How to list files from n biggest directories?
I am interested in finding n biggest subdirectories (eg. 3) in my current directory AND THEN listing all the files from them. I am not interested in finding biggest files. I know there are a lot of solutions for finding biggest files and that's not what I want.
I have found my 3 biggest subdirectories with line:
$ du -hs */ | sort -rh | head -3
with result:
212K 04/
52K 02/
20K 03/
but somehow I am not able to list files from these directories (piping with ls did't work). Any suggestions how to do that?
Solution 1:
You can use the results of your piped commands with a while loop:
du -hs */ | sort -rh | head -3 | while read -r size dir
do
ls -l "$dir"
done
As a one liner:
du -hs */ | sort -rh | head -3 | while read -r size dir ; do ls -l "$dir"; done
Thanks to steeldriver for suggesting the use of read
in a while
loop over a for
loop using awk
, to handle filenames with spaces and special characters in a more robust manner.
Solution 2:
Command substitution, with help from awk
to get first 3 directories:
ls -l $(du -hs */ | sort -rh | awk 'NR==4{exit} {print $2}')
Assuming no directory name contains any whitespace or any unusual character.
Robust approach: handling any possible file (directory) name, outputting directory name as NUL separated in awk
and taking help from xargs
to deal with each directory:
du -0hs */ | sort -zrh | awk 'BEGIN{RS="\0"} NR==4{exit} {printf("%s\0", $2)}' | \
xargs -0 ls -l
Each directory content at a time:
du -0hs */ | sort -zrh | awk 'BEGIN{RS="\0"} NR==4{exit} {printf("%s\0", $2)}' | \
xargs -0 -I{} ls -l {}