How to delete folders that have n or fewer files in them?
Solution 1:
One way would be to use find
's -exec
action to execute a custom test of the number of files.
One could use a second find
command along with wc
to find and count files within each directory, but probably a better option would be to shell globbing to slurp the filenames into an array, and then return a logical value indicating whether the size of the array is less than the threshold i.e.
files=( dir/* ); ((${#files[@]} < 10))
Putting it all together, we should be able to list all subdirectories with fewer than 10 files (including sub-subdirectories) using
find . -depth -type d ! -name '.' -exec bash -c '
shopt -s nullglob; files=( "$1"/* ); ((${#files[@]} < ${2:-10}))
' bash {} 10 \; -print0 | xargs -0
(you can adjust the number 10
after the bash {}
for different thresholds - the ${2:-10}
parameter expansion makes it default to 10 files if no second argument is given). For example, given
$ tree .
.
├── bar
│ ├── file1
│ ├── file2
│ ├── file3
│ ├── file4
│ ├── file5
│ ├── file6
│ ├── file7
│ ├── file8
│ ├── file9
│ └── other file
├── baz
│ ├── other file
│ └── subdir
└── foo
└── somefile
4 directories, 12 files
then
find . -depth -type d ! -name '.' -exec bash -c '
shopt -s nullglob; files=( "$1"/* ); ((${#files[@]} < "${2:-10}"))
' bash {} 10 \; -print0 | xargs -0
./foo ./baz/subdir ./baz
If that appears to be doing the right thing you can actually remove them by adding rm -rf
but please be very careful with this - remember there is no 'undo'
find . -depth -type d ! -name '.' -exec bash -c '
shopt -s nullglob; files=( "$1"/* ); ((${#files[@]} < ${2:-10}))
' bash {} 10 \; -print0 | xargs -0 rm -rf
(The xargs
could be eliminated by using another -exec
action to run rm
more directly, but the formulation above makes it easy to generalize - print, stat, remove or whatever.)
Note this will act recursively i.e. a directory that initially has more than n
files including subdirectories may be removed as a result of some of those subdirectories themselves getting removed as find
backs up the directory tree - the current directory is explicitly protected from possible deletion by the ! -name '.'
test.
If you don't need it to act recursively, you can simply loop over directories and perform the same file count and test logic e.g. to remove all first-level directories containing fewer than 10 files in the current directory
shopt -s nullglob
for d in */; do
files=( "$d"/* )
((${#files[@]} < 10)) && echo rm -rf -- "$d"
done