Bash script to delete files older than x days with subdirectories

I'm trying to delete a ton of files older than x days.

Now I have a script to do that

find /path/to/files* -mtime +10 -exec rm {} \; 

But this will also delete the subdirectories. There are a ton of folders but I would like to keep them, and delete the files older than 10 days within the said folders.

Is there a way to do this?


Solution 1:

type option for filtering results

find accepts the type option for selecting, for example, only files.

find /path/to/files -type f -mtime +10 -delete

Leave out -delete to show what it'd delete, and once you've verified that, go ahead and run the full command.

That would only run on files, not directories. Use -type d for the inverse, only listing directories that match your arguments.


Additional options

You might want to read man find, as there are some more options you could need in the future. For example, -maxdepth would allow you to only restrict the found items to a specific depth, e.g. -maxdepth 0 would not recurse into subdirectories.

Some remarks

  • I wonder how the command would have removed a folder, since you can't remove a folder with rm only. You'd need rm -r for that.

  • Also, /path/to/files* is confusing. Did you mean /path/to/files/ or are you expecting the wildcard to expand to several file and folder names?

  • Put the {} in single quotes, i.e. '{}' to avoid the substituted file/directory name to be interpreted by the shell, just like we protect the semicolon with a backslash.

Solution 2:

As in previous answers (+1 for both) the trick is to use -type f predicate.

Note, that instead of -exec rm '{}' you can also use -delete predicate. But don't do that. With -exec rm '{}' you can (and should) first do -exec echo rm '{}' to verify that this is really what do you want. After that rerun the command without the echo.

Using -delete is faster (no extra fork() and execve() for each file), but this is risky because -delete works also as a condition, so:

# delete *.tmp files
find . -type f -name '*.tmp' -delete

but if you ONLY swap arguments:

# delete ALL files
find . -type f -name '*.tmp' -delete

If you ever need find and rm to work faster for tons of files, check out the find ... | xargs ... rm UNIX idiom.

Solution 3:

You can easily do this with the find command

$ find -type f

Which restricts the results to be of the type file

Solution 4:

I was struggling to get this right using the scripts provided above and some other scripts especially when files and folder names had newline or spaces.

Finally stumbled on tmpreaper and it has been worked pretty well for us so far.

tmpreaper -t 5d ~/Downloads


tmpreaper  --protect '*.c' -t 5h ~/my_prg

Original Source link

Has features like test, which checks the directories recursively and lists them. Ability to delete symlinks, files or directories and also the protection mode for a certain pattern while deleting