Argument list too long error for rm, cp, mv commands
Solution 1:
The reason this occurs is because bash actually expands the asterisk to every matching file, producing a very long command line.
Try this:
find . -name "*.pdf" -print0 | xargs -0 rm
Warning: this is a recursive search and will find (and delete) files in subdirectories as well. Tack on -f
to the rm command only if you are sure you don't want confirmation.
You can do the following to make the command non-recursive:
find . -maxdepth 1 -name "*.pdf" -print0 | xargs -0 rm
Another option is to use find's -delete
flag:
find . -name "*.pdf" -delete
Solution 2:
tl;dr
It's a kernel limitation on the size of the command line argument. Use a for
loop instead.
Origin of problem
This is a system issue, related to execve
and ARG_MAX
constant. There is plenty of documentation about that (see man execve, debian's wiki).
Basically, the expansion produce a command (with its parameters) that exceeds the ARG_MAX
limit.
On kernel 2.6.23
, the limit was set at 128 kB
. This constant has been increased and you can get its value by executing:
getconf ARG_MAX
# 2097152 # on 3.5.0-40-generic
Solution: Using for
Loop
Use a for
loop as it's recommended on BashFAQ/095 and there is no limit except for RAM/memory space:
Dry run to ascertain it will delete what you expect:
for f in *.pdf; do echo rm "$f"; done
And execute it:
for f in *.pdf; do rm "$f"; done
Also this is a portable approach as glob have strong and consistant behavior among shells (part of POSIX spec).
Note: As noted by several comments, this is indeed slower but more maintainable as it can adapt more complex scenarios, e.g. where one want to do more than just one action.
Solution: Using find
If you insist, you can use find
but really don't use xargs as it "is dangerous (broken, exploitable, etc.) when reading non-NUL-delimited input":
find . -maxdepth 1 -name '*.pdf' -delete
Using -maxdepth 1 ... -delete
instead of -exec rm {} +
allows find
to simply execute the required system calls itself without using an external process, hence faster (thanks to @chepner comment).
References
- I'm getting "Argument list too long". How can I process a large list in chunks? @ wooledge
- execve(2) - Linux man page (search for ARG_MAX) ;
- Error: Argument list too long @ Debian's wiki ;
- Why do I get “/bin/sh: Argument list too long” when passing quoted arguments? @ SuperUser
Solution 3:
find
has a -delete
action:
find . -maxdepth 1 -name '*.pdf' -delete