Why is using the '-execdir' action insecure for directory which is in the PATH?
Solution 1:
You could run the wrong program. Someone could make you run their program.
The -execdir
action runs your command from the directory that contains the file(s) found. When $PATH
contains relative paths, such as .
or anything that doesn't start with /
, -execdir
is insecure because a directory where a file is found (or another directory resolved relative to it) could also contain an executable of the same name as the one you are trying to run. That potentially untrusted executable would then get run instead.
This could be deliberately exploited by another user to cause you to run their program, which might cause harm or breach data security, instead of the program you are trying to run. Or, less often, it might simply result in the wrong program inadvertently being run, even without anyone trying to make the problem happen.
If everything in your PATH
environment variable is an absolute path, this error should not occur, even if the directory you're searching and -execdir
ing from is contained in PATH
. (I've checked that this works.) If you believe you don't have any relative directories in $PATH
but are still getting this error, please update your question with details including the output of echo "$PATH"
.
A concrete example.
As an example of what could go wrong, suppose:
- Alice has
.
in her$PATH
because she wants to be able to run programs in whatever directory she'scd
'd to, without bothering to prepend their names with./
. - Alice's frenemy Eve has shared
/home/eve/shared
with Alice. - Alice wants statistics (lines, words, bytes) on the
.c
files Eve has shared with her.
So Alice runs:
find ~eve/shared -name \*.c -execdir wc {} \;
Unfortunately for Alice, Eve has created her own script, named it wc
, set it executable (chmod +x
), and placed it clandestinely in one of the directories under /home/eve/shared
. Eve's script looks like this:
#!/bin/sh
/usr/bin/wc "$@"
do_evil # Eve replaces this command with whatver evil she wishes to do
So when Alice uses find
with -execdir
to run wc
on the files Eve has shared, and it gets to files in the same directory as Eve's custom wc
script, Eve's wc
runs--with all of Alice's privileges!
(Being crafty, Eve has made her wc
script act as a wrapper for the system wc
, so Alice won't even know something has gone wrong, i.e., that do_evil
was run. However, simpler--and also more sophisticated--variations are possible.)
How find
prevents this.
find
prevents this security problem from happening by refusing to take the -execdir
action when $PATH
contains a relative directory.
find
offers two diagnostic messages depending on the specific situation.
-
If
.
is in$PATH
, then (as you've seen) it says:find: The current directory is included in the PATH environment variable, which is insecure in combination with the -execdir action of find. Please remove the current directory from your $PATH (that is, remove "." or leading or trailing colons)
It probably has a special message for the
.
case as it's especially common. -
If a relative path other than
.
--say,foo
--appears in$PATH
and you runfind
with-execdir
, it says:find: The relative path `foo' is included in the PATH environment variable, which is insecure in combination with the -execdir action of find. Please remove that entry from $PATH
It's better not to have relative paths in $PATH
at all.
The risk of having .
or other relative paths in $PATH
is especially heightened when using a utility that automatically changes the directory, which is why find
won't let you use -execdir
in this situation.
But having relative paths, especially .
, in your $PATH
is inherently risky and is really best avoided anyway. Consider the fictional situation in the example above. Suppose instead of running find
, Alice simply cd
s to ~eve/shared/blah
and runs wc *.c
. If blah
contains Eve's wc
script, do_evil
runs as Alice.
Solution 2:
There is a much detailed information here. Another excellent reference is here. To quote from the first reference:
The option -execdir is a more modern option introduced in GNU find is an attempt to create a more safe version of -exec. It has the same semantic as -exec with two important enhancements:
It always provides absolute path to the file (using relative path to a file is really dangerous in case of -exec).
In addition to providing absolute path it also checks the PATH variable for safety (if dot is present in the PATH env variable, you can pickup executable from the wrong directory)
From second reference:
The ‘-execdir’ action refuses to do anything if the current directory is included in the $PATH environment variable. This is necessary because ‘-execdir’ runs programs in the same directory in which it finds files – in general, such a directory might be writable by untrusted users. For similar reasons, ‘-execdir’ does not allow ‘{}’ to appear in the name of the command to be run.
Solution 3:
The main problem is with the value of system variable PATH
which contains relative folders in it, so for security reasons find
command won't allow you to execute binaries, because potentially it can execute wrong programs.
So for example, if you have your current dir in your PATH as per warning which you get:
The current directory is included in the PATH environment variable.
and you'll run your command:
find . -type f -name 'partOfFileNames*' -execdir rm -- {} +
in case you'll have local script (rm
with executable flags) containing rm -fr /
in it, it can remove all your files, because instead of executing expected /bin/rm
, you'll execute rm
from the current dir, so probably it's not what you wanted.
As a side note, this is known issue in Travis CI (GH #2811) when it fails with the error:
find: The relative path `./node_modules/.bin' is included in the PATH environment variable, which is insecure in combination with the -execdir action of find. Please remove that entry from $PATH
So the solution is to remove affected entry from PATH variable, e.g.
PATH=`echo $PATH | sed -e 's/:\.\/node_modules\/\.bin//'`
as proposed by drogus. The progress of this bug, can be followed at GH #4862.
Here is Bash version workaround:
PATH=${PATH//:\.\/node_modules\/\.bin/}
Example usage (passing filtered PATH
to specific command):
env PATH=${PATH//:\.\/node_modules\/\.bin/} find . -type f
Solution 4:
Set PATH=/usr/bin
This is not ideal, but it tends to solve most use cases, supposing both find
and the program to be used (rm
here) are in that directory:
PATH=/usr/bin find -execdir rm ...
You can just add any missing PATHs as needed.
xargs
and bash -c cd
workaround
OK, I give up:
find . -type f |
xargs -I '{}' bash -c 'cd "$(dirname "{}")" && pwd && echo "$(basename "{}")"'
sed
workaround
A bit less nice than the previous workaround:
PATH="$(echo "$PATH" | sed -E 's/(^|:)[^\/][^:]*//g')" find . -execdir echo '{}' \;
A testcase:
[ "$(printf '/a/b::c/d:/e/f\n' | sed -E 's/(^|:)[^\/][^:]*//g')" = '/a/b:/e/f' ] || echo fail
For rename
specifically, you can also work around with some Perl regex-fu: https://stackoverflow.com/questions/16541582/finding-multiple-files-recursively-and-renaming-in-linux/54163971#54163971
RTFS hope crushing
For those who have hopes that there exists a way to ignore find
's opinionatedness, let me crush that with some source:
- https://git.savannah.gnu.org/cgit/findutils.git/tree/find/parser.c?h=v4.6.0#n2847
- https://git.savannah.gnu.org/cgit/findutils.git/tree/find/parser.c?h=v4.6.0#n2944
From that we see that there seems to be no way to turn off the path checking.
The exact rule it checks is: fail if the PATH
is either empty or does not start with /
.