Problem with spaces in file names
I want to do something repeatedly on a list of files. The files in questions have spaces in their names:
david@david: ls -l
total 32
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha
-rw-rw-r-- 1 david david 0 Mai 8 11:55 haha~
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (3rd copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (4th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (5th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (6th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (7th copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (another copy)
-rw-rw-r-- 1 david david 13 Mai 8 11:55 haha (copy)
Now I want to stat each of these files:
david@david: echo '
for file in $(ls)
do
stat $file
done' | bash
(I use echo and a pipe in order to write multi-line commands.)
When I do that, it works correctly on those files that do not have any spaces in their names. But the others...
stat: cannot stat ‘(another’: No such file or directory
stat: cannot stat ‘copy)’: No such file or directory
Changing $(ls)
to "$(ls)"
or $file
to "$file"
does not work. What can I do?
Edit:
echo '
for files in *
do
stat "$files"
done' | bash
does the trick! As I'm new to bash, I want to keep things as simple as possible - so nothing with trying to escape spaces, or using xargs
or the solution with read -r
, although they solve the problem.
As some have asked: Yes, using this instead of stat *
is weird. But I just wanted to find a general way to apply the same command on a bunch of file names in bash, using a for loop. So stat
could stand for gzip
, gpg
or rm
.
Solution 1:
The multiple quote from the echo '
is complicating the thing.
You can just use:
for f in *; do stat -- "$f"; done
But also
stat -- *
...and if you want to collect the files and then apply the command (why?) you can go with (but be careful with file containing new lines...(1))
for f in *; do echo "$f"; done | xargs stat --
...and if you want hidden files too, just use * .*
as a pattern, but then remember that .
and ..
will be in the set.
As an aside, you shouldn't parse ls
output.
(1) but if you have file names with newlines, you somewhat deserve it... ;-)
Solution 2:
On a side note: you can split long / complicated commands over multiple lines by adding a space followed by a backslash and hitting Enter everytime you want to start writing into a new line, instead of forking multiple processes by using echo [...] | bash
; also you should enclose $file
in double quotes, to prevent stat
from breaking in case of filenames containing spaces:
for file in $(ls); \
do \
stat "$file"; \
done
The problem is that $(ls)
expands to a list of filenames containing spaces, and the same will happen also with "$(ls)"
.
Even solving this problem, this method will still break on filenames containing backslashes and on filenames containing newlines (as pointed out by terdon).
A solution for both problems would be to pipe the output of find
to a while
loop running read -r
so that, at each iteration, read -r
will store one line of find
's output into $file
:
find . -maxdepth 1 -type f | while read -r file; do \
stat "$file"; \
done
Solution 3:
Use the good old find
, works with hidden files, newlines and spaces.
find . -print0 | xargs -I {} -0 stat {}
or any other instead of stat
find . -print0 | xargs -I {} -0 file {}
find . -print0 | xargs -I {} -0 cat {}