How to iterate over arguments in a Bash script
I have a complex command that I'd like to make a shell/bash script of. I can write it in terms of $1
easily:
foo $1 args -o $1.ext
I want to be able to pass multiple input names to the script. What's the right way to do it?
And, of course, I want to handle filenames with spaces in them.
Solution 1:
Use "$@"
to represent all the arguments:
for var in "$@"
do
echo "$var"
done
This will iterate over each argument and print it out on a separate line. $@ behaves like $* except that when quoted the arguments are broken up properly if there are spaces in them:
sh test.sh 1 2 '3 4'
1
2
3 4
Solution 2:
Rewrite of a now-deleted answer by VonC.
Robert Gamble's succinct answer deals directly with the question. This one amplifies on some issues with filenames containing spaces.
See also: ${1:+"$@"} in /bin/sh
Basic thesis: "$@"
is correct, and $*
(unquoted) is almost always wrong.
This is because "$@"
works fine when arguments contain spaces, and
works the same as $*
when they don't.
In some circumstances, "$*"
is OK too, but "$@"
usually (but not
always) works in the same places.
Unquoted, $@
and $*
are equivalent (and almost always wrong).
So, what is the difference between $*
, $@
, "$*"
, and "$@"
? They are all related to 'all the arguments to the shell', but they do different things. When unquoted, $*
and $@
do the same thing. They treat each 'word' (sequence of non-whitespace) as a separate argument. The quoted forms are quite different, though: "$*"
treats the argument list as a single space-separated string, whereas "$@"
treats the arguments almost exactly as they were when specified on the command line.
"$@"
expands to nothing at all when there are no positional arguments; "$*"
expands to an empty string — and yes, there's a difference, though it can be hard to perceive it.
See more information below, after the introduction of the (non-standard) command al
.
Secondary thesis: if you need to process arguments with spaces and then
pass them on to other commands, then you sometimes need non-standard
tools to assist. (Or you should use arrays, carefully: "${array[@]}"
behaves analogously to "$@"
.)
Example:
$ mkdir "my dir" anotherdir
$ ls
anotherdir my dir
$ cp /dev/null "my dir/my file"
$ cp /dev/null "anotherdir/myfile"
$ ls -Fltr
total 0
drwxr-xr-x 3 jleffler staff 102 Nov 1 14:55 my dir/
drwxr-xr-x 3 jleffler staff 102 Nov 1 14:55 anotherdir/
$ ls -Fltr *
my dir:
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 14:55 my file
anotherdir:
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 14:55 myfile
$ ls -Fltr "./my dir" "./anotherdir"
./my dir:
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 14:55 my file
./anotherdir:
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 14:55 myfile
$ var='"./my dir" "./anotherdir"' && echo $var
"./my dir" "./anotherdir"
$ ls -Fltr $var
ls: "./anotherdir": No such file or directory
ls: "./my: No such file or directory
ls: dir": No such file or directory
$
Why doesn't that work?
It doesn't work because the shell processes quotes before it expands
variables.
So, to get the shell to pay attention to the quotes embedded in $var
,
you have to use eval
:
$ eval ls -Fltr $var
./my dir:
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 14:55 my file
./anotherdir:
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 14:55 myfile
$
This gets really tricky when you have file names such as "He said,
"Don't do this!"
" (with quotes and double quotes and spaces).
$ cp /dev/null "He said, \"Don't do this!\""
$ ls
He said, "Don't do this!" anotherdir my dir
$ ls -l
total 0
-rw-r--r-- 1 jleffler staff 0 Nov 1 15:54 He said, "Don't do this!"
drwxr-xr-x 3 jleffler staff 102 Nov 1 14:55 anotherdir
drwxr-xr-x 3 jleffler staff 102 Nov 1 14:55 my dir
$
The shells (all of them) do not make it particularly easy to handle such
stuff, so (funnily enough) many Unix programs do not do a good job of
handling them.
On Unix, a filename (single component) can contain any characters except
slash and NUL '\0'
.
However, the shells strongly encourage no spaces or newlines or tabs
anywhere in a path names.
It is also why standard Unix file names do not contain spaces, etc.
When dealing with file names that may contain spaces and other
troublesome characters, you have to be extremely careful, and I found
long ago that I needed a program that is not standard on Unix.
I call it escape
(version 1.1 was dated 1989-08-23T16:01:45Z).
Here is an example of escape
in use - with the SCCS control system.
It is a cover script that does both a delta
(think check-in) and a
get
(think check-out).
Various arguments, especially -y
(the reason why you made the change)
would contain blanks and newlines.
Note that the script dates from 1992, so it uses back-ticks instead of
$(cmd ...)
notation and does not use #!/bin/sh
on the first line.
: "@(#)$Id: delget.sh,v 1.8 1992/12/29 10:46:21 jl Exp $"
#
# Delta and get files
# Uses escape to allow for all weird combinations of quotes in arguments
case `basename $0 .sh` in
deledit) eflag="-e";;
esac
sflag="-s"
for arg in "$@"
do
case "$arg" in
-r*) gargs="$gargs `escape \"$arg\"`"
dargs="$dargs `escape \"$arg\"`"
;;
-e) gargs="$gargs `escape \"$arg\"`"
sflag=""
eflag=""
;;
-*) dargs="$dargs `escape \"$arg\"`"
;;
*) gargs="$gargs `escape \"$arg\"`"
dargs="$dargs `escape \"$arg\"`"
;;
esac
done
eval delta "$dargs" && eval get $eflag $sflag "$gargs"
(I would probably not use escape quite so thoroughly these days - it is
not needed with the -e
argument, for example - but overall, this is
one of my simpler scripts using escape
.)
The escape
program simply outputs its arguments, rather like echo
does, but it ensures that the arguments are protected for use with
eval
(one level of eval
; I do have a program which did remote shell
execution, and that needed to escape the output of escape
).
$ escape $var
'"./my' 'dir"' '"./anotherdir"'
$ escape "$var"
'"./my dir" "./anotherdir"'
$ escape x y z
x y z
$
I have another program called al
that lists its arguments one per line
(and it is even more ancient: version 1.1 dated 1987-01-27T14:35:49).
It is most useful when debugging scripts, as it can be plugged into a
command line to see what arguments are actually passed to the command.
$ echo "$var"
"./my dir" "./anotherdir"
$ al $var
"./my
dir"
"./anotherdir"
$ al "$var"
"./my dir" "./anotherdir"
$
[Added:
And now to show the difference between the various "$@"
notations, here is one more example:
$ cat xx.sh
set -x
al $@
al $*
al "$*"
al "$@"
$ sh xx.sh * */*
+ al He said, '"Don'\''t' do 'this!"' anotherdir my dir xx.sh anotherdir/myfile my dir/my file
He
said,
"Don't
do
this!"
anotherdir
my
dir
xx.sh
anotherdir/myfile
my
dir/my
file
+ al He said, '"Don'\''t' do 'this!"' anotherdir my dir xx.sh anotherdir/myfile my dir/my file
He
said,
"Don't
do
this!"
anotherdir
my
dir
xx.sh
anotherdir/myfile
my
dir/my
file
+ al 'He said, "Don'\''t do this!" anotherdir my dir xx.sh anotherdir/myfile my dir/my file'
He said, "Don't do this!" anotherdir my dir xx.sh anotherdir/myfile my dir/my file
+ al 'He said, "Don'\''t do this!"' anotherdir 'my dir' xx.sh anotherdir/myfile 'my dir/my file'
He said, "Don't do this!"
anotherdir
my dir
xx.sh
anotherdir/myfile
my dir/my file
$
Notice that nothing preserves the original blanks between the *
and */*
on the command line. Also, note that you can change the 'command line arguments' in the shell by using:
set -- -new -opt and "arg with space"
This sets 4 options, '-new
', '-opt
', 'and
', and 'arg with space
'.
]
Hmm, that's quite a long answer - perhaps exegesis is the better term.
Source code for escape
available on request (email to firstname dot
lastname at gmail dot com).
The source code for al
is incredibly simple:
#include <stdio.h>
int main(int argc, char **argv)
{
while (*++argv != 0)
puts(*argv);
return(0);
}
That's all. It is equivalent to the test.sh
script that Robert Gamble showed, and could be written as a shell function (but shell functions didn't exist in the local version of Bourne shell when I first wrote al
).
Also note that you can write al
as a simple shell script:
[ $# != 0 ] && printf "%s\n" "$@"
The conditional is needed so that it produces no output when passed no arguments. The printf
command will produce a blank line with only the format string argument, but the C program produces nothing.
Solution 3:
Note that Robert's answer is correct, and it works in sh
as well. You can (portably) simplify it even further:
for i in "$@"
is equivalent to:
for i
I.e., you don't need anything!
Testing ($
is command prompt):
$ set a b "spaces here" d
$ for i; do echo "$i"; done
a
b
spaces here
d
$ for i in "$@"; do echo "$i"; done
a
b
spaces here
d
I first read about this in Unix Programming Environment by Kernighan and Pike.
In bash
, help for
documents this:
for NAME [in WORDS ... ;] do COMMANDS; done
If
'in WORDS ...;'
is not present, then'in "$@"'
is assumed.
Solution 4:
For simple cases you can also use shift
.
It treats the argument list like a queue. Each shift
throws the first argument out and the
index of each of the remaining arguments is decremented.
#this prints all arguments
while test $# -gt 0
do
echo "$1"
shift
done
Solution 5:
You can also access them as an array elements, for example if you don't want to iterate through all of them
argc=$#
argv=("$@")
for (( j=0; j<argc; j++ )); do
echo "${argv[j]}"
done