Is there a way to avoid positional arguments in bash?

I have to write a function in bash. The function will take about 7 arguments. I know that I can call a function like this:

To call a function with parameters:

function_name $arg1 $arg2

And I can refer my parameters like this inside the function:

function_name () {
   echo "Parameter #1 is $1"
}

My question is, is there a better way refer to the parameters inside the function? Can I avoid the $1, $2, $3, .... thing and simply use the $arg1, $arg2, ...?

Is there a proper method for this or do I need to re-assign these parameters to some other variables inside the function? E.g.:

function_name () {
   $ARG1=$1
   echo "Parameter #1 is $ARG1"
}

Any example would be much appreciated.


The common way of doing that is assigning the arguments to local variables in the function, i.e.:

copy() {
    local from=${1}
    local to=${2}

    # ...
}

Another solution may be getopt-style option parsing.

copy() {
    local arg from to
    while getopts 'f:t:' arg
    do
        case ${arg} in
            f) from=${OPTARG};;
            t) to=${OPTARG};;
            *) return 1 # illegal option
        esac
    done
}

copy -f /tmp/a -t /tmp/b

Sadly, bash can't handle long options which would be more readable, i.e.:

copy --from /tmp/a --to /tmp/b

For that, you either need to use the external getopt program (which I think has long option support only on GNU systems) or implement the long option parser by hand, i.e.:

copy() {
    local from to

    while [[ ${1} ]]; do
        case "${1}" in
            --from)
                from=${2}
                shift
                ;;
            --to)
                to=${2}
                shift
                ;;
            *)
                echo "Unknown parameter: ${1}" >&2
                return 1
        esac

        if ! shift; then
            echo 'Missing parameter argument.' >&2
            return 1
        fi
    done
}

copy --from /tmp/a --to /tmp/b

Also see: using getopts in bash shell script to get long and short command line options


You can also be lazy, and just pass the 'variables' as arguments to the function, i.e.:

copy() {
    local "${@}"

    # ...
}

copy from=/tmp/a to=/tmp/b

and you'll have ${from} and ${to} in the function as local variables.

Just note that the same issue as below applies — if a particular variable is not passed, it will be inherited from parent environment. You may want to add a 'safety line' like:

copy() {
    local from to    # reset first
    local "${@}"

    # ...
}

to ensure that ${from} and ${to} will be unset when not passed.


And if something very bad is of your interest, you could also assign the arguments as global variables when invoking the function, i.e.:

from=/tmp/a to=/tmp/b copy

Then you could just use ${from} and ${to} within the copy() function. Just note that you should then always pass all parameters. Otherwise, a random variable may leak into the function.

from= to=/tmp/b copy   # safe
to=/tmp/b copy         # unsafe: ${from} may be declared elsewhere

If you have bash 4.1 (I think), you can also try using associative arrays. It will allow you to pass named arguments but it will be ugly. Something like:

args=( [from]=/tmp/a [to]=/tmp/b )
copy args

And then in copy(), you'd need to grab the array.


You can always pass things through the environment:

#!/bin/sh
foo() {
  echo arg1 = "$arg1"
  echo arg2 = "$arg2"
}

arg1=banana arg2=apple foo

All you have to do is name variables on the way in to the function call.

function test() {
    echo $a
}

a='hello world' test
#prove variable didnt leak
echo $a .

enter image description here

This isn't just a feature of functions, you could have that function in it's own script and call a='hello world' test.sh and it would work just the same


As an extra little bit of fun, you can combine this method with positional arguments (say you were making a script and some users mightn't know the variable names).
Heck, why not let it have defaults for those arguments too? Well sure, easy peasy!

function test2() {
    [[ -n "$1" ]] && local a="$1"; [[ -z "$a" ]] && local a='hi'
    [[ -n "$2" ]] && local b="$2"; [[ -z "$b" ]] && local b='bye'
    echo $a $b
}

#see the defaults
test2

#use positional as usual
test2 '' there
#use named parameter
a=well test2
#mix it up
b=one test2 nice

#prove variables didnt leak
echo $a $b .

enter image description here

Note that if test was its own script, you don't need to use the local keyword.


Shell functions have full access to any variable available in their calling scope, except for those variable names that are used as local variables inside the function itself. In addition, any non-local variable set within a function is available on the outside after the function has been called. Consider the following example:

A=aaa
B=bbb

echo "A=$A B=$B C=$C"

example() {
    echo "example(): A=$A B=$B C=$C"

    A=AAA
    local B=BBB
    C=CCC

    echo "example(): A=$A B=$B C=$C"
}

example

echo "A=$A B=$B C=$C"

This snippet has the following output:

A=aaa B=bbb C=
example(): A=aaa B=bbb C=
example(): A=AAA B=BBB C=CCC
A=AAA B=bbb C=CCC

The obvious disadvantage of this approach is that functions are not self-contained any more and that setting a variable outside a function may have unintended side-effects. It would also make things harder if you wanted to pass data to a function without assigning it to a variable first, since this function is not using positional parameters any more.

The most common way to handle this is to use local variables for arguments and any temporary variable within a function:

example() {
   local A="$1" B="$2" C="$3" TMP="/tmp"

   ...
}

This avoids polluting the shell namespace with function-local variables.