bash: start multiple chained commands in background

Solution 1:

I haven't tested this but how about

print `(touch .file1.lock; cp bigfile1 /destination; rm .file1.lock;) &`;

The parentheses mean execute in a subshell but that shouldn't hurt.

Solution 2:

Thanks Hugh, that did it:

adrianp@frost:~$ (echo "started"; sleep 15; echo "stopped")
started
stopped
adrianp@frost:~$ (echo "started"; sleep 15; echo "stopped") &
started
[1] 7101
adrianp@frost:~$ stopped

[1]+  Done                    ( echo "started"; sleep 15; echo "stopped" )
adrianp@frost:~$ 

The other ideas don't work because they start each command in the background, and not the command sequence (which is important in my case!).

Thank you again!

Solution 3:

Another way is to use the following syntax:

{ command1; command2; command3; } &
wait

Note that the & goes at the end of the command group, not after each command. The semicolon after the final command is necessary, as are the space after the first bracket and before the final bracket. The wait at the end ensures that the parent process is not killed before the spawned child process (the command group) ends.

You can also do fancy stuff like redirecting stderr and stdout:

{ command1; command2; command3; } 2>&2 1>&1 &

Your example would look like:

forloop() {
    { touch .file1.lock; cp bigfile1 /destination; rm .file1.lock; } &
}
# ... do some other concurrent stuff
wait # wait for childs to end

Solution 4:

for command in $commands
do
    "$command" &
done
wait

The ampersand at the end of the command runs it in the background, and the wait waits until the background task is completed.