Execute curl requests in parallel in bash

Use '&' after a command to background a process, and 'wait' to wait for them to finish. Use '()' around the commands if you need to create a sub-shell.

#!/bin/bash

curl -s -o foo http://example.com/file1 && echo "done1" &
curl -s -o bar http://example.com/file2 && echo "done2" & 
curl -s -o baz http://example.com/file3 && echo "done3" &

wait

xargs has a "-P" parameter to run processes in parallel. For example:

wget -nv http://en.wikipedia.org/wiki/Linux -O- | egrep -o "http://[^[:space:]]*.jpg" | xargs -P 10 -r -n 1 wget -nv

Reference: http://www.commandlinefu.com/commands/view/3269/parallel-file-downloading-with-wget


I use gnu parallel for tasks like this.