Can I download all pictures on a page?

Is there a way I can run a script with a certain web page as the argument that will download all the images from there?


You can use an Automator workflow to download images embedded in a web page, or images linked from a web page. A good starting point for a workflow is:

  1. Get Current Webpage from Safari
  2. Get Image URLs from Webpage
  3. Download URLs

Downloading images from web pages with Automator on Mac OS X 10.8

You can change the workflow to use a list of web pages to fetch from.

Automator is included with Mac OS X in the Applications > Utilities folder.


wget -nd -r -l1 -p -np -A jpg,jpeg,png,svg,gif -e robots=off http://www.apple.com/itunes/
  • -nd (no directories) downloads all files to the current directory
  • -r -l1 (recursive level 1) downloads linked pages and resources on the first page
  • -p (page requisites) also includes resources on linked pages
  • -np (no parent) doesn't follow links to parent directories
  • -A (accept) only downloads or keeps files with the specified extensions
  • -e robots=off ignores robots.txt and doesn't download a robots.txt to the current directory

If the images are on a different host or subdomain, you have to add -H to span hosts:

wget -nd -H -p -A jpg,jpeg,png,gif -e robots=off http://example.tumblr.com/page/{1..2}

You can also use curl:

cd ~/Desktop/; IFS=$'\n'; for u in $(curl -Ls http://example.tumblr.com/page/{1..2} | sed -En 's/.*src="([^"]+\.(jpe?g|png))".*/\1/p' | sort -u); do curl -s "$u" -O; done

-L follows location headers (redirects). -O outputs files to the current directory with the same names.