Download entire album from Google+?

Is there any software that can download all the full-size pictures of a publicly-shared Google+ album?


As suggested by @vurdalakov

With this tool: http://code.google.com/p/gpalbum/
and an implementation of wget it is possible to have the files in seconds.

Download, unzip, and run the gpalbum program    
Visit the album and copy the url out of the address bar    
Paste the url into gpalbum    
Click "Get Image URLs"    
Click "Copy Image URLs to clipboard    
Paste URLs into any text editor    
replace all "https" with "wget https"    
Save it as a batch file    
Run    

All the images are now downloaded locally, I have been looking for this for a while, just tried it and it is fantastic.

FYI Each line looks something like this

wget https://lh5.googleusercontent.com/FULLIMAGEPATH.jpg

The specific version used for testing was 1.00 (update: 1.03 also tested).


Use the following commands in a bash shell (e.g. a Linux terminal):

(copy&paste the album URL to the end of the first one)

wget -O album.html  https://plus.google.com/photos/XXX/albums/YYY?authkey=ZZZ
grep '"https://lh..googleusercontent.com/.*",\[' album.html | sed 's%,"\(https://lh..googleusercontent.com/.*\)",\[".*%\1=w0-h0%' >images.txt
wget -i images.txt --content-disposition

There you get them all, in full size (but no EXIF data)! This even works for private albums (visibility: ‘anyone with the link’).


Update to the latest version of Picasa, then start Picasa, and check the top right side of the window to make sure you are logged into Google Plus.

Then Click

File, Import from Google+ Photos


Had to do this today since they are closing Google Plus in a couple of days and a deceased friend of mine had shared my birthday photos with me so I wanted to download a copy before they shut down. What I ended up doing was look at the album source on: https://get.google.com/albumarchive/<user_id> (I had to get the user_id from their Google Plus page: https://plus.google.com/<user_id>/

At the bottom of the source file for each album look for "album_id", [list], where album_id is in the URL for each album in the albumarchive page.

You can then use a script to read the [list] as json and iterate through it. For example I wrote a Python script to download all the images:

import json

with open('file.json') as jf:
    images = json.load(jf)
for image in images:
    url = image[1]
    urllib.request.urlretrieve(url, '<path_to_save_directory>' + image[11])