How can I use a command line tool like Wget to log into an OpenID site?

Specifically, I would like to be able to download certain pages from my user profile on the various Stack Exchange sites. I would, however, like to do this automatically (using a cron job), from the command line and in a parsable format. I much prefer using Linux for this, but I could get access to a Mac or Windows machine if necessary.

Ideally, I would like to use a tool like Wget or cURL to fetch the pages. I don't know how to get past the log in though. I have seen suggestions that mention that you can log in via Firefox, export the relevant cookie and import it into Wget through its --load-cookiesoption. For example here and here. While this works if I have just logged in, it doesn't after a while. I guess because the ID token has to be refreshed.

So, just after logging in to SU and exporting my cookies I can do:

wget --load-cookies cookies.txt \
  https://superuser.com/users/151431/terdon?tab=responses

After a few minutes though, I get a 404 error:

wget -O ~/stack/$(date +%s) --load-cookies ~/cookies.txt \
   https://superuser.com/users/151431/terdon?tab=responses

--2013-08-06 04:04:14--  https://superuser.com/users/151431/terdon?tab=responses
Resolving superuser.com (superuser.com)... 198.252.206.16
Connecting to superuser.com (superuser.com)|198.252.206.16|:80... connected.
HTTP request sent, awaiting response... 404 Not Found
2013-08-06 04:04:15 ERROR 404: Not Found.

So, how can I automatically log in to an OpenID enabled website from the command line?


PS. I think this is better suited here than in web applications since my question is really about the command line aspect and not the actual details of the web page in question. I would guess that any solution will be applicable to all OpenID sites.


You cant, because Cookies refresh every so often. This is for security purposes, the only way you could do this is the way you already did. atleast, from my understanding.