urllib2.URLError: <urlopen error [Errno 11004] getaddrinfo failed>

If I run:

urllib2.urlopen('http://google.com')

even if I use another url, I get the same error.

I'm pretty sure there is no firewall running on my computer or router, and the internet (from a browser) works fine.


Solution 1:

The problem, in my case, was that some install at some point defined an environment variable http_proxy on my machine when I had no proxy.

Removing the http_proxy environment variable fixed the problem.

Solution 2:

The site's DNS record is such that Python fails the DNS lookup in a peculiar way: it finds the entry, but zero associated IP addresses. (Verify with nslookup.) Hence, 11004, WSANO_DATA.

Prefix the site with 'www.' and try the request again. (Use nslookup to verify that its result is different, too.)

This fails essentially the same way with the Python Requests module:

requests.exceptions.ConnectionError: HTTPConnectionPool(host='...', port=80): Max retries exceeded with url: / (Caused by : [Errno 11004] getaddrinfo failed)

Solution 3:

This may not help you if it's a network-level issue but you can get some debugging info by setting debuglevel on httplib. Try this:

import urllib, urllib2, httplib

url = 'http://www.mozillazine.org/atom.xml'
httplib.HTTPConnection.debuglevel = 1

print "urllib"

data = urllib.urlopen(url);

print "urllib2"

request = urllib2.Request(url)
opener = urllib2.build_opener()
feeddata = opener.open(request).read()

Which is copied directly from here, hope that's kosher: http://bytes.com/topic/python/answers/517894-getting-debug-urllib2