tags:

views:

115

answers:

2

I tried running this,

>>> urllib2.urlopen('http://tycho.usno.navy.mil/cgi-bin/timer.pl')

But it is giving error like this, can anyone tell me a solution ?

Traceback (most recent call last):
  File "<pyshell#11>", line 1, in <module>
    urllib2.urlopen('http://tycho.usno.navy.mil/cgi-bin/timer.pl')
  File "C:\Python26\lib\urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "C:\Python26\lib\urllib2.py", line 391, in open
    response = self._open(req, data)
  File "C:\Python26\lib\urllib2.py", line 409, in _open
    '_open', req)
  File "C:\Python26\lib\urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "C:\Python26\lib\urllib2.py", line 1161, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "C:\Python26\lib\urllib2.py", line 1136, in do_open
    raise URLError(err)
URLError: <urlopen error [Errno 11001] getaddrinfo failed>
+3  A: 

Double check domain is accessible or not.

I am getting 504 Gateway Timeout error here for domain - tycho.usno.navy.mil , at the moment.

Looks like the site is down, also downforeveryoneorjustme.com says that

It's not just you! http://tycho.usno.navy.mil looks down from here.

Thats why getaddrinfo is failing

S.Mark
Confirmed from an undisclosed location near Earth.
msw
A: 

Wrapping in try..except could help keep it neat:

try:
    urllib2.urlopen('http://tycho.usno.navy.mil/cgi-bin/timer.pl')
except URLError:
    print "Error opening URL"
reech