views:

108

answers:

6

I have installed 3 different python script on my ubuntu 10.04 32 bit machine with python 2.6.5.

All of these use the urllib2 and I always get this error:

urllib2.URLError: <urlopen error [Errno 110] Connection timed out>

Why ?

Examples:

>>> import urllib2
>>> response = urllib2.urlopen("http://www.google.com")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 391, in open
    response = self._open(req, data)
  File "/usr/lib/python2.6/urllib2.py", line 409, in _open
    '_open', req)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 110] Connection timed out>



>>> response = urllib2.urlopen("http://search.twitter.com/search.atom?q=hello&amp;rpp=10&amp;page=1")
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 391, in open
    response = self._open(req, data)
  File "/usr/lib/python2.6/urllib2.py", line 409, in _open
    '_open', req)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 110] Connection timed out>

UPDATE:

$ ping google.com
PING google.com (72.14.234.104) 56(84) bytes of data.
64 bytes from google.com (72.14.234.104): icmp_seq=1 ttl=54 time=25.3 ms
64 bytes from google.com (72.14.234.104): icmp_seq=2 ttl=54 time=24.6 ms
64 bytes from google.com (72.14.234.104): icmp_seq=3 ttl=54 time=25.1 ms
64 bytes from google.com (72.14.234.104): icmp_seq=4 ttl=54 time=25.0 ms
64 bytes from google.com (72.14.234.104): icmp_seq=5 ttl=54 time=23.9 ms
^C
--- google.com ping statistics ---
5 packets transmitted, 5 received, 0% packet loss, time 4003ms
rtt min/avg/max/mdev = 23.959/24.832/25.365/0.535 ms


$ w3m http://www.google.com
w3m: Can't load http://www.google.com.

$ telnet google.com 80
Trying 1.0.0.0...
telnet: Unable to connect to remote host: Connection timed out

UPDATE 2:

I am at home and I am using a router and an Access point :-. However I have just noticed that Firefox doesn't work for me. But chrome, synaptic and other browsers like Midori and Epiphany, etc does work.

UPDATE 3:

>>> useragent = 'Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Ubuntu/10.04 Chromium/6.0.472.62 Chrome/6.0.472.62 Safari/534.3)'
>>> request = urllib2.Request('http://www.google.com/')
>>> request.add_header('User-agent', useragent )
>>> urllib2.urlopen(request)
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 391, in open
    response = self._open(req, data)
  File "/usr/lib/python2.6/urllib2.py", line 409, in _open
    '_open', req)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 110] Connection timed out>

UPDATE 4:

>>> socket.setdefaulttimeout(50)
>>> urllib2.urlopen('http://www.google.com')
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib/python2.6/urllib2.py", line 126, in urlopen
    return _opener.open(url, data, timeout)
  File "/usr/lib/python2.6/urllib2.py", line 391, in open
    response = self._open(req, data)
  File "/usr/lib/python2.6/urllib2.py", line 409, in _open
    '_open', req)
  File "/usr/lib/python2.6/urllib2.py", line 369, in _call_chain
    result = func(*args)
  File "/usr/lib/python2.6/urllib2.py", line 1161, in http_open
    return self.do_open(httplib.HTTPConnection, req)
  File "/usr/lib/python2.6/urllib2.py", line 1136, in do_open
    raise URLError(err)
urllib2.URLError: <urlopen error [Errno 110] Connection timed out>

UPDATE 5:

Wireshark results ( packet sniffer ):

Firefox: http://bit.ly/chtynm

Chrome: http://bit.ly/9ZjILK

Midori: http://bit.ly/cKilC4

midori is another browser that works for me. Only Firefox doesn't work.

+1  A: 

To what URL are you trying to connect? There could be any number of reasons for this error, most of them having to do with either an incorrect name or IP address or a problem with your link to the remote host.

jathanism
I have written 2 examples above :)
xRobot
Are you able to successfully perform `telnet google.com 80` from a terminal? What happens?
jathanism
$ telnet google.com 80Trying 1.0.0.0...telnet: Unable to connect to remote host: Connection timed out
xRobot
Something is weird with your name resolution. Looks like your probably being proxied in some way. Since your `ping google.com` showed the IP address for google.com as `72.14.234.104`, try a `telnet 72.14.234.104 80`. If that returns a `Connected to ...` message, you know you can make direct connections, but you have some other system-based issues (namely proxying and DNS name resolution) to solve before you can successfully connect using `urllib2`. This relies on your system to return accurate name-to-IP resolution.
jathanism
$ telnet 72.14.234.104 80Trying 72.14.234.104...telnet: Unable to connect to remote host: Connection timed out
xRobot
A: 

Have you tested your network connection? Something on the other end is not responding, due to a severance of connection or a refusal of connection.

Also, post the version of python you're using.

UPDATE:

This is almost certainly a network issue. I also have an Ubuntu 10.04 machine (32-bit) with Python 2.6.5 that's a nearly pristine install, and I am unable to reproduce the problem.

Python 2.6.5 (r265:79063, Apr 16 2010, 13:09:56)
[GCC 4.4.3] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> import urllib2
>>> response = urllib2.urlopen("http://www.google.com")
>>> print response.read(100)
<!doctype html><html><head><meta http-equiv="content-type" content="text/html; charset=ISO-8859-1"><
Andrew Sledge
I have written 2 examples above and python version :)
xRobot
Looks like port 80 may be blocked: you can ping and resolve the address, but you cannot pull the web page. Check your firewall.
Andrew Sledge
chrome works... firefox not. I haven't firewall installed :-\
xRobot
+3  A: 

As suggested, troubleshoot the network setup first.

First, check that you can ping the host you're trying to connect to:

$ ping www.google.com

Then try a HTTP connection using for instance w3m:

$ w3m http://www.google.com
codeape
update above :)
xRobot
+1  A: 

i can think only in one raeson right now , XRobot they don't trust you .

woh they ? they :)

when you want to do some crawling or scraping and you see that they don't trust you , you just have to dump them , how is that ?

First of all you should know that some web server filter they contain for malicious software like robot (maybe they know you are a robot, hmmm XRobot :) ), how they do that? there is many way to filter : like using captcha in the webpage , filtering by User-Agent ...

And because your ICMP ping work ,chrome browser work but not w3m i suggest you change the User-Agent like this:

user_agent = 'Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.2.10) Gecko/20100915\
              Ubuntu/10.04 (lucid) Firefox/3.6.10'

request = urllib2.Request('http://www.google.com/')
request.add_header('User-agent', user_agent )

opener.open(request)

maybe i'm getting paranoia here, but hopefully this can help you :)

singularity
I get the same error urllib2.URLError: <urlopen error [Errno 110] Connection timed out>:(
xRobot
you said that that firefox don't work also, try it with chrome header (this one is chromonuim heeader: __Mozilla/5.0 (X11; U; Linux x86_64; en-US) AppleWebKit/534.3 (KHTML, like Gecko) Ubuntu/10.04 Chromium/6.0.472.62 Chrome/6.0.472.62 Safari/534.3__), and try this command "dig google.com" to make sure that your DNS work fine.
singularity
same error :(. See above... I have edited the question in the end.
xRobot
this is getting harder :), set a bigger timeout put this code before "import socket; socket.setdefaulttimeout(10)" and now ? and what do you mean firefox doesn't work ? for google.com or for all website ?
singularity
I have set a bigger timeout but I get the same error :(. Firefox doesn't work for any website. Firefox say this: "Time for the server connection timed out www.youtube.com is taking too long to respond."
xRobot
i will suggest that you install wireshark (the paquet sniffer) it can give you more detail on what's going on in the socket level, and try to see the difference between firefox and chrome .
singularity
I have just published the wireshark results above :)
xRobot
+1 for your answer. It's only paranoia if they are not actually listening.
chiggsy
i just remarked that for firefox the DNS is going crazy, you see the line __"DNS","Standard query response A 1.0.0.0"__ after firefox ask for the ip of www.google.it the DNS server tell him that google is in 1.0.0.0 ??? and after firefox try to create a TCP connection (syn) with 1.0.0.0 but he can't connect because there is no such 1.0.0.0 , so the DNS is given wrong response to firefox and why is that ??? can you go in firefox to __Edit -> Preferences -> Advanced -> Network__ go to Settings do you have some proxy there ? if yes, change it to No-proxy, and have you installed lately an Addons?
singularity
In Firefox setting there is No-proxy. However I have just rebooted my system and then firefox works... but when I run urllib2.urlopen('http://www.google.com'), I get the error above in the terminal and firefox no longer work :-\
xRobot
which mean that urllib2.urlopen broke firefox :) this is getting more fun , restart your pc and try urllib2.urlopen('someothersite_no_google') , maybe it's only when you ping google; if it yes you can sue Google Inc. of sabotage, and you will become rich and don't forget me :)
singularity
I get the same error with other sites as well... so I can't sue google :D
xRobot
so basically when you run urllib2.open firefox don't work very weird, you can check the sys log __System -> Administration -> Log file Viewer__ to find what's going on when you run urllib2.open() if you are not tiered yet :), but it's very weird, because like i told you it's the DNS that give you wrong ip address as i remarked in the wireshark dump but still very weird :)
singularity
now urllib2 work some times... maybe depends from some programs that I have installed in these days... I will check Log file viewer tomorrow, now I am tiered :D. Thanks for help ^_^
xRobot
A: 

Sounds like chrome and synaptic might be using an HTTP proxy. In chromium, go to Options/Under the hood/Change Proxy Settings. Check the gnome proxy settings with:

$ gconftool-2 -R /system/proxy
ataylor
In chromium: "Direct connection to internet". gnome settings are there: # gconftool-2 -R /system/proxy ftp_host = socks_port = 0 socks_host = secure_host = secure_port = 0 mode = none ftp_port = 0 autoconfig_url =
xRobot
Ok, according to your wireshark dumps, firefox is doing an ipv6 DNS request and chromium is doing ipv4. Try disabling ipv6 dns in firefox (http://kb.mozillazine.org/Network.dns.disableIPv6) to confirm.
ataylor
+1  A: 

Do these steps one by one-

  1. Check if you are connected & it's working. ping google.com
  2. If all fine, & your internet connection is just slow then do this -

    import socket
    socket.setdefaulttimeout(300) #in seconds.

This will extend the timeout of your socket.

MovieYoda
I have just tryed... I get the same error :(
xRobot