Hi, I'm trying to create a script to automate some actions involving chained GET and POST HTTP requests.
I'm using Python 3.1 on a Windows XP SP3 machine, and I need to interact to a web server located on our intranet (so no proxy should be involved).
Mind that I'm quite new to Python so coding class inheritance, overriding and such is still quite obscure to me, if you post code please write it in "n00b mode", thanks! (Note: I'm not asking for extended commenting, only don't simply state "Override this" or "Inherit that class".)
The actual test code for a GET request is:
import urllib
import urllib.request
proxy = { "http": "$proxy" }
request = urllib.request.FancyURLopener(proxy)
response = request.open("$uri")
text = response.read()
response.close()
request.close()
print(text)
Let $uri
be http://www.google.com/
and $proxy
be an invalid address, I get as expected an error message:
Traceback (most recent call last):
File "C:\test.py", line 8, in <module>
response = request.open("http://www.google.com/")
File "C:\Program Files\Python\lib\urllib\request.py", line 1458, in open
raise IOError('socket error', msg).with_traceback(sys.exc_info()[2])
File "C:\Program Files\Python\lib\urllib\request.py", line 1454, in open
return getattr(self, name)(url)
File "C:\Program Files\Python\lib\urllib\request.py", line 1628, in open_http
return self._open_generic_http(http.client.HTTPConnection, url, data)
File "C:\Program Files\Python\lib\urllib\request.py", line 1608, in _open_generic_http
http_conn.request("GET", selector, headers=headers)
File "C:\Program Files\Python\lib\http\client.py", line 932, in request
self._send_request(method, url, body, headers)
File "C:\Program Files\Python\lib\http\client.py", line 970, in _send_request
self.endheaders(body)
File "C:\Program Files\Python\lib\http\client.py", line 928, in endheaders
self._send_output(message_body)
File "C:\Program Files\Python\lib\http\client.py", line 782, in _send_output
self.send(msg)
File "C:\Program Files\Python\lib\http\client.py", line 723, in send
self.connect()
File "C:\Program Files\Python\lib\http\client.py", line 705, in connect
self.timeout)
File "C:\Program Files\Python\lib\socket.py", line 307, in create_connection
raise error(msg)
IOError: [Errno socket error] [Errno 10061] No connection could be made because the target machine actively refused it
Then, let $uri
be http://www.google.com/
and $proxy
be a valid Squid proxy (which doesn't require authentication), the request succeeds and the page content is saved.
Now, let $uri
be http://www.google.com/
and $proxy
be a valid ISA proxy (which requires NTLM authentication), Python hangs without printing anything.
The same thing happens when $uri
is an internal address and $proxy
is null, valid Squid or valid ISA.
Update:
I found that, when I'm using the ISA server, leaving the script running for 5+ minutes yelds the following error (which is ok, since the proxy requires authentication, but it's not normal the 5+ minutes wait):
Traceback (most recent call last):
File "C:\test.py", line 13, in <module>
response = request.open("http://10.42.8.176/dbapps/tc/tcgs_new/")
File "C:\Program Files\Python\lib\urllib\request.py", line 1454, in open
return getattr(self, name)(url)
File "C:\Program Files\Python\lib\urllib\request.py", line 1628, in open_http
return self._open_generic_http(http.client.HTTPConnection, url, data)
File "C:\Program Files\Python\lib\urllib\request.py", line 1624, in _open_generic_http
response.status, response.reason, response.msg, data)
File "C:\Program Files\Python\lib\urllib\request.py", line 1640, in http_error
result = method(url, fp, errcode, errmsg, headers)
File "C:\Program Files\Python\lib\urllib\request.py", line 1893, in http_error_407
errcode, errmsg, headers)
File "C:\Program Files\Python\lib\urllib\request.py", line 1650, in http_error_default
raise HTTPError(url, errcode, errmsg, headers, None)
urllib.error.HTTPError: HTTP Error 407: Proxy Authentication Required ( The ISA Server requires authorization to fulfill the request. Access to the Web Proxy service is denied. )
Passing an empty proxy lists (e.g. urllib.request.FancyURLopener(proxies={})
) seems to yield the same result, i.e. it seems that Python is using the Windows's proxy (the ISA one).
How can I force Python not to use any proxy at all?
Thanks in advance, Andrea.