as pointed out, you have some basic
quotation issues. but more fundamentally:
- you're not using Pythonesque
constructs to handle things but
you're coding them as simple
imperative code. that's fine, of course, but below are examples of funner (and better) ways to express things
- you need to explicitly set timeouts or it'll
take forever
- you need to multithread or it'll take forever
- you need to handle various common exception types or your code will crash: connections will fail (including
time out) under numerous conditions
against real web servers
- 10.1.1.* is only one possible set of "local" servers. RFC 1918 spells out that
the "local" ranges are 10.0.0.0 - 10.255.255.255, 172.16.0.0 - 172.31.255.255, and
192.168.0.0 - 192.168.255.255. the problem of
generic detection of responders in
your "local" network is a hard one
- web servers (especially local
ones) often run on other ports than
80 (notably on 8000, 8001, or 8080)
- the complexity of general
web servers, dns, etc is such that
you can get various timeout
behaviors at different times (and affected by recent operations)
below, some sample code to get you started, that pretty much addresses all of
the above problems except (5), which i'll assume is (well) beyond
the scope of the question.
btw i'm printing the size of the returned web page, since it's a simple
"signature" of what the page is. the sample IPs return various Yahoo
assets.
import urllib
import threading
import socket
def t_run(thread_list, chunks):
t_count = len(thread_list)
print "Running %s jobs in groups of %s threads" % (t_count, chunks)
for x in range(t_count / chunks + 1):
i = x * chunks
i_c = min(i + chunks, t_count)
c = len([t.start() for t in thread_list[i:i_c]])
print "Started %s threads for jobs %s...%s" % (c, i, i_c - 1)
c = len([t.join() for t in thread_list[i:i_c]])
print "Finished %s threads for job index %s" % (c, i)
def url_scan(ip_base, timeout=5):
socket.setdefaulttimeout(timeout)
def f(url):
# print "-- Trying (%s)" % url
try:
# the print will only complete if there's a server there
r = urllib.urlopen(url)
if r:
print "## (%s) got %s bytes" % (url, len(r.read()))
else:
print "## (%s) failed to connect" % url
except IOError, msg:
# these are just the common cases
if str(msg)=="[Errno socket error] timed out":
return
if str(msg)=="[Errno socket error] (10061, 'Connection refused')":
return
print "## (%s) got error '%s'" % (url, msg)
# you might want 8000 and 8001, too
return [threading.Thread(target=f,
args=("http://" + ip_base + str(x) + ":" + str(p),))
for x in range(255) for p in [80, 8080]]
# run them (increase chunk size depending on your memory)
# also, try different timeouts
t_run(url_scan("209.131.36."), 100)
t_run(url_scan("209.131.36.", 30), 100)