I want to have a robot fetch a URL every hour, but if the site's operator is malicious he could have his server send me a 1 GB file. Is there a good way to limit downloading to, say, 100 KB and stop after that limit?
I can imagine writing my own connection handler from scratch, but I'd like to use urllib2 if at all possible, just specifying the limit somehow.
Thanks!