The site that I'm trying to scrape uses js to create a cookie. What I was thinking was that I can create a cookie in python and then use that cookie to scrape the site. However, I don't know any way of doing that. Does anybody have any ideas?
+2
A:
Please see Python httplib2 - Handling Cookies in HTTP Form Posts for an example of adding a cookie to a request.
I often need to automate tasks in web based applications. I like to do this at the protocol level by simulating a real user's interactions via HTTP. Python comes with two built-in modules for this: urllib (higher level Web interface) and httplib (lower level HTTP interface).
Andrew Hare
2009-07-13 02:13:49
+1 looks like a really useful library. Very relevant link!
Tom Leys
2009-07-13 03:12:05
Thanks I managed to get it work :)
PCBEEF
2009-07-14 18:30:02
+2
A:
If you want to do more involved browser emulation (including setting cookies) take a look at mechanize. It's simulation capabilities are almost complete (no Javascript support unfortunately): I've used it to build several scrapers with much success.
jkp
2009-07-13 07:14:21
There are a few remote control browser solutions. I like selenium, particularly since I can run it in a virtual framebuffer Xwindows. (screenshots still work just fine.) Don't know much about the others, though.
Anders Eurenius
2009-07-13 07:34:09
mechanize is not a browser automater, it emulates a browser at level of HTTP requests and responses.
jkp
2009-07-13 10:10:46