views:

462

answers:

2

The site that I'm trying to scrape uses js to create a cookie. What I was thinking was that I can create a cookie in python and then use that cookie to scrape the site. However, I don't know any way of doing that. Does anybody have any ideas?

+2  A: 

Please see Python httplib2 - Handling Cookies in HTTP Form Posts for an example of adding a cookie to a request.

I often need to automate tasks in web based applications. I like to do this at the protocol level by simulating a real user's interactions via HTTP. Python comes with two built-in modules for this: urllib (higher level Web interface) and httplib (lower level HTTP interface).

Andrew Hare
+1 looks like a really useful library. Very relevant link!
Tom Leys
Thanks I managed to get it work :)
PCBEEF
+2  A: 

If you want to do more involved browser emulation (including setting cookies) take a look at mechanize. It's simulation capabilities are almost complete (no Javascript support unfortunately): I've used it to build several scrapers with much success.

jkp
There are a few remote control browser solutions. I like selenium, particularly since I can run it in a virtual framebuffer Xwindows. (screenshots still work just fine.) Don't know much about the others, though.
Anders Eurenius
mechanize is not a browser automater, it emulates a browser at level of HTTP requests and responses.
jkp
This looks interesting, I'll take a look into it
PCBEEF