views:

5253

answers:

7

Is there a good test suite or tool set that can automate website navigation -- with Javascript support -- and collect the HTML from the pages?

Of course I can scrape straight HTML with BeautifulSoup. But this does me no good for sites that require Javascript. :)

A: 

Keep in mind that and javascript fanciness is messing with the brower's internal DOM model of the page, and does nothing to the raw HTML.

William Keller
+1  A: 

It would be very difficult to code a solution that would work with any arbitrary site out there. Each navigation menu implementation can be quite unique. I've worked a great deal with scrapers, and, provided you know the site you wish to target, here is how I'd approach it.

Usually, if you analyze the particular javascript used in a nav menu, it is fairly easy to use regular expressions to pull out the entire set of variables that are used to build the navmenu. I have never used Beautiful Soup, but from your description it sounds like it may only work on HTML elements and not be able to work inside the script tags.

If you're still having problems, or need to emulate some form POSTs or ajax, get Firefox and install the LiveHttpHeaders plugin. This plugin will allow you to manually browse the site and capture the urls being navigated along with any cookies that are being passed during your manual browsing. That is what you need your scraperbot to send in a request to get a valid response from the target webserver(s). This will also capture any ajax calls being made, and in many cases the same ajax calls must be implementated in your scraper to get your desired responses.

tyshock
+4  A: 

You could use Selenium or WATIR to drive a real browser (IE only in WATIR's case).

Selenium runs cross-browser, has support for writing automation scripts in a good number of languagues and has more mature tooling, such as the excellent Selenium IDE extension for Firefox, which can be used to write and run testcases, and can export test scripts to many languages.

insin
A: 

I've been using Selenium for this and it find that it works great. Selenium runs in Browser and will work with Firefox, Webkit and IE. http://selenium.openqa.org/

DanielHonig
A: 

@insin Watir is not IE only.

http://stackoverflow.com/questions/81566#83387

Željko Filipin
+2  A: 

Using HtmlUnit is also a possibility.

HtmlUnit is a "GUI-Less browser for Java programs". It models HTML documents and provides an API that allows you to invoke pages, fill out forms, click links, etc... just like you do in your "normal" browser.

It has fairly good JavaScript support (which is constantly improving) and is able to work even with quite complex AJAX libraries, simulating either Firefox or Internet Explorer depending on the configuration you want to use.

It is typically used for testing purposes or to retrieve information from web sites.

Kevin Hakanson
I agree that HtmlUnit is good for this... and it doesn't need to have a browser to run, so you can stick it in a script and have it run automatically on a server.
Spike Williams
It's only real javascript support is following location redirects. It isn't going to be helpful for scraping.
Zombies
A: 

Mozenda is a great tool to use as well.

Justin McClelland