views:

327

answers:

1

After working with .NET's HttpWebRequest/Response objects, I'd rather shoot myself than use this to crawl through web sites. I'm looking for an existing .NET library that can fetch URLs, and give you the ability to follow links, extract/fill in/submit forms on the page, etc. Perl's LWP and WWW::Mechanize modules do this very well, but I'm working with a .NET project.

I've come across the HTML Agility Pack, which looks awesome, but it stops short of simulating links/forms.

Does such a tool already exist?

+2  A: 

Somebody built a bit of code to run as an addon to the HTML Agility Pack (which I also love) that allows you to do a bit of form tinkering:

http://apps.ultravioletconsulting.com/projects/uvcwebtransform/docs/class_html_agility_pack_1_1_add_ons_1_1_form_processor_1_1_form_processor.html

I read a review that says it's not WWW::Mechanize, but it's a great start. The code is provided, so you might be able to easily extend it.

nathaniel
Interesting, thank you.
spoulson