After working with .NET's HttpWebRequest
/Response
objects, I'd rather shoot myself than use this to crawl through web sites. I'm looking for an existing .NET library that can fetch URLs, and give you the ability to follow links, extract/fill in/submit forms on the page, etc. Perl's LWP
and WWW::Mechanize
modules do this very well, but I'm working with a .NET project.
I've come across the HTML Agility Pack, which looks awesome, but it stops short of simulating links/forms.
Does such a tool already exist?