I know similar questions have been asked, but I'm not sure about the answers (and I can't easily test all of them), so before I go crazy continuing to search, I want to ask: Is there an easy way to crawl all the pages on a website and check them for broken and invalid links automatically? Prefereably I'd like a solution that does not require an install or compile as I'm severely limited. Thanks.
A:
You could use Xenu's Link Sleuth. It does require an install, but is light-weight and no compile is needed.
Code Commander
2010-08-24 15:41:38
I looked at that, but it's still got an installer application.
Tom
2010-08-24 15:42:54
Well yeah, but you can easily install it somewhere and then just zip the program folder. Here, I just did that: http://www.2shared.com/fadmin/15876502/98e6a26b/Xenu.zip.html
gablin
2010-08-24 16:54:11
@gablin I'm blocked from most installations, but the zip helped. Thanks.
Tom
2010-08-24 17:59:27