views:

34

answers:

1

I know similar questions have been asked, but I'm not sure about the answers (and I can't easily test all of them), so before I go crazy continuing to search, I want to ask: Is there an easy way to crawl all the pages on a website and check them for broken and invalid links automatically? Prefereably I'd like a solution that does not require an install or compile as I'm severely limited. Thanks.

A: 

You could use Xenu's Link Sleuth. It does require an install, but is light-weight and no compile is needed.

http://home.snafu.de/tilman/xenulink.html

Code Commander
I looked at that, but it's still got an installer application.
Tom
Well yeah, but you can easily install it somewhere and then just zip the program folder. Here, I just did that: http://www.2shared.com/fadmin/15876502/98e6a26b/Xenu.zip.html
gablin
@gablin I'm blocked from most installations, but the zip helped. Thanks.
Tom