I need a script that can spider a website and return the list of all crawled pages in plain-text or similar format; which I will submit to search engines as sitemap. Can I use WGET to generate a sitemap of a website? Or is there a PHP script that can do the same?
+1
A:
You can use this perl script to do the trick : http://code.google.com/p/perlsitemapgenerator/
sputnick
2010-10-16 12:58:35
It'll generate by scanning file system but won't "crawl". The sites I want to spider are dynamic.
Salman A
2010-10-16 13:26:41