tags:

views:

40

answers:

1

I need a script that can spider a website and return the list of all crawled pages in plain-text or similar format; which I will submit to search engines as sitemap. Can I use WGET to generate a sitemap of a website? Or is there a PHP script that can do the same?

+1  A: 

You can use this perl script to do the trick : http://code.google.com/p/perlsitemapgenerator/

sputnick
It'll generate by scanning file system but won't "crawl". The sites I want to spider are dynamic.
Salman A