views:

68

answers:

2

I have a site with a huge number (well, thousands or tens of thousands) of dynamic URLs, plus a few static URLs.

In theory, due to some cunning SEO linkage on the homepage, it should be possible for any spider to crawl the site and discover all the dynamic urls via a spider-friendly search.

Given this, do I really need to worry about expending the effort to produce a dynamic sitemap index that includes all these URLs, or should I simply ensure that all the main static URLs are in there?

That actual way in which I would generate this isn't a concern - I'm just questioning the need to actually do it.

Indeed, the Google FAQ (and yes, I know they're not the only search engine!) about this recommends including URLs in the sitemap that might not be discovered by a crawl; based on that fact, then, if every URL in your site is reachable from another, surely the only URL you really need as a baseline in your sitemap for a well-designed site is your homepage?

+1  A: 

In an SO podcast they talked about limitations on the number of links you could include/submit in a sitemap (around 500 per page with a page limit based on pagerank?) and how you would need to break them over multiple pages.

Given this, do I really need to worry about expending the effort to produce a dynamic sitemap index that includes all these URLs, or should I simply ensure that all the main static URLs are in there?

I was under the impression that the sitemap wasn't necessarily about disconnected pages but rather about increasing the crawling of existing pages. In my experience when a site includes a sitemap, minor pages even when prominently linked to are more likely to appear on Google results. Depending on the pagerank/inbound links etc. of your site this may be less of an issue.

Graphain
Yeah this is one of the (understandable) pains with sitemaps - having to break them up either based on size or number of links. Clearly, if the datastore that your sitemap mirrors is weighty it can be quite a load to keep such a thing up to date - so in that case by focussing on good linking (after all - it's page content and link count that must matter the most for search engine ranking) you should be able to avoid the pain. But is it an unnecessary gamble to assume this and avoid the sitemap?
Andras Zoltan
Andras Zoltan
+1  A: 

If there is more than one way to get to a page, you should pick a main URL for each page that contains the actual content, and put those URLs in the site map. I.e. the site map should contain links to the actual content, not every possible URL to get to the same content.

Also consider putting canonical meta tags in the pages with this main URL, so that spiders can recognise a page even if it's reachable through different dynamical URLs.

Spiders only spend a limited time searching each site, so you should make it easy to find the actual content as soon as possible. A site map can be a great help as you can use it to point directly to the actual content so that the spider doesn't have to look for it.

We have had a pretty good results using these methods, and Google now indexes 80-90% of our dynamic content. :)

Guffa