views:

246

answers:

3

I have a site with about 150K pages in its sitemap. I'm using the sitemap index generator to make the sitemaps, but really, I need a way of caching it, because building the 150 sitemaps of 1,000 links each is brutal on my server.[1]

I COULD cache each of these sitemap pages with memcached, which is what I'm using elsewhere on the site...however, this is so many sitemaps that it would completely fill memcached....so that doesn't work.

What I think I need is a way to use the database as the cache for these, and to only generate them when there are changes to them (which as a result of the sitemap index means only changing the latest couple of sitemap pages, since the rest are always the same.)[2] But, as near as I can tell, I can only use one cache backend with django.

How can I have these sitemaps ready for when Google comes-a-crawlin' without killing my database or memcached?

Any thoughts?

[1] I've limited it to 1,000 links per sitemap page because generating the max, 50,000 links, just wasn't happening.

[2] for example, if I have sitemap.xml?page=1, page=2...sitemap.xml?page=50, I only really need to change sitemap.xml?page=50 until it is full with 1,000 links, then I can it pretty much forever, and focus on page 51 until it's full, cache it forever, etc.

+1  A: 

I had a similar issue and decided to use django to write the sitemap files to disk in the static media and have the webserver serve them. I made the call to regenerate the sitemap every couple of hours since my content wasn't changing more often than that. But it will depend on your content how often you need to write the files.

I used a django custom command with a cron job, but curl with a cron job is easier.

Here's how I use curl, and I have apache send /sitemap.xml as a static file, not through django:

curl -o /path/sitemap.xml http://example.com/generate/sitemap.xml
dar
I'm working on something similar now. Do you have a code example?
mlissner
A: 

I am working on the same thing using Google app engine - you can read my post on how i see to serve site maps dynamically - any comments welcome.

spidee
Just checked out your post: http://stackoverflow.com/questions/2819192/google-app-engine-sitemap-creation-for-a-social-networkLooks like a similar problem, don't think I'm any help though since I've never used app engine.
mlissner
+1  A: 

Okay - I have found some more info on this and what amazon are doing with their 6 million or so URLS.

Amazon simply make a map for each day and add to it:

  1. new urls
  2. updated urls

So this means that they end up with loads of site-maps - but the search bot will only look at the latest ones - as the updated dates are recent. I was under the understanding that one should refresh a map - and not include a url more than once. I think this is true. But, Amazon get around this as the site maps are more of a log. A url may appear in a later site-map - as it maybe updated - but Google wont look at the older maps as they are out of date - unless of course it does a major re-index. This approach makes a lot of sense as all you do is simply build a new map - say each day of new and updated content and ping it at google - thus google only needs to index these new urls.

This log approach is a synch to code - as all you need is a static data-store model that stores the XML data for each map. your cron job can build a map - daily or weekly and then store the raw XML page in a blob field or what have you. you can then serve the pages straight from a handler and also the index map too.

I'm not sure what others think but this sounds like a very workable approach and a load off ones server - compared to rebuilding huge map just because a few pages may have changed.

I have also considered that it may be possible to then crunch a weeks worth of maps into a week map and 4 weeks of maps into a month - so you end up with monthly maps, a map for each week in the current month and then a map for the last 7 days. Assuming that the dates are all maintained this will reduce the number of maps tidy up the process - im thinking in terms of reducing 365 maps for each day of the year down to 12.

Here is a pdf on site maps and the approaches used by amazon and CNN.

http://www2009.org/proceedings/pdf/p991.pdf

spidee