views:

279

answers:

3

I implemented a simple sitemap class using django's default sitemap app. As it was taking a long time to execute, I added manual caching:

class ShortReviewsSitemap(Sitemap):
    changefreq = "hourly"
    priority = 0.7

    def items(self):
        # try to retrieve from cache
        result = get_cache(CACHE_SITEMAP_SHORT_REVIEWS, "sitemap_short_reviews")
        if result!=None:
            return result

        result = ShortReview.objects.all().order_by("-created_at")

        # store in cache
        set_cache(CACHE_SITEMAP_SHORT_REVIEWS, "sitemap_short_reviews", result)

        return result

    def lastmod(self, obj):
        return obj.updated_at

The problem is that memcache allows only max 1MB object. This one was bigger that 1MB, so storing into cache failed:

>7 SERVER_ERROR object too large for cache

The problem is that django has an automated way of deciding when it should divide the sitemap file into smalled ones. According to the docs (http://docs.djangoproject.com/en/dev/ref/contrib/sitemaps/):

You should create an index file if one of your sitemaps has more than 50,000 URLs. In this case, Django will automatically paginate the sitemap, and the index will reflect that.

What do you think would be the best way to enable caching sitemaps? - Hacking into django sitemaps framework to restrict a single sitemap size to, let's say, 10,000 records seems like the best idea. Why was 50,000 chosen in the first place? Google advice? random number? - Or maybe there is a way to allow memcached store bigger files? - Or perhaps onces saved, the sitemaps should be made available as static files? This would mean that instead of caching with memcached I'd have to manually store the results in the filesystem and retrieve them from there next time when the sitemap is requested (perhaps cleaning the directory daily in a cron job).

All those seem very low level and I'm wondering if an obvious solution exists...

+4  A: 

50k it's not a hardcore parameter. :)

You can use this class instead django.contrib.sitemaps.GenericSitemap

class LimitGenericSitemap(GenericSitemap):
    limit = 2000
Oduvan
This was phenomenally helpful. For a working version of this, see my code, here: http://bitbucket.org/mlissner/legal-current-awareness/src/dc66d2268bec/alert/alertSystem/sitemap.py
mlissner
+1  A: 

Assuming you don't need all those pages in your sitemap then reducing the limit to get the file size down will work fine as described in the previous answer.

If you do want a very large sitemap and do want to use memcached you could split the content up into multiple chunks, store them under individual keys and then put them back together again on output. To make this more efficient Memcached supports the ability to get multiple keys at the same time, although I'm not sure whether the django client supports this capability yet.

For reference the 1MB limit is a feature of memcached to do with how it stores data: http://code.google.com/p/memcached/wiki/FAQ#What_is_the_maximum_data_size_you_can_store?_(1_megabyte)

Garethr
A: 

I have about 200,000 pages on my site, so I had to have the index no matter what. I ended up doing the above hack, limiting the sitemap to 250 links, and also implementing a file-based cache.

The basic algorithm is this:

  • Try to load the sitemap from a file on disk
  • If that fails, generate the sitemap, and
  • If the sitemap contains 250 links (the number set above), save it to disk and then return it.

The end result is that the 1st time a sitemap is requested, if it's complete, it's generated and saved to disk. The next time it's requested, it's simply served from disk. Since my content never changes, this works very well. However, if I do want to change a sitemap, it's as simple as deleting the file(s) from disk, and waiting for the crawlers to come regenerate things.

The code for the whole thing is here, if you're interested: http://bitbucket.org/mlissner/legal-current-awareness/src/tip/alert/alertSystem/sitemap.py

Maybe this will be a good solution for you too.

mlissner