I have a Django view, which receives part of its data from an external website, which I parse using urllib2/BeautifulSoup.
This operation is rather expensive so I cache it using the low-level cache API, for ~5 minutes. However, each user which accesses the site after the cached data expires will receive a significant delay of a few seconds while I go to the external site to parse the new data.
Is there any way to load the new data lazily so that no user will ever get that kind of delay? Or is this unavoidable?
Please note that I am on a shared hosting server, so keep that in mind with your answers.
EDIT: thanks for the help so far. However, I'm still unsure as to how I accomplish this with the python script I will be calling. A basic test I did shows that the django cache is not global. Meaning if I call it from an external script, it does not see the cache data going on in the framework. Suggestions?
Another EDIT: coming to think of it, this is probably because I am still using local memory cache. I suspect that if I move the cache to memcached, DB, whatever, this will be solved.