views:

169

answers:

3

I am looking for a good explanation of why one would use sliding expiration when caching data in a Web application. It seems as if you would always want to cache content using absolute expiration so the data is forced to be refreshed. With sliding expiration you risk having your data be cached indefinitely. Is it only useful for caching static data? What is a common scenerio for when sliding expiration is helpful? Thanks.

+2  A: 

Some things might be okay to be cached indefinitely, but only if you have enough memory to keep them.

Joel Coehoorn
+1  A: 

Yes the data could be cached indefinitely - which as you say only makes sense if its static.

This is useful when you have an expensive piece of data (eg takes a long time to calculate, retrieve etc) which is going to be useful for an unknown length of time, but most accesses will be close together.

A fictional example is you may write a tax time application to be used by an accounting firm who handles many corporate customers. The application needs to pull down details of the company from the tax department. It makes sense that all the actions for a single company would be done around the same time (say generating pay slips for all the employees), so if you download these details once and cache them with a sliding expiration they will stick around only as long as you are still working with that company, which is an unknown length of time.

Of course the standard caching rules still apply, if memory is low, the cached data could be cleared etc.

Tetraneutron
+2  A: 

If you carefully implement updating the cache, sliding expiration makes sense. As long as you constantly make use of the cached data the system would try to keep it in memory so you can hit it (actually the availability isn't guaranteed even if it's not expired, based on memory pressure and a few other factors).

So if you take care to invalidate it once any kind of update takes place, there's no risk to have your data cached indefinetely. Also caching static data only doesn't makes sense as there are plenty of scenarios when you would cache large amounts of data that really isn't static. I'd say if between updates of an item you can statistically get 3 requests, cache that item and you'll gain in perf.