views:

485

answers:

5

Hi!

I'm having some trouble getting my cache to work the way I want.

The problem: The process of retrieving the requested data is very time consuming. If using standard ASP.NET caching some users will take the "hit" of retrieving the data. This is not acceptable.

The solution?: It is not super important that the data is 100% current. I would like to serve old invalidated data while updating the cached data in another thread making the new data available for future requests. I reckon that the data needs to be persisted in some way in order to be able to serve the first user after application restart without that user taking the "hit".

I've made a solution which does somewhat of the above, but I'm wondering if there is a "best practice" way or of there is a caching framework out there already supporting this behaviour?

+1  A: 

You could listen when the Cached Item is Removed and Process then,

public void RemovedCallback(String k, Object v, CacheItemRemovedReason r)
{
    // Put Item Back IN Cache, ( so others can use it until u have finished grabbing the new data)

    // Spawn Thread to Go Get Up To Date Data

    // Over right Old data with new return... 
}

in global asax

protected void Application_Start(object sender, EventArgs e)
{
     // Spawn worker thread to pre-load critical data
}

Ohh...I have no idea if this is best practice, i just thought it would be slick~ Good Luck~

BigBlondeViking
Great idea.. I didn't think of this.But, using this solution the first user visiting the site will still take the "hit" of updating the cache. Also, since I'm using inline delegates for fetching the data I don't know how to re-run the delegate inside the RemovedCallback.
hakksor
Sorry I forgot to address that part. in you Global.Asax on Application Start, spawn a thread that makes "fake" requests to the needed data. this way its already being either worked on or it has already finish by the fist request.
BigBlondeViking
haha fist request... ( I am a horrible typer )
BigBlondeViking
+3  A: 

There are tools that do this, for example Microsofts ISA Server (may be a bit expensive / overkill).

You can cache it in memory using Enterprise Libary Caching. Let your users read from Cache, and have other pages that update the Cache, these other pages should be called as regularly as you need to keep the data upto date.

Shiraz Bhaiji
EL caching is pretty frictionless if your requirements aren't too complicated.
borisCallens
Sounds like a good idea, however I wanted to do this without relying on a service/scheduled task.
hakksor
You can do this without the service/scheduled task, but then the first user asking for the content will take the "hit"
Shiraz Bhaiji
+1 for Enterprise Libary Caching recommendation. Had never heard of that
Rafe Lavelle
A: 

Yeah you could just cache the most frequently accessed data when your app starts but that still means the first user to trigger that would "take the hit" as you say (assuming inproc cache of course).

A: 

What I do in this situation is using a CacheTable in db to cache the latest data, and running a background job (with a windows service. in a shared environment you can use threads also) that refreshes the data on the table.

There is a very little posibility to show user a blank screen. I eliminate this by also caching via asp.net cache for 1 minute.

Don't know if it's a bad design, but it's working great without a problem on a highly used web site.

ercu
+1  A: 

I created my own solution with a Dictionary/Hashtable in memory as a duplicate of the actual cache. When a method call came in requesting the object from cache and it wasn't there but was present in memory, the memory stored object was returned and fired a new thread to update the object in both memory and the cache using a delegate method.

hakksor