views:

640

answers:

4

I want to cache custom data in an ASP.NET application. I am putting lots of data into it, such as List<objects>, and other objects.

Is there a best practice for this? Since if I use a static data, if the w3p.exe dies or gets recycled, the cache will need to be filled again.

The database is also getting updated by other applications, so a thread would be needed to make sure it is on the latest data.

Update 1:

Just found this, which problably helps me

http://www.codeproject.com/KB/web-cache/cachemanagementinaspnet.aspx?fid=229034&amp;df=90&amp;mpp=25&amp;noise=3&amp;sort=Position&amp;view=Quick&amp;select=2818135#xx2818135xx

Update 2:

I am using DotNetNuke as the application, ( :( ). I have enabled persistent caching and now the whole application feels slugish.

Such as a Multiview takes about 3 seconds to swap view....

Update 3:

http://stackoverflow.com/questions/115126/strategies-for-caching-on-the-web

Linked to this, I am using the DotNetNuke caching method, which in turn uses the ASP.NET Cache object, it also has file based caching.

I have a helper:

CachingProvider.Instance().Add( _
    (label & "|") + key, _
    newObject, _
    Nothing, _
    Cache.NoAbsoluteExpiration, _
    Cache.NoSlidingExpiration, _
    CacheItemPriority.NotRemovable, _
    Nothing)

Which runs that to add the objects to the cache, is this correct? As I want to keep it cached as long as possible. I have a thread which runs every x Minutes, which will update the cache. But I have noticied, the cache is getting emptied, I check for an object "CacheFilled" in the cache.

As a test I've told the worker process not to recycle, etc., but still it seems to clear out the cache. I have also changed the DotNetNuke settings from "heavy" to "light" but think that is for module caching.

+2  A: 

Also have a look at the MS Enterprise Caching Application block which allows your to write custom expiration policy, custom store etc. http://msdn.microsoft.com/en-us/library/cc309502.aspx

You can also check "Velocity" which is available at http://code.msdn.microsoft.com/velocity

This will be useful if you wish to scale your application across servers...

rajesh pillai
+1  A: 

There are lots of articles about the Cache object in ASP.NET and how to make it use SqlDependencies and other types of cache expirations. No need to write your own. And using the Cache is recommended over session or any of the other collections people used to cram lots of data into.

sliderhouserules
+1  A: 

Cache and Session can lead to sluggish behaviour, but sometimes they're the right solutions: the rule of right tool for right job applies.

Personally I've often created collections in pseudo-static singletons for the kind of role you describe (typically to avoid I/O overheads like storing a compiled xslttransform), but it's very important to keep in mind that that kind of cache is fragile, and design for it to A). filewatch or otherwise monitor what it's supposed to cache where appropriate and B). recreate/populate itself with use - it should expect to get flushed frequently.

Essentially I recommend it as a performance crutch, but don't rely on it for anything requiring real persistence.

annakata
+2  A: 

You are looking for either out of process caching or a distributed caching system of some sort, based upon your requirements. I recommend distributed caching, because it is very scalable and is dedicated to caching. Someone else had recommended Velocity, which we have been evaluating and thoroughly enjoying. We have written several caching providers that we can interchange while we are evaluating different distributed caching systems without having to rebuild. This will come in handy when we are load testing the various systems as part of the final evaluation.

In the past, our legacy application has been a random assortment of cached items. There have been DataTables, DataViews, Hashtables, Arrays, etc. and there was no logic to what was used at any given time. We have started to move to just caching our domain object (which are POCOs) collections. Using generic collections is nice, because we know that everything is stored the same way. It is very simple to run LINQ operations on them and if we need a specialized "view" to be stored, the system is efficient enough to where we can store a specific collection of objects.

We also have put an abstraction layer in place that pretty much brokers calls between either the DAL or the caching model. Calls through this layer will check for a cache miss or cache hit. If there is a hit, it will return from the cache. If there is a miss, and the call should be cached, it will attempt to cache the data after retrieving it. The immediate benefit of this system is that in the event of a hardware or software failure on the machines dedicated to caching, we are still able to retrieve data from the database without having a true outage. Of course, the site will perform slower in this case.

Another thing to consider, in regards to distributed caching systems, is that since they are out of process, you can have multiple applications use the same cache. There are some interesting possibilities there, involving sharing database between applications, real-time manipulation of data, etc.

joseph.ferris