views:

905

answers:

3

We have a website running .NET 2.0 and have started using the ASP.Net HttpRuntime.Cache to store the results of frequent data lookups to cut down our database access.

Snippet:

 
lock (locker)
{
    if (HttpRuntime.Cache[cacheKey] == null)
    {
        HttpRuntime.Cache.Insert(cacheKey, GetSomeDataToCache(), null, DateTime.Today.AddDays(1), Cache.NoSlidingExpiration);    
    }
    return ((SomeData)HttpRuntime.Cache[cacheKey]).Copy();
}

We are pessimistically locking whenever we want to look at the cache. However, I've seen various blogs posted around the web suggesting you lock after you check the cache value instead, to not incur the overhead of the lock. That doesn't seem right as another thread may have written to the cache after the check.

So finally my question is what is the "right" way to do this? Are we even using the right thread synchronization object? I am aware of ReaderWriterLockSlim() but we're running .NET 2.0.

+2  A: 

As far as I know the Cache object is thread safe so you wouldn't need the lock.

Darin Dimitrov
A: 

The Cache object in .NET is thread safe, so locking is not necessary. Reference: http://msdn.microsoft.com/en-us/library/system.web.caching.cache.aspx.

mc2thaH
A: 

Your code is probably making you think you'll have that item cached for 1 day and your last line will always give that data to you, but that's not the case. As others said, the cache operations are synchronized so you shouldn't lock at that point.

Take a look here for the proper way of doing it.