views:

170

answers:

3

Hello there, I use Cache in a web service method like this :

var pblDataList = (List<blabla>)HttpContext.Current.Cache.Get("pblDataList");
            if (pblDataList == null)
            {
                var PBLData = dc.ExecuteQuery<blabla>(
    @"SELECT blabla");

                pblDataList = PBLData.ToList();

                HttpContext.Current.Cache.Add("pblDataList", pblDataList, null, DateTime.Now.Add(new TimeSpan(0, 0, 15)), Cache.NoSlidingExpiration, CacheItemPriority.Normal, null);

            }

I wonder is it thread safe? I mean the method is called by multiple requesters And more then one requester may hit the second line at the same time while the cache is empty. So all of these requesters will retrieve the data and add to cache. The query takes 5-8 seconds. May a surrounding lock statement around this code prevent that action? (I know multiple queries will not cause error but i want to be sure running just one query.)

+2  A: 

You are correct. The retrieving and adding operations are not being treated as an atomic transaction. If you need to prevent the query from running multiple times, you'll need to use a lock.

(Normally this wouldn't be much of a problem, but in the case of a long running query it can be useful to relieve strain on the database.)

Greg
This really depends. In a high-throughput environment, the additional load on the database might not be as big a deal as the request-throttling that results from this kind of locking. In *most* cases I'd say that you're right, but each case has to be evaluated individually.
Aaronaught
Good point. I slightly rephrased my answer because of this.
Greg
A: 

I believe the Add should be thread-safe - i.e. it won't error if Add gets called twice with the same key, but obviously the query might execute twice.

Another question, however, is is the data thread-safe. There is no guarantee that each List<blabla> is isolated - it depends on the cache-provider. The in-memory cache provider stores the objects directly, so there is a risk of collisions if any of the threads edit the data (add/remove/swap items in the list, or change properties of one of the items). However, with a serializing provider you should be fine. Of course, this then demands that blabla is serializable...

Marc Gravell
+1  A: 

The cache object is thread-safe but HttpContext.Current will not be available from background threads. This may or may not apply to you here, it's not obvious from your code snippet whether or not you are actually using background threads, but in case you are now or decide to at some point in the future, you should keep this in mind.

If there's any chance that you'll need to access the cache from a background thread, then use HttpRuntime.Cache instead.

In addition, although individual operations on the cache are thread-safe, sequential lookup/store operations are obviously not atomic. Whether or not you need them to be atomic depends on your particular application. If it could be a serious problem for the same query to run multiple times, i.e. if it would produce more load than your database is able to handle, or if it would be a problem for a request to return data that is immediately overwritten in the cache, then you would likely want to place a lock around the entire block of code.

However, in most cases you would really want to profile first and see whether or not this is actually a problem. Most web applications/services don't concern themselves with this aspect of caching because they are stateless and it doesn't matter if the cache gets overwritten.

Aaronaught
I don't use background threads, it's just web service calls. By the way, thanks.
Burak SARICA