views:

31

answers:

1

Hi Guys,

I have an ASP.NET MVC 2 Web Application (.NET 4, C#), where user's can search for locations.

The page is implemented with an auto-complete box, similar to many websites. (Google, YouTube, etc)

Now, the AJAX call to the server results in a Stored Procedure call to the database. (whilst efficient, could result in a lot of round-trips for slow typers).

I'm wondering how i can create a strategy to cache the result of say the last 100 searches?

I can't use OutputCache, as the call is made via AJAX on the client-side. I need to cache the output of the stored procedure (list of matched locations for a querytext).

In other words, a lot of people will search for "New York", or "San Francisco", and this data only changes via a manual admin change (where we could invalidate the cache manually, for example).

So, how could i cache the last 100 searches? I was hoping for a FIFO-like functionality, where if the cache already had 100 searches, the oldest one gets thrown away and everything get's moved down.

I want the code to be something like this:

public ICollection<MatchedLocation> FindLocations(string queryText)
{
    // Check last 100 searches.. How?
    string cacheKey = queryText;
    var matchedLocations = cachedPersistence.Get(cacheKey);

    if (matchedLocations == null)
    {
        // Call db
        matchedLocations = dbPersistence.GetLocations(queryText);

        // Add to cache
        cachedPersistence.Add(cacheKey, matchedLocations);
    }
    else
    {
        // Found in Cache! Awesome!
        return matchedLocations;
    }
}

I'm thinking the obvious choice would be a .NET Queue?

But i've never used that before, so any advice? How would i implement concurrency for the get/set, Would i need to use a fully-locked Singleton? Has anyone used a Queue for this purpose? What other options do we have? I would almost need a custom queue, to limit the number of items in the stack.

Thanks for your help.

A: 

If you only want to cache a 100 you could use a Dictionary with a corresponding Dictionary with last used times. And you can use a reader writer lock instead of a full blown lock which allows multiple readers.

With the code below potentially two threads can enter EnterWriteLock for the same value. The penalty would be two db calls which might not be an issue. You can avoid this by doing another TryGetValue and locking (double locking) if necessary.

class Cache
{
    static readonly Dictionary<string, ICollection<MatchedLocation>> _cache = new Dictionary<string, ICollection<MatchedLocation>>(100);
    static readonly Dictionary<string,DateTime> _cacheTimes = new Dictionary<string, DateTime>(100);
    static readonly ReaderWriterLockSlim _lock = new ReaderWriterLockSlim();

    public ICollection<MatchedLocation> FindLocations(string queryText)
    {
        _lock.EnterUpgradeableReadLock();
        try
        {
            ICollection<MatchedLocation> result;
            if (_cache.TryGetValue(queryText, out result))
            {
                return result;
            }
            else
            {
                _lock.EnterWriteLock();
                try
                {
                    // expire cache items
                    if( _cache.Count > 100)
                    {
                        // could be more efficient http://code.google.com/p/morelinq/ - MinBy
                        string key = _cacheTimes.OrderBy(item => item.Value).First().Key;
                        _cacheTimes.Remove(key);
                        _cache.Remove(key);
                    }
                    // add new item
                    result = dbPersistence.GetLocations(queryText);
                    _cache[queryText] = result;
                    _cacheTimes[queryText] = DateTime.UtcNow;                        
                }
                finally
                {
                    _lock.ExitWriteLock();
                }
                return result;
            }
        }
        finally
        {
            _lock.ExitUpgradeableReadLock();
        }
    }
}
Mikael Svenson
Interesting approach. I'll try this out, run some tests and get back to you. Thanks
RPM1984
You might want to add a time expiration job in order to get updated DB data once in a while. Another approach is to use the Cache object in ASP.Net, and keep a counter and use expire triggers. I think it all depends on how much control you want over the cache, and how much data you need to cache to make the solution fly. Interesting read at http://social.msdn.microsoft.com/Forums/en-US/velocity/thread/2ea7a5bc-987b-4395-ab4f-f03890a3c0a0 which also talks about Velocity.
Mikael Svenson
Mikael Svenson