tags:

views:

17

answers:

1

Consider the following data model:

Suppose I have a table called "SuperAwesomeData" where each record maps to an instance of an object called "SuperAwesomeData" which is retrieved by using the primary key for table "SuperAwesomeData". My question is what caching strategy would best work for managing individual records? I need to still be able to request the "SuperAwesomeData" record via it's primary key.

+2  A: 

Well, give SQL enough memory and you'll likely find its caching stuff for you anyway. Other than that, a basic caching idea will work for you - create a caching entity for your table (or business object, preferably) and simply use something like a dictionary to provide key-value associations.

Then all you need to do is work in some cache invalidation or lifespan and you're sorted. The caching layer usually hovers around the business layer, as the business logic can decide if what's in memory is suitable for you, or stale.

Don't re-invent anything, there are lots of caching solutions around that provide caching infrastructure: ASP.NET Cache, memcached, AppFabric...

Caching is a little gem when it comes to improving performance, because all it consumes is memory - which is turning into penny-a-dozen. However, like anything performance related, don't assume you need it until you need it - ie, database accesses are slow, the network is slow, you have millions of users accessing the same data, etc

Profile your code first!

Adam
I stand corrected. :) I have encountered situations in the past where a client application isn't using caching and is executing 20,000+ queries (each of which are served more or less instantly), at which point the network became the overriding factor in the extraordinary slowness of that particular app.
Will A
Totally agree with the performance optimization point - don't do it unless you see a problem.
niaher