views:

35

answers:

2

I have a Dictionary<int, string> cached (for 20 minutes) that has ~120 ID/Name pairs for a reference table. I iterate over this collection when populating dropdown lists and I'm pretty sure this is faster than querying the DB for the full list each time.

My question is more about if it makes sense to use this cached dictionary when displaying records that have a foreign key into this reference table.

Say this cached reference table is a EmployeeType table. If I were to query and display a list of employee names and types should I query for EmployeeName and EmployeeTypeID and use my cached dictionary to grab the EmployeeTypeIDs name as each record is displayed or is it faster to just have the DB grab the EmployeeName and JOIN to get the EmployeeType string bypassing the cached Dictionary all together.

I know both will work but I'm interested in what will perform the fastest. Thanks for any help.

+2  A: 

I know you won't like the answer, but common sense dictates you do the easiest thing and if too slow then you put remedy to it.

I'll explain myself. As a matter of fact if you cache it it'll probably be faster as you wouldn't be hitting the database every time you load the page, but the gain might not be noticeable for what you're doing (i.e. you might have some other bottle-neck that makes that gain insignificant), defeating the purpose of caching in the first place.

The only way, again, is to do it the easiest way (no caching) and if you're not happy only then you'll go the extra bit.

JohnIdol
+2  A: 

Optimization 101 says don't do it unless you need to:- http://stackoverflow.com/questions/2473666/tips-for-optimizing-c-net-programs/2473875#2473875

But, yes, if this really is a totally static lookup for the lifetime of the application AND it takes up very little RAM then caching it would seem fairly harmless and a Dictionary lookup from RAM will be faster than a trip to the database.

As for the 2nd part you might as well let the database do the join, it'll probably have that table in RAM already, and the increased network payload would seem small.

But again, if you don't need to do it, don't do it! The danger here is that you do this one, then another, then another, the code grows ever more complex and RAM fills up with things you think you might need but which in fact are used rarely leaving less space for the OS/ORM/DB to do its work. Let the compiler, ORM and database decide what to keep in memory instead - they have a much bigger team focused on optimization!

Hightechrider
Yeah I think the JOIN is probably better. I disagree that RAM filling up is a concern though. You can add as much to cache as you want. It will never effect performance because ASP.Net won't cache an item unless the server has the memory to do it w/o adverse effects. It will remove cached items on its own to keep performance up if necessary.I don't totally agree w/ Optimization 101 either. I think there's a balance. Maybe if the optimization was complex but not for something as trivial as caching. Plus is it really wise to wait until there are problems to come up w/ a good design?
Sorry, missed the point that you were caching this in ASP.NET cache, often people put entirely static lookups like this in static variables for the lifetime of the App. Would you want this lookup to drop out of memory if the server is idle? Caching is 'trivial' only for totally static data, otherwise it's often the cause of numerous bugs.
Hightechrider
Yes, I would want it to drop if the server is idle and it will (after 20 minutes). When the program accesses this data it checks if the value is in the cache if so it returns that, if not it runs the query to return the data and adds the data to the cache again.