So I'm trying to introduce some caching into my Asp.Net application. New data for the user is returned from the DB in the form of large datasets. Whenever a user requests data I insert this dataset into the HttpRunTime.Cache
. At the moment I'm setting their caching time to 2-3 hours. And these are really large datasets and I put them in Cache quite frequently with different keys. What I'm worried about are the memory leakage implications of doing this. Would Asp.Net take care of excessive data in Cache and remove it? Also when a cached item is removed by Asp.Net or by me using Cache.Remove()
, is only the reference to the dataset removed or the dataset is also garbage collected from memory? Is there a scenario in which datasets might be 'removed' from cache but still existing in memory creating performance issues? Is there a way to explicitly 'garbage collect' them if this is the case?
views:
668answers:
1
+5
A:
Items put in the cache are not guaranteed to be there, with the framework clearing the cache if it runs low on memory. You can specify a priority to indicate which items should be automatically cleared first.
As to if the memory will be freed, as long as the items are managed, and you are not keeping a reference yourself elsewhere in the application the garbage collector will free the memory once the cache timeout has expired (or you manually remove it from the cache).
Of course removing from the cache doesn't guarantee that the physical memory will be freed, because that only happens the next time the garbage collector runs.
Tetraneutron
2009-05-28 05:44:23
So is there a way to explicitly free memory occasionally before it brings me down?
Daud
2009-05-28 06:31:53
You don't need to if memory gets low and there are items in the cache, items will automatically be removed, at which point since there is low memory the Garbage Collector will kick in.
Tetraneutron
2009-05-28 07:33:13
Ok. So the .Net Framework guarantees that caching will never create memory issues.
Daud
2009-05-28 08:11:48