views:

388

answers:

1

Hi,

I have an objectdatasource that will return a potentially large collection (up to 200,000 records) that are bound and paged in a gridview. I am using default paging and caching on the objectdatasource. The data being returned is only updated weekly so stale data is not an issue. The paging in this solution was also faster than when I created a solution using custom paging.

My questions are: Is caching a record set this large acceptable to you? If not, why? Are there any performance counters that you use to see the impact on memory that your cached data is creating?

Thanks!

+2  A: 

to answer your questions:

1) Yes caching a large data set is ok. Particulary is generating the data set is more expensive then caching it. Also since this is fairly static data that makes this a good candidate.
2) As for performance counters that sort of depends on the caching mechanism you use. If your use Enterprise Librarie's caching block for example it has counters built in. As for general counters watch the memory counters, working set, private bytes etc...

JoshBerke