I'm working on an ASP.NET application that has the following requirements:
- Once every 15 minutes perform a fairly expensive query of around 20,000 items (not from a database). Has around 10 columns, all short strings, except for one which stores up to 3000 char (usually a lot less).
- Process the resulting DataTable with various sorting and filtering, then store the top 100 items in additional DataTables.
- Display this information as a form of aggregation to potentially 10,000's of people.
It seems to me that this is an excellent candidate for caching (System.Web.Caching) especially given we may wish to support additional "on the fly" filtering. For example: Filtering the table down to rows only relevant to a specific user.
However, before getting started I would like to understand:
- If there are any best practices around storing such a large DataTable in Cache.
- If anyone has any experience they are able to share? Large tables you have stored?
- Any traps to be careful of along the way?
Thanks in advance.