Hi,
I'd like to ask your expert advice on a workable architecture in C#.
I have a C# service which responds to a request from a local user on the LAN, fetches packets of data from the internet, and crunches that data to produce arrays of data in a structure. Each data request takes about 2 seconds, and returns 4000 bytes. There could be tens of thousands of requests per day.
To speed everything up, and reduce bandwidth, I need to cache the results of the data crunching so that 2nd and subsequent accesses are served instantly to any other users on the LAN (there could be >50 users).
Constraints:
- The underlying data never changes, i.e. I don't have to worry about "dirty" data (great!).
- The data I want to cache is a rather complex structure, containing nested arrays of DateTime, doubles, etc. The data is crunched, using a lot of math, from the data served from the internet.
- I can't use more than 100MB of memory no matter how much data is cached (i.e. the cache must be size limited).
- I can't index the data in the cache by a numerical index, I have to index it with a combination of date ("YYYY-MM-DD") and a unique ID string ("XXXXXXXX").
- It has to be fast, i.e. it has to serve most of its responses from RAM.
- The data in the cache must be persisted to disk every 24 hours.
Here are my options at the moment:
- Cache the data in the server class, using private variables (i.e. private List or Dictionary), then serialize it to disk occasionally;
- Use a database;
I'm interested in your expert opinion.