We have million and millions of records in a SQL table, and we run really complex analytics on that data to generate reports.
As the table is growing and additional records are being added, the computation time is increasing and the user has to wait a long time before the webpage loads.
We were thinking of using a distributed cache like AppFabric to load the data in memory when the application loads and then running our reports off that data in memory. This should improve the response time a little since now data is in memory vs disk.
Before we take the plundge and implement this I wanted to check and find out what others are doing and what are some of the best techniques and practices to load data in memory, caching etc. Surely you don't just load the entire table with 100s of millions of records in memory...??
I was also looking into OLAP / Data warehousing, which might give us better performance rather than caching.