I need to extract some management information (MI) from data which is updated in overnight batches. I will be using aggregate functions to generate the MI from tables with hundreds of thousands and potentially millions of rows. The information will be displayed on a web page.
The critical factor here is the efficiency of SQL Server's handling of aggregate functions.
I am faced with two choices for generating the data:
- Write stored procs/views to generate the information from the raw data which are called every time someone accesses a page
- Create tables which are refreshed daily and act as a cache for the MI
What is the best approach to take?