views:

146

answers:

2

I need to extract some management information (MI) from data which is updated in overnight batches. I will be using aggregate functions to generate the MI from tables with hundreds of thousands and potentially millions of rows. The information will be displayed on a web page.
The critical factor here is the efficiency of SQL Server's handling of aggregate functions.
I am faced with two choices for generating the data:

  1. Write stored procs/views to generate the information from the raw data which are called every time someone accesses a page
  2. Create tables which are refreshed daily and act as a cache for the MI

What is the best approach to take?

+2  A: 

Cache the values during your nightly load if the data doesn't change throughout the day. It will make retrieval much faster. I'm a big fan of summary tables when necessary. In your case, they're necessary!

One thing you may want to look into, since you own SQL Server, is Analysis Services. By creating a Multidimensional Database, or a cube, these aggregations all happen automagically, and you can drill down and across your data to find numbers at the speed of thought, instead of trying to write reports that capture all of those numbers. Spend 10 minutes and watch the intro video of it, and I think you'll garner a real appreciation for SSAS's power.

Eric
Thanks. Will have a look at Analysis Services.
macleojw
+1  A: 

It sounds to me like an Analysis Services Cube would actually be the best fit to your problem. The cube processesing can be run after the data loads occur to aggregate the data for later use.

However, you could also possibly use an indexed view, which if designed correctly and used in conjunction with the NO EXPAND table hint can provide a significant performance increase.

SQL 2005 Indexed Views

SQL 2008 Indexed Views

Jonathan Kehayias