views:

124

answers:

7

I don't have much experience with databases, so I don't know which is better for long-term performance, better practice, etc.

Here is my (hypothetical) case: imagine you have a database of customer information and history of purchase orders for each. You want to keep track of how much each customer is buying. I can think of two ways of calculating this:

1) Just do a SUM() every time it is needed. This is an easy solution, but the concern is that this database could be 20 years old with tens of thousands of rows for each customer. As more customers purchases are added to the database, the SUM() operation will take longer to calculate.

2) Keep the sum cached in the customer information table, and every time a new purchase is made (updated, deleted, etc), update this cache. That way no matter how many purchase orders there are, there won't be an increase in the calculation time. The downside is that, this is a less flexible solution (only the sum over all rows, what about per month sum? other intervals? etc); this cached value could get out of sync with the actual total somehow (technically shouldn't happen, but it might)

So what should I be doing for this? I know I shouldn't be storing anything I can calculate from what's already in the database, but part of me doesn't like the fact that this type of calculation will get worse in time, and that there is some kind of elegance to option 2.

+8  A: 

In database terms, there is no elegance in option 2 - it would be considered a hack that you might use as a last resort if your database gets to be truly humongous - not likely to happen for a novice setting it up for the first time (but possible).

There would be a lot of work maintaining the totals; and you'd be forever dealing with the question: "Why doesn't the detail add up to the total?"

Go for option 1 until you've proven you can't. Which will in most cases be a long time.

le dorfier
You can use Oracle's materialized views or DB2's indexed views. They do the caching exactly as you described seamlessly, silently and automatically.
S.Lott
Yup, when the time comes. BUt baby-steps first.
le dorfier
+1  A: 

Almost always 1.

How often will you be querying for the total for 20 year's worth of history? If the answer is often, and the performance is poor, then you can think about optimisation or OLAP.

I suspect you're worrying about optimisation too early. This is the sort of thing databases are designed for - let them worry about caching.

Draemon
+4  A: 

What you're describing in option #2 is a case of premature optimization. Using the SUM() of all purchases is going to work for a really long time (years). When (if) you start to see this function degrade, you can add indexes or a totals table to your database to speed things up. Don't complicate things when a simple solution exists.

Of course, the real solution is to try both solutions with 20 years of made-up data and see if there's any real difference. I suspect there isn't.

Bill the Lizard
A: 

Kudos for thinking ahead, but the question that begs asking, is your sales data going to stay in the transactional database for 20 years?

There comes a point where it's going to be a lot easier to move that data into a data warehouse, and just keep your frontline database current.

If this is a new project, be more concerned with making it work, and getting people to use it. Worry about scalability when you've crossed those bridges.

seanb
A: 

Use option 1. Later, if performance becomes poor, you can identify specific bottlenecks and resolve them with options like #2, or materialized views, or several other possibilities.

JosephStyons
A: 

Thank you for all the quick responses. Man, you guys are quick!

I suspect you're worrying about optimisation too early.

Worry about scalability when you've crossed those bridges.

I hear you. I just wanted to consider my options before I potentially start heading down a path I'll regret later.

Emrah
+1  A: 

I'll just add that another possibility is to create roll-up tables. For instance, when tracking page hits, it's not very useful necessarily to know that IP such-and-such accessed page1.php at 14:42:04 on 11/19/2008; but you might want to track daily stats for page1.php. In this case, at the end of each day you might run a procedure to sum up the hits for each page and create a record in a summary table, which can in turn be heavily indexed. Your reporting then can run against that table. In addition to speeding up reporting, it can also speed up writing the original records, as you don't need to worry about table locking or building indexes.

That said, good indexes can go a long way with reporting; and like others here have admonished, it's best to go with the easier, even if less optimal, solution until (if ever) it becomes a problem.

Soldarnal