tags:

views:

407

answers:

3

We're in the process of upgrading one of our SQL Server instances from 2000 to 2005. I installed the performance dashboard (http://www.microsoft.com/downloads/details.aspx?FamilyId=1d3a4a0d-7e0c-4730-8204-e419218c1efc&displaylang=en) for access to some high level reporting. One of the reports shows missing (recommended) indexes. I think it's based on some system view that is maintained by the query optimizer.

My question is what is the best way to determine when to take an index recommendation. I know that it doesn't make sense to apply all of the optimizer's suggestions. I see a lot of advice that basically says to try the index and to keep it if performance improves and to drop it if performances degrades or stays the same. I wondering if there is a better way to make the decision and what best practices exist on this subject.

A: 

Your best researching the most common type of queries that happen on your database and creating indexes based on that research.

For example, if there is a table which stores website hits, which is written to very very often but hardly even read from. Then don't index the table in away.

If how ever you have a list of users which is access more often than is written to, then I would firstly create a clustered index on the column that is access the most, usually the primary key. I would then create an index on commonly search columns, and those which are use in order by clauses.

GateKiller
+3  A: 

The advice you got is right. Try them all, one by one.

There is NO substitute for testing when it comes to performance. Unless you prove it, you haven't done anything.

Stu
+3  A: 
SQLMenace