How does one performance tune a SQL Query?
- What tricks/tools/concepts can be used to change the performance of a SQL Query?
- How can the benefits be Quantified?
- What does one need to be careful of?
What tricks/tools/concepts can be used to change the performance of a SQL Query?
- Using Indexes? How do they work in practice?
- Normalised vs Denormalised Data? What are the performance vs design/maintenance trade offs?
- Pre-processed intermediate tables? Created with triggers or batch jobs?
- Restructure the query to use Temp Tables, Sub Queries, etc?
- Separate complex queries into multiples and UNION the results?
- Anything else?
How can performance be Quantified?
- Reads?
- CPU Time?
- "% Query Cost" when different versions run together?
- Anything else?
What does one need to be careful of?
- Time to generate Execution Plans? (Stored Procs vs Inline Queries)
- Stored Procs being forced to recompile
- Testing on small data sets (Do the queries scale linearly, or square law, etc?)
- Results of previous runs being cached
- Optimising "normal case", but harming "worst case"
- What is "Parameter Sniffing"?
- Anything else?
Note to moderators:
This is a huge question, should I have split it up in to multiple questions?
Note To Responders: Because this is a huge question please reference other questions/answers/articles rather than writing lengthy explanations.