views:

55

answers:

6

I want to write a query in a stored proc with many filters but I want to avoid dynamic SQL.

Say my parameters are nullable (@filter1, @filter2, @filter3...). One way I might solve this is:

SELECT col1, col2, col3
FROM table
WHERE col1 = ISNULL(@filter1, col1)
AND col2 = ISNULL(@filter2, col2)
AND col3 = ISNULL(@filter3, col3)

The result of this would filter by the appropriate filters if not null. The question is: 1) Is this good a practice? 2) Will the optimizer optimize the col1 = col1 out or will this affect query performance?

A: 

ISNULL can hurt index usage so I wouldn't say it's ideal but if you need the functionality described above i'm not sure there is a way around it.

Can you look at your execution plan to see if the index you would expect to be used are being used?

Abe Miessler
This construct often works OK. COALESCE can be the killer because of how it treats datatypes
gbn
Interesting, any recommended readings for the ISNULL vs COALESCE?
Abe Miessler
A: 

1) Is this good a practice? 2) Will the optimizer optimize the col1 = col1 out or will this affect query performance?

Yes, it's a good practice.

Some RDBMSes will optimize it out, some won't. None will if you're calling it as a prepared statement.

Don't prematurely optimize; odds are, for most things, the difference in costs will be negligible, or if not, can be made negligible with appropriate indices.

Concentrate on writing code that clearly expresses what you're doing. In my opinion, this idiom is clear and concise.

tpdi
I'm a big proponent of not trying to prematurely optimize, but in this case the performance impact can often be huge and indexes are unlikely to help because the query is potentially doing something different every time. This type of functionality is usually system-wide (i.e. you need to do dynamic searches on many different tables), so it's good to know your general approach before you've written 50 SPs that you then have to rewrite.
Tom H.
This is not premature optimizing it is desgning for performance which should be done on every database. When there are known differnces in performance between techniques, then the best performing one should be chosen from the start. Databasea are notoriously difficult to refactor. It is too late when performance has become an issue. Premature optimization doesn't mean no optimization.
HLGEM
+3  A: 

Erland Sommarskog put together an excellent article on this type of problem. I strongly advise reading through it.

Tom H.
A: 

If you expect this table to grow to any substantial size this is not a good idea as the query optimizer will not cache the execution plan and the optimizer sucks at dealing with situations like this as it can't easily tell at compile time what the execution path will be.

You would be much better off just generating a query on the client side with the correct filters in the where clause instead of trying to write a single catch-all query.

Donnie
A: 

In my experience (by running some benchmarks on large tables) the following:

(col1 = @filter or @filter IS NULL)

is much faster than:

col1 = ISNULL(@filter1, col1)
Jani
+1  A: 

About optimizing the conditions: what you must realize is that a compiled plan has to satisfy any variable value. So when the plan is generated, SQL Server must create an access plan that works when @filter1 is NULL and also works when @filter1 is not NULL. The result is almost always a scan.

The articles linked by Tom H. go into this in much detail.

Remus Rusanu