views:

4035

answers:

4

I've been battling this one for a while now. I have a stored proc that takes in 3 parameters that are used to filter. If a specific value is passed in, I want to filter on that. If -1 is passed in, give me all.

I've tried it the following two ways:

First way:

SELECT field1, field2...etc  
FROM my_view  
WHERE  
parm1 = CASE WHEN @PARM1= -1 THEN parm1  ELSE @PARM1 END  
AND parm2 = CASE WHEN @PARM2 = -1 THEN parm2  ELSE @PARM2 END  
AND parm3 = CASE WHEN @PARM3 = -1 THEN parm3  ELSE @PARM3 END

Second Way:

SELECT field1, field2...etc  
FROM my_view  
WHERE  
(@PARM1 = -1 OR parm1 = @PARM1)  
AND (@PARM2 = -1 OR parm2 = @PARM2)  
AND (@PARM3 = -1 OR parm3 = @PARM3)

I read somewhere that the second way will short circuit and never eval the second part if true. My DBA said it forces a table scan. I have not verified this, but it seems to run slower on some cases.

The main table that this view selects from has somewhere around 1.5 million records, and the view proceeds to join on about 15 other tables to gather a bunch of other information.

Both of these methods are slow...taking me from instant to anywhere from 2-40 seconds, which in my situation is completely unacceptable.

Is there a better way that doesn't involve breaking it down into each separate case of specific vs -1 ?

Any help is appreciated. Thanks.

A: 

No other way I can think of then doing:

WHERE

(MyCase IS NULL OR MyCase = @MyCaseParameter) AND ....

The second one is more simpler and readable to ther developers if you ask me.

JonH
It's also slow as molasses. Don't use it.
Joel Coehoorn
@Joel do you have anything to back that up? Not that I doubt you, I'm just looking for some hard data one way or the other.
Jonas
I have two indicators that the second method may force a table scan, and that is not a good thing with 1.5 million records.
IronicMuffin
@Joel: This notion of slow as molasses is completly false.Are you stating that dynamically building a WHERE clause inside of your client app is better? I beg to differ...it is clean CORRECT code and his DBA should be prompted for that.
JonH
+3  A: 

I read somewhere that the second way will short circuit and never eval the second part if true. My DBA said it forces a table scan.

You read wrong; it will not short circuit. Your DBA is right; it will not play well with the query optimizer and likely force a table scan.

The first option is about as good as it gets. Your options to improve things are dynamic sql or a long stored procedure with every possible combination of filter columns so you get independent query plans. You might also try using the "WITH RECOMPILE" option, but I don't think it will help you.

Joel Coehoorn
Damn. Beat me to it. +1 :)
DVK
The first option will not necessarily return the same results as the 2nd one. If you have rows in your table with NULL values, they will NOT be returned by the "option 1" query. For example.... Select * From Table Where NullableColumn = NullableColumn
G Mastros
All three parms are NOT NULL, so that's not an issue in this case. Looks like I might be stuck with option 1.
IronicMuffin
Thank you all for the fast responses and great advice.
IronicMuffin
+1  A: 

If you pass in a null value when you want everything, then you can write your where clause as

   Where colName = IsNull(@Paramater, ColName)

This is basically same as your first method... it will work as long as the column itself is not nullable... Null values IN the column will mess it up slightly.

The only approach to speed it up is to add an index on the column being filtered on in the Where clause. Is there one already? If not, that will result in a dramatic improvement.

Charles Bretana
Do you think this will speed it up at all? It seems like it's basically the same operation but checking null instead of -1. Is IsNull more efficient than a CASE?
IronicMuffin
@IronicMuffin, (I like your moniker) No you're exactly right, this is equivilent to yr first method...
Charles Bretana
+3  A: 

if you are running SQL Server 2005 or above you can use IFs to make multiple version of the query with the proper WHERE so an index can be used. Each query plan will be placed in the query cache.

also, here is a very comprehensive article on this topic:

Dynamic Search Conditions in T-SQL by Erland Sommarskog

it covers all the issues and methods of trying to write queries with multiple optional search conditions

here is the table of contents:

  Introduction
      The Case Study: Searching Orders
      The Northgale Database
   Dynamic SQL
      Introduction
      Using sp_executesql
      Using the CLR
      Using EXEC()
      When Caching Is Not Really What You Want
   Static SQL
      Introduction
      x = @x OR @x IS NULL
      Using IF statements
      Umachandar's Bag of Tricks
      Using Temp Tables
      x = @x AND @x IS NOT NULL
      Handling Complex Conditions
   Hybrid Solutions – Using both Static and Dynamic SQL
      Using Views
      Using Inline Table Functions
   Conclusion
   Feedback and Acknowledgements
   Revision History
KM
+1 for linking anything by Mr Sommarskog
Joel Coehoorn
Bookmarked...thank you. We use a mix of 2000/2005 servers, unfortunately this one is 2000.
IronicMuffin
the article covers numerous ways to handle variable search conditions, the IFs is just one
KM