views:

101

answers:

3

I have using the many dynamic Query in my database for the procedures because my filter is not fix so i have taken @filter as parameter and pass in the procedure.

Declare @query as varchar(8000)
Declare @Filter as varchar(1000)

set @query = 'Select * from Person.Address where 1=1 and ' + @Filter

exec(@query)

Like that my filter contain any Field from the table for comparison.

It will affect my performance or not ? is there any alternate way to achieve this type of things

A: 

As long as no other joins have to be made dynamically to check the value of a parameter that might matter, and the only dynamic part is the WHERE clause, this could as well be a static query that has all possible parameters. So you have the following scenarios:

In case you want to check values that can be everything (negative/nulls/zeros/positive/empty strings/etc), the use of an auxiliary parameter is needed, something like @signifficant_param1, along with the original value of @param1.

[...]
WHERE 
    (@signifficant_param1=0 or (@param1 is null and field1 is null) or @param1=field1) 
    AND (@signifficant_param2=0 or (@param2 is null and field2 is null) or @param2=field2)
    //etc
[...]

This is the most universal clause I could imagine. Basically it will verify the @signifficant_param value. If this parameter should be taken into account, it will be 1, the first part of the condition will be false and the second part (the verification of the parameter) will take place. In the second phase, if @param is null, then you are looking for all null values of field, and you can't compare null to null, because they are not equal. Then takes place the verification of regular non-null values match.

If, on the other hand, the values in field can't be null, or can't be negative, you don't need the @signifficant_param, because you can make a rule, for example, if @param is null, then this value is not signifficant (in the previous case, you would have to search all null values instead), you could use the following:

[...]
WHERE
    field1=case when @param1 is null then field1 else @param1 end --first way with case statement
    and (@param2 is null or field2=@param2) --second way with boolean logic
[...]
Alexander
You have to be really careful with this construct. Yes, it's an almost universal WHERE clause, but it's not, as they say, SARGable. That is, it's not going to be able to use your indexes effectively for processing your WHERE conditions. This is fine for filtering small subsets of data, but for a search of a large table, this is going to cause the database optimizer to scan.
Dave Markle
Nice answer and nice comment too but i am agree with Dave markle that this is not efficient way to solve this problem.
KuldipMCA
Yes, it's partially true, but also you can't store queries larger than 8000 characters when building dynamic queries from inside MSSQL. It's PARTIALLY true, in my experience, because I had such algorithms on large tables (hundreds of millions of records), that WERE performing quite bad, because the table wasn't indexed on the fields that were needed. As soon as replication on another server was set up, I was able to run Database Engine Tuning Advisor, which proposed complex indexes and statistics, after which performance boosted by a coefficient of ~10.I say trying doesn't hurt.
Alexander
A: 

There really is nothing wrong with dynamic queries, per se. But the way you're intending to go about it is scary. It sort of implies that your parameters will be part of @Filter, which is just asking for a SQL injection attack. It also means that your query plan will not likely be reused, which can result in high CPU and low throughput due to excess query recompilations.

You need to make sure that the dynamic SQL you generate is properly parameterized. You also need to make sure that when you access it using your ADO.NET code (or whatever data access technology you may be using), you use a SqlParameter (or equivalent) object.

Dave Markle
+1  A: 

For performance the question is only if the database can reuse an existing plan or not.

In simple terms you can see it as the database caches the query plan with the sql statement as key. As soon as you change the sql statement it will not be in the cache and a new plan must be generated.

So generating dynamic statements like

"SELECT * FROM table WHERE param = @paramvalue"

has a better chance to be in the cache than

"SELECT * FROM table WHERE param = '" + variable + "'"

You should also add the schema name to tablenames in the query (e.g. dbo.table). Otherwise the plan will not be reused if it is executed by different logins.

adrianm
+1, it should be noted that SQL Server make an attempt to convert queries of the latter type to parameterised queries internally for execution plan caching
Paolo