views:

53

answers:

2

I have a query which joins 6 tables, produces 800,000 rows, and inserts them into a table.

I'm running this query on Sql 2005 Standard on an 8-core machine, on which there is no other workload running. The Sql service uses only one CPU core while running this query (using that CPU for 100%), and this way the query runs for almost 4 minutes.

How could I make my query use all the available CPUs?

Rebuilding the statistics did not help. And there were times, when this query used all the CPU-s and run considerably faster.

(The query also uses two scalar TSQL functions, but this should not be a problem, since it was not before. These functions only use their parameters in the calculations, and they do not access any tables from the database. So using these functions should not prevent the parallelization of the query.)

(Could turning the "read committed snapshot isolation" on on the database cause this behavior? The query ran all right before I turned this on, and it is running on a single CPU since then.)

+2  A: 

Have a look at the query plan, probably it will give you some insight on what's happening. You could also post the query and its plan (in text mode) here for others to analyze.

Try forcing another transaction isolation level (SET TRANSACTION ISOLATION LEVEL READ COMMITTED) and comparing query plans - it will teel you whether changing default isolation caused the problem.

Just in case, you might want to check max degree of parallelism option (it could be set as a server-wide option or as a MAXDOP hint for a particular query).

VladV
Thanks. There is no MAXDOP hint for the query, and neither a systemwide setting for this. (Some other queries run with a parallel plan.)
treaschf
It is interesting, that when I commented out the function calls from the query, it ran in 2 seconds, but still with a non-parallel plan. It seems that the optimizer considers the query to be too simple for parallelization.
treaschf
Generally, calling a scalar function in a large query is not a wise decision - the function is called once for each row, and the overhead of thousands of function calls is significant. Is it possible to replace the function calls with precomputed values, or a lookup table?
VladV
Not really, as the functions do some formula evaluations written in Polish notation. One thing I could do though. By rewriting the functions in C#, the query still runs on a single CPU, but it takes only 20 seconds instead of 4 minutes.
treaschf
Thank you for your answer. It seems that there is no easy solution to make this query of mine run with a parallel plan. Maybe I could force such a plan with using the OPTION (USE PLAN N'xml_plan') clause, but I'm not going to write a parallel plan for this query in XML at this time.
treaschf
A: 

refer this and this

Hope this helps

HotTester
He said, some other queries run with a parallel plan, so the license is unlikely to be the problem.
VladV
Yes, the license is not the problem in this case. Thank you for your answer.
treaschf