views:

2268

answers:

4

Hello, I have a SqlDataAdapter that is being populated with 21 rows of data (4 columns). The sproc that drives it returns in a couple seconds in SQL Mgmt Studio, but the .Fill() takes 5 minutes.

    ArrayList ret = new ArrayList();
    SqlDataAdapter da = null;
    SqlCommand cmd = null;  
        cmd = base.GetStoredProc("usp_dsp_Stuff"); //Returns immediately in MSSMS.
        cmd.CommandTimeout = 3600; // Set to 6 min - debug only
        base.AddParameter(ref cmd, "@Param1", ParameterDirection.Input, SqlDbType.BigInt, 8, 19, 0, theParam1);
        base.AddParameter(ref cmd, "@Param2", ParameterDirection.Input, SqlDbType.BigInt, 8, 19, 0, theParam2);
        base.AddParameter(ref cmd, "@Param3", ParameterDirection.Input, SqlDbType.Char, 1, 'C');
        da = new SqlDataAdapter(cmd);
        DataTable dt = new DataTable();
        da.Fill(dt); //Takes 5 minutes.

Any ideas?

Thanks in advance! -Chris

A: 

Fill() can sometimes be slow because .NET is analysing the data that comes back from the procedure.

Use the SQL Profiler to work out what SQL .NET is actually sending when the Fill() executes.

If it is sending a lot of SET statements, such as

set concat_null_yields_null on
set cursor_close_on_commit off
set implicit_transactions off

etc...

.. then putting those same set statements into your stored procedure may speed things up.

codeulike
+1  A: 

Thank you for the help. The solution to this was to add with (nolock) statements on the joins that the sproc was using:

FROM category_tbl c INNER JOIN dbo.categoryItem_LNK cl WITH (NOLOCK) ON c.categoryid = cl.categoryid

I dont know why we were only seeing degradation when using the SqlDataAdapter, but this changed solved the problem right away.

Thanks again, Chris

Chris
That worked for me as well, I was getting a time out even after pulling 10 records from table valued function.
McLovin
A: 

I hate to break the news, but (NOLOCK) isn't a solution, it just creates a new problem. Locks in a SQL database are your friend.

If locking (or worse, blocking) was causing it to be slow, you compare the connection options running through SSMS and the ones used by your application. Use SQL Profiler to see how the code is being executed.

If any of those fields are large objects, keep in mind that SSMS automatically retrieves only a few hundred characters by default. The extra data returned could be a factor.

Bruce
A: 

Bad query plans and parameter sniffing. For a stored proc, and especially one where parameters will wildly adjust the rows read, a bad execution plan from looking at incoming parameters is the cause. It doesn't happen in SQL Management Studio because of different SET parameters.

This thread sums up your issue nicely: http://social.msdn.microsoft.com/Forums/en-US/transactsql/thread/9fd72536-f714-422a-b4c9-078e2ef365da/

This is a typical case of parameter sniffing. Your application most likely runs with different SET options (set by the client API) and uses a different execution plan than the one created in SSMS. What happens is when your procedure is invoke the first time via your application is creates execution plan based on the parameters passed. However, this execution plan may not be good for another set of parameter, which can result in poor performance when executed with the other set of parameters. See the following for more details and different solutions: http://pratchev.blogspot.com/2007/08/parameter-sniffing.html

Here is more on the internals of plan caching and query plan reuse:
http://technet.microsoft.com/en-us/library/cc966425.aspx

debugme