views:

880

answers:

4

I am trying to do a many different queries on a result set which has a very large creation time. To get performance gains I wish to use a temp table and just do many queries on this temp table.

Seems pretty standard. Yet I am struggling to share this temp table in dynamic sql. As I understand it, each SqlCommand object executes in its own thread and so the temp table is in a different scope - thus making it inaccessible from the query thread.

I tried using a global temporary table and that works great, but not ideal?

How can I share a local temporary table between dynamic SQL queries?

My intent:

using (var conn = new SqlClient.SqlConnection("..."))
{
    // Creation involes many table joins in reality
    String creationScript = "SELECT * FROM FooTable INTO #MyTemp";
    SqlCommand createTempTbl = new SqlCommand(creationScript, conn);
    createTempTbl.ExecuteNonQuery();

    String query1 = "SELECT * FROM #MyTemp where id=@id";
    SqlCommand query1Comm = new SqlCommand(query1, conn);
    query1Comm.Parameters.Add("@id", ...);

    String query2 = "SELECT * FROM #MyTemp where name=@name";
    SqlCommand query2Comm = new SqlCommand(query2, conn);
    query2Comm.Parameters.Add("@name", ...);

    // And so on the queries go

} // Now want #MyTemp to be destroyed
+1  A: 

You could try using a global temporary table (ie, use ##MyTemp rather than #MyTemp in your queries), with this caveat:

Global temporary tables are automatically dropped when the session that created the table ends and all other tasks have stopped referencing them. The association between a task and a table is maintained only for the life of a single Transact-SQL statement. This means that a global temporary table is dropped at the completion of the last Transact-SQL statement that was actively referencing the table when the creating session ended.


EDIT: Oops, missed the fact that you've already tried global temp tables.

How about moving all of your logic into a single stored procedure which creates/populates the temp table and then runs the queries and returns multiple resultsets to the client code?

LukeH
The question says that the poster already tried that...
Mitch Wheat
I *really* don't want to go sproc just yet. I'm trying to test my BL. And I need to have the resultset strictly at the DB, I went down this road and memory got hit hard.
vanslly
@vanslly, I'm not sure what you mean about "hav[ing] the resultset strictly at the DB", but if you don't want to use a sproc at this time then the only realistic option I can think of is a global temp table.
LukeH
@Luke, with the assumption the connection is strictly only shared by the temp table and its queries on an instance basis, would the global table 'die' after conn is closed?
vanslly
@vanslly, I think that's the case, but it's a long time since I've played with global temp tables and I can't remember for sure (and I don't have access to SQL Server right now to test). Maybe execute a "DROP TABLE ##MyTemp" just before closing the connection, to make certain.
LukeH
+1  A: 

What is missing from your question is the lifecycle of the created table. If you will have it sticking around for a while, then it is not quite a temp table, it is a work table that you populate and use. I would not use a temp table at all, just a regular table that gets created by the SELECT INTO and used by everyone else until it gets dropped (if ever).

Otávio Décio
As I tried to indicate with the intention code, I wish the table to persist *only* as long as it takes to complete all queries that share a connection. The number of consecutive queries can be easily exceed 65k+, and so the need for the temp table.
vanslly
A: 

An alternative that I've used successfully is to create a work table in TempDb, and use it as if it is a global temp table (e.g., "TempDb.dbo.MyTable"). Remember that user tables are dropped when SQL Server reboots.

Jesse
A: 

I know it's a while since this one was posted but the answer, I believe, is quite simple.

I surmise you are using the MS Enterprise Library to access the database, this explains why the temp table doesn’t exist between commands. The Enterprise Library EXPLICITLY closes the connection to the DB (puts it back in the pool) when the command finishes. That is, UNLESS you put the commands into a transaction. If you use ADO.NET directly (by opening the connection, building and executing the commands, then closing the connection) you do not get this problem (it’s up to you when the connection closes – which is more risky). Here is some code written using the MS Enterprise Library and a transaction (sorry, VB.NET):

// Get a reference to the database
Dim sqlNET As New Sql.SqlDatabase("*Your Connection String Here...*")

//Specify the transaction options
Dim oTranOpt As TransactionOptions = New TransactionOptions
//What type of isolation the transaction should have (V. Important):
oTranOpt.IsolationLevel = IsolationLevel.ReadUncommitted //This one doesn't place locks on DB but allows dirty reads
//How long the transaction has to complete before implicitly failing (h,m,s):
oTranOpt.Timeout = New TimeSpan(0, 0, 30)

//Start the scope of the transation
Using oTranScope As TransactionScope = New TransactionScope(TransactionScopeOption.Required, oTranOpt)

    //Create the connection to the DB. Not abs. necessary. EL will create one but best to do so.
    Using Conn As Common.DbConnection = sqlNET.CreateConnection

        //Create a Temp Table
        sqlNET.ExecuteNonQuery(CommandType.Text, "SELECT * INTO #MyTemp FROM FooTable")

        //Get results from table, e.g.
        Dim intCount As Integer = sqlNET.ExecuteScalar(CommandType.Text, "Select Count(*) from #MyTemp")

        MsgBox(intCount)

        //Flag transaction as successful (causes a commit when scope is disposed)
        oTranScope.Complete()

    End Using //Disposes the connection

End Using //If this point is reached without hitting the oTranScope.Complete - the transaction is rolled back and locks, if any, are released.

If you were to take out the transaction scope, the code would fail on the Select count(*) as the table no longer exists. Specifying the scope keeps the connection open between command calls.

I hope this helps someone.

Neil.

Neilo