views:

130

answers:

3

I have a report which on execution connects to the database with my_report_user username. There can be many end-users of the report. And in each execution a new connection to the database will be made with my_report_user (there is no connection pooling)

I have a result set which I think can just be created once (may be on the first run of the report) and other report executions can just reuse that stuff. Basically each report execution should check whether this result set (stored as temp table) exists or not. If it does not exist then create that result set else just reuse whats available.

Should I use local temp tables (#) or global temp tables (##)?

Has anyone tried such stuff and if yes, please let me know what all things should I care about? (Almost simultaneous report runs, etc.)

EDIT: I am using Sql-Server 2005

+4  A: 

Neither

If you want to cache result result sets under your own control, then you cannot use temp tables, of any kind. You should use ordinary user tables, stored either in tempdb or even have your own result set cache database.

Temp tables, bot #local and ##shared have a lifetime controlled by the connection(s). If your application disconnect, the temp table is deleted, and this does not work well with what you describe.

The real difficult prolem will be to populate these cached result sets under concurent runs without mixing things up (end up with result sets containing duplicate items from concurent report runs that both believed are the 'first' run).

As a side note SQL Server Reporting Services already does this out-of-the-box. You can cache and share datasets, you can cache and share reports, it already works and was tested for you.

Remus Rusanu
"The real difficult prolem will be to populate these cached result sets under concurent runs without mixing things up" I am also worried about this thing. Thanks for the help, btw.
peakit
You can use app locks. Use a resource that identifies the report you're viewing, eg. "report sales north west fiscal october 2009", or "report id 42". 1. sp_getapplock 'shared', 'session'. 2. check the data set, if present, use it for report, then release app lock and exit. 3. if not present, release app loc, aquire it back X sp_getapplock 'exclusive', 'session'. 4. Check again if report exists, if not create it and use it, releae x lock and exit. 5. If report exists after X lock acquired, release x lock, go back to step 1.
Remus Rusanu
I appreciate ur knowledge. But still there are chances that the result set is cleared off by the engine (when there are no active connections to that ## temp table). This defeats my caching-like purpose. What say? Though whatever locking mechanism you gave may hold good to solve the concurrency...
peakit
That's why I say don't use ##tables of any kind.
Remus Rusanu
Thanks for the help Remus !
peakit
A: 

It sounds like you are getting into an OLTP mode now. Reading up on database warehousing will definitely help you.

Raj More
+1  A: 

I find #temp tables can be useful in certain scenarios, but not as a best practice. I have yet to find a valid use for global ##temp tables, either in my own work, or in the work of anyone else who has written about them. The only case I can think of is BCP or other external process which needs to build a temporary data store and then retrieve it in some subsequent step. In that case I would prefer to use a permanent table with some kind of key and a background process to handle cleanup.

Aaron Bertrand