views:

258

answers:

4

I'm currently having following error message when executing a .sql file with about 26MB on SQL Server 2005:

Msg 701, Level 17, State 123
There is insufficient system memory to run this query.

I'm working with 4GB RAM, 64Bit Windows 7 Ultimate, Core2Duo T6400(2GHz)...

Is there a way to execute it without receiving this message (maybe force SQL Server to use swap file?) or a way to execute it in parts (like 100 queries a time)...

The file is basically a CREATE TABLE followed by thousads of INSERT queries and I have a lot of those (converted .DBF files to SQL queries using ABC DBF Converter)

Any idea will be very appreciated!

+4  A: 

You could break up the file into several batches - e.g. adding a go statement after every thousand inserts

e.g.

insert db..table( field list ) values  ...
insert db..table( field list ) values  ...
go
insert db..table( field list ) values  ...
...
insert db..table( field list ) values  ...
go

Another way might be you use bulk uploading e.g. bcp

Mark
Until it's edited, he means ADDING a `GO` statement after every thousand insert.
Alex Bagnolini
Thanks Alex - updated as suggested
Mark
I already used GO statements to solve the problem, but I don't want to use batch-executing because if I have any error I'll have problems rollbacking transaction...
Tufo
Then insert from batch into another table e.g. in tempdb and then do the main table as a insert ... select from tempdd..x
Mark
A: 

In addition sprinkling GO statements every so many records, if you are concerned about the whole thing running or rolling back then use a transaction like so:

SET XACT_ABORT ON
BEGIN TRAN

Insert ...
Insert ...
Insert ...
...
GO
Insert ..
Insert ..
Insert ..
GO

If @@TranCount > 0 Commit Tran

With XACT_ABORT set to ON, any insert statement that fails will rollback the entire transaction.

Thomas
+1  A: 

You could add DBCC commands in between your sql queries like:

DBCC FREESYSTEMCACHE ('ALL')
GO
DBCC FREESESSIONCACHE
GO
DBCC FREEPROCCACHE
GO

This will help release memory. Also, Microsoft release a Hotfix to address this issue in Sql Server 2005 (Look here). Try to install the Hotfix\Service Pack.

Jojo Sardez
+5  A: 

This question actually seems to come up every so often here. Mark has the correct (and most commonly employed) answer, but let me try to add what I can to make this clearer.

The error message is a little misleading. SQL Server tells you that it doesn't have enough memory to run the query, but what it really means is that it doesn't have enough memory to parse the query.

When it comes to running the query, SQL Server can use all it wants - gigabytes if necessary. Parsing is another story; the server has to build a parse tree and there is only a very limited amount of memory available for that. I've never found the actual limit documented anywhere but for a typical batch full of INSERT statements, it can't handle more than a few MB at a time.

So I am sorry to tell you this but you cannot make SQL Server execute this script exactly as it is written. No way, no how, doesn't matter what settings you tweak. You do, however, have a number of options to work around it:

Specifically, you have three options:

  1. Use GO statements. This is used by SSMS and various other tools as a batch separator. Instead of a single parse tree being generated for the entire script, individual parse trees are generated for each segment of the batch separated by GO. This is what most people do, and it is very simple to still make the script transactionally-safe, as others have demonstrated and I won't repeat here.

  2. Instead of generating a massive script to insert all of the rows, keep the data in a text file (i.e. comma-separated). Then import it using the bcp utility. If you need this to be "scriptable" - i.e. the import needs to happen in the same script/transaction as the CREATE TABLE statement, then use BULK INSERT instead. Although BULK INSERT is a non-logged operation, believe it or not, it can still be placed within a BEGIN TRAN / COMMIT TRAN block.

  3. If you really, really want the INSERT to be a logged operation, and don't want the insertions to happen in batches, then you can use OPENROWSET to open up a text file, excel file, etc. as an ad-hoc "table", and then insert this into your newly-created table. I'm normally loath to ever recommend the use of OPENROWSET, but as this is clearly an administrative script, it's not really a major problem.


Previous comments suggest that you're uncomfortable with #1, although that may just be because of an incorrect assumption that it can't be done in a single transaction, in which case see Thomas's answer. But if you're dead-set on going another way, I suggest going with #2, creating a text file and using BULK INSERT. An example of a "safe" script would be:

BEGIN TRAN

BEGIN TRY

    CREATE TABLE MyTable (...)

    BULK INSERT  MyTable
    FROM 'C:\Scripts\Data\MyTableData.txt' 
    WITH (
        FIELDTERMINATOR = ',',
        ROWTERMINATOR = '\r\n',
        BATCHSIZE = 1000,
        MAXERRORS = 1
    )

    COMMIT

END TRY

BEGIN CATCH

    ROLLBACK

END CATCH

Hopefully this helps put you on the right track. I'm pretty sure this covers all of your available "in the box" options - beyond these, you'd have to start writing actual application programs or shell scripts to do the work, and I don't think that level of complexity is really warranted here.

Aaronaught