views:

353

answers:

6

Hello Everyone,

I have a sql script with insert statements to insert 1000s of rows (12000 approx.). When i try running the script in SSMS, it throws 'Out of memory' exception after a while.

"An error occurred while executing batch. Error message is: Exception of type 'System.OutOfMemoryException' was thrown."

I have SQL Server 2008 on Vista with 3gb RAM.

Any thoughts or pointers would be appreciated!

+1  A: 

You may need to insert the odd commit so as to not fill up the transaction logs.

paxdiablo
@paxdiablo: Do you mean to say that, the rows need to be split into multiple transactions?
pencilslate
Yes, just as general advice, not as an answer to your specific problem (which is why I made it a community wiki). And only if they're actually a transaction to begin with. I know DB2 auto-starts transactions if you don't do it manually - it may be that SQL Server doesn't do that, instead treating each insert as a single unit of work.
paxdiablo
+2  A: 

System.OutOfMemoryException is a CLR exception, not a SQL Server exception. SQL would raise an error 701, and besides it would not run out of memory from simply executing some insrts to start with.

The fact that you get a CLR exception indicates that the problem is perhaps in SSMS itself. Make sure your script does not return spurious result sets and messages to SSMS. Also, try executing the script from sqlcmd.

Remus Rusanu
His error is correct. The SQL Management Studio (SSMS) is running out of memory and does run .Net code within it.
Jason Short
+1  A: 

You will have to split up the commands. The easiest way is to add a GO every 10 lines or so.

Basically the SSMS is trying to load all your text into a single SqlCommand.CommandText and execute it. That won't work.

You need to get it to batch them. GO is an easy split point in SSMS where it will take up to that point and execute it, then continue.

LINE1
LINE2
...
GO

LINE11
LINE12

That will be run in 2 SqlCommands to the database.
If you need them all run in a single transaction you will probably have to write a command line app to load each line and execute it within a transaction. I don't think you can split transactions across executions within SSMS.

You could also build an SSIS package, but that is a LOT of work and I don't recommend it unless you need to repeat this process a every so often.

Jason Short
A: 

Make sure you use SET NOCOUNT ON;

Instead of:

  INSERT ... VALUES (...)
  INSERT ... VALUES (...)
  INSERT ... VALUES (...)

You could alternatively try building your script using the following, which works as a single implicit transaction (avoiding the need to run COMMIT between each insert):

  INSERT ... SELECT ...
  UNION ALL  SELECT ...
  UNION ALL  SELECT ...

However, if the overall script is still too big for SSMS to handle (I imagine you are inserting some lengthy strings), then your only options are as the others have suggested ... split up your script or use a different tool.

Aaron Bertrand
+1  A: 

The fix turned out to be running the insert statements in batches. I had do a little bit of experimenting to identify the maximum number of inserts i could do in a transaction.

So, the fix that worked for me was to split the insert statements into smaller groups (5000) and mark them as transaction with commit. Repeated the same for all the groups. It worked without giving 'out of memory' exception.

But on the darker side, it took me 2 hrs to insert the 50,000 rows, which took me by surprise. Was there a way i could have done it in a shorter time?

Appreciate all your comments. Specifically paxdiablo and Jason Short for pointing to the solution and everyone else for reasoning out the cause.

pencilslate
A: 

Try using sqlcmd

http://msdn.microsoft.com/en-us/library/ms170572.aspx

Mike