views:

183

answers:

3

I have about 50 T-SQL files, some of them are 30MB but some of them are 700MB. I thought on executing them manually, but if the file is bigger than 10MB it throws an out of memory exception on the SQL Server Management Studio.

Any ideas?

+2  A: 

you can try the sqlcmd command line tool, that may have different memory limits. http://msdn.microsoft.com/en-us/library/ms162773.aspx

Usage example:

sqlcmd -U userName -P myPassword -S MYPCNAME\SQLEXPRESS -i myCommands.tsql
Eric Tuttleman
+3  A: 

If you have that much data - wouldn't it be a lot easier and smarter to have that data in e.g. a CSV file and then bulk importing those into SQL Server??

Check out the BULK INSERT command - allows you to quickly and efficiently load large data volumes into SQL Server - much better than such a huge SQL file!

The command looks something like:

BULK INSERT dbo.YourTableName
   FROM 'yourfilename.csv'
   WITH ( FIELDTERMINATOR =';',
          ROWTERMINATOR =' |\n' )

or whatever format you might have to import.

marc_s
Of course it's more efficient and a better strategy, but sometimes you're given a file of SQL statements. If you're passing around data between DBMS's, it's possibly the most portable. So, you can either show some impressive regex-fu and turn it into a csv file, or deal with what you already have.
Eric Tuttleman
+2  A: 

Maybe this is too obvious, but...did you consider writing a program to loop through the files and call SqlCommand.ExecuteNonQuery() for each line? It's almost trivial.

Less obvious advantages:

  • You can monitor the progress of the feed (which is going to take some time)
  • You can throttle it (in case you don't want to swamp the server)
  • You can add a little error handling in case there are problems in the input files
egrunin
+1, this may take a long time, but unless the OP can edit those big files so bulk load will work on them, then this is most likely the only solution.
KM