views:

816

answers:

7

I have a big SQL file that does not fit into memory and that needs to be executed against Microsoft SQL Server 2008. It seems that the sqlcmd.exe tool always loads it into memory first which is impossible in this case. Any ideas?

unfortunately, i cannot split the script because it is generated by Redgates excellent SQL Data Compare. the entire script is one big transaction and i want to leave it that way. i had never thought that having a gigantic script is unusual because having much data is common in the database world. the script is 3gb in size.

+1  A: 

What/who created the SQL script? Get whatever created the file to split the script up into logic chunks, by either transaction or statement (depending on how the file is structured). If the source can't do this, then whip up a script to split the file up logically.

Welbog
+1  A: 

If it is that big, the script is either too complex or is repetitive. In either case, as others have suggested, the only sensible thing is to break it down into manageable chunks.

Is this a one-off exercise or a regular event?

CJM
+1  A: 

I've had this problem before where the script had an enormous XML String that was being used with OpenXML. The actual SQL was rather minimal, updating some values in a table.

I ended up inserting the data (in chunks) into a temporary table until all the info that was in the XML was stored. Then I ran my update statement.

Added later after more data got posted:

You may want to select large chunks in the tool and have SQL Data compare generate the scripts in chunks. That way you get the transactions. You can select large sections by simply highlighting a range and hitting the space bar.

wcm
+2  A: 

RedGate's SQL Compare has an option to execute the statements directly, instead of generating a SQL script and executing it later. Is there a reason this wouldn't work - in other words, is there a reason you require a SQL script and can't use the application's "synchronize now" functionality?

rwmnau
That's a good point (Redgate makes some awesome products) but he may not have update access in his environment. That's the position I'm in. I need to generate scripts and pass them on to DBA's to run.
wcm
Touche. In that case, is it necessary to update every single table in a single transaction? Can you instead break the operation into 3-4 different scripts, meaning run the Red Gate tool on only 1/4 of your tables each time (picking tables so you balance the size of each script)? Though this isn't a pure transaction for your entire update, each piece is a transaction, so your data won't be left in a "damaged" state if a script fails, and you could even request that DBAs run them simultaniously during a maintenance window, so users won't see partially updated data.
rwmnau
You could make the argument that the DBA could run SQL Data compare for him. The question for me is whether SQL Data Compare will run into the same issue. I mean, isn't Redgate just running the script for you in the same way that Management Studio runs the script. I'm not saying don't try it. It's a good idea
wcm
wcm is just right: i cannot update the database from the machine SQL Data Compare runs and i doubt that will change very soon. Is it really that hard to stream a script instead of executing it at once? Btw, the script is for one gigantic table, not for many.
We own a copy of Red Gate Data Compare, so I just did a compare and had the tool syncronize the databases while running SQL Profiler, and it looks like the tool actually opens the transaction, runs the statements in batches, and the commits at the end. Since it's submitted smaller batches at a time, it should run just fine on a DBA's workstation. Is this a viable option?
rwmnau
Are the batches for each table being updated? If so, since he is only doing one table this won't work for him.
wcm
A: 

1-800-redgate-support.....

or

  • break up transaction script into smaller files
  • set database in single user mode
  • fullbackup of database
  • run each smaller script file; if there is a failure: restore backup, fix script, try again
  • back out of single user mode, all done
KM
+1  A: 

I ran into this problem a few months ago. I generate sync scripts with SQLDataCompare on a weekly and monthly basis for several of our catalog databases and they are routinely larger than 500MB. My solution was writing a VBscript that chops the update script into 50 to 1000 command batches. The problem with this approach is losing the ability to roll back all changes if something breaks halfway into your database update.

A: 

I just want to let everybody know that none of the solutions proposed here will work, even red-gate does not have a way to resolve it (unless you write your own as user Matthew).

Breaking up transactions into smaller files is not an option within red-gate sqldatacompare, you can specify the size of the transactions but you can not divide the script in several different files according and following the size of the transactions.

In addition as someone pointed out, once you divide the script in transactions of any given size you desire, you lose the ability to roll back the changes.

What you end up doing is, transaction splitting or not your scrpit file, it it gets very large, as in my case, no text editor out there can read the 1.2GB .sql that my redgate sqldata compare generated. I have tried Vim, LTF, etc none of them get to open it.

continuing from my message above, I guess the question is what are we supposed to use/do to open a large 1+gb script file? I'm counting sqlcmd will run it anyways.