tags:

views:

161

answers:

6

I have a SQL script which is extremely large (about 700 megabytes). I am wondering if there is a good way to reduce the size of the script?

I know there are code minimizers for JavaScript and am looking for one to use with SQL scripts.

I am not looking to get performance on the SQL script. I am trying to make the file size smaller. Removing excess whitespace. Keeping name-qualification down so that the script file sizes can be smaller.

If I attempt to load the file in SQL Server Management Studio I get this error.

Not enough storage is available to process this command. (Exception from HRESULT: 0x80070008) (mscorlib)

+1  A: 

gzip should do.

troelskn
He wants it to still be a script (at least that's how I interpret the question)
Lou Franco
why would he want that?
troelskn
+1  A: 

SQL is much harder to shrink, the field, table names and commands need to be what they are. Plus, you wouldn't just want to rewrite the commands as something shorter because it could have implications on performance.

Depending on the DBMS that you use, it may allow short names for commands, and then there might be a converter.

Lou Franco
A: 

Whats your goal here? Are you finding SQL Server is giving you memory error if you try to run very large statements in one hit? Try putting some go statements in, that always helps.

Nick Kavadias
+3  A: 

What about breaking your script into several small files, and calling those files from a single master script?

This link describes how to do it from a stored procedure.

Or you can do it from a batch file like this:

REM =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
REM Define widely-used variables up here so they can be changed in one place
REM Search for "sqlcmd.exe" and make sure this path is valid for you
REM =-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=-=
set sqlcmd="C:\Program Files\Microsoft SQL Server\100\Tools\Binn\sqlcmd.exe"
set uname="your_uname_here"
set pwd="your_pwd_here"
set database="your_db_name_here"
set server="your_server_name_here"

%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script1.sql"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script2.sql"
%sqlcmd% -S %server% -d %database% -U %uname% -P %pwd% -i "c:\script3.sql"

pause

I like the batch file approach myself, because it is easier to tinker with it, and you can schedule it as a windows job.

Make sure the .BAT file is in a folder with the appropriate security restrictions, since it has your credentials in a plain text .BAT file.

JosephStyons
Breaking the file up sounds like the right thing to do from a structural perspective; unless the script is full of INSERT statements to load tables, 700MB seems awfully large for a cohesive file.
grantwparks
A: 

Compress the sql file will have the most compression ratio.

Minimizing the txt sql file will reduce some bytes/kilobytes per mega.. is not worth...

The better approach is to create a "function" to unzip and read the file. The best benefit I guess.

Today, filesize shouldn't be a problem. Dial-up connection? Floppy disks?

Ismael
+3  A: 

What's in this script of 700MB?! I would hope that there are some similarities/repetitions that would allow it to shorten the file.

Just some guesses:

  • Instead of inserting a million records using Insert statements, use a bulk loading tool
  • Instead of updating a number of individual records, try to batch updates to the same value into one (e.g. Update tab set col=1 where id in (..) instead of individual updates)
  • long manipulations can be defined as a stored procedure (before running the script) and the script would only have to call the stored proc

Of course, splitting the script up into smaller portions and calling each one from a simple batch file would work too. But I'd be a little worried about performance (how long does the execution take?!) and would look for some faster ways.

IronGoofy