views:

2363

answers:

6

I have a database table on a development server that is now fully populated after I set it running with an import routine for a CSV file containing 1.4 million rows.

I ran the Database Publishing Wizard on the table, and now I have a 286MB SQL script on my local machine. The problem is, I can't figure out how to run it. If I load it into SQL Server Management Studio Express I get an alert window that says "The operation could not be completed".

Any ideas on how I can get this SQL script to run?

+9  A: 

use the sqlcmd tool to execute the file..

sqlcmd -S myServer\instanceName -i C:\myScript.sql
Gulzar
Using this method I get the error "Sqlcmd: Error: Scripting error." Is this to do with the size of the file?
tags2k
first check on the connectivity. maybe you have to pass username and password (-U -P options). can you also try with a smaller file. I have ran huge scripts without any issues.
Gulzar
If I take out all but 15 INSERT statements, it works. If I put the other 1,410,472 statements back in it fails with that error message.
tags2k
Are all these statements inside a transaction? maybe it is timing out. try setting the timeout to 0 (infinite) before you run this.
Gulzar
i am also bit curious to know what text editor are you using for editing that file :)
Gulzar
I'm not at work just now so I will let you know the result tomorrow, thanks for your help so far. By the way I'm just using Textpad and a bit of patience :)
tags2k
there is some acivity going on here to discuss large file text editors http://stackoverflow.com/questions/222853/notepad-replacement-for-viewing-very-large-text-files
Gulzar
no probs. let me know the outcome. looks like you might have to go with Michael Haren's suggestion on splitting up the files.
Gulzar
Yeah, it made no difference. I'll go with Michael's file splitting solution, but thanks for your help anyway!
tags2k
I managed to insert 400MB by the sqlcmd. Thanks, great tip!
Andersson
+1  A: 

Running something that large inside a single transaction is not a good idea. Therefore, I'd recommend breaking up the file into smaller, more manageable chunks.

Another option is to look at some of the other ways to import CSV data directly.

Michael Haren
A: 

Why not just use DTS to import the CSV file directly?

Aheho
A bunch of processing needs to be done on the data first, which I don't want running on the live server. Therefore, I need to be able to process the data into a simple SQL script to minimize the work required by that box.
tags2k
A: 

I Second Gulzar's comment - i just imported a 400mb .sql scripts using sqlcmd without a hitch - worked brilliantly when nothing else would.

The above usage along with Microsoft Database Publishin Wizard works a treat when desperately trying to move something with only script and cmd promptaccess.

Doug
A: 

Thanks Gulzar, I just used your method to restore 600MB sql DB file.

Alex Ferdinansyah
this answer should be a comment
jcollum
A: 

Thanks, I loaded the 2.2 GB script.

Dhillon