views:

5099

answers:

2

I'm trying to bulk load a lot of data ( 5.5 million rows ) into an SQLite database file. Loading via INSERTs seems to be far too slow, so I'm trying to use the sqlite3 command line tool and the .import command.

It works perfectly if I enter the commands by hand, but I can't for the life of me work out how to automate it from a script ( .bat file or python script; I'm working on a Windows machine ).

The commands I issue at the command line are these:

> sqlite3 database.db
sqlite> CREATE TABLE log_entry ( <snip> );
sqlite> .separator "\t"
sqlite> .import logfile.log log_entry

But nothing I try will get this to work from a bat file or python script.

I've been trying things like:

sqlite3 "database.db" .separator "\t" .import logfile.log log_entry

echo '.separator "\t" .import logfile.log log_entry' | sqlite3 database.db

Surely I can do this somehow?

+6  A: 

Create a text file with the lines you want to enter into the sqlite command line program, like this:

CREATE TABLE log_entry (  );
.separator "\t"
.import logfile.log log_entry

and then just call sqlite3 database.db < commands.txt

Joey
+5  A: 

Create a separate text file containing all the commands you would normally type into the sqlite3 shell app:

CREATE TABLE log_entry ( <snip> );
.separator "\t"
.import /path/to/logfile.log log_entry

Save it as, say, impscript.sql.

Create a batch file which calls the sqlite3 shell with that script:

sqlite3.exe yourdatabase.db < /path/to/impscript.sql

Call the batch file.

On a side note - when importing, make sure to wrap the INSERTs in a transaction! That will give you an instant 10.000% speedup.

Mihai Limbășan