views:

64

answers:

2

I tried to play around with .import but it seems to limited with csv and delimited file. Is it possible to import gzip file ? or at least, pipe from command line ?

Also, could I skip some un-wanted column like mysql "LOAD DATA INFILE" ?

+1  A: 

You can create a named pipe. It will act like a normal file but decompress on the fly. SQLite will know nothing about it.

It turns out the example on wikipedia is with gzip. http://en.wikipedia.org/wiki/Named_pipe

Louis-Philippe Huberdeau
For some reason, I've always had problems with named pipes; if there's going to be a complicated file operation with a lot off seeking back and forth through it, it's prone to messing up.
amphetamachine
A: 

You could write a parser for data that would convert it to a series of SQL statements. Perl is a good language for that. It can even handle gzip'd files.

Are you running this in a *Nix OS? If so, you could create a temporary file to hold the decompressed data:

tf="$(mktemp)" &&
zcat <my_records.csv.gz >"$tf"
sqlite3 /path/to/database.sqlite3 ".import $tf"
rm -f "$tf"
amphetamachine
Well, I use ubuntu but I prefer "on the fly" instead of creating temporary file, since the data I working with is pretty huge when uncompressed.
Tg