views:

277

answers:

1

I came across the .import command to do this (bulk insert), but is there a query version of this which I can execute using sqlite3_exec().

I would just like to copy a small text file contents into a table.

A query version of this one below,

".import demotab.txt mytable"

+2  A: 

Sqlite's performance doesn't benefit from bulk insert. Simply performing the inserts separately (but within a single transaction!) provides very good performance.

You might benefit from increasing sqlite's page cache size; that depends on the number of indexes and/or the order in which the data is inserted. If you don't have any indexes, for a pure insert, the cache size is likely not to matter much.

Be sure to use a prepared query, as opposed to regenerating a query plan in the innermost loop. It's extremely important to wrap the statements in a transaction since this avoids the need for the filesystem to sync the database to disk - afterall, partially a written transaction is atomically aborted anyhow, meaning that all fsync()'s are delayed until the transaction completes.

Finally, indexes will limit your insert performance since their creation is somewhat expensive. If you're really dealing with a lot of data and start off with an empty table, it may be beneficial to add the indexes after the data - though this isn't a huge factor.

Oh, and you might want to get one of those intel X25-E SSD's and ensure you have an AHCI controller ;-).

I'm maintaining an app with sqlite db's with about 500000000 rows (spread over several tables) - much of which was bulk inserted using plain old begin-insert-commit: it works fine.

Eamon Nerbonne
Thanks for the detailed explanation Eamon. U rock!!!