views:

60

answers:

1

I consider to use SQLite in a desktop application to persist my model. I plan to load all data to model classes when the user opens a project and write it again when the user saves it. I will write all data and not just the delta that changed (since it is hard for me to tell).

The data may contain thousands of rows which I will need to insert. I am afraid that consecutive insertion of many rows will be slow (and a preliminary tests proves it).

Are there any optimization best practices / tricks for such a scenario?

EDIT: I use System.Data.SQLite for .Net

A: 

Like Nick D said: If you are going to be doing lots of inserts or updates at once, put them in a transaction. You'll find the results to be worlds apart. I would suggest re-running your preliminary test within a transaction and comparing the results.

My Other Me