views:

1852

answers:

4

I am working with a larger than average sqlite database (for use on both on windows and linux) and am looking to maximize the performance I get out of it. The database is to be installed on commodity hardware along with an sqlite gui. The users I am delivering this to are sql savvy but are unlikely to undertake their own optimizations (creation of indexes, setting of pragma etc.) so I am keen to get as much out of the box performance as possible (to ensure maximum usage of the data).

One issue Windows seems to throttle the execution of queries much more than Linux and another is that I am less familiar with sqlite's approach to indexing (compared to other databases such as postgres).

+4  A: 

Have you read the SQLite Optimization FAQ (a few years old, but still seems useful)?

I don't think 1gb is particularly large, even for SQLite. It can certainly handle much larger databases stably.

Dan
I haven't used SQLite, but this seems like a good document to get started.
Jeff
+1  A: 

You should have a look at "scheduled tasks" and a script to "optimize tables" every night, perhaps even re-recreate your indexes. Doing so on a regular basis can save you a ton of time and manual labour.

Mojah
+2  A: 

An update - we've had the most success with more intelligent ordering of data at import time, aggressive indexing (more than doubling the size of the db file in the process), tuning settings in Windows to get XP to behave more like a server than a desktop, changing the order of joins (you can't solely rely on the optimizer) and by measuring your progress by creating a performance test suite (just a harness for running queries and taking measurements.

The experience has been fun and the end users are happy and are able to work with the data.

wioota
+2  A: 

If you are doing large imports of data, I found the most efficient way was to

  1. Order your inserted data preferably in primary index sequence
  2. Use prepared statements (doh)
  3. Drop any indexes
  4. Insert the data in bulk wrapped with transactions ( say 10,000 records in a hit)
  5. Add you indexes back

also remember that sqlite does not support the or operator in the where clause yet. You may be able to tweak things by demogranizing the where clause to get to to use ands.

ey man, do you have a reference or quick sample for the 4th point (insert the data in bulk with transactions)?
Jhonny D. Cano -Leftware-