views:

36

answers:

1

I'm deploying a client application to several thousand windows machines. Each machine will use a sqlite database as a local datastore, ultimately writing data to a remote server. So the sqlite db won't really grow over a few MB over time, since data will be added and then later deleted. SQLite is supposed to be zero-administration, but are there any tasks I need to run occassionally such as analyze / update statistics / check for consistency, etc? Or can I assume that once the sqlite db is there it can be used for a couple of years with no worries and no corruption?

If there are such tasks I'll build processes into my app to run them occassionally, but at present I'm not sure if that's necessary/recommended.

+3  A: 

SQLite has a VACUUM command, which will compact deleted entries and reduce fragmentation & file size. The amount of deletions vs. insertions really dictate how frequently this should be done. It's generally quick and not always necessary.

The best thing to do is setup some scripts to simulate your expected activity in production. Take snapshots of the file size & graph in your favorite spreadsheet. If after 10 years of simulated activity, there isn't a problem, then your usage patterns don't need VACUUM.

Pestilence