views:

301

answers:

2

I need to parse a fairly large XML file (varying between about a hundred kilobytes and several hundred kilobytes), which I'm doing using Xml#parse(String, ContentHandler). I'm currently testing this with a 152KB file.

During parsing, I also insert the data in an SQLite database using calls similar to the following: getWritableDatabase().insert(TABLE_NAME, "_id", values). All of this together takes about 80 seconds for the 152KB test file (which comes down to inserting roughly 200 rows).

When I comment out all insert statements (but leave in everything else, such as creating ContentValues etc.) the same file takes only 23 seconds.

Is it normal for the database operations to have such a big overhead? Can I do anything about that?

+5  A: 

You should do batch inserts.

Pseudocode:

db.startTransaction();
for (entry : listOfEntries) {
    db.insert(entry);
}
db.commit();

That increased the speed of inserts in my apps extremely.

WarrenFaith
Inserting all ContentValues in this way only takes a second or two. Thanks a lot!
benvd
I forgot, rechecked the answer and spent a while figuring it out again... So for posterity: db.beginTransaction(); db.insertStuff(); db.setTransactionSuccessful(); db.endTransaction();
benvd
A: 

If the table has an index on it, consider dropping it prior to inserting the records and then adding it back after you've commited your records.

kaD'argo