views:

47

answers:

2

My application currently needs to upload a large amount of data to a database server (SQL Server) and locally on a SQLite database (local cache).

I have always used Transactions when inserting data to a database for speed purposes. But now that I am working with something like 20k rows or more per insert batch, I am worried that Transactions might cause issues. Basically, what I don't know is if Transactions have a limit on how much data you can insert under them.

What is the correct way to use transactions with large amounts of rows to be inserted in a database? Do you for instance begin/commit every 1000 rows?

+1  A: 

I dont see any problems doing this but if there are any constraint/ referential integrity errors then probably you got insert them all again and also the table is locked till the time the transaction is commited. Breaking down into smaller portions while logging activity in each batch will help.

A better option would be to BCP insert them into the target while dealing with many rows or even an SSIS package to do this.

Baaju
+1  A: 

No there is no such limit. Contrary to what you might believe, SQLite writes pending transactions into the database file, not RAM. So you should not run into any limits on the amount of data you can write under a transaction.

See SQLite docs for these info: http://sqlite.org/docs.html

Follow the link "Limits in SQLite" for implementation limits like these.

Follow the link "How SQLite Implements Atomic Commit" for how transactions work

Savui