views:

415

answers:

6

I have some code I use to transfer a table1 values to another table2, they are sitting in different database.

It's slow when I have 100.000 records. It takes forever to finish, 10+ minutes.

(Windows Mobile smartphone)

What can I do?

cmd.CommandText = "insert into " + TableName + " select * from sync2." + TableName+"";  
cmd.ExecuteNonQuery();

EDIT

The problem is not resolved. I am still after answers.

A: 

I think you could use a DataTable (I think - or was it DataSet? sorry, still a beginner in .NET), then .Fill() the Reader's results into that DataTable, and then there is a BulkCopy or BulkInsert operation, that can push it all out into a table in another database (connection).

Bandi-T
Can you show me some code example for this?
Pentium10
A: 

One suggestion, that may help (although you'd need to profile this), would be to call your insert or replace command a single time for multiple records, with multiple values. This would batch the calls to the DB, which potentially would speed things up.

Also, it sounds like you're copying from one DB to another - if the records are not going to exist in the new DB, you can use INSERT instead of INSERT OR REPLACE, which is (at least theoretically) faster.

Neither of these is going to be dramatic, however - this is likely going to still be a slow operation.

Reed Copsey
As far I know SQLite doesn't support multiple values. Actually there are two scenarios. First time, the records won't exists. Pure insert is better, but the second time they will and I need the Insert OR Replace.
Pentium10
+1  A: 

As far as I can tell, you are using two SQL statements per row - one to insert it, and then another when you update the entire table. Do you need to update the entire table, or just the rows you are inserting? If not, you could just insert the row with dirty set to 0 in the first place.

You can also try changing your ad-hoc insert statement into a prepared/compiled/parametrized statement. In some database engines that provides a small speed boost.

There are a few options for improvment listed in the SQLite FAQ, here.

Finally, have you figured out what your bottleneck is? I don't know much about the performance of mobile phone applications - is your profiling showing that you are CPU bound or "disk" bound, whatever passes for a disk on that particular phone?

Peter Recore
After I've done with the process I have to set dirty 0 for all records. I don't know if the parametrized queries will help me, because the table schema is variable, and I don't know the fixed columns. On mobile both CPU and DISK are bottlenecks. Probably disk more.
Pentium10
A: 

Instead of executing the statements individually you could build up the string then execute it at once as a batch, alternatively look into batch updates.

Alternatively you could do this all as one statement which would probably be more efficient, something like this is what I mean:

http://forums.mysql.com/read.php?61,15029,15029

I hope this helps.

Burt
Can you please be more precise, and please keep with SQLite. Batch updates are rare on sqlite.
Pentium10
You could get rid of the loop by doing something like: insert into ??? select ??? from ???
Burt
he is dealing with two different databases, not two tables in the same database.
Peter Recore
A: 

Well you're using the query processor for each insert. It's going to be much better to use a command, add Parameters to it, Prepare it, and then just set the Parameters in your loop.

I don't know if SQLite supports TableDirect commands. If it does that would be way, way, way faster (it's a few orders of magnitude faster in SQL Compact for example). It would be certainly worth a try.

ctacke
According to a Note on that page:TableDirect is only supported by the .NET Framework Data Provider for OLE DB. Multiple table access is not supported when CommandType is set to TableDirect.
Pentium10
can you tell me some code example for the Parameteres?
Pentium10
+2  A: 

1] You can set the following parameters in your connectionString

string connectionString = @"Data Source=E:\myDB.db3; New=true; Version=3; PRAGMA cache_size=20000; PRAGMA page_size=32768; PRAGMA synchronous=off";

which has its own limitations. check this link for details

The above will increase the cache size ultimately(cache_size & page_size) but you could lose some data in case of force shutdown of your system(synchronous=off).

2] You can wrap your insert statements inside a transaction like this

dbTransaction = dbConnection.BeginTransaction();
dbCommand.Transaction = dbTransaction;
// your individual insert statements here
dbTransaction.Commit();

3] There is one more trick which is well explained in this page. Check it out

Hope that helps
Cheers!

Veer