tags:

views:

202

answers:

3

I have a piece of code that executes about 500,000 inserts on a database. It's now done in a loop calling PreparedStatement's executeUpdate at each iteration. Would it be faster to add all 500,000 inserts to a batch and and call executeBatch just once?

+3  A: 

Yes, it will be much faster. make sure you turn autoCommit off first, otherwise you get no performance benefit.

Charles Ma
+1  A: 

Using PreparedStatement in combination with batch update facility yields most efficient results (from Sun JDBC doc):

// turn off autocommit
con.setAutoCommit(false);

PreparedStatement stmt = con.prepareStatement(
    "INSERT INTO employees VALUES (?, ?)");

stmt.setInt(1, 2000);
stmt.setString(2, "Kelly Kaufmann");
stmt.addBatch();

stmt.setInt(1, 3000);
stmt.setString(2, "Bill Barnes");
stmt.addBatch();

// submit the batch for execution
int[] updateCounts = stmt.executeBatch();
grigory
A: 

500.000 is way too much to add at once. Remember those records are kept in memory and sent at once.Add them in batches of a few thousands, I didn't notice any improvement between 1000 and 10000 rows in a batch(with MySQL) but I presume some other factors counts.

adrian.tarau
it could quite feasible for small example with prepared statement. but it really depends, i agree...
grigory