views:

415

answers:

2

I've noticed the following behavior.

I have a file that is about 3MB containing several thousand rows. In the rows I split and create prepared statement (about 250 000 statements).

What I do is:

preparedStatement
addBatch
do for every 200 rows {
 executeBatch
 clearBatch().
}

at the end

commit()

The memory usage will increase to around 70mb without out of memory error. Is it possible get the memory usage down? and have the transactional behavior (if one fails all fails.). I was able to lower the memory by doing commit with the executeBatch and clearBatch... but this will cause a partial insert of the total set.

+1  A: 

You could insert all rows into a temp table with same structure and if everything is fine. let the database insert them into to target table using: insert into target (select * from temp). In case the import into the temp table fails you haven't changed anything in you target table.

EDIT: fixed syntax

stacker
A: 

You could also use the JDBC 2.0 "batch processing" feature.

  1. Set your dbconnection using connection.setAutoCommit(false)
  2. Add batches to your statement using statement.addBatch(sql_text_here)
  3. Once your batches are all loaded, execute it using: statement.executeBatch()
  4. Commit it using connection.commit()
  5. Catch exceptions and rollback as necessary using connection.rollback()

More on exception handling for rollback... here is a typical rollback exception handler:

  catch( BatchUpdateException bue )
  {
    bError = true;
    aiupdateCounts = bue.getUpdateCounts();

    SQLException SQLe = bue;
    while( SQLe != null)
    {
      // do exception stuff

      SQLe = SQLe.getNextException();
    }
  } // end BatchUpdateException catch
  catch( SQLException SQLe )
  {
    ...

  } // end SQLException catch

Read up here: http://java.sun.com/developer/onlineTraining/Database/JDBC20Intro/JDBC20.html#JDBC2015

Tim Drisdelle
Isn't that what he described he does presently except using prepared statements ?
Anonym
Partially, yes (steps 1-4). But step 5 will allow a complete rollback of the transaction if it fails.
Tim Drisdelle
Rollback isn't relevant to the problem.
skaffman
OP states: "Is it possible get the memory usage down? and have the transactional behavior (if one fails all fails.)" so yes, rollback of the transaction is relevant to the problem.
Tim Drisdelle