tags:

views:

228

answers:

4

I need to populate a database with thousands of entries on a daily basis, but my code at the moment manually inserts each one into the database one at a time.

Do While lngSQLLoop < lngCurrentRecord
    lngSQLLoop = lngSQLLoop + 1
    sql = "INSERT INTO db (key1, key2) VALUES ('value1', 'value2');"
    result = bInsertIntoDatabase(sql, True)
    If result = false Then lngFailed = lngFailed + 1
Loop

This works, but takes about 5 seconds for each 100 entries. Would there be a more efficient way to put this into the database? I've tried

INSERT INTO db (key1, key2) VALUES ('value1-1', 'value2-1'), ('value1-2', 'value2-2'), ('value1-3', 'value2-3');

but this fails with a missing colon ; error, suggesting it doesn't like the values to be listed like that. Is there a way that VBA will do this?

+3  A: 

The use of multiple (), () clauses only works with SQL Server 2008.

But you're in luck: you can batch these by simply concatenating your SQL statements and batch a the calls to bInsertIntoDatabase.

The only down side to this approach is that if one statement in the batch fails, so will every subsequent statement in the batch.

So, if failure is a regular issue (say, from key collisions), you would need to use another approach. One solution is to:

  • Insert batches into a temporary table first (without unique indexes, thus avoiding failures initially)
  • Do a final insert into the main table with a WHERE clause that prevents an error
  • Get the result count and subtract from the total number of records in the temporary table to get the number of failures.
richardtallent
+1, works perfectly. But don't do the mistake that I did once: `sql = sql + "INSERT ...;"` in a loop. String concatenation is very slow as the string grows (http://support.microsoft.com/kb/170964). Write your own little buffer or google for a vba string class with a pre-allocated buffer.
stephan
@stephanL good point, using an array of strings and concatenating with JOIN() is an old VBA trick for string concatenation performance.
richardtallent
A: 

If the source of your data can be accessed via a database driver (like ODBC) and your database framework supports heterogeneous queries you should be able to do:

INSERT INTO targetDBtable (key1, key2)
VALUES (SELECT key1, key2 FROM sourceDBtable);
fupsduck
A: 

Using .AddNew and .Update with an updateable recordset seems fast: takes about 0.25 seconds to add 10000 records with no errors, or 1.25 seconds to add 10000 records with 10000 errors, on my system.

Charles Williams
A: 

Save the data to a CSV file first and then use Access' TransferText method (of the DoCmd object) to load in to Access table in one go. Remember to delete the CSV file afterwards.

Even if you're running the code from Excel, you can still execute the TransferText method in Access via Automation.

Marcus from London