tags:

views:

454

answers:

4

I have a scripts that retrieves huge data on a table. I want to create a mysqldump to insert data into another database table with different fields. I want the format of phpMyAdmin where it repeats the INSERT INTO Table VALUES(values1),(values2), ...(values100); if reaches certain amount of value sets depends on what you set.

ex: If I have 550 data sets and i want to be devided the data by 100 so that i will have 6 sets of INSERT INTO query.

INSERT INTO tablename VALUES(value1), (value2), .... (value100);
INSERT INTO tablename VALUES(value101), (value102), .... (value200);
INSERT INTO tablename VALUES(value201), (value202), .... (value300);
INSERT INTO tablename VALUES(value301), (value302), .... (value400);
INSERT INTO tablename VALUES(value401), (value402), .... (value500);
INSERT INTO tablename VALUES(value501), (value502), .... (value550);
A: 

While fetching the rows, increment a counter, and when it hits a certain value, have it create a new insert statement?

Some code that might not be correct (no PHP for a LONG time), but you will probably get the idea

$i = 0;
$insertstatements = array();
$currentinsertstatement;

while ($temp = mysql_fetch_assoc($result)) {
  // do something with the data
  $insertpart = "(value_xxx)";
  if ($i % 100 == 0) {
    // first value
    if ($i != 0) $insertstatements[count($insertstatements)] = $currentinsertstatement;
    $currentinsertstatement = "INSERT INTO tablename VALUES " . $insertpart;
  } else {
    $currentinsertstatement .= ", " . $insertpart;
    // somewhere in the middle of the insert statement
  }
  $i++;
}
if ($i % 100 != 0) {
  $insertstatements[count($insertstatements] = $currentinsertstatement;
}
drvdijk
A: 

If you're using mysqldump and wish to output mutiple rows into a single insert, then you need to use the --extended-insert option:

mysqldump extended insert option

I'm not sure it's possible to specify with mysqldump to specify that a specific number of rows are included in each INSERT statement generated in the dump. Rather you can set the net_buffer_length (although it's not recommended that you do change it), so the actual amount may vary depending on the data in each row.

Sliff
A: 

You definitely should use transactions for huge inserts, if your storage engine supports them (like innoDB):

BEGIN;
INSERT INTO tablename VALUES...
INSERT INTO tablename VALUES...
COMMIT;

If something goes wrong, you can safely ROLLBACK the last operation, restart you script, etc.

Brian Clozel
A: 

You could use array_chunk(), something like:

$toInsert = array( '(values1)', '(values2)', '(values3)', '(values4)' ); //etc.

$sqlStart = 'INSERT INTO tablename (field1, field2, field3, etc) VALUES ';
foreach (array_chunk($toInsert, 100) as $insertSet) {
    $sql = $sqlStart . implode(', ', $insertSet);
    //execute $sql
}

Are you actually doing much with the data though? You might be able to do it all in SQL with INSERT INTO table (field1, field2) SELECT somefield, somefield2 FROM another table

Tom Haigh
thanks for this. Do you have any idea how use auto_increment primary key even in the values have no value for primary key?
christian
I'm not sure what you mean? If you omit the primary key from the INSERT it will be created for you, but you will probably need to specify the list of fields you are Inserting so that you can specify which ones to omit.
Tom Haigh
Thanks again. I need to include field names and omit the field for primary key..
christian