views:

53

answers:

4

Hi

I have an array with 30000 plus entries that need to go into a MySQL table.

What is the best practice? From here? Lets say [0], [1] and [2] in the database would be 'title', 'type' and 'customer'

Is it add key names matching the column names in the table and call som "magic" function? Or build the query manually...

Array
(
    [0] => Array
        (
            [0] => 2140395946
            [1] => 1SAP
            [2] => 0041451463
        )

    [1] => Array
        (
            [0] => 2140411607
            [1] => 2SAP
            [2] => 0041411940
        )

    [2] => Array
        (
            [0] => 2140706194
            [1] => 4SAP
            [2] => 0041411943
        )
etc. etc.

UPDATE - based on answers

Thanks for the answers.

The solution would normally be to manually create the SQL-string and all rows can be inserted as one:

INSERT INTO `tx_opengate_table` (`machine` ,`customer` ,`type`)
VALUES
  ('m123', 'dfkj45', 'A'),
  ('m137', 'kfkj49', 'A'), "repeat this line for each entry in the array"
  ... ... ...
  ('m654321', '34dgf456', 'C4') "end with out comma, or remove last comma"
;

Special for TYPO3

I happen to do this in the CMS TYPO3 and just came across a new function added not that long ago:

//Insert new rows
$table = 'tx_opengate_stuff';
$fields = array('machine','type','customer');
$lines = "array as given above"
$GLOBALS['TYPO3_DB']->exec_INSERTmultipleRows($table,$fields,$lines);
+2  A: 

Magic function? I'm guessing you mean some sort of DB abstraction layer? IMHO that would just double your work.

Just build the query manually looping through array[] INSERT'ing values as you go.

martin blank
+1  A: 

You will have to build the query manually if you want the best performance during this operation. If you would iteratively add everything using PDO or some abstarction layer, you will have 30000+ insert queries.

Use foreach to iterate over the arraay, build one nested INSERT query that does all the work at once, and just send it to the server.

Pelle ten Cate
Doing extended inserts here is a bad idea. Ever seen what happens when you fill up the MySQL buffer? Perhaps doing extended inserts for 10 at a time isn't a bad idea, but this depends on data length.
Brad
You should always be aware of the buffer size while doing a bulk insert in MySQL. That doesn't make it a bad idea. I never had this issue doing it with 100 - 1000 records a time without setting it. You can configure the buffer's length by setting the bulk_insert_buffer_size value. It defaults to 8388608, so whether you are going to have a problem really depends on the sizes of the field values. Clearly, when I tried to insert 1000 rows at once, my average length of the values for one row was less than 830 chars. By looking at the question, I don't expect one line to be longer than 40 chars.
Pelle ten Cate
+3  A: 

I would say just build it yourself. You can set it up like this:

$query = "INSERT INTO x (a,b,c) VALUES ";
foreach ($arr as $item) {
  $query .= "('".$item[0]."','".$item[1]."','".$item[2]."'),";
}
$query = rtrim($query,",");//remove the extra comma
//execute query

Don't forget to escape quotes if it's necessary.

Also, be careful that there's not too much data being sent at once. You may have to execute it in chunks instead of all at once.

Drackir
Super. That is it. I guess it just should be $item[0], $item[1] and $item[2] and not the $arr array that should go inte the $query string.
Tillebeck
Yeah, sorry. I fixed it.
Drackir
+1  A: 
$statement = "INSERT INTO table (title, type, customer) VALUES ";
foreach( $data as $row) {
   $statement .= ' ("' . explode($row, '","') . '")';
}
Jon Snyder
Great. Something like this. I guess you would use "implode" to concat the $row array to a string, right? Otherwise I misunderstod ;-)
Tillebeck