views:

13820

answers:

5

I'm passing a large dataset into a mysql table via php using insert commands and I'm wondering if its possible to insert approximately 1000 rows at a time via a query other than appending each value on the end of an mile long string and then executing it. I am using the codeigniter framework so its functions are also available to me.

A: 

You could prepare the query for inserting one row using the mysqli_stmt class, and then iterate over the array of data. Something like:

$stmt =  $db->stmt_init();
$stmt->prepare("INSERT INTO mytbl (fld1, fld2, fld3, fld4) VALUES(?, ?, ?, ?)");
foreach($myarray as $row)
{
    $stmt->bind_param('idsb', $row['fld1'], $row['fld2'], $row['fld3'], $row['fld4']);
    $stmt->execute();
}
$stmt->close();

Where 'idsb' are the types of the data you're binding (int, double, string, blob).

Espresso_Boy
A: 

Well, you don't want to execute 1000 query calls, but doing this is fine:

$stmt= array( 'array of statements' );
$query= 'INSERT INTO yourtable (col1,col2,col3) VALUES ';
foreach( $stmt AS $k => $v ) {
  $query.= '(' .$v. ')'; // NOTE: you'll have to change to suit
  if ( $k !== sizeof($stmt)-1 ) $query.= ', ';
}
$r= mysql_query($query);

Depending on your data source, populating the array might be as easy as opening a file and dumping the contents into an array via file().

bdl
It's cleaner if you move that if above the query and change it to something like if ($k>0).
cherouvim
@cherouvim... umm, thanks? This post is from over a year ago...
bdl
@bdl: Yep. The information stays forever though :)
cherouvim
@cherouvim... Well, you're right about that. Thanks for your input. As I'm re-reading the example I provided,I'm failing to see your point. Care to elaborate (via pastebin, etc?). Thanks-
bdl
@bdl: http://pastebin.com/Rv7EaZYH
cherouvim
A: 

You could always use mysql's LOAD DATA:

LOAD DATA LOCAL INFILE '/full/path/to/file/foo.csv' INTO TABLE `footable` FIELDS TERMINATED BY ',' LINES TERMINATED BY '\r\n'

to do bulk inserts rather than using a bunch of INSERT statements.

vezult
I had looked into that but I need to manipulate the data before inserting it. It's been given to me as a Cartesian product of a 1400 by 1400 set of int values many of which are zero. I need to convert that to a many to many relationship using an intermediary table to save space hence the need for inserts as opposed to a bulk insert
toofarsideways
+11  A: 

Assembling one INSERT statement with multiple rows is much faster in MySQL than one INSERT statement per row.

That said, it sounds like you might be running into string-handling problems in PHP, which is really an algorithm problem, not a language one. Basically, when working with large strings, you want to minimize unnecessary copying. Primarily, this means you want to avoid concatenation. The fastest and most memory efficient way to build a large string, such as for inserting hundreds of rows at one, is to take advantage of the implode() function and array assignment.

$sql = array(); 
foreach( $data as $row ) {
    $sql[] = '("'.mysql_real_escape_string($row['text']).'", '.$row['category_id'].')';
}
mysql_real_query('INSERT INTO table (text, category) VALUES '.implode(',', $sql));

The advantage of this approach is that you don't copy and re-copy the SQL statement you've so far assembled with each concatenation; instead, PHP does this once in the implode() statement. This is a big win.

If you have lots of columns to put together, and one or more are very long, you could also build an inner loop to do the same thing and use implode() to assign the values clause to the outer array.

staticsan
Thanks for that! Btw your missing a closing bracket on the end of the function if anyone is planning on copying it. mysql_real_query('INSERT INTO table VALUES (text, category) '.implode(','. $sql));
toofarsideways
Thanks! Fixed. (I often do that...)
staticsan
I also just noticed, me being silly too the implode(','. $sql)); should be implode(',', $sql));
toofarsideways
and the query should really be 'INSERT INTO table (text, category) VALUES '.implode(','. $sql)' sigh 4am coding leads to terrible debugging :(
toofarsideways
the use of implode rocks.
cherouvim
@staticsan: Thanks a lot! Superb Technique! Had a really significant speedup!
Legend
A: 

Hi,

bulk insert always use all columns. In this case you must edit your CSV or Excel.
You can use dbTube.org to configure your bulk Excel import.

Grretings

Andreas

FreeGroup