tags:

views:

816

answers:

2

I am running into the

DBD::mysql::st execute failed: Got a packet bigger than 'max_allowed_packet' bytes

error when trying to make a large insert using Perl & MySQL. I know that increasing the max_allowed_packet setting in my.cnf could fix this, but is it possible to tell DBI (or DBD::mysql, since my app really only needs to work with MySQL) to use smaller packets? Is it even possible to break up a large insert into smaller packets?

I don't have full control over the database server, since this needs to run in a shared hosting environment, so if I could handle this without requesting a global change to the server, that would be ideal.

Thanks!

+1  A: 

Try adding something like ";max_allowed_packet=1MB" to the fist argument of DBI->connect.

If that doesn't work, you could use a ";mysql_read_default_file=/somewhere/my.cnf" option pointing to a my.cnf file that has the proper configuration.

Leon Timmermans
A: 

If that doesn't work, one way to side step the issue, is to first insert the biggest length of the string you can (1MB-length of the insert statement). Then, you update with packets of 1MB of data like this:

$data = "whatever data needs inserting";
my $ins_sth=$dbh->prepare("insert into table_name (datacol) values(?)");
my $upd_sth=$dbh->prepare("update table_name set datacol=datacol||? where id=?");
my $max_size = 900_000; # Really max_packet_size - 200bytes
my $pos = 0;
$ins_sth->execute(substr($data,$pos,$max_size));
my $id = $dbh->{'mysql_insertid'};
$pos += $max_size;
while ( $pos < length($data) ) {
  $upd_sth->execute(substr($data,$pos,$max_size));
  $pos+=$max_size;
}

I didn't test this code, but you should get the idea.

Mathieu Longtin