tags:

views:

253

answers:

4

What's the best way to dump a MySQL table to a file in Perl?

I've been using:

open( FILE, ">$filename" );
my $sth=$dbh->prepare("select * from table");
$sth->execute();
while ( my $row = $sth->fetchrow_arrayref ) {
    print FILE join( "|", @$row ), "\n";
}
+5  A: 

Can you shell out to mysqldump? That's what it's there for...

Greg
+3  A: 

It depends on what you really want. Do you want to preserve schema information and database metadata? What about column names? On the other hand your current method should work fine for data storage as long as the schema and column order don't change, buy you should consider the case of some record with the "|" character in it and escape that value appropiately, and apply the corresponding logic when you read and parse the file back. You might want to look into Text::CSV for a fast, realiable and flexible implementation that does most of the work for you in both directions, both writing an reading the file.

codehead
+1  A: 

As already said, it depends on what you want to do. If the purpose is to back up the data, you should consider mysqlhotcopy (if you are using MyIsam tables) which copies the data/index files. It is much faster than manually dumping data (e.g. I get a 2.5 gb backup in approx 3 minutes).

Anon
+1  A: 

Depending on why you need it dumped and what the size and content is. Assuming what you want is not a backup, which obviously should have a different application besides perl for. I would go with something like this which will preserve your columns and make the data easier in some respects to slurp into other programs or hand off than a CSV.

use XML::Simple;
...
my @rows=();
while ( my $h = $sth->fetchrow_hashref() )
{
   $h->{_ROWNUM}=$#rows;
   push(@rows, $h);
}
print XMLout(\@rows);
hpavc