views:

2780

answers:

4

I'd like to avoid mysqldump since that outputs in a form that is only convenient for mysql to read. CSV seems more universal (one file per table is fine). But if there are advantages to mysqldump, I'm all ears. Also, I'd like something I can run from the command line (linux). If that's a mysql script, pointers to how to make such a thing would be helpful.

+2  A: 

If you can cope with table-at-a-time, and your data is not binary, use the -B option to the mysql command. With this option it'll generate TSV (tab separated) files which can import into Excel, etc, quite easily:

% echo 'SELECT * FROM table' | mysql -B -uxxx -pyyy database

Alternatively, if you've got direct access to the server's file system, use SELECT INTO OUTFILE which can generate real CSV files:

SELECT * INTO OUTFILE 'table.csv'
    FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
    LINES TERMINATED BY '\n'
FROM table
Alnitak
thanks! your second "table" should be "database", right? no option for CSV instead of TSV that you know of?
dreeves
duh - yes, it should have read 'database'. No, there's no option for CSV, this is the best I know of without using MySQL's built-in 'select into outfile', which _can_ do CSV, but writes the files on the server, not the client.
Alnitak
oh, well, I don't mind scp'ing the dumped files afterwards...
dreeves
+2  A: 

In MySQL itself, you can specify CSV output like:

SELECT order_id,product_name,qty
FROM orders
INTO OUTFILE '/tmp/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n'

From http://www.tech-recipes.com/rx/1475/save-mysql-query-results-into-a-text-or-csv-file/

Paul Tomblin
+1  A: 

Check out mk-parallel-dump which is part of the ever-useful maatkit suite of tools. This can dump comma-separated files with the --csv option.

This can do your whole db without specifying individual tables, and you can specify groups of tables in a backupset table.

Note that it also dumps table definitions, views and triggers into separate files. In addition providing a complete backup in a more universally accessible form, it also immediately restorable with mk-parallel-restore

Paul Dixon
+2  A: 

If you really need a "Backup" then you also need database schema, like table definitions, view definitions, store procedures and so on. A backup of a database isn't just the data.

The value of the mysqldump format for backup is specifically that it is very EASY to use it to restore mysql databases. A backup that isn't easily restored is far less useful. If you are looking for a method to reliably backup mysql data to so you can restore to a mysql server then I think you should stick with the mysqldump tool.

Mysql is free and runs on many different platforms. Setting up a new mysql server that I can restore to is simple. I am not at all worried about not being able to setup mysql so I can do a restore.

I would be far more worried about a custom backup/restore based on a fragile format like csv/tsv failing. Are you sure that all your quotes, commas, or tabs that are in your data would get escaped correctly and then parsed correctly by your restore tool?

If you are looking for a method to extract the data then see several in the other answers.

Zoredache
thanks, very helpful! you convinced me. i do have other reasons to want a quick way to get a csv dump via a command line command though. i'm holding out for that for the "accepted answer". (tho i could probably piece it together at this point from the current answers, i guess w/ a mysql script).
dreeves
I can see lots of value in having both a backup using the native method and also extracts for other purposes.
Zoredache
mk-parallel-restore can restore a CSV backup created with mk-parallel dump, so you can get the best of both worlds. In fact, mk-parallel-restore does a better job by ensuring any triggers you have defined are restored last.
Paul Dixon