views:

366

answers:

5

Hi,

I need to export a table from phpmyadmin to a comma delimited text file. I think there is a code but can't seem to find it. I found one but it doesnt work.

I need to export this table because I need to organize it and sort it and add another column so I can add data to this column.

How can I do this please?

Many thanks

+2  A: 

Select the database and click the 'Export' tab. Select the table to export. Use CSV as the format.

TheGrandWazoo
+2  A: 

In PHPAdmin, if I recall correctly, there is an Export tab. If you click on that, you can select the table(s) you want to export and which format you would like to export the data in. CSV, .zip, .gzip, etc...

jaywon
Yes you are right. However I forgot a small detail. I want to export a table but sorted by ID. Can I do this? Currently, it is a bit random. For instance, you have from 1-19 then 5325, then 22,23, and so on...
Chris
Ok I managed to sort all columns from excel.
Chris
A: 

As mention before you have a export tab for do things things. But if you want to export a tricky question or a very special ordering you can do as following:

Go to View or write your SQL question
In the bottom of the view-page you will have a Export link. 
If you click on that you will now export the question (and its order) 
in any format you chose
Kristoffer
A: 

Execute this statement:

SELECT * FROM `table_name`
 INTO OUTFILE 'directory' ->e.g /tmp/myTable.csv(Linux) or C:\myTable.csv(Windows)
 FIELDS TERMINATED BY ','
 ENCLOSED BY '"'
 LINES TERMINATED BY '\n'
sami
A: 

Sami's SQL statement is good - I need to export a table with 5,000,000 records into 50 files of 100,000 records each. How can i do this? I can export the first 100,000 by adding LIMIT 0 , 100000 to the command, and I do get 100,000 records in the file, but if i add further statements for LIMIT 100001 , 200000 etc - only the first 100,000 records are given.

How do i break up this command so that each grouping of 100,000 records gets put in a differently named file and that the process doesn't contain duplicate entries?

thanks, Eddie

Eddie