tags:

views:

631

answers:

6

How do I insert for example 100 000 rows into MySQL table with a single query?

A: 

You do have an option, kind of comboBox that lets you choose the number of rows you wanna insert. 100 000 is a large amount, so I would better recommend a script doing that for you

Ada
A: 

I believe the mysql admin tools have the ability to import from csv files etc. Script it otherwise.

Really depends on your data format

Karl
A: 

You can't as far as I know. You will need a loop.

Antony Carthy
+1  A: 

You can do a batch insert with the INSERT statement, but your query can't be bigger than (slightly less than) max_allowed_packet.

For 100k rows, depending on the size of the rows, you'll probably exceed this.

One way would be to split it up into several chunks. This is probably a good idea anyway.

Alternatively you can use LOAD DATA INFILE (or LOAD DATA LOCAL INFILE) to load from a tab-delimited (or other delimited) file. See docs for details.

LOAD DATA isn't subject to the max_allowed_packet limit.

MarkR
+2  A: 
insert into $table values (1, a, b), (2, c, d), (3, e, f);

That will perform an insertion of 3 rows. Continue as needed to reach 100,000. I do blocks of ~1,000 that way when doing ETL work.

If your data is statically in a file, transforming it and using load data infile will be the best method, but I'm guessing you're asking this because you do something similar.

Also note what somebody else said about the max_allowed_packet size limiting the length of your query.

Autocracy
A: 

Try to use LoadFile() or Convert yor Data into XML file and then use Load and Extract() funtion to Load Data into MYSql database.

This is the One Quesry and Fastest option,

Even i m doing the same,I had files if 1.5 GB arounf millions of rows. I have Used Both Option in my case.

Ashok Gupta