views:

254

answers:

2

I have a Text file of Size 1.5GB or around 1.3Crores of Rows. When I am Loading this text file to the Table of same structure in MYSQL,

Its Loads only 62Lack records, shows 65K warnings and Finished. Whatis the reason, Why It loads only half of records???

Do I need to increase Buffer Size?? If yes then How??

or there is some other reason.

Can you guys help me??

A: 

We can't help you, because you've told us about the warnings, but haven't told us what the warnings are. That's like saying, "please help, I have a compiler error", without saying what the error is.

You can help you, by reading the warnings. The warnings tell you why the rows failed to load. No, you probably don't need to increase the buffer size, you probably have a type mismatch or a bad conversion.

(Incidentally, "crore" and "lakh" are non-standard measurements in most English speaking countries other than India, and won't be understood by most English speakers; try to use standard English measures when communicating in English.)

tpdi
Ok. wll be precautius from next time
Ashok Gupta
A: 

The problem is that when you're importing a lot of records in mysql, it doesn't always behave nicely. It just zooms through the data, spewing out warnings for the records that have a problem. You can try 'show warnings;', but that only helps you with the last warning(s).

Buffer size should not be an issue here, unless you have a lot of large blob elements. If that's the case, the error log might say something about innodb log file size.

If the error log doesn't help, I would suggest that you try to split your input into several smaller chunks so that you can dig through them to find the records that are causing the problem.

I don't know what crore and lakh means, so I can't comment on the numbers.

Is the file exported by the same mysql that you try to import it into, or from another version of mysql? There could be some version incompatibilities.

Nils-Anders Nøttseter