tags:

views:

46

answers:

2

Hi

I have encountered a problem while inserting a large amounts of text into my MySql database.

Can I please get any pointers (what column types to use & any other issues I need to know) as to how I can achieve this, the code works fine with small amounts.

Thanks in advance.

Clarification The text blocks have around 7,000 characters.

And the problem I'm encountering is the PHP app is prompting me that the data has been saved but when i look at the dbase, the record hasnt been stored.

I have changed the particular column to LONG TEXT but that doesnt seem to do the trick.

Thanks

A: 

In short use LONGBLOB or LONGTEXT, but you better descrive the problem more precisely.

Yasen Zhelev
Will Do, Thanks
Stanley Ngumo
A: 

question is a bit vague but if you're using innodb and can bulk load here are a few tips:

sort your data file into the primary key order of the target table 
(innodb uses clustered primary keys)

typical load data infile i use:

truncate <table>;

set autocommit = 0;

load data infile <path> into table <table>...

commit;

other optimisations you can use to boost load times:

unqiue_checks = 0;
foreign_key_checks = 0;
sql_log_bin = 0;
split the csv file into smaller chunks

as far as datatypes are concerned:

use the smallest datatype you can tinyint unsigned vs int etc

as you're dealing with textual data favour varchar over text where possible.

f00