views:

41

answers:

3

I'm reading a flie with essentially upwards of ~500,000 lines separated out by | for the columns which I am parsing and trying to insert into the database through the CLI.. Is there a better way to read it in so I can use it?

Currently I'm inserting it as :

$fd = fopen ($txtFileName, "r");  
while (!feof ($fd))  {     
     $buffer = fgets($fd);     
     $lines[] = $buffer;  
} 
fclose ($fd); 

$i=0;
#hearder keys $t = $lines[1];

$keys = explode('|',$t);

However I'm starting to run out of memory with the larger files.. Any help would be appreciated. Thank you

A: 

Sure.
Insert it in the database immediately, do not store in in array

Col. Shrapnel
+3  A: 

Do you really need to have all of the data in memory before you start processing it? Generally it's best to read a line, do some processing (calculate aggregate statistics or do DB updates, for example) and then discard it and move on to the next line.

If you really need to do this with everything in memory, then I'd respectfully suggest that PHP may not be the right tool for the job if you do not have a lot of memory available on your system.

Gian
Thank you, yes.. I should have done this from the beginning rather than reading it all in.Cheers!
Frederico
+1  A: 

Why not read it line by line? Read the headers first, then read a line, insert the values, read a line, insert the values etc. Then your memory requirements will be tiny.

Depending on how you're accessing the database you may want to batch the inserts, of course - but a batch of 100 or even 1000 will still have a relatively small memory cost, compared with loading in the whole file in one go.

Jon Skeet