views:

37

answers:

5

I am trying to read in a simple flat file, it has 738,627 records in it, a sample of the file looks like this:

#export_dategenre_idapplication_idis_primary
#primaryKey:genre_idapplication_id
#dbTypes:BIGINTINTEGERINTEGERBOOLEAN
#exportMode:FULL
127667880285760002817317350
127667880285760002818261461
127667880285760002825372301
127667880285760002827785570
127667880285770193778591110
127667880285770193778771240
127667880285770193779116230
127667880285770193779482590
127667880285770193779623800
127667880285770193780516840
#recordsWritten:738627

My relevant PHP code looks like this

 ini_set("memory_limit","40M"); 
$fp1 = fopen('genre_application','r');
if (!$fp) {echo 'ERROR: Unable to open file.'; exit;}

$loop = 0;
while (!feof($fp1)) {
  $loop++;
    $line = stream_get_line($fp1,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue;  //Skip lines that start with # 
    $field[$loop] = explode ($delimiter, $line);
list($export_date, $genre_id, $application_id, $is_primary ) = explode($delimiter, $line);

// does application_id exist? 
$application_id = mysql_real_escape_string($application_id); 
$query = "SELECT * FROM jos_mt_links WHERE link_id='$application_id';"; 
$res = mysql_query($query); 
if (mysql_num_rows($res) > 0) { 
 echo $application_id . "application id" . $link_id . "\n";
} else 
{
// no, application_id doesn't exist 
echo $loop . "\n";
}

} //close reading of genre_application file
fclose($fp1);

The last output on my screen is as follows, so it's not even getting through the first 100,00 records. Is there a way to prevent the script running out of memory?

81509
81510
81511
81512
81513
81514
81515
81516
PHP Fatal error:  Allowed memory size of 41943040 bytes exhausted (tried to allocate 14 bytes) in /var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text_to_mysql.php on line 156

Fatal error: Allowed memory size of 41943040 bytes exhausted (tried to allocate 14 bytes) in /var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text_to_mysql.php on line 156
+2  A: 

You seem to be storing every line in an array initialized outside the script you quote:

 $field[$loop] = explode ($delimiter, $line);

why? This is bound to grow with every loop, until the 40MB limit is hit.

I think it'd work if you removed that, or changed it to a mere $field = ....

Pekka
+1 Beaten to the punch
Mark Baker
A: 

You're populating $field[$loop] for every line read from the file, so this array is growing every iteration of the loop. Do you actually need this growing in memory?

Mark Baker
A: 

You kind of answer your own question... You want to store 738,627 arrays of text into memory. Is there a reason you keep all the exploded lines in field$?

R. Hill
A: 

You can raise the amount of available memory like this: ini_set("memory_limit","64M"); ... just put that at the top of your PHP file. If it's still not enough memory, bump it up to 128 or 256 megs.

In your specific case, though, you're not using the available memory efficiently... see Pekka's answer.

no
A: 

Where you have...

$res = mysql_query($query);

Didn't you ought to have?...

if (isset ($res))
{
    mysqli_free_result ($res);
}

$res = mysql_query($query);
Brian Hooper