I am trying to read in a simple flat file, it has 738,627 records in it, a sample of the file looks like this:
#export_dategenre_idapplication_idis_primary
#primaryKey:genre_idapplication_id
#dbTypes:BIGINTINTEGERINTEGERBOOLEAN
#exportMode:FULL
127667880285760002817317350
127667880285760002818261461
127667880285760002825372301
127667880285760002827785570
127667880285770193778591110
127667880285770193778771240
127667880285770193779116230
127667880285770193779482590
127667880285770193779623800
127667880285770193780516840
#recordsWritten:738627
My relevant PHP code looks like this
ini_set("memory_limit","40M");
$fp1 = fopen('genre_application','r');
if (!$fp) {echo 'ERROR: Unable to open file.'; exit;}
$loop = 0;
while (!feof($fp1)) {
$loop++;
$line = stream_get_line($fp1,128,$eoldelimiter); //use 2048 if very long lines
if ($line[0] === '#') continue; //Skip lines that start with #
$field[$loop] = explode ($delimiter, $line);
list($export_date, $genre_id, $application_id, $is_primary ) = explode($delimiter, $line);
// does application_id exist?
$application_id = mysql_real_escape_string($application_id);
$query = "SELECT * FROM jos_mt_links WHERE link_id='$application_id';";
$res = mysql_query($query);
if (mysql_num_rows($res) > 0) {
echo $application_id . "application id" . $link_id . "\n";
} else
{
// no, application_id doesn't exist
echo $loop . "\n";
}
} //close reading of genre_application file
fclose($fp1);
The last output on my screen is as follows, so it's not even getting through the first 100,00 records. Is there a way to prevent the script running out of memory?
81509
81510
81511
81512
81513
81514
81515
81516
PHP Fatal error: Allowed memory size of 41943040 bytes exhausted (tried to allocate 14 bytes) in /var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text_to_mysql.php on line 156
Fatal error: Allowed memory size of 41943040 bytes exhausted (tried to allocate 14 bytes) in /var/www/vhosts/smartphonesoft.com/httpdocs/fred/xmlfeed/test/text_to_mysql.php on line 156