I am parsing an xml file using php simplexml_load_file() and then inserting the desired data in a mysql ISAM table. The problem is that the code works "most" of the times, with 500 internal server errors here and there. The XML file that I am trying to process is big (around 50 MB), and it yields around 25000 rows in the mysql table when it works. When I get the error, the script inserts anything from a few rows to a few thousand rows.
Anyway, here is the code, I would appreciate it if someone has any insight into this, or has an alternative way, I don't know, maybe batch processing or something like that.
<?php
include ("myconnection.php");
mysql_query("DELETE FROM mytable") or die(mysql_error());
echo "data deleted, now insert: <br /><br />";
//the url for the feed:
$feed = 'cachy/copy.xml';
echo "myfeed: ".$feed;
echo "<br />";
// Load the feed
$xml = simplexml_load_file($feed);
if ($xml != null)
{
echo "<br />Success! feed available!<br /><br />";
}
else
{
echo "<br />Couldn't fetch the content<br /><br />";
die;
}
//die;
//ini_set("memory_limit","256M");
//set_time_limit(120);
function clean($input)
{
$input = trim($input);
$input = htmlentities($input, ENT_QUOTES);
$input = mysql_escape_string($input);
$input = EscapeShellCmd($input);
return $input;
}
//insert data from feed
foreach($xml->xpath('//product') as $products)
{
$product_name = clean($products->product_name);
mysql_unbuffered_query("INSERT INTO mytable (onsaledate, onsaletime, eventdate, eventtime, buyat_short_deeplink_url, product_name, level1, level2, VenueName, VenueDMAID)VALUES (\"$products->OnsaleDate\",\"$products->OnsaleTime\",\"$products->EventDate\",\"$products->EventTime\",\"$products->buyat_short_deeplink_url\",\"$product_name\",\"$products->level1\",\"$products->level2\",\"$products->VenueName\",\"$products->VenueDMAID\")") or die(mysql_error());
}
mysql_close($myConnection);
echo "records inserted my man!";
die;
Thanks in advance for your help!