Hi there, I run a price comparison data engine, and as we are collecting so much data im running into pretty serious performance issues. We generate various XML files, one per product and within the product data is each Online shop we grab data from, with their price, link, description, etc.
We have multiple feed parsers/scrapers which collect the price information for each product. The product data is uploaded to a MySQL db, then a PHP file sits on the server and generates the XML for every product.
The problem we are running into, is that for 10,000 products, the XML generation is taking almost 25 minutes! The DB is completely normalised and i am producing the XML via PHP Dom.
The XML generation process doesnt take into consideration whether any of the data has actually changed and this is the problem i am facing. What is the most efficient way of skipping generation of XML files which do not have any data changes?
Do i use a flag system? But doesnt this result in more db look ups which may increase the the db overheads? The current queries only take ~0.1 seconds per product.
Also, what happens if only 1 price for 1 shop changes within an XML file, it seems a waste to write the entire file again because of this, but surely a preg_replace would be just as time consuming?
Thanks for you time, really appreciated!