Hi,
I'm dealing with large XML files (several megabytes) for which I have to make various kind of checks. However I have problem with memory and time usage which grows very quickly. I've tested it like this:
$xml = new SimpleXMLElement($string);
$sum_of_elements = (double)0.0;
foreach ( $xml->xpath('//Amt') as $amt ) {
$sum_of_elements += (double)$amt;
}
With microtime() and memory_get_usage() -funtions I get the following results by running this code:
- 5Mb file (7480 Amt-elements):
- execution time 0,69s
- Memory usage grows from 10.25Mb to 29.75Mb
That's still quite ok. But then with a bit bigger file memory and time usage grow very much:
- 6Mb file (8976 Amt-elements):
- execution time 8,53s
- Memory usage grows from 10.25Mb to 99.25Mb
The problem seems to be in looping the result set. I've also tried for-loop instead of foreach but with no difference. Without looping the memory usage does not grow so much.
Any idea where the problem could be?