Hi everybody,
in a PHP program, I sequentially read a bunch of files (with file_get_contents
), gzdecode
them, json_decode
the result, analyze the contents, throw the most of it away, and store about 1% in an array.
Unfortunately, with each iteration (I traverse over an array containing the filenames), there seems to be some memory lost (according to memory_get_peak_usage
, about 2-10 MB each time). I have double- and triplechecked my code, I am not storing unneded data in the loop (and the needed data hardly exceeds about 10MB overall), but I am frequently rewriting (actually, strings in an array). Apparently, PHP does not free the memory correctly, thus using more and more RAM until it hits the limit.
Is there any way to do a forced garbage collection? Or, at least, to find out where the memory is used?
Thanks in advance, Dmitri