views:

36

answers:

1

I am writing a PHP script (to be run from command line) to parse hundreds of large JSON files. All of these files are in a directory. Initially I was reading the files one by one and parsing them in the same script, but ran out of memory quickly. The other way to do it is to have two scripts, one to read the directory, get the list of files and call another script with the file name to be parsed as arguments. Is there any other way to do it?

Also, is there any way to parallelize this?

+1  A: 

Try unsetting variables after you are done with them, that should free memory allocated to those variables.

Edit: better yet, as I read, you assign null to those variables, that frees memory faster, and more efficiently:

$myNoLongerUsedVar = null;
aularon
I tried both, it is still running out of memory
@user187809 When does your script give this error? when calling `json_decode`? or while you are looping and working on the resulting array? Besides, how big is your `memory_limit` and the file you are trying to read?
aularon