tags:

views:

138

answers:

2

Hello,

I need to do a big database preload in a PHP webpage (which requires an ini_set of 1024MB) for a database export in CSV format (data preloaded in an array). I know this is not always the right choice but the fact is that I need to do it, so here's the main technical question :

How can I be sure that all the preloaded data is cleared after by CSV generation is freed so that server performances stay optimal ?

+1  A: 

Perhaps unset(). But actually, the memory is freed as the PHP script ends.

Havenard
So say the user clicks on a new link on that page, the memory will automatically be freed, right ?
Amadeus45
As soon as the page is loaded, the PHP script dies, and the PHP Garbage Collector works on freeing all memory used. You must know your script does not necessarily still active as the user is seeing the page, unless you've programmed it to do so.
Havenard
+3  A: 

I know you say that you need to do it the way you described, but I'll throw out this option anyway. Have you considered using SELECT INTO OUTFILE syntax? MySQL can generate the CSV file for you (if you're not using MySQL, there's probably similar functionality you can find):

SELECT * INTO OUTFILE '/tmp/result.txt'
  FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
  LINES TERMINATED BY '\n'
  FROM test_table;

You could do an fpassthru() after that if you wanted to dump out the file.

If you can't get around loading your entire data set into memory, then you can specifically unset() your array to be sure that it is cleared after you are done with it. As Havenard mentioned however, if you are loading this from a web page, then your web server will automatically clear the memory when the PHP script finishes executing, as the web thread will shut down the PHP environment.

zombat
Thanks for your suggestion.
Amadeus45