tags:

views:

43

answers:

2

Am exporting data to csv. after 25000 records , memory exhausted. Memory limit increasing is ok.

If i have 100000 rows, can i write it as 4 process. write first 25000 rows, then next 25000 then next...

Is this possible in csv export? Will this have any advantage? Or this is same exporting whole data?

Any multiple processing or parallel processing have some advantage?

A: 

The problem is if you fork the process, you have to worry about cleaning its children up, and you're still using the same amount of memory. Ultimately you're limited by the machine memory, but if you don't want to have to conditionally increase php's memory_limit based on the number of iterations, then forking may be the way to go.

If you compiled PHP with --enable-pcntl and --enable-sigchild, you're good to go - otherwise, you won't be able to fork the process. One workaround would be to have a master script that delegates the execution of other scripts, but if you're using backticks or shell() or exec() (or anything similar) it starts to get sloppy and you'll have to take a lot of steps to ensure that your commands cannot be tainted/exploited.

mway
pcntl is enable. what is that sigchild? that i dont found.i will look
zod
When you're using signal handlers, `--enable-sigchild` allows the parent to receive the SIGCHLD signal, eg, telling it that the child has exited so your processes don't become zombies.
mway
A: 

Well, this depends on how you're generating the CSV.

Assuming that you're doing it as the result of a database query (or some other import), you could try streaming instead of building and then returning.

Basically, you turn off output buffering first:

while(ob_get_level() > 0) {
    ob_end_flush();
}

Then, when you're building it, echo it out row by row:

foreach ($rows as $row) {
    echo '"'.$row[0].'","'.$row[1].'"'."\n";
}

That way, you're not using too much memory in PHP.

You could also write the data to a temporary file, and then stream that file back:

$file = tmpfile();
foreach ($rows as $row) {
    fputcsv($file, $row);
}
rewind($file);
fpassthru($file);  // Sends the file to the client
fclose($file);

But again, it all depends on what you're doing. It sounds to me like you're building the CSV in a string (which is eating all your memory). That's why I suggested these two options...

ircmaxell