views:

427

answers:

7

Hi all,

I have a report page that deals with ~700k records from a database table. I can display this on a webpage using paging to break up the results. However, my export to PDF/CSV functions rely on processing the entire data set at once and I'm hitting my 256MB memory limit at around 250k rows.

I don't feel comfortable increasing the memory limit and I haven't got the ability to use MySQL's save into outfile to just serve a pre-generated CSV. However, I can't really see a way of serving up large data sets with Drupal using something like:

$form = array();
$table_headers = array();
$table_rows = array();
$data = db_query("a query to get the whole dataset");
while ($row = db_fetch_object($data)) {
    $table_rows[] = $row->some attribute;
}

$form['report'] = array('#value' => theme('table', $table_headers, $table_rows);
return $form;

Is there a way of getting around what is essentially appending to a giant array of arrays? At the moment I don't see how I can offer any meaningful report pages with Drupal due to this.

Thanks

A: 

At the moment you store everything in the array $table_rows.

Can't you flush at least parts of the report while you're reading it from the database (e.g. every so and so many lines) in order to free some of the memory? I can't see why it should only be possible to write to a csv at once.

middus
+1  A: 

If you are generating PDF or CSV you shouldn't use the Drupal native functions. What about writing to the output file inside your while loop? This way, only one result set is in memory at a given time.

chiborg
Yeah, that's what I mean: write to the output inside your while loop.
middus
A: 

I don't feel comfortable increasing the memory limit

Increasing the memory limit doesn't mean that every php process will use that amount of memory. However you could exec the cli version of php with a custom memory limit - but that's not the right solution either....

and I haven't got the ability to use MySQL's save into outfile to just serve a pre-generated CSV

Then don't save it all in an array - write each line to the output buffer when you fetch it from the database (IIRC the entire result set is buffered outside the limited php memory). Or write it directly to a file then do a redirect when the file is completed and closed.

C.

symcbean
Some good suggestions here, thanks. I didn't realise PHP copies by value each time - I'll see if I can combine using byref assignment with buffering to the filesystem to get what I need
jo
+2  A: 

With such a large dataset, I would use Drupal's Batch API which allows for time intensive operations to be broken into batches. It is also better for users because it will give them a progress bar with some indication of how long the operation will take.

Start the batch operation by opening a temporary file, then append new records to it on each new batch until done. The final page can do the final processing to deliver the data as cvs or convert to PDF. You'd probably want to add some cleanup afterwords as well.

http://api.drupal.org/api/group/batch/6

Nathan Rambeck
A: 

You should include paging into that with a pager_query, and break results into 50-100 per page. That should help a lot. You say you want to use paging but I don't see it in the code.

Check this out: http://api.drupal.org/api/function/pager_query/6

Kevin
A: 

Another things to keep in mind is that in PHP5 (before 5.3), assigning an array to a new variable or passing it to a function copies the array and does not create a reference. You may be creating many copies of the same data, and if none are unset or go out of scope they cannot be garbage collected to free up memory. Where possible, using references to perform operations on the original array can save memory

function doSomething($arg){
  foreach($arg AS $var)
    // a new copy is created here internally: 3 copies of data exist
    $internal[] = doSomethingToValue($var);
  return $internal;
  // $arg goes out of scope and can be garbage collected: 2 copies exist
}
$var = array();
// a copy is passed to function: 2 copies of data exist
$var2 = doSomething($var);
// $var2 will be a reference to the same object in memory as $internal, 
//  so only 2 copies still exist

if the $var is set to the return value of the function, the old value can be garbage collected, but not until after the assignment, so more memory will still be needed for a brief time

function doSomething(&$arg){
  foreach($arg AS &$var)
    // operations are performed on original array data: 
    // only two copies of an array element exist at once, not the whole array
    $var = doSomethingToValue($var);  
  unset($var); // not needed here, but good practice in large functions
}
$var = array();
// a reference is passed to function: 1 copy of data exists
doSomething($var);
GApple
A: 

The way I approach such huge reports is to generate them with the php cli/Java/CPP/C# (i.e. CRONTAB) + use the unbuffered query option mysql has.
Once the file/report creation is done on the disk, you can give a link to it...

Itay Moav