views:

188

answers:

1

My web host has a "process killer" which terminates any process running longer than about 5 minutes, so my download script can't run for that long.

I thought I'd use fread-print for a few seconds, just to catch if the user aborts the download initially and then use fpassthru to to dump the rest of the file.

ob_end_clean();
$file = fopen($this->m_path,'rb'); // read binary
fseek($file, $startByte);
$cnt = 0;
while (!feof($file) && (!connection_aborted()) && ($bytesSent<$bytesToSend) )
{
    $cnt++;
    if($cnt > 25) // to simulate breaking the loop after a certain time has passed
        break;
    set_time_limit(0);
    $buffer = fread($file, 32*1024);
    print($buffer);
    //flush();
    $bytesSent += strlen($buffer);
}
flush(); // has no effect on fpassthru

// --- insert point for the code below --

if(!feof($file))
{
    fpassthru($file);
}
fclose($file);

The problem is that fpassthru keeps the script running during the whole download. I've tired using output buffering, which results in the script exiting quickly, but the download gets cut off after about 8 minutes (I'm sending a 30 MB file at 15 kB/s).

One thing that makes the script exit quickly and lets the download complete is if I insert the following at the "insert point" above:

echo("X"); // or pretty much any string literal
flush();

It works every time, but the file gets corrupted of course (as expected).

If I insert something like this instead:

echo fread($file, 16);
flush();

it doesn't work (fpassthru doesn't allow the script to exit).

It feels like I've tried everything. I tired closing and reopening the file, sleeping, calling flush multiple times. It makes little to no sense to me why it would suddenly work when I print a string literal.

Does anyone have any ideas?

PHP version 5.2.11 Apache 2.2.9

EDIT:

I ended up getting my web host to extend the timeout period to 40 minutes so that I can serve the files with fread-echo and exit cleanly with time to spare. I serve relatively small files, and resuming is supported for the few visitors who get their transfers cut off.

I also made (and rejected) a redirect based solution where I set the headers I wanted in .htaccess, but that solution would force me to disallow partial downloads for most, since I had problem counting the used bandwidth.

+1  A: 

I see two immediate solutions to your problem:

  1. Get another host that doesn't do that sort of stuff
  2. Instead of letting the script manage the download, let apache do it, just redirect to the actual file after updating your counters and such. header('Location: '.$actualFile, true, 303) (the 303 signifies "See Other Resource", to prevent post data from posting again to the next url) After that you can just exit;
Kris
Thanks for answering. I've already confirmed that redirecting to the file works. It doesn't give me the control I want, but it's the backup plan if I can't get the script to work otherwise.
mrdeus
You could also implement rangeing, check out http://www.thomthom.net/blog/2007/09/php-resumable-download-server/ for examples. this means your script never does the whole file at once.
Kris
It doesn't show in the code above, but I have already implemented support for partial downloads, but that doesn't help if the requested segment can't be downloaded within the time limit.
mrdeus