views:

194

answers:

6

In our application, authentication is handled via set of Controller Plugins that validate the user etc.

I want to serve a large (video) file only to authenticated users- the obvious way to do this is via readfile() in the controller, but I'm finding it hits the PHP memory limit - presumably the output from the controller is buffered somewhere.

How can I turn off buffering just for this one controller?

EDIT: Thanks for all the useful tips about flushing any existing output buffering - I guess I was specifically looking for a way of doing this within the framework though?

A: 

I don't think you can actually. As far as i know php buffers all output before sending it to the requester.

you could increase the memory limit using ini_set()

Nicky De Maeyer
Not true, PHP has functions for controlling the output buffer.
Ben James
yes you can control i but you can't not buffer it...
Nicky De Maeyer
True, you can't turn it off completely, but you can definitely prevent it from reaching the memory limit.
Ben James
+1  A: 

Considering using an external script to output the file, and stream it to the browser using PHP's passthru function.

If on a Linux-based system, you could try something like passthru("cat video_file.flv");

However, a better practice is to avoid this streaming (from within PHP) altogether and issue the client a 301 HTTP redirection to the URL of the actual static resource so that the webserver can handle streaming it directly.

Tyson
That means the resource would be publicly accessible by unauthenticated people, as long as they know the URL
Bart van Heukelom
+1  A: 

Interesting problem... You could try:

// ...
public function largeFileAction()
{
    // this clears all active output buffers
    while (ob_get_level()) {
        ob_end_clean();
    }
    readfile('path/to/large/file');
    exit(); // to prevent further request handling
}
//  ...
Stefan Gehrig
A: 
$handle = fopen('/path/to/file', 'r');
$chunk_size = 8192;

while ($chunk = fread($handle, $chunk_size)) {
    echo $chunk;
    ob_flush();
}

This will probably need some tweaking, such as adding correct headers and reading in binary mode if necessary, but the basic idea is sound. I have used this method successfully to send 50+ MB files, with a 16 MB PHP memory limit.

Ben James
Thanks, was that specifically in a Zend Framework environment?
Ciaran McNulty
Sorry, this is not Zend-specific, it is simply the only way I know to prevent a large file filling the memory limit (aside from using a method that doesn't tie up a PHP process, of course)
Ben James
I think this is not Zend specific as long as the framework doesn't make any attempts to control the output buffer. I have managed to output 700MB files this way on a 32MB environment (That was the limit because of the execution time).
Pekka
+1  A: 

Ok, I might be totally wrong here, but I think to have read somewhere OB has to be enabled for ZendLayout and placeholder helpers to work, so you'd have to disable them for the downloadAction (you probably aint gonna need them for serving the file anyway).

Would something like this achieve what you want to do?

class DownloadController
{
    public function downloadAction()
    {
        $this->_helper->layout()->disableLayout();
        $this->_helper->viewRenderer->setNoRender(true);
        // authenticate user if not done elsewhere already
        header( /* ... the usual stuff ... */);
        filepassthru(/* some path outside webroot */);
        exit;
    }
}
Gordon
+1  A: 

As Tyson wrote, your best choice (if you have full control over the server) is to validate users credentials and redirect him (302 temporary redirect) to the URL where he can download the file.

To prevent reuse of this URLs we are using Lighttpd and its mod_secdownload that allows you to generate a hash that is valid for the specified amount of time.

nginx has X-Accel-Redirect and Apache has mod_xsendfile.

If you decide to implement a separate lightweight web server there are other benefits as well (mainly lower memory consumption while serving static files and faster response times).

If you decide to go this route you will either have to add another IP address to the server and bind Apache only to the one IP address, and the other server (lighty of nginx) to the other because they are web servers the both listen on port 80. And changing the port for one of the servers is not a good idea because a lot of people do not have access to higher ports.

If adding another IP address is not an option you can install nginx on port 80 and use it as a reverse proxy to pass the dynamic requests to Apache which can listen on another port and serve all of the static files.

Goran Jurić