views:

96

answers:

6

I have a php file that acts as a gatekeeper for all the files I want people to download, who ahve sufficient privilages.

The code I use throw the file to the user is

header('Content-Description: File Transfer');
header('Content-Type: application/octet-stream');
header("Content-disposition: attachment; filename=\"".$public_filename."\""); 
header("Content-Transfer-Encoding: Binary"); 
header('Expires: 0');
header('Cache-Control: must-revalidate, post-check=0, pre-check=0');
header('Pragma: public');
header("Content-length: ".$f_filesize); 
readfile($file_path);

Most files are fairly large.... 400mb-10GB.

What would be a good way to do this, and keep the true locations + filenames secret, so people cant just link to the files directly, but HAVE to link thru my download.php?file=ID gatekeeper?

Thanks

EDIT: Im not asking how to do user authentication, all that is done. Im just asking if my way of doing it, is a good idea on a large scale. Seems like it could cause memory problems if I keep reading 10GB files.

+1  A: 

You'll want to somehow authenticate them (an HTML form, HTTP basic auth, whatever), then set a session flag, which your download.php script can check. Note that this doesn't prevent people from downloading the file, then distributing it themselves.

You should configure your web server so the real files are not directly accessible.

It's not going to cause memory problems per se. readfile does not read the file into memory. However, using PHP will create overhead. You can eliminate some of this delay by using X-Sendfile.

Matthew Flaschen
+1  A: 

You're already doing that - the $public_filename is what you want it called, the readfile($file_path) part is the file - it's location isn't made public. Past that, it could be above the document root.

Dan Heberden
I understand.. but what Im getting at is.... is it a good idea to have php read 10GB files to pass to users?
Yegor
A: 
  1. Put the files somewhere that is not accessible via HTTP.
  2. Create a database table of file IDs with file paths.
  3. Link to the files via file ID (as you noted above, download.php?fileID=0000).
  4. ???
  5. Profit.

As someone who did this previously (many years ago), you need to consider the memory impact this will have on your server. The readfile function was not available then, so it is possible you may not need to do anything special for memory considerations.

Nate Pinchot
+4  A: 

Ok, having php send files of around 400Mb–10Gb is not good. You need to somehow let whatever webserver you're using actually serve the files.

This really comes down to how secure you need it to be. The easiest solution that comes to mind (but far from the most secure) is using symbolic links with long random names that link to the original file. After a certain time the symbolic links expire and are removed. Each user get their own symbolic link (or "token") to the file they're downloading. I'm not sure how this plays out in Windows-environment, but on unix it's fairly straightforward anyway.

Here's some pseudo code:

if($user->isAllowedToDownload($file)){
    $token = md5($user->name . $file->name . time() . $someGoodRandomValue);
    symlink($file, $download_path . $token);
    header("Location: $download_url$token"); 
}

Then you need a cron job that cleans out old symbolic links. You also need to make sure the webserver is set to follow symbolic links, preferably only for that folder where these download tokens are created.

So when the user maybe requests domain.com/download?file=bigfile.mp4 a symbolic link is created in the webservers public space that points to the real file outside the webservers public space. The user gets redirected to maybe domain.com/getFile/ab739babec890103bdbca72 which in turn causes the webserver to serve the file. Now it's very hard for users to try and guess what an URL is for a file, and that's the "security".

0scar
I think you can even unlink the file (here, the link) after few seconds, I will work for the current download. Just as you can create file, open it, delete, then read it.
Aif
Didn't know that, cool! You'd have to check how that works out with users aborting and resuming downloads though. (With files this big I think we can assume downloads will get interrupted?)
0scar
You can also store token in cookies and have rewrite rules which would transform cookie into file path. You can achieve nice URLS with original file names and show PHP script if cookie is not set or symlink no longer exists.
Skirmantas
Deleting the symlink during the download processes, stops the download, at least for me. Im running lighttpd.
Yegor
A: 

Your method will cause memory problems, however it is possible to read and output the file in chunks. You will need to use flush() function after you echo each chunk of file. You can also make resuming downloads to work with a little more effort. Still this is an CPU hungry approach.

The easier and better solution is to use "x-sendfile" header tag supported by both apache and lighttpd through their modules. All you'll have to do is just specify file name in your header, similar to this:

header('X-Sendfile: filename-on-your-file-system');

Link for lighttpd:

http://redmine.lighttpd.net/projects/lighttpd/wiki/X-LIGHTTPD-send-file

Skirmantas
A: 

i found this article a while ago:

http://lakin.weckers.net/code/web/apache-mod-rewrite-secure-downloads/

Sabeen Malik