views:

914

answers:

3

I've got files on Amazon's S3. They are named with a unique ID so there are no duplicates. I am accessing them using an authorized URL. I need to be able to pass them through to the browser, but I need to rename them. Right now I'm using fopen, but it is downloading the file to my server before serving the file to the browser. How can I have the files 'pass through' my server to the browser? Or how do I buffer the download - downloading a small chunk to my server and pass that to the browser while downloading the next chunk?

Also - I would really like to use CloudFront but they don't offer authenticated URLs. I believe I can use CURL to send credentials for the request - can I do this sort of 'pass through' file serving with CURL?

Thanks!

+1  A: 

Have you tried using http_get, with request_options to specify the httpauth and httpauthtype? Although I don't remember if this method assumes a string valid type, which might not work for well binary.

If that is successful, then you should be able to provide the correct MIME type and write out to the browser.

Joe
Thanks, Joe. I'll look into it.
Corey Maass
+2  A: 

I'm not familiar with how S3 works, so I don't know if this solution is possible. But couldn't you simply redirect the user's browser to the file? If I understand correctly, S3 allows you to create web URLs for any of the files in your bucket. So if, say, these are paid downloads, then you could have S3 generate a temporary URL for that download and then remove that once the user has downloaded it.

If that is not an option, you can try these PHP Classes:

  • HTTP protocol client - A class that implements requests to HTTP resources (used by the below stream wrapper). Allows requests to be streamed.
  • gHttp - An HTTP stream wrapper that lets you treat remote HTTP resources as files, using functions like fopen(), fread(), etc.
  • Amazon S3 Stream Wrapper - An Amazon S3 stream wrapper by the same developer as gHttp. Also allows remote resources to be accessed like ordinary files via fopen('s3://...').


Edit:

This page has the info on how to "pre-authenticate" a request by encoding the authentication key in the URL. It's under the section titled: Query String Request Authentication Alternative.

// I'm only implementing the parts required for GET requests.
// POST uploads will require additional components.
function getStringToSign($req, $expires, $uri) {
   return "$req\n\n\n$expires\n$uri";
}

function encodeSignature($sig, $key) {
    $sig = utf8_encode($sig);
    $sig = hash_hmac('sha1', $sig, $key);
    $sig = base64_encode($sig);
    return urlencode($sig);
}

$expires = strtotime('+1 hour');
$stringToSign = getStringToSign('GET', $expires, $uri);
$signature = encodeSignature($stringToSign, $awsKey);

$url .= '?AWSAccessKeyId='.$awsKeyId
       .'&Expires='.$expires
       .'&Signature='.$signature;

Then just redirect the user to $url, and they should be able to download the file. The signature is encoded by a one-way encryption scheme (sha1), so there's no risk of your AWS Secret Access Key being uncovered.

Calvin
The one worry I would have with redirection is that you need to verify that your security isn't passed along. If the redirection requires authorization, you are giving that to the client.
Joe
Well, that's why a temporary public URL is generated. That way no authentication info is passed, and the URL is gone once the user has received the file (or after a certain time limit).
Calvin
I don't think this addresses my renaming risk. If I move two files to a public folder to allow download directly, I'd have to give them friendly names there. It's possible they have the same name, and then I'm stuck.
Corey Maass
Is that how S3 works? I thought the web urls were generated dynamically and completely detached from the filesystem? As I understand it, you can have multiple web urls per hosted file.
Calvin
@Calvin - Yes, you create authenticated urls for s3. You can have as many as you need. But I think your solution is the right one - to redirect, because then I can redirect to a cloudfront URL. MY next question is can I redirect with authentication info in the headers?
Corey Maass
Sort of. Basically, you just encode the headers into the URL. S3 supports this type of authentication for GET requests.
Calvin
A: 

Have you tried bu just using readfile("http://username:password@host/filename.ext") ?

That will just bypass and write directly to the outputbuffer, however if content-type is of concern you need to verify that first.

Using en URL as argument to readfile also requires that PHP is compiled with urlwrapper-support.

jishi