views:

62

answers:

3

On my site, users can input links to files and I can stream the download process to them through my server. I use a system like this:

header('Cache-control: private');
header('Content-Type: application/octet-stream'); 
header('Content-Length: ' . $r[2]);
header('Content-Disposition: filename=' . $theName);

flush();

$file = fopen($fileName, "r");
while(!feof($file))
{
    print fread($file, 10240);  
    flush();
    sleep(1);
}
fclose($fileName);

The think is, my users downloads go pretty slowly (600kb/s). The server this is hosted on is on a 1Gbit port so they should be maxxing out their internet connection tenfold.

I'm wondering if there is a better way to do this sort of thing, maybe cURL perhaps? I don't have much experience with cURL but I'd appreciate any feedback.

Thanks.

+2  A: 

Use readfile.

If you were to insist on your approach, then it would be much more efficient this way, without flush and sleep:

while(!feof($file)) {
   print fread($file, 10240);  
}

Here's why:

  • using flush() you prevent normal buffers functioning, and
  • using sleep(1) you effectively decrease transfer speeds by pausing for 1 second every 10 KiBs.
chronos
+1  A: 

i am not really sure what your trying to do here .. dont see why you need that loop and specially dont see why you need the sleep() .. you should just use readfile or something like that instead of that loop , it would be much effective

Also how do you think curl will help you?

Sabeen Malik
I didn't notice the sleep, +1
Pekka
+1  A: 

If it's not the one-second sleep() that @Sabeen Malik pointed out, it is most likely due to a server-side restriction imposed by your web provider (e.g. using mod_throttle or mod_bandwidth) or the web provider you are fetching data from.

Pekka