tags:

views:

215

answers:

5
<?php

$filename= './get/me/me_'.rand(1,100).'.zip';

header("Content-Length: " . filesize($filename));
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=foo.zip');

readfile($filename);
?>

Hi, I have this simple code that forces a random file download, my problem is that if I call the script two or more times from the same browser the second download won't start until the first is completed or interrupted. Thus I can download only one file per time. Do you have any clue?

+3  A: 

just guesses. There could be different reasons.

first, your server could restrict the number of connections or childs in paralell. But I guess this sin't the problem

second, it is more likely that the client restricts the number of connections. The "normal" browser opens only two connections at a time to a certain server. Modern browsers allow up to 8 (?) connections. This is a simple restriction in order to avoid problems which could occur with slow servers.

One workaround could be to place every download on a "virtual" subdomain.

give it a try!

Ralf
A: 

As far as I can find, there is no php configuration setting that restricts max downloads or anything like that - besides, such a configuration is outside the scope of php.

Therefore, I can come to only two conclusions:

  • The first is that this is browser behaviour, see if the problem is repeated across multiple browsers (let me know if it is). The HTTP spec does say that only two connections to the same domain should be active at any one time, but I wasn't aware that affected file downloads as well as page downloads. A way of getting round such a limitation is to allocate a number of sub-domains to the same site (or do a catch-all subdomains DNS entry), and when generating a link to the download, select a random sub domain to download from. This should work around the multiple request issue if it is a browser problem.

  • A second and much more unlikely option is that (and this only applys if you are using Apache), your MaxKeepAliveRequests configuration option is set to something ridiculously low and KeepAlives are enabled. However, I highly doubt that is the issue, so I suggest investigating the browser possibility.

Are you getting an error message from the browser when the second download is initiated, or does it just hang? If it just hangs, this suggests it is a browser issue.

Kazar
+1  A: 

I'd further investigate Ralf's suggestion about the server restrictions and start with checking the logfiles to ensure that the second request is received by the server at all. Having that knowledge, you can eliminate one of the possibilities and at least see which side the problem resides on.

From the client's browser - you didn't mention which one is it - if Firefox, try to install the Live Http Headers extension to see what happens to request you send and if browser receives any response from the server side.

bth
+3  A: 

This may be related to PHP's session handling.

Using the default session handler, when a PHP script opens a session it locks it. Subsequent scripts that need to access it have to wait until the first script is finished with it and unlocks it (which happens automatically at shutdown, or by session_write_close() ). This will manifest as the script not doing anything till the previous one finishes in exactly the same way you describe.

Clearly you aren't starting the session explicitly, but there's a config flag that causes the session to start automatically: session.auto_start - http://www.php.net/manual/en/session.configuration.php

Either use phpinfo() to determine if this is set to true, or look in your config. You could also try adding session_write_close() to the top of the script, see if it makes the issue go away.

benlumley
Found this old answer by searching, and session_write_close(); was exactly what I needed. Just had to be sure to call it right before the download starts. Thanks!
Greg W
A: 

Hello

Just to say that the session_write_close(); solved the problem for me.

I was using session_destroy(); (that worked) but whas not much good if i needed to keep session data :)

All you need to do i place session_write_close(); just before you start streaming the file data. Example :

<?php
$filename= './get/me/me_'.rand(1,100).'.zip';

session_write_close();

header("Content-Length: " . filesize($filename));
header('Content-Type: application/zip');
header('Content-Disposition: attachment; filename=foo.zip');

readfile($filename);
?>

Sky

Sky