views:

84

answers:

4

Trying to force-download file with PHP using usual:

header("Content-type: $type" );
header("Content-Disposition: attachment; filename=$name");
header('Content-Length: ' . filesize($path));

And it does successfully for files somewhere below 32 mb. For bigger ones it just returns zeroed file.

Obviously there's some kind of limit, but what sets it? Using Apache 2.2.11 and PHP 5.3.0.

A: 

Inside the php.ini you will see the setting.

I can't remember the option name off the top of my head, but I will look inside my php.ini now and try and find it.

Just remove it and it will work.

Added

Okay, someone please correct me if I am wrong, but is it

memory_limit

Laykes
What? PHP doesn't have a "max download" setting of any type. Please take the time to at least research your answer so you can actually _provide one_.
hobodave
I think the problem is because he is generating the file himself, hence the memory_limit above. Please correct me if I am wrong as I want to know this also.
Laykes
There's not enough information present to determine that. memory_limit has to do with how much memory the script can use. This would only come into play if you attempted to read the entire file into memory, or enough of it to exceed the memory_limit. Sending headers doesn't do this.
hobodave
Thats what I would infer anyway. Maybe I am wrong, hopefully the OP can clarify. And of course PHP doesn't have this setting, because apache is providing the file for download, but if PHP is generating it, you will potentially hit problems.
Laykes
Regardless, memory_limit does not explicitly limit file download size.
hobodave
I have one of my own scripts which creates a file. Is memory_limit an issue here. Its basically joining lots of files together.
Laykes
I think there's more than enough information to reach this conclusion. ~32Mb is a fairly common memory limit for PHP and as Laykes says, PHP is generating the file after all.
Oli
@Oli: Laykes can't possibly know that, he's not the questioner here.
hobodave
Thank you guys for your responses. Actually my memory_limit is set to 128 mb. What I do is read file by 1mb chunks and instantly echoing it and flushing the buffer (I just thought that's how the logic should go for big files). And I'm hitting that problem I described above.
jayarjo
I guess it was an issue of headers, but after I fixed it, another issue rose it's head - PHP framework was buffering the output. So your advice was also useful. Thanks!
jayarjo
+3  A: 

It seems like you're loading the entire file into RAM before sending it down to the recipient. You'll want to look into PHP Streams to be able to send the full file contents without having to read it all into RAM first: http://php.net/streams

MightyE
I'm using fopen, fread($fh, $chunksize), fclose functions. Are not they safe for such cases?
jayarjo
The easiest form is to use `readfile` http://php.net/manual/en/function.readfile.php
deceze
That's a good point deceze; I'm accustomed to piping data from a database or other source, so I fall back to streams when there's a simpler solution available.
MightyE
jayarjo, it depends on what you're doing with it after you fread() it. If you're just doing `while (!feof($fp)) { echo fread($fp, 1024); }` then this is probably fine (assuming output buffering is disabled), but if you're concatenating it to another string, you'll still run out of memory eventually.
MightyE
Yes, that's what I was doing. Glad to know now that it is right, since I was doing it first time :)
jayarjo
+2  A: 

I eventually stumbled on this post: http://w-shadow.com/blog/2007/08/12/how-to-force-file-download-with-php/. Adding all the headers recommended there and also using:

 ob_end_clean(); //turn off output buffering to decrease cpu usage

before any output - has helped. No more limitations observable. Files download completely even huge ones.

jayarjo
Using `@` error suppression is inadvisable as it is slow and can lead to debugging nightmares.
Justin Johnson
Got it removed, thanks for pointing this out :)
jayarjo
A: 

also may need to set_time_limit(0);

steelbytes