views:

445

answers:

4

Ive been having trouble with a php and JavaScript upload script accepting large file uploads with Dreamhost. I realize that you are supposed to edit php.ini to change post max size and the memory limit, but it isn't behaving as it should.

The only way I have ever successfully had a large file upload was switching to Dreamhost PS and making the memory limit as high as a file (1GB) but there has to be another cost effective way, otherwise how oils sites like YouTube survive? I get I/O errors if I do not have all this memory available.

Could anyone help? Ive struggled with this for over a month.

A: 

You usually can't increase the maximum file size on a shared hosting webspace. Run phpinfo() to see what the exact size limit is. Anything beyond that is probably not going to work on that web space without an upgrade.

Don't confuse max_upload_file_size and memory_limit though. memory_limit applies to how much RAM one instance of your PHP script is allowed to use, and has nothing to do with file uploads.

Pekka
Thanks Pekka! I am using Dreamhost Private Server which actually does allow modification of php.ini, and I have my settings confirmed with phpinfo(). I have post_max_size and memory_limit set appropriately. memory_limit was recommended to be higher than post_max_size on this page: http://www.developershome.com/wap/wapUpload/wap_upload.asp?page=php2 and stats show that the script is indeed sucking up tons of memory. The problem seems to be that PHP is storing it all in memory. Someone else has suggested a CGI script would be more efficient, perhaps that is the solution.
Ted Avery
The actual processing of the uploaded file is done before the PHP script comes into play, so memory limits there should not be an issue. Can you post some actual numbers (File sizes, limits etc.)? Plus, what exactly is your script doing? Can you post the receiving script? What errors are you exactly getting?
Pekka
A: 

Looks like the answer was to make a Perl script instead. Using Perl I don't see even a blip in the server memory usage.

Ted Avery
A: 

Editing the PHP.ini is ultimately the solution. You have to change the max_upload_file_size and maybe the post_max_size, you may also want to increase the max_execution_time of the script. These variables will increase your uploading abilities. You also want to include the actual, modified PHP.ini script in the actual directory where you want the changes made.

Reference:PHP Core Variables

Try creating/copying php.ini into a directory, then going to info.php for details. (info.php - just create a new blank php file with contents: <?php phpinfo() ?>

Also, never leave an info.php file on your server. I change its extension on my sites.

The Mirage
A: 

I agree using the changes to php.ini you can choose just how crazy of a file you want to allow. With no limit to size, you are opening yourself up for a world of problems. I think even in perl you should fatal out if the number of bytes of a file exceeds a certain amount. Depending on how tech savy your users are you may end up with more than you bargained for. And you wouldnt want to crash yur whole web server because one user uploaded a 200GB file.

Jeff Kalbfleisch