I'm working on an application that allows the upload, and storage of large files on a web server. Currently I'm using PHP to handle POSTed files via http. I have my php.ini set with:
upload_max_filesize = 100M
post_max_size = 100M
memory_limit = 128M
max_input_time = 6000
max_execution_time = 6000
There doesn't seem to be any apache directive set for LimitRequestBody. I use APC to track file upload progress. For some reason the file upload always halts at 50M exactly.
I know http isn't the most efficient solution for file uploads, but this app needs to be user friendly, and I understand there are firewall issues with ftp.
I'm wondering if anyone could give me some suggestions as to what is halting my download at exactly 50M? It must be some sort of configuration setting.
Additionally, is there some other way I should consider working around using javascript / PHP and http for file uploads. I've looked into java applets and using flash. Possibly going to use swfuploader, but if its a server config that's causing my upload to fail via http, I don't really see how a java applet or flash uploader would get around that.
I should note that I'm hoping to come up with a solution that will allow me to upload very large files up to 1 GB at some point.
I use very simplistic PHP to receive the file
$uploaddir = '/'.$_POST['upload_directory'].'/';
$uploadfile = $uploaddir . basename($_FILES['file']['name']);
if (is_uploaded_file($_FILES['file']['tmp_name'])) {
if (move_uploaded_file($_FILES['file']['tmp_name'], $uploadfile)) { some success code; }
There's obviously a little more to it than that, but that's the gist of how I handle the upload.