views:

170

answers:

3

What way is the best way to let users upload large files from there webbrowser to a server. I'm talking 200MB+ possible up to a few gigatyes. I have been thinking of a few possible solutions to the problem (not tried them yet) and this is basically the things I came up with. Server download speed will not be a problem but the users connection possibly could.

Having some sort of applet on the client side written in Java or Flash which sends the file in parts (is this possible with an applet) to a php/other script on the server and a checksum+ some other info about the file. On the server scripts all the parts and the info file is saved in a temporary directory wich has a unique name based on the checksum of the file and the ip of the user. When the last chunk is sent the applet sends a signal to the server saying it's finished and the server put the file together in the right location. If a chunk doesn't match the checksum for that part the server will send a response to the applet telling it to reupload that chunk. I don't know how important the checksum checking is since it's all tcpackages, someone with more insigth migth be able to answer on that.

This is probably the worst way, changing the settings on your server to allow huge fileuploads via an inputfiel. Do it like a normal transfer.

User an uploadmanager which does pretty much the same thing as applet i mentioned above.

Pros of the first is probably that it would most likely be rather secure, you could show progress as well and possibly resume an upload if ip hasn't changed and do a threaded upload of the chunks. Cons of the first is that the user will need flash/java for it to work. Pros of the 2nd is that it will pretty much work for everyone but cons are big, first there's no way resuming an intruppted download and if something is wrong the whole file would have to be reuploaded is a few of cons. For the third one the pros is pretty muc the same as for the first but the cons is that the user would have to download an application to their computer and run and the application will have to be have to be compatible with their computer and OS.

Another way may be a combination of two. Lets say an applet for bigger or more files and a simple input which is rather restricted to maybe max 10-20MB for smaller files and comability.

There are probably other much smarter ways to tackle this and that's why I'm asking for advice here on SO.

A: 

Well, I don't know what access you have to your server, but you could create an .htaccess file that allows huge uploads for just one page/file. As for chunking, I don't think you can do that with the Flash player. (You can't access the raw bytes from the client side, anyway.)

George Edison
+2  A: 

Your best bet is to use a Java applet to do it. I hate saying "Use Java" because it's such an awful solution (who actualy likes visiting a page and seeing the Java logo?). On the up side, though, you can break files up into chunks and upload simultaneously. It's cross-platform, and once it's running, it can be damn fast. Since it runs in the JVM, you're also not putting stress on the browser like Flash would. You obviously could easily add the progress indicators and whatnot with little difficulty.

On top of that, I'm assuming that since the user is allowed to post such huge files, you're somehow ensuring that they're valid users (such that it's not just some idiot uploading like a 1gb file of random letters) and that they aren't just attacking or gumming up your site. With Java you can also perform some light client-side validation to make sure that they're legitimate users. Not that you can't do that with Flash, but it would seem like it's more intuitive to do it in Java. You could also encrypt the files, compress them with GZip or Deflate to help save bandwidth.

Good luck

mattbasta
+1 for the java rant :D
knittl
Thanks for the answer, I'm neither to keen on running java applets on websites but in this case it seems like the best solution. Obviously the system will only be accessable to a closed group of trusted users and there will be some sort of validation as well, probably both on server and client side. Reason to having both user and client side is because a skilled user could probably sniff the data and create their own unrestricted client. But it's not going to be some sort of public upload whatever you want site. Upvoted and accepted.
Hultner
+1  A: 

For a limited circle of users, an idea I'm toying with - but haven't implemented yet - is using Rightload as a tool for clients to upload files. It's a great free (but apparently not Open Source) "Right click" FTP uploader for Windows that is pretty easy to set up. It also seems to be easy to ship along pre-defined XML profiles for the user's FTP server.

Presumably, this is a more stable solution for large files than a browser-based upload.

Pekka
Interesting solution. Creating a custom FTP client shouldn't be impossible either. Problem is cross-system compatibility and accessibility.
Hultner