views:

873

answers:

4

I know that such type of questions exist in SF but they are very specific, I need a generic suggestion. I need a feature for uploading user files which could be of size more that 1 GB. This feature will be an add-on to the existing file-upload feature present in the application which caters to smaller files. Now, here are some of the options

  1. Use HTTP and Java applet. Send the files in chunks and join them at the server. But how to throttle the n/w.
  2. Use HTTP and Flex application. Is it better than an applet wrt browser compatibility & any other environment issues?
  3. Use FTP or rather SFTP rather than HTTP as a protocol for faster upload process

Please suggest.

Moreover, I've to make sure that this upload process don't hamper the task of other users or in other words don't eat up other user's b/w. Any mechanisms which can be done at n/w level to throttle such processes?

Ultimately customer wanted to have FTP as an option. But I think the answer with handling files programmatically is also cool.

A: 

For sending files to a server, unless you have to use HTTP, FTP is the way to go. Throttling, I am not completely sure of, at least not programmatically.

Personally, it seems like limitations of the upload speed would be better accomplished on the server side though.

Noctrine
+8  A: 

Use whatever client side language you want (a Java App, Flex, etc.), and push to the server with HTTP PUT (no Flex) or POST. In the server side Java code, regulate the flow of bytes in your input stream loop. A crude, simple, sample snippet that limits bandwidth to no faster than an average <= 10KB/second:

InputStream is = request.getInputStream();
OutputStream os =  new FileOutputStream(new File("myfile.bin"));
int bytesRead = 0;
byte[] payload = new byte[10240];

while (bytesRead >= 0) {
    bytesRead = is.read(payload);

    if (bytesRead > 0) 
        os.write(payload, 0, bytesRead);

    Thread.currentThread().sleep(1000);
}

(With more complexity one could more accurately regulate the single stream bandwidth, but it gets complex when considering socket buffers and such. "Good enough" is usually good enough.)

My application does something similar to the above--we regulate both up (POST and PUT) and (GET) down stream bandwidth. We accept files in the 100s of MB every day and have tested up to 2GB. (Beyond 2GB there is the pesky Java int primitive issues to deal with.) Our clients are both Flex and curl. It works for me, it can work for you.

While FTP is great and all, you can avoid many (but not all) firewall issues by using HTTP.

Stu Thompson
+1  A: 

If you want to reduce bandwidth you may want to send the data compressed (unless its compressed already) This may save 2-3 times the data volume depending on what you are sending.

Peter Lawrey
A: 

For an example of good practice for uploading large files, and the various ways of tackling it, have a look at flickr.com (you may have to sign up to get to the uploader page)

They provide various options, including HTTP form upload, a java desktop client, or some kind of javascript-driven gadget that I can't quite figure out. They don't seem to use flash anywhere.

skaffman