views:

40

answers:

2

The application is simple, an HTML form that posts to a Perl script. The problem is we sometimes have our customers upload very large files (gt 500mb) and their internet connections can be unreliable at times.

Is there any way to resume a failed transfer like in WinSCP or is this something that can't be done without support for it in the client?

+3  A: 

AFAIK, it must be supported by the client. Basically, the client and the server need to negotiate which parts of the file (likely defined as parts in "multipart/form-data" POST) have already been uploaded, and then the server code needs to be able to merge newly uploaded data with existing one.

The best solution is to have custom uploader code, usually implemented in Java though I think this may be possible in Flash as well. You might be even able to do this via JavaScript - see 2 sections with examples below


Here's an example of how Google did it with YouTube: http://code.google.com/apis/youtube/2.0/developers_guide_protocol_resumable_uploads.html

It uses "308 Resume Incomplete" HTTP response which sends range: bytes=0-408 header from the server to indicate what was already uploaded.


For additional ideas on the topic:

  1. http://code.google.com/p/gears/wiki/ResumableHttpRequestsProposal

  2. Someone implemented this using Google Gears on calient side and PHP on server side (the latter you can easily port to Perl)

    http://michaelshadle.com/2008/11/26/updates-on-the-http-file-upload-front/

    http://michaelshadle.com/2008/12/03/updates-on-the-http-file-upload-front-part-2/

DVK
A: 

It's a shame that your clients can't use ftp uploading, since this already includes abilities like that. There is also "chunked transfer encoding" in HTTP. I don't know what Perl modules might support it already.

Snake Plissken