views:

39

answers:

1

Hi all,

My website allows users to upload photographs which I store on Amazon's S3. I store the original upload as well as an optimized image and a thumbnail. I want to allow users to be able to export all of their original versions when their subscription expires. So I am thinking the following problems arise

  1. Could be a large volume of data (possibly around 10GB)
  2. How to manage the download process - eg make sure if it gets interrupted where to start from again, how to verify successful download of files
  3. Should this be done with individual files or try and zip the files and download as one file or a series of smaller zipped files.

Are there any tools out there that I can use for this? I have seen Fzip which is an Actionscript library for handling zip files. I have an EC2 instance running that handles file uploads so could use this for downloads also - eg copy files to EC2 from S3, Zip them then download them to user via Flash downloader, use Fzip to uncompress the zip folder to user's hard drive.

Has anyone come across a similar service / solution?

all input appreciated thanks

A: 

I have not dealt with this problem directly but my initial thoughts are:

  1. Flash or possibly jQuery could be leveraged for a homegrown solution, having the client send back information on what it has received and storing that information in a database log. You might also consider using Bit Torrent as a mediator, your users could download a free torrent client and you could investigate a server-side torrent service (maybe RivetTracker or PHPBTTracker). I'm not sure how detailed these get, but at the very least, since you are assured you are dealing with a single user, if they become a seeder you can wipe the old file and begin on the next.

  2. Break larger than 2GB files into 2GB chunks to accommodate users with FAT32 drives that can't handle > ~4GB files. Break down to 1GB if space on the server is limited, keeping a benchmark on what's been zipped from S3 via a database record

Fzip is cool but I think it's more for client side archiving. PHP has ZIP and RAR libraries (http://php.net/manual/en/book.zip.php) you can use to round up files server-side. I think any solution you find will require you to manage security on your own by keeping records in a database of who's got what and download keys. Not doing so may lead to people leeching your resources as a file delivery system.

Good luck!

clumsyfingers
Thanks, some great advice here. Yes I agree my thoughts were coming round to thinking I would have to manage the whats been got by who info in my database which is probably as easy as anything as i already have details of all the original files in a table anyway so would just be a case of ticking them off so to speak. Flash sending back info on what it has received would work well I think. Thanks!
undefined