views:

46

answers:

2

I have a web app in which users can upload an avatar under 700kb. That's the only part of the application dealing with uploading images (so I won't be dealing with an exceptionally heavy load). I was wondering what the best way is to go about this. Currently I'm using Paperclip and I wan't to store all of the images on Amazon's S3.

Option 1: User uploads the image to my web server. The image is processed asynchronously using delayed job and is then uploaded to s3. The image is then deleted off of my web server.

Option 2: User uploads the image directly to s3. A background process on my web server downloads the image, processes it, and uploads the newly created thumbnails to s3.

Option 1 seems to use less bandwidth overall. Option 2 seems to save my application the resources of having the image uploaded directly to it.

Am I correct in these assumptions? Which option is the fastest and most resource-friendly? Or is there another way to go about this?

Also, I was wondering if sites like Twitter, FB, and Posterous asynchronously process images. Whenever I upload a profile pic on Twitter, or images on FB and Posterous, they seem to be processed instantly.

Thanks!

Tim

+1  A: 

If by "processing image", you mean creating thumbnails or other image conversion options, you can do that using paperclip. If processing is a huge task that can't be achieved using paperclip alone, then I would go with option 1, cheaper as you don't access S3 multiple times. Just make sure you have enough hard disk space for all images during processing.

Faisal
+1  A: 

Your best bet is to have the user upload directly to S3. The d2s3 plugin works well for this.

In your d2s3 callback, create a delayed_job event to download the full size images and save thumbnails them to S3 with Reduced Redundancy Storage. If you wire everything up correctly, you can automatically generate thumbnails when and if they're ever lost from S3.

jelder