I'm in a situation where I need to push image storage for a number of websites out to a service that can scale indefinitely (S3, CloudFiles, etc.). Up until this point we've been able to allow our users to generate custom thumbnail sizes on the fly using Python's Imaging library with some help from sorl-thumbnail in Django.
By moving our images to something like S3, the ability to quickly create thumbnails on the fly is lost. We can either:
- Do it slowly by downloading the source from S3 and creating the thumbnail locally
con: it is slow and bandwidth intensive - Do it upfront by creating a pre-determined set of thumbnail sizes (a'la Flickr) and pushing them all to S3
con: it limits the sizes that can be generated and stores lots of files that will never be used - Let the browser resize using the height/width attributes on the img tag.
con: extra bandwidth used by downloading larger than necessary files
At this point #3 looks to be a simple solution to the problem with few drawbacks. Some quick tests and data from this website show that the quality isn't as bad as expected (we could assure the aspect ratio is maintained).
Any suggestions on other options or drawbacks we might not be taking into consideration?
note: The images are digital photos and are only used for display on the web. Sizes would range from 1000-50 pixels in height/width.