I actually just implemented authorized S3 url's in my Ruby on Rails 3 application with Paperclip. Let me share how I accomplished this.
So what I did, and what you probably want is quite easy to implement.
Let me give you an example:
FileObject model
has_attached_file :attachment,
:path => "files/:id/:basename.:extension",
:storage => :s3,
:s3_permissions => :private,
:s3_credentials => File.join(Rails.root, 'config', 's3.yml')
FileObjectsController controller
def download
@file_object = FileObject.find(params[:id])
redirect_to(@file_object.attachment.expiring_url(10))
end
I believe this is quite straightforward. You add the Paperclip attachment to the FileObject model and then have an action (download for example) in the FileObjectsController. This way you can do some application level authorization from within your controller with a before_filter or something.
The expiring_url() method (provided by Paperclip) on the @file_object.attachment basically requests Amazon S3 for a key which makes the file accessible with that particular key. The first argument of the expiring_url() method takes an integer which represents the amount of seconds in which you want the provided URL to expire.
In my application it is currently set to 10 (@file_object.attachment.expiring_url(10)) so when the user requests a file, the user ALWAYS has to go through my application at for example myapp.com/file_objects/3/download to get a new valid URL from Amazon, which the user then instantly will be using to download the file since we're using the redirect_to method in the download action. So basically 10 seconds after the user hits the download action, the link already expired and the user has (or is still) happily downloading the file, while it remains protected from any non-authorized users.
I have even tried to set expiring_url(1) so that the URL instantly expires after the user triggers the Amazon S3 request for the URL. This worked for me locally, but never used it in production, you can try that too. However, I set it to 10 seconds to give the server a short period of time to respond. Works great so far and I doubt anyone will hijack someone's URL within 10 seconds after it's been created, let alone know what the URL is.
Extra security measure I took is just to generate a secret key for every file on create so my URL's always look like this:
has_attached_file :attachment,
:path => "files/:id/:secret_key/:basename.:extension"
So that every URL has it's unique secret_key in it's path, making it harder to hijack within the time the URL is accessible. Mind you that, while the URL to your file remains the same, the accessibility comes from the additional parameters that Amazon S3 provides which expire:
http://s3.amazonaws.com/mybucket/files/f5039a57acc187b36c2d/my_file.pdf?AWSAccessKeyId=AKIAIPPJ2IPWN5U3O1OA&Expires=1288526454&Signature=5i4%2B99rUwhpP2SbNsJKhT/nSzsQ%3D
Notice this part, which is the key Amazon generates and expires which makes the file temporarily accessible:
my_file.pdf?AWSAccessKeyId=AKIAIPPJ2IPWN5U3O1OA&Expires=1288526454&Signature=5i4%2B99rUwhpP2SbNsJKhT/nSzsQ%3D
That's what it's all about. And this changes with every request for your file if requested through the download action.
Hope this helps!