views:

906

answers:

2

I am currently developing a rails application that tries to copy/move videos from one bucket to another in s3. However i keep getting a proxy error 502 on my rails application. In the mongrel log it says "failed to allocate memory." Once this error occurs the application dies and we must restart is.

+3  A: 

Seems like your code is reading the entire resource into memory, and that out-of-memories your application. A naïve way to do this (and from your description, you're doing something like this already) would be to download the file and upload it again: just download it to a local file and not into memory. However, Amazon engineers have thought ahead and provide APIs that can deal with this specific case, as well.

If you're using something like the RightAWS gem, you can use its S3Interface like so:

# With s3 being an S3 object acquired via S3Interface.new
# Copies key1 from bucket b1 to key1_copy in bucket b2:
s3.copy('b1', 'key1', 'b2', 'key1_copy')

And if you're using the naked S3 HTTP interface, see http://docs.amazonwebservices.com/AmazonS3/2006-03-01/index.html?UsingCopyingObjects.html for a solution that uses only HTTP to copy one object from one bucket to another.

antifuchs
can I pass arguments to s3.copy(...) to specify the permissions of the new file?
deb
A: 

try to stream files instead of loading whole file into memory and then working with it.

for example, if you're using aws-s3 gem, do not use:

data = open(file)
S3Object.store file_name, data, BUCKET

Use following instead:

S3Object.store file_name, open(file), BUCKET

not sure how exactly to "stream-download" the file though.

Mantas