tags:

views:

41

answers:

1

Plainly, is that possible without having to read the remote resource to a local server and then PUT it on S3's servers ?

So in a sense, instead of transmission looking like this:

S3<--(PUT DATA)--LOCAL<--(REQUEST DATA)--REMOTE_URL

it ends up looking like this.

S3<--(PUT DATA BY URL)--LOCAL

S3<--(REQUEST DATA)--REMOTE_URL
A: 

Not possible.

Amazon doesn't offer a pull service for S3, and I haven't seen anyone else advertising one either. (It's not a terrible business idea though.)

That having been said, there are a ton of tools to help with this!

A lot of people use something like s3fs with their favorite backup utility (ie, cron + rsync).

Lately, I've had great success with boto and some custom Python scripts. The reason I like this is because it integrates nicely with whatever other services you're running on the box, and gives status updates.

I've written a shell script that starts up an EC2 instance, connects via SSH, has the EC2 box download data from an FTP site to it's local disk, and then upload the new data to S3.

Best,

Zach

Developer, LongTail Video

zach at longtail