boto

SimpleDB query performance improvement using boto

Hello, I am trying to use the SimpleDB in following way. I want to keep 48 hrs worth data at anytime into simpledb and query it for different purposes. Each domain has 1 hr worth data, so at any time there are 48 domains present in the simpledb. As the new data is constantly uploaded, I delete the oldest domain and create a new domai...

Comparing uncompressed local files to compressed files stored on Amazon S3?

We put hundreds of image files on Amazon S3 that our users need to synchronize to their local directories. In order to save storage space and bandwidth, we zip the files stored on S3. On the user's end they have a python script that runs every 5 min to get a current list of files, and download new/updated files. My question is what's...

python s3 using boto, says 'attribute error: 'str' object has no attribute 'connection'

I have a connection that works as I can list buckets, but having issues when trying to add a object. conn = S3Connection(awskey, awssecret) key = Key(mybucket) key.key = p.sku key.set_contents_from_filename(fullpathtofile) I get the error: 'attribute error: 'str' object has no attribute 'connection' the error is in the file: /us...

Problem uploading image file to Amazon S3 in Django using BOTO Library

Hi there, I am a total beginner to programming and Django so I'd appreciate help that beginner can get his head round! I was following a tutorial to show how to upload images to an Amazon S3 account with the Boto library but I think it is for an older version of Django (I'm on 1.1.2 and Python 2.65) and something has changed. I get an ...

Django Boto S3 Access

I can't figure this out. Here's what I want to happen ... I have an applications that users upload files to S3 using boto and django. I want those files to be private and only accessible through my app using my api credentials. So if a user uploads a photo via my app, the only way he or anyone else can download it is via his accoun...

Amazon SQS region from EC2 Instance.

If I create SQS queue from an EC2 instance without specifying the region in the API call, in which region will the queue be created. When I run boto.sqs.regions() I get 4 regions from an NON ec2 machine, I get [RegionInfo:us-east-1, RegionInfo:eu-west-1, RegionInfo:us-west-1, RegionInfo:ap-southeast-1] from a EC2 machine in the As...

how to get version of an object in an Amazon S3 bucket using boto?

After uploading a file by doing this: key = Key(bucket) key.set_contents_from_file(fh) I expect it to return some kind of information for the uploaded file, but it doesn't. I want to maintain a list of all versions of a file. is there a way to get the latest version key as soon as I upload it? FYI I'm using boto 1.9b ...

How do I use Avro to process a stream that I cannot seek?

I am using Avro 1.4.0 to read some data out of S3 via the Python avro bindings and the boto S3 library. When I open an avro.datafile.DataFileReader on the file like objects returned by boto it immediately fails when it tries to seek(). For now I am working around this by reading the S3 objects into temporary files. I would like to be a...

How to compile python code that uses boto to access S3?

I'm trying to compile a simple Python program, that uploads files to an S3 bucket using the boto package, in to a single, redistributable .exe file. I'm open to any compilation method. So far I've tried both bbfreeze and py2exe and both yield the same results. The code in question that causes trouble looks like this: import boto #...sni...

Is is possible to read a file from S3 in Google App Engine using boto?

I want to manipulate a pickled python object stored in S3 in Google App Engine's sandbox. I use the suggestion in boto's documentation: from boto.s3.connection import S3Connection from boto.s3.key import Key conn = S3Connection(config.key, config.secret_key) bucket = conn.get_bucket('bucketname') key = bucket.get_key("picture.jpg") fp...