views:

50

answers:

2

I want to manipulate a pickled python object stored in S3 in Google App Engine's sandbox. I use the suggestion in boto's documentation:

from boto.s3.connection import S3Connection

from boto.s3.key import Key

conn = S3Connection(config.key, config.secret_key)
bucket = conn.get_bucket('bucketname')
key = bucket.get_key("picture.jpg")
fp = open ("picture.jpg", "w")
key.get_file (fp)

but this requires me to write to a file, which apparently is not kosher in the GAE sandbox.

How can I get around this? Thanks much for any help

+2  A: 

You can write into a blob and use the StringIO to retrieve the data

from boto.s3.connection import S3Connection
from boto.s3.key import Key
from google.appengine.ext import db

class Data(db.Model)
    image = db.BlobProperty(default=None)

conn = S3Connection(config.key, config.secret_key)
bucket = conn.get_bucket('bucketname')
key = bucket.get_key("picture.jpg")
fp = StringIO.StringIO()
key.get_file(fp)

data = Data(key_name="picture.jpg")
data.image = db.Blob(fp.getvalue())
data.put()
Shay Erlichmen
thanks, this works. Using StringIO was a great idea.
rd108
+2  A: 

You don't need to write to a file or a StringIO at all. You can call key.get_contents_as_string() to return the key's contents as a string. The docs for key are here.

Nick Johnson
thanks Nick. This works, and without having to import the StringIO module. I think for obvious reasons that makes it a better solution. For anyone following along at home, I changed the pickle.load(content) to pickle.loads(content) to work with unpickling a string-like, rather than file-like, object.
rd108