views:

346

answers:

2

Google itself provides two solutions.

http://code.google.com/appengine/docs/python/tools/uploadingdata.html

One of these is new and experimental and requires you to run a separate command (and separately enter your username and password) for each kind of data you want to back up.

The other requires you to twice write out information on the structure of all the kinds of your data, information that is already implicit in your models file. (We've got 25 different kinds of data so I'm sensitive to this kind of stuff. Plus it will mean future changes will have to be made in 3 places.)

Then there's Aral Balkan's solution (google for "gaebar"), but his code on Github hasn't been updated in about a year, and he's additionally telling people to do some modification to App Engine internals (which seems risky, since they change in every version).

I think I'm leaning towards Google's non-experimental solution, but they all seem pretty bad.

+1  A: 

Have a look at AppRocket (an open-source replication engine that synchronizes Google App Engine datastore and MySQL database.) The project seems to be active.

jbochi
A: 

What's wrong with the --dump functionality? Yes, you have to download each kind separately, but that's going to be the case with any solution. If you just want backups, it fits your requirements exactly.

Nick Johnson
Why must I enter my username and password separately for each kind?
You don't need to - just use the cookie caching functionality, which should be enabled by default.
Nick Johnson