views:

189

answers:

3

I have a relatively extensive sqlite database that I'd like to import into my Google App Engine python app.

I've created my models using the appengine API which are close, but not quite identical to the existing schema. I've written an import script to load the data from sqlite and create/save new appengine objects, but the appengine environment blocks me from accessing the sqlite library. This script is only to be run on my local app engine instance, and from there I hope to push the data to google.

Am I approaching this problem the wrong way, or is there a way to import the sqlite library while running in the local instance's environment?

A: 

According to Google, you're doing it backwards. The app should be pulling data from you where you have more flexibility in converting to the new model anyway.

msw
+2  A: 

I would make suitable CSV files from the Sqlite data, in a separate script, then use bulk loading to push the data from the CSV files up to app engine.

Alex Martelli
A: 

I have not had any trouble importing pysqlite2, reading data, then transforming it and writing it to AppEngine using the remote_api.

What error are you seeing?

dar