First things first I'd just try a simple:
sqlite3 sqllitedb .dump | psql postgresdb
At that point, just test it out. Write some test scripts in Django to output a set of sample records for each application, and then do a diff to ensure they're identical. If they are, then your conversion is probably fine.
If that doesn't work...
I'd recommend against using Django to dump and load the data, since I'm guessing it's not optimized to do so.
Instead, I'd create a second version of your app with the correct PostgreSQL database settings, run syncdb to create all the tables, then copy the data over from mysqllite to PostgreSQL using some other tool.
The thing is, most of the problems when converting over data is in the table definitions, etc. Those seem to be the most idiosyncratic. If you can generate a SQL script that is a dump of just the table contents, that should be pretty standard SQL INSERT INTO
commands.
Honestly, I can't see why there would be foreign key problems. Assuming that sqlite is creating accurate foreign keys (and why wouldn't it?) then there's no way that wouldn't copy over correctly. Really, foreign keys aren't special forms of data. It's no more likely that the UserProfile.user_id
field will contain an incorrect value than the UserProfile.photo
field would. If the foreign key concern is that the fields themselves aren't correctly identified as foreign key fields (i.e. no constraints), then the option of first creating the database using syncdb
will address that problem.
As per truncation: as I understand it, PostgreSQL throws a hard error if data is about to be truncated. I don't know if that is the case with sqlite or if it just truncates silently. Either way, again assuming sqlite isn't somehow munging the data on export, the fields should contain data that is the proper length for the field it's going in. The only thing I can think of that might affect this is character encoding, so make sure that the PostgreSQL fields have the same encoding as the sqlite tables do, at least during the import.