I am working on a hobby app which will contain a large slew of basically-hardcoded data, as well as dynamic user data once I deploy it. I want the ability to update the hardcoded data locally (more UPDATEs than INSERTs), and then export these data to the server. In other words, I need to dump data to a file, and import it in such a way that new rows (which will be relatively few) are INSERTed, and existing rows (as identified by the PK) are UPDATEd. Clearly, new rows can't be INSERTed on the server (or the PKs would potentially clash and issue erroneous UPDATEs); this is an acceptable limitation. However, I cannot DELETE the rows to be UPDATEd, let alone drop the synchronised tables, because the user-accessible tables will have FK constraints on the "static" tables.
Unfortunately, this seems to be very difficult to do. Google and Postgres mailing lists inform me that the "MySQL-like" feature on_duplicate_key_update "will be in a new version" (stale information; is it there?), along with a suggestion that "pg_loader can do it", with no indication of how.
In a worst-case scenario, I suppose I can come up with a home-brewn solution (dump a data file, write a custom import script that checks for PK conflicts and issues INSERT or UPDATE statements appropriately), but that seems an awfully clumsy solution for a problem that others have surely encountered before me.
Any thoughts?