views:

24

answers:

0

I'm trying to push a large database (1.6 GB over 8 tables) to Heroku via db:push and having strange issues.

It keeps failing, at different points in the transfer process, with:


HTTP CODE: 500 Taps Server Error: PGError: ERROR: duplicate key value violates unique constraint "letters_pkey"


letters is a large table (1.3 M records) but there aren't any duplicates in the primary key column in the local db (mysql). And what's more, if I rerun, it will fail at different points. The first time, 398,000 out of 1.3 million records made it, then 49,000, then 98,000 (approximately).

My theory is that there is a network hiccup, and maybe taps isn't quite recovering gracefully and trying to rewrite the same row?

So I've tried tinkering w/ the the "stream_state" (last_fetched and filter) and rerunning w/ '--resume-filename push_201009162245.dat' but it throws the same error immediately again. (And I am not totally sure what I'm doing mucking w/ that .dat file, as I can't find docs about the schema for that anywhere)

Stumped and hoping someone can help!