views:

34

answers:

1

I have a table of Products and one of ProductGroups to categorize them. Now the ProductGroups will be removed and their data merged into the Products table.

I made a migration that loops through the existing ProductGroup records and adds them as Products. This so we don't have to re-enter hundreds of groups manually. Afterwards the product_groups table is dropped.

Then I removed the ProductGroup model, controllers and so on and committed the changes.

This all worked fine in development, but obviously if I now update the production application it will first apply the file system changes (removing the ProductGroup model etc.) and the migration will fail because the model no longer exists.

What is the best way to solve this? Should I not have put the data transfer in a migration? It feels like catch-22.

+2  A: 

I didn't get what the problem is but if you want to do everything in one migration it may look like

class UpdateProducts < ActiveRecord::Migration 
  def self.up 
    add_column :products, :columns, :types

    Product.all.each do |product|
      product.update_attributes(ProductGroup.find_by_id(product.product_group_id).attributes)
      product.save
    end

    system("ruby script/destroy scaffold product_group")

    drop_table :product_groups
  end  

  def self.down 
    remove_column :products, :columns
    create_table :product_groups  
  end 
end 
Bohdan Pohorilets
Thanks for the suggestion, that's actually more or less what I ended up doing; migrated the data first and then updated the code, removing the ProductGroup model.But if someone else had been applying my commit without knowing what I did, they'd have certainly run into trouble. Or if someone else were to update their repo later on.Thankfully that wasn't an issue this time (it's a solo project), but I'd love to know a better solution for the future.
PeterD
That's it. I didn't consider breaking out into system in a migration but that's exactly what I wanted to do, thanks!
PeterD