views:

26

answers:

2

Hi you all,

I have thousands of databases schema to update (add a column, update some rows and add 3 rows in 2 different tables).

I have 2 different approaches to do so:

1) Put the name of each schema before table names

# database A01be91a86

UPDATE A01be91a86.ACTION set CODE_QUALIFICATION....
ALTER table A01e02a697.IMPRESSION add column NAME.....

# database blabla
....
....
# thousand databases

# database zfc982251d

UPDATE zfc982251d.ACTION set CODE_QUALIFICATION....
ALTER table zfc982251d.IMPRESSION add column NAME.....

2) connect to the database schema before each update

# database A01be91a86

connect A01be91a86
UPDATE ACTION set CODE_QUALIFICATION....
ALTER table IMPRESSION add column NAME.....

# database blabla
....
....
# thousand databases

# database zfc982251d

connect zfc982251d
UPDATE ACTION set CODE_QUALIFICATION....
ALTER table IMPRESSION add column NAME.....

The goal is to limit the time the whole script will take to execute

What is the best approach? The first one or the second one? Or maybe a third one I didn't think of.

Thank you guys

A: 

Premature optimization is the root of all evil...

How long does it currently take, and how long does it need to take?

Zak
It's a SAAS application. The less time, the better. And I have literally 10s of thousands to update. It's far from premature.
Thomas
A: 

I think the first might be quicker - but it will be a bigger script so harder to write. Why not restore a backup of the database somewhere and try it out?

Adam Butler
Don't worry, I don't write it by hand. A ruby script is responsible of that. From the first one to the second one, it's one line of code.The databases are customer databases (financial stuff here). Don't want to mess too much with them :-)
Thomas