views:

90

answers:

4

At work we are using Oracle (ick) for the database of a client (this is a migration from a previous company's project, so the database was set in stone, unfortunately). This has not caused any major headaches (save from the usual Oracle vs. MSSQL stuff), until now: We're at a point where five seperate developers are developing five seperate enhancements for the system, each using a different branch in Perforce.

Can anyone suggest a good way to keep track of version changes in Oracle: mainly Stored Procedures but also (possibly) changes to the table architecture?

At the moment we're simply attaching 'Create or replace' scripts to FogBugz cases, but there have been times when this has caused changes to go unnoticed, whcih quicky propagates to a large headache !

N.B. it's not possible to have more than one copy of the database itself, as it's just far too big.

+2  A: 

How about checking in the script for the procedures in SVN, tagging each checking with FB case number so that the changes can be seen directly in FB ?

That's what we do for our scripts (MSSQL mostly) but instead of FB we use Jira.

no_one
+6  A: 

A very similar question was asked before http://stackoverflow.com/questions/1717048/database-source-control-with-oracle/1721653#1721653 which I answered then.

Summary - DDL and Stored Procedures are code which should be managed in a version control tool just like Java, C# or whatever.

Stephen ODonnell
Well, that post wasn't popping up in the search: the search on this site can be pretty tempremental, it seems! That was one option we were considering, just chking in the scripts we're putting into FB into Perforce, but that means we have to set up some sort of procedure for replicating the database (i.e. make a lot of create scripts for tables) and that could take a while (lots of tables!)
Ed Woodcock
Back when I setup the process for our Oracle application, the initial create table statements were a pain. One idea is to draw a line in the sand and generate a single schema definition file - then moving forward have 'migration' scripts to modify the database. That way a clean install would involve running the schema definition and then each migration in turn.
Stephen ODonnell
+2  A: 

You don't say which version of Oracle you're using, and I suspect it is unlikely to be the very latest. But if you are using 11gR2 then you should read up on the Editions feature.

Edition-based redefinition allows us to manage different versions of PL/SQL objects inside the database. It doesn't replace proper version control, but supplements it by allowing different versions of a schema to co-exist in the same instance. It's very cool. Find out more.

APC
unfortunately we're on 11gR1, so this isn't an option, but it gave me a convincing case for why we might ugrade one day :)
Ed Woodcock
A: 

As ACP said, the Editions feature would work well for this if you have the latest and greatest version of Oracle.

In any case, the other linked answer shows the way to go - have all PL/SQL maintained in version control and keep all DDL changes as patches, which are also entered into version control.

A few practicalities that might affect your case.

If you have a relatively unified database structure and rapidly changing PL/SQL packages then an option is to have one schema holding the tables and the main branch of PL/SQL packages, and give every developer a separate schema for their branch of PL/SQL packages. All tables in the main schema are synonymed over to each developer's schema.

So, you have multiple PL/SQL versions running against a single data store. As developer branches come together these are checked into the main branch and compiled into the main application schema.

I find it easier to have all pl/sql packages in each development schema, not just the ones currently in development, but you can make it work either way.

Clearly you still need at least 2 databases so your production environment is safe from all these shenanigans.

Another option if this just doesn't work for you is to give each developer their own database in which to tinker. You said that size is a prohibitive factor here, but you could use the features in Data Pump to restrict the number of rows that are transferred from the main database to each developer database.

For example:

To only export 5% of the rows...

$ expdp sample=5

To only export 5% of a specific table...

$ expdp sample=mybigtable:5

In this way each developer can work with the same database structure but you don't have the same storage issues.

Hope this helps.

Nick Pierpoint