views:

9039

answers:

55

I feel that my shop has a hole because we don't have a solid process in place for versioning our database schema changes. We do a lot of backups so we're more or less covered, but it's bad practice to rely on your last line of defense in this way.

Surprisingly, this seems to be a common thread. Many shops I have spoken to ignore this issue because their databases don't change often, and they basically just try to be meticulous.

However, I know how that story goes. It's only a matter of time before things line up just wrong and something goes missing.

Are there any best practices for this? What are some strategies that have worked for you?

+70  A: 

The databases themselves? No

The scripts that create them, including static data inserts, stored procedures and the like; of course. They're text files, they are included in the project and are checked in and out like everything else.

Of course in an ideal world your database management tool would do this; but you just have to be disciplined about it.

blowdart
With Mysql Workbench you can have all that in a structured file(xml) that can be opened and handled with a GUI.Being xml just text, yes it can be versioning without having to type single sql sentence.
levhita
+5  A: 

I do by saving create/update scripts and a script that generates sampledata.

Paco
+163  A: 

Must read Get your database under version control. Check the series of posts by Scott Allen.

When it comes to version control, the database is often a second or even third-class citizen. From what I've seen, teams that would never think of writing code without version control in a million years-- and rightly so-- can somehow be completely oblivious to the need for version control around the critical databases their applications rely on. I don't know how you can call yourself a software engineer and maintain a straight face when your database isn't under exactly the same rigorous level of source control as the rest of your code. Don't let this happen to you. Get your database under version control.

Gulzar
I follow very closely a methodology described in the referenced articles. You don't need to implement every level, and there are variations which will work equally well. The system is flexible, easily customizable, allows for fine-grained control over schema and data changes, and works very well as a best practice for database source control.The part which can be tricky and adds almost as much security as the rest of the process is a tool to help manage the scripts. It can be as simple as file concatenation, or as complex as automated deployments. First get src ctrl, then think about a tool.
ulty4life
+1  A: 

Yes ... our databases are designed in ERwin and the DDLs for each version are automatically generated. The ERwin files are kept in our source code control system (actually, so are our engineering documents).

Steve Moyer
+18  A: 

I absolutely love Rails ActiveRecord migrations. It abstracts the DML to ruby script which can then be easily version'd in your source repository.

However, with a bit of work, you could do the same thing. Any DDL changes (ALTER TABLE, etc.) can be stored in text files. Keep a numbering system (or a date stamp) for the file names, and apply them in sequence.

Rails also has a 'version' table in the DB that keeps track of the last applied migration. You can do the same easily.

Matt Rogish
A: 

We have a weekly sql dump into a subversion repo. It's fully automated but it's a REALLY beefy task.

You'll want to limit the number of revisions because it really chows disk space after a while!

Oli
+5  A: 

Yes, we do it by keeping our SQL as part of our build -- we keep DROP.sql, CREATE.sql, USERS.sql, VALUES.sql and version control these, so we can revert back to any tagged version.

We also have ant tasks which can recreate the db whenever needed.

Plus, the SQL is then tagged along with your source code that goes with it.

Dustin
+6  A: 

YES, I think it is important to version your database. Not the data, but the schema for certain.

In Ruby On Rails, this is handled by the framework with "migrations". Any time you alter the db, you make a script that applies the changes and check it into source control.

My shop liked that idea so much that we added the functionality to our Java-based build using shell scripts and Ant. We integrated the process into our deployment routine. It would be fairly easy to write scripts to do the same thing in other frameworks that don't support DB versioning out-of-the-box.

Pete
+2  A: 

I source control the database schema by scripting out all objects (table definitions, indexes, stored procedures, etc.). But, as for the data itself, simply rely on regular backups. This ensures that all structural changes are captured with proper revision history, but doesn't burden the database each time data changes.

Ben Hoffstein
+9  A: 

The best practice I have seen is creating a build script to scrap and rebuild your database on a staging server. Each iteration was given a folder for database changes, all changes were scripted with "Drop... Create" 's . This way you can rollback to an earlier version at any time by pointing the build to folder you want to version to.

I believe this was done with NaNt/CruiseControl.

Sara Chipps
+1  A: 

We use replication and clustering to manage our databases, as well as backups. We use Serena to manage our SQL scripts and configuration implementations. Before a configuration change is made, we perform a backup as part of the change management process. This backup satisfies our rollback requirement.

I think it all depends on scale. Are you talking about enterprise applications that need offsite backups and disaster recovery? A small workgroup running an accounting application? Or everywhere in between?

Robert S.
+2  A: 

Hey Brian,

I have everything necessary to recreate my DB from bare metal, minus the data itself. I'm sure there are lots of ways to do it, but all my scripts and such are stored off in subversion and we can rebuild the DB structure and such by pulling all that out of subversion and running an installer.

itsmatt
+5  A: 

Yes. Code is code. My rule of thumb is that I need to be able to build and deploy the application from scratch, without looking at a development or production machine.

Stu Thompson
+2  A: 

At our business we use database change scripts. When a script is run, it's name is stored in the database and won't run again, unless that row is removed. Scripts are named based on date, time and code branch, so controlled execution is possible.

Lots and lots of testing is done before the scripts are run in the live environment, so "oopsies" only happen, generally speaking, on development databases.

Wes P
+3  A: 

The most successful scheme I've ever used on a project has combined backups and differential SQL files. Basically we would take a backup of our db after every release and do an SQL dump so that we could create a blank schema from scratch if we needed to as well. Then anytime you needed to make a change to the DB you would add an alter scrip to the sql directory under version control. We would always prefix a sequence number or date to the file name so the first change would be something like 01_add_created_on_column.sql, and the next script would be 02_added_customers_index. Our CI machine would check for these and run them sequentially on a fresh copy of the db that had been restored from the backup.

We also had some scripts in place that devs could use to re-initialize their local db to the current version with a single command.

Mike Deck
A: 

I suppose it depends on what you mean by 'your databases'.

Data should be backed up regularly, as part of a maintenance schedule (ie. frequent log file backups, regular data file backups and preferably offsite backup).

The schema scripts should be under Source control. I prefer the baseline and incremental change script approach, with all database changes (data and schema) scripted, versioned and in Source control. This approach also means that you can rebuild databases to specific versiond as part of automated build processes.

Mitch Wheat
+1  A: 

We have our Create/Alter scripts under source control. As for the database itself, when you have hundreds of tables and a lot of processing data every minutes, it would be CPU and HDD killer to version all the database. That's why backup is still, according to me, the best way to control your data.

David
A: 

We insist upon change scrips and a master data definition script. These are checked into CVS along with any other source code. The PL/SQL (were are an Oracle shop) is also source controlled in CVS. The change scripts are repeatable and can be passed to everyone on the team. Basically, just because it is a database, there is never an excuse not to code it and use a source control system to track the changes.

dacracot
+19  A: 

You should never just log in and start entering "ALTER TABLE" commands to change a production database. The project I'm on has database on every customer site, and so every change to the database is made in two places, a dump file that is used to create a new database on a new customer site, and an update file that is run on every update which checks your current database version number against the highest number in the file, and updates your database in place. So for instance, the last couple of updates:

if [ $VERSION \< '8.0.108' ] ; then
  psql -U cosuser $dbName << EOF8.0.108
    BEGIN TRANSACTION;
    --
    -- Remove foreign key that shouldn't have been there.
    -- PCR:35665
    --
    ALTER TABLE     migratorjobitems
    DROP CONSTRAINT migratorjobitems_destcmaid_fkey;
    -- 
    -- Increment the version
    UPDATE          sys_info
    SET             value = '8.0.108'
    WHERE           key = 'DB VERSION';
    END TRANSACTION;
EOF8.0.108
fi

if [ $VERSION \< '8.0.109' ] ; then
  psql -U cosuser $dbName << EOF8.0.109
    BEGIN TRANSACTION;
    --
    -- I missed a couple of cases when I changed the legacy playlist
    -- from reporting showplaylistidnum to playlistidnum
    --
    ALTER TABLE     featureidrequestkdcs
    DROP CONSTRAINT featureidrequestkdcs_cosfeatureid_fkey;
    ALTER TABLE     featureidrequestkdcs
    ADD CONSTRAINT  featureidrequestkdcs_cosfeatureid_fkey
    FOREIGN KEY     (cosfeatureid)
    REFERENCES      playlist(playlistidnum)
    ON DELETE       CASCADE;
    --
    ALTER TABLE     ticket_system_ids
    DROP CONSTRAINT ticket_system_ids_showplaylistidnum_fkey;
    ALTER TABLE     ticket_system_ids
    RENAME          showplaylistidnum
    TO              playlistidnum;
    ALTER TABLE     ticket_system_ids
    ADD CONSTRAINT  ticket_system_ids_playlistidnum_fkey
    FOREIGN KEY     (playlistidnum)
    REFERENCES      playlist(playlistidnum)
    ON DELETE       CASCADE;
    -- 
    -- Increment the version
    UPDATE          sys_info
    SET             value = '8.0.109'
    WHERE           key = 'DB VERSION';
    END TRANSACTION;
EOF8.0.109
fi

I'm sure there is a better way to do this, but it's worked for me so far.

Paul Tomblin
We do at least work off of a development database and then push changes to production in a coordinated way. :) Thanks for the input, this is an interesting approach.
Brian MacKay
We do a similar thing except we put each "if version" in a separate file and have a tool that runs the files in order.
jwanagel
We're also working on a similar thing, except SQL scripts are installed (new install or upgrade) along with app files, and the location and date and time of script execution are logged.
Si
I too have written something almost exactly like this, but for Jet (e.g. MS Access) datbases. We're currently using DB Ghost for SQL Server, which does a lot of this for you.
Kenny Evitt
+4  A: 

I typically build an SQL script for every change I make, and another to revert those changes, and keep those scripts under version control.

Then we have a means to create a new up-to-date database on demand, and can easily move between revisions. Every time we do a release, we lump the scripts together (takes a bit of manual work, but it's rarely actually hard) so we also have a set of scripts that can convert between versions.

Yes, before you say it, this is very similar to the stuff Rails and others do, but it seems to work pretty well, so I have no problems admitting that I shamelessly lifted the idea :)

Dan
+1  A: 

We're in the process of moving all the databases to source control. We're using sqlcompare to script out the database (a profession edition feature, unfortunately) and putting that result into SVN.

The success of your implementation will depend a lot on the culture and practices of your organization. People here believe in creating a database per application. There is a common set of databases that are used by most applications as well causing a lot of interdatabase dependencies (some of them are circular). Putting the database schemas into source control has been notoriously difficult because of the interdatabase dependencies that our systems have.

Best of luck to you, the sooner you try it out the sooner you'll have your issues sorted out.

Min
+3  A: 

We do source control all our dabase created objects. And just to keep developers honest (because you can create objects without them being in Source Control), our dbas periodically look for anything not in source control and if they find anything, they drop it without asking if it is ok.

HLGEM
Wow, that's a really brutal policy! I'm not 100% sure that's in the best interests of anyone except the DBAs. :)
Brian MacKay
On the other hand everything is in source control.
HLGEM
+6  A: 

I'm ashamed, but we don't do this at all. My project's database is 30GB so we've been lazy about it. But thanks to stacko, it's going on my list right now. Good job, website! "Better programming through shame"

Kevin Conner
Thanks for the honesty Kevin. :)
Brian MacKay
+1  A: 

I have used the dbdeploy tool from ThoughtWorks at http://dbdeploy.com/. It encourages the use of migration scripts. Each release, we consolidated the change scripts into a single file to ease understanding and to allow DBAs to 'bless' the changes.

David Medinets
+1  A: 

I always check my database structure dumps into source control. Full database dumps however I normally just compress and put away for storage.

Adam Gibbins
A: 

I version control the create script, and I use the svn version tag within it. Then, whenever I get a version that is going to be used, I create a script in a dbpatches/ directory named as the version to roll up to. The job of that script is to modify a current database without destroying the data. dbpatches/, for example, might have files named 201, 220, and 240. If the database is currently at level 201, apply patch 220, then patch 240.

DROP TABLE IF EXISTS `meta`;
CREATE TABLE `meta` (
  `property` varchar(255),
  `value` varchar(255),
  PRIMARY KEY (`property`)
) ENGINE=InnoDB DEFAULT CHARSET=latin1;
INSERT INTO `meta` VALUES ('version', '$Rev: 240 $');

Don't forget to test your code before considering a patch good. Caveat emptor!

A: 

We maintain DDL (and sometime DML) scripts generated by our ER Tool (PowerAMC).

We have a bench of shell scripts which rename the scripts starting with a number on the trunk branch. Each script is committed and tagged with the bugzilla number.

These scripts are then at need merged within the release branches along with the application code.

We have a table recording the scripts and their status. Each script is executed in order and recorded in this table on each install by the deploying tool.

tal
+2  A: 

I use SQL CREATE scripts exported from MySQL Workbech, then using theirs "Export SQL ALTER" functionality I end up with a series of create scripts(numbered of course) and the alter scripts that can apply the changes between them.

3.- Export SQL ALTER script Normally you would have to write the ALTER TABLE statements by hand now, reflecting your changes you made to the model. But you can be smart and let Workbench do the hard work for you. Simply select File -> Export -> Forward Engineer SQL ALTER Script… from the main menu.

This will prompt you to specify the SQL CREATE file the current model should be compared to.

Select the SQL CREATE script from step 1. The tool will then generate the ALTER TABLE script for you and you can execute this script against your database to bring it up to date.

You can do this using the MySQL Query Browser or the mysql client.Voila! Your model and database have now been synchronized!

Source: MySQL Workbench Community Edition: Guide to Schema Synchronization

All this scripts of course are inside under version control.

levhita
+1  A: 

My team versions our database schema as C# classes with the rest of our code. We have a homegrown C# program (<500 lines of code) that reflects the classes and creates SQL commands to build, drop and update the database. After creating the database we run sqlmetal to generate a linq mapping, which is then compiled in another project that is used to generate test data. The whole things works really well because data access is checked at compile time. We like it because the schema is stored in a .cs file which is easy to track compare in trac/svn.

+16  A: 

Check out LiquiBase for managing database changes using source control.

killdash10
We use this for our projects as well - works great so far.
agartzke
+2  A: 

This has always been a big annoyance for me too - it seems like it is just way too easy to make a quick change to your development database, save it (forgetting to save a change script), and then you're stuck. You could undo what you just did and redo it to create the change script, or write it from scratch if you want of course too, though that's a lot of time spent writing scripts.

A tool that I have used in the past that has helped with this some is SQL Delta. It will show you the differences between two databases (SQL server/Oracle I believe) and generate all the change scripts necessary to migrate A->B. Another nice thing it does is show all the differences between database content between the production (or test) DB and your development DB. Since more and more apps store configuration and state that is crucial to their execution in database tables, it can be a real pain to have change scripts that remove, add, and alter the proper rows. SQL Delta shows the rows in the database just like they would look in a Diff tool - changed, added, deleted.

An excellent tool. Here is the link: http://www.sqldelta.com/

Sam Schutte
+3  A: 

Yes, always. You should be able to recreate your production database structure with a useful set of sample data whenever needed. If you don't, over time minor changes to keep things running get forgotten then one day you get bitten, big time. Its insurance that you might not think you need but the day you do it it worth the price 10 times over!

AndrewB
+1  A: 

RedGate software makes some great tools that will help you version your database. Be sure to try to have your devs build their own isolated local databases for dev work rather than rely on a "dev server" which may or may not be down at some time.

Karthik Hariharan
Thanks for your kind words about our software. We've just released SQL Source Control, which makes it even easier to shared changes amongst isolated development databases. I hope you get a chance to try this out and let me know if this works for you. This assumes you're using TFS or SVN.
David Atkinson
+2  A: 

RedGate is great, we generate new snapshots when database changes are made (a tiny binary file) and keep that file in the projects as a resource. Whenever we need to update the database, we use RedGate's toolkit to update the database, as well as being able to create new databases from empty ones.

RedGate also makes Data snapshots, while I haven't personally worked with them, they are just as robust.

Tom Anderson
Red Gate's SQL Source Control has been developed to address this problem, so please take a look and let us know if it does or doesn't meet your requirements. The advantage of SQL Source Control above SQL Compare is that it integrates with SSMS and therefore doesn't require a separate tool to be loaded to log different schema versions. [I'm a product manager at Red Gate]
David Atkinson
+2  A: 

FYI This was also brought up a few days ago by Dana ... Stored procedures/DB schema in source control

Robert Paulson
+1  A: 

I have used RedGate SQL Compare Pro for schema synchronization with script folder, then I commit all my update to version control. It works great.

A: 

Your project team can have a DBA to whom every developer would forward their create alter, delete, insert/update (for master data) sql statements. DBAs would run those queries and on successfully making the required update would add those statements to a text file or a spreadsheet. Each addition can be labeled as a savepoint. Incase you revert back to a particular savepoint, just do a drop all and run the queries uptil the labelled savepoint. This approach is just a thought... a bit of fine tuning here would work for your development environment.

A: 

Any database interface code absolutely should go into version control (Stored Procedures, Functions, etc).

For structure and data, it is a judgement call. I personally keep a clean structural template of my databases around, but don't store them in version control, due to the size. But storing it in version control can be very beneficial, even for just having a history.

pearcewg
A: 

A big problem, often overlooked, is that for larger web based systems, it is required to have a transitional period or bucket testing approach to making new releases. This makes it essential to have both rollback and a mechanism for supporting both the old and new schema in the same DB. This requires a scaffolding approach (made populist by the Agile DB folks). In this scenario, lack of process in DB source control can be a total disaster. You need old schema scripts, new schema scripts and a set of intermediate scripts, as well as a tidy up, once the system is fully on the new version (or rolled back).

Rather than having scripts to recreate schema from scratch, what is required is a state based approach, where you need scripts purely to move the DB into the state you require, both forward and back, from version to version. Your DB becomes a series of state scripts, which can be easily source controlled and tagged along with the rest of the source.

+3  A: 

There has been a lot of discussion about the database model itself, but we also keep the required data in .SQL files.

For example, in order to be useful your application might need this in the install:

INSERT INTO Currency (CurrencyCode, CurrencyName) 
VALUES ('AUD', 'Australian Dollars');

INSERT INTO Currency (CurrencyCode, CurrencyName) 
VALUES ('USD', 'US Dollars');

We would have a file called currency.sql under subversion. As a manual step in the build process, we compare the previous currency.sql to the latest one and write an upgrade script.

WW
We keep the required data in a database (who would have thunk?), then use our tools to generate these insert/update scripts to keep the reference data in sync between dev, qa, production, etc. It's so much easier to manage the data and the changes this way.The scripts are all controlled in by our version/config tools.
Karen Lopez
+4  A: 

The new Database projects in Visual Studio provide source control and change scripts.

They have a nice tool that compares databases and can generate a script that converts the schema of one into the other, or updates the data in one to match the other.

The db schema is "shredded" to create many, many small .sql files, one per DDL command that describes the DB.

+tom


Additional info 2008-11-30

I have been using it as a developer for the past year and really like it. It makes it easy to compare my dev work to production and generate a script to use for the release. I don't know if it is missing features that DBAs need for "enterprise-type" projects.

Because the schema is "shredded" into sql files the source control works fine.

One gotcha is that you need to have a different mindset when you use a db project. The tool has a "db project" in VS, which is just the sql, plus an automatically generated local database which has the schema and some other admin data -- but none of your application data, plus your local dev db that you use for app data dev work. You rarely are aware of the automatically generated db, but you have to know its there so you can leave it alone :). This special db is clearly recognizable because it has a Guid in its name,

The VS DB Project does a nice job of integrating db changes that other team members have made into your local project/associated db. but you need to take the extra step to compare the project schema with your local dev db schema and apply the mods. It makes sense, but it seems awkward at first.

DB Projects are a very powerful tool. They not only generate scripts but can apply them immediately. Be sure not to destroy your production db with it. ;)

I really like the VS DB projects and I expect to use this tool for all my db projects going forward.

+tom

Tom A
Interesting. So is this usable for enterprise-type projects?
Brian MacKay
I have been using it as a developer for the past year and really like it. It makes it easy to compare my dev work to production and generate a script to use for the release. I don't know if it is missing features that DBAs need for "enterprise-type" projects. I will add more not my entry.+tom
Tom A
+2  A: 

Here is a sample poor man's solution for a trigger implementing tracking of changes on db objects ( via DDL stateements ) on a sql server 2005 / 2008 database. I contains also a simple sample of how-to enforce the usage of required someValue xml tag in the source code for each sql command ran on the database + the tracking of the current db version and type ( dev , test , qa , fb , prod) One could extend it with additional required attributes such as , etc. The code is rather long - it creates the empty database + the needed tracking table structure + required db functions and the populating trigger all running under a [ga] schema.

USE [master]
GO

/****** Object:  Database [DBGA_DEV]    Script Date: 04/22/2009 13:22:01 ******/
CREATE DATABASE [DBGA_DEV] ON  PRIMARY 
( NAME = N'DBGA_DEV', FILENAME = N'D:\GENAPP\DATA\DBFILES\DBGA_DEV.mdf' , SIZE = 3072KB , MAXSIZE = UNLIMITED, FILEGROWTH = 1024KB )
 LOG ON 
( NAME = N'DBGA_DEV_log', FILENAME = N'D:\GENAPP\DATA\DBFILES\DBGA_DEV_log.ldf' , SIZE = 6208KB , MAXSIZE = 2048GB , FILEGROWTH = 10%)
GO

ALTER DATABASE [DBGA_DEV] SET COMPATIBILITY_LEVEL = 100
GO

IF (1 = FULLTEXTSERVICEPROPERTY('IsFullTextInstalled'))
begin
EXEC [DBGA_DEV].[dbo].[sp_fulltext_database] @action = 'enable'
end
GO

ALTER DATABASE [DBGA_DEV] SET ANSI_NULL_DEFAULT OFF 
GO

ALTER DATABASE [DBGA_DEV] SET ANSI_NULLS OFF 
GO

ALTER DATABASE [DBGA_DEV] SET ANSI_PADDING ON 
GO

ALTER DATABASE [DBGA_DEV] SET ANSI_WARNINGS OFF 
GO

ALTER DATABASE [DBGA_DEV] SET ARITHABORT OFF 
GO

ALTER DATABASE [DBGA_DEV] SET AUTO_CLOSE OFF 
GO

ALTER DATABASE [DBGA_DEV] SET AUTO_CREATE_STATISTICS ON 
GO

ALTER DATABASE [DBGA_DEV] SET AUTO_SHRINK OFF 
GO

ALTER DATABASE [DBGA_DEV] SET AUTO_UPDATE_STATISTICS ON 
GO

ALTER DATABASE [DBGA_DEV] SET CURSOR_CLOSE_ON_COMMIT OFF 
GO

ALTER DATABASE [DBGA_DEV] SET CURSOR_DEFAULT  GLOBAL 
GO

ALTER DATABASE [DBGA_DEV] SET CONCAT_NULL_YIELDS_NULL OFF 
GO

ALTER DATABASE [DBGA_DEV] SET NUMERIC_ROUNDABORT OFF 
GO

ALTER DATABASE [DBGA_DEV] SET QUOTED_IDENTIFIER OFF 
GO

ALTER DATABASE [DBGA_DEV] SET RECURSIVE_TRIGGERS OFF 
GO

ALTER DATABASE [DBGA_DEV] SET  DISABLE_BROKER 
GO

ALTER DATABASE [DBGA_DEV] SET AUTO_UPDATE_STATISTICS_ASYNC OFF 
GO

ALTER DATABASE [DBGA_DEV] SET DATE_CORRELATION_OPTIMIZATION OFF 
GO

ALTER DATABASE [DBGA_DEV] SET TRUSTWORTHY OFF 
GO

ALTER DATABASE [DBGA_DEV] SET ALLOW_SNAPSHOT_ISOLATION OFF 
GO

ALTER DATABASE [DBGA_DEV] SET PARAMETERIZATION SIMPLE 
GO

ALTER DATABASE [DBGA_DEV] SET READ_COMMITTED_SNAPSHOT OFF 
GO

ALTER DATABASE [DBGA_DEV] SET HONOR_BROKER_PRIORITY OFF 
GO

ALTER DATABASE [DBGA_DEV] SET  READ_WRITE 
GO

ALTER DATABASE [DBGA_DEV] SET RECOVERY FULL 
GO

ALTER DATABASE [DBGA_DEV] SET  MULTI_USER 
GO

ALTER DATABASE [DBGA_DEV] SET PAGE_VERIFY CHECKSUM  
GO

ALTER DATABASE [DBGA_DEV] SET DB_CHAINING OFF 
GO

EXEC [DBGA_DEV].sys.sp_addextendedproperty @name=N'DbType', @value=N'DEV' 
GO

EXEC [DBGA_DEV].sys.sp_addextendedproperty @name=N'DbVersion', @value=N'0.0.1.20090414.1100' 
GO



USE [DBGA_DEV]
GO
/****** Object:  Schema [ga]    Script Date: 04/22/2009 13:21:29 ******/
CREATE SCHEMA [ga] AUTHORIZATION [dbo]
GO
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'Contains the objects of the Generic Application database' , @level0type=N'SCHEMA',@level0name=N'ga'
GO
/****** Object:  Table [ga].[tb_DataMeta_ObjChangeLog]    Script Date: 04/22/2009 13:21:40 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [ga].[tb_DataMeta_ObjChangeLog](
 [LogId] [int] IDENTITY(1,1) NOT NULL,
 [TimeStamp] [timestamp] NOT NULL,
 [DatabaseName] [varchar](256) NOT NULL,
 [SchemaName] [varchar](256) NOT NULL,
 [DbVersion] [varchar](20) NOT NULL,
 [DbType] [varchar](20) NOT NULL,
 [EventType] [varchar](50) NOT NULL,
 [ObjectName] [varchar](256) NOT NULL,
 [ObjectType] [varchar](25) NOT NULL,
 [Version] [varchar](50) NULL,
 [SqlCommand] [varchar](max) NOT NULL,
 [EventDate] [datetime] NOT NULL,
 [LoginName] [varchar](256) NOT NULL,
 [FirstName] [varchar](256) NULL,
 [LastName] [varchar](50) NULL,
 [ChangeDescription] [varchar](1000) NULL,
 [Description] [varchar](1000) NULL,
 [ObjVersion] [varchar](20) NOT NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING ON
GO
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'The database version as written in the extended prop of the database' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'TABLE',@level1name=N'tb_DataMeta_ObjChangeLog', @level2type=N'COLUMN',@level2name=N'DbVersion'
GO
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'dev , test , qa , fb or prod' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'TABLE',@level1name=N'tb_DataMeta_ObjChangeLog', @level2type=N'COLUMN',@level2name=N'DbType'
GO
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'The name of the object as it is registered in the sys.objects ' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'TABLE',@level1name=N'tb_DataMeta_ObjChangeLog', @level2type=N'COLUMN',@level2name=N'ObjectName'
GO
EXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'TABLE',@level1name=N'tb_DataMeta_ObjChangeLog', @level2type=N'COLUMN',@level2name=N'Description'
GO
SET IDENTITY_INSERT [ga].[tb_DataMeta_ObjChangeLog] ON
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (3, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'DROP_TABLE', N'tb_BL_Products', N'TABLE', N' some', N'<EVENT_INSTANCE><EventType>DROP_TABLE</EventType><PostTime>2009-04-22T11:03:11.880</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>drop TABLE [en].[tb_BL_Products] --<Version> some</Version>&#x0D;
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300B6271C AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (4, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'CREATE_TABLE', N'tb_BL_Products', N'TABLE', N' 2.2.2 ', N'<EVENT_INSTANCE><EventType>CREATE_TABLE</EventType><PostTime>2009-04-22T11:03:18.620</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>CREATE TABLE [en].[tb_BL_Products](&#x0D;
 [ProducId] [int] NULL,&#x0D;
 [ProductName] [nchar](10) NULL,&#x0D;
 [ProductDescription] [varchar](5000) NULL&#x0D;
) ON [PRIMARY]&#x0D;
/*&#x0D;
<Version> 2.2.2 </Version>&#x0D;
&#x0D;
*/&#x0D;
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300B62F07 AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (5, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'DROP_TABLE', N'tb_BL_Products', N'TABLE', N' 2.2.2 ', N'<EVENT_INSTANCE><EventType>DROP_TABLE</EventType><PostTime>2009-04-22T11:25:12.620</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>drop TABLE [en].[tb_BL_Products] &#x0D;
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300BC32F1 AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
INSERT [ga].[tb_DataMeta_ObjChangeLog] ([LogId], [DatabaseName], [SchemaName], [DbVersion], [DbType], [EventType], [ObjectName], [ObjectType], [Version], [SqlCommand], [EventDate], [LoginName], [FirstName], [LastName], [ChangeDescription], [Description], [ObjVersion]) VALUES (6, N'DBGA_DEV', N'en', N'0.0.1.20090414.1100', N'DEV', N'CREATE_TABLE', N'tb_BL_Products', N'TABLE', N' 2.2.2 ', N'<EVENT_INSTANCE><EventType>CREATE_TABLE</EventType><PostTime>2009-04-22T11:25:19.053</PostTime><SPID>57</SPID><ServerName>YSG</ServerName><LoginName>ysg\yordgeor</LoginName><UserName>dbo</UserName><DatabaseName>DBGA_DEV</DatabaseName><SchemaName>en</SchemaName><ObjectName>tb_BL_Products</ObjectName><ObjectType>TABLE</ObjectType><TSQLCommand><SetOptions ANSI_NULLS="ON" ANSI_NULL_DEFAULT="ON" ANSI_PADDING="ON" QUOTED_IDENTIFIER="ON" ENCRYPTED="FALSE"/><CommandText>CREATE TABLE [en].[tb_BL_Products](&#x0D;
 [ProducId] [int] NULL,&#x0D;
 [ProductName] [nchar](10) NULL,&#x0D;
 [ProductDescription] [varchar](5000) NULL&#x0D;
) ON [PRIMARY]&#x0D;
/*&#x0D;
<Version> 2.2.2 </Version>&#x0D;
&#x0D;
*/&#x0D;
</CommandText></TSQLCommand></EVENT_INSTANCE>', CAST(0x00009BF300BC3A69 AS DateTime), N'ysg\yordgeor', N'Yordan', N'Georgiev', NULL, NULL, N'0.0.0')
SET IDENTITY_INSERT [ga].[tb_DataMeta_ObjChangeLog] OFF
/****** Object:  Table [ga].[tb_BLSec_LoginsForUsers]    Script Date: 04/22/2009 13:21:40 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [ga].[tb_BLSec_LoginsForUsers](
 [LoginsForUsersId] [int] IDENTITY(1,1) NOT NULL,
 [LoginName] [nvarchar](100) NOT NULL,
 [FirstName] [varchar](100) NOT NULL,
 [SecondName] [varchar](100) NULL,
 [LastName] [varchar](100) NOT NULL,
 [DomainName] [varchar](100) NOT NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING ON
GO
SET IDENTITY_INSERT [ga].[tb_BLSec_LoginsForUsers] ON
INSERT [ga].[tb_BLSec_LoginsForUsers] ([LoginsForUsersId], [LoginName], [FirstName], [SecondName], [LastName], [DomainName]) VALUES (1, N'ysg\yordgeor', N'Yordan', N'Stanchev', N'Georgiev', N'yordgeor')
SET IDENTITY_INSERT [ga].[tb_BLSec_LoginsForUsers] OFF
/****** Object:  Table [en].[tb_BL_Products]    Script Date: 04/22/2009 13:21:40 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
SET ANSI_PADDING ON
GO
CREATE TABLE [en].[tb_BL_Products](
 [ProducId] [int] NULL,
 [ProductName] [nchar](10) NULL,
 [ProductDescription] [varchar](5000) NULL
) ON [PRIMARY]
GO
SET ANSI_PADDING ON
GO
/****** Object:  StoredProcedure [ga].[procUtils_SqlCheatSheet]    Script Date: 04/22/2009 13:21:37 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE PROCEDURE [ga].[procUtils_SqlCheatSheet]                

as                 
set nocount on                 

--what was the name of the table with something like role                
/*                
SELECT * from sys.tables where [name] like '%POC%'                
*/                
-- what are the columns of this table                 
/*                
select column_name , DATA_TYPE , CHARACTER_MAXIMUM_LENGTH,  table_name  from Information_schema.columns where table_name='tbGui_ExecutePOC'                
*/                

-- find proc        
--what was the name of procedure with something like role                
/*                
select * from sys.procedures where [name] like '%ext%'                
exec sp_HelpText procName        
*/                
/*                
exec sp_helpText procUtils_InsertGenerator                
*/                
--how to list all databases in sql server                 
/*                
SELECT database_id AS ID, NULL AS ParentID, name AS Text FROM sys.databases ORDER BY [name]                
*/                

--HOW-TO LIST ALL TABLES IN A SQL SERVER 2005 DATABASE                
/*                
SELECT TABLE_NAME FROM [POC].INFORMATION_SCHEMA.TABLES                
WHERE TABLE_TYPE = 'BASE TABLE'                  
AND TABLE_NAME <> 'dtproperties'                  
ORDER BY TABLE_NAME                


*/                
--HOW-TO ENABLE XP_CMDSHELL START                
-------------------------------------------------------------------------                
-- configure verbose mode temporarily                 
-- EXECUTE sp_configure 'show advanced options', 1                 
-- RECONFIGURE WITH OVERRIDE                 
--GO                 


--ENABLE xp_cmdshell                 
-- EXECUTE sp_configure 'xp_cmdshell', '1'                 
-- RECONFIGURE WITH OVERRIDE                 
-- EXEC SP_CONFIGURE 'show advanced option', '1';                 
-- SHOW THE CONFIGURATION                 
-- EXEC SP_CONFIGURE;                 


--turn show advance options off                 
-- GO                 
--EXECUTE sp_configure 'show advanced options', 0                 
-- RECONFIGURE WITH OVERRIDE                 
-- GO                

--HOW-TO ENABLE XP_CMDSHELL END                
-------------------------------------------------------------------------                

--HOW-TO IMPLEMENT SLEEP                 
-- sleep for 10 seconds                 
-- WAITFOR DELAY '00:00:10' SELECT * FROM My_Table                

/* LIST ALL PRIMARY KEYS                 

SELECT                 
  INFORMATION_SCHEMA.TABLE_CONSTRAINTS.TABLE_NAME AS TABLE_NAME,                
  INFORMATION_SCHEMA.KEY_COLUMN_USAGE.COLUMN_NAME AS COLUMN_NAME,                 
  REPLACE(INFORMATION_SCHEMA.TABLE_CONSTRAINTS.CONSTRAINT_TYPE,' ', '_') AS CONSTRAINT_TYPE                 
FROM                 
  INFORMATION_SCHEMA.TABLE_CONSTRAINTS                 
  INNER JOIN INFORMATION_SCHEMA.KEY_COLUMN_USAGE ON                 
  INFORMATION_SCHEMA.TABLE_CONSTRAINTS.CONSTRAINT_NAME =                 
  INFORMATION_SCHEMA.KEY_COLUMN_USAGE.CONSTRAINT_NAME                 
WHERE                 
  INFORMATION_SCHEMA.TABLE_CONSTRAINTS.TABLE_NAME <> N'sysdiagrams'                 
ORDER BY                 
  INFORMATION_SCHEMA.TABLE_CONSTRAINTS.TABLE_NAME ASC                

*/                

--HOW-TO COPY TABLE AND THE WHOLE TABLE DATA , COPY TABLE FROM DB TO DB                
--==================================================START                 
/*                
use Poc_Dev                
go                
drop table tbGui_LinksVisibility                

use POc_test                
go                
select *                 
INTO [POC_Dev].[ga].[tbGui_LinksVisibility]                
from [POC_TEST].[ga].[tbGui_LinksVisibility]                


*/                
--HOW-TO COPY TABLE AND THE WHOLE TABLE DATA , COPY TABLE FROM DB TO DB                
--====================================================END                
--=================================================== SEE TABLE METADATA START                
/*                



SELECT c.name AS [COLUMN_NAME], sc.data_type AS [DATA_TYPE], [value] AS                 
[DESCRIPTION] , c.max_length as [MAX_LENGTH] , c.is_nullable AS [OPTIONAL]                 
, c.is_identity AS [IS_PRIMARY_KEY] FROM sys.extended_properties AS ep                 
INNER JOIN sys.tables AS t ON ep.major_id = t.object_id                 
INNER JOIN sys.columns AS c ON ep.major_id = c.object_id AND ep.minor_id                 
= c.column_id                 
INNER JOIN INFORMATION_SCHEMA.COLUMNS sc ON t.name = sc.table_name and                 
c.name = sc.column_name                 
WHERE class = 1 and t.name = 'tbGui_ExecutePOC' ORDER BY SC.DATA_TYPE                


*/                
--=================================================== SEE TABLE METADATA END               
    /*            
select * from Information_schema.columns                
select table_name , column_name from Information_schema.columns where table_name='tbGui_Wizards'                
    */            


--=================================================== LIST ALL TABLES AND THEIR DESCRIPTOINS START                
/*                

SELECT T.name AS TableName, CAST(Props.value AS varchar(1000)) AS                
TableDescription                
FROM sys.tables AS T LEFT OUTER JOIN                
(SELECT class, class_desc, major_id, minor_id,                
name, value                
FROM sys.extended_properties                
WHERE (minor_id = 0) AND (class = 1)) AS                
Props ON T.object_id = Props.major_id                
WHERE (T.type = 'U') AND (T.name <> N'sysdiagrams')                
ORDER BY TableName                
*/                
--=================================================== LIST ALL TABLES AND THEIR DESCRIPTOINS START                

--=================================================== LIST ALL OBJECTS FROM DB START                
/*                


use DB                
--HOW-TO LIST ALL PROCEDURE IN A DATABASE                
select s.name from sysobjects s where type = 'P'                
--HOW-TO LIST ALL TRIGGERS BY NAME IN A DATABASE                
select s.name from sysobjects s where type = 'TR'                
--HOW-TO LIST TABLES IN A DATABASE                 
select s.name from sysobjects s where type = 'U'                
--how-to list all system tables in a database                
select s.name from sysobjects s where type = 's'                
--how-to list all the views in a database                
select s.name from sysobjects s where type = 'v'                


*/                

/*                
Similarly you can find out other objects created by user, simple change type =                 

C = CHECK constraint                 

D = Default or DEFAULT constraint                 

F = FOREIGN KEY constraint                 

L = Log                 

FN = Scalar function                 

IF = In-lined table-function                 

P = Stored procedure                 

PK = PRIMARY KEY constraint (type is K)                 

RF = Replication filter stored procedure                

S = System table                 

TF = Table function                 

TR = Trigger                 

U = User table ( this is the one I discussed above in the example)                

UQ = UNIQUE constraint (type is K)                 

V = View                 

X = Extended stored procedure                
*/                



--=================================================== HOW-TO SEE ALL MY PERMISSIONS START                


/*                

SELECT * FROM fn_my_permissions(NULL, 'SERVER');                
USE poc_qa;                
SELECT * FROM fn_my_permissions (NULL, 'database');                
GO                

*/                
--=================================================== HOW-TO SEE ALL MY PERMISSIONS END                

/*               
--find table               

use poc_dev               
go               
select s.name from sysobjects s where type = 'u'  and s.name like '%Visibility%'              
select * from tbGui_LinksVisibility              

*/              

/* find cursor              

use poc      
go        
DECLARE @procName varchar(100)        
DECLARE @cursorProcNames CURSOR        
SET @cursorProcNames = CURSOR FOR        
select name from sys.procedures where modify_date > '2009-02-05 13:12:15.273' order by modify_date desc       

OPEN @cursorProcNames        
FETCH NEXT        
FROM @cursorProcNames INTO @procName        
WHILE @@FETCH_STATUS = 0        
BEGIN        

set nocount off;        
exec sp_HelpText @procName --- or print them        
-- print @procName        

FETCH NEXT        
FROM @cursorProcNames INTO @procName        
END        
CLOSE @cursorProcNames        

select @@error    

*/              


/* --  SEE STORED PROCEDURE EXT PROPS            

SELECT ep.name as 'EXT_PROP_NAME' , SP.NAME , [value] as 'DESCRIPTION' FROM sys.extended_properties as ep left join sys.procedures as sp on sp.object_id = ep.major_id where sp.type='P'            


-- what the hell I ve been doing lately on sql server 2005 / 2008
select o.name , 
(SELECT [definition] AS [text()] FROM sys.all_sql_modules where sys.all_sql_modules.object_id=a.object_id FOR XML PATH(''), TYPE) AS Statement_Text
, a.object_id, o.modify_date from sys.all_sql_modules a left join sys.objects o on a.object_id=o.object_id order by 4 desc



-- GET THE RIGHT LANG SCHEMA START 
DECLARE @template AS varchar(max)
SET @template = 'SELECT * FROM {object_name}'

DECLARE @object_name AS sysname

SELECT @object_name = QUOTENAME(s.name) + '.' + QUOTENAME(o.name)
FROM sys.objects o
INNER JOIN sys.schemas s
    ON s.schema_id = o.schema_id
WHERE o.object_id = OBJECT_ID(QUOTENAME(@LANG) + '.[TestingLanguagesInNameSpacesDelMe]')

IF @object_name IS NOT NULL
BEGIN
    DECLARE @sql AS varchar(max)
    SET @sql = REPLACE(@template, '{object_name}', @object_name)
    EXEC (@sql)
END
-- GET THE RIGHT LANG SCHEMA END 

--  SEE STORED PROCEDURE EXT PROPS end*/             
set nocount off
GO
EXEC sys.sp_addextendedproperty @name=N'AuthorName', @value=N'Yordan Georgiev' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'PROCEDURE',@level1name=N'procUtils_SqlCheatSheet'
GO
EXEC sys.sp_addextendedproperty @name=N'ProcDescription', @value=N'TODO:ADD HERE DESCRPIPTION' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'PROCEDURE',@level1name=N'procUtils_SqlCheatSheet'
GO
EXEC sys.sp_addextendedproperty @name=N'ProcVersion', @value=N'0.1.0.20090406.1317' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'PROCEDURE',@level1name=N'procUtils_SqlCheatSheet'
GO
/****** Object:  UserDefinedFunction [ga].[GetDbVersion]    Script Date: 04/22/2009 13:21:42 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [ga].[GetDbVersion]()
RETURNS VARCHAR(20)
    BEGIN
  RETURN convert(varchar(20) , (select value from sys.extended_properties where name='DbVersion' and class_desc='DATABASE') )
    END
GO
EXEC sys.sp_addextendedproperty @name=N'AuthorName', @value=N'Yordan Georgiev' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'FUNCTION',@level1name=N'GetDbVersion'
GO
EXEC sys.sp_addextendedproperty @name=N'ChangeDescription', @value=N'Initial creation' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'FUNCTION',@level1name=N'GetDbVersion'
GO
EXEC sys.sp_addextendedproperty @name=N'CreatedWhen', @value=N'getDate()' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'FUNCTION',@level1name=N'GetDbVersion'
GO
EXEC sys.sp_addextendedproperty @name=N'Description', @value=N'Gets the current version of the database ' , @level0type=N'SCHEMA',@level0name=N'ga', @level1type=N'FUNCTION',@level1name=N'GetDbVersion'
GO
/****** Object:  UserDefinedFunction [ga].[GetDbType]    Script Date: 04/22/2009 13:21:42 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [ga].[GetDbType]()
RETURNS VARCHAR(30)
    BEGIN
  RETURN convert(varchar(30) , (select value from sys.extended_properties where name='DbType' and class_desc='DATABASE') )
    END
GO
/****** Object:  Default [DF_tb_DataMeta_ObjChangeLog_DbVersion]    Script Date: 04/22/2009 13:21:40 ******/
ALTER TABLE [ga].[tb_DataMeta_ObjChangeLog] ADD  CONSTRAINT [DF_tb_DataMeta_ObjChangeLog_DbVersion]  DEFAULT ('select ga.GetDbVersion()') FOR [DbVersion]
GO
/****** Object:  Default [DF_tb_DataMeta_ObjChangeLog_EventDate]    Script Date: 04/22/2009 13:21:40 ******/
ALTER TABLE [ga].[tb_DataMeta_ObjChangeLog] ADD  CONSTRAINT [DF_tb_DataMeta_ObjChangeLog_EventDate]  DEFAULT (getdate()) FOR [EventDate]
GO
/****** Object:  Default [DF_tb_DataMeta_ObjChangeLog_ObjVersion]    Script Date: 04/22/2009 13:21:40 ******/
ALTER TABLE [ga].[tb_DataMeta_ObjChangeLog] ADD  CONSTRAINT [DF_tb_DataMeta_ObjChangeLog_ObjVersion]  DEFAULT ('0.0.0') FOR [ObjVersion]
GO
/****** Object:  DdlTrigger [trigMetaDoc_TraceDbChanges]    Script Date: 04/22/2009 13:21:29 ******/
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
create trigger [trigMetaDoc_TraceDbChanges]
on database
for create_procedure, alter_procedure, drop_procedure,
create_table, alter_table, drop_table,
create_function, alter_function, drop_function , 
create_trigger , alter_trigger , drop_trigger 
as

set nocount on

declare @data xml
set @data = EVENTDATA()
declare @DbVersion varchar(20)
set @DbVersion =(select ga.GetDbVersion())
declare @DbType varchar(20)
set @DbType =(select ga.GetDbType())
declare @DbName varchar(256)
set @DbName [email protected]('(/EVENT_INSTANCE/DatabaseName)[1]', 'varchar(256)')
declare @EventType varchar(256)
set @EventType [email protected]('(/EVENT_INSTANCE/EventType)[1]', 'varchar(50)')
declare @ObjectName varchar(256)
set @ObjectName  = @data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'varchar(256)')
declare @ObjectType varchar(25)
set @ObjectType = @data.value('(/EVENT_INSTANCE/ObjectType)[1]', 'varchar(25)')
declare @TSQLCommand varchar(max)
set @TSQLCommand = @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'varchar(max)')
declare @opentag varchar(4)
set @opentag= '&lt;'
declare @closetag varchar(4) 
set @closetag= '&gt;'
declare @newDataTxt varchar(max) 
set @newDataTxt= cast(@data as varchar(max))
set @newDataTxt = REPLACE ( REPLACE(@newDataTxt , @opentag , '<') , @closetag , '>')
-- print @newDataTxt
declare @newDataXml xml 
set @newDataXml = CONVERT ( xml , @newDataTxt)
declare @Version varchar(50)
set @Version = @newDataXml.value('(/EVENT_INSTANCE/TSQLCommand/CommandText/Version)[1]', 'varchar(50)')

-- if we are dropping take the version from the existing object 
if  ( SUBSTRING(@EventType , 0 , 5)) = 'DROP'
set @Version =( select top 1 [Version]  from ga.tb_DataMeta_ObjChangeLog where ObjectName=@ObjectName order by [LogId] desc)



declare @LoginName varchar(256) 
set @LoginName = @data.value('(/EVENT_INSTANCE/LoginName)[1]', 'varchar(256)')
declare @FirstName varchar(50)
set @FirstName= (select [FirstName] from [ga].[tb_BLSec_LoginsForUsers] where [LoginName] = @LoginName)
declare @LastName varchar(50)
set @LastName  = (select [LastName] from [ga].[tb_BLSec_LoginsForUsers] where [LoginName] = @LoginName)
declare @SchemaName sysname 
set @SchemaName = @data.value('(/EVENT_INSTANCE/SchemaName)[1]', 'sysname');
--declare @Description xml 
--set @Description = @data.query('(/EVENT_INSTANCE/TSQLCommand/text())')




--print 'VERSION IS ' + @Version
--print @newDataTxt
--print cast(@data as varchar(max))


-- select column_name from information_schema.columns where table_name ='tb_DataMeta_ObjChangeLog'
insert into [ga].[tb_DataMeta_ObjChangeLog]
(
[DatabaseName] ,
[SchemaName],
[DbVersion] ,
[DbType],
[EventType],
[ObjectName],
[ObjectType] ,
[Version],
[SqlCommand] ,
[LoginName] ,
[FirstName],
[LastName] 
)

values(

@DbName,
@SchemaName,
@DbVersion,
@DbType,
@EventType, 
@ObjectName, 
@ObjectType , 
@Version,
@newDataTxt, 
@LoginName , 
@FirstName , 
@LastName
)
GO
SET ANSI_NULLS OFF
GO
SET QUOTED_IDENTIFIER OFF
GO
DISABLE TRIGGER [trigMetaDoc_TraceDbChanges] ON DATABASE
GO
/****** Object:  DdlTrigger [trigMetaDoc_TraceDbChanges]    Script Date: 04/22/2009 13:21:29 ******/
Enable Trigger [trigMetaDoc_TraceDbChanges] ON Database
GO
YordanGeorgiev
A: 

Yes, of course. We generate dumps of our PostgreSQL schemas whenever there's a change and check it in. It's already saved us many times, and I've only been at my job a few months.

cookiecaper
+1  A: 

funny , i was thinking why no one has built version control into a database

there has been many times in doing database management where it would have been nice to be able to do a checkin so i could roll back to a previous revision

yes transactions do this to a certain extent in a temporary or short term way , but i can still see the benefit to having vcs right in the database and be able to roll back to a previous revision if a database management path doesnt work

drfrog
+2  A: 

We version and source control everything surrounding our databases:

  • DDL (create and alters)
  • DML (reference data, codes, etc.)
  • Data Model changes (using ERwin or ER/Studio)
  • Database configuration changes (permissions, security objects, general config changes)

We do all this with automated jobs using Change Manager and some custom scripts. We have Change Manager monitoring these changes and notifying when they are done.

Karen Lopez
+1  A: 

Dear friends,

You are talking about coding and finding the best way for source control, but theres a much easier way!

SQL tools is what we doing, and we have the 1st and the best third party tool for SQL Version control

This tool is a unique solution that revolutionizes the way version control and change management is done for SQL Server. It shifts the responsibility for versioning from the users to the software. Its light-weight, easy to use tool that runs in the background and keeps track of all your databases schema and data changes over time, and enables full review of databases' history, and full rollback to any point in time, as well as optionally push changes into Subversion or SourceSafe.

Itamar
+3  A: 

I use SchemaBank to version control all my database schema changes:

  • from day 1, I import my db schema dump into it
  • i started to change my schema design using a web browser (because they are SaaS / cloud-based)
  • when i want to update my db server, i generate the change (SQL) script from it and apply to the db. In Schemabank, they mandate me to commit my work as a version before I can generate an update script. I like this kind of practice so that I can always trace back when I need to.

Our team rule is NEVER touch the db server directly without storing the design work first. But it happens, somebody might be tempted to break the rule, in sake of convenient. We would import the schema dump again into schemabank and let it do the diff and bash someone if a discrepancy is found. Although we could generate the alter scripts from it to make our db and schema design in sync, we just hate that.

By the way, they also let us create branches within the version control tree so that I can maintain one for staging and one for production. And one for coding sandbox.

A pretty neat web-based schema design tool with version control n change management.

Leigh Pyle
A: 

Sadly, I've seen more than one team developing PL/SQL programs (stored procedures in Oracle) - sometimes ten thousands LOC - just by editing the code in TOAD (a database tool), without even saving the source to files (except for deployment). Even if the database is backuped regulary (wouldn't take that for granted, though), the only way to retrieve an old version of a stored procedure is to restore the whole database, which is many GB large. And of course sometimes concurrent changes in one file lead to loss of work, when more than one developer works on the same project.

ammoQ
A: 

I believe that every DB should be under source control, and developers should have an easy way to create their local database from scratch. Inspired by Visual Studio for Database Professionals, I've created an open-source tool that scripts MS SQL databases, and provides and easy way of deploying them to your local DB engine. Try http://dbsourcetools.codeplex.com/ . Have fun, - Nathan.

A: 

I use ActiveRecord Migrations. This Ruby gem can be used outside of a Rails project and there are adapters to handle most databases you'll come across. My tip: if you are able to run your project off Postgres, you get transactional schema migrations. That means you don't end up with a broken database if a migration only half-applies.

jds
A: 

One of Kira's prime use cases is database upgrades by explicitly specify the schema outside the database as code. It then can manage the database and upgrade it to any version from any version.

MathGladiator
A: 

Yes, we source control our sql scripts too with subversion. It's a good practice and you can recreate the schema with default data whenever needed.

Guilherme Melo
+3  A: 

If your Database is SQL Server, we might have just the solution you're looking for. SQL Source Control 1.0 has now bee released.

http://www.red-gate.com/products/SQL_Source_Control/index.htm

This integrates into SSMS and provides the glue between your database objects and TFS or SVN. The 'scripting out' happens transparently (it uses the SQL Compare engine under the hood), which should make it so straightforward to use that developers won't be discouraged from adopting the process. If this doesn't meet your needs, we'd love to know why.

David Atkinson
+1 just because I'm a fan of redgate. We actually do use SQL Compare and SQL Data Compare, although I'm considering giving it up for the comparison features that are built into VS2010.
Brian MacKay
That's terrible! :( Anything we could do to change your mind? I'd love to get your thoughts on SQL Source Control if you get a chance to download it.
David Atkinson
+1  A: 

I think that most people uses SQL scripts that they put under version control (typically, an initial create script and then a stack of alter scripts). Well, this is an idea to deal with "what we have" and reach the goal of controlling the versions of our databases.

But this process is time-consuming and you loose the focus on your data model : you have to build the scripts yourself, or generate them by comparing a database with a reference without any guarantee that the 2 compared database do correspond respectively to the initial state and target state of your upgrade... I think this is why so few organizations are considering databases in their lifecycle management (but I would like to hear from you on this).

We have built a product allowing you to focus on your database model as the model itself is under version control (as structured entities, not as scripts). This approach allows to compare any entity with another and provide the ability to generate any SQL script dynamically which can transform a given version into another. As a result, you version control your whole database and the product generates any upgrade script you need from any version delta automatically for you.

In addition, neXtep designer is a whole database development environment based on eclipse so you'll have access to native editors, dependency management, data model diagrams, delivery generation, install automation, etc. It is free, it is GPL and supports Oracle, mySql and Postgresql (DB2 under development)

You can learn more on this product by having a look at the WIKI at:

http://www.nextep-softwares.com

Edited : Updated the link which was incorrect, sorry for this.

Christophe Fondacci
Sounds interesting! The link does not work however..
Wouter
sorry, there was a TYPO in the link. I fixed this in the previous post.
Christophe Fondacci
Looks promising but upon syncing with db, it shows a crapload of beng_* and dgbm_* and similar databases I know nothing about. Using mysql on localhost.
Raveren
Well, I think you are trying to sync the neXtep repository database itself. As soon as you have setup a repository connection and you've successfully logged in, you can work with your own databases or start new data design from scratch : Select create workspace and either select an existing db or start with empty workspace for new design. Have a look at the tutorials : http://www.nextep-softwares.com/wiki/index.php/NeXtep_Designer_tutorials
Christophe Fondacci