views:

732

answers:

7

I am currently creating a master ddl for our database. Historically we have used backup/restore to version our database, and not maintained any ddl scripts. The schema is quite large.

My current thinking:

  • Break script into parts (possibly in separate scripts):

    1. table creation
    2. add indexes
    3. add triggers
    4. add constraints
  • Each script would get called by the master script.

  • I might need a script to drop constraints temporarily for testing
  • There may be orphaned tables in the schema, I plan to identify suspect tables.

Any other advice?

Edit: Also if anyone knows good tools to automate part of the process, we're using MS SQL 2000 (old, I know).

A: 

What you have there seems to be pretty good. My company has on occasion, for large enough databases, broken it down even further, perhaps to the individual object level. In this way each table/index/... has its own file. Can be useful, can be overkill. Really depends on how you are using it.

@Justin

By domain is mostly always sufficient. I agree that there are some complexities to deal with when doing it this way, but that should be easy enough to handle.

I think this method provides a little more seperation (which in a large database you will come to appreciate) while still making itself pretty manageable. We also write Perl scripts that do a lot of the processing of these DDL files, so that might be an option of a good way to handle that.

Adam Lerman
+1  A: 

Invest the time to write a generic "drop all constraints" script, so you don't have to maintain it. (A cursor over "Select * From Information_Schema.Table_Constraints" and "Select * From Information_Schema.Referential_Constraints" does the trick).

Stu
+1  A: 

@Adam

Or how about just by domain -- a useful grouping of related tables in the same file, but separate from the rest?

Only problem is if some domains (in this somewhat legacy system) are tightly coupled. Plus you have to maintain the dependencies between your different sub-scripts.

Justin Standard
A: 

I previously organised my DDL code organised by one file per entity and made a tool that combined this into a single DDL script.

My former employer used a scheme where all table DDL was in one file (stored in oracle syntax), indicies in another, constraints in a third and static data in a fourth. A change script was kept in paralell with this (again in Oracle). The conversion to SQL was manual. It was a mess. I actually wrote a handy tool that will convert Oracle DDL to SQL Server (it worked 99.9% of the time).

I have recently switched to using Visual Studio Team System for Database professionals. So far it works fine, but there are some glitches if you use CLR functions within the database.

vzczc
+2  A: 

I think the basic idea is good.

The nice thing about building all the tables first and then building all the constraints, is that the tables can be created in any order. When I've done this I had one file per table, which I put in a directory called "Tables" and then a script which executed all the files in that directory. Likewise I had a folder for constraint scripts (which did foreign key and indexes too), which were executed when after the tables were built.

I would separate the build of the triggers and stored procedures, and run these last. The point about these is they can be run and re-run on the database without affecting the data. This means you can treat them just like ordinary code. You should include "if exists...drop" statements at the beginning of each trigger and procedure script, to make them re-runnable.

So the order would be

  1. table creation
  2. add indexes
  3. add constraints

Then

  1. add triggers
  2. add stored procedures

On my current project we are using MSBuild to run the scripts. There are some extension targets that you can get for it which allow you to call sql scripts. In the past I have used perl which was fine too (and batch files...which I would not recommend - the're too limited).

Mat Roberts
I also have a similar environment where I use MSBuild to control the script execution. It also allows me to include data-loading scripts where appropriate. For example, I can load sample data into a test or demo copy of the database.
bobs
+1  A: 

If you are looking for an automation tool, I have often worked with EMS SQLManager, which allows you to generate automatically a ddl script from a database.

Data inserts in reference tables might be mandatory before putting your database on line. This can even be considered as part of the ddl script. EMS can also generate scripts for data inserts from existing databases.

Need for indexes might not be properly estimated at the ddl stage. You will just need to declare them for primary/foreign keys. Other indexes should be created later, once views and queries have been defined

Philippe Grondier
A: 

Hello, there is a neat tools that will iterate through the entire sql server and extract all the table, view, stored proceedures and UDF defintions to the local file system as SQL scripts (Text Files). I have used this with 2005 and 2008, not sure how it wil work with 2000 though. Check out http://www.antipodeansoftware.com/Home/Products

Darryl