views:

93

answers:

2

Problem:

There are a lot of different databases, which is populated by many different applications directly (without any common application layer). Data can be accessed only through SP (by policy)

Task:

Application needs to track changes in these databases and react in minimal time.

Possible solutions:

1) Create trigger for each table in each database, which will populate one table with events. Application will watch this table through SqlDependency.

2) Watch each table in each database through SqlDependency.

3) Create trigger for each table in each database, which will notify application using managed extension.

Which is the best way?

A: 

Don't you review possibility to use SQL profiler. Using filter you can select only update operations, then write to log.

Dewfy
And how it affects performance?How can i get parameters from update statements? By query parsing? Or there can be another way?
Yauheni Sivukha
Profiler can be started at any computer in network, so this reduces pitfall of processor usage. Parameters are also included into log
Dewfy
+3  A: 

This can be an extensive topic. First of all: What is the SQL Server version used?

if your are using SQL 2008 the Change Data Capture is the tool of choice This new feature enables you to monitor EVERY change made to databases within SQL 2008. This includes DDL changes as well as changes to the data. Check an introduction here.

If you are using an older version of SQL 2008 and you are allowed to modify the DDL of the database the option 3 would be the one of choice (of the once you described). I would not recommend it though, since there are other things to consider, like what happens when a transaction rolls back or when triggers are deactivated when bulk inserting for example?

It will be quiet a challenge to make your solution work properly in all of these cases.

Another way you could go is to watch the Transaction Log file. This way by far the best, but also most complex way of doing it, since there is almost no documentation on the proprietary log format. Also it's bound to a specific version of SQL Server. This will result in a no impact monitoring of the chosen databases.

Yet another approach is creating a copy of the data that is to be monitored and periodically check if there are differences. This has the benefit that NO changes to the source databases have to be made. As well as get rid of transaction or bulk inserting issues. Since latest at the next monitoring run you will be able to discover the changes.

The performance impact is rather minimal since it would only require a primary index consecutive read for the tables that are to be monitored. And this is by far the most optimized way of interacting with a database. This approach will however require quite a development effort. I have to know, since this is my prime focus since the last years. Check here ;)

(I hope linking is ok, in this case since its on topic otherwise I remove it)

ntziolis
Thanks a lot! I never heard about CDC feature of SQL 2008 and it seems that this is exactly what I need.
Yauheni Sivukha
It's extremely powerful since a lot of system do not allow changes to the DDL (e.g.all Microsoft Dynamics systems crash if you change the DDL directly at Database level, like adding triggers).The CDC is totally transparent to the system consuming the DB, thats what makes it so great. In fact, since it give s you ALL information that is also available to the DBMS you would be able to create a custom TransactionLog. At the same time performance impact is minimal sine nothing is especially created but (previous) internal information is simply properly exposed.
ntziolis