views:

865

answers:

4

I am trying to do an audit history by adding triggers to my tables and inserting rows intto my Audit table. I have a stored procedure that makes doing the inserts a bit easier because it saves code; I don't have to write out the entire insert statement, but I instead execute the stored procedure with a few parameters of the columns I want to insert.

I am not sure how to execute a stored procedure for each of the rows in the "inserted" table. I think maybe I need to use a cursor, but I'm not sure. I've never used a cursor before.

Since this is an audit, I am going to need to compare the value for each column old to new to see if it changed. If it did change I will execute the stored procedure that adds a row to my Audit table.

Any thoughts?

+1  A: 

If your database needs to scale past a few users this will become very expensive. I would recommend looking into 3rd party database auditing tools.

zodeus
+3  A: 

I would trade space for time and not do the comparison. Simply push the new values to the audit table on insert/update. Disk is cheap.

Also, I'm not sure what the stored procedure buys you. Can't you do something simple in the trigger like:

insert into dbo.mytable_audit
    (select *, getdate(), getdate(), 'create' from inserted)

Where the trigger runs on insert and you are adding created time, last updated time, and modification type fields. For an update, it's a little tricker since you'll need to supply named parameters as the created time shouldn't be updated

insert into dbo.mytable_audit (col1, col2, ...., last_updated, modification)
     (select *, getdate(), 'update' from inserted)

Also, are you planning to audit only successes or failures as well? If you want to audit failures, you'll need something other than triggers I think since the trigger won't run if the transaction is rolled back -- and you won't have the status of the transaction if the trigger runs first.

I've actually moved my auditing to my data access layer and do it in code now. It makes it easier to both success and failure auditing and (using reflection) is pretty easy to copy the fields to the audit object. The other thing that it allows me to do is give the user context since I don't give the actual user permissions to the database and run all queries using a service account.

tvanfosson
I agree. Disk is cheap. Just save the new (inserted) values each time. The old values will be in the previous row in the audit table.
MikeW
+1  A: 

There is already a built in function UPDATE() which tells you if a column has changed (but it is over the entire set of inserted rows).

You can look at some of the techniques in Paul Nielsen's AutoAudit triggers which are code generated.

What it does is check both:

IF UPDATE(<column_name>)
INSERT Audit (...)
SELECT ...
FROM Inserted
JOIN Deleted
    ON Inserted.KeyField = Deleted.KeyField -- (AutoAudit does not support multi-column primary keys, but the technique can be done manually)
AND NOT (Inserted.<column_name> = Deleted.<column_name> OR COALESCE(Inserted.<column_name>, Deleted.<column_name>) IS NULL)

But it audits each column change as a separate row. I use it for auditing changes to configuration tables. I am not currently using it for auditing heavy change tables. (But in most transactional systems I've designed, rows on heavy activity tables are typically immutable, you don't have a lot of UPDATEs, just a lot of INSERTs - so you wouldn't even need this kind of auditing). For instance, orders or ledger entries are never changed, and shopping carts are disposable - neither would have this kind of auditing. On low volume change tables, like customer, you can use this kind of auditing.

Cade Roux
+1  A: 

Jeff, I agree with Zodeus..a good option is to use a 3rd tool. I have used auditdatabase (FREE)web tool that generates audit triggers (you do not need to write a single line of TSQL code)

Another good tools is Apex SQL Audit but..it's not free.

I hope this helps you, F. O'Neill

Site is www.auditdatabase.com