views:

159

answers:

3

I work on a system that is based on a SQL server database that has a lot of years of development on it. It is not huge in data volume (a few GB) but it has a lot of complexity (hundreds of tables, hundreds of stored procedures).

I want to start by cleaning out the stuff that isn't used any more. We have a weekly/monthly/quarterly/annual cycle that will mean that some stuff will not be used for a year. However, if there is a short list of things that have not been used for a few months, we might be able to decide by inspection if they are still useful.

My intention is to start logging access to all the database objects. I've got some ideas but what I'd like is a log that I can turn into a list of things that aren't being used. I've a few ideas but I'd like some expert help.

edit : I'd also like to clear up that I can I would like to see the access to the tables/views as well as the stored procs and functions.

A: 

Well SQL profiler would be my suggestion, but you'd need to be careful having that running, because it does introduce a performance overhead when you have a trace running.

Neil Barnwell
yes that was my concern. the database server isn't heavily loaded but I think that running a trace for year on "batchComplete" might make a lot of data.
No More Hacks
+1  A: 

This is a similar question, specifically asking when stored procedures were used: http://stackoverflow.com/questions/1207274/how-do-i-log-the-frequency-and-last-used-time-for-a-stored-procedure

Iain Hoult
Ah yes. I didn't see that one. But I have got some code that will tag all the stored procs with a logging statement. However, I'd also like to log table and view access. I have considered adding logging triggers to tables and views for insert/update/delete but I don't know how to trap the "select" access.
No More Hacks
+1  A: 

You check sys.dm_db_index_usage_stats: the last update is retained in the last_user_update column and the last SELECT is retained in the one of last_user_seek, last_user_scan or last_user_lookup. Note that the counters are reset at SQL Server start up, so you need to run your application and do a thorough test of every feature to get relevant results.

For stored procedure you should start a server trace monitoring the the SP:Starting event. After your tests are run that, again, exercise every feature of the product, stop the trace and use SQL aggregate functions to count distinct occurrences of procedure names in TextData in the trace file. You read the trace with fn_trace_gettable.

Remus Rusanu
i had forgotten the dynamic views, so you defo get a +1 for that. I'm not sure it does quite the right thing. It is very useful for finding the _most_ used sprocs/tables/indexes but i'm not sure it finds every use of every table. I'm not sure though, so don't shoot!
No More Hacks
That's what I mean by 'thorough test'. If your test battery exercise *every* feature of the product, at the end you'll have collected *every* used table/index/procedure. I understand that in the real world coming up with a test that exercises everything an application does is not trivial, if possible at all.
Remus Rusanu