views:

467

answers:

11

Quick synopsis:

The guys on my team have been working on a production database (sql server 2005). We've added various things such as constraints, added triggers, etc.

Now we've found that someone or something has been rolling back our changes at various times. Problem is we all share a common admin login. (dumb, yeah I know, we're fixing this). Its causing tons of frustration, at this point we just want to find out whodunnit.

How would you go about tracking down the guilty party?

NOTE: I'm NOT looking for a way to fix this, that's already being done. I'm looking for a way to track down the culprit.

+1  A: 

By adding user-level security like you should have.

ctacke
+10  A: 

Stay away from production databases. Create your scripts and email them to the DBA in charge (if you don't have one, get one). Developers with access to production database is a recipe for disaster - I don't have it and don't want to have it.

Otávio Décio
Yep... I'd say even follow this rule for testing databases as well. Keep a dev database that the devs can mess around in.
TM
I agree, we call that "acceptance" or "QA" database, where the DBA can apply the patches and make sure it will work before applying to prod. Developers ideally have their own db sandbox to do whatever they want.
Otávio Décio
I agree. Dev's should never be allowed near production boxes (web or database). If it works in staging but not in production then the infrastructure team better figure out why staging is not exactly like production.
Chris Lively
+1  A: 

Can you cross-reference roll-back times with the whereabouts of people on the team?

Or alternatively - just ask everyone?

Tom Ritter
+2  A: 

I'll assume that you have a audit log with change data capture-esque features. This will be keeping track of the who, what, and when of each change.

Are the rollbacks intermittent or consistent? Any chance you have auto commit turned off and forget to commit your changes?

There can't be that many people that have sufficient permissions to do such a thing. Find out who can do it and ask. Better than any technology you can put in place.

Hacker? It should be somebody on the inside. If someone outside your firewall has access to that database you need to talk to your network people.

Try adding a monitor to that URL and port to see what requests come through.

duffymo
A: 

Asking everyone isn't useful, people lie and/or don't know they are screwing this up. We assume it's malicious but hope it's not.

eviljack
+2  A: 

The thing you are going to have to watch out for is that if someone is maliciously altering the database, and they have admin access, you have to assume they are smart enough to cover their tracks. At this point, you can stop further damage, but if the attacker is any good at all, you'll either blame the wrong person as the log files will be altered, or all the evidence point to the right person will be gone.

The best way to do is it to have it so that no one has direct admin access to the production database. We have a system set up so that no account has administrative access by default, and everyone has their own accounts. No one gets to use the SA account.

Someone has to grant the account access and it is automatically removed 24 hours after being granted. Ideally, the same person to grant access shouldn't be the one that gets administrative access to the database. That way two people always have to be involved to make changes to the system.

Ideally, two people should always be involved in making changes. That way the second person can verify what the first does. (It's easy to make mistakes at 10 at night after working several hours).

People will counter this by saying that sometimes they "need" to be able to make quick changes. In most places this is not the case. It may take an extra 10 minutes to get a second person involved, and explain the situation. It will take years to clean up a reputation about someone stealing/altering corporate data.

Kevin
I disagree with the premise that having admin access automatically makes a person smart enough to cover their tracks. The OP has already stated that everyone has admin access by default. Therefore the knowledge or intelligence of the person can't be ascertained.
Chris Lively
+4  A: 

Tracking down your problem is obviously a symptom and not the cause: since it's a SQL Server 2005 database, there should be a 'Default' trace that runs out of the box. It's very lightweight, but does include some object creation and deletion. You can view it from the sys.traces view using the following query:

SELECT *
FROM sys.traces
WHERE id = 1

It rolls over after only a few MB so it's usefulness will depend on how much activity there is on the server.

Presumably, the real cause is not having your changes scripted and in version control.

Agree with the other posters who mentioned that all changes to a Production Database should be done only by an Admin, and not individual developers.

Mitch Wheat
A: 

Wow, you've got a real problem then. If you can't trust your own people...

Time to turn off all the IDs except one. Make sure that person knows what they're doing and doesn't lie.

duffymo
+1  A: 

SQL Server 2005 added DDL and DML triggers so you can track who's modifying data as well as the data structure.

A: 

In addition to what you've already received in responses, my vote is that it's nobody; you're simply mistaken about how you're using the system.

Now, don't get me wrong, I'm not talking about incompetence here. What I do mean, though, is that there may well be scripts that are running periodically, and someone rightly mentioned that sometimes auto-commit may be on versus off and someone's getting fooled.

I also believe you are asking for trouble by mixing ANY development work in the production environment. Disk space is CHEAP - a terabyte is less than $300 these days! You don't need whiz-bang performance for development work in most circumstances...

Richard T
+1  A: 

If you're fixing it -- and by "fixing it" I mean locking down the production database and following some of the other practices mentioned here -- then don't worry about finding the culprit. It was probably accidental anyway and when you lock it down someone will start wondering why something doesn't work.

Tracking down the user who did won't solve anything. If it was malicious, they'll lie and say it was accidental.

The root cause is the security on the database so the group at fault is the one that allowed the database to be so susceptible.

Austin Salonen