views:

735

answers:

4

In my enviroment i have a database on ms sql 2000 that is being hit by hundreads of users at any time. Also the are lots of intense reports using reporting services 2005 hitting the same database. The problem we are getting is when there are lots of reports runnig at the same time and people using the database concurrent with the reports. The problem is that we see lots of blocking process to the level that the sistem starts to give time out to any transaction made after some time in that situation. Is there a global way of minimize those blocking so the transaction can continue to flow. Thanks.

+1  A: 

Use optimistic locking, if updates are not happening often and the database is mainly used for reporting.

SQL Server has quite a pessimistic locking default.

A look into SQL Server Table Hints might get you started.

Tomalak
I'm currently using the nolock and read uncommitted
+1  A: 

The reports can use WITH(NOLOCK).

Other possibilities are having the reports run off a read-only replica of the database or running off a datawarehouse version of the database which is optimized for the reporting needs.

Cade Roux
the problem we are getting to that work around is that must of the reports must be made to get up to date data in real time. that makes replication a pain to keepup.
Your definition of up to date may mean that the people writing data have to wait - after all, they might invalidate the report...
Cade Roux
+1  A: 

Since you are already using NOLOCK hints and READ UNCOMMITTED isolation level for your reports, the investigation needs to turn to the transactional queries coming in. This may get deep. Perhaps applications are keeping transactions open too long. It may also be the case that you have a lot of table scans or range scans in some of the other query volume, and those may be holding shared locks for long-running transactions. Those shared locks will block your writers.

You need to start looking at sp_lock, and seeing what kinds of locks are outstanding, see what locks the blocked queries are trying to obtain, and then examine the queries that are blocking the requestors.

This will help you if you are unfamiliar with SQL Server locking: Understanding SQL Server 2000 Locking

Also, perhaps you could describe your disk subsystem. It may be undersized.

Pittsburgh DBA
A: 

Thanks everyone for your support. What we do to mitigate the problem was to create a new database whit a logshipping procedure every hour to mantain in sync to the real one. The reports that do no need real time data where point to that database and the ones that needs real time data where restricted so only a few people can access them. The drawbacks whit the method is tha the data will be up to one hour out of sync and we need to create a new server for that purpose only. Also when the loggshipping procedure runs every connetion is drop for a very short period of time but it can be a problem to really long procedures or reports. After this I will verify the querys from the reports so I can understand what can be optimize. Thanks and I will recomend the site to the whole IT department.