views:

112

answers:

3

Hi, I am having a problem with my SQL Server 2005 database. The database must handle 1000 inserts a sec constantly. This is proving to be very difficult when the database must also handle reporting of the data, thus indexing. It seems to slow down after a couple of days only achieving 300 inserts per sec. By 10 days it is almost non functional.

The requirement is to store 14 days worth of data. So far I can only manage 3 or 4 before everything falls apart. Is there a simple solution to this problem?

I was thinking that I could replicate the primary database allowing the new database to be the reporting database storing the 14 days worth of database, then truncate the primary database daily. Would this work?

A: 

If the server has multiple harddrives I would try to split the database (or even the tables) in partitions.

ZippyV
A: 

Yeah, you dont need to copy a database over and then truncate/delete the live database on the fly. My guess is that the slowness is because your transaction logs are growing like crazy?

I think you are trying to say that you want to "shrink" the database periodically. If you have a FULL backup scheme, I think that if you backup the transaction logs once in a while that will shrink things down to normal again.

djangofan
The database's recovery model is set to SIMPLE. I already shrink the database every 30 mins and I reorganise the indexes once a day.
gjrwebber
ok, i was just curious. your clustering/mirroring wish is beyond my ability to answer.
djangofan
A: 

It is unlikely you will want reporting running against a database capturing 1000 records per second. I'd suggest two databases, one handling the constant stream of inserts and a second reporting database that only loads records at an interval, either by querying the first for a finite set since the last load or by caching the incoming data and loading it separately.

However, reporting in near real time against a database capturing 86 million rows per day and carrying approximately 1.2 billion rows will require significant planning and hardware demands. Further, on the backend as you reach day 14 and start to remove old data you will put more load on the database. If you can run without logging that will help the primary system, but the reporting system with indexing demands and such will require some pretty significant performance considerations.

Joe Skora
This is what I have in mind, however going about implementing is completely new to me. Do you think it would be technically possible to have the database handling the inserts not do any indexing, then have that database replicated to another (reporting db) which would add the indexes? If the problem is the indexes, this way I wouldn't have to truncate any tables, after the 14 days my delete scripts would run on the primary database and thus be replicated on the reporting db.
gjrwebber
Without all the details, I think you might need an index on date in the main database to be able to query the raw data into the reporting database and to delete old data after 14 days. Assuming you are saving the raw data (in case of failure) disabling logging on the main database will reduce its load to add rows.
Joe Skora