views:

66

answers:

1

We have a application (written in c#) to store live stock market price in the database (SQL Server 2005). It insert about 1 Million record in a single day. Now we are adding some more segment of market into it and the no of records would be double (2 Millions/day).

Currently the average record insertion per second is about 50, maximum is 450 and minimum is 0.

To check certain conditions i have used service broker (asynchronous trigger) on my price table. It is running fine at this time(about 35% CPU utilization).

Now i am planning to create a in memory dataset of current stock price. we would like to do some simple calculations.

Currently i am using xml batch insertion method. (OPENXML in Storred Proc)

I want to know different views of members on this.

Please provide your way of dealing with such situation.

A: 

Your question is reading, but title implies writing?

When reading, consider (bit don't blindly use) temporary tables to cache data if you're going to do some processing. However, by simple calculations I assume aggregates live AVG, MAX etc?

It would generally be inane to drag data around, cache it in the client and aggregate it there.

If batch uploads:

  • SQLBulkCopy or similar to a staging table
  • Single write from staging to final table with

If single upload, just insert it

A million rows a day is a rounding error for what SQL Server ('Orable, MySQL, DB2 etc) is capable of

Example: 35k transaction (not rows) per second

gbn
Thanks for reply. Simple conditions is just checking >,<,= etc. for calculating AVG,MAX,MIN.... I am still using stored proc. The other complex calculations keep SQL server under presure. We have UI to add conditions. There are plenty of dynamic sql statement, having Group by clause, need to be executed every 2 seconds. It keeps Server very busy and sometime CPU uses gone to 100%. We remove some conditions at those time.Now i am trying to move some of the conditions to my application where i directly check it against recieved data.
Manjoor