views:

39

answers:

2

I am working on system where there are huge (2000-5000 records per minute) amount of incoming events which have to be recorded in the database (SQL server 2000). We are using ODBC connection to connect with the database.

Currently we open a single connection during the app startup and push the data using the connection since we won't be able to get a new connection during every record addition due to the huge amount of data getting pumped in. The challenge we face is this: when there is a network failure or the connection is intermittent, the data doesn't get pushed into the database (of course, we get a CDBException if there is an issue with the network). But it will be too late to restore the connection because more events keeps pouring in. Neither it is possible to check for the validity of the connection each time we insert a record because it is a time-consuming job.

I thought of a method to overcome this issue.

Use a cache/buffer to hold the incoming alarms/events and push them into the database once in 'n' seconds and/or once the buffer reaches a certain size.

Will this solution prove helpful? Is there any alternate approach to solve the problem?

Thanks.

+1  A: 

I was about to suggest using a cache or buffer to hold the incoming events and batch insert them but I see you've thought of that already.

You should take a look at MSMQ which could be what you're looking for.

Rob Stevenson-Leggett
+1  A: 

I'd have the client write to a file with a date and time in the name. After N records are written, stop writing to that file (start writing to a new file) and have BULK INSERT (Transact-SQL) insert that "completed" file. You could add a cleanup routine to delete files that were processed into the database.

KM