I am working on system where there are huge (2000-5000 records per minute) amount of incoming events which have to be recorded in the database (SQL server 2000). We are using ODBC connection to connect with the database.
Currently we open a single connection during the app startup and push the data using the connection since we won't be able to get a new connection during every record addition due to the huge amount of data getting pumped in. The challenge we face is this: when there is a network failure or the connection is intermittent, the data doesn't get pushed into the database (of course, we get a CDBException
if there is an issue with the network). But it will be too late to restore the connection because more events keeps pouring in. Neither it is possible to check for the validity of the connection each time we insert a record because it is a time-consuming job.
I thought of a method to overcome this issue.
Use a cache/buffer to hold the incoming alarms/events and push them into the database once in 'n' seconds and/or once the buffer reaches a certain size.
Will this solution prove helpful? Is there any alternate approach to solve the problem?
Thanks.