views:

504

answers:

2

I have a Windows Service running on a .Net 2.0 environment (NOT UPGRADABLE TO 3/3.5). The service uses a System.Timers.Timer on a X second Interval to poll for items on a database table.

The Timer.Elapsed event handling code is wrapped around a lock statement to prevent the next Elapsed event of running if the previous processing has not been finished. For each item, Process A is executed and then the item is deleted from the database. This is the current "production" code.

A second process (Process B) needs to be executed, but only for some of the items (which I can easily isolate using a condition) and only after they have been handled by Process A. The resource used to run Process B only accepts one connection at a time. (so having multiple threads for each Process B execution is not an option) If I were to run Process B on my Timer.Elapsed event handling code along with Process A, this would greatly delay the event handler execution and make it more probable that an Elapsed event will not finish to execute in time for the next event, making the system less responsive, etc.

I think it would be best to have a new single thread execute Process B (or another service, which I think would be overkill). Process A would queue items for Process B, so if Process B takes longer than Process A to finish processing items, the next Process A run would just add more items to the Process B queue.

I'm aware using a database as IPC is not exactly a great practice, but given the fact that I can't use WCF IPC (which would probably have been my approach) AND I'm already using a database here...

Would it be considered a "bad practice" if I keep using the Database as my IPC mechanism and flag the items on the database once Process A finishes so Process B can select and process them (including the delete)? Or should I still try to use some sort of direct IPC between the threads?

It is VERY important that I'm guaranteed that the items in the queue will persist if the system crashes or the services are shut down.

 //This is how my Elapsed event handler should look
    private void timer_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
    {
       lock(this)
       {
          //Load items from DB
          //foreach item
             //Run Process A
             //if(item.Member == condition)
             //   Queue item for Process B  (DB = reflag,  IPC = send message)
             //If using IPC delete the item from the DB
       }
    }
+2  A: 

Instead of using the "lock" in the timer_elapsed, can you just disable/enable the timer? Disable the timer at the start then enable the timer in a 'finally'. Then you don't have to worry about overlapping events.

If that's not an option and the timer must fire every interval, regardless of processing, then flag the item in the DB and use two timers. One timer for 'Process A' and one timer for 'Process B'.

MattH
+1  A: 

On the contrary, using a database as your inter-process communication channel is a terrific practice due to the common requirement you mention at the end of your question:

It is VERY important that I'm guaranteed that the items in the queue will persist if the system crashes or the services are shut down.

Just make sure that the flagging happens within the same transactional scope as any other data changes wrought by process A.

You might even consider breaking this into two services and using a MessageQueue between the logical processes A and B.

Jeff Sternal