views:

975

answers:

5

I am investigating the design of a work queue processor where the QueueProcessor retrieves a Command Pattern object from the Queue and executes it in a new thread.

I am trying to get my head around a potential Queue lockup scenario where nested Commands may result in a deadlock.

E.G.

A FooCommand object is placed onto the queue which the QueueProcessor then executes in its own thread.

The executing FooCommand places a BarCommand onto the queue.

Assuming that the maximum allowed threads was only 1 thread, the QueueProcessor would be in a deadlocked state since the FooCommand is infinitely waiting for the BarCommand to complete.

How can this situation be managed? Is a queue object the right object for the job? Are there any checks and balances that can be put into place to resolve this issue?

Many thanks. ( application uses C# .NET 3.0 )

+1  A: 

For simple cases like this an additional monitoring thread that can spin off more threads on demand is helpful.

Basically every N seconds check to see if any jobs have been finished, if not, add another thread.

This won't necessarily handle even more complex deadlock problems, but it will solve this one.

My recommendation for the heavier problem is to restrict waits to newly spawned process, in other words, you can only wait on something you started, that way you never get deadlocks, since cycles are impossible in that situation.

Guvante
+2  A: 

You could redesign things so that FooCommand doesn't use the queue to run BarCommand but runs it directly, or you could split FooCommand into two, and have the first half stop immediately after queueing BarCommand, and have BarCommand queue the second have of FooCommand after it's done its work.

Khoth
+2  A: 

Queuing implicitly assumes an asynchronous execution model. By waiting for the command to exit, you are working synchronously.

Maybe you can split up the commands in three parts: FooCommand1 that executes until the BarCommand has to be sent, BarCommand and finally FooCommand2 that continues after BarCommand has finished. These three commands can be queued separately. Of course, BarCommand should make sure that FooCommand2 is queued.

jan.vdbergh
+1  A: 

If you are building the Queue object yourself there are a few things you can try:

  1. Dynamically add new service threads. Use a timer and add a thread if the available thread count has been zero for too long.
  2. If a command is trying to queue another command and wait for the result then you should synchronously execute the second command in the same thread. If the first thread simply waits for the second you won't get a concurrency benefit anyway.
hfcs101
+1  A: 

I assume you want to queue BarCommand so it is able to run in parallel with FooCommand, but BarCommand will need the result at some later point. If this is the case then I would recommend using Future from the Parallel Extensions library.

Bart DeSmet has a good blog entry on this. Basically you want to do:


public void FooCommand()
{
    Future<int> BarFuture = new Future<int>( () => BarCommand() );

    // Do Foo's Processing - Bar will (may) be running in parallel

    int barResult = BarFuture.Value;

    // More processing that needs barResult
}

With libararies such as the Parallel Extensions I'd avoid "rolling your own" scheduling.

mancaus