views:

269

answers:

4

I've had the following code in my application for some years and have never seen an issue from it.

while ((PendingOrders.Count > 0) || (WaitHandle.WaitAny(CommandEventArr) != 1))
{
    lock (PendingOrders)
    {
       if (PendingOrders.Count > 0)
       {
           fbo = PendingOrders.Dequeue();
       }
       else
       {
           fbo = null;
       }
    }

    // Do Some Work if fbo is != null
}

Where CommandEventArr is made up of the NewOrderEvent (an auto reset event) and the ExitEvent (a manual reset event).

But I'm not sure if this is thread safe (assuming N producer threads that all lock the queue before enqueing and one consumer thread that runs the code above). Also, we can assume that the Queue.Count property just returns one instance Int32 value from the Queue class (without volatile or interlocked or a lock, etc.).

What's the usual pattern used with a Queue and an AutoResetEvent to fix this and do what I'm trying to do with the code above?

(Edited to change the question slightly after it was correctly pointed out that Queue.Count could do anything and its implementation specific).

A: 

You should only use the WaitAny for that, and ensure that it gets signaled on every new order added to the PendingOrders collection:

while (WaitHandle.WaitAny(CommandEventArr) != 1))
{
   lock (PendingOrders)
   {
      if (PendingOrders.Count > 0)
      {
          fbo = PendingOrders.Dequeue();
      }
      else
      {
          fbo = null;

          //Only if you want to exit when there are no more PendingOrders
          return;
      }
   }

   // Do Some Work if fbo is != null
}
iCe
If two items are queued while one is executing this will only Dequeue one of them then wait for another to be queued.
Hightechrider
That's exactly the issue that led me to use the Count > 0 check in the first place. If some other thread enqueues an item and sets the event while I'm in the while loop, I will miss it.
Michael Covelli
+2  A: 

Using manual events...

ManualResetEvent[] CommandEventArr = new ManualResetEvent[] { NewOrderEvent, ExitEvent };

while ((WaitHandle.WaitAny(CommandEventArr) != 1))
{
    lock (PendingOrders)
    {
        if (PendingOrders.Count > 0)
        {
            fbo = PendingOrders.Dequeue();
        }
        else
        {
            fbo = null;
            NewOrderEvent.Reset();
        }
    }
}

Then you need to ensure a lock on the enqueue side as well:

    lock (PendingOrders)
    {
        PendingOrders.Enqueue(obj);
        NewOrderEvent.Set();
    }
csharptest.net
This does looks like it would work. Let me think about it some more, but that does look right to me. Thanks!
Michael Covelli
+1: I have only given this a cursory look, but I believe this solution is indeed safe. The reason this code is easier to understand is because it does not try any of those fancy lock-free strategies (which are **huge** red flags).
Brian Gideon
I should clarify that this is safe only if there is a single thread doing the enqueueing. Having two or more threads enqueueing may lead to a situation where the queue has an item but the ARE is not set.
Brian Gideon
Why does N threads enqueueing cause any issues (assuming they all lock the queue before enqueueing)?
Michael Covelli
@Michael: Because two enqueueing threads could call Set on the ARE before a single dequeueing thread is released. That leaves the queue with 2 items, but only one dequeueing thread is released before the ARE is reset. That means only 1 item is dequeued.
Brian Gideon
Right. But then the Count > 0 would make the while loop continue to dequeue the second item, right? (I should have said before that I am assuming only one consumer thread).
Michael Covelli
@Michael: In your code...yes (probably). But, in this answer's code...no.
Brian Gideon
@Brian Ah, I see what you're saying. Without a Count > 0 type of check then this code does have an issue. If two threads enqueue at the same time, we would only get the first item.
Michael Covelli
The event is only reset when count reaches zero. That is why I'm using a mre instead of are. You can safely have any number of theads enqueue and dequeue, assuming of course that the runtime commits changes to the queue's memory stucture prior to unlock. This has, afaik, always been the case.
csharptest.net
@csharptest.net You're right, I wasn't reading your code carefully. I think that this looks threadsafe for any number of readers.
Michael Covelli
+1  A: 

Looks quite thread-safe to me, the WaitAny() will simply complete immediately because thee event is already set. That's not a problem.

Don't break threading sync that works. But if you want a better mousetrap then you could consider Joe Duffy's BlockingQueue in this magazine article. A more general version of it is available in .NET 4.0, System.Collections.Concurrent.BlockingCollection.

Hans Passant
But since this is an AutoResetEvent, if the other thread sets the event before I have a chance to wait on it then won't I have an issue. I agree that if it was a ManualResetEvent then its a different story.
Michael Covelli
Whaaa...it's in 4.0? How come I didn't get the memo? Ya learn something new everyday.
Brian Gideon
@Michael: that's not how ARE works, the Wait call resets it on exit. It doesn't reset on entry, then waits.
Hans Passant
Ah, I think you're right and I'm mistaken about the nature of ARE. Are you saying that an ARE won't get reset until at least one thread somewhere waits on it?
Michael Covelli
You sure it's thread safe? There's the potential for one thread to be invoking `Queue<T>.Dequeue()` and another invoking `Queue<T>.Count` on the same queue at the same time... that *might* be thread safe depending on the implementations of those methods, but I know that concurrent read-writes definitely cause problems on other collections, like the `Dictionary<TKey, TValue>`.
Aaronaught
@Michael: That is exactly right. But, to be very padantic the code still is not safe. See my answer for an explanation.
Brian Gideon
Agreed, Count could be read as 0 when it is already set to 1, *before* the if statement. The Wait call fixes it though. Even if it wouldn't, the while() takes care of it.
Hans Passant
@Aaronaught I should have specified that I may have multiple threads enqueueing orders (producers) but only one thread dequeueing them (consumer). So only one thread is dequeing and looking at Count. The other threads never look at count or dequeue, they only lock thet queue and enqueue items.
Michael Covelli
@Hans Agreed, even if my one consumer thread sees a 0 for the count when its "really" 1 (because the 0 is cached and it hasn't gotten the 1 on that thread yet or something) then the event will fix it. The Count > 0 is only in there for when someone enqueues M items rapidly and sets the event M times. I think that I still do need that check in there, right? ARE won't keep track of the number of times its set, only whether it is set or not, right?
Michael Covelli
@Michael, yes, calling Set() on ARE more than once would be a problem if you counted them off. You don't.
Hans Passant
My concern would not be that `Count` returns the wrong value if it's executed concurrently. That is exactly the point of a double-checked lock. My concern is that it could literally crash. It happens with some of the other collections (excluding the .NET 4.0 concurrent collections). If this particular DCL is stable, then it's only because of the implementation details of the `Queue<T>` (in other words, lucky).
Aaronaught
@Aaronaught I understand that if Queue<T> implementation of Count was changed to iterate through the items or do something else then there might be an issue. But do you think there still is one if the question is changed so that you can assume that Queue<T> has one instance member Count that is not locked or volatile or interlocked or anything?
Michael Covelli
@Michael: As a rule, I simply don't assume anything about the internal workings of some component, or base any decisions on characteristics that are not guaranteed to be static. If the documentation says it's not thread-safe, then I synchronize all access (not just write access). You wouldn't necessarily expect a `Dictionary` to throw an exception if one thread is adding a value and another thread is looking up a different value, and yet it does (sometimes). If you happen to be able to get away with a double-checked lock here... then you're lucky.
Aaronaught
Well, nothing is guaranteed but death and taxes when you use a property or method of a thread-unsafe object in a thread-unsafe manner. But Count does mis-behave predictably: a property that doesn't mutate the object state cannot corrupt it. It can only return the wrong value. Duffy doesn't like to take the risk though, impossible to argue with. I would argue about changing something that has worked well for several years.
Hans Passant
@Hans Thanks for your help at thinking about this. The consensus seems to be that its a bad idea to rely on implementation details that may change even if this code happens to work. I guess I should rely on the link that you sent me to re-work this. Thanks!
Michael Covelli
+2  A: 

You are correct. The code is not thread-safe. But not for the reason you think.

The AutoResetEvent is fine. Though, only because you acquire a lock and retest PendingOrders.Count. The real crux of the matter is that you are calling PendingOrders.Count outside of a lock. Because the Queue class is not thread-safe your code is not thread-safe...period.

Now in reality you will probably never have a problem with this for two reasons. First, Queue.Count property is almost certainly designed to never leave the object in a half-baked state. Afterall, it will probably just return an instance variable. Second, the lack of a memory barrier on that read will not have a significant impact in the broader context of your code. The worst thing that will happen is that you will get a stale read on one iteration of the loop and then the acquired lock will implicity create a memory barrier and a fresh read will take place on the next iteration. I am assuming here that there is only one thread queueing items. Things change considerably if there are 2 or more.

However, let me make this perfectly clear. You have no guarentee that PendingOrders.Count will not alter the state of the object during its execution. And because it is not wrapped in a lock another thread could initiate an operation on it while is still in that half-backed state.

Brian Gideon
Indeed, Queue.Count appears to just return an int field in current implementation and it's not marked as volatile.
Hightechrider
Hi Brian, as to your first point, I agree. An arbitrary Queue<T> implementation could do anything in the Count property, including iterating through all the elements. I should change the question to say that we can assume that Queue<T>.Count just returns an instance variable (a non-volatile, non-interlocked read).
Michael Covelli
I agree that I could get a stale read. That is, the real value of Count is 1, but my read of Count in the while loop condition returns 0. But I think that that would be okay still. The Count > 0 check is only there to catch the case where the producers enqueue M orders at the same time so that I don't just process one of the orders. I think you're right - that the memory barrier of the lock will save me. Just trying to think through if there is any case in practice where there would be an issue. (Including having N producer threads, assuming each locks the queue before enqueueing).
Michael Covelli
@Michael: Yeah, you very well could be right. In reality you might not have an issue. It is very difficult to think about. Even threading experts would admit this an incredibly difficult problem. Your best option is to get a correct implementation of a BlockingQueue and use that. .NET 4.0 comes with one if you happen to be using the new version of the framework.
Brian Gideon