views:

9448

answers:

5

I have a Queue object that I need to ensure is thread-safe. Would it be better to use a lock object like this:

lock(myLockObject)
{
//do stuff with the queue
}

Or is it recommended to use Queue.Synchronized like this:

Queue.Synchronized(myQueue).whatever_i_want_to_do();

From reading the MSDN docs it says I should use Queue.Synchronized to make it thread-safe, but then it gives an example using a lock object. From the MSDN article:

To guarantee the thread safety of the Queue, all operations must be done through this wrapper only.

Enumerating through a collection is intrinsically not a thread-safe procedure. Even when a collection is synchronized, other threads can still modify the collection, which causes the enumerator to throw an exception. To guarantee thread safety during enumeration, you can either lock the collection during the entire enumeration or catch the exceptions resulting from changes made by other threads.

If calling Synchronized() doesn't ensure thread-safety what's the point of it? Am I missing something here?

A: 

It seems clear to me that using a lock(...) {...} lock is the right answer.

To guarantee the thread safety of the Queue, all operations must be done through this wrapper only.

If other threads access the queue without using .Synchronized(), then you'll be up a creek - unless all your queue access is locked up.

Greg Hurlman
The wrapper the article is talking about is the Synchronized() wrapper, not the lock() wrapper.
Jon Tackabury
Right - but using it in one thread doesn't make the other threads compliant. The point is that if you rely on .Synch() only, you'll have an easier time screwing yourself.
Greg Hurlman
+13  A: 

There's frequently a tension between demands for 'thread safe collections' and the requirement to perform multiple operations on the collection in an atomic fashion.

So Synchronized() gives you a collection which won't smash itself up if multiple threads add items to it simultaneously, but it doesn't magically give you a collection that knows that during an enumeration, nobody else must touch it.

As well as enumeration, common operations like "is this item already in the queue? No, then I'll add it" also require synchronisation which is wider than just the queue.

Will Dean
+11  A: 

Personally I always prefer locking. It means that you get to decide the granularity. If you just rely on the Synchronized wrapper, each individual operation is synchronized but if you ever need to do more than one thing (e.g. iterating over the whole collection) you need to lock anyway. In the interests of simplicity, I prefer to just have one thing to remember - lock appropriately!

Jon Skeet
Maybe add something about lock-free structures? They don't warrant a new reply.
Jonathan C Dickinson
@Jonathan: Not sure what there is to say about lock-free structures in this particular case. I generally don't bother trying to go lock-free - it involves a *lot* of head-scratching, and I can't remember *ever* being in a synchronization-bottleneck situation.
Jon Skeet
+10  A: 

There's a major problem with the Synchronized methods in the old collection library, in that they synchronize at too low a level of granularity (per method rather than per unit-of-work).

There's a classic race condition with a synchronized queue, shown below where you check the Count to see if it is safe to dequeue, but then the Dequeue method throws an exception indicating the queue is empty. This occurs because each individual operation is thread-safe, but the value of Count can change between when you query it and when you use the value.

object item;
if (queue.Count > 0)
{
    // at this point another thread dequeues the last item, and then
    // the next line will throw an InvalidOperationException...
    item = queue.Dequeue();
}

You can safely write this using a manual lock around the entire unit-of-work (i.e. checking the count and dequeueing the item) as follows:

object item;
lock (queue)
{
    if (queue.Count > 0)
    {
        item = queue.Dequeue();
    }
}

So as you can't safely dequeue anything from a synchronized queue, I wouldn't bother with it and would just use manual locking.

.NET 4.0 should have a whole bunch of properly implemented thread-safe collections, but that's still nearly a year away unfortunately.

Greg Beech
+2  A: 

This way we don't need to lock the queue just to find out it was empty.

object item;
if (queue.Count > 0)
{
  lock (queue)
  {
    if (queue.Count > 0)
    {
       item = queue.Dequeue();
    }
  }
}
Leonidius