views:

258

answers:

2

I need to implement a producer/consumer bounded queue, multiple consumers against a single producer.

I have a push function that adds an item to the queue and then checks for maxsize. If we have reached it return false, in every other case return true.

In the following code _vector is a List<T>, onSignal basically consumes an item in an asynchronous way.

Do you see issues with this code?

public bool Push(T message)
{
    bool canEnqueue = true;

    lock (_vector)
    {
        _vector.Add(message);
        if (_vector.Count >= _maxSize)
        {
            canEnqueue = false;
        }
    }

    var onSignal = SignalEvent;
    if (onSignal != null)
    {
        onSignal();
    }

    return canEnqueue;
}
+1  A: 

I know you said single-producer, multiple-consumer, but it's worth mentioning anyway: if your queue is almost full (say 24 out of 25 slots), then if two threads Push at the same time, you will end up exceeding the limit. If there's even a chance you might have multiple producers at some point in the future, you should consider making Push a blocking call, and have it wait for an "available" AutoResetEvent which is signaled after either an item is dequeued or after an item is enqueued while there are still slots available.

The only other potential issue I see is the SignalEvent. You don't show us the implementation of that. If it's declared as public event SignalEventDelegate SignalEvent, then you will be OK because the compiler automatically adds a SynchronizedAttribute. However, if SignalEvent uses a backing delegate with add/remove syntax, then you will need to provide your own locking for the event itself, otherwise it will be possible for a consumer to detach from the event just a little too late and still receive a couple of signals afterward.

Edit: Actually, that is possible regardless; more importantly, if you've used a property-style add/remove delegate without the appropriate locking, it is actually possible for the delegate to be in an invalid state when you try to execute it. Even with a synchronized event, consumers need to be prepared to receive (and discard) notifications after they've unsubscribed.

Other than that I see no issues - although that doesn't mean that there aren't any, it just means I haven't noticed any.

Aaronaught
Aaronaught, thanks for your prompt feedbackno multiple producers in the foreseeable future, but then max size is also a loose suggestion ... i don't need to strictly enforce it.SignalEvent is declared as> internal event Action SignalEvent;i'm not familiar with the SynchronizedAtribute ... will check into it.i'm building this on top of retlang ... it really helps in terms of providing a general framework for multithreaded apps
lboregard
@lboregard: You don't need to worry about `SynchronizedAttribute` if you don't use custom `add`/`remove` methods on the delegate - the compiler generates them for you. It's only when you override the default behaviour that you need to implement custom locking. This seems to be largely undocumented but it is fairly well-known.
Aaronaught
A: 

The biggest problem I see there is the use of List<T> to implement a queue; there are performance issues doing this, as removing the first item involves copying all the data.

Additional thoughts; you're raising the signal even if you didn't add data, and the use of events itself may have issue with threading (there are some edge cases, even when you capture the value before the null test - plus it is possibly more overhead than using the Monitor to do the signalling).

I would switch to a Queue<T> which won't have this problem - or better use a pre-rolled example; for example Creating a blocking Queue in .NET?, which does exactly what you discuss, and supports any number of both producers and consumers. It uses the blocking approach, but a "try" approach would be:

public bool TryEnqueue(T item)
{
    lock (queue)
    {
        if (queue.Count >= maxSize) { return false; }
        queue.Enqueue(item);
        if (queue.Count == 1)
        {
            // wake up any blocked dequeue
            Monitor.PulseAll(queue);
        }
        return true;
    }
}

Finally - don't you "push" to a stack, not a queue?

Marc Gravell
Marc, thanks for the reply.i didn't mention that i need to be able to add elements at the top of the "queue", so a straight Queue<T> won't help me. this should happen unfrequently, but i'd appreciate any suggestion that could help performance.i'm aware of the blocking queues implementations based on monitors, but i'm using retlang as the base framework to manage the synchronization between threads and i'm still learning it's intricacies, but i will revisit the use a such a blocking queue.
lboregard