views:

100

answers:

3

I'm writing a program where one thread needs to push items onto the queue, and one or more threads pops items off the queue and processes them. To avoid running out of memory, I'd like the producer thread to sleep when the queue gets full. Some items have a higher priority than others, so I'd like those to be processed first. If the items have the same priority, I'd like the one that was added first to be processed first.

I want to display the top 100 items or so in a WPF DataGrid, so it needs to be accessed by a UI thread as well. Would be nice if it could notify the UI thread that there's been an update as well, i.e., implements IObservable.

Is there a container class that will do all this?

For bonus points, I'm pretty sure the entire queue doesn't need to be locked both when enqueing and dequeing.

.NET 4 implementations are fine.

+2  A: 

If you're on .NET 4 you should seriously consider the Task Parallel Library with a custom scheduler like the example QueuedTaskScheduler. I'm not sure it meets all your requirements, but it would be a good start.

Sean Fausett
-1... does not solve ANY problem the user mentioned.
TomTom
I may have been off the mark with my answer, but I strongly disagree that it could not solve the problem. The data processing on dedicated threads is effectively being scheduled by the ordering of the data in the collection. A task is a higher level abstraction that could encapsulate the data and its processing without dealing directly with threads - which has many benefits. In such a scenario the scheduling could be achieved via the task scheduler instead.
Sean Fausett
+1  A: 

You are out of luck looking for a container - you have to implement one yourself. Be carefull with priorities - sorting gets slow fast. What I do is I have a queue class implemented myself that internally uses multiple arrays (one per priority - coded low, middle, high). This way I never sort. Avoid locks if you can (multi core assumed) and go for Spinlocks (.NET 4.0), they are faster / carry less overhead in a queue scenario.

TomTom
I'd never resort the whole collection every insert... that's just crazy. I'd use multiple arrays (or queues) as you suggested, or some kind of tree structure.
Mark
Tree structures are slower than multiple queues (but more efficient if the number of possible priorities is HIGH - which rarely makes sense). Tree rebalances are hell price wise, and you never need to search in them anyway in a queue.
TomTom
+1  A: 

What I've done in the past is wrapped multiple ConcurrentQueue<T> collections into one -- sort of what TomTom suggests. This is pretty reasonable when the number of priorities you intend to have is low. For instance in some cases it may even be sufficient to have two: high and low. Then your TryDequeue method just looks something like this:

public bool TryDequeue(out T item)
{
    return _highItems.TryDequeue(out item) || _lowItems.TryDequeue(out item);
}

This isn't exactly a comprehensive answer to your question, but maybe it can help you get started.

Dan Tao
I don't like limiting the number of possible priorities though. I want to use a computed value for the priority, and I don't know the range beforehand.
Mark
@Mark: Sometimes people do this with something like a `SortedList<TPriority, Queue<T>>` internally; but as TomTom has already mentioned, the cost of unlimited priorities can end up not being worth it. What about a compromise: have a computed value that gets translated to a predefined priority based on where it falls within a certain range?
Dan Tao
@Dan: Yeah... I'm thinking maybe that's what I should do. I hate putting arbitrary restrictions on things, but I guess I'm creating far too much work for myself if I don't. I was half way through implementing it w/ the `SortedDictionary<TPriority, ConcurrentQueue<TValue>>` when I realized I'd have to lock the structure both when enqueuing and dequeing so that I could manage the keys, which takes away some of the concurrent benefit :(
Mark