views:

83

answers:

3

It seems there are lots of improvements in .NET 4.0 related to concurrency that might rely on concurrent priority queues. Is there decent priority queue implementation inside framework available for reuse?

A: 

Check Thread-safe Collections in .NET Framework 4 and Their Performance Characteristics but AFAIK there are no ready to use priority queue. All new thread-safe collections doesn't maintain order but you can make your own on top of them. Check @Steven's way.

Nick Martyshchenko
+3  A: 

You may need to roll your own. A relatively easy way would be to have an array of regular queues, with priority decreasing.

Basically, you would insert into the queue for the appropriate priority. Then, on the consumer side, you would go down the list, from highest to lowest priority, checking to see if the queue is non-empty, and consuming an entry if so.

Steven Sudit
+1, I forget to mention how to implement due think it pretty straightful :) Without that my answer is not full.
Nick Martyshchenko
It's not much of an implementation. It could probably be improved by having a ManualResetEvent that's signaled after inserting into *any* of the queues, so that the consumer can poll once, then wait until the signal fires. In practice, it would be best for that wait to be finite and relatively short (perhaps a quarter of a second) so as to avoid circumstances where the signal is missed.
Steven Sudit
Paw's implementation locks unnecessarily, but the overall structure is a good starting point.
Steven Sudit
+1  A: 

Maybe you can use my own implementation of a PriorityQueue. It implements alot more than the usual push/pop/peek, features that I implemented whenever I found the need for it. It also has locks for concurrency.

Comments to the code is much appreciated :)

public class PriorityQueue<T> where T : class
{
    private readonly object lockObject = new object();
    private readonly SortedList<int, Queue<T>> list = new SortedList<int, Queue<T>>();

    public int Count
    {
        get
        {
            lock (this.lockObject)
            {
                return list.Sum(keyValuePair => keyValuePair.Value.Count);
            }
        }
    }

    public void Push(int priority, T item)
    {
        lock (this.lockObject)
        {
            if (!this.list.ContainsKey(priority))
                this.list.Add(priority, new Queue<T>());
            this.list[priority].Enqueue(item);
        }
    }
    public T Pop()
    {
        lock (this.lockObject)
        {
            if (this.list.Count > 0)
            {
                T obj = this.list.First().Value.Dequeue();
                if (this.list.First().Value.Count == 0)
                    this.list.Remove(this.list.First().Key);
                return obj;
            }
        }
        return null;
    }
    public T PopPriority(int priority)
    {
        lock (this.lockObject)
        {
            if (this.list.ContainsKey(priority))
            {
                T obj = this.list[priority].Dequeue();
                if (this.list[priority].Count == 0)
                    this.list.Remove(priority);
                return obj;
            }
        }
        return null;
    }
    public IEnumerable<T> PopAllPriority(int priority)
    {
        List<T> ret = new List<T>();
        lock(this.lockObject)
        {
            if (this.list.ContainsKey(priority))
            {
                while(this.list.ContainsKey(priority) && this.list[priority].Count > 0)
                    ret.Add(PopPriority(priority));
                return ret;
            }
        }
        return ret;
    }
    public T Peek()
    {
        lock (this.lockObject)
        {
            if (this.list.Count > 0)
                return this.list.First().Value.Peek();
        }
        return null;
    }
    public IEnumerable<T> PeekAll()
    {
        List<T> ret = new List<T>();
        lock (this.lockObject)
        {
            foreach (KeyValuePair<int, Queue<T>> keyValuePair in list)
                ret.AddRange(keyValuePair.Value.AsEnumerable());
        }
        return ret;
    }
    public IEnumerable<T> PopAll()
    {
        List<T> ret = new List<T>();
        lock (this.lockObject)
        {
            while (this.list.Count > 0)
                ret.Add(Pop());
        }
        return ret;
    }
}
Paw Baltzersen
Aside from not following .NET conventions, it looks correct but slow. The slowness comes from locking everything, whereas the .NET 4.0 concurrent queue is lockless. See: http://www.codethinked.com/post/2010/02/04/NET-40-and-System_Collections_Concurrent_ConcurrentQueue.aspx
Steven Sudit
In any case, +1 for the overall structure.
Steven Sudit
What do you mean by not following .NET conventions?
Paw Baltzersen
I was thinking about making the Queue inside the SortedList a ConcurrentQueue and then only lock when adding or deleting a new item in the SortedList, and not when working on the Queues. But I'd still have to lock when checking if a given priority is present in the SortedList, so that didn't help much.
Paw Baltzersen