views:

63

answers:

3

Dear ladies and sirs.

My headache is this - my server application exceeds the maximum open database connections while being under load. So, I figure I need a task queue (aka service bus) for write database access. A queue, that db write requests can be posted to it and the dedicate threads will read it and execute.

I was wondering if there are ready components out there that do just that. My requirements are:

  1. Multiple threads can write to the queue with minimum blocking.
  2. More than one thread can read from the queue to execute the posted write requests. In the most constrained case this number is 1, but it may be more, for instance 10% of the open db connection limit.

Any ideas?

Thanks.

P.S.

I have noticed this, and this, but neither seem relevant to me.

A: 

.net 4.0 offers several threadsafe datastructures.
ConcurrentQueue<T> should be what you want.

It lacks a blocking dequeue or an event signaling that something has been enqueued. So you need to implement such a signal yourself if you need it.

CodeInChaos
No, don't implement signalling yourself. Look at the BlockingCollection in the same namespace.
Henk Holterman
+1  A: 

In process for .NET 4 your starting point almost certainly should be either:

  1. BlockingCollection<T>, or
  2. Using Parallel LINQ method like Parallel.Foreach passing a ParallelOptions instance with MaxDegreeOfParallelism set.
Richard
+3  A: 

Using a BlockingCollection<T>, which implements a producer/consumer pattern. It's thread safe, so you can access it from multiple threads simultaneously. Here's an example of how that would be used:

var collection = new BlockingCollection<MyClass>();

// start a new background task to fill the queue
Task.Factory.StartNew(() => {
    while(thereIsStillStuffToStoreInTheDatabase)
    {
        var item = GetNextItem();
        collection.Add(item);
    }
    collection.CompleteAdding();
});

// GetConsumingEnumerable() blocks while empty until the producer
// has more items or it signals that adding is complete
foreach(var item in collection.GetConsumingEnumerable())
{
    // store item in database
}

Note that the above implementation could also be performed in reverse, with a background Task consuming from the collection and the main worker thread filling it.

Joseph Albahari has an excellent writeup on Tasks and parallelism. You can read all about it, and specifically about BlockingCollection, here.

Nathan Ridley
Just for the link to Albahari site you deserve +1.
mark