views:

56

answers:

3

I have a basic queue of tasks executing (c# WinForms App that talks to 3 separate systems). Everything is great until one of the web services decides not to respond in with the usual speed.

I'm not interested in speeding things up via multi threading the jobs, but now that I am using it in production, I can see the benefit of having at least two threads running jobs - if one blocks and it's an anomaly, the other will keep truckin', and if both block, then probably any number of threads would and I just deal with it.

So, the question is: is this a common pattern I just described, and is there a name for that pattern and/or some awesome reference or framework, or anything to help me not re-invent any wheels.

Additions based on Comments/Answers

The tasks can be run simultaneously. I have chosen not to peruse a multi-threaded design for the purposes of speed, but I am now considering it to attain consistent performance in the face of infrequent task latency.

My assumption is that every once in a while a call to a web service takes disproportionately longer to complete while still being considered non-exceptional. This has a NON(corrected)-negligible impact on the total run time of, say, N jobs, if the average execution time is 1 second (including a host of disparate web service calls) and 0.0001% of the time a web service takes 15 seconds to respond.

Is a thread pool just another way of saying, "spin up worker threads and manage their state manually" ? Or is there something that can help me manage the complexity? I worry that the chance of introducing bug(s) grows out of proportion to the benefits in this case...

I think I am looking for something similar to a thread pool, but one that only spins up additional threads when latency is detected.

If anyone can give me more info on what one of the comments refers to as a work-stealing thread, that sounds promising.

The reason I didn't use the BackgroundWorker component is because they seem to be built for the case when you know how many workers you want, and I'd ideally keep the design flexible

PS: Thanks again. Thanks!

+1  A: 

It sounds like what you might be looking for here is the Backgroundworker. A class that neatly encapsulates launching a worker thread, monitoring progress and receiving results. http://msdn.microsoft.com/en-us/library/system.componentmodel.backgroundworker.aspx. It's quick and easy to use just wire up the DoWork, ProgressChanged and RunWorkerCompleted events and then start it.

Mark
Yeah I agree Backgroundworker is very convenient - I just used it to bring my GUI back from the dead... :)
Gabriel
+3  A: 

It depends on how important is the order of the queue items and how important it is that an item is completed before the next one is processed. If one item must be completely processed before the next one is, then basically you are stuck. If not, you may decide to implement a simple pool of worker threads. if .NET 4.0 is an option I would recommend using the Parallel Extensions for that and especially the AsParallel() and AsOrdered() methods.

Johann Blais
+1, but you didn't mention that the Parallel Extensions implement work-stealing queues, which is the real answer to this question.
Gabe Moothart
+1  A: 

The producer-consumer pattern may be the best fit for this, and using a queue (either Queue<T> wrapped in a lock or the new ConcurrentQueue<T>) is a good approach. It also gives you a place to recycle web service requests that fail due to timeouts or dropped connections.

If you want to use more than the maximum default of two simultaneous web connections then add this to your app.config (replace "10" with your new maximum):

<configuration>
  <system.net>
    <connectionManagement>
      <add address="*" maxconnection="10"/>
    </connectionManagement>
  </system.net>
</configuration>
ebpower
Thanks, that's an angle I didn't think of. Conceptually, do you mean the consumer is executing the logic that effectively monitors for latency and grabs an additional items
Gabriel
@Gabriel - I prefer to create a consumer supervisor that monitors the queue and launches consumer workers. The workers communicate success or failure back to the supervisor, which decides whether to requeue the request.
ebpower