I have what I assume is a pretty common threading scenario:
- I have 100 identical jobs to complete
- All jobs are independent of each other
- I want to process a maximum of 15 jobs at a time
- As each job completes, a new job will be started until all jobs have been completed
If you assume that each job will fire an event when he completes (I'm using the BackgroundWorker class), I can think of a couple of ways to pull this off, but I'm not sure what the "right" solution is. I was hoping some of you gurus out there could point me in the right direction.
SOLUTION 1: Have a while(continue) { Threading.Sleep(1000); } loop in my Main() function. The code in the Job_Completed event handler would set continue = false when A) no jobs remain to be queued and B) all queued jobs have completed. I have used this solution before and while it seems to work fine...it seems a little "odd" to me.
SOLUTION 2: Use Application.Run() in my Main() function. Similarly, the code in the Job_Completed event handler would call Application.Exit() when A) no jobs remain to be queued and B) all queued jobs have completed.
SOLUTION 3: Use a ThreadPool, queue up all 500-1000 requests, let them run 10 at a time (SetMaxThreads) and somehow wait for them all to complete.
In all of these solutions, the basic idea is that a new job would be started every time another job is completed, until there are no jobs left. So, the problem is not only waiting for existing jobs to complete, but also waiting until there are no longer any pending jobs to start. If ThreadPool is the right solution, what is the correct way to wait on the ThreadPool to complete all queued items?
I think my overriding confusion here is that I don't understand exactly HOW events are able to fire from within my Main() function. Apparently they do, I just don't understand the mechanics of it from a Windows message loop point-of-view. What is the correct way to solve this problem, and why?