views:

63

answers:

2

Let me preface this with the disclaimer that I am very new to multithreading and may be missing something obvious.

I'm currently using the below code to process all the files in a directory. My question is if it would ever be possible for a thread to finish, decrement numFilesLeft, and find it equal to 0 because the next item hasn't been added as a work item and not because all the files have been processed? If this is possible what would be the standard way to make sure it doesn't occur?

Thank you for your time.

List<Bar> bars = new List<Bar>();
int numFilesLeft = 0;
ManualResetEvent isWorkDone = new ManualResetEvent(false);

foreach (string dirName in Directory.GetDirectories(@"c:\Temp"))
{
    foreach (string file in Directory.GetFiles(dirName))
    {
        string temp = file;
        Interlocked.Increment(ref numFilesLeft);
        ThreadPool.QueueUserWorkItem(delegate
        {
            try
            {
                List<Bar> results = Process(File.ReadAllText(temp));
                if (results.Count > 0)
                {
                    lock (bars) bars.AddRange(results);
                }
            }
            finally
            {
                if (Interlocked.Decrement(ref numFilesLeft) == 0)
                {
                    isWorkDone.Set();
                }
            }
        });
    }
}

isWorkDone.WaitOne();
isWorkDone.Close();
+4  A: 

Yes it is possible. The usual trick is to add one more count, for the operation of queuing the items itself:

List<Bar> bars = new List<Bar>();
int numFilesLeft = 0;
ManualResetEvent isWorkDone = new ManualResetEvent(false);

Interlocked.Increment(ref numFilesLeft);
try
{
foreach (string dirName in Directory.GetDirectories(@"c:\Temp"))
{
    foreach (string file in Directory.GetFiles(dirName))
    {
        string temp = file;
        Interlocked.Increment(ref numFilesLeft);
        ThreadPool.QueueUserWorkItem(delegate
        {
            try
            {
                 ...
            }
            finally
            {
                if (Interlocked.Decrement(ref numFilesLeft) == 0)
                {
                    isWorkDone.Set();
                }
            }
        });
    }
}
}
finally
{
 if (0 == Interlocked.Decrement(ref numFilesLeft))
 {
   isWorkDone.Set ();
 }
}

...
Remus Rusanu
Thanks. That seems like a nice solution.
Ryan
A: 

1.

My question is if it would ever be possible for a thread to finish, decrement numFilesLeft, and find it equal to 0 because the next item hasn't been added as a work item and not because all the files have been processed?

Yes. Because we don't know exactly when the thread starts processing.

2.

If this is possible what would be the standard way to make sure it doesn't occur?

We can use an array of ManualResetEvent instead, then in the main thread we will wait for all threads completing their work.

//Assume that you got the number of files in the directory.

  var fileCount = 10;

 ManualResetEvent[] waitHandles = new  ManualResetEvent[fileCount];

Enumurate through the files and create each thread like you have done. Besides, pass each ManualResetEvent as a thread state to each thread that you initialize.

 ......
     ThreadPool.QueueWorkItem(ProcessFile, waitHandles[i]);
     .....

Inside the ProcessFile() method regain the ManualResetEvent.

   void ProcessFile(object stateInfo)
{
     var waitHandle = stateInfo as ManualResetEvent;
    //Do your work here

   //finished. Call Reset()
  waitHandle.Reset()
}

In the main thread we wait all.

 WaitHandle.WaitAll(waitHandles);

That ensures all the files will be processed before the main thread terminates.

Hope that help.

DrakeVN
This is actually similar to what I used before but I ran into a limitation of WaitAll which can only wait on 64 handles at a time.
Ryan