I have an application that is transferring files across the network, doing FTP, etc. I am launching these tasks using the new Task
functionality in .NET 4.0.
Aside from a problem where I have not figured out why my application hangs when running in Debug or Release from the IDE, the application has been deployed to a test server and has not crashed. I haven't been able to narrow this down yet, but there are some differences on the software that is running on the dev server and the qa/prod server.
I just observed it a short while ago since it has been running (as a Console app) without issue for about 4 days. I noticed the Process' handle count was around 13,000, steadily trending upward. I noticed this trend and though that it would decrease, but it hasn't yet.
Some notes from the performance monitor:
- Gen. 0,1,2 collections are occurring
- Gen. 2 heap size is large, but does shrink eventually. For example, in typing this post, it fluctuated between 27mb and 32mb
Using ProcessExplorer from sysinternals shows under the 'Handles' window that the majority of the handles are threads.
The threading in the application is layed out as follows [the application is launched as a console app for this test phase but will be eventually converted to a service]
Some notable highlights from the application:
Program launched from Program.Main()
static readonly ManualResetEvent resetEvent = new ManualResetEvent(false); static void Main() {
TheApp theApp = new TheApp(connectionstring, resetEvent); resetEvent.WaitOne(); //run indefinitely
}
System.Threading.Timer()
producer thread:_MyTimer.Change(Timeout.Infinite, Timeout.Infinite); //Call the database to get records Task t = new Task.Factory.StartNew( delegate { //Launch process to gather files, add to BlockingCollection _MainQ) }); _MyTimer.Change(5000,Timeout.Infinite);
Another
System.Threading.Timer
acting as a consumer thread. Each work item has a queue of queues to transfer. (For example, WorkItem A connects to System A. Since we want to be able to move a lot of files in parallel, we want to transfer 1000 files @ 100 files per thread. This yields 10 queues with 100 files each. Each one of these threads will process 100 files concurrently, yielding the 1000 file transfer that we want more efficiently than 1 thread doing 1000, or 2 doing 500, etc).private void ProcessQueue(object state) { _Consumer.Change(Timeout.Infinite, Timeout.Infinite); WorkItem nextItem = null; try { //Attempt to get an item from the queue nextItem = _MainQ.Take();
}} catch (InvalidOperationException) { } if (nextItem != null) { Task workItemTask = Task.Factory.StartNew(delegate { Task[] tasks = new Task[nextItem.WorkQueue.Count]; int ctr = 0; while (nextItem.WorkQueue.Count > 0) { List<Transfer> transferQueueItem = nextItem.WorkQueue.Dequeue(); tasks[ctr] = Task.Factory.StartNew(delegate { this.Transfer(nextItem.Header, transferQueueItem); }, TaskCreationOptions.AttachedToParent); ctr++; } try { //Wait for activity Task.WaitAll(tasks); } catch (AggregateException ax) { throw ax; } finally { for (int i = 0; i < tasks.Length; i++) tasks[i].Dispose(); CloseTaskInDB((ITask)nextItem.Header); } }); } _Consumer.Change(10000, Timeout.Infinite);
I must have poured over anything related to IDisposable() 100 times and I am confident that I am disposing all objects where necessary, including the 3rd party components and any Sql connections, etc.
I am using 3 third party components which are thread safe. UniObjects for .NET (latest version) and nSoftware's IPWorks FTP and SFTP component.
Am I simply worried about nothing? Will these handles eventually get cleaned up? I mean it has been about 4 days, although I don't want to assume anything extremely bad unless the app is out of memory or crashes due to that. I would have guessed since the threads are managed they are under the same finalization rules as any other managed object.