views:

220

answers:

9

Coming from C/C++ a long time ago I still have a habit of ensuring that all resources are cleaned up correctly. I always ensure Dispose is called on IDisposable classes and implement Dispose patterns in my classes containing disposable objects.

However, in my environment I'm more or less the only one doing this. Others just don't understand what I'm doing and think my code is more difficult to understand.

They just create database connections, open streams etc without calling Close or Dispose. Sometimes they set a local or member variable to "Nothing" at the end of a method (guess their background).

My problem is that their code works just as well as mine. Code that over time creates thousands of database connection objects just works.

So, ignoring any arguments about code correctness, following guidelines etc, does IDiposable really matter?

Has anyone actually ran out of resources from not Disposing objects?

Edit: Thanks for all replies. Interesting to see that some people have had problems when not Disposing. It seems to be rare though and I assume the GC/JIT does a good job of keeping resource usage down under normal conditions.

Neither my collegues nor I will change behavior because of this but it feels good to be right.

+23  A: 

Yes, I've maxed out the number of Oracle cursors when looping over a connection object, for example, because I forgot to close the command reader - that was only 100 loops on a single connection too, and I needed to support that with possibly hundreds of connections doing it at the same time.

Your fellow developers should be taught to use the using() { ... } syntax if they can't be bothered to close up any unmanaged resources themselves. It is good practice anyways and you should use it too, since you yourself might be forgetting to put your Dispose() calls in a finally {} clause so as to truly clean up in the event of an unhandled exception being thrown.

If you can't win their hearts - change their minds - create tests that break their code by maxing out the resources that they're not cleaning up - and then show that the "fix" is simple and easy, and enables their code to be far more scalable. Or just show it to your boss and tell them this will enable him/her to sell the product as a new version with more scalability built in :) Your fellow developers will be instructed to do this all the time in the future, hopefully, and you'll be held in higher regard too.

Mike Atlas
+1 for writing tests to break bad practice.
Paul Alexander
+1  A: 

Not disposing (or closing) database connections will eventually bite you, yeah. I've seen that happen.

kekekela
@Anon-Downvoter"Has anyone actually ran out of resources from not Disposing objects?"My response is a direct answer to this question that doesn't require an essay, at least explain your flawed reasoning.
kekekela
+4  A: 

Some of these resources like handles are a limited resource for the entire system, so if your application doesn't release these other applications or even the OS may suffer. Have a look at Mark Russinovich's latest article in the pushing the limits of Windows series for examples.

Brian Rasmussen
A: 

I had a case which I unfortunately cannot remember the details of, but it was some kind of layered streams. The lowerlevel file stream was sometimes closed before the upperlevel text formatter was flushed, which caused the last output written to the text formatter to be lost.

Anders Abel
+3  A: 

Yes, I also ran into an issue with Connection objects to an Oracle database not getting disposed.

Mike Atlas's issue above is bad, but it was at least clear about what was going wrong. The issue we ran into was that from time to time under heavy load, the site would start throwing errors when we tried to open a connection, but by the time we looked at the system it had all cleared up (because the garabe collector had cleared the objects and freed up the connection pool). It was very difficult to reproduce, until I was looking through the code and noticed that a connection was not being closed in the event of an error, changing this to a using statment fixed the whole issue.

The short answer is that if an object takes the effort to implement IDisposable, it's there for a reason, so ALWAYS dispose it when you are done, ideally with a using statement. Don't get clever or tricky with disposing sometimes but not other times when you don't think you need to blah blah blah. Just do what works every time.

The shorter, and more satisfying answer, is that you are right, and your coworkers are morons who don't know what they are doing.

Mike Mooney
+4  A: 

Yes it matters. When an object implements IDisposable it is explicitly stating that it is holding resources that need to be released when the object is no longer needed.

Most will still clean up their resources when the object is finalized but finalization is not deterministic and can't be relied on for resource management.

Simply wrapping the variable declarations in a using(...) block makes it easy to dispose of properly.

Paul Alexander
A: 

Not disposing database releated IDisposable objects is a reliable and efficient way to generate OutOfMemoryExceptions in environments.

DataSet implements IDisposable and I've read that it is not necessary to call Dispose because the objects that need to be disposed for a dataset are only created at Design time (by the visual studio designer). I've never seen OOM from un-Disposed datasets (just OOM from enormous DataSets)

MatthewMartin
A: 

Besides the obvious cases (already mentioned) of resources running out, another benefit of IDisposable is that, since it guarantees that Dispose() is called when a using block exits, you can use it for all kinds of things, even things that aren't just "perform operation with OS resource".

In this way, it's like a poor-man's substitute for Ruby blocks, or for one small use case of Lisp macros.

Ken
A: 

Yes, Yes, Yes, It matters.

I've been profiling an application recently that had never been profiled. It's just a Winforms application, no big deal, right?

Wrong.

By not implementing IDisposible and not de-referencing event handlers, the application was leaking memory like a sieve.

The .NET Framework does not absolve you from cleaning up after yourself, it just makes it less likely that you'll break something if you don't.

Spend an hour, profile your application with the ANTS Profiler. It's a free trial. If you don't see any memory leaks, then continue on your way. If you do, then it's because you were relying on the .NET Framework to be your crutch.

George Stocker
I know the theory but are memoryleaks really a big problem on the desktop? (server applications is a different matter)With gigabytes of ram available it will take a long time before it affects performance.
adrianm
Yes, it matters. If you have a multiple form desktop application that isn't disposed, any data it has in it is also kept alive. All the user has to do is use your application for it to get slower and slower as memory is paged on and off of disk. If you don't care about your application, fine. Just let me know which one it is so I don't buy it.
George Stocker