Why do people think that an memory leak in .net is not the same as an other leak?
An memory leak is when you attach to an resource and do not let it go, that you can do both in managed and in unmanaged coding.
Regarding .net, and other programming tools, there have been ideas about garbage collecting, and other ways of minimizing situations that will make your application leak.
But the best method of preventing memory leaks is that you need to understand your underlying memory model, and how things works, on the platform you are using.
Believing that GC and other magic will clean up your mess is the short way to memory leaks, and will be difficult to find later.
When coding unmanaged, you normally make sure to clean up, you know that the resources you take hold of, will be your responsibility to clean up, not the janitors.
In .net on the other hand, lot of people think that GC will clean up everything. Well, it does some for you, but you need to make sure that it is so. .net do wrap lots of things, so you do not always know if you are dealing with an managed or unmanaged resource, and you need to make sure what you do. Handling fonts, gdi resources, active directory, databases etc is typically things you need to look out for.
In managed terms I will put my neck on
the line to say it does go away once
the process is killed/removed.
I see lots of people have this though, and I really hope this will end. You cannot ASK the user to terminate your app to clean up your mess!
Take a look at an browser, that can be IE, FF etc, then open, say, Google Reader, let it stay for some days, and look at what happes.
If you then open another tab in the browser, surf to some site, then close the tab that host the other page, that made the browser leak, do you think the browser will release the memory?, Not so with IE at least.. on my computer, IE easily will eat 1gig memory in short time (about 3-4 days) if I use Google Reader. Some newspages is even worse.