Hi All, I read somewhere that there are special cases when memory leaks are needed and actually a good thing in some rare circumstances. But i can't remember what was it exactly!
Does anyone know or remember such an argument?
Thanks, Adi
Hi All, I read somewhere that there are special cases when memory leaks are needed and actually a good thing in some rare circumstances. But i can't remember what was it exactly!
Does anyone know or remember such an argument?
Thanks, Adi
Wow, I'd love to know where you heard about that. A memory leak is just that, unintended leaking of memory which wastes resources. I can't see any reason for wanting this behaviour unless you were building software which you wanted your rivals to use (as the Americans did with the Russians when they let them steal pipeline control software)!
I've never heard of a case where memory leaks are a good thing.
In mondern operating systems they aren't nearly as bad as 5-10 years ago. Back then, OSes where much worse at policing programs and memory leaks would cause the memory to be permanently lost until the OS was rebooted.
These days, that isn't really an issue. So the most common "memory leaks" are when things are not cleaned up properly in error and exit cases from a program. When the program is about to terminate, usually it is okay to let the memory stay un-freed, because when the program is torn down by the OS all of that will be reed any way.
While the program is running, I can think of no reason that you'd want to have memory leaks.
They're good if you're a hosting company that get to charge $$$++ for memory upgrades to servers.
The only thing that even comes close that I can think of is when you want to test your code for handling out-of-memory conditions. This is important on embedded systems that don't have swap space, where an out-of-memory is a fatal condition.
If you want to get close to 100% code coverage with automated unit testing, you will have to figure out some way to make a specific allocation request fail from an automated unit test (which is easier said than done).
Once you have that, you need to write a unit test for each allocation request to make sure you have the out-of-memory failure is handled correctly for that allocation (a lot of work).
The only situations I can think of where a memory leak would be desirable are as follows:
(they're all a little convoluted though apart from the 1st)
An academic situation where a memory leak would be a good learning excercise in a programming class; including tracking down of and prevention of leaks.
A profiling/testing scenario where we would like to see how our software behaves under leaky circumstances.
Writing a leaky application to test/debug a profiling application with.
@Adi Barda:
In the long run, issues with more memory consumed by leaked objects will outweight memory fragmentation issues.
Memory leaks are never "OK". Sometimes you need to allocate a piece of memory and not deallocate it because it will be needed later, but that isn't a memory leak. But saying things like leaking memory avoids memory fragmentation is like saying it's better to just skip work today without calling in to avoid being late.