views:

1173

answers:

11

I feel like developers talk about memory leaks but when you ask them what that means many have no idea. To prevent these situations, let's decide on one.

Please no Wikipedia definitions...

What is your best definition of a memory leak and what is the best way to prevent them?

+1  A: 

Definition: Failure to release memory after allocation

Mozilla has a great page on tools for tracking down memory leaks.

Brian Gianforcaro
That's only a leak if it's memory you don't need any more.
Paul Tomblin
True, definitely an over sight by myself. I noticed you are also in Rochester, NY.. small world.
Brian Gianforcaro
+2  A: 

The process in which memory resource are allocated and not properly released once no longer required, often introduced through bad coding practices.

There are built in ways in some languages to help prevent them, although the best way to avoid them is through diligent observation of code execution paths and code reviews. Keeping methods short and singularly purposed helps to keep resource usage tightly scoped and less prone to get lost in the shuffle, as well.

joseph.ferris
+1  A: 

Memory that is not deallocated when it is no longer needed, and is no longer "reachable". For instance, in unmanaged code, if I use "new" to instantiate an object, but I don't use "delete" when I'm done with it (and my pointer has gone out of scope or something).

The best way to prevent them probably depends on who you ask and what language you are using. Garbage collection is a good solution for it, of course, but there may be some overhead associated with this, which isn't a big deal unless you performance is your primary concern. Garbage collection may not always be available, again, depending on the language you are using.

Alternatively, you can make sure you have the appropriate deletes and/or destructors in place. There's a lot of methods and tools to detect memory leaks as well, but this will depend on the language and/or IDE you are using.

mgroves
A: 

There are two ways a memory leak may be defined.

First, if data is not freed when there are no longer has any references to it, that data is unreachable (unless you have some corrupt pointer or read past the data in a buffer or something). Basically, if you don't free/delete data allocated on the heap, it becomes unusable and simply wastes memory.

There may be cases where a pointer is lost but the data is still accessible. For example, if you store the pointer in an int, or store an offset to the pointer (using pointer arithmetic), you can still get the original pointer back.

In this first definition, data is handled by garbage collectors, which keep track of the number of references to the data.

Second, memory is essentially leaked if it is not freed/deleted when last used. It may be referenced, and immediately free-able, but the mistake has been made not to do so. There may be a valid reason (e.g. in the case where a destructor has some weird side effect), but that indicates bad program design (in my opinion).

This second type of memory leaking often happens when writing small programs which use file IO. You open the file, write your data, but don't close it once you're done. The FILE* may still be within scope, and easily closeable. Again, there may be some reason for doing this (such as locking write access by other programs), but to me that's a flag of bad design.

In this second definition, data is not handled by garbage collectors, unless the compiler/interpreter is smart (or dumb) enough to know it won't be used any longer, and this freeing the data won't cause any side effects.

strager
Yes, I've hit multiple programs that don't close a file handle when they should. I've even had a several-message exchange with one who thought write/rename/close was acceptable programming since you get away with it on a Windows file system.
Loren Pechtel
+4  A: 

Allocated memory that cannot be used because the reference to it has been lost.

truppo
I like it, but I don't think it's quite accurate. Even in gc systems, sometimes things are called "leaks" because although the memory is still reachable, it "shouldn't be", because it's no longer needed.
Steve Jessop
I think this is the best one - references to memory can be lost even in languages with Garbage collection.
Foredecker
You don't have to lose the reference to the memory to have a leak. If you don't need the memory any more but you fail to deallocate it, even if you keep a reference to it around, it is a memory leak in my book.
Robert Gamble
I think there is clearly a difference between memory not needed/used, and memory leaked.
truppo
I think there is clearly a difference between unrecoverable memory and a memory leak, the former is a subset of the latter, not the definition of the latter.
Robert Gamble
But what you are saying is that this program is leaking memory:int main(){ int* p = malloc(sizeof(int)); *p = 4711; printf("hi %d\n", *p); printf("bye!\n"); free(p); return 0;}...since at line 8, we are not needing *p anymore and thus should free it...
truppo
sorry for the mangled formatting
truppo
It's okay, you can't really format comments, what part of that exactly is "line 8"?
Robert Gamble
printf("hi %d\n", *p); printf("bye!\n"); free(p);The second printf.
truppo
I wouldn't call it a leak but it does bring to light a gray area: memory that isn't freed before the program terminates is clearly a leak (keep in mind there are OSes, esp. embedded, that don't free memory when an app terminates) but ...
Robert Gamble
exactly when before termination it must be freed to not be a leak isn't clear cut.
Robert Gamble
Removing the "free()" call would of course create a leak. But you and some others claim "unused" memory is also a leak. According to that principle, my simple example is leaking memory since it did not free *p right after the last use of it. My point being that unused memory != leaked memory.
truppo
Okay, how about the flip side: If I allocate a new buffer every time I open a file in my program, the buffer is never used after the file is closed and I never have more than one file open, but I never free any of the buffers until right before the program terminates, is the program leaking memory?
Robert Gamble
No. It uses unnecessary memory (which always is bad), but it never leaks as long has you have the references.
truppo
Okay, this is where we disagree. I consider it a memory leak because the memory footprint of the program needlessly increases for every file I open which wouldn't be the case if I freed the memory so it was available for the next time I opened a file.
Robert Gamble
Yes, I think we can agree on that we disagree :)
truppo
Since this is subjective and I see the merit in your reasoning (even though I'm not on board), I am removing my down vote.
Robert Gamble
I think it differs at the point of development, if the original developer doesn't understand how to free memory properly, that's another thing to a decent developer freeing up memory properly but still has his benchmark going up.
chakrit
A: 

edit: This answer is wrong. I'm leaving it up as an example of how easy it is to be mistaken about something you think you know very well. Thank you to everyone who pointed out my mistake.

A memory leak is: A programming error. Your software borrows some memory from the system, uses it, and then fails to return it to the system when it has finished. This means that that particular chunk of memory can never be used by any other programs until the system is rebooted. Many such leaks could use up all of the available memory, resulting in a completely useless system.

To prevent memory leaks, practice RIIA, and always test your software. There are plenty of tools available for this task.

e.James
I am unaware of any modern OS that doesn't return all of an app's memory to the system free list immediately upon termination.
Just Some Guy
I am with Just Some Guy. All of a program's memory is returned to the system when the process is killed. Remains of the data are there, maybe including cache data from opened files, but that's all handled by the OS anyway, not the program itself.
strager
You guys have obviously never worked on embedded devices.
Robert Gamble
How many of them are constantly loading and unloading apps?
Just Some Guy
How many of them never exit and thus whether or not the memory is freed when they do it moot?
Robert Gamble
This is simply wrong.
titaniumdecoy
Wow. And I thought I knew this answer cold! Thank you all for waking me up to the fact that I was completely wrong about this. I'm off to do some reading.
e.James
A: 

Memory Leak: Failing to free memory that you no longer need before either:

  • The program terminates
  • Additional memory is allocated

Best way to prevent Memory Leaks: Free memory as soon as it is no longer needed.

Robert Gamble
Freeing memory before termination is a non-issue. It's memory lost while the program is still running that causes problems.
Just Some Guy
You, as the programmer of a program, can't free memory allocated by your program after said program terminates. It's simply beyond you control. So the first point is reundant.It doesn't matter if you alloc more memory after you've started a leak, either.
strager
On the first point, if you don't free them memory before the application terminates you stand to create a memory leak in the vast majority of computers in existence (hint: the world isn't a PC).
Robert Gamble
On the second note, if I allocate 1 MB of memory, free it, then allocate 1 MB of memory, my program is using 1 MB of memory. If I don't need the first MB and don't free it before I allocate the second MB my program is now using 2 MB of memory where is should be using 1, a leak of 1MB.
Robert Gamble
Also note that on many systems, including many "modern" OSes, a program cannot give memory back to the system until it terminates so if I have 2 MB allocated and give one back, it is still unusable by any other process on the system.
Robert Gamble
+16  A: 

There are two definitions (at least for me):

Naive one: Failure to release unreachable memory, which can no longer be allocated again by any process during execution of allocating process. It can mostly be cured by using GC techniques or detected by automated tools.

Subtle one: Failure to release reachable memory, which is no longer needed for your program to function correctly. It is nearly impossible to be detected by automated tools or programmers who is not too familiar with the code. While technically it is not a leak, it has the same implications of the naive one. This is not my own idea only. You can come across projects that are written in a garbage collected language but still mentions fixing memory leaks in their changelogs.

artificialidiot
It is possible to find memory leaks in the category "Subtle" quite easily with http://www.eclipse.org/mat.
kohlerm
A: 

All the definitions given here fail to address one borderline case:

You have a singleton that allocates memory upon creation and this memory is normally held as long as the program is running even though the current use is done and it's unknown whether any future use will ever be made or not. This is generally done because of the overhead of recreating it.

By the "fail to free when done with it" standard this would be considered a leak and I've seen leak-reporting tools call such things leaks as the memory was still in use. (And in fact the code may not contain code capable of cleaning the object up.)

However, I have encountered code of this nature in compiler libraries before even when the cost of recreating the object isn't all that great.

Leak or not?

Loren Pechtel
Singletons are a memory leak by definition - this is just one of the reasons you should never use them
1800 INFORMATION
This is missing the _unintentional_ aspect of the memory leak. And a singleton, could have a release method that frees the allocated memory.
philippe
A: 

w:

In computer science, a memory leak is a particular type of unintentional memory consumption by a computer program where the program fails to release memory when no longer needed. This condition is normally the result of a bug in a program that prevents it from freeing up memory that it no longer needs.

eed3si9n
A: 

Here are some techniques for preventing / detecting memory leaks:

  1. Consider your algorithm in terms of memory consumption. Other respondents have mentioned the fact that you don't have to lose the pointer to an allocated item to leak memory. Even if your implementation contains zero pointer bugs, you can still effectively leak memory if you hold onto allocated items long after you actually need them.

  2. Profile your application. You can use memory debugger tools like Valgrind or Purify to find leaks.

  3. Black-box testing. Watch what happens to your compiled code after you feed it large data sets, or allow it to run for long periods of time. See if its memory footprint has a tendency to grow without limit.

mseery