views:

270

answers:

4

I've got a desktop program. Most Operating Systems run the program in its own address space.

When the program exits, I believe most operating systems free the memory allocated by the program and return it to the memory stack for reuse.

What I am not sure of, is if the program has a memory leak, will the memory "leaked" also be returned for reuse, or is it lost until the machine is rebooted?

This is a followup to the question I asked earlier today: Do Small Memory Leaks Matter Anymore?, and a couple of comments there mentioned that program memory is freed when a program finishes. If leaks are freed when a program is done, then it definitely puts less pressure on me to rigorously rid my program of the tiniest leaks.

Specifically, I'm a Windows programmer and I need to know what happens (memory lost or memory freed) for Windows 98, 2000, XP, Vista and 7. But I'd also like to hear what happens on Mac and Unix machines.


Clarification: I am talking about non-growing leaks. These are leaks of a constant size that happen once when a program is run.

I do realize that leaks that are continually growing in a program are severe and must be fixed.

Either way, the question isn't about whether or not leaks must be fixed. It is whether or not the Operating Systems give you back the leaked memory when the program ends.

+4  A: 

A memory leak simply refers to when your program is allocating memory that it then loses track of. The operating system still considers this memory to be in the address space of the program, and thus it will be reused when the program finishes.

All modern operating systems use a mechanism called virtual memory to keep track of a programs memory.

This is where I learned about virtual memory in quite good detail CS3231.

Basically it is where the operating system can put chunks of an applications memory anywhere in physical memory, while maintaining a mapping to where these chunks should point to.

From the applications point of view, it gets full access to memory (4gig on 32-bit OS, something massive on 64-bit), and can keep allocating until it hits the hardware limit, even if physical memory is less than this limit (this requires the OS to store some of the contents of memory on disk, usually in a swap file)

This is facilitated by the hardware on the CPU, a module usually called the MMU (Memory Management Unit), and sometimes there is also a TLB (Translation Lookaside Buffer) to speed up virtual memory operations.

Another page that explains a bit more about Memory Protection which details some more of the inner workings of Virtual Memory.

barkmadley
phoebus
Is that true on all Operating Systems? When did the OS's switch from shared memory space?
lkessler
It's true on all modern operating systems that aren't running on restricted devices (low end 8 bit embedded controllers, for example). The primary factor the enabled this capability was Virtual Memory and hardware memory management units (MMUs). Any system can do it, but the VM/MMU subsystems make it a LOT easier. And "protected" memory is not a requirement to pull this off. The older Mac OS, for example, didn't have memory protection (one process could stomp on the memory image of another), but processes didn't link when they exited either (they could still leak internally of course). 7 to go
Will Hartung
+2  A: 

Windows will free your process memory after it terminates but the leak has some impact on your application performance and reliability (depending on its size)

In some ways small leaks are worse than big ones, since they will cause crawling degradation in your software until the inevitable death, possibly taking hours of your users' work with them.

If you do know of memory leaks I suggest you hunt them down and get rid of them, no part of the OS or your programming language will effectively do that for you. There are some very good tools to pinpoint leaks.

Alon
+2  A: 

I, as a professional in my field find the idea of someone not caring if they are doing their job well to be abhorrent. You should strive to do your job well, and that will translate into you writing better programs. Allowing or ignoring memory leaks as "unimportant" means you are more likely to do the same with other things you consider "unimportant", like documentation, or performance, or user friendliness.

A laissez faire attitude breeds problems. So I consider memory leakes to be a sign of how poor you are at your job.

However, having said all that, there are very real reasons not to ignore memory leaks. For instance, you don't know how long your user will run your program. It could be 5 minutes or 5 weeks. Memory leaks build up, and use more and more resources until other things start to fail.

Another issue to keep in mind is that users are not only running your program. They are running other apps as well, and the more resources your app uses, the fewer that are available for their other programs. Ignoring memory leaks is basically being selfish and not caring if your users are having problems.

Mystere Man
I am not ignoring memory leaks. I am finding and fixing 98% of them. But the last 2% are very difficult to solve. Most are not my mistakes, but are in third party software or even in the routines that come with my development tool (Delphi). What is left over is not major, but it is also not zero.
lkessler
Perhaps we have a different definition of what a "leak" is. A "leak" is when you lose memory, like a leaky bucket loses water. If an applicaiton has a one-time allocation that it never frees, that's not really a "leak", but in most cases it's still bad practice. Lots of frameworks, particularly MFC and VCL have unfreed memory the leave for the OS because the alloations are made early in the app and stick around for the life of the program. Those aren't "leaks", but should be avoided where possible..
Mystere Man
Yes, I think we do. I'm talking about the one-time allocations. I'll add a clarification to the question.
lkessler
A: 

what happened in the os does not have memory management unit

norhayati bt mohd yaakub