Hi,
Why does the memory decrease when you minimize an Application ? I found this out while running a Flash Application in IE. It was taking around 200 MB memory but when I minimized the IE it came down to 5 MB's. I saw similar behaviour with Outlook and other EXE's.
I googled and found out that the working set of a process decreases when a window is minimized.
Is this documented by Microsoft ? What part of the working set is actually removed ? Is that part sent to the virtual memory?
What are the implications of this on performance ?
Does this behaviour prevail in Linux too ?