tags:

views:

369

answers:

8

Applications like Microsoft Outlook and the Eclipse IDE consume RAM, as much as 200MB. Is it OK for a modern application to consume that much memory, given that few years back we had only 256MB of RAM? Also, why this is happening? Are we taking the resources for granted?

+4  A: 

http://en.wikipedia.org/wiki/Moore%27s_law

also:

http://en.wikipedia.org/wiki/Wirth%27s_law

uzbones
Gate's Law says it well.
chappar
Wirth's law is so right - when I was doing mainframe work in the 80s, any screen that took longer than 500 millis to return was in need of optimization. And look at how long some web pages take to load now!
Don Branson
+2  A: 

There's a couple of things you need to think about.

1/ Do you have 256M now? I wouldn't think so - my smallest memory machine is 2G so a 200M application is not much of a problem.

2a/ That 200M you talk about might not be "real" memory. It may just be address space in which case it might not all be in physical memory at once. Some bits may only be pulled in to physical memory when you choose to do esoteric things.

2b/ It may also be shared between other processes (such as a DLL). This means it could be only held in physical memory as one copy but be present in the address space of many processes. That way, the usage is amortized over those many processes. Both 2a and 2b depend on where your figure of 200M actually came from (which I don't know and, running Linux, I'm unlikel to find out without you telling me :-).

3/ Even if it is physical memory, modern operating systems aren't like the old DOS or Windows 3.1 - they have virtual memory where bits of applications can be paged out (data) or thrown away completely (code, since it can always reload from the executable). Virtual memory gives you the ability to use far more memory than your actual physical memory.

paxdiablo
When windows task manager says Mem Usage as 200MB, what memory it refers to? does it also include the dll memory which the application is using?
chappar
See http://stackoverflow.com/questions/588882/unmanaged-vc-applications-memory-consumption-on-windows-server Another member posted a response about the 'Process Explorer' which would show you that information.
uzbones
I am not talking about physical or virtual memory. I am talking about total memory usage of a application. Virtual memory is way for an app to use huge memory in a limited physical memory. So, if the physical memory is small, it slows down the machine as OS has to do lot of swap in and swap out.
chappar
http://www.intellipool.se/forum/blog/server_monitoring/index.php?showentry=1 shows that the working set (mem usage in processes tab in task manager) may not reflect what the app is using - there's a private bytes that more accurately shows this, though I still don't know if that's virtual/physical.
paxdiablo
Process Explorer (SysInternals) seems like a much better tool for finding this out.
paxdiablo
+1  A: 

A few years ago 256 MB was the norm for a PC, then Outlook consumed about 30 - 35 MB or so of memory, that's around 10% of the available memory, Now PC's have 2 GB or more as a norm, and outlook consumes 200 MB of memory, that's about 10% also.

The 1st conclusion: as more memory is available applications use more of it.

The 2nd conclusion: no matter what time frame you pick there are applications that are true memory hogs (like Outlook) and applications that are very efficient memory wise.

The 3rd conclusion: memory consumption of a app can't go down with time, else 640K would have been enough even today.

Pop Catalin
+4  A: 

Is it acceptable when most people have 1 or 2 gigabytes of RAM on their PCS?

Think of this - although your 200mb is small and nothing to worry about given a 2Gb limit, everyone else also has apps that take masses of RAM. Add them together and you find that the 2Gb I have very quickly gets all used up. End result - your app appears slow, resource hungry and takes a long time to startup.

I think people will start to rebel against resource-hungry applications unless they get 'value for ram'. you can see this starting to happen on servers, as virtualised systems gain popularity - people are complaining about resource requirements and corresponding server costs.

As a real-world example, I used to code with VC6 on my old 512Mb 1.7GHz machine, and things were fine - I could open 4 or 5 copies along with Outlook, Word and a web browser and my machine was responsive.

Today I have a dual-processor 2.8Ghz server box with 3Gb RAM, but I cannot realistically run more than 2 copies of Visual Studio 2008, they both take ages to start up (as all that RAM still has to be copied in and set up, along with all the other startup costs we now have), and even Word take ages to load a document.

So if you can reduce memory usage you should. Don't think that you can just use whatever bloated framework/library/practice you want with impunity.

gbjbaanb
+1  A: 

Many modern apps will take advantage of the existance of more memory to cache more. Some like firefox and SQL server have explicit settings for how much memory they will use. In my opinion, it's foolish to not use available memory - what's the point of having 2GB of RAM if your apps all sit around at 10MB leaving 90% of your physical memory unused. Of course, if your app does use caching like this, it better be good at releasing that memory if page file thrashing starts, or allow the user to limit the cache size manually.

You can see the advantage of this by running a decent-sized query against SQL server. The first time you run the query, it may take 10 seconds. But when you run that exact query again, it takes less than a second - why? The query plan was only compiled the first time and cached for use later. The database pages that needed to be read were only loaded from disk the first time - the second time, they were still cached in RAM. If done right, the more memory you use for caching (until you run into paging) the faster you can re-access data. You'll see the same thing in large documents (e.g. in Word and Acrobat) - when you scroll to new areas of a document, things are slow, but once it's been rendered and cached, things speed up. If you don't have enough memory, that cache starts to get overwritten and going to the old parts of the document gets slow again.

Eclipse
+1  A: 

If you can make good use of the RAM, it is your responsability to use it.

Mo Flanagan
A: 

Software is a gas

epatel
A: 

It completely depends on the application.

Daniel Earwicker