tags:

views:

430

answers:

8

I started to write a large application in C# for the first time in my life. I wrote sample module to test the idea behind my software. This module contained several dozen C# dictionaries and lists of objects that each had several members and properties.

I was shocked that after initializing core objects it end up utilizing about 40MB of RAM.

I tested and found out that more than 30MB is allocated after object initialization, but I was under the impression that given the size of my objects, no more than a few hundred kilobytes should have been consumed.

Have I done something wrong or is .NET naturally memory intensive compared to native code applications?

+2  A: 

It's not always about the language, it's usually about how it is used.

Good code can use memory efficiently in any language.

Bad code will use memory inefficiently in every language

Chris Ballance
Well, only half true. The language/runtime in this case *does* indeed matter quite a bit. -1
Ed Swangren
Correct, if you care about a few hundred K of memory, C is a often a better choice.
Chris Ballance
What's not true about my answer? (updated slightly for emphasis)
Chris Ballance
I call BS on it mattering 99% of the time. Who cares if your LoB application that has some dictionaries and UI elements takes 100mb of RAM when your secretary's PC has 2+gb?
Rex M
+12  A: 

How did you determine how much memory that was used? .NET applications are known to eagerly reserve more memory than they need, as long as there is plenty of memory available, and release memory back to the system if it starts to run short on memory.

I think you may get some pointers in this MSDN article.

Fredrik Mörk
By using task manager - thanks to pointing that it could not be a valid number
PiotrK
+2  A: 

The .Net runtime has a certain large overhead - we've found that even simple applications will tend to use much more memory that similar applications written in C++. Fortunately this overhead is quickly disipated in the noise as the size of the overall code increases. The second factor is that of garbage collection, the garbage collector runs "whenever", so by comparison to C++, memory allocations are not typically freed right away, but rather when it feels the requirement to do so.

1800 INFORMATION
+5  A: 

In the days where we have several GB of RAM on a machine, the idea that a user-facing app like you would generally build in C# should only use a few 100K of ram is backwards. Any unused ram on your system is wasted.

With that in mind, C# allocates memory using very different strategy from C++, such that larger chunks are allocated and freed less often.

That said, 40 seems a little high. I'm used to something closer to 10-14Mb for very simple console apps. Maybe there's a form taking up ram, or maybe I'm on a different version of the framework (2.0).

Joel Coehoorn
"Any unused ram on your system is wasted."Any RAM that my app isn't hogging can perhaps be put to better use by another app or the OS.
Tom Juergens
But it is not being 'hogged', it is being allocated by the runtime in a greedy fashion. The runtime can also return it to the OS if needed.
Ed Swangren
one of the main reasons linux performs better than windows is because it uses all unused memory as disk cache, greatly improving read speeds (which is very necessary because of the *nix idiom of using empty files as flags). If your app hogs this memory, it can't do that.
rmeador
Windows has prefetch/caching strategies as well. Perhaps of a different nature, but it definitely utilizes surplus RAM. For example, Windows 7 is currently using 2 GB of my RAM and applications like Visual Studio load instantly.
JulianR
+2  A: 

I'd say that unless you have a very good reason to worry about RAM usage, don't. Like optimizations, you should only optimize where there is a customer/end-user need to do so. I don't believe users are so constrained by memory these days (unless you're talking about an embedded system) that they're going to notice or care much about a missing 38 MB.

jasonh
If I see 40 megas used after I wrote single module and application will have... let's say 20 - 30 such modules multiply full data set than, well - I am concerned that I won't make in 512 megabytes of RAM
PiotrK
@PiotrK: Yes, if it were that simple, but it is not. .NET applications are managed by the runtime. The amount of memory used does not directly reflect what you are doing with your code, your assumptions are incorrect.
Ed Swangren
PiotrK: but you won't. Memory use doesn't scale linearly with the number or size of modules. You take a big hit for the .NET runtime, but you take that hit only once. You might see that with one module you're consuming 40MB, but every subsequent module might add only another couple of MB (depending on how much memory it allocates of course).
itowlson
In addition to what others have said, physical memory isn't the only limitation here. Windows will page out your applications or portions of it as it sees fit, not necessarily when you fill up physical memory.
jasonh
+7  A: 

Using Task Manager to look at your memory usage is likely to be horrendously inaccurate.

Instead, grab a proper memory profiling tool (dotTrace is very good, and has a 10 day trial) and take the time to see your actual memory consumption.

The sorts of things I've seen in my own code include

  • Underestimating how much memory I'm actually using (not counting separate objects properly, not allowing for lists to have "spare" capacity)
  • Keeping references to transient objects that I don't need
  • Not allowing for transient objects created during operation that haven't yet been garbage collected
  • Not allowing for unallocated memory - memory that has been claimed from the OS (and therefore shows as a part of the process in Task Manager) but which has't been allocated to any one object yet
  • Not allowing for thread stacks (each thread gets a stack of its own; .NET applications are always multi-threaded as the framework may create several threads of its own).
Bevan
A: 

Doing a quick calc you are probably allocating around 340K objects (subtracting the usual c12MB hit for single console apps on 32bit, yikes Steve!) and seeing [removing the reference to a typical scenario eating 8bytes per object + then some for other usual suspects] of runtime and System.Object waste for reference types by the bloated runtime tech copyright @ Sun.com..

Try and design for value types / structs if at all possible.. if you cannot, tough, tell your users not to run more than 10 .NET apps or their machine will feel slower than C64. If it makes you feel better, try WPF or Silverlight and feel c100MB penalty for a few buttons and flashy animations.

(downvote feel)

rama-jka toti
This is terrible, inaccurate advice. -1
Ed Swangren
So bad that I have to say it twice.
Ed Swangren
Sure, make sure you jump to the CLR execution engine designers blog and read the entries on why they opted out for the value type technique, actually array allocation so they can avoid this type of overhead in whopping 2 apps they realised since 2001. Btw, on that team, most were done, ran away, Google mothership and it just keeps paving the way..
rama-jka toti
And btw, on 64-bit this overhead gets bigger Ed.. feel it, measure it, love it for the fact you will be creating most bloated apps in history of mankind..
rama-jka toti
-1. See "Choosing Between Class and Struct" in section 4.2 of the book Framework Design Guidelines 2nd Ed. Microsoft recommends against defining structs in all but a few cases.
TrueWill
And lastly, if you are against value types you are reading too many Joe Duffy and Eric style weblogs that fail to understand you are running on the stack-based VM and that abstracting the stack is asking for WPF app of 400MB usage instead of the common 60MB for most trivial functionality. Stack and value types are that much of an implementation detail that Redmond needs a bit of, err, detoxing.. The question is clearly about primitive types, integral one in fact.. Oh wtf, live the hype..
rama-jka toti
Yeah go on and read that book again and especially how they are against interfaces and more for abstract classes.. Btw, run FXCop and make sure it satisfies the Abrams designs that lead to proliferation of reference types which is exactly the problem CLR execution guys worked on to beat Java and beat it in its own JVM incarnation before Sun crap kicked off.. Find a better reference, seriously.. it's nice and copies Elements of Java style hype.. and so on. My advice, be yourself and read between the lines, and beyond Redmond.
rama-jka toti
And lastly, value types are about the only thing that initiates you to be careful about immutability ( oh the Duffy and Abrams irony..). Please, read the EE designer advice, not book sellers as about the only good thing that book has is the Miguel's name on it (which is a suprise). The other guys, and series editor, designed the WPF and WCF, you know the bloat.. Too Young to..
rama-jka toti
And what you are both asking, listening to such great advice, is that either you lose control of design or adopt Java style int is a reference type.. oh dear.
rama-jka toti
Before I forget, read that book again and listen to the heurstic they chose for the size, as well as read up on inlining choices they are making *for you*. That type of design never ever worked and if you need hand holding by Richter's rule of 16bytes, they you must be just blind reading and not critically thinking. Simple.
rama-jka toti
A: 

If you are using Task Manager to look at the memory it can be misleading. Working Set is the amount of virtual memory mapped to the process. Not necessarily how much your application is using - this is especially true in .NET's garbage-collected environment. As your program allocates memory the .NET CLR/GC will usually request more memory than it actually needs from the OS so that it can efficiently allocate that memory to managed objects in your program in the future.

A quick and dirty way (very dirty) to see if this is affecting you is to set Process.MaxWorkingSet property to 0. This is similar to using SetProcessWorkingSetSize in Win32 to try and trim the amount of pages mapped to the process. If you immediately see a drop in the memory usage then you know what is going on. However, as soon as you start allocating memory again via the GC/CLR it will go back up - and usually that is a good thing. Really you shouldn't worry about it and give the GC a chance to do it the right way.

To both optimize your program's memory usage and get a better idea of how memory allocation works in the CLR I suggest you start messing with dotTrace (my preference), Ants Profiler (who incidentally publishes a cool video on this topic here). CLRProfiler is interesting too, but it is a bit dated these days, but it is free.

scott