I have a long running console app running through millions of iterations. I want to benchmark if memory usage increases linerally as the number of iterations increases.
What would be the best way to go about this?
I think I really only need to be concerned with Peak memory usage during a run right? I basically need to work out what is the max number of iterations I can run on this hardware given the memory on the Server.
I am going to setup a batch lot of runs and Log the results over different interation sizes and then graph the results to identify a memory usage trend which can then be extrapolated for any given hardware.
Looking for advice on the best way to implement this, what .net methods, classes to use or should I use external tools. This article http://www.itwriting.com/dotnetmem.php suggests I should profile my own app via code to factor out shared memory used by the .net runtime across other apps on the box.
Thanks