views:

640

answers:

8

I'm looking for a profiler for my C# application being developed in Visual Studio 2008. I am looking for something that is inexpensive (open sourced is preferred) and that it can integrate into VS2008. I found the Visual Studio Profiler but I do not know how to use it. I installed the Stand Alone version which depends on Visual Studio (not to stand alone I guess?) but nothing ever shows up in the Tools menu like their walk through says it will.

+6  A: 

The Visual Studio Profiler is part of Team System only. It is not included in Visual Studio Professional.

There is a free .NET profiler called nprof, but it's not released yet and it can be rather volatile. Also, there are some excellent commercial profiler's such as ANTS Profiler from Red Gate; however, these are not low cost.

Jeff Yates
I've never used ANTS Profiler, but you can't go wrong with anything made by Red Gate unless you're on a budget.
TheTXI
If I remember correctly Red Gate has a 14 day trial version. I would definitely advise you to use it. I have advised it to other people and they ended up buying a license.
bastijn
+9  A: 

Here's a list of open source .Net profilers.

I have used and like the Ants-Profiler from Red Gate, but it does cost money (highly worth it, IMHO).

gbc
+1 for ANTS profiler - well worth every penny it costs
marc_s
Another +1 for ANTS profiler...definitely worth the money.
jrista
agreed ANTS is very good
miguel
+3  A: 

I have used AQtime with great success.

As already mentioned ANTS is also a good option.

Dana Holt
+1  A: 

There is some discussion on profilers for .NET in this stackoverflow thread. I have used CLR Profiler some, and it has helped me take care of a few performance issues in software before. Could be worth a try. Microsoft has published a guide on how to use the CLR Profiler.

Fredrik Mörk
+4  A: 

My recommendation is dotTrace. Isn't free, price is 170 EUR for Personal License.

http://www.jetbrains.com/profiler/index.html

MicTech
We use dotTrace in our company.It's very easy to use and very helpful. I recommend it :)
Beatles1692
I use its trial version too, until it lasts :) and I love its simplicity and how it manages multilanguage projects. http://stackoverflow.com/questions/906915/c-code-performance/907676#907676
Daniel Daranas
I used this and it works very well for performance profiling, as long as your solution is not too big. I tried to run 5000 unit tests in profiling mode and my memory was not enough (and I have 8 Gigs in my dev pc). So dottracer is resource intensive.
crauscher
+2  A: 

If you just want to do memory profiling, the .NET Memory Profiler is excellent. It's got a trial period and small cost after that -- well worth it. If you want to spend some money, DevPartner Studio is very good.

JP Alioto
I'm actually right now just looking for execution time per method, but memory profiling may come later
Malfist
I can vouch for .NET Memory Profiler. Who says you can't have memory leaks in .NET!
Mark Lindell
they're not memory leaks in the traditional sense, but yes, you can have them. GC isn't perfect, even in java.
Malfist
@Malfist: True. Where I find it very useful is identifying build-ups over time (collections that don't get cleared for example). So it's more along the lines of finding places where your code incorrectly prevents GC than problems with the GC itself. :)
JP Alioto
+3  A: 

Check out the EQATEC profiler, free and works pretty well. Also works for ASP.NET and .NET CF.

jspru
Wow, this looks good. I will try it out and see if it can compete with ANTS. +1 from me.
tobsen
+1  A: 

For performance tuning, as opposed to memory diagnostics, there's a simple way to do it.

It's counterintuitive, but all you have to do is run the program under the IDE, and while it's being slow, pause it several times, examining the call stack to see why it's doing whatever it's doing. Chances are excellent that multiple samples will show it doing something that you could eliminate. The time saved is roughly equal to the fraction of samples that contained the code you fixed.

It is "quick and dirty", but unlike most profilers, it pinpoints the actual statements needing attention, not just the functions containing them. It also gives directly a rough estimate of the speedup you can expect by fixing them. It is not confused by recursion, and it avoids the call-tree difficulty that a problem might be small in any branch, but could be big by being spread over many brances.

I take several samples N, usually no more than 20. If there is a hotspot or a rogue method call somewhere mid-stack, taking some fraction F of the execution time, then the number of samples that will show it is NF +- sqrt(NF(1-F). If N=20 and F=0.15, for example, the number of samples that will show it is 3 +- 1.6, so I have an excellent chance of finding it.

Often F is more like 0.5, so the number of samples showing it is 10 +- 2.2, so it will not be missed.

Notice this has absolutely nothing to do with how fast the code is, or how often it runs. If optimizing it will save you a certain percentage of time, that determines what percentage of samples will display it for you.

Usually there are multiple places to optimize. If problem 1 has F1=0.5, and problem 2 has F2 = 0.1, then if you fix problem 1 (doubling the program's speed), then F2 usually increases by that factor, to 0.2. So you can do it again and be sure of finding problem 2. In this way, you can knock down a succession of problems, until the code is practically optimal.

Mike Dunlavey
Haha, have fun, see you in a few decades! What happens if you have millions of calls?
leppie
Mike Dunlavey
Was that downvote because of not thinking it will work, or just not "liking" it?
Mike Dunlavey
This method is great at finding the large inefficiencies in code. And it saves ramping up on a profiler (although this is a one-time cost).The problem with this method is where you have 10 separate issues that each contribute 1% of unnecessary run time. The total savings is a 10% speed increase which would be pretty nice, but to find any one of the slow downs would be very time consuming. Even taking 100 samples at 1% = 1 +/- 0.995 times you'll find one of these calls with this method. A profiler that lets you sort function calls would get you to the correct function very quickly.
Robert Gowland
@Robert: My experience is issues have something of a log-normal distribution, ranging from big to small. Whichever one you fix makes the rest of them bigger, on a percentage basis, so something might have been small to start with, but it gets bigger as you solve others. Anyway, if there really is only room in the code for a 10% speedup, it's pretty near optimal. I've seen factors like 10-40 times.
Mike Dunlavey
@Robert: When I get to the point of diminishing returns, it's not because it's not telling me what's taking time. It's because to fix what's taking time requires major refactoring or redesign.
Mike Dunlavey
... sorry to add another answer. "A profiler that lets you sort function calls would get you to the correct function very quickly." I keep waiting for a profiler that would do that, specifically, to rank function CALLS (not whole functions) by their fractional residence time on the stack. (I'm about to build such a UI, as I have in the past, because on projects like ours each sample can be 30 layers deep.)
Mike Dunlavey