views:

244

answers:

6

Hi!

In AQTime for Delphi, it boasts to be very fast to get to the trouble spots by using areas and triggers etc. But it seems to me, that especially if you have very much code in the areas to profile, then the execution slows down dramatically even when the profiling is NOT on.

For example, if I want to profile a specific routine late in the program flow, but don't know what is called there, I'd think to put this routine only as a trigger and the initial status for threads as Off, and then choose "Full check by Routines/Lines". However, when I do this, the program execution slows down heavily already before the trigger routine has ever been hit.

For example if the "preparation flow" takes around 5 minutes without AQTime, then when I run it with profiling disabled, it already has been running for 30 minutes and still goes even when I know the trigger has not yet even been reached.

I know I can try to workaround this by reducing the amount of routines/lines profiled, but it is not really a good solution for me, since I'd like to profile all of them once I get to the actual trigger routine.

Also another, often better workaround is to start the application without AQTime and then use Attach to Process after the "preparation flow" has finished, but this works well only when the execution pauses in GUI in the proper place or otherwise provides a suitable time frame for doing the attaching. In all cases this is not the case.

Any comments on why this is so and is there anything else to do than just try to reduce the code from the areas or attach later to the process?

+3  A: 

AQTime is an instrumenting profiler. At runtime, it essentially surrounds every method (or line, depending upon how you've configured the options) you've chosen to profile with its own code, somewhat like this:

begin
    DoStuff();
end;

...becomes:

begin
    AQTimeEnter('MethodName');
    try
        DoStuff();
    finally
        AQTimeLeave('MethodName');
    end;
end;

It does this directly in the executable, rather than by modifying your source, but the effect is essentially the same. When profiling is active, there is considerable overhead for these calls, as they fire quite a lot, and log a good bit of information.

When profiling is inactive, there is less overhead, because they log nothing. However, there is still some overhead for the method call itself, plus the try/finally block.

I don't know if anything you can do in AQTime to improve this other than to profile less. However, you could also try a sampling profiler, which has less overhead for the profiler itself, but might miss calls to routines which execute quickly.

Craig Stuntz
Thanks, this is what I was assuming also. It just seems to me a bit surprising that the effect can be so big even when profiling routine level (not line level). But I guess even the try-finally blocks and extra method calls even without real content add up when there are lots of them.
Antti Suni
+4  A: 

Well, you could try my free profiler of course :-)
http://code.google.com/p/asmprofiler/

It has both instrumenting and sampling profiling possible. It has not all the functionality of AQTime, but at least it is free (and very slight performance loss if you stopped profiling).

André
Looks nice - I will check this out... Looks like a good candidate for the Delphi partner DVD.
Chris Thornton
Thanks - I could try this out also.
Antti Suni
+2  A: 

So what do you mean by "Full check by Routines/Lines". There's a big different between Routines and Lines. Profiling routines shouldn't slow down your app to much. It doesn't for me. Profiling by lines can be very slow, I think that's what you're doing now.

In general, the idea is to profile by routines first, find the bottlenecks, and then profile those (and only those) routines by line.

Giel
I mean that the same effect more or less happens on both. But my app is quite big (over half a million lines of code). Actually, I can't even use full check by lines at all, since when I try to start the app with that, the profiling just stops automatically after a few seconds. This problem goes away if I reduce the areas radically, so apparently the line amount has some upper limit after which problems (other than performance) start to occur. I know that the drill-down approach should be used, but with each run taking up to an hour or so, I wouldn't want to do too many of them.. :(
Antti Suni
A: 

Have you tried using

AQtimeHelpers.EnableProfiling(false); 

on the start of your preparation and then

AQtimeHelpers.EnableProfiling(True); 

after?

stg
No - I don't want to modify my code if possible to avoid, but it is good to know that something like this exists. I might try it if I can't use the other workarounds in some point - thanks!
Antti Suni
A: 

You have to use a drill-down approach. First identify the areas that need profiling by procedure, identify what procedures needs profiling, profile only them and switch to line profiling only when needing to pinpoint the exact line of code needing attention. If you enable full profiling of the application by line the instrumentation performed by AQTime will be so heavy and the data collected so much it will slow down your application a lot.

ldsandon
I know that the drill-down approach should be used, but with each run even without profiling taking for example 15 minutes, I wouldn't want to do too many of them.. :(. But I'll have to bear with it I guess.
Antti Suni
A: 

Thanks for all the answers and comments - I got several good points!

Antti Suni