tags:

views:

92

answers:

5

Hi, I am doing profiling of a C code in Microsoft VS 2005 on a Intel Core-2Duo platform. I measure the time(secs:millisecs) counsumed by my function. But i have some doubts about the accuracy of this measurement as the operating system will not continuously run my application, but instead schedule others apps/services in between the execution of my code.(Although i have no major applications running while i do the profile run, still windows will have lot of code of its own which it will run by preempting my app.). Because of all this i believe the profiling number(time taken by my app to run) is not accurate.

So my question is there any way to find out the Operating system overheads, scheduling overhead on a typical windows system(I run Windows XP)e.g. if my applications says it ran for 60 milliseconds, out of that 60 msec, how much time really was used by my app. and how much time it was sitting idle, due to being pre-empted by some other task scheduled by the OS?

or

Atleast is there any ball-park number to get such OS overhead, based on your experience you came across while doing something similar?

A: 

Suggestion

Try run on multi CPU systems.

epatel
A: 

1 - Put some debug logging in your code (include timestamps of course), and run it outside of the debugger

2 - Run again in the debugger

3 - Repeat many times, to get statistically valid data.

4 - Compare.

If there is a significant difference in the average execution time of the standalone vs. the debugger, then you are right to be suspicious of the OS (or the overhead of the debugger hooks themselves...). If no difference, then don't sweat it.

Edit0: Obviously the debug messages have some overhead of their own. You may want to leave those in the code even when you are running from the debugger. That way, both the standalone and the debugger are running the very same code.

Edit1: I misunderstood the question. I thought your concern was that --while debugging--, the OS might interrupt your app more frequently than in a normal mode of execution. If you want to know how much time your app actually spent working, just compare the time taken to the "CPU Time" in the Task Manager.

Edit2: Compare the time returned by GetProcessTimes for your process to the actual execution time. The difference is the time spent by the CPU on somebody else.

JosephStyons
+1  A: 

@Kogus: Even if i run outside debugger(standalone app. from a command prompt) it still could be preempted by OS and cause a incorrect measurement of the time consumed by my app.

Is'nt it?

-AD

goldenmean
A: 

The best way of doing this is a dedicated profiling tool. There are lots out there. I haven't used one for C for a few years, someone else will hopefully be able to give better advice. As you are using Visual Studio 2005 this might be a good place to start: AQ, but I've never used it.

Nick Fortescue
+1  A: 

I think you are going to have some problems with the granularity. See similar questions GetLocalTime() API time resolution and Is gettimeofday() guaranteed to be of microsecond resolution?

Also, you may want to take a look at the Windows Resource Kits Tools which include timeit.exe (similar to time on unix/linux) to give you elapsed and process times.

David Schlosnagle