The title says it all.
It never happened to me. In Visual Studio, I have a part of code that is executed 300 times, I time it every iteration with the performance counter, and then average it. If I'm running the code in the debugger I get an average of 1.01 ms if I run it without the debugger I get 1.8 ms.
I closed all other apps, I rebooted, I tried it many times: Always the same timing.
I'm trying to optimize my code, but before throwing me into changing the code, I want to be sure of my timings. To have something to compare with.
What can cause that strange behaviour?
Edit:
Some clarification:
I'm running the same compiled piece of code: the release build. The only difference is (F5 vs CTRL-F5) So, the compiler optimization should not be invoved.
Since each calcuated times were verry small, I changed the way I benchmark: I'm now timing the 300 iterations and then divide by 300. I have the same result.
About caching: The code is doing some image cross correlation, with different images at each iterations. The steps of the processing are not modified by the data in the images. So, I think caching is not the problem.