I have an algorithm of which I'm using System.Diagonstics to time - via the Stopwatch.
It works great but one thing I have noticed is that the first time I run the algorithm it takes around 52 milliseconds which is great.
The second time I run the algorithm it takes only a fraction of that time.
Is this due to the nature of .NET?
Each time I run the algorithm with a new set of data I re-initalise it. In other words I create a new object rather than re-use the old reference so I'm not sure why this still occurs. Normally I wouldn't care about something like this, but for this assignment I must measure the efficency and speed of my algorithms so it is important for myself to get an understanding to why this is happening.
Pseudo code of how I'm using the timer is below:
Algorithm class
Stopwatch get/set
Method A
Start stopwatch
// Do work.
Stop stopwatch
End
Method B
Start stopwatch
// Do work.
Stop stopwatch
End
End
After both methods are called in my runner, I get the stopwatch and inspect the time.
The algorithm
The algorithm is tactical waypoint reasoning for computer controlled A.I opponnents. I tried to keep it as simple as possible in the above example.
Results
19.7847
0.0443
0.0102
0.0159
0.0091
0.0073
0.0079
0.0079
0.0079
0.0079
0.0079
0.0079
0.0136
0.0079
0.0073
0.0079
0.0079
0.0079
0.0079
0.0073
...
Should I just ignore the first time the algorithm is run? Otherwise I'll end up with an average that is essentially the same as the value when its first run.