What's a good algorithm for determining the remaining time for something to complete? I know how many total lines there are, and how many have completed already, how should I estimate the time remaining?
That really depends on what is being done... lines are not enough unless each individual line takes the same amount of time.
The best way (if your lines are not similar) would probably be to look at logical sections of the code find out how long each section takes on average, then use those average timings to estimate progress.
Why not?
(linesProcessed / TimeTaken) (timetaken/linesProcessed) * LinesLeft = TimeLeft
TimeLeft will then be expressed in whatever unit of time timeTaken is.
Edit:
Thanks for the comment you're right this should be:
(TimeTaken / linesProcessed) * linesLeft=timeLeft
so we have
(10/100) * 200 = 20 Seconds now 10 seconds go past
(20/100) * 200 = 40 Seconds left now 10 more seconds and we process 100 more lines
(30/200) * 100 = 15 Seconds and now we all see why the copy file dialog jumps from 3 hours to 30 minutes :-)
If you know the percentage completed, and you can simply assume that the time scales linearly, something like
timeLeft = timeSoFar * (1/Percentage)
might work.
there is no standard algorithm i know of, my sugestion would be:
- Create a variable to save the %
- Calculate the complexity of the task you wish to track(or an estimative of it)
- Put increments to the % from time to time as you would see fit given the complexity.
You probably seen programs where the load bar runs much faster in one point than in another. Well that's pretty much because this is how they do it. (though they probably just put increments at regular intervals in the main wrapper)
It depends greatly on what the "something" is. If you can assume that the amount of time to process each line is similar, you can do a simple calculation:
TimePerLine = Elapsed / LinesProcessed
TotalTime = TimePerLine * TotalLines
TimeRemaining = TotalTime - LinesRemaining * TimePerLine
Generally, you know three things at any point in time while processing:
- How many units/chunks/items have been processed up to that point in time (A).
- How long it has taken to process those items (B).
- The number of remaining items (C).
Given those items, the estimate (unless the time to process an item is constant) of the remaining time will be
B * C / A
Make sure to manage perceived performance.
Although all the progress bars took exactly the same amount of time in the test, two characteristics made users think the process was faster, even if it wasn't:
- progress bars that moved smoothly towards completion
- progress bars that sped up towards the end
Where time$("ms") represents the current time in milliseconds since 00:00:00.00, and lof represents the total lines to process, and x represents the current line:
if Ln>0 then
Tn=Tn+time$("ms")-Ln 'grand total of all laps
Rn=Tn*(lof-x)/x^2 'estimated time remaining in seconds
end if
Ln=time$("ms") 'start lap time (current time)