views:

170

answers:

2

Possible Duplicate:
Estimating/forecasting download completion time

We've all seen the download time running estimate that initially says something like "7 days", but keeps dropping wildly (e.g. "23 hours", "45 minutes", "1 min. 50 sec", etc) with each successive estimation as the chunks are downloaded.

To avoid these initial (alarming) estimates, there are techniques one could try like suppressing display of the first n estimates, or waiting for the delta between estimates to drop below some threshold before you start displaying them, but these don't seem like a general, robust solution. There are corner cases involving too few samples, or samples that actually are wildly varying...

I think I recall a general solution for this kind of thing in mathematics (statistics?) that reduced or eliminated these wild values.

Does anyone know?

Edit:

OK, looks like this has already been asked and answered:

A: 

If you have the size of the file, how much of it is downloaded, and the expected download speed

  • from previous files
  • from previous samples
  • from a dropdown the user picks from
  • from a speed test

you could provide improved estimates.

kiwicptn
A: 

Use filer, moving avarege can be good enough, for calculating speed.

S_filtered=S_filtered_prevous*(1-x) + S_current*x

Where x is inverse value of filtered samples, try different values from 0.1 - 0.01 (10-100)

ralu