I have experimented with a technique that involves a web page request which you use to calculate the bit-rate based on bytes divided by the time elapsed. You can average multiple data points of course, but is this as accurate of a bit-rate estimation that can be made?
Do any professional or less hackish techniques exists? (Or is this just one of the magical mysteries of the internet?)