views:

73

answers:

2

I have experimented with a technique that involves a web page request which you use to calculate the bit-rate based on bytes divided by the time elapsed. You can average multiple data points of course, but is this as accurate of a bit-rate estimation that can be made?

Do any professional or less hackish techniques exists? (Or is this just one of the magical mysteries of the internet?)

A: 

Assuming that you want to do this anagrammatically, perhaps one of the connection speed testing services have an API that you can use?

Speed Test

Jeepstone
I'm more interested in knowing how its done then plugging into a black box.
Paul
Speedtest.net performs a ping for responsiveness/latency test. It also performs upstream/downstream data transfer and calculate up/download speeds based on time taken.
o.k.w
+1  A: 

This really depends what kind of speed information you are interested in. You must be aware of several things:

  1. The speed test may be skewed by caching
  2. CDN servers that catch a request (assuming you are checking for HTTP response speed)
  3. The offchance of the network traffic veering to a congested network
  4. DNS caching

My suggestion is to make requests to a fair amount of unique sites for it to test. [Within an accetable margin of error]

Test lots of them and take the average speed, that is what you can generally expect from a client. The max is an estimated maximum that a client can get. The minimum that a client can guarentee is 0mbs.

monksy