I am trying to work out how to calculate the latency of requests through a web-app (Javascript) to a .net webservice.
Currently I am essentially trying to sync both client and server time, which when hitting the webservice I can look at the offset (which would accurately show the 'up' latency.
The problem is - when you sync the time's, you have to factor in latency for that also. So currently I am timeing the sync request (round trip) and dividing by 2, in an attempt to get the 'up' latency...and then modify the sync accordingly.
This works on the assumption that latency is symmetrical, which it isn't. Does anyone know a procedure that would be able to determine specifically the up/down latency of a JS http request to a .net service? If it needs to involve multiple handshakes thats fine, what ever is as accurate as possible.
Thanks!!