views:

413

answers:

2

I am trying to debug an iPhone app that uses bluetooth for communication. I am basically wondering if the devices should agree on the time down to the millisecond. I am stamping the message when it is sent from one device with CFAbsoluteTimeGetCurrent() and subtracting that from CFAbsoluteTimeGetCurrent() on the other device when the message arrives. Is it safe to assume they are in sync to that granularity? Is there another way to time the messages?

+1  A: 

Those depend on the clock on each device, so, no, I doubt you'd get millisecond accuracy.

A possible strategy is to try and figure out the two device's clock offsets with a series of initial network messages. You can do this synchronization occasionally thereafter to (try to) account for drift and clock changes. Here's something that may work (just brainstorming here)...

  1. A sends current timestamp to B (send_time)
  2. B replies with its current timestamp (response_time)
  3. A receive response (receive_time) and estimates:
    • latency = 0.5 * (receive_time - send_time)
    • offset = (response_time - send_time - latency)
Daniel Dickison
+1  A: 

If the timing is for debugging purposes only, it probably isn't worth it, but if your app needs synchronized clocks, you could take a look at the NTP article on Wikipedia. You should be able to synchronize very closely using some of those techniques.

I don't know if there is any NTP functionality built in to the iPhone through the BSD layer, but you can pick up some source code somewhere if you need it.

Dustin Voss