views:

273

answers:

2

I have the beginnings of a small multiplayer game that I'm writing in python as a learning exercise. Currently the server runs at 10 fps, while the clients run at whatever rate they like. This works well to conserve bandwidth, but unless the client tells the server when its input happened, all input gets quantized to 100ms intervals. How can I synchronize time between client and server so that I can make these corrections? A major hurdle here is that I'll need to determine ping times and compensate for them.

A: 

This is a very interesting question. Unfortunately there's no easy answer. You just have to understnad the issue well and settle for a solution that is good enough for your application.

My first instinct was that the Network Time Protocol (NTP) for setting machine clocks from NTP servers would have addressed this issue. One of the issues addressed there concerns Jitter Buffers, which involves packet delay variation. This is elaborated in RFC 3393; IP Packet Delay Variation Metric for IP Performance Metrics (IPPM).

Ewan Todd
I will look into NTP, that may indeed provide some insight into my situation.
Alex
+1  A: 

I accidentally came across an excruciatingly fine blog post on how to do distributed network physics in general (without traditional client prediction). I highly recommend it, along with the GDC slides Fiedler presented a couple of years ago. Good luck!

Jonas Byström
Thanks, I remember those blog posts from a long time ago when the series was incomplete. The final one on networking is great, and should help a lot!
Alex