views:

61

answers:

2

Assuming there are a number of load-balanced web servers in a web farm, is it safe to use the app/web server time in the application code for getting the system date/time or should we leave this responsibility to the database server?

Would there be a chance that machine date/time settings on all servers in the webfarm are out of sync?

If date/time will be the responsible of the DBMS, how will this strategy work if we have load-balanced clustered DBs?

+1  A: 

Best just to have the same time set on all the servers so you don't have to worry about it, otherwise there is always confusion about whether the time comes from.

If the clocks on the servers are set from a time server regularly, they should be accurate to within 100ms of each other at least, which is probably good enough, though obviously it depends on what exactly you are trying to do.

Nathan Reed
+3  A: 

You should have a Time Server (-:)

Seriously, the first approach is to make sure all servers use a protocol to sync their clocks. That will leave you with a known worst case deviation. Only if that deviation is larger than you can tolerate (unlikely in a Web App) you will need to engineer something special. Usually the database will be OK, but if that is clustered then you may need to appoint a dedicated server as the keeper of time.

But note that your accuracy will always be bound by the maximum lag of the network.

Henk Holterman