views:

470

answers:

3

Is it time()? Is it time().substr(microtime(), 2, 2)? Is it time().substr(microtime(), 2, 3)?

Kind of lost with the following snippet.

function updateClock ( ) {
    var timeStamp = <?php echo time().substr(microtime(), 2, 2);?>;
    var currentTime = new Date ( );
    currentTime.setTime( timeStamp );

    ...
    ...
}

My goal is to use server time and start ticking from there on client browser window. The code above either returns the current client computer time or sometime in 1973. I guess I'm not getting the right time stamp format for setTime()?

Thanks!

  • 1000

I tried that but the web page still shows my local time after I upload the js.php (rendering the javascript code) to my server. My server is approx 12 hours different in time from me. My guess is that does php takes client side time into account running time() ? I mean browsers do send request time to apache right?

I copied the time() * 1000 returned value from the web page run on my server and pasted it into a local page:

<script type="text/javascript">

var d = new Date();
d.setTime(1233760568000);
document.write(d);

</script>

And it's indeed my local time. Thus the guess.

Is there anyway to specify time zone for time()?

+1  A: 

Date.setTime expects the number of milliseconds since 1970-1-1. php's time function yields the number of seconds since 1970-1-1. Therefore, you can just use

var TimeStamp = <?php echo time()*1000;?>

Due to latency issues (the browsers needs to load the whole page before starting JavaScript), the time will usually drift one or a couple of seconds though.

phihag
A: 

Multiply by 1000. JavaScript expects milliseconds while PHP returns seconds.

Staale
A: 

Date.setTime() wants milliseconds since the Unix Epoch, and time() returns seconds since then. If absolute precision isn't required (and given your methodology, I don't think it is), just multiply the value you get from time() by 1000.

Edit: beaten twice--D'oh

Jeremy DeGroot