views:

529

answers:

4

In Python, using calendar.timegm(), I get a 10 digit result for a unix timestamp. When I put this into Javscript's setTime() function, it comes up with a date in 1970. It evidently needs a unix timestamp that is 13 digits long. How can this happen? Are they both counting from the same date?

How can I use the same unix timestamp between these two languages?

In Python:

In [60]: parseddate.utctimetuple()
Out[60]: (2009, 7, 17, 1, 21, 0, 4, 198, 0)
In [61]: calendar.timegm(parseddate.utctimetuple())
Out[61]: 1247793660

In Firebug:

>>> var d = new Date(); d.setTime(1247793660); d.toUTCString()
"Thu, 15 Jan 1970 10:36:55 GMT"
+1  A: 

Are you possibly mixing up seconds-since-1970 with milliseconds-since-1970?

Greg Hewgill
+1  A: 

JavaScript Date constructor works with milliseconds, you should multiply the Python unix time by 1000.

var unixTimestampSeg = 1247793660;
var date = new Date(unixTimestampSeg*1000);
CMS
+8  A: 

timegm is based on Unix's gmtime() method, which return seconds since Jan 1, 1970.

Javascripts setTime() method is milliseconds since that date. You'll need to multiply your seconds times 1000 to convert to the format expected by Javascript.

Reed Copsey
+3  A: 

Here are a couple of python methods I use to convert to and from javascript/datetime.

def to_datetime(js_timestamp):
    return  datetime.datetime.fromtimestamp(js_timestamp/1000)

def js_timestamp_from_datetime(dt):
    return 1000 * time.mktime(dt.timetuple())

In javascript you would do:

var dt = new Date();
dt.setTime(js_timestamp);
Ben Noland