In Actionscript, the Unix timestamp in milliseconds is obtainable like this:
public static function getTimeStamp():uint
{
var now:Date = new Date();
return now.getTime();
}
The doc clearly states the following:
getTime():Number Returns the number of milliseconds since midnight January 1, 1970, universal time, for a Date object.
When I trace it, it returns the following: 824655597
So, 824655597 / 1000 / 60 / 60 / 24 / 365 = 0.02 years. This is obviousely not correct, as it should be around 39 years. Question #1: What's wrong here?
Now, onto the PHP part: I'm trying to get the timestamp in milliseconds there aswell. microtime() returns either a string (0.29207800 1246365903) or a float (1246365134.01), depending on the given argument. Because I thought timestamps were easy, I was going to do this myself. But now that I have tried and noticed this float, and combine that with my problems in Actionscript I really have no clue. Question #2: how should I make it returns the amount of milliseconds in a unix timestamp?
Timestamps should be so easy, I'm probably missing something.. sorry about that. Thanks in advance.
EDIT: Answered the first question by myself. See below. EDIT2: Answered second question by myself aswell. See below. Can't accept answer within 48 hours.