views:

226

answers:

3

I was playing around with Javascript creating a simple countdown clock when I came across this strange behavior:

var a = new Date(), 
now = a.getTime(),
then = Date.UTC(2009,10,31),
diff = then -now ,
daysleft = parseInt(diff/(24*60*60*1000));
console.log(daysleft );

The daysleft is off by 30 days.

What is wrong with this code?

Edit: I changed the variable names to make it more clear.

+7  A: 

The month is zero-based for JavaScript.

Days and years are one-based.

Go figure.

Eric J.
Ha. Thank you. That means I am not losing my mind after all.
picardo
A: 

As Eric said, this is due to months being listed as 0-11 range.

This is a common behavior - same is true of Perl results from localtime(), and probably many other languages.

This is likely originally inherited from Unix's localtime() call. (do "man localtime")

The reason is that days/years are their own integers, while months (as a #) are indexes of an array, which in most languages - especially C where the underlying call is implemented on Unix - starts with 0.

DVK
A: 

date1 = new Date(); //year, month, day [, hrs] [, min] [, sec] date1 = new Date.UTC(date1.getFullYear(),date1.getMonth()+1,date1.getDate(),date1.getHours(),date1.getMinutes(),date1.getSeconds());

date2 = new Date(); date2 = date2.getTime();

alert(date1) alert(date2)

peter