I was playing around with Javascript creating a simple countdown clock when I came across this strange behavior:
var a = new Date(),
now = a.getTime(),
then = Date.UTC(2009,10,31),
diff = then -now ,
daysleft = parseInt(diff/(24*60*60*1000));
console.log(daysleft );
The daysleft is off by 30 days.
What is wrong with this code?
Edit: I changed the variable names to make it more clear.