Why does the equality operator return false in the first case?
var a = new Date(2010, 10, 10);
var b = new Date(2010, 10, 10);
alert(a == b); // <- returns false
alert(a.getTime() == b.getTime()); // returns true
Why?
Why does the equality operator return false in the first case?
var a = new Date(2010, 10, 10);
var b = new Date(2010, 10, 10);
alert(a == b); // <- returns false
alert(a.getTime() == b.getTime()); // returns true
Why?
http://stackoverflow.com/questions/492994/compare-2-dates-with-javascript
dates.compare(a,b)
The fact is that the comparison between the two objects does not work properly :/
Since dates are built-in objects, and not primitives, an equality check is done using the objects references.
In this case, objects a
and b
are not the same object, and so the test fails.
You can see the same using
var a = new String("a");
var b = new String("a");
alert(a == b); //false
By using .getTime
or .valueOf
you are converting the objects value into a primitive, and these are always compared by value rather than by reference.
If you want to do a comparison by value of two dates there is also a more obscure way to do this
var a = new Date(2010, 10, 10);
var b = new Date(2010, 10, 10);
alert(+a == +b); //true
In this case the unary +
operator forces the javascript engine to call the objects valueOf
method - and so it is two primitives that are being compared.
If you create two clocks, and set them both to the same time, you have two clocks.
If you change the time in one clock, it will not change the time in the other clock.
To compare or sort Dates, subtract one from the other. The valueof a Date object, used in a mathematical expression, is its timestamp.
function compareDates(a,b){return a-b};