views:

4484

answers:

16

Why does this javascript return 108 instead of 2008? it gets the day and month correct but not the year?

myDate = new Date();
year = myDate.getYear();

year = 108?

+5  A: 

It must return the number of years since the year 1900.

yjerem
A: 

Y2K issue?

BCS
A: 

The number you get is the number of years since 1900. Don't ask me why..

Nils Pipenbrinck
+3  A: 

use date.getFullYear().

This is (as correctly pointed out elsewhere) is a Y2K thing. Netscape (written before 2000) originally returned, for example 98 from getYear(). Rather than return to 00, it instead returned 100 for the year 2000. Then other browsers came along and did it differently, and everyone was unhappy as incompatibility reigned.

Later browsers supported getFullYear as a standard method to return the complete year.

Dan
A: 

it is returning 4 digit year - 1900, which may have been cool 9+ years ago, but is pretty retarded now. Java's java.util.Date also does this.

+1  A: 

It's dumb. It dates to pre-Y2K days, and now just returns the number of years since 1900 for legacy reasons. Use getFullYear() to get the actual year.

joelhardi
A: 

As others have said, it returns the number of years since 1900. The reason why it does that is that when JavaScript was invented in the mid-90s, that behaviour was both convenient and consistent with date-time APIs in other languages. Particularly C. And, of course, once the API was established they couldn't change it for backwards compatibility reasons.

+32  A: 

It's a Y2K thing, only the years since 1900 are counted.

There are potential compatibility issues now that getYear() has been deprecated in favour of getFullYear() - from quirksmode:

To make the matter even more complex, date.getYear() is deprecated nowadays and you should use date.getFullYear(), which, in turn, is not supported by the older browsers. If it works, however, it should always give the full year, ie. 2000 instead of 100.

Your browser gives the following years with these two methods:

* The year according to getYear(): 108
* The year according to getFullYear(): 2008

There are also implementation differences between Internet Explorer and Firefox, as IE's implementation of getYear() was changed to behave like getFullYear() - from IBM:

Per the ECMAScript specification, getYear returns the year minus 1900, originally meant to return "98" for 1998. getYear was deprecated in ECMAScript Version 3 and replaced with getFullYear().

Internet Explorer changed getYear() to work like getFullYear() and make it Y2k-compliant, while Mozilla kept the standard behavior.

ConroyP
A: 

BTW, different browsers might return different results, so it's better to skip this function altogether and and use getFullYear() always.

Milan Babuškov
+11  A: 

Since getFullYear doesn't work in older browsers, you can use something like this:

Date.prototype.getRealYear = function() 
{ 
    if(this.getFullYear)
        return this.getFullYear();
    else
        return this.getYear() + 1900; 
};

Javascript prototype can be used to extend existing objects, much like C# extension methods. Now, we can just do this;

var myDate = new Date();
myDate.getRealYear();
// Outputs 2008
FlySwat
Maybe smth like Date.prototype.getRealYear = Date.prototype.getFullYear ? Date.prototype.getFullYear : function() { return this.getYear() + 1900; };
valya
+1  A: 

This question is so old that it makes me weep with nostalgia for the dotcom days!

That's right, Date.getYear() returns the number of years since 1900, just like Perl's localtime(). One wonders why a language designed in the 1990s wouldn't account for the century turnover, but what can I say? You had to be there. It sort of made a kind of sense at the time (like pets.com did).

Before 2000, one might have been tempted to fix this bug by appending "19" to the result of getYear() resulting in the "year 19100 bug". Others have already answered this question sufficiently (add 1900 to the result of getDate()).

Maybe the book you're reading about JavaScript is a little old?

Thanks for the blast from the past!

jjohn
The reason the language (Perl) didn't account for it is because it was blindly copying a C design decision that had been made far earlier. (20 years?)
skiphoppy
+1  A: 

Check the docs. It's not a Y2K issue -- it's a lack of a Y2K issue! This decision was made originally in C and was copied into Perl, apparently JavaScript, and probably several other languages. That long ago it was apparently still felt desirable to use two-digit years, but remarkably whoever designed that interface had enough forethought to realize they needed to think about what would happen in the year 2000 and beyond, so instead of just providing the last two digits, they provided the number of years since 1900. You could use the two digits, if you were in a hurry or wanted to be risky. Or if you wanted your program to continue to work, you could add 100 to the result and use full-fledged four-digit years.

I remember the first time I did date manipulation in Perl. Strangely enough I read the docs. Apparently this is not a common thing. A year or two later I got called into the office on December 31, 1999 to fix a bug that had been discovered at the last possible minute in some contract Perl code, stuff I'd never had anything to do with. It was this exact issue: the standard date call returned years since 1900, and the programmers treated it as a two-digit year. (They assumed they'd get "00" in 2000.) As a young inexperienced programmer, it blew my mind that we'd paid so much extra for a "professional" job, and those people hadn't even bothered to read the documentation. It was the beginning of many years of disillusionment; now I'm old and cynical. :)

In the year 2000, the annual YAPC Perl conference was referred to as "YAPC 19100" in honor of this oft-reported non-bug.

Nowadays, in the Perl world at least, it makes more sense to use a standard module for date-handling, one which uses real four-digit years. Not sure what might be available for JavaScript.

skiphoppy
hehe... reminds me of the Dilbert cartoon... "have you read the story about a spanish guy named Manual?" ;)
Adhip Gupta
+2  A: 

You should, as pointed out, never use getYear(), but instead use getFullYear().

The story is however not as simple as "IE implements GetYear() as getFullYear(). Opera and IE these days treat getYear() as getYear() was originally specified for dates before 2000, but will treat it as getFullYear() for dates after 2000, while webkit and Firefox stick with the old behavior

This outputs 99 in all browsers:

javascript:alert(new Date(917823600000).getYear());

This outputs 108 in FF/WebKit, and 2008 in Opera/IE:

javascript:alert(new Date().getYear());
Arve
A: 

var date_object=new Date(); var year = date_object.getYear(); if(year < 2000) { year = year + 1900; } //u will get the full year ....

A: 

I am using date.getUTCFullYear(); working without problems.

D3vito