Why does Javascript getYear() return 108?
Why does this javascript return 108 instead of 2008? it gets the day and month correct but not the year?
myDate = new Date();
year = myDate.getYear();
year = 108?
It's a Y2K thing, only the years since 1900 are counted.
There are potential compatibility issues now that getYear()
has been deprecated in favour of getFullYear()
- from quirksmode:
To make the matter even more complex, date.getYear() is deprecated nowadays and you should use date.getFullYear(), which, in turn, is not supported by the older browsers. If it works, however, it should always give the full year, ie. 2000 instead of 100.
Your browser gives the following years with these two methods:
* The year according to getYear(): 108
* The year according to getFullYear(): 2008
There are also implementation differences between Internet Explorer and Firefox, as IE's implementation of getYear()
was changed to behave like getFullYear()
- from IBM:
Per the ECMAScript specification, getYear returns the year minus 1900, originally meant to return "98" for 1998. getYear was deprecated in ECMAScript Version 3 and replaced with getFullYear().
Internet Explorer changed getYear() to work like getFullYear() and make it Y2k-compliant, while Mozilla kept the standard behavior.
Since getFullYear doesn't work in older browsers, you can use something like this:
Date.prototype.getRealYear = function()
{
if(this.getFullYear)
return this.getFullYear();
else
return this.getYear() + 1900;
};
Javascript prototype can be used to extend existing objects, much like C# extension methods. Now, we can just do this;
var myDate = new Date();
myDate.getRealYear();
// Outputs 2008
Check the docs. It's not a Y2K issue -- it's a lack of a Y2K issue! This decision was made originally in C and was copied into Perl, apparently JavaScript, and probably several other languages. That long ago it was apparently still felt desirable to use two-digit years, but remarkably whoever designed that interface had enough forethought to realize they needed to think about what would happen in the year 2000 and beyond, so instead of just providing the last two digits, they provided the number of years since 1900. You could use the two digits, if you were in a hurry or wanted to be risky. Or if you wanted your program to continue to work, you could add 100 to the result and use full-fledged four-digit years.
I remember the first time I did date manipulation in Perl. Strangely enough I read the docs. Apparently this is not a common thing. A year or two later I got called into the office on December 31, 1999 to fix a bug that had been discovered at the last possible minute in some contract Perl code, stuff I'd never had anything to do with. It was this exact issue: the standard date call returned years since 1900, and the programmers treated it as a two-digit year. (They assumed they'd get "00" in 2000.) As a young inexperienced programmer, it blew my mind that we'd paid so much extra for a "professional" job, and those people hadn't even bothered to read the documentation. It was the beginning of many years of disillusionment; now I'm old and cynical. :)
In the year 2000, the annual YAPC Perl conference was referred to as "YAPC 19100" in honor of this oft-reported non-bug.
Nowadays, in the Perl world at least, it makes more sense to use a standard module for date-handling, one which uses real four-digit years. Not sure what might be available for JavaScript.