Javascript Date.UTC() function is off by a month?
I was playing around with Javascript creating a simple countdown clock when I came across this strange behavior:
var a = new Date(),
now = a.getTime(),
then = Date.UTC(2009,10,31),
diff = then - now,
daysleft = parseInt(diff/(24*60*60*1000));
console.log(daysleft );
The days left is off by 30 days.
What is wrong with this code?
Edit: I changed the variable names to make it more clear.
Solution 1:
The month is zero-based for JavaScript.
Days and years are one-based.
Go figure.
UPDATE
The reason this is so, from the creator of JavaScript, is
JS had to "look like Java" only less so, be Java's dumb kid brother or boy-hostage sidekick. Plus, I had to be done in ten days or something worse than JS would have happened.
http://www.jwz.org/blog/2010/10/every-day-i-learn-something-new-and-stupid/#comment-1021
Solution 2:
As Eric said, this is due to months being listed as 0-11 range.
This is a common behavior - same is true of Perl results from localtime(), and probably many other languages.
This is likely originally inherited from Unix's localtime() call. (do "man localtime")
The reason is that days/years are their own integers, while months (as a #) are indexes of an array, which in most languages - especially C where the underlying call is implemented on Unix - starts with 0.