Why does (0 < 5 < 3) return true?
I was playing around in jsfiddle.net and I'm curious as to why this returns true?
if(0 < 5 < 3) {
alert("True");
}
So does this:
if(0 < 5 < 2) {
alert("True");
}
But this doesn't:
if(0 < 5 < 1) {
alert("True");
}
Is this quirk ever useful?
Order of operations causes (0 < 5 < 3)
to be interpreted in javascript as ((0 < 5) < 3)
which produces (true < 3)
and true is counted as 1, causing it to return true.
This is also why (0 < 5 < 1)
returns false, (0 < 5)
returns true, which is interpreted as 1
, resulting in (1 < 1)
.
My guess is because 0 < 5
is true, and true < 3
gets cast to 1 < 3
which is true.
probably because true
is assumed as 1
so
0 < 5 < 3 --> true < 3 --> 1 < 3 --> true
Because true < 3
, because true == 1
As to your question whether this quirk is ever useful: I suppose there could be some case where it would useful (if condensed code is what you are after), but relying on it will (most likely) severely reduce the understandability of your code.
It's kind of like using post/pre increment/decrement as a part of bigger expressions. Can you determine what this code's result is at a glance?
int x = 5;
int result = ++x + x++ + --x;
Note: with this code, you can sometimes even get different results depending on the language and compiler.
It's a good idea to make life easy for yourself and the next guy who will read your code. Clearly write out what you actually want to have happen rather then relying on side effects like the implicit conversion of booleans.