How does !!~ (not not tilde/bang bang tilde) alter the result of a 'contains/included' Array method call?
If you read the comments at the jQuery inArray
page here, there's an interesting declaration:
!!~jQuery.inArray(elm, arr)
Now, I believe a double-exclamation point will convert the result to type boolean
, with the value of true
. What I don't understand is what is the use of the tilde (~
) operator in all of this?
var arr = ["one", "two", "three"];
if (jQuery.inArray("one", arr) > -1) { alert("Found"); }
Refactoring the if
statement:
if (!!~jQuery.inArray("one", arr)) { alert("Found"); }
Breakdown:
jQuery.inArray("one", arr) // 0
~jQuery.inArray("one", arr) // -1 (why?)
!~jQuery.inArray("one", arr) // false
!!~jQuery.inArray("one", arr) // true
I also noticed that if I put the tilde in front, the result is -2
.
~!!~jQuery.inArray("one", arr) // -2
I don't understand the purpose of the tilde here. Can someone please explain it or point me towards a resource?
There's a specfic reason you'll sometimes see ~
applied in front of $.inArray
.
Basically,
~$.inArray("foo", bar)
is a shorter way to do
$.inArray("foo", bar) !== -1
$.inArray
returns the index of the item in the array if the first argument is found, and it returns -1 if its not found. This means that if you're looking for a boolean of "is this value in the array?", you can't do a boolean comparison, since -1 is a truthy value, and when $.inArray returns 0 (a falsy value), it means its actually found in the first element of the array.
Applying the ~
bitwise operator causes -1
to become 0
, and causes 0 to become `-1. Thus, not finding the value in the array and applying the bitwise NOT results in a falsy value (0), and all other values will return non-0 numbers, and will represent a truthy result.
if (~$.inArray("foo", ["foo",2,3])) {
// Will run
}
And it'll work as intended.
!!~expr
evaluates to false
when expr
is -1
otherwise true
.
It is same as expr != -1
, only broken*
It works because JavaScript bitwise operations convert the operands to 32-bit signed integers in two's complement format. Thus !!~-1
is evaluated as follows:
-1 = 1111 1111 1111 1111 1111 1111 1111 1111b // two's complement representation of -1
~-1 = 0000 0000 0000 0000 0000 0000 0000 0000b // ~ is bitwise not (invert all bits)
!0 = true // ! is logical not (true for falsy)
!true = false // duh
A value other than -1
will have at least one bit set to zero; inverting it will create a truthy value; applying !
operator twice to a truthy value returns boolean true.
When used with .indexOf()
and we only want to check if result is -1
or not:
!!~"abc".indexOf("d") // indexOf() returns -1, the expression evaluates to false
!!~"abc".indexOf("a") // indexOf() returns 0, the expression evaluates to true
!!~"abc".indexOf("b") // indexOf() returns 1, the expression evaluates to true
* !!~8589934591
evaluates to false so this abomination cannot be reliably used to test for -1
.
The tilde operator isn't actually part of jQuery at all - it's a bitwise NOT operator in JavaScript itself.
See The Great Mystery of the Tilde(~).
You are getting strange numbers in your experiments because you are performing a bitwise logical operation on an integer (which, for all I know, may be stored as two's complement or something like that...)
Two's complement explains how to represent a number in binary. I think I was right.