Why is string "11" less than string "3"? [duplicate]

if ('11' < '3') alert('true');

It's obvious that it's not comparing them by length but by encoding instead. However, I don't understand how it works. I need some explanation :-)


Strings are compared lexicographicaly. i.e. character by character until they are not equal or there aren't any characters left to compare. The first character of '11' is less than the first character of '3'.

> '11' < '3'
true
> '31' < '3'
false
> '31' < '32'
true
> '31' < '30'
false

If we use letters then, since b is not less than a, abc is not less than aaa, but since c is less than d, abc is less than abd.

> 'abc' < 'aaa'
false
> 'abc' < 'abd'
true

You can explicitly convert strings to numbers:

> +'11' < '3'
false

By default, JavaScript will compare two strings by each character's ordinal value; much like how strcmp() works in C.

To make your comparison work, you can cast either side to a number to tell the interpreter your intentions of numeric comparison:

Number('11') < '3' // false
+'11' < '3' // false, using + to coerce '11' to a numeric

'11' < Number('3') // false
'11' < +'3' // false

In Many Programming languages Strings are compared as lexicographically. You can check Alphabetical order


It compares by each character, the following will be false:

if ('41' < '3') alert('true');

Since 4 is not less than 3. So essentially your comparison boiled down to this:

if ('1' < '3') alert('true'); // true