Why is string "11" less than string "3"?

By default, JavaScript will compare two strings by each character's ordinal value; much like how strcmp() works in C.

To make your comparison work, you can cast either side to a number to tell the interpreter your intentions of numeric comparison:

Number('11') < '3' // false
+'11' < '3' // false, using + to coerce '11' to a numeric

'11' < Number('3') // false
'11' < +'3' // false

Strings are compared lexicographicaly. i.e. character by character until they are not equal or there aren't any characters left to compare. The first character of '11' is less than the first character of '3'.

> '11' < '3'
true
> '31' < '3'
false
> '31' < '32'
true
> '31' < '30'
false

If we use letters then, since b is not less than a, abc is not less than aaa, but since c is less than d, abc is less than abd.

> 'abc' < 'aaa'
false
> 'abc' < 'abd'
true

You can explicitly convert strings to numbers:

> +'11' < '3'
false