This is more of a language design rather than a programming question.
The following is an excerpt from JLS 15.19 Shift Operators:
If the promoted type of the left-hand operand is
int
, only the five lowest-order bits of the right-hand operand are used as the shift distance.If the promoted type of the left-hand operand is
long
, then only the six lowest-order bits of the right-hand operand are used as the shift distance.
This behavior is also specified in C#, and while I'm not sure if it's in the official spec for Javascript (if there's one), it's also true based on my own test at least.
The consequence is that the following is true:
(1 << 32) == 1
I understand that this specification is probably "inspired" by the fact that the underlying hardware only takes 5 bits for the count operand when shifting 32-bit values (and 6 bits for 64-bit), and I can understand such behavior specified at the JVM level for example, but why would high level languages such as C# and Java retain this rather low-level behavior? Shouldn't they provide a more abstract view beyond the hardware implementation and behave more intuitively? (Even better if they can take a negative count to mean to shift in the OTHER direction!)