Given Java's "write once, run anywhere" paradigm and the fact that the Java tutorials give explicit bit sizes for all the primitive data types without the slightest hint that this is dependent on anything, I would say that, yes, an int
is always 32 bit.
But are there any caveats? The language spec defines the value range, but says nothing about the internal representation, and I guess that it probably shouldn't. However, I have some code which does bitwise operations on int
variables that assume 32 bit width, and I was wondering whether that code is safe on all architectures.
Are there good in-depth resources for this type of question?