When 64-bit processors came out, it wasn't too big of a deal. Sure, C++ people had to deal with the fact that their 32-bit pointer math doesn't work on 64-bit machines, but that's what you get for not using sizeof
(or not using Java!), but in most cases, our languages already had 64-bit primitives, so all the compiler needed to do was just use the new 64-bit instructions.
What about the future? What is going to happen if Intel decides it would be a great idea to come out with 128-bit processors? How are languages going to adapt?
I see a few outcomes:
- We add new 128-bit primitives (and hence keywords) and possibly break existing code.
- We silently (like how C++ does) increase our existing primitives to 64 and 128 bits (This won't work for Java, since
int
is defined as 32-bits). - We stay with 64-bit forever.
- There is a paradigm shift in new languages where primitive types are not defined.
I am leaning towards 3, but hope for 4.
An example of outcome #4 is if integers are real mathematical integers, as in, they do not have bounds. Floating point types would ask how many bits you want. The compiler or run time would then pick the right instruction depending on the hardware available.
How would you store such an integer? Well, it would be variable length like a String. How would you write fixed length bytes out? You would probably need a byte primitive that goes from 0-255, this way you can convert these boundless integers to fixed byte arrays.
What do you think is going to happen?