I'm just going through a bunch of C++ interview questions just to make sure there's nothing obvious that I don't know. So far I haven't found anything that I didn't know already, except this:
long value;
//some stuff
value &= 0xFFFF;
The question is "what's wrong with this code?" And hints that it's something to do with target architectures.
Unless the answer is just "value isn't initialized", I can't see any problem. As far as I can tell, it's just masking the 2 least significant bytes of the value, and long
is guaranteed to be at least 2 bytes, so there's no problem there.
Could it possibly be that long
might only be 2 bytes on the target architecture, and you might be losing the sign bit? Or perhaps that the 0xFFFF is an int
and int
is only 2 bytes?
Thanks in advance.