hi,
often in code that uses permissions checking, i see some folks use hex 0x0001 and others use 0x00000001. these both look like an equivalent of a decimal 1, if i'm not mistaking.
why use one over the other, just a matter of preference?
hi,
often in code that uses permissions checking, i see some folks use hex 0x0001 and others use 0x00000001. these both look like an equivalent of a decimal 1, if i'm not mistaking.
why use one over the other, just a matter of preference?
What's going on here is this is a bitmask for which it is tradition to place leading zeros out to the width of the bitmask. I would furthermore guess the width of the bitmask changed at some point to add more specialized permissions.
Assuming that this is C, C++, Java, C# or something similar, they are the same. 0x0001 implies a 16-bit value while 0x00000001 implies a 32-bit value, but the real word length is determined by the compiler at compile time when evaluating hexadecimal literals such as these. This is a question of coding style, but it doesn't make any difference in the compiled code.