This has to do with a question I read yesterday: http://stackoverflow.com/questions/2274428/how-to-determine-how-many-bytes-an-integer-needs
Anyway, the part that I have a question about is this:
I'm looking for the most efficient way to calculate the minimum number of bytes needed to store an integer without losing precision.
e.g.
int: 10 = 1 byte
int: 257 = 2 bytes
My question is, why does 10 require 1 byte, and why does 257 require 2? From what I understand, you can represent 10 as 1010, which is 4 bits, and 257 as 100000001, which is 9 bits. Does it have to do with word size? Is it that you can't have just 4 bits, but you need the whole byte and you can't just have 9 bits, you need the whole 2 bytes?