Hello, I am having a hard time understanding this and hopefully someone can correct me on it. A BYTE is defined as 0 - 2^7 ? which would be 128, which is 8 bits, correct? But that cant be correct because I am now storing a value of 255 into a BYTE? Any kick in the right direction would be helpful
+2
A:
An unsigned byte is 2^8 = 256, but if you have to store the sign, the you need to sacrify a bit, then you have +- 2^7 = -127 + 128.
vulkanino
2010-09-14 14:51:32
+1: I may say "byte" like an American, but I've accidentally come to think of them as "octets". I think this mental and written notation clarifies thinking about them; this is not my own invention: http://en.wikipedia.org/wiki/Octet_(computing)
msw
2010-09-14 14:58:59
yeah, but please don't say Mega-Octet and Giga-Octet!
vulkanino
2010-09-14 15:13:49