tags:

views:

100

answers:

2

In programming, when we say "the 7th Least Significant Bit", do we have a standard of whether it is bit 7 or bit 6 (if we start from bit 0).

Because if we say "the 2nd Least Significant Bit", it sounds like it is bit 1 (counting from bit 0 again), so if 2nd means bit 1, then 7th means bit 6, not bit 7.

+2  A: 

The standard that I've always used is that bits are numbered 0 through n-1 for an n bit number, with 0 the lowest order bit, but the "1st bit" is bit 0, the "second bit" is bit 1, and so on.

GregS
That way, when the bits' value is interpreted as an integer, a `1` in bit *n* corresponds to a value of 2^n
mobrule
This is becoming increasingly the norm, but you do still see examples of bit numbering in the opposite direction, i.e. bit 0 = MSB. There may be some loose correlation with endianness.
Paul R
@Paul R: Yes, I have seen this in processor manuals and it never fails to confuse me.
GregS
+4  A: 

A standard? Like an ISO standard? No, although quite a few of them actually count bits at b0. But, in English terms, the second least significant bit is one removed from the (first) least significant bit, so that would be b1.

So the seventh would be b6. In an octet, the most significant bit, b7, would be the eighth least signicant bit.

For what it's worth, I don't think I've ever heard the phrase, "the 7th least significant bit", in my entire 30-odd year worklife. It's always been bN (where N rages from 0 to the number of bits minus one) or just the least or most significant bit (not even second most significant).

paxdiablo