+4  A: 

You can regard a string of bits as a set, with a 1 representing membership of the set for the corresponding element. The bit count therefore gives you the population count of the set.

Practical applications include compression, cryptography and error-correcting codes. See e.g. wikipedia.org/wiki/Hamming_weight and wikipedia.org/wiki/Hamming_distance.

Paul R
A: 

If you're rolling your own parity scheme, you might want to count the number of bits. (In general, of course, I'd rather use somebody else's.) If you're emulating an old computer and want to keep track of how fast it would have run on the original, some had multiplication instructions whose speed varied with the number of 1 bits.

I can't think of any time I've wanted to do it over the past ten years or so, so I suspect this is more of a programming exercise than a practical need.

David Thornley
You can calculate parity directly with fewer operations than for a population count (unless your CPU has `POPCNT` or similar).
Paul R
A: 

In an ironic sort of fashion, it's useful for an interview question because it requires some detailed low-level thinking and doesn't seem to be taught as a standard algorithm in comp sci courses.

Paul Nathan
A: 

Some people like to use bitmaps to indicate presence/absence of "stuff".

There's a simple hack to isolate the least-significant 1 bit in a word, convert it to a field of ones in the bits below it, and then you can find the bit number by counting the 1-bits.

countbits((x XOR (x-1)))-1;

Watch it work.

Let x =     00101100
Then x-1 =  00101011
x XOR x-1 = 00000111

Which has 3 bits set, so bit 2 was the least-significant 1-bit in the original word

John R. Strohm