Am I missing something painfully obvious? Or does just nobody in the world actually use java.util.BitSet?
The following test fails:
@Test
public void testBitSet() throws Exception {
BitSet b = new BitSet();
b.set(0, true);
b.set(1, false);
assertEquals(2, b.length());
}
It's really unclear to me why I don't end up with a BitSet of length 2 and the value 10. I peeked at the source for java.util.BitSet, and on casual inspection it seems to fail to make sufficient distinction between a bit that's been set false and a bit that has never been set to any value...
(Note that explicitly setting the size of the BitSet in the constructor has no effect, e.g.:
BitSet b = new BitSet(2);