I need to represent an array of integers using BitSet. Can somebody explain me the logic required to do this ?
first thought:
use BigInteger and create it like: new BigInteger(int value, int base). Then you can toString() it, and then create BitSet using that String(don't know how to do it without analyzing the string, however).
--
didn't read it right. That method only helps you to create an array of BitSet, not the whole BitSet that contains the whole array.
I don't know how to make array of integers to one bitSet. I guess you will need some kind of delimeters, but how to make good delimeter in binary - that's a good question.
I think the logic would be: Run through the integer array, test every bit and set this bit in the bitset like bitset.set(array_pos+bit_pos)
You can represent a set of integers using BitSet
, but not an arbitrary array. You will lose information about order and repetitions.
Basically, set the n
th bit of the BitSet
if and only if n
appears in your set of integers.
BitSet bitSet = new BitSet();
int[] setOfInts = new int[] { /* Your array here */ };
for (int n : setOfInts) {
bitSet.set(n);
}