I don’t quite understand the different meanings of size()
and length()
in BitSet
. Check the code below:
JavaScript
x
public class Sandbox {
public static void main(String argv[])
{
BitSet bitSet1 = new BitSet(16);
bitSet1.set(0);
bitSet1.set(8);
bitSet1.set(15);
displayBitSet(bitSet1);
}
static void displayBitSet(BitSet bitSet)
{
for(int i=0;i<bitSet.size();i++)
{
boolean bit = bitSet.get(i);
System.out.print(bit?1:0);
}
System.out.println(" "+bitSet.size()+" "+bitSet.length());
}
}
Output is:
JavaScript
1000000010000001000000000000000000000000000000000000000000000000 64 16
I thought I would get something like
JavaScript
1000000010000001 16 16
Where do these trailing zeroes come from? Can someone explain this to me? thanks~~
Advertisement
Answer
answer is quite simple the BitSet
constructor just says it generates something which is big enough to the given size, actually it takes some internal size which is best matching.
And in your case this is 64 bit, see JavaDoc