Say there's an array of 1024 bits that are all zeros:
example: [0,0,0,0,0,0,0,...]
Then I overwrite 20 zeros with ones at completely random positions:
example: [0,1,0,0,0,0,0,...]
What is the theoretical minimum number of bits needed to encode the location of these 20 randomly placed bits, assuming that I had a perfect encoder?
I know there are communication theory equations that will tell me this, but I want to double check my calculations.
Harder bonus question: Show me the code for an algorithm that implements an encoding that approaches this minimum limit.
Bonus bonus: What if the bit flips where byte level instead of bit level? e.g. entire bytes flipped. Same result?