I have been tossing around a conceptual idea for a machine (as in a Turing machine) and I'm wondering if any work has been done on this or related topics.
The idea is a machine that takes an entropy stream and gives out random symbols in any range without losing any entropy.
I'll grand that is a far from rigorous description so I'll give an example: Say I have a generator of random symbols in the range of 1
to n
and I want to be able to ask for a symbols in any given range, first 1
to 12
and then 1
to 1234
. (To keep it practicable I'll only consider deterministic machines where, given the same input stream and requests, it will always give the same output.) One necessary constraint is that the output contain at least as much entropy as the input. However, the constraint I'm most interested in is that the machine only reads in as much entropy as it spits out.
E.g. if asked for tokens in the range of 1
to S1, S2, S3, ... Sm
it would only consume ceiling(sum(i = 1 to m, log(Si))/log(n))
inputs tokens.
This question asks about how to do this conversion while satisfying the first constraint but does very badly on the second.