I'm compiling a lookup table that needs to have 133,784,560
entries, with values ranging from 0 - 7,462
The maximum value of 7,462
can be contained within 13 bits
. This gives us a lookup table of around 207 MB.
A 16 bit
value increases our lookup table size around 50mb
more.
The extra increase in size of the lookup table is not significant in todays age, but it would be nice to keep it as thin as possible.
When the LUT is loaded into memory, how much overhead is there to evaluate the value of a range of 13 bits, compared to evaluating the 16 bits
? I'm assuming there would be some intermediary bitwise operations to convert it to a computer workable format, or am I wrong?
Every clock cycle counts, as this will be involved in a brute force analysis program that will run billions of comparisons. Should I just stick with the slightly larger LUT?