A lot of Database schemas seem to follow the following standard:
(2^n)-1 for large fields:
varchar(511)
varchar(255)
varchar(127)
...then (2^n) for smaller ones
varchar(64)
varchar(32)
varchar(16)
varchar(8)
I understand why numbers of (2^n)-1 are used, what I don't understand is why it is not necessary to continue the trend down to the small fields.
E.g.
varchar(63)
varchar(31)
varchar(15)
varchar(7)
Is there a reason for this or is it just that the returns have diminished too far?