Is there a guideline for estimating the amount of memory consumed by a BigDecimal
?
Looking for something similar to these guidelines for estimating String
memory usage.
Is there a guideline for estimating the amount of memory consumed by a BigDecimal
?
Looking for something similar to these guidelines for estimating String
memory usage.
If you dig into the internals of BigDecimal
you'll see that it uses a compact representation if the significand is <= Long.MAX_VALUE
. Hence, the memory usage can vary depending on the actual values you're representing.
If you look at the fields in the source for BigDecimal there is:
BigDecimal:
long intCompact +8 bytes
int precision +4 bytes
int scale +4 bytes
String stringCache +?
BigInteger intVal +?
BigInteger:
int bitCount +4 bytes
int bitLength +4 bytes
int firstNonzeroIntNum +4 bytes
int lowestSetBit +4 bytes
int signum +4 bytes
int[] mag +?
The comment for stringCache is "Used to store the canonical string representation, if computed.", so assuming that you don't call toString we will leave that as zero bytes, so in total there are (8+4+4)=16 bytes + BigInteger in BigDecimal and 4+4+4+4+4=20 bytes + mag for BigInteger. So a total of 36 bytes plus the magnitude. As far as I can tell magnitude is always the minimum number of bits necessary to represent the full integer, so for a number n it will need log2(n) bits, which can be converted to ints. So in general you should be using about:
36 + Cieling(log2(n)/8.0) bytes
(note this doesn't include any of the other object descriptor overhead as your example link for strings does, but it should give you a good general idea.)