What type of variable that can contain 1,000,000,000(a decimal number) takes the most memory space?
- int in C
- string in C
- string in Java(which uses unicode)
What type of variable that can contain 1,000,000,000(a decimal number) takes the most memory space?
I honestly don't really want to answer this directly, so you can have looks over here:
Also it might be useful how to convert between binary and decimal:
A Java String. Under the hood A Java String consists of an object with 3 fields, one of which points to a separate array object containing the characters. Plus of course, Java Strings are composed of 16 bit characters.
If you are worried about memory usage over all other criteria, don't use Java. But for most applications, memory usage is the least of your concerns.
It is worth noting that 1,000,000,000 can be represented using a Java int
which will be the same size as a C signed or unsigned (32 bit) integer.
Furthermore, a C int
is not necessarily big enough to represent 1,000,000,000. On some platforms, int
is 16 bits, and this is allowed by the C standard.
The C standard doesn't state many storage requirements. As it is, you could have:
int
s that take 32 bytes to store anything (see @nonnb's comment) wchar_t[]
) that uses UCS-4/UTF-32 characters (as all GNU implementations do, apparently) char
s (which would have to be on a system with 32-bit bytes)