Why is it that I'm finding the size of int and long int are shown to be 4 bytes? Is it that the int is by default long int?
The size of int
is neither guaranteed to be 4
nor to be equal to the size of long int
. Put in other words: that's completely implementation defined.
No, int
and long
are not necessarily the same size, even though that happens to be the case in your compiler.
The C standard defines a minimum size for each of these datatypes, but it is up to the implementation what the actual size is. For example, some systems have 2-byte int
s and 4-byte long
s, while others may have 4-byte int
s and 8-byte long
s.
The only guarantee made by the standard is that sizeof(long) >= sizeof(int).
In the old days of 16-bit processors, it wasn't uncommon for int to be 2 bytes.
sizeof(short) <= sizeof(int) <= sizeof(long)
That's all you can count on. The rest is completely up to the implementation. In the olden days of DOS, 16 bit compilers usually has sizeof(int) == sizeof(short) == 2. On 32 bit systems, sizeof(int) is usually equal to sizeof(long) == 4. As a rule of thumb, int
is the type the processor can work with the fastest. No rule without exceptions...
EDIT: Removed the second rule, sizeof(short) < sizeof(long)
, which is NOT part of the C standard. On some platforms, sizeof(short)
may actually be equal to sizeof(long)
.
The only guarantees the Standard mandates are (assume all expressions below are sizeof (type)
instead of just type
)
char <= short <= int <= long <= long long
so you can have
char == short == int == long == long long /* Cray?? */
char < short < int == long < long long /* Windows 32 bit */
char < short < int < long == long long /* Linux 64 bit */
The C standard makes the following guarantees about the conversion ranks of the standard integer types and their precisions:
_Bool (1) < char (8)
<= short (16)
<= int (16)
<= long (32)
<= long long (64)
In case of _Bool
, the precision is exact, whereas the others are minimum values. There may also be additional, so-called extended integer types.
A char
is the smallest addressable unit of memory, so sizeof (char) == 1
, regardless of its precision (which is given by CHAR_BIT
). The sizes of the other standard integer types are imlementation-defined. I suspect sizeof (_Bool)
will also be 1
, but I couldn't find anything in the standard which actually guarantees this...
On modern 32-bit operating systems, int and long are generally 32bit, or ILP32 (integer, long, pointer), while on 64-bit operating systems, LP64 is common, implying that int is 32 bits. long long is 64 bits just about everywhere, which may or may not be the same as long.
There are usually macros you can test for, like ILP32, ILP32, etc. that will be set if your environment uses that set of types.