It seems all of them take 4 bytes of space,
so what's the difference?
It seems all of them take 4 bytes of space,
so what's the difference?
First of all, the size of int/long is unspecified. So on your compiler, an int
and a long
might be the same, but this isn't universal across compilers.
As for the difference between unsigned long
and long
:
Assuming 4 bytes, a long
has the range of -2,147,483,648
to 2,147,483,647
. An unsigned long has the range of 0
to 4,294,967,295
.
One other difference is with overflow. For a signed type, an overflow has unspecified behavior. But for an unsigned type, overflow is guaranteed to "wrap around."
Well, the difference between unsigned long
and long
is simple -- the upper bound. Signed long
goes from (on an average 32-bit system) about -2.1 billion (-2^31) to +2.1 billion (+2^31), while unsigned long
goes from 0 to 4.2 billion (2^32).
It so happens that on many compilers and operating systems (including, apparently, yours), int
is also a 32-bit value. But the C++ standard doesn't determine maximum widths for any of these types, only minimum widths. On some systems, int
is 16 bits. On some systems, long
is 64 bits. A lot of it depends on the processor architecture being targeted, and what its base word size is.
The header limits.h
exists to define the maximum capacity of the various types under the current compilation environment, and stdint.h
exists to provide environment-independent types of guaranteed width, such as int32_t
.
The C language specification allows the implementation of int and long types to vary from one platform to another within a few constraints. This variability is a headache for cross-platform code, but it is also an asset because it enables the informed programmer to balance their design goals between native processor speed and full numeric range on hardware architectures that don't offer both.
In general, "int" is supposed to map a machine register size of the target CPU architecture's machine, so that loading, storing, and operating on the int type data should translate directly into operations that use the target processor's native registers.
Int can be less than the machine register size in the interest of saving memory space (big ints take up twice as much RAM as little ints). It's common to see int as a 32 bit entity even on 64 bit architectures where compatibility with older systems and memory efficiency are high priorities.
"long" can be the same size or larger than "int" depending on the target architecture's register sizes. Operations on "long" may be implemented in software if the target architecture doesn't support values that large in its native machine registers.
CPU chips designed for power efficiency or embedded devices are where you will find distinctions between int and long these days. Compilers for general purpose CPUs like in your desktop or laptop PC generally treat int and long as the same size because the CPU efficiently uses 32 bit registers. On smaller devices such as cell phones the CPU may be built to handle 16 bit data more naturally and have to work hard to handle 32 bit or larger data.
Fewer bits per register means fewer circuits required on the chip, fewer data lines to move data in and out of the chip, lower power consumption and smaller chip die size, all of which make for a lower cost (in $ and in watts) device.
In such an architecture, you will most likely find int to be 16 bits in size and long to be 32 bits in size. There may also be a performance penalty associated with using longs, caused by either wait states to load the 32 bits in multiple reads across a 16 bit data bus, or caused by implementing long operations (addition, subtraction, etc) in software if the native hardware doesn't support such operations in hardware.
As a general rule, the only thing you can assume about ints and longs is that the range of int should always be less than or equal to long on any architecture. You should also assume that someday your code will be recompiled for a different architecture where whatever relationship you currently see between int and long no longer exists.
This is why you should be careful to keep ints separate from longs even in everyday mundane coding. They may be completely assignment compatible today because their implementation details for your current hardware platform coincide, but that coincidence is not guaranteed across all platforms.