tags:

views:

519

answers:

5

Hi, this was an interview question! would the size of an integer depend upon the compiler or processor?

A: 

Yes , I found that size of int in turbo C was 2 bytes where as in MSVC compiler it was 4 bytes.

Basically the size of int is the size of the processor registers.

Ashish
"Basically the size of int is the size of the processor registers." - This is incorrect, see other answers.
hobodave
+6  A: 

Yes, it would. Did they mean "which would it depend on: the compiler or the processor"? In that case the answer is basically "both." Normally, int won't be bigger than a processor register (unless that's smaller than 16 bits), but it could be smaller (e.g. a 32-bit compiler running on a 64-bit processor). Generally, however, you'll need a 64-bit processor to run code with a 64-bit int.

Jerry Coffin
You don't *need* a 64-bit processor, it'll just be slower without one..
Brendan Long
+6  A: 

Yes, it depends on both processors (more specifically, ISA, instruction set architecture, e.g., x86 and x86-64) and compilers including programming model. For example, in 16-bit machines, sizeof (int) was 2 bytes. 32-bit machines have 4 bytes for int. It has been considered int was the native size of a processor, i.e., the size of register. However, 32-bit computers were so popular, and huge number of software has been written for 32-bit programming model. So, it would be very confusing if 64-bit computer would have 8 bytes for int. Both Linux and Windows remain 4 bytes for int. But, they differ in the size of long.

Please take a look at the 64-bit programming model like LP64 for most *nix and LLP64 for Windows:

Such differences are actually quite embarrassing when you write code that should work both on Window and Linux. So, I'm always using int32_t or int64_t, rather than long, via stdint.h.

minjang
A small typo, you mentioned that `sizeof(int)` is 16 on a 16-bit machine, but it is more likely to be 2.
dreamlax
Thanks! 16 bits, or 2 bytes. Corrected.
minjang
If you need a type that is "at least 32 bits", then `long` suffices, and shouldn't cause a problem if it's *too* long. The major exception is when you're directly reading from or writing to an on-disk or network format.
caf
+14  A: 

The answer to this question depends on how far from practical considerations we are willing to get.

Ultimately, in theory, everything in C and C++ depends on the compiler and only on the compiler. Hardware/OS is of no importance at all. The compiler is free to implement a hardware abstraction layer of any thickness and emulate absolutely anything. There's nothing to prevent a C or C++ implementation from implementing the int type of any size and with any representation, as long as it is large enough to meet the minimum requirements specified in the language standard.

Yet in reality C and C++ are intended to be efficient. That immediately means that any meaningful implementation has to observe certain efficiency considerations imposed by the underlying hardware. In that sense, the size of basic types will depend on the hardware, i.e. each basic type will be based on some representation immediately (or almost immediately) supported by the hardware.

In other words, a specific C or C++ implementation for a 64-bit hardware/OS platform is absolutely free to implement int as a 71-bit 1's-complement signed integral type that occupies 128 bits of memory, using the other 57 bits as padding bits that are always required to store the birthdate of the compiler author's girlfriend. This implementation will even have certain practical value: it can be used to perform run-time tests of the portability of C/C++ programs. But that's where the practical usefulness of that implementation would end. Don't expect to see something like that in a "normal" C/C++ compiler.

AndreyT
Some of the other answers given are perfectly valid and correct, but I like this one best because it tries to give a deeper and more well-rounded explanation.
John Y
I'll just quote the C++ standard for completeness: "Plain ints have the natural size suggested by the architecture of the execution environment." It's up to the compiler writer to decide what "natural" means.
Mike Seymour
A: 

Data Types Size depends on Processor, because compiler wants to make CPU easier accessible the next byte. for eg: if processor is 32bit, compiler may not choose int size as 2 bytes[which it supposed to choose 4 bytes] because accessing another 2 bytes of that int(4bytes) will take additional CPU cycle which is waste. If compiler chooses int as 4 bytes CPU can access full 4 bytes in one shot which speeds your application.

Thanks

mahesh