views:

442

answers:

9

In my C# source code I may have declared integers as:

int i = 5;

or

Int32 i = 5;

In the currently prevalent 32bit world they are equivalent. However, as we move into a 64bit world am I correct in saying that the following will become the same?

int i = 5;
Int64 i = 5;
+21  A: 

No. The C# specification rigidly defines that int is an alias for System.Int32 with exactly 32 bits. Changing this would be a major breaking change.

Jon Skeet
Moreover, even in x64, as far as I know, only pointers and size (size_t) are 64bit wide, whereas the default integer is still 32bit wide.
Asaf R
Asaf: In some languages that depends on the compiler. For example, in C++ 64-bit gcc defines long as 64 bits, while Visual C++ keeps long at 32 bits. C# is not such a language, the decision is standardized and not left to the compiler vendor.
Ben Voigt
A: 

yes, as jon said, and unlike the 'C/C++ world', Java and C# aren't dependent of the system they're running on. They have strictly defined lengths for byte/short/int/long and single/double precision floats, equal on every system.

zappan
In the C/C++ world, ints are dependent on the *compiler* rather than the underlying hardware. Most 64-bit C++ compilers still use 32-bit ints, but pointers will be 64-bit instead of 32.
Roddy
And in the C world, they actually ran out of confidence in the whole scheme at C99 and decided not to make long get any longer but to add long long instead. At some level this is an acknowledgement that types with unpredictable (over extended time-scales) sizes are a problem.
Will Dean
+1  A: 

No matter whether you're using the 32-bit version or 64-bit version of the CLR, in C# an int will always mean System.Int32 and long will always mean System.Int64.

Mark Cidade
+10  A: 

int is currently always synonymous with Int32 on all platforms.

It's very unlikely that Microsoft will change that in the near future, as it will break lots of existing code that assumes int is 32-bits.

BlueRaja - Danny Pflughoeft
Word to that! MS considers backwards compatability important. Consistency in the language is one of the reasons I <3 C# over C++.
P.Brian.Mackey
@P.Brian.Mackey I'm with you. Thank the maker that we finally have a C-flavored language that was bold enough to actually define its fundamental datatypes.
Detmar
-1 .... that would be an explicit language change which is EXTREMELY unlikely. Exactly this "predetermined" size is one plus of C#.
TomTom
@TomTom: ...that's what I said.
BlueRaja - Danny Pflughoeft
What about an option in the compiler to compile to 32 or 64 bits integers?
Xavier Poinas
+35  A: 

The int keyword in C# is defined as an alias for the System.Int32 type and this is (judging by the name) meant to be a 32-bit integer. To the specification:

CLI specification section 8.2.2 (Built-in value and reference types) has a table with the following:

  • System.Int32 - Signed 32-bit integer

C# specification section 8.2.1 (Predefined types) has a similar table:

  • int - 32-bit signed integral type

This guarantees that both System.Int32 in CLR and int in C# will always be 32-bit.

Tomas Petricek
System.Int64 will be happy to know this.
Will
+4  A: 
Andrew Koester
A: 

My best guess is that it will be changed right around the same time that Internet Explorer X is fully standards compliant.

jMerliN
How mature. /sarcasm
Chris Charabaruk
I'm sure even the guys at Microsoft would chuckle at that one. But it's nice to see stackoverflow can attract people who don't have a technical sense of humor.
jMerliN
Just because they might laugh too, doesn't make it any less childish. Besides, now that they're actually trying to improve standards compliance, this comes off as a blow below the belt. Maybe it would be acceptable back in the IE6 days, maybe even IE7, but not now. Grow up.
Chris Charabaruk
This would be funny if it wasn't an answer. No, wait, it will never be funny.
Will
It's a legitimate answer. We're talking about a company that wholly ignores standards. Asking when this company is going to standardize the name of something in their proprietary platform to the hardware standard.. is ludicrous in itself, and what I've posited here assumes a point in time where standards compliance becomes a goal, which is certainly a good guess as to when such a change might happen.
jMerliN
Oh I see, you're trapped in 2005.
Chris Charabaruk
Should've known better than to argue with a C#/.NET fanboy. By definition they can't think logically or reason properly. I'll just continue to laugh at the ones that apply for my jobs.
jMerliN
@jMerliN only thing worse than a fanboi is a hater. Also, I hope you're not using jQuery anymore. That evil standards-ignoring M$ is all up in Resing's code, now...
Will
@Will "Embrace, extend, extinguish." Welcome to the cycle of technological death, brought to you by yours truly, M$.Incoming: You need jQuery.NET, it's fully compatible with IE.NET, the only "good" browser, duh!
jMerliN
+4  A: 

I think what you may be confused by is that int is an alias for Int32 so it will always be 4 bytes, but IntPtr is suppose to match the word size of the CPU architecture so it will be 4 bytes on a 32-bit system and 8 bytes on a 64-bit system.

Brian Gideon
+10  A: 

Will sizeof(testInt) ever be 8?

No, sizeof(testInt) is an error. testInt is a local variable. The sizeof operator requires a type as its argument. This will never be 8 because it will always be an error.

VS2010 compiles a c# managed integer as 4 bytes, even on a 64 bit machine.

Correct. I note that section 18.5.8 of the C# specification defines sizeof(int) as being the compile-time constant 4. That is, when you say sizeof(int) the compiler simply replaces that with 4; it is just as if you'd said "4" in the source code.

Does anyone know if/when the time will come that a standard "int" in C# will be 64 bits?

Never. Section 4.1.4 of the C# specification states that "int" is a synonym for "System.Int32".

If what you want is a "pointer-sized integer" then use IntPtr. An IntPtr changes its size on different architectures.

Eric Lippert