tags:

views:

431

answers:

11

Hi, In my C# app, I would like to know whether it is really important to use short for smaller numbers, int for bigger etc. Does the memory consumption really matter?

A: 

That all depends on how you are using them and how many you have. Even if you only have a few in memory at a time - this might drive the data type in your backing store.

Christopherous 5000
+1  A: 

This is entirely relative to the amount of memory you can afford to waste. If you aren't sure, it probably doesn't matter.

Joel Clark
+2  A: 

Only you can be the judge of whether the memory consumption really matters to you. In most situations it won't make any discernible difference.

In general, I would recommend using int/Int32 where you can get away with it. If you really need to use short, long, byte, uint etc in a particular situation then do so.

LukeH
+3  A: 

For C# apps that aren't trying to mirror some sort of structure from a file, you're better off using ints or whatever your native format is. The only other time it might matter is if using arrays on the order of millions of entries. Even then, I'd still consider ints.

Michael Dorgan
+16  A: 

Unless you are packing large numbers of these together in some kind of structure, it will probably not affect the memory consumption at all. The best reason to use a particular integer type is compatibility with an API. Other than that, just make sure the type you pick has enough range to cover the values you need. Beyond that for simple local variables, it doesn't matter much.

recursive
+6  A: 

The simple answer is that it's not really important.

The more complex answer is that it depends.

Obviously you need to choose a type that will hold your datastructure without overflowing, and even if you're only storing smaller numbers then choosing int is probably the most sensible thing to do.

However, if your application loads a lot of data or runs on a device with limited memory then you might need to choose short for some values.

ChrisF
A: 

Memory consumption based on the type of integers you are storing is probably not an issue in a desktop or web app. In a game or a mobile device app, it may be more of an issue.

However, the real reason to differentiate between the types is the kind of numbers you need to store. If you have really big numbers, or high precision, you may need to use long to store it.

Zachary Yates
+1  A: 

The answer is: it depends. The question of whether memory matters is entirely up to you. If you are writing a small application that has minimal storage and memory requirements, then no. If you are google, storing billions and billions of records on thousands of servers, then every byte can cost some real money.

Wade Tandy
+1  A: 

There are a few cases where I really bother choosing.

  1. When I have memory limitations
  2. When I do bitshift operations
  3. When I care about x86/x64 portability

Every other case is int all the way

Edit : About x86/x64

In x86 architecture, an int is 32 bits but in x64, an int is 64 bits

If you write "int" everywhere and move from one architecture to another, it might leads to problems. For example you have an 32 bits api that export a long. You cast it to an integer and everything is fine. But when you move to x64, the hell breaks loose.

The int is defined by your architecture so when you change architecture you need to be aware that it might lead to potential problems

Eric
I don't understand "When I care about x86/x64 portability". Can you explain this?
Conrad Frix
I don't know about C#, but in C an `int` is 32 bits on amd64 and x86. It's the `long` that changes size.
Ken Bloom
The x86 vs x64 argument is wrong in C# - `int` is an alias for `System.Int32` (see [MSDN](http://msdn.microsoft.com/en-us/library/ya5y69ds.aspx) ) If you want the platform native size then I think you need to use IntPtr.
Sam
A: 

The context of the situation is very important here. You don't need to take a guess at whether it is important or not though, we are dealing with quantifiable things here. We know that we are saving 2 bytes by using a short instead of an int.

What do you estimate the largest number of instances are going to be in memory at a given point in time? If there are a million then you are saving ~2Mb of Ram. Is that a large amount of ram? Again, it depends on the context, if the app is running on a desktop with 4Gb of ram you probably don't care too much about the 2Mb.

If there will be hundreds of millions of instances in memory the savings are going to get pretty big, but if that is the case you may just not have enough ram to deal with it and you may have to store this structure on disk and work with parts of it at a time.

PhilB
A: 

Int32 will be fine for almost anything. Exceptions include:

  • if you have specific needs where a different type is clearly better. Example: if you're writing a 16 bit emulator, Int16 (aka: short) would probably be better to represent some of the internals
  • when an API requires a certain type
  • one time, I had an invalid int cast and Visual Studio's first suggestion was to verify my value was less than infinity. I couldn't find a good type for that without using the pre-defined constants, so i used ulong since that was the closest I could come in .NET 2.0 :)
Dinah