tags:

views:

109

answers:

5

i wanted to declare a very large array. i found that the max size of an array is size_t, which is defined as UINT_MAX

so i wrote the code like this

int arr[UINT_MAX];

when i compile this, it says overflow in array dimension

but when i write like this

size_t s = UINT_MAX;
int arr[s]; 

it compiles properly. what's the difference

A: 
size_t s = UINT_MAX;
int arr[s];  

won't compile unless you declare s as const. Aslo note that UINT_MAX is the potentially greatest size of the array. Practically it won't let you declare an array more than a few million in size. That's because static and automatic and any memory is limited

Armen Tsirunyan
A: 
size_t s = UINT_MAX;
int arr[s];

means arr is a variable length array (VLA). I think it is not allowed per C++ standard. I would expect an warning if compiled with

g++ -ansi -pedantic -std=c++98

Also, think about it, arr needs UINT_MAX * sizeof( int ) number of bytes. Thats quite big!

ArunSaha
g++ compiler allows vla. c99 also allows it.
ameen
You are correct. I suspect the compiler is allowing C99 features to sneak past, but it is not portable code.
Shmoopty
you see, this was my point in this thread http://stackoverflow.com/questions/3916608/how-important-is-standards-compliance - standards compiance would be great for learners :)))
Armen Tsirunyan
The question is tagged C++. So, I thought of C++ only, not C99. However, some C++ compilers might allow it, unless standard compliance is *enforced* (see my edit). Talking about standards, a good discussion is here: http://stackoverflow.com/questions/3916608/how-important-is-standards-compliance I have posted some similar thoughts there.
ArunSaha
A: 

What compiler are you using? On VC++, I get an error in both cases (after correcting s to be const). Even if it did compile, it would result in undefined behaviour because UINT_MAX * sizeof(int) certainly won't fit in your process' address space, and further the integer value itself would overflow and result in the wrong value for size.

casablanca
i used g++ compiler
ameen
Did you try running the program?
casablanca
+1  A: 

You are delaying the error.

You are asking for about 16GB* of contiguous memory in both cases, which is impossible on a 32 bit machine.

Your first attempt is hard-coding the size, and your compiler was nice enough to tell you in advance that it will not succeed.

Your second attempt is using a variable for the size, which has bypassed the compiler warning, but it will still fail when you attempt to run the program.

*On typical architectures

Shmoopty
You mean 4GB of memory.
Nathan Ernst
`UINT_MAX` is usually around 4 billion, and an `int` is 4 bytes. 4 billion times 4 bytes is 16GB.
Peter Alexander
@Peter: An `int` isn't necessarily 4 bytes.
GMan
Read: *"On typical architectures"*
Peter Alexander
+3  A: 

First error: size_t is not necessarily unsigned int, thus its maximum value can be different from the one of unsigned int (UINT_MAX); moreover, in C++ to get informations about the limits of a type you should use std::numeric_limits.

#include <limits>

size_t s=std::numeric_limits<size_t>::max();

Second error: you won't ever get an array so big; since size_t is required to be able to express the biggest size of any object, it should probably big enough to express an object big as the whole address space available to the application, but trying to allocate such a big object would require to dedicate the whole address space to it, which is infeasible; moreover, you're requesting an array of ints that big, which means that it will be UINT_MAX*sizeof(int) bytes big, which will probably be about 4 times the whole address space - clearly nonsense - and by the way sizeof(arr) wouldn't be able to express the size of such object, and in general pointers couldn't even reach the top of that array. The compiler detects these faults and stop you from doing that.

Moreover, I infer that you're trying to allocate that thing on the stack, that is usually much much smaller than all the memory that the application can use, and in general it's not a good idea to allocate big arrays there (you should use the heap for that).

Third error: allocating all that memory doesn't make sense. If you have big memory requirements, you should allocate stuff on the heap, not on the stack, and allocate just the memory you need to play along well with the OS and the other applications (this last consideration do not apply if you're working on embedded systems where you are the only application that is running).

The second snippet in C++ shouldn't even work, since, if that thing is allocated on the stack, you're going nonstandard, since it would be a VLA (available in C99 but strongly rejected from the current and the next C++ standard). However, in that case the code to allocate that array is used at runtime (VLAs in general are not fixed in dimensions), so the check for the compiler is not obvious to do (although I suppose that this thing could be spotted easily by the optimizer, which, if VLA semantic is not different from regular arrays, could optimize away the VLA and try to make a regular array => which would fail for the same reasons I stated).

Long story short: it makes no sense to allocate all that memory (that you couldn't even address), especially on the stack. Use the heap and allocate just what you need. If you have special requirements, you should investigate the special virtual memory functions provided by your OS.

Matteo Italia
+1 for mentioning the usually smaller size of the stack.
Thomas Matthews