I was wondering if there is some standardised way of getting type sizes in memory at the pre-processor stage - so in macro form, sizeof() does not cut it.
If their isn't a standardised method are their conventional methods that most IDE's use anyway?
Are there any other methods that anyone can think of to get such data?
I suppose I could do a two stage build kind of thing, get the output of a test program and feed it back into the IDE, but thats not really any easier then #defining them in myself.
Thoughts?
EDIT:
I just want to be able to swap code around with
#ifdef / #endif
Was it naive of me to think that an IDE or underlying compiler might define that information under some macro? Sure the pre-proccesor doesn't get information on any actual machine code generation functions, but the IDE and the Compiler do, and they call the pre-processor and declare stuff to it in advance.
EDIT FURTHER
What I imagined as a conceivable concept was this:
The C++ Committee has a standard that says for every type (perhaps only those native to C++) the compiler has to give to the IDE a header file, included by default that declares the size in memory that ever native type uses, like so:
#define CHAR_SIZE 8
#define INT_SIZE 32
#define SHORT_INT_SIZE 16
#define FLOAT_SIZE 32
// ect
Is there a flaw in this process somewhere?
EDIT EVEN FURTHER
In order to get across the multi-platform build stage problem, perhaps this standard could mandate that a simple program like the one shown by lacqui would be required to compile and run be run by default, this way, whatever that gets type sizes will be the same machine that compiles the code in the second or 'normal' build stage.
Apologies:
I've been using 'Variable' instead of 'Type'