views:

329

answers:

4

I am working on an embedded system, so memory is precious for me.

One issue that has been recurring is that I've been running out of memory space when attempting to compile a program for it. This is usually fixed by limiting the number of typedefs, etc that can take up a lot of space.

There is a macro generator that I use to create a file with a lot of #define's in it. Some of these are simple values, others are boundary checks

ie

#define SIGNAL1 (float)0.03f
#define SIGNAL1_ISVALID(value) ((value >= 0.0f) && (value <= 10.0f))

Now, I don't use all of these defines. I use some, but not actually the majority. I have been told that they don't actually take up any memory if they are not used, but I was unsure on this point. I'm hoping that by cutting out the unused ones that I can free up some extra memory (but again, I was told this is pointless).

Do unused #define's take up any memory space?

+13  A: 

No, #defines take up no space unless they are used - #defines work like find/replace; whenever the compiler sees the left half, it'll replace it with the right half before it actually compiles.

So, if you have:

float f = SIGNAL1;

The compiler will literally interpret the statement:

float f = (float)0.03f;

It will never see the SIGNAL1, it won't show up in a debugger, etc.

Paul Betts
Chuck
@Chuck You and I know that, but I was trying to make the answer easier to grok - thanks for mentioning this though!
Paul Betts
Very good point as well.
espais
Not strictly true about not showing up in a debugger. E.g: http://developer.apple.com/DOCUMENTATION/DeveloperTools/gdb/gdb/gdb_10.html (But usually true, and easier to grok the answer if you assume this ;)
benno
A: 

Well, yes and no.

No, unused #defines won't increase the size of the resulting binary.

Yes, all #defines (whether used or unused) must be known by the compiler when building the binary.

By your question it's a bit ambiguous how you use the compiler, but it almost seems that you try to build directly on an embedded device; have you tried a cross-compiler? :)

Christoffer
`#define`s are never known by the compiler. They are only known by the preprocessor, which uses them for textual substitution and `#if`/`#ifdef` statements. Once we're out of the preprocessor and into the actual compiler, `#define`s are all replaced, and no longer exist (and if they aren't replaced, they become syntax errors or worse, missing symbols to the linker).
Chris Lutz
P.S. Sorry about the random fixed-width fonts. Now I understand how the regex that SO uses to match `code` blocks works.
Chris Lutz
Ah yes, but in a "loose" context like the one in this question (where the OP don't specify/care whether it's the compiler or preprocessor that runs out of memory) we can assume the preprocessing is part of the compilation process.
Christoffer
A: 

unused #defines doesn't take up space in the resulting executable. They do take up memory in the compiler itself whist compiling.

nos
A: 

This is usually fixed by limiting the number of typedefs, etc that can take up a lot of space.

You seem somewhat confused because typedef's do not take up space at runtime. They are merely aliases for data types. Now you may have instances of large structures (typedef'd or otherwise), but it is the instance that takes space, not the type definition. I wonder what 'etc' might cover in this statement.

Macro instances are replaced in the source code with their definition, and code generated accordingly, an unused macro does not result in any generated code.

Things that take up space are:

  1. Executable code (functions/member functions)
  2. Data instantiation (including C++ object instances)
  3. The amount of space allocated to the stack (or stacks in a multi-threaded system).

What is left is typically available for dynamic memory allocation (RAM), or is unused or made for non-volatile storage (Flash/EPROM).

Reducing memory usage is primarily a case of selecting/designing efficient data structures, using appropriate data types, and efficient code and algorithm design. It is best to target the area that will get the greatest benefit. To see the size of objects and code in your application, get the linker to generate a map file. That will tell you which are the largest functions, as well as the sizes of global and static objects.

Source file text length is not a good guide to code size. Large amounts of C code is declarative (typically header files are all declarative), and do not generate memory occupying code or data.

An embedded system does not necessarily imply small memory, so you should specify. I have worked on systems with 64Mb RAM and 2Mb Flash, and even that is modest compared with many systems. A typical micro-controller with on-chip resources however, will generally have much less (especially SRAM which takes up a lot of chip area). Also whether your system is Harvard or Von Neumann architecture is relevant here, since in a Harvard architecture data and code spaces are separate, so we need to know what it is you are short of. If Von Neumann, the code/data usage is still relevant if the code is running from ROM, or is it is copied from ROM to RAM at run-time (i.e. different types of memory, even if they are in the same address space).

Clifford

Clifford