views:

77

answers:

3

I have a program for some scientific simulation stuff, and as such needs to run quickly.

When I started out, I was somewhat lazy, and decided to allow inputting constants later; and just used #define macros for them all.

The problem is that when I tried changing that, it got a lot slower. For example, changing

#define WIDTH 8
//..... code

to

#define WIDTH width
int width;
//... main() {
width=atoi(argv[1]);
//...... code

resulted in something that used to take 2 seconds taking 2.8. That's just for one of about a dozen constants, and I can't really afford that even. Also, there is probably some complied-away math with these.

So my question is if I can have some way (bash script?) of compiling the constants I want to use into the program at runtime. It's ok if any machine that needs to run this has to have a compiler on it. It currently compiles with a standard (quite simple) Makefile.
--This also allows for march=native, which should help a little.

I suppose my question also is if there's a better way of doing it entirely...

+4  A: 

At least if I understand your question correctly, what I'd probably do would be something like:

#ifndef WIDTH
#define WIDTH 8
#endif

(and likewise for the other constants you want to be able to modify). The in your makefile(s), add some options to the makefile to pass the correct definitions to the compiler when/if necessary, so if you wanted to change the WIDTH, you'd have something like:

cflags=-DWIDTH=12

and when you compile the file, this would be used as the definition for WIDTH, but if you didn't define a value in the makefile, the default in the source file would be used.

Jerry Coffin
That is almost what I'd like to do, but it would be preferable to be able to do something like `./compile-run 32 32 5 .1`, which would substitute those values into the makefile, which would pass them in. Or something like that.
zebediah49
At least with most versions of `make` you can do that. The normal syntax is oriented toward the name, not the order, so you'd use something like `make WIDTH=8`, and that value would be visible in the make file as `$WIDTH`.
Jerry Coffin
Ah, that sounds about right. I assume there's some way of checking (in the Makefile) if `$WIDTH` is set?
zebediah49
Yes -- you can do things like `!if "$WIDTH" == ""\nWIDTH=8\n!ENDIF` (where the "\n" means a newline in the makefile).
Jerry Coffin
I would probably use `if "$WIDTH" != ""\nFLAGS2=$(FLAGS1) -DWIDTH=$(WIDTH)\nENDIF` in that case, but that seems pretty good.
zebediah49
+1  A: 

You could store the relevant defines into a separate header file constants.h:

#ifndef CONSTANTS_H
#define CONSTANTS_H

#define WIDTH 8
...other defines...

#endif

If you take care that the header is included only once, then you can even omit the include guards and have a small file with only the relevant stuff. I would go this way if the program is used by others who need to change the constants. If you're the only one using it, then Jerry's method is just fine.

EDIT:

Reading your comment, this separate header could be easily generated with a little tool from the makefile before the compilation.

Secure
+1  A: 

The difference is that with the macro being just an integer literal, the compile is able to often calculate a bunch of the math at compile time. A trivial example is if you had:

int x = WIDTH * 3;

the compiler would actually emit:

int x = 24;

no multiply there. If you change WIDTH to a variable, it can't do that, because it could be any value. So there is almost certainly going to be some difference in speed (how much depends on the circumstance and it is often so little that it doesn't matter).

I recommend making what needs to be variables variables and then profiling to find the hot spots in the code. Almost always, it's the algorithm that slows you down the most. Once you find out which blocks of code you are spending the most time in, then you can figure out ways to make that part faster.

The only real solution would be to have a separate header file with the constants that you could have a script generate then compile the program. Or if there aren't too many just passing them directly to gcc. This of course sacrifices up front speed for runtime speed. I do wonder if a difference of 0.8 seconds in runtime is un-affordable, how is compiling a program (which will surely take more than a second) affordable?

The script could be something as simple as this:

#!/bin/sh

echo "#define WIDTH $1" > constants.h
echo "#define HEIGHT $2" >> constants.h
gcc prog.c -o prog && ./prog

where prog.c includes constants.h or something like this (with no extra header).

#!/bin/sh
gcc -DWIDTH=$1 -DHEIGHT=$2 prog.c -o prog && ./prog
Evan Teran
The problem isn't .8 seconds, it's what happens when I go from a 5000 point, 5000 step run to a 500,000 point, 50,000 step run. Assuming it scales linearly, that'd end up being 800 seconds, which isn't quite so insignificant any more.creating a constants.h file does make some sense for this actually though.
zebediah49
Also, my hot spots seem to be some fairly simple (and unavoidable) calculations that just get run a couple million times a second.
zebediah49