tags:

views:

135

answers:

2

What is more efficient with the C code I am working with, moving my code into an existing C program or have an h file #included so it calls the separate .c file?

When this is compiled into an .exe how does this work with having it incorporated into the original code vs having an h file and separate .c file?

I am not sure how many lines of code the program has which I would incorporate this other code into but my code is only about a thousand lines of code.

Thanks, DemiSheep

+10  A: 

There's no difference in efficiency between keeping code in a single .c file or in multiple .c files linked into the same executable, since once the executable is created, chances are it will contain the same binary code whichever method you choose. This can be easily verified with a binary disassembler, of course.

What a single .c file can change, however, is the speed of compilation, which may be faster for a single file than for a bunch of files that have to be linked together. IIRC, one of the reasons SQLite's preferred source distribution method is a single huge "amalgamated C file" is compilation speed.


That said, you should really not concern yourself with issues of this kind. Do break your programs into separate modules, each with a clean interface and a separate implementation file.

Eli Bendersky
Eli's answers is also true when you use libraries (ok, apart from a load-time very small overhead).
Mario The Spoon
@Mario: you mean dynamic libraries, of course (.so / .dll). Static linking has similarly no effect on code execution speed.
Eli Bendersky
The bit about compilation speed is rubbish. If you split your code into multiple source files, you will generally be able to compile much more quickly because you only need to compile the source files you have changed. This is accepted practice and has been since the dawn of `make`.
JeremyP
@JeremyP: I "appreciate" your candid language. Nevertheless, keep in mind that sometimes (like building a downloaded app from source) you compile the whole code at the same time, not only when parts of the code are changed. This is the cause for some code-bases to be distributed as a single .c file, because all the burden of the linker is saved.
Eli Bendersky
And sqlite3 is not distributed as a single huge source file to make the compile faster. It's done to make it easier for you to include it in your source tree and apparently to make it run faster (the compiler can optimise better if it has the complete source code in one unit, allegedly). I'm sceptical on the second point.
JeremyP
@JeremyP: I totally agree that this is a much better reason. Yes, gcc is weak with cross-object optimizations, so the code may be optimized better this way. However, newer compilers like clang/llvm actually do include cross-object optimization (the `-O4`) flag which would render such a structure unnecessary
Eli Bendersky
For compilation speed, usually the problem is a big include-tree of header files, for which precompiled headers are a solution. For execution speed, don't worry about compiler optimizations before you avoid these pitfalls: http://stackoverflow.com/questions/2679186/most-hazardous-performance-bottleneck-misconceptions/2679514#2679514
Mike Dunlavey
A: 

I know this already has a good answer that says no difference but wanted to add this

The #include directive tells the preprocessor to treat the contents of a specified file as if those contents had appeared in the source program at the point where the directive appears.

I would have wrote it myself but the msdn docs say it quite nicely.

Adam Butler