views:

156

answers:

3

Hi

My question is very simple , I am working on a old legacy code where most of function are implemented in header file only. As per my knowledge, Compiler convert function implemented in header into inline functions.

I wanted to know if i move these implementation into .cxx file , What will be benefits ?

Thanks in Advance

~SS

+1  A: 

The purpose of a header file is to contain definitions that may be included in multiple source files. Code placed in a header file is #included into a source file, which is then compiled. As far as the compiled code is concerned it is the same as if the entire contents of that header are part of the source.

Putting code in a header or a source (cxx) file is convention. The compiler will not alter behavior based on where it is.

brainiac
+2  A: 

Good idea, migrate, don't go macro/inline banannas

In legacy eras, the procedure call machine operation was quite relatively expensive, and so the macro processing feature of C and C++, and inline functions were occasionally important.

As time went on individual machine operations ceased to be a performance concern.

In fact, the situation inverted itself somewhat. Expanding a procedure into inline code everywhere served little benefit in minimizing already optimized-by-the-CPU procedure call machine ops, but did blow the cache by wasting precious inner ring cache space with zillions of copies of the same function.

I say: define real functions. The CPU will deal with that just fine.

Your application appears to be an artifact of an era where exploding the code into straight line execution was popular. I think part of this dates from the original VAX 11/780, which was the first popular Unix 32-bit system and was the definition of a 1 MIPS CPU for some time, yet the overblown and microcoded calls instruction required 17 uS to execute. Yes, 17x that of a normal op. This isn't true any more, but the BSD coding style that reasonably tried to deal with VAX oddness still persists to this day. (Hey, I'm a BSD fan, but we don't need everything in macros today.)

So, to answer your exact question: you will get better cache performance by migrating those routines into individually linked .cxx modules as you have proposed. Go ahead and do it.

DigitalRoss
+3  A: 

To the compiler there is no difference between input that come from a .cxx file or a .h file, this text gets compiled into the same unit of translation.

The main reason why we typically do not put code inside header files, is to avoid duplicate objects, which then conflict at the level of the linker, when a given header is be used by multiple .cxx files.

Maybe you are confusing inline functions with macros, for in the case of macros, which are essentially a pre-processing directive, there is no risk of linker-time conflicts, even when/if the same headers where included multiple times for different units of translation.

It is of course possible to define functions in the headers (or elsewhere), in a way which instructs the compiler to systematically inline the calls to the function, and in such cases there is also no conflict at link time. However this require special syntax, and is merely implied by the fact that code comes from an include file or a cpp file (as the question suggests).

Now, to answer the question, per se, moving all this code out of header files and into cpp files, should not have much of an impact on either the binary size nor on performance.
Apparently, if the function definitions in the header files were not inlined explicitly, there must have only one user of the header file per different exe/dll produced (otherwise there'd be duplicates at link time), and therefore the file would not change in either direction.
With regards to performance, with the general performance gains in the hardware, even if functions formerly inlined were to be now called normally, this should go generally unnoticed, performance-wise, with possible exception of particular tight loops, where the logic iterates very many times.

mjv