tags:

views:

681

answers:

5

I've heard that enabling Link-Time Code Generation (the /LTCG switch) can be a major optimization for large projects with lots of libraries to link together. My team is using it in the Release configuration of our solution, but the long compile-time is a real drag. One change to one file that no other file depends on triggers another 45 seconds of "Generating code...". Release is certainly much faster than Debug, but we might achieve the same speed-up by disabling LTCG and just leaving /O2 on.

Is it worth it to leave /LTCG enabled?

+3  A: 

It is hard to say, because that depends mostly on your project - and of course the quality of the LTCG provided by VS2005 (which I don't have enough experience with to judge). In the end, you'll have to measure.

However, I wonder why you have that much problems with the extra duration of the release build. You should only hand out reproducible, stable, versioned binaries that have reproducible or archived sources. I've rarely seen a reason for frequent, incremental release builds.

The recommended setup for a team is this: Developers typically create only incremental debug builds on their machines. Building a release should be a complete build from source control to redistributable (binaries or even setup), with a new version number and labeling/archiving the sources. Only these should be given to in-house testers / clients.

Ideally, you would move the complete build to a separate machine, or maybe a virtual machine on a good PC. This gives you a stable environment for your builds (includes, 3rd party libraries, environment variables, etc.).

Ideally, these builds should be automated ("one click from source control to setup"), and should run daily.

peterchen
A: 

I also don't see problems with extra compilation time using link-time code generation with the release build. I only build my release version once per day (overnight), and use the unit-test and debug builds during the day.

Ryan Ginstrom
A: 

I know the guys at Bungie used it for Halo3, the only con they mentioned was that it sometimes messed up their deterministic replay data.

Have you profiled your code and determined the need for this? We actually run our servers almost entirely in debug mode, but special-case a few files that profiled as performance critical. That's worked great, and has kept things debuggable when there are problems.

Not sure what kind of app you're making, but breaking up data structures to correspond to the way they were processed in code (for better cache coherency) was a much bigger win for us.

+1  A: 

It allows the linker to do the actual compilation of the code, and therefore it can do more optimization such as inlining.

If you don't use LTCG, the compiler is the only component in the build process that can inline a function, as in replace a "call" to a function with the contents of the function, which is usually a lot faster. The compiler would only do so anyway for functions where this yields an improvement.

It can therefore only do so with functions that it has the body of. This means that if a function in the cpp file calls another function which is not implemented in the same cpp file (or in a header file that is included) then it doesn't have the actual body of the function and can therefore not inline it.

But if you use LTCG, it's the linker that does the inlining, and it has all the functions in all the of the cpp files of the entire project, minus referenced lib files that were not built with LTCG. This gives the linker (which becomes the compiler) a lot more to work with.

But it also makes your build take longer, especially when doing incremental changes. You might want to turn on LTCG in a special build configuration, that you ship to your customers (your customers should be the testers).

Note that LTCG is not the same as profile-guided optimization.

Dave Van den Eynde
A: 

I've found the downsides are longer compile times and that the .obj files made in that mode (LTCG turned on) can be really massive. For example, .obj files that might be 200-500k were about 2-3mb. It just to happened to me that compiling a bunch of projects in my chain led to a 2 gb folder, the bulk of which was .obj files.