views:

368

answers:

6

I heard some people complaining about including the windows header file in a C++ application and using it. They mentioned that it is inefficient. Is this just some urban legend or are there really some real hard facts behind it? In other words, if you believe it is efficient or inefficient please explain how this can be with facts.

I am no C++ Windows programmer guru. It would really be appreciated to have detailed explanations.


*Edit: I want to know at compile time and execution. Sorry for not mentioning it.

A: 
  • Just including the header without using it will not have any effects in runtime efficiency
  • It would affect compilation time ..
Betamoo
This is not true...it has macros in it that are *very* obnoxious, like "min" and "max".
Zan Lynx
John Dibling
@Zan I do not think that these kind of macros causes huge problems with compilation time..
Betamoo
@Betamoo: they sure do when you have _member_ functions `min` or `max` (or any other name that clashes with one of Win32 macros) anywhere! That's why it makes _C++_ programmers cry, and not C ones.
Pavel Minaev
@Pavel I think we are concerned with compilation time not readability and name conflicts issues...
Betamoo
I usually `#undef min` `#undef max` straight after including windows.h in my projects :) but of course there's still problems with functions like "CreateFile" or "CreateWindow" and other stuff like that which you might like to have used as member functions...
Dean Harding
@Betamoo: What kind of compile time are you talking about if the program cannot compile due to name clashes? Give yourself another try with another question.
Viet
I guess it's a question of precise wording. Your answer currently reads "will not have any effects in runtime", which, if read literally, is provably false due to macro substitution potentially changing semantics of the program. At least I think that's what Zan originally referred to in his comment.
Pavel Minaev
+3  A: 

If you precompile it, then the compilation speed difference is barely noticeable. The downside to precompiling, is that you can only have one pre-compiled header per project, so people tend to make a single "precompiled.h" (or "stdafx.h") and include windows.h, boost, stl and everything else they need in there. Of course, that means you end up including windows.h stuff in every .cpp file, not just the ones that need it. That can be a problem in cross-platform applications, but you can get around that by doing all your win32-specific stuff in a static library (that has windows.h pre-compiled) and linking to that in your main executable.

At runtime, the stuff in windows.h is about as bare-metal as you can get in Windows. So there's really no "inefficiencies" in that respect.

I would say that most people doing serious Windows GUI stuff would be using a 3rd-party library (Qt, wxWidgets, MFC, etc) which is typically layered on top of the Win32 stuff defined in windows.h (for the most part), so as I said, on Windows, the stuff in windows.h is basically the bare metal.

Dean Harding
+1  A: 

There are multiple places where efficiency comes in to play.

Including <windows.h> will substantially increase compile times and bring in many symbols and macros. Some of these symbols or macros may conflict with your code. So from this perspective, if you don't need <windows.h> it would be inefficient at compile time to bring it in.

The increased compile time can be mitigated somewhat by using precompiled headers, but this also brings with it a little more codebase complexity (you need at least 2 more files for the PCH), and some headaches unique to PCHs. Nonetheless, for large Windows project, I usually use a PCH. For toy or utility projects, I typically don't because it's more trouble than it's worth.

Efficiency also comes in to play at runtime. As far as I know, if you #include <windows.h> but don't use any of those facilities, it will have no effect on the runtime behavior of your program at least as far as calling extra code and that kind of thing. There may be other runtime effects however that I'm not aware of.

As far as the big White Elephant question, "Is Windows Efficient?" I'll not go in to that here other than to say this: Using Windows is much like anything else in that how efficient or inefficient it is depends mostly on you and how well you know how to use it. You'll get as many different opinions on this as people you ask however, ranging from "Winblowz sucks" to "I love Windows, it's awesome." Ignore them all. Learn to code in Windows if you need & want to and then make up your own mind.

John Dibling
+3  A: 

windows.h is not a "code library". It's a header file, and doesn't contain any executable code as such (save for macro definitions, but those still aren't compiled - their expansions are, if and when you use them).

As such, looking at it strictly from performance perspective, merely including it has any effect solely on compilation time. That one is rather significant, though - for example, if using Platform SDK headers that come with VS2010, #include <windows.h> expands to ~2.4Mb of code - and all that code has to be parsed and processed by the compiler.

Then again, if you use precompiled headers (and you probably should in this scenario), it wouldn't affect you.

Pavel Minaev
+1  A: 

In general, including windows.h is a necessity: if you need windows functions, you have to include it. I think what you're refering to is (among other things) nested inclusion of windows.h. That is, you include a .h that includes itself windows.h, and you also include windows.h in your .cpp file. This leads to inefficiencies, of course, so you have to study very well in your code what .h files are included in each .h file, and avoid including, say, windows.h n times indirectly.

Diego Sevilla
Read the top of windows.h. It will not add any definitions for nested inclusion. There is code like #ifndef _WINDOWS_H_ (all the definitions) #endif
Erik Hermansen
+1  A: 

As has been noted, #including windows.h slows down compile time. You can use precompiled headers or do a good job of isolating the windows calls only to modules that need them to help with that.

Also, you can add these preproc definitions before the windows.h include like so:

#define WIN32_LEAN_AND_MEAN
#define VC_EXTRALEAN
#include <windows.h>

It will reduce the number of definitions from windows.h and sub-included header files. You may find later on that you need to remove the lean-and-mean, but try it first and wait until the compiler complains about a missing def.

The namespace conflicts are a legitimate gripe, but technically have nothing to do with efficiency, unless you count efficiency of your personal use of time. Considering how many thousands of definitions will be thrown into your namespace, conflicts are bound to occur at some point, and that can be severely irritating. Just use the practice of isolating your Windows calls into modules, and you will be fine. For this, put #include windows.h in the .cpp file, and not the .h file.

I see no basis for thinking that the runtime performance of the executable will be impacted by including windows.h. You are only adding a large number of definitions to the context used by the compiler. You aren't even putting all the definitions into your compiled code--just allocations, function calls, and referencing based on any definitions used in your source code (.cpp).

Another argument could be made that the Windows API types and functions are inherently wasteful of resources or perform inefficiently. I.e. if you want to create a file, there is some monstrous structure to pass to the Windows API. Still, I think most of this is penny-wise/pound-foolish thinking. Evaluate Windows API performance problems case-by-case and make replacements for inefficient code where possible and valuable.

Erik Hermansen