views:

634

answers:

7

I noticed this interesting behaviour of the g++ compiler, if I add a -O3 flag to the compiler, I get

otsu.cpp:220: warning: ‘x’ may be used uninitialized in this function

However, when I do not use optimization and instead use a debug flag -g I got no warnings at all. Now, I trust the compiler more when the -g flag is on; however, I'm wondering if this is well defined behaviour that should be expected?

For clarity, the code that causes this is something along these lines:

int x; //uninitialized


getAValueForX( &x ); // function makes use of x,
                     // but x is unitialized

where

 void getAValueForX( int *x )
 {
     *x = 4;
 }

or something along those lines, obviously more complex.

+1  A: 

My compiler flags:

CFLAGS=-W -Wall\
 -Wno-non-template-friend\
 -Wold-style-cast\
 -Wsign-promo\
 -Wstrict-null-sentinel\
 -Woverloaded-virtual
# -Weffc++

-Weffc++ can be really annoying, so sometimes I try it but generally I keep it turned off. Try these - and others in the manual - and let's see what we see.

Notinlist
+13  A: 

That's expected. The optimizations cause a specific code analysis to run and that's how gcc finds the un-initialized variables. It's in the manual page:

. . . these warnings depend on optimization

http://gcc.gnu.org/onlinedocs/gcc/Warning-Options.html

Kyle Butt
+1 for the manual link; explains it all. Which is probably why your answer was so much shorter than mine ;)
Clifford
+3  A: 

This is actually very common with gcc. And yes, this is to be expected.

As far as I understood, in order to optimize, the compiler generates a lot of metrics and transform the code (or, more precisely, the representation it has of the code) in a way that allows the detection of uninitialized or unused variables for examples (there are a few other warnings like that, I don't remember the list).

Doing the same thing without optimization would require doing, and then scrapping away all this analysis. This would slow down compilation significantly for no good purpose (especially as, in debug, the compiler is not suppose to rearrange the code).

PierreBdR
+1  A: 

Yes, this is well-defined behavior. When GCC's optimizer is not enabled, it doesn't do certain types of execution-path checking (so as to avoid the performance penalty of doing these kinds of checks). Certain situations, such as use of uninitialzed variables, can only be detected when these extra checks are performed. Hence, with -O0, GCC is unable to warn about these conditions.

Dan Moulding
+1  A: 

Well the fact the compiler can move things around to optimize can cause issues and lead to undefined behaviour (as stated by the manual states below); I think it would be helpful to see the code to try and help make sense.

The shortcuts taken by optimized code may occasionally produce surprising results: some variables you declared may not exist at all; flow of control may briefly move where you did not expect it; some statements may not be executed because they compute constant results or their values were already at hand; some statements may execute in different places because they were moved out of loops.

bahree
It has nothing to do with whether things get moved around. With optimization disabled, the compiler never even performs the analysis that could detect uninitialized variables. Optimization won't ever *cause* a variable to not be initialized.
Rob Kennedy
+3  A: 

The code flow analysis that the optimiser performs allows it to detect potential problems that normal (and faster) compilation cannot detect. The problem was always there, the compiler just didn't check for it.

Even when this warning is issued, it may in fact not be a problem due to the actual usage of the function in practice; the compiler will assume that all possible values of the types of its arguments (and any external variables used within the function) may occur in all possible combinations - leading to at least one path where the variable is used without being assigned a value. Your actual usage will have a far more restrictive set of possible states, so the path may never occur in practice. The simple solution is just to initialise the variable if only to shut the compiler up - it will cost you nothing.

I always use the optimiser as a form of poor-man's static analysis even when I ultimately do not intend to use it in the production code. Equally, I often use more than one compiler for the same reason. Some compilers perform checks that others don't, or they generate differently worded messages for the same errors, which often helps in their interpretation for some of the more obtuse messages.

Quote:

I trust the compiler more when the -g flag is on

While it is true that if a compiler has a bug it is likely to be in the optimiser (it being the most complex part), for a mature compiler such as GCC, this would be a very rare find. Conversely people often find that their working code fails when optimised; most often the code was always flawed (perhaps it relied on undefined or compiler defined behaviour), and the optimiser has just exposed that flaw. So I suggest if you find your code breaking under optimisation, suspect the code before the compiler - Occam's Razor applies.

Clifford
+1 I find trying a variety of compilers can offer interesting perspective into your code, and call attention to implicit assumptions you've made that might not hold somewhere else. This is often neglected.
asveikau
A: 

I have the same issue in my msvc 6 compiler. Initializing the variable in question, removes the possibility of a bad path from the compiler viewpoint.

EvilTeach