tags:

views:

239

answers:

9

I'm quite happy that, in C, things like this are bad code:

(var_a == var_b) ? TRUE : FALSE

However, what's the best way of dealing with this:

/* Header stuff */
#define INTERESTING_FLAG 0x80000000
typedef short int BOOL;

void func(BOOL);

/* Code */
int main(int argc, char *argv[])
{
        unsigned long int flags = 0x00000000;

        ... /* Various bits of flag processing */

        func(flags & INTERESTING_FLAG); /* func never receives a non-zero value
                                         * as the top bits are cut off when the
                                         * argument is cast down to a short 
                                         * int
                                         */
}

Is it acceptable (for whatever value of acceptable you're using) to have (flags & FLAG_CONST) ? TRUE : FALSE?

A: 

You could avoid this a couple different ways:

First off

void func(unsigned long int);

would take care of it...

Or

if(flags & INTERESTING_FLAG)
{
  func(true);
}
else
{
  func(false);
}

would also do it.

EDIT: (flags & INTERESTING_FLAG) != 0 is also good. Probably better.

David Oneill
+4  A: 

Set your compiler flags as anally as possible, to warn you of any cast that loses bits, and treat warnings as errors.

Thomas
+1. This doesn't tell you what to do about it, but it is important.
David Oneill
That tells me about the problem, but it doesn't solve it…
me_and
+6  A: 

I would in either case called func with (flags & INTERESTING_FLAG) != 0 as an argument to indicate that a boolean parameter is required and not the arithmetic result of flags & INTERESTING_FLAG.

jarnbjo
Vatine
@vatine: That does have the disadvantage that comparisons to non-zero values are (very marginally) less efficient than comparisons to zero, as a comparison to zero can generally be implemented as a single machine operation. Doesn't make much of a difference (and with a good compiler it may well make no difference at all), but still…
me_and
@me_and: No difference at all with the two compilers I just tried. It's a pretty simple transform for a compiler to get.
Stephen Canon
+5  A: 

I'd prefer (flags & CONST_FLAG) != 0. Better still, use the _Bool type if you have it (though it's often disguised as bool).

Jerry Coffin
Does `_Bool`/`bool` map all non-zero values to `true`, then? I've not heard of it before…
me_and
@me_and:yes, a cast to _Bool/bool makes all non-zero values true, and zero false. It was added in C99, but since C++ also has `bool`, a fair number of C compilers include it as well, even though they don't implement (even close to) all of C99.
Jerry Coffin
Accepting: has a good pre-C99 option (that I'm probably going to have to use; I need to be able to build on non-C99 compilers, sadly), and has the ideal standards-supplied solution too :D
me_and
A: 

This is partially off topic:

I'd also create a help function that makes it obvious to the reader what the purpose of the check is so you don't fill your code with this explicit flag checking all over the place. Typedefing the flag type would make it easier to change flag type and implementation later.

Modern compilers supports the inline keyword that can get rid of the performance overhead in a function call.

typedef unsigned long int flagtype;
...
inline bool hasInterestingFlag(flagtype flags) {
   return ((flags & INTERESTING_FLAG) != 0);
}
Laserallan
+1  A: 

This may not be a popular solution, but sometimes macros are useful.

#define to_bool(x) (!!(x))

Now we can safely have anything we want without fear of overflowing our type:

func(to_bool(flags & INTERESTING_FLAG));

Another alternative might be to define your boolean type to be an intmax_t (from stdint.h) so that it's impossible for a value to be truncated into falseness.

While I'm here, I want to say that you should be using a typedef for defining a new type, not a #define:

typedef short Bool; // or whatever type you end up choosing

Some might argue that you should use a const variable instead of a macro for numeric constants:

const INTERESTING_FLAG = 0x80000000;

Overall there are better things you can spend your time on. But macros for typedefs is a bit silly.

Chris Lutz
Good spot. I used `typedef` in my code, and completely forgot about its very existence when writing this question!
me_and
+4  A: 

Some people don't like it, but I use !!.

ie

!!(flags & CONST_FLAG)

(not as a to_bool macro as someone else suggested, just straight in the code).

If more people used it, it wouldn't be seen as unusual so start using it!!

tony
Chris Lutz
A: 

Do you have anything against

 flags & INTERESTING_FLAG ? TRUE : FALSE

?

Bernd Jendrissek
A: 

This is why you should only use values in a "boolean" way when these values have explicitly boolean semantics. Your value does not satisfy taht rule, since it has a pronounced integer semantics (or, more precisely, bit-array semantics). In order to convert such a value to boolean, compare it to 0

func((flags & INTERESTING_FLAG) != 0); 
AndreyT