views:

705

answers:

7

Since I started Java it's been very aggravating for me that it doesn't support implicit conversions from numeric types to booleans, so you can't do things like:

if (flags & 0x80) { ... }

instead you have to go through this lunacy:

if ((flags & 0x80) != 0) { ... }

It's the same with null and objects. Every other C-like language I know including JavaScript allows it, so I thought Java was just moronic, but I've just discovered that C# is the same (at least for numbers, don't know about null/objects): http://msdn.microsoft.com/en-us/library/c8f5xwh7(VS.71).aspx

Microsoft changed it on purpose from C++, so why? Clearly I'm missing something. Why change (what I thought was) the most natural thing in the world to make it longer to type? What on Earth is wrong with it?

+25  A: 

Both Java and C# abandoned implicit conversions to booleans to reduce the chance of programmer error.

For example, many programmers would accidentally write:

if( x = 5 ) { ... }

instead of:

if( x == 5 ) { ... }

Which of course results in completely different behavior, since the first statement performs an assignment (which will always result in true), while the second performs a comparison. In the past, developers would sometimes write such assignments in reverse to avoid the pitfall, since:

if( 5 = x ) { ... } // doesn't compile.

Now, in C#, you can still create implicit conversion operators to bool for your own types - although it is rarely advisable, since most developers don't expect it:

public class MyValue
{
   public int Value { get; set; }

   public static implicit operator bool( MyValue mb )
   {
       return mb.Value != 0;
   }
}

MyValue x = new MyValue() { Value = 10; }
if( x ) { ... } // perfectly legal, compiler applies implicit conversion
LBushkin
+32  A: 

For clarity. It makes the following mistake simply illegal:

int x = ...;

if (x = 0)  // assigns 0 to x and always evaluates to false in C
   ....     // never executed

Note: most modern C / C++ compilers will give a Warning (but not an Error) on this straightforward pattern, but there are many variations possible. It can creep up on you.

Henk Holterman
In theory, in C# it's possible that the type of `x` implements implicit conversion operators from `int` and to `bool`, in which case this would compile in C#. If you qualify that `x` is a built in numeric type (like `int`), then your example is absolutely correct.
LBushkin
@LBushkin, you're right, I'll add it. But assuming a custom conversion is a little far fetched here.
Henk Holterman
Many of the Underhanded C submissions feature this little chestnut tucked away somewhere in the code.
Jason
+7  A: 

Maybe they felt that being more explicit was more inline with a strongly typed language.

JohnB
+1  A: 

Even the most experienced programmers have problems with an implicit conversion to boolean. I for one appreciate this little feature.

ChaosPandion
+6  A: 

Implicit conversion of any int value (such as (flags & 0x80)) to a boolean implies a language defined mapping from an int value to a boolean. C did this, and caused a huge amount of confusion and a lot of programmer error. There is no good reason why a zero int value ALWAYS means true (or false) and a lot of good reasons why you might want to leave the decision to the programmer. For these reasons implicit conversion to boolean has been abandoned by most modern languages.

If typing seven extra characters every time you do a bit test constitutes 'lunacy' you may be in the wrong profession. If you are doing bit tests in an int extremely frequently you might want to think about whether you are prematurely optimizing to save memory.

DJClayworth
Such "lunacy" is why we have the `++` operator, which used to optimize to INCR instructions but now just serve as a typing shortcut.
Barry Brown
I agree. Implicit conversions to booleans can be useful, but declaring that some arbitrary values are false and some other equally arbitrary values are true is not the way. In Ruby and many Lisps for example, *everything* is true, except for `false` and `nil`. That's a *simple* and obvious rule. "Some numbers are true and some are false, unless you write your own number type in which case they are neither true nor false" is not exactly a simple rule. Also, why should numbers be true or false but not, say, matrices? Or `User` s?
Jörg W Mittag
+2  A: 

Some programming languages do no automatic coercion at all. An integer, for example, can only be compared to another integer; assignment to a non-integer variable results in an error. Such is the hallmark of a strongly-typed language.

That Java does any coercion is a convenience for you and breaks the strong-typing model.

Mapping the entire range of integers -- or the even larger range of floats -- onto the two boolean values is fraught with disagreement over arbitrary assignment of "truthness" and "falseness".

  • What values map onto false and true? If you're C, only zero maps to false and all other values are true. If you're the bash shell, it's reversed.
  • How should negative values be mapped?

When you try to automatically convert a double to an integer, Java flags this as a "loss of precision" error. By analogy, converting a number to a boolean should also result in a loss of precision. Instead, Java chose to not syntactically support it.

Barry Brown
I know of no debate of how to map `int` to `bool`. Could you provide a link?
Henk Holterman
There's no official debate, per se. But since some languages have different ideas of how to do the conversion, there's clearly no standard.
Barry Brown
+7  A: 

You've got it backward.
It's actually C that does not support boolean, so if ( and any other conditional statement ) actually expects an int value, not boolean. Then int value of 0 is treated as false and any other value is treated as true.

Some people actually find it a little ambiguous, because this type of behavior can lead to many errors, as others have pointed out. Because of this, Java designers have opted out to support only boolean types in the condition statements. And when Microsoft decided to implement MS-Java ( AKA C# ), they've borrowed this design principal.

If you don't like it, you can program in a variety of languages that do not have this restriction.

Alexander Pogrebnyak
+1 for MS-Java (AKA C#)
Willi