views:

159

answers:

2

Looks like ExpressionTrees compiler should be near with the C# spec in many behaviors, but unlike C# there is no support for conversion from decimal to any enum-type:

using System;
using System.Linq.Expressions;

class Program
{
  static void Main()
  {
    Func<decimal, ConsoleColor> converter1 = x => (ConsoleColor) x;
    ConsoleColor c1 = converter1(7m); // fine

    Expression<Func<decimal, ConsoleColor>> expr = x => (ConsoleColor) x;

    // System.InvalidOperationException was unhandled
    // No coercion operator is defined between types
    // 'System.Decimal' and 'System.ConsoleColor'.

    Func<decimal, ConsoleColor> converter2 = expr.Compile();

    ConsoleColor c2 = converter2(7m);
  }
}

Other rarely used C# explicit conversions, like double -> enum-type exists and works as explained in C# specification, but not decimal -> enum-type. Is this a bug?

+3  A: 

Not a real answer yet, I'm investigating, but the first line is compiled as:

Func<decimal, ConsoleColor> converter1 = x => (ConsoleColor)(int)x;

If you try to create an expression from the previous lambda, it will work.

EDIT : In the C# spec, §6.2.2, you can read:

An explicit enumeration conversion between two types is processed by treating any participating enum-type as the underlying type of that enum-type, and then performing an implicit or explicit numeric conversion between the resulting types. For example, given an enum-type E with and underlying type of int, a conversion from E to byte is processed as an explicit numeric conversion (§6.2.1) from int to byte, and a conversion from byte to E is processed as an implicit numeric conversion (§6.1.2) from byte to int.

So explicit casts from enum to decimal are handled specifically, that's why you get the nested casts (int then decimal). But I can't see why the compiler doesn't parse the lambda body the same way in both cases.

Romain Verdier
The compiler probably emits the nested cast in another pass. In this case, it just creates a Convert node which fails at runtime. Whether it's a compiler bug that should emit the nested conversion or an Expression API bug that should understand decimal to enum conversion is left to the reader. I for one think that it's csc's responsibility to emit the proper convert node.
Jb Evain
I agree. In fact you get the compilation error on the "expr = lambda" line. So the compiler doesn't event try to emit the additional Convert node or anything else actually ; it considers the lambda body is invalid, which is not, according to the c# spec.
Romain Verdier
For `double -> enum-type` conversion csc doesn't emit Convert `double -> int`, just directly `double -> enum-type' and ExpressionTrees compiler understands this fine...
ControlFlow
ControlFlow: that's because decimal is not a traditional value type. It converts to int not by a standard conversion but by an operator emitted by the compiler.Romain: actually I think that the compiler emits the node decimal -> enum thinking that it is valid. And the expression tree factory method bails out.
Jb Evain
+12  A: 

It is probably a bug, and it is probably my fault. Sorry about that.

Getting decimal conversions right was one of the hardest parts of building the expression tree code correct in the compiler and the runtime because decimal conversions are actually implemented as user-defined conversions in the runtime, but treated as built-in conversions by the compiler. Decimal is the only type with this property, and therefore there is all kinds of special-purpose gear in the analyzer for these cases. In fact, there is a method called IsEnumToDecimalConversion in the analyzer to handle the special case of nullable enum to nullable decimal conversion; quite a complex special case.

Odds are good that I failed to consider some case going the other way, and generated bad code as a result. Thanks for the note; I'll send this off to the test team and we'll see if we can get a repro going. Odds are good that if this does turn out to be a bona fide bug, this will not be fixed for C# 4 initial release; at this point we are taking only "user is electrocuted by the compiler" bugs so that the release is stable.

Eric Lippert
I didn't know humans were harmed in the making of the C# language :)
Joan Venge
"decimal conversions are actually implemented as user-defined conversions in the runtime, but treated as built-in conversions by the compiler": What does this mean, and why was it done this way?
Brian
@Brian: When you do a representation-changing conversion, say int to double, there is an IL instruction that does exactly that conversion. When you do decimal to double, we actually generate code to call a method to do the conversion; there is no built-in-to-the-CLR conversion instruction for decimals. But from a *language* perspective we want the decimal conversions to *appear* to be built-in-to-the-language conversions; we have different rules for built-in and user-supplied conversions. So we have to build some special scenery to hide what is happening behind the scenes with decimals.
Eric Lippert
OK, that makes sense. Thank you.
Brian