tags:

views:

1161

answers:

4

Investigating a bug, I discovered it was due to this weirdness in c#:

sbyte[] foo = new sbyte[10];
object bar = foo;
Console.WriteLine("{0} {1} {2} {3}",
        foo is sbyte[], foo is byte[], bar is sbyte[], bar is byte[]);

The output is "True False True True", while I would have expected "bar is byte[]" to return False. Apparently bar is both a byte[] and an sbyte[]? The same happens for other signed/unsigned types like Int32[] vs UInt32[], but not for say Int32[] vs Int64[].

Can anyone explain this behavior? This is in .NET 3.5.

+1  A: 

Surely the output is correct. bar "is" both sbyte[] and byte[], because it is compatible with both, since bar is merely an object then it "could be" either signed or unsigned.

"is" is defined as "expression can be cast to type".

cdm9002
But since `bar` is of the type `object`, the base type of any other type, that means that you could say `bar is <any given type>` and it would return true, and that is not the case. Why can you cast an `sbyte[]` to `byte[]` just because it happens to pass by a reference type?
Fredrik Mörk
+8  A: 

Ran the snippet through Reflector:

sbyte[] foo = new sbyte[10];
object bar = foo;
Console.WriteLine("{0} {1} {2} {3}", new object[] { foo != null, false, bar is sbyte[], bar is byte[] });

The C# compiler is optimizing the first two comparisons (foo is sbyte[] and foo is byte[]). As you can see they have been optimized to foo != null and simply always false.

Benjamin Wegman
+4  A: 

Also interesting:

    sbyte[] foo = new sbyte[] { -1 };
    var x = foo as byte[];    // doesn't compile
    object bar = foo;
    var f = bar as byte[];    // succeeds
    var g = f[0];             // g = 255
Ben M
I'm missing something here. Isn't this what you expect. What is the oddity?
cdm9002
Not that `g = 255`, which is expected, but that `bar as byte[]` doesn't return null.
Ben M
Right -- so now that you've read my answer, you can deduce that the same kind of thing is happening here. With the first one, we know at compile time that this violates the rules of C#. With the second one, we don't know that. So we have to either (1) emit a method that implements all the rules of the C# language to do the cast, or (2) use the CLR cast rules, which are subtly different from C#'s rules for a tiny percentage of bizarre cases. We chose (2).
Eric Lippert
Yes -- it made sense after reading your explanation.
Ben M
+53  A: 

UPDATE: I've used this question as the basis for a blog entry, here:

http://blogs.msdn.com/ericlippert/archive/2009/09/24/why-is-covariance-of-value-typed-arrays-inconsistent.aspx

See the blog comments for an extended discussion of this issue. Thanks for the great question!


You have stumbled across an interesting and unfortunate inconsistency between the CLI type system and the C# type system.

The CLI has the concept of "assignment compatibility". If a value x of known data type S is "assignment compatible" with a particular storage location y of known data type T, then you can store x in y. If not, then doing so is not verifiable code and the verifier will disallow it.

The CLI type system says, for instance, that subtypes of reference type are assignment compatible with supertypes of reference type. If you have a string, you can store it in a variable of type object, because both are reference types and string is a subtype of object. But the opposite is not true; supertypes are not assignment compatible with subtypes. You can't stick something only known to be object into a variable of type string without first casting it.

Basically "assignment compatible" means "it makes sense to stick these exact bits into this variable". The assignment from source value to target variable has to be "representation preserving". See my article on that for details.

http://blogs.msdn.com/ericlippert/archive/2009/03/19/representation-and-identity.aspx

One of the rules of the CLI is "if X is assignment compatible with Y, then X[] is assignment compatible with Y[]".

That is, arrays are covariant with respect to assignment compatibility. This is actually a broken kind of covariance; see my article on that for details.

http://blogs.msdn.com/ericlippert/archive/2007/10/17/covariance-and-contravariance-in-c-part-two-array-covariance.aspx

That is NOT a rule of C#. C#'s array covariance rule is "if X is a reference type implicitly convertible to reference type Y, then X[] is implicitly convertible to Y[]". That is a subtly different rule, and hence your confusing situation.

In the CLI, uint and int are assignment compatible. But in C#, the conversion between int and uint is EXPLICIT, not IMPLICIT, and these are value types, not reference types. So in C#, it's not legal to convert an int[] to a uint[].

But it IS legal in the CLI. So now we are faced with a choice.

1) Implement "is" so that when the compiler cannot determine the answer statically, it actually calls a method which checks all the C# rules for identity-preserving convertibility. This is slow, and 99.9% of the time matches what the CLR rules are. But we take the performance hit so as to be 100% compliant with the rules of C#.

2) Implement "is" so that when the compiler cannot determine the answer statically, it does the incredibly fast CLR assignment compatibility check, and live with the fact that this says that a uint[] is an int[], even though that would not actually be legal in C#.

We chose the latter. It is unfortunate that C# and the CLI specifications disagree on this minor point but we are willing to live with the inconsistency.

Eric Lippert
Great read. Thanks for the answer.
Ben M
Hi Eric, out of curiosity, were you guys just decided to accept this inconsistency or was it not foreseen before? Just wondering.
Joan Venge
Deleted my post in deference to a much better and indepth answer.
LBushkin
Thanks for that clarification.
Fredrik Mörk
@Joan: I do not know; that was before my time. Remember, C# and the CLR were evolving at the same time and all kinds of decisions were being made on the basis of incomplete information about what the language and runtime rules were going to be. My _suspicion_ is that this one simply "fell through the cracks", and by the time we realized it, it was too late. That's just a guess though. Nothing about this issue appears in the initial language design notes archives from 1999.
Eric Lippert
Thanks Eric. .
Joan Venge
Out of curiosity, why not fix this in the spec? I understand that it would be a breaking change, but since in practice Visual C# is effectively _the_ C# compiler, and everyone else follows suit even in spec deviations, wouldn't it make sense to reconcile this now to put the matter to rest?
Pavel Minaev
Good question. Obviously we prefer the spec and the implementation to be the same. On points where they differ, we prefer to be in the situation where the spec says what we would like to be true. Should we make the spec specify a language feature we don't like and do not want, just so that we can make the implementation consistent? (And of course, that would entail removing all the existing compile-time checks that enforce the desired semantics.) Given those choices, we'd rather stay inconsistent. That's the least of all the evils.
Eric Lippert
Also Eric, does it bother you when problems like these get stuck in the .NET platform? I am asking because I always wonder about backwards compatibility vs continuously improving architecture. If you were allowed to ignore backwards compatibility, do you think the whole .NET platform would be infinitely superior? When we integrate our code on 3rd party systems at each iteration, we have to do alot of work to "upgrade". Why shouldn't it be the case, if someone wants to jump from .NET 2.0 to 3.5? Thanks.
Joan Venge
Thanks for the detailed explanation Eric!
Clusterflock
If we never had to worry about any backwards compatibility problems then yes, we could fix problems and make improvements that were breaking changes with far less expense. But that's contrary to fact. The fact is that breaking changes massively increase upgrade costs. We seek to make your upgrade costs lower, to encourage upgrading to beter products. Sometimes the benefit of taking a breaking change is worth the cost, but you can't just wish the costs away.
Eric Lippert
Thanks Eric. It's always aspiring to get a glimpse of a great mind like yours.
Joan Venge