This is just a guess, but I think that the reason might have to do with the fact that const values are put in metadata (which has subtle consequences all it's own) when compiled. I wonder if maybe the compiler has some issues figuring out how to transform a var to metadata.
In Richter's CLR VIA C# (page 177),
Defining a constant causes creation
of metadata. When code refers to a
constant symbol, compilers look up
that symbol in the metadata of the
assembly that defines that constant,
extract the constant's value, and
embed the value in the emitted IL
code.
He goes on to note that this means that you can't get the reference to memory of a constant for this reason. To make this a bit more explicit, in psuedo C# if assembly A defines a const:
//Assembly A, Class Widget defines this:
public static const System.Decimal Pi = 3.14
then you have a consumer of A:
//somewhere in the Program.exe assembly
decimal myCircleCurcum = 2 * Widget.pi
the resultant compiled IL of program.exe would do something like this pseudocode:
// pseudo-IL just to illustrate what would happen to the const
myCircleCurcum = 2*3.14
note that the consuming assembly has no idea that the decimal 3.14 had any relationship to Assembly A at all--it is to program.exe a literal value. This, to me, is a reasonable way for the C# compiler to act--after all, Assembly A declared explicitly that pi is a constant (meaning that the value is once and for all pi=3.14). But, I'd venture to guess, that 99% of C# developers do not understand the ramifications of this & might change pi to be 3.1415 on a whim.
Constants have a really poor cross-assembly version story (again, this comes from Richter) because a consumer of assembly A with a constant in it will not see a change if assembly A's constant changes (i.e. it was recompiled). This can cause really hard to figure out bugs by consumer of assembly A. . . so much so that I ban my team from using constants. Their slight perf gain is not worth the subtle bugs they can cause.
You can really only ever use a constant if you know that the value will never, ever change -- and even with something set as a const such as pi, you can't say for sure that you won't want your percision to change in the future.
if assembly A defines:
decimal const pi = 3.14
then you build it and then other assemblies consume it, if you then change assembly A:
decimal const pi = 3.1415
and rebuild assembly A, the consumer of assembly A will still have the old value 3.14! why? because the original 3.14 was defined as a constant which means that the consumers of assembly A have been told that the value won't change--so they can bake that value of pi into their own metadata (if you rebuild consumer of assembly A it will then get the new value of pi in it's metadata). Again, I don't see this as a problem with the way CSC handles constants--it's just that developers probably don't expect that a constant can't be changed safely under some circumstances, where it can be changed safely in others. Safe: no consumers will ever have reference by .dll only (i.e. they will always build from source EVERY TIME), unsafe: consumers don't have a clue about when source code of your assembly with the const defined it it changes. It probably should be made much more clear in .NET documentation that constant means you can't change the value in the sourcecode
For that reason, I'd strongly suggest not using constants and instead just making the widget readonly. How many values can you really say for certain are truly going to be const for ever and always?
The only real reason to use const over readonly in my mind is if something might have performance implications... but if you are running into that, I'd wonder if C# is really the correct language for your problem. In short, to me, it is alomst never a good idea to use constants. There are very few times where the tiny perf improvement is worth the potential problems.