I want to define a Nibble type.
I want that if the user sets the value higher than 0xf
, it should generate a compiler error.
Is this possible?
I want to define a Nibble type.
I want that if the user sets the value higher than 0xf
, it should generate a compiler error.
Is this possible?
No, this is not possible with the C# compiler.
It has no preprocessor to test for constant values like the C/C++ preprocessor can.
Best you can do, is to throw an exception at runtime.
Edit: You could always try running the C# code (with some minor alterations) through the C/C++ preprocessor and emit an #error
directive.
Edit:
Seeing this is language-agnostic, yes, you can easily do it in any langauge that supports some kind of macro expansion or compile time evaluation. Eg: Scheme, LISP, C/C++, etc.
If in your point USER
is a DEVELOPER
you can do this with macro like this:
#if YOUT_VALUE == 0xf
#error YOUR_ERROR_MESSAGE
#endif
But in some development enviroment you may have problem with comparsion in #if
statement, because her functionality was cuted into defied/undefined only.
You could use an Enum with the [Flags] attribute. This way you are allowed to use bitwise operations on its members:
[Flags]
enum Nibble
{
_0,
_1,
// ...
_A,
_F,
};
byte b = Nibble._1|Nibble._A;
You could also create a struct nibble with an implicit conversion operator from int to nibble. But this would create a runtime error, not a compile time error.
If you want to do static checking have a look at c#4.0 Contracts API.
It looks like the easiest way to achieve what you want is a subrange type. Languages which support subrange types are pretty much all languages in the Algol68 line of succession (Algol68, Pascal, Modula-2, Oberon, Component Pascal) and their cousings and derivatives (Turbo Pascal, Borland Pascal, FreePascal, Delphi, Kylix, Object Pascal) as well as Ada. I believe that you can implement subrange types in C++ using some heavy template-fu. You can probably implement them in languages with more expressive type systems, such as Scala, Haskell, ML, Agda, Epigram, Guru.
I have no idea why not more languages support subrange types. They are obviously useful, easy to use, easy to understand, easy to implement.
Another possibility might be Fortress. In Fortress, the various fixed-width integer types are not actually built into the language, they are user-defined. So, there is no reason why you should not be able to build your own user-defined fixed-width integer type.