I'm working developing a system right now that deals with lots of conversions between semantically different values that have the same primitive .NET type (double/string/int). This means that it's possible to get confused about which 'semantic type' you are using, either by not converting or converting too many times. Ideally I'd like the compiler to issue a warning/error if I try to use a value where it doesn't semantically make sense.
Some examples to indicate what I'm referring to:
- Angles may be in units of degrees or radians, yet both are represented by
double
. - Vector positions may be in local/global coordinates, yet both are represented by a
Vector3D
struct. - Imagine a SQL library that accepts various query parameters as strings. It'd be good to have a way of enforcing that only clean strings were allowed to be passed in at runtime, and the only way to get a clean string was to pass through some SQL injection attack preventing logic.
I believe F# has a compile-time solution for this (called units of measure.) I'd like to do something similar in C#, although I don't need the dimensional analysis that units of measure in F# offers.
I believe C++ could achieve this using typedef
(though I'm not a C++ expert).
The obvious solution is to wrap the double/string/whatever in a new type to give it the type information the compiler needs. I'm curious if anyone has an alternative solution. If you do think wrapping is the only/best way, then please go into some of the downsides of the pattern (and any upsides I haven't mentioned too.) I'm especially concerned about the performance of abstracted primitive numeric types on my calculations at runtime, so whatever solution I come up with must be lightweight both in terms of memory allocation and call dispatch.