No, you can't. If you look at the definition of Enum, instances of it are final and can't be extended. This makes sense if you understand enums as a finite, final set of values.
There is a difference between a numeral (an syntactic artifact) which could be binary, decimal, hex or whatever, and the actual semantic number, the numerical entity represented syntactically by a numeral in a context (the base system.)
In your example, what you need is
- enums specifying syntatic numerals
(decimal digits and alphabetic
symbols representing legal
hexadecimals; that is, enum tokens,
and
- classes specifying behavior (or
grammar) required for syntactically
representing a number as a numeral
(using the numeral [syntactic]
enums).
That is, you have tokens or symbols and a grammar/behavior that indicate whether a stream of tokens represent a number under a given base.
But that's a bit off the tangent (and as you said, it was just an example for example's sake). Going back to extending enums...
... yo can't and you shouldn't. Enums are not meant to represent things that can be extended. They are for representing a constant set of constant values. There are such things that are un-inheritable.
Also, don't fall into the trap of extending for extension's sake or for trying to force an structure into your code or model.
It might seem to make sense to make one set of values an extension of another. More often than not, it is not. Use inheritance for reusing behavior or complex structure, not just data that has little to no structure with non-reusable behavior associated to it.