By default, the first field for an enum
type is given the ordinal value 0
.
So, if uriType
does not contain the MyUriType.ForParse
flag, then uriType & MyUriType.ForParse
actually equals 0
, which counterintuitively evaluates to true
when you compare it to MyUriType.ForParse
for equality (which is also 0
).
If you break it down to bitwise arithmetic then the expression you're evaluating is:
({something} & 0) == 0
...which will always evaluate to true
, no matter what the "something" is.
Normally, when you define a Flags
enum, you should actually specify values for each field:
[Flags]
public enum MyUriTypes
{
None = 0,
ForParse = 1,
ForDownload = 2,
ForSomethingElse = 4,
ForAnotherThing = 8
}
Each value should be a power of 2 so that they don't conflict (every multiple of 2 is a binary shift-left).
It's also customary to name it as a plural, so that people who use the enum know that it is a Flags
enum and can/should hold multiple values.
If you define your enum this way, your test code will now evaluate to false
if uriType
does not have the ForParse
bit set.