I want to check if a reference type is null. I see two options (_settings is of reference type FooType):
if (_settings == default(FooType)) { ... }
and
if (_settings == null) { ... }
How do these two perform differently?
I want to check if a reference type is null. I see two options (_settings is of reference type FooType):
if (_settings == default(FooType)) { ... }
and
if (_settings == null) { ... }
How do these two perform differently?
There's no difference. The default value of any reference type is null
.
My understanding is they are not different. It only matters when you are dealing with value types.
There is no difference, but second one is more readable. The best place to use default
is when you deal with generics. Common code is return default(T);
I would definitely go with the specific check against null. Because if the type of the _settings
class ever changes you may run into reference issues. At minimum it would require a change to the code breaking the open/close policy.
if( _settings == null ) {...}
This IMO is safer and cleaner.
As has been mentioned, there is no difference... but you might want to use default(<type>)
anyway, to handle the cases where it's not a reference type. Typically this is only in generics, but it's a good habit to form for the general case.