tags:

views:

95

answers:

6

I want to check if a reference type is null. I see two options (_settings is of reference type FooType):

if (_settings == default(FooType)) { ... }

and

if (_settings == null) { ... }

How do these two perform differently?

+6  A: 

There's no difference. The default value of any reference type is null.

Stephen Cleary
+1  A: 

My understanding is they are not different. It only matters when you are dealing with value types.

Chuck Conway
+1  A: 

Not different but I think

if (_settings == null) { ... }

is clearer.

Mau
+1  A: 

There is no difference, but second one is more readable. The best place to use default is when you deal with generics. Common code is return default(T);

Andrey
A: 

I would definitely go with the specific check against null. Because if the type of the _settings class ever changes you may run into reference issues. At minimum it would require a change to the code breaking the open/close policy.

if( _settings == null ) {...}

This IMO is safer and cleaner.

Jerod Houghtelling
A: 

As has been mentioned, there is no difference... but you might want to use default(<type>) anyway, to handle the cases where it's not a reference type. Typically this is only in generics, but it's a good habit to form for the general case.

Randolpho