Is there an actual need for NULL or not? In most of the OO languages i have programmed in there has always been a way to set a variable to a NULL value which lead to all sorts of funny problems.
What are your thoughts?
Is there an actual need for NULL or not? In most of the OO languages i have programmed in there has always been a way to set a variable to a NULL value which lead to all sorts of funny problems.
What are your thoughts?
It's possible to design a language that doesn't have a NULL but instead uninitialised values point to a singleton dummy object that doesn't actually do anything. You could compare pointers against the reference of this dummy object, and calls to methods on the object would result in no action or a runtime error.
This technique is hard to implement for statically typed languages like C++ or Java.
NULL is a little like God. If it didn't exist, we would wind up having to create one. Something has to represent the value of a reference that is unassigned (whether that be because it was never assigned or it was cleared at some point). The only alternative is to use an object that, effectively, substitutes for NULL. The problem with that is that if you did all that to avoid the NullPointerException, now you're going to simply replace it with UnexpectedObject exception or ClassCastException or what not.
In languages with garbage collection where variables are actual storage locations (as opposed to Python's labels), the NULL
value is required to allow memory to be freed in a clean manner before the end of the variable's scope.
Also, even many algorithms written in pseudo code make use of the special NULL
value. It pops up literally everywhere. It is a central concept in computer science.
There are normally two special values. Some languages handle both, some only 1 and throw an error with the other, and some merge the two. Those two values are Null
and Undefined
.
Undefined
would be trying to use a variable that flat out doesn't exist.
Null
would be trying to use a variable that exists but has no value.
Null
can be useful because it is a guaranteed value that indicates that something is wrong, or outside of the domain/range of possible answers. Take Java for instance:
If you did not have null
, what if you did a lookup in a HashMap
for something that didn't exist in the Map? What would you return? If you returned an Object then how would you know wether that Object meant nothing was there or this was what was actually in the Map. Workarounds could include creating your own "NON_EXIST" Constant Object, but thats essentially the same thing as what null
already is anyways. Another workaround might be throwing an Exception. Now you're looking at major performance impacts.
Again, its the idea of having a guaranteed allowable value that you can use. It is always available to you and its always disjoint from the set of "real values" that would normally return from an operation you're performing. Null
is therefore intentionally used to mean something special.
Like I hinted before, there are also optimizations that can be done here as well because Null
is a special value. In languages like Java if you originally referenced an Object solely by a variable, then set that variable to null
you're removing that reference and allowing Java's Garbage Collector to collect that now unreferenced data. If instead you let that variable sit around forever that hunk of memory that may never be used again will continue to hold resources. This is a contrived example, but it proves a point and in some resource intensive Java programs you will see these explicit assignments to null
.