Immutability
Simply put, memory is immutable when it is not modified after being initialised.
Programs written in imperative languages such as C, Java and C# may manipulate in-memory data at will. An area of physical memory, once set aside, may be modified in whole or part by a thread of execution at any time during the program's execution. In fact, imperative languages encourage this way of programming.
Writing programs in this way has been incredibly successful for single-threaded applications. However as modern application development moves towards multiple concurrent threads of operation within a single process, a world of potential problems and complexity is introduced.
When there is only one thread of execution, you can imagine that this single thread 'owns' all of the data in memory, and so therefore can manipulate it at will. However, there is no implicit concept of ownership when multiple executing threads are involved.
Instead, this burden falls upon the programmer who must go to great pains to ensure that in-memory structures are in a consistent state for all readers. Locking constructs must be used in careful measure to prohibit one thread from seeing data while it is being updated by another thread. Without this coordination, a thread would inevitably consume data that was only halfway through being updated. The outcome from such a situation is unpredictable and often catastrophic. Furthermore, making locking work correctly in code is notoriously difficult and when done badly can cripple performance or, in the worst case, case deadlocks that halt execution irrecoverably.
Using immutable data structures alleviates the need to introduce complex locking into code. When a section of memory is guaranteed not to change during the lifetime of a program then multiple readers may access the memory simultaneously. It is not possible for them to observe that particular data in an inconsistent state.
Many functional programming languages, such as Lisp, Haskell, Erlang, F# and Clojure, encourage immutable data structures by their very nature. It is for this reason that they are enjoying a resurgence of interest as we move towards increasingly complex multi-threaded application development and many-computer computer architectures.
State
The state of an application can simply be thought of as the contents of all the memory and CPU registers at a given point in time.
Logically, a program's state can be divided into two:
- The state of the heap
- The state of the stack of each executing thread
In managed environments such as C# and Java, one thread cannot access the memory of another. Therefore, each thread 'owns' the state of its stack. The stack can be thought of as holding local variables and parameters of value type (struct
), and the references to objects. These values are isolated from outside threads.
However, data on the heap is shareable amongst all threads, hence care must be taken to control concurrent access. All reference-type (class
) object instances are stored on the heap.
In OOP, the state of an instance of a class is determined by its fields. These fields are stored on the heap and so are accessible from all threads. If a class defines methods that allow fields to be modified after the constructor completes, then the class is mutable (not immutable). If the fields cannot be changed in any way, then the type is immutable. It is important to note that a class with a mutable field is not necessarily immutable. For example, in C#, just because a field of type List<object>
is defined as being readonly
, the actual content of the list may be modified at any time.
By defining a type as being truly immutable, its state can be considered frozen and therefore the type is safe for access by multiple threads.
In practice, it can be inconvenient to define all of your types as immutable. To modify the a value on an immutable type can involve a fair bit of memory copying. Some languages make this process easier than others, but either way the CPU will end up doing some extra work. Many factors contribute to determine whether the time spent copying memory outweighs the impact of locking contentions.
A lot of research has gone into the development of immutable data structures such as lists and trees. When using such structures, say a list, the 'add' operation will return a reference to a new list with the new item added. References to the previous list do not see any change and still have a consistent view of the data.