On why a vector<T>
cannot be correctly converted to a vector<const T>
even if T
can be converted to const T
This is a common recurring problem in programming whether it is with constness or inheritance (a container of derived object cannot be converted to a container of base objects, even if the contained elements themselves can). The problem is that element by element each one of them can be converted, but the container itself cannot without breaking the type system.
If you were allowed to do vector< const T > &vr = my_vector_of_T
, then you would be allowed to add elements through vr
, and those elements would be constant by definition. But at the same time those same elements would be aliased in my_vector_of_T
as non-const elements and could be modified through that interface, breaking constness in the typesystem.
In the particular case of a vector<int>
being converted to a vector<const int>
, chances are that you would not notice really weird effects --besides adding an element to a vector<const int>
and seeing how the constant element changes in time, but still remember that given two related types T1
and T2
for which a relation exists, in most cases trying to apply the same relationship to containers of T1
and T2
will break the type system.