It can be safe to have a public base class with a non-virtual destructor but behaviour is undefined if someone allocates an instance of your class with new
, refers to it with a vector<...>*
, and then deletes it using that pointer without casting it back to a pointer to your class. So users of your class must know not to do that. The surest way to stop them is not to give them the opportunity, hence the compiler warning.
To deal with this issue without having to impose such odd conditions on your users, the best advice is that for public base classes in C++, the destructor should be either public and virtual, or protected and non-virtual (http://www.gotw.ca/publications/mill18.htm, guideline #4). Since the destructor of std::vector
is neither, that means it shouldn't be used as a public base class.
If all you want is to define some additional operations on vectors, then that's what free functions are for in C++. What's so great about the .
member-call syntax anyway? Most of <algorithm>
consists of additional operations on vector and other containers.
If you want to create, for example, a "vector with a maximum size limit", which will provide the entire interface of vector
with modified semantics, then actually C++ does make that slightly inconvenient, compared with languages where inheritance and virtual calls are the norm. Simplest is to use private inheritance and then for the member functions of vector
that you don't want to change, bring them into your class with using
:
#include <vector>
#include <iostream>
#include <stdexcept>
class myvec : private std::vector<int> {
size_t max_size;
public:
myvec(size_t m) : max_size(m) {}
// ... other constructors
void push_back(int i) {
check(size()+1);
std::vector<int>::push_back(i);
}
// ... other modified functions
using std::vector<int>::operator[];
// ... other unmodified functions
private:
void check(size_t newsize) {
if (newsize > max_size) throw std::runtime_error("limit exceeded");
}
};
int main() {
myvec m(1);
m.push_back(3);
std::cout << m[0] << "\n";
m.push_back(3); // throws an exception
}
You still have to be careful, though. The C++ standard doesn't guarantee which functions of vector
call each other, or in what ways. Where those calls do occur, there's no way for my vector
base class to call the overload in myvec
, so my changed functions simply won't apply -- that's non-virtual functions for you. I can't just overload resize()
in myvec
and be done with it, I have to overload every function that changes the size and make them all call check
(directly or by calling each other).
You can deduce from restrictions in the standard that some things are impossible: for example, operator[]
can't change the size of the vector, so in my example I'm safe to use the base-class implementation, and I only have to overload functions which might change the size. But the standard won't necessarily provide guarantees of that kind for all conceivable derived classes.
In short, std::vector
is not designed to be a base class, and hence it may not be a very well-behaved base class.
Of course, if you use private inheritance then you can't pass myvec
to a function that requires a vector. But that's because it isn't a vector - its push_back
function doesn't even have the same semantics as a vector, so we're on dodgy grounds with LSP, but more importantly non-virtual calls to functions of vector
ignore our overloads. That's OK if you do things the way the standard libraries anticipate - use lots of templates, and pass iterators rather than collections. It's not OK if you want virtual function calls, because quite aside from the fact that vector
doesn't have a virtual destructor, it doesn't have any virtual functions.
If you actually want dynamic polymorphism with standard containers (that is, you want to do
vector<int> *ptr = new myvec(1);
), then you're entering "thou shalt not" territory. The standard libraries can't really help you.