Polymorphism. Polymorphism. Polymorphism. :)
The virtual
functionality is what makes C++ object-oriented. It's one of the major reasons why you are using C++ in the first place. Never think twice about using virtual
if your design calls for it. Do not redesign your model simply to avoid virtuals.
Would you think twice about accessing a structure field even though there is an added cost to jump to the memory offset from the structure's base? No, of course you wouldn't if the design calls for it. Would you think twice about passing callbacks, event listeners, functors, or any other "logical" address that requires a jump to reach the actual data? Of course you wouldn't if the design calls for it.
On the flip side, there's no point to making a class member virtual if the design does not call for it, just as there's no need to pass around functors or create structs unnecessarily if the design doesn't call for it. The decision whether to use virtual
is part of good OO design and implementation.
Performance
With respect to the so-called performance cost: First, this is a very old concern. The performance of the early C++ implementations of virtual calls could actually be measured without incredibly contrived code. As others have mentioned, today's technology largely obsoletes this debate.
Second, vector multiplication and similarly contrived examples are misleading. They appear to be measuring the difference between virtual calls and non-virtual calls. But they are not. They are measuring the difference between billions of virtual calls and billions of non-virtual calls to functions that do next to nothing. Is there real-world code that may be susceptible to this problem? It's certainly possible. When you find it, will the solution be to scapegoat the use of virtual in general? Clearly not. The solution is to optimize your exceptionally performance-sensitive code. As part of this hypothetical optimization, the removal of virtuals would be wise, but won't buy you much. If you've got code that performance sensitive, you'll need to optimize a heck of a lot more than discarding the virtuals.
Third, it's easily measurable, which is great, because you don't need to take our word for it. You can easily benchmark the difference with your compiler, on your target architecture, to assure yourself that there really is no performance difference.