views:

311

answers:

4

If a compiler doesn't "support" RTTI, does that mean that the compiler can not handle class hierarchies that have virtual functions in them? Or have I been misunderstanding the literature about how RTTI isn't portable, and the issues lie elsewhere?

Thank you all for your comments!

+1  A: 

The only part of RTTI that is unportable is the format of strings returned from type_info::name().

Even this has a fighting chance so long as you can find a c++filt tool for your compiler that converts (unmangles) such a string back into a compliant C++ type.

Potatoswatter
+9  A: 

RTTI is not needed for virtual functions.

It is mainly used for dynamic_cast and typeid.

Thomas
Isn't that typeid?
MSN
Been C#ing for too long, fixed. Thanks!
Thomas
A: 

If a compiler doesn't "support" RTTI, does that mean that the compiler can not handle class hierarchies that have virtual functions in them?

Generally all modern C++ compilers support RTTI... So forget about it.

Or have I been misunderstanding the literature about how RTTI isn't portable, and the issues lie elsewhere?

RTTI today is portable and works fine on any modern compiler... However some special cases may occur.

Under ELF platforms (Linux) when you load libraries dynamically (i.e. dlopen) and try to perform dynamic_cast to some class between library and executable it may fail if you do not pass correct flags for linking executable (-rdynamic).

Almost any other cases... Just works.

Artyom
+3  A: 

This is probably way more of an answer you were looking for, but here goes:

RTTI is not "portable" means that if you use compiler A to build dynamic library A, and use compiler B to build application B that links with A, then you cannot use RTTI, because the RTTI implementations of compiler a and b are different. Virtual function are affected only because the virtual function mechanism may not be binary compatible either.

This issue was very important in the mid 90's, but the issue is now obsolete. Not because compliers have now all become binary compatible with each other, but rather the opposite: C++ developers have now recognized that C++ libraries must be delivered as source code, and not linkable libraries. For those who view C++ as an extension of C, this is very discomforting, but for more modern programmers, who grew up in an open source enviroment, nothing special at all.

What changed between the mid-90's and now is differing attitudes between what constitutes valuable intellectual property and what doesn't. To wit: there is actually a patent registered with USPO on "expression templates." Even the holder of such realizes that the patent is unenforcable.

C style "header and binary" libraries were long seen as a way to obfuscate valuable source code. More and more, business came to recognize that the obfuscation was more self-defeating than protective: there is very little code out there that meets "valuable IP" status. Most people buy libraries not because of the special IP it contains, but because it is cheaper to buy rather than roll their own. In fact: expertise in applying IP is far more valuable than the IP itself. But if no one cares about this IP because theyh don't know about it, then it is not worth very much.

This is how open source works: IP is freely distrbuted, in return those distributors gain consultancy fees in applying that IP. Those who can figure it out for themselves profitably-- well good on them. But it is not the norm. What happpens actually is that a developer understands IP and sells their employer on buying the product that implements it. Yeah, whole "developer communities" are founded on this premise.

To make a long story short: binary (and subsequently RTTI) compatibilty went the way of the dinosaur once the open source movement took off, and concurrently, C++ template libraries became the norm. C++ libraries long ago became "source distributable only" like Perl, Python, JavaScript etc. To make your C++ compiler work with all source that you compile with it, make sure that RTTI is turned on (inideed all C++ standard features, like exceptions), and that all C++ libs you link are likewise commpied with the same options that you used to compile your app.

There is one (and only one) compiler I know of that does not enable RTTI by default, and that is beccause there are other legacy ways to do the same thing. To read about these, pick up a copy of Don Box's excellent work "Essential COM."

Lance Diduck
uhm.... Wow. Actually that explains a whole lot. I've always wondered why cs literature eventually describes how separating headers and code allows "ISV"'s to build library bytes that can be sold and reused, but there seemed to be a lack of libraries that you didn't have to to compile yourself. I wish I could mark two answers, but I need to remember to not ask too many questions in one post. Thanks for your insight!
Joshua
Although I have to ask, how is your 5th paragraph related to open vs closed source? Couldn't the same thing be done in a closed source system? Or is the point that in an open source system, there are more opportunities for participation, producing a more robust "consulting" network (and therefore is one particular reason why open source works?)?dd
Joshua
There are still a few places like DinkumWare and RogueWave that license proprietary C++ source code (i.e. closed source) What is different is that few are attempting to protect IP by handing customers C++ header and binaries. However, there are a few (like Numerix) that do use this model, at least they did a few years ago. The problem with that of course is that you must maintain a build for every possible customer configuration, and cannot deliver template code. This limits your competetiveness typically far more that if your competitor just stole your code.
Lance Diduck