views:

666

answers:

7

At first when I saw the upcoming C++0x standard I was delighted, and not that I'm pessimistic, but when thinking of it now I feel somewhat less hopeful.

Mainly because of three reasons:

  1. a lot of boost bloat (which must cause hopeless compile times?),
  2. the syntax seems lengthy (not as Pythonic as I initially might have hoped), and
  3. I'm very interested in portability and other platforms (iPhone, Xbox, Wii, Mac), isn't there are very real risk that the "standard" will take long to get portable enough?

I suppose #3 is less of a risk, lessons learned from templates in the previous decade; however Devil's in the details.

Edit 2 (trying to be less whimsy): Would you say it's safe for a company to transition to C++0x in the first effective year of the standard, or will that be associated with great risk?

+9  A: 

Like most of C++ you will pay for only what you need. Therefore if you don't want the "boost bloat" of useful tracking pointers, thread libraries etc. then you don't need to pay for the compilation.

I'm very sure that portability will be addressed with the design, especially since a lot is based on existing portable code from projects like boost. Both GCC and Microsoft VC have implementations of much of C++ 0x already as you'll see from their respective current prototype versions.

Simon Steele
Boost is Bloat! Purely. It taints my favorite language.
NTDLS
@NTDLS Well then we'll have to respectfully disagree. You can pry my boost library from my cold dead hands.
Simon Steele
A: 

What's the question here? Of course it's going to take years before compilers on esoteric platforms will implement these features. Don't count on being able to use new features until 3, maybe 5 years in the future. 'He who then lives, then worries', as we say in Dutch.

Roel
My main concern is that moving ahead with C++0x will cause big trouble in six months time; due to build times, unreadable code and lack of portability.
Jonas Byström
That's still a *concern*, not a question.
Tyler McHenry
Jonas Byström
+3  A: 

I would actually consider #3 the biggest risk short term. AFAIK, the standard is introducing new syntax in a couple of areas (lambdas) and changing the meaning of previous words (auto for instance). Code using these features can only be portable if the compiler for every platform you deploy to supports them.

Sure this will happen at some point in time. But adding a new feature to a compiler is no small feat and takes quite a bit of time. I would be afraid that it will take too long for these features to be supported in the main compilers and hence inhibit the ability of a programmer to be an early adopter and portable.

JaredPar
This is true, but the standard takes a very long time to settle as well and compiler implementers are generally making their changes during this time. See my comment about GCC and VC support for most of these changes already.
Simon Steele
The problem will be the same as the one we have already lived 10 years ago: different compilers supporting different subsets of the standard with different interpretation of the ambiguous parts. Portability and evolution are somewhat antinomic. A possible strategy is to standardize the compiler -- g++, como? -- uniform language support on all the platforms, but it has other costs.
AProgrammer
Aren't they taking better care not to do the same mistake this time over?
Jonas Byström
Well thats the main point of having it be a standard - you have to balance competing interests from different parties and produce something that almost everyone can agree on. I think it does help - but it obviously doesn't work 100%. Having a reference implementation might do something, but no one implementation supports all OS's and platforms that people want to develop C++ code for.
Greg Rogers
@Jonas, practically, what do you propose? Implementing all what is new takes time, so implementation provide it piecewize. Some things are easier in an implementation than in others. Some things may already exist as extensions (says long long...). And obviously in g++, what is done first is what is wanted by those able to put the resources to do it.
AProgrammer
I have no proposition, I just want to know if C++0x is worth the trouble for the next couple of years. I have no doubt that the new standard will make it a better language in the long run.
Jonas Byström
+12  A: 
  1. You pay for only what you use. If you don't need a complex template feature, don't #include the headers it's defined in, and you won't have to deal with it.
  2. Lambda functions should reduce the verbosity of STL algorithms a good bit; and auto variables will help with code like std::map<foo, std::shared_ptr<std::vector<bar> > >::const_iterator...
  3. Yes, it will take a while. Many of the new features are indeed in boost, and if you want portability that's what you should be using for at least a few years after the standard is implemented. Fortunately there are only two compilers that cover those platforms you mentioned: g++ and Microsoft's C++ compiler. Once they get support, it's just a matter of time before the embedded toolchains get rebuilt with the new versions. Unfortunately, possibly a lot of time...
bdonlan
Lambda didn't look as slick as I had hoped; to me looked like longer but fewer lines. Perhaps when I get used to it, I might be able to cut some code.
Jonas Byström
It's a hell of a lot better than boost::lambda, that's for sure :)
bdonlan
Hehe, safe to say that no-one will disagree there! :)
Jonas Byström
Wii doesn't use either of the compilers you mentioned.
Andrew Khosravian
It does if you're doing homebrew ;)
bdonlan
+5  A: 
  1. C++ is always going to have hopeless compile times, it's whole philosophy is do things once(i.e. do it at compile time) so you don't have to repeat it at runtime and reduce the performance. And as others have said, don't include a library if you don't need it!
  2. C++ will never be very pythonic because of it's aim of being backwards compatible. The verboseness comes from being an old language that had many things added on as the language evolved. As some others have said, lambdas and auto variables will greatly reduce the verbosity as well
  3. This is a problem with any big changes to a language, but I think it's widely agreed that the changes will make the language a lot easier to use, so it should get adopted quickly.
Charles Ma
+8  A: 

Edit: do I (and others like me) have to keep a very close eye on build times, unreadable code and lack of portability and do massive prototyping to ensure that it's safe to move on with the new standard?

Yes. But you have to do all these things with the current standard as well. I don't see that it is getting any worse with C++0x.

C++ build times have always sucked. There's no reason why C++0x should be slower than it is today, though. As always, you only include the headers you need. And each header has not grown noticeably bigger, as far as I can tell.

Of course Concepts was one of the big unknowns here, and it was feared that they would slow down compile-times dramatically. Which was one of the many reasons why they were cut.

C++ easily becomes unreadable if you're not careful. Again, nothing new there. And again, C++0x offers a lot of tools to help minimize this problem. Lambdas aren't quite as concise as in, say, Python or SML, but they're a hell of a lot more readable than the functors we're having to define today.

As for portability, C++ is a minefield already. There are no guarantees given for integer type sizes, nor for string encodings. In both cases, C++0x offers the tools to fix this (with Unicode-specific char types, and integers of a guaranteed fixed size)

The upcoming standard nails down a number of issues that currently hinder portability.

So overall, yes, the issues you mention are real. They exist today, and they will exist in C++0x. But as far as I can see, C++0x lessens the impact of these problems. It won't make them worse.

You're right, it'll take a while for compliant standards to become available on all platforms. But I think it'll be a quicker process than it was with C++98.

All the major compiler vendors seem very keen on C++0x support, which wasn't really the case last time around. (probably because back then, it was mostly a matter of adjusting and fixing the pre-standard features they already implemented, so it was easier to claim that your pre-standard compiler was "sort of almost nearly C++98-compliant".

I think on the whole, the C++ community is much more standard-focused and forward-looking than a decade ago. If you want to sell your compiler, you're going to have to take C++0x seriously.

But there's definitely going to be a period of several years from the standard is released until fully (or mostly) compliant compilers are available.

jalf
That's somewhat reasuring, but will this include changes in the old part of the C++ compiler, such as making int a fixed size? If not, you'd conclude that every compiler will have its set of tweakable settings - which might even make it worse ("3rd party lib this and that requires those settings when compiling for C++0x, but these settings for old school on this particular platform").
Jonas Byström
No, these are new datatypes. An int will work as it's always done, but you'll also have int64_t and similar types that guarantee a specific size. Likewise, char and wchar_t will work as they always have, but char16_t is guaranteed to use UTF16 encoding and be 16 bits wide. Nothing significant should change in the "old" parts of the language. A few bug fixes here and there, but nothing that should cause problems. They take backwards compatibility pretty seriously.
jalf
+1  A: 

One of the advantages of standard things is that compilers can take shortcuts. E.g. the boost template circus needed to implement is_X<U> disappears if you can simply hand off to __compiler__is_X<U>. This could easily save 2 orders of magnitude, sometimes 3.

MSalters