In the past week I've been following a discussion where Ross Ihaka wrote:
I’ve been worried for some time that R isn’t going to provide the base that we’re going to need for statistical computation in the future. (It may well be that the future is already upon us.) There are certainly efficiency problems (speed and memory use), but there are more fundamental issues too. Some of these were inherited from S and some are peculiar to R.
He then continued explaining. This discussion started from Xi'an's Og, and was then followed by comments at reddit, statalgo, DecisionStats, columbia.edu, Hacker News, r-help mailing list, and maybe other places.
As someone who isn't a computer scientist, I am trying to understand what to make of this.
- Is R so flawed that it is better to rewrite it then to fix it? Searching on stackoverflow, I came by When to rewrite a code base from scratch and Under what situation should code be rewritten from scratch? (based on Joel's article Things You Should Never Do), both threads argue that a very(!) extreme case is needed in order to justify a rewrite of the code. But is this the case with R?
- Can R be patched in a way to fix these problems and do become "the stat language of the future" ?
- What about the social aspect of this? R already has a large user base. If R were to "die", is it possible to imagine all the users willing to move to a new language?
I think this question is not subjective, but since it has so many uncertainties, I decided to mark it as a community wiki.