Is it just me or are most colleges teaching Java instead of C++ these days? I feel like I've been missing out with having zero classes teach or use C++ at all.
views:
3773answers:
58It's been that way for quite a while; many of the Universities here in Australia teach Java over C++ because it's more mainstream - and modern. Mind you, when I went to Uni I got to do two semesters of COBOL and only one of C, so...
Mainstream means that having learned Java, it's a skill you can take out into the real world and apply to a corporate job. C++ is rarely used in greenfield development nowadays, replaced by Java J2EE.
Modern is desirable because things like resource management - memory handing et al - are the kinds of things that ought to be handled by the language, not the programmer. Programmers ought to be solving problems, not ensuring that every new is matched by a delete. That kind of argument is persuasive to the Lecturers who set syllabuses.
So I put it all down to pragmatism.
Colleges believe that Java is easier to learn and easier to teach. There is also a push to give students "real world" experience and in the "real world" Java Programming ability, regardless of real value, is generally considered to be more valuable to C++ programming ability.
I don't claim that this common belief is right, but it is prevalent among project managers and interviewers.
I'd imagine better corporate uptake, better OO support and all round better exposure.
It depends on the University. I think teaching core concepts is the most important part though: Java is probably a better language to teach programming/OOP core concepts with. C++ has lots of idiosyncracies that you have to learn about (or maybe less idiosyncracies and more that it's 'pure OOP'?), which would get in the way of the teaching.
That said, the University I'm attending started with C as the basic language and uses C++ to teach OOP. Java's in there somewhere though.
I also think it's a case of Java has an ability to quickly bring forward the visual side of programming, of getting an app that really does something visually for the student to experience. C/C++ and a console app, just doesn't set most people's heart buzzing. Granted C and derived languages can do visuals pretty easily, but Java just does it faster.
In talking with teachers/professors at my school a lot of them said that the first CS course was to get kids hooked on programming, and a slick app is a really easy way to do that.
First of all, don't take a course that is only meant to teach you a language. Unless it's a completely different way of thinking from what you're used to, it'll be a big waste of time. Take courses that focus on algorithms and the like. If the material can't be applied to another language, then it's practically useless in the long run.
I imagine that the reason schools choose Java is because it frees them from having to talk about pointers. I could see that making the job much easier.
I think it's about levels of abstraction. Java is generally taught as an introductory course because as an introduction students do not need to know about things like memory management or pointers. Over a degree these abstractions are given more detail and students can begin to understand where abstractions leak.
I think that demand from industry has played the biggest role. That is to say, demand from industry, filtered through the university bureaucracy, and mixed with pressure to keep up graduation rates.
Java is the easy choice. It's not as hard to pick up as C, it can't be called 'too academic' the way functional languages often are, but it feels ever so slightly more hardcore, than say, PHP. Personally I'd wish that my first year class (I'm in third) had been taught in Scheme, but I can just play with that on my own.
I think it also depends on what school you go to. Some schools that may not have a focus on technical skills and lean more towards liberal arts or even just a generally larger college might believe that Java is easier to learn and has a quicker satisfaction. (Applets and quick programs that do things).
I go to a college known for engineering. Here the intro to the CS degree is taught with Python. Then OO moves on with Java, then Assembly and C.
However, the only CS the vast majority take is taught with MATLAB.
I was in the business school in college and our first programming class was C++. This of course was not C++ for majors, but it still gave me a bad feeling that I wouldn't do well in the class.
I think teaching core concepts is the most important part though: Java is probably a better language to teach programming/OOP core concepts with. C++ has lots of idiosyncracies that you have to learn about (or maybe less idiosyncracies and more that it's 'pure OOP'?), which would get in the way of the teaching.
Java tends to have a more universal feel to a business major that does teach better OOP concepts than C++. But then again, in my C++ class, we really only went through basic programming concepts.
My only Java class was my senior year when it was my "senior OOP" class. It was great at teaching me how to program the topics I learned in my Object Design class from my jr year.
I think it's a combination of two things:
1 - Most jobs out there right now are Java. Schools look better by producing Java experts because they enter jobs without needing to be trained in the language.
2 - It's easier to teach a concept like Operating Systems or HTTP Networking when you can just focus on the concepts and not have the students worrying about things like memory management.
I finished college about 5 years ago, it was ALL Java except for a couple of specialized courses where C++ or C were used. For example in Game Programming we used Direct X, in Graphics Programming we used OpenGL, etc.
I just finished four years at Virginia Tech as a Computer Science undergrad. My languages in coursework went as follows:
- Java in high school, as the AP Computer Science exam had gone to Java that year.
- Java for Intro to Object-Oriented Programming
- C++ for Object-Oriented Analysis and Design
- C++ for Data Structures (PR-quad trees, B-trees, 2-3 trees, binary search trees)
- C for OS (extending PintOS from Stanford with userland programs, a filesystem, virtual memory, and multi-threading and dynamic priority scheduling).
All things considered, I think it's a great way to go about it. I learned object-oriented from the ground up, without a lot of language syntax (I'm looking at you, C++, and your pure virtual templated function syntax); I learned pointers and detailed memory management once I was comfortable with basic object-oriented principles, and learned to blow my foot off with C in OS.
Working now on OS X and iPhone applications using Objective-C and Cocoa, I feel completely comfortable handling high-level object abstractions in Cocoa, and handling memory with retain-release memory management in Obj-C.
Java is, in a sense, easier to learn because it hides pointers from the programmer. It allows newer programmers to easily write programs that have dynamic memory for assets such as text, graphics, and input, unlike C++ where memory must be handled by hand or by using a smart pointer, which beginners may not know exist until they find them.
A second reason is that companies are writing programs in Java because of the easier learning curve which produces a larger pool of programmers to hire. Java will also run on any machine that has a JVM - Windows, Mac, Linux, and many cell phones as well. Write it once, run it everywhere!
However, there is a niche where the lower level capabilities of C++ comes in handy - embedded systems programming. Many such devices have memory mapped IO that cannot be accessed without using pointers, thus rendering the higher level languages nearly unusable.
In the end though, it should really come down to using the correct tool for the problem.
The only languages I was "taught" in school were Pascal (this was 20 years ago) and 80x86 assembler. Every other language we used in school, we taught ourselves.
Java is a reasonable choice for an introductory programming course, but students who don't learn anything else are going to be at a disadvantage when they look for jobs after school.
Many universities also do not teach basic concepts such as pointer arithmetic and recursion. I was actually told by one of my professors not to use pointer arithmetic, but to use array index notation instead. What? Do they think that we cannot grasp these concepts, that they are too difficult? This is fine if all I want to do is go and write banking software, but what if I want to write the next programming language or a web server? I for one am very disappointed in the education that I have received thus far. If only I could go to MIT...
I've discussed this with my professors. I've discussed it with people in the industry. I've even discussed it online, and if that's not definitive, I don't know what is.
I believe it comes down to 3 things.
3. Industry pressure
There's a lot of industry pressure to produce students who are experienced with Java. Companies are using Java a lot these days, so they feel the best way to get great Java programmers is to have universities teach Java.
Unfortunately, the concentration on Java has been to the detriment of the students. By giving students so much exposure to Java, they've reduced the exposure to other languages. The result is that most people who graduate with a computer science degree are not proficient in any low-level language (C, C++), nor are they proficient in any theory-based language (Haskell, Lisp).
Also unfortunately, it turns out that the industry is wrong. What they actually want are good programmers (code grinders are not hard to come by). Good programmers can use Java without any problem. The concentration on teaching Java has simply made it less common for student to be exposed to non-Java concepts. And it's really hard to be a good programmer if all you know is Java. Hence, the concentration on Java may actually be causing a decline in the number of good programmers being graduated.
2. Belief that high-level is better
There's a fairly pervasive belief in computer science that it's better to teach students high-level concepts than to teach them low-level details. Memory is a low-level detail. Pointers are low-level. Strings are (sadly) low-level. Objects are high-level. Object-oriented programming has become the ultimate goal.
With this mindset, it's easy to see that it's more important to understand the "concept" of a linked list than to understand the implementation. Sure, we have students implement simple linked lists. But they do it in Java, where memory is free and pointers are crippled. We also give them two weeks and enough reference material to cut and paste 99% of it. The end result is that many students don't really understand linked lists, which is why you'll see them two years later fetching every element of a linked list via a for-loop and the get()
method. They missed the memo that this is really expensive. You'll see these same people concatenating strings in a loop, instead of using a StringBuilder
or a StringBuffer
, because they really have no idea what's going on. Thank God we saved them from all the low-level details about how things really work.
1. Declining enrollment
Computer science departments are scared to death about the falling CS enrollment. They got really spoiled during the .com bubble, and forgot that they are, in essense, a science. Their enrollment isn't supposed to match the English school. It's supposed to match Physics.
Since the bubble burst, there's been an enormous push at all levels of academia to keep enrollment high and to keep the graduation rate nearly as high. This means that they've knowingly simplified the curricula to keep from scaring students off. They really, really want to seduce people into computer science, so they are doing everything they can to hide the ugly details of the field. That means beginning courses use easier languages. It means that the students aren't graded, or pushed, as hard. It means that they have dumbed it down, because they don't know what else to do. They are afraid to push people out of the field, because they can't get enough people to replace them. The end result is that the entire field gets watered down.
Frankly, I don't envy computer science departments. On the one hand, I do think the quality of programmers would rise if they'd just push really hard at first. It would weed out people early, and allow them to raise the bar for the rest of the students. On the other hand, I think if they pushed harder, they would cause enrollment to plummet. This would result in a backlash from the university, from the industry, and from the government. No one wants enrollment to drop, but no one has come up with a reliable way to increase enrollment.
So they teach Java, and pray that most students don't quit.
The reason that first comes to mind is that schools are out to teach CS rather than 'programming'.
Java was chosen at my former school because it provided an easy way to start doing some real work quickly with regards to data structures and algorithms, while abstracting away the messy details of how a computer actually works that might be too much all at once for a first year student.
If you're a professor, would you rather spend the bulk of your student's time trying to get them to really understand a concept such as node manipulation in a B-Tree, or would you prefer they rip their hair out on top of this once pointers become involved?
Java also provides a nice bridge to other similar syntax languages in classes that are 'closer to the metal', such as compilers or OS.
On top of all of this, the java sdk is free for students to download, and there are popular tools such as Eclipse that are also free and will provide a uniform experience regardless of what platform their home machine is running on.
One answer is that Java has a less severe learning curve than C++, so there aren't as many syntactical and other gotchas in the way before you can get to learning the concepts.
I'd also say that C/C++ now occupy the space assembly language occupied 10-15 years ago, and Java/C# now occupy the space where C/C++ were.
I recently had a friend in the business world ask me how he could learn to program. In response, I enthusiastically recommended Deitel and Deitel's "C++: How to Program". I believe that it is THE book to learn programming on. Think of my friend as Joe Student at University of So and So where the CS department doesn't believe in teaching the low level details. Is he at a disadvantage? Yes, but a recoverable one if he is willing to put in effort to go out of the box.
The disadvantaged need town criers though, because they don't even know their need to learn these things. The professors are telling them they should learn this and that, but not C++ and pointers--I remember reading a Java book once, and it failed to enter into the details of some of the class algorithms with the excuse that "some very smart people wrote these things so that we don't have to"; were they saying that as a reader of the book, I am not as smart as those?--so we have to cry out the need and we have to provide learning forums to recover our science.
In response, to both my friend and posts like this, I've called up my local school district and offered to get involved at the grade and high school levels to teach Deitel and Deitel's C++ book to kids.
At University I was taught 1 year of Java, then a year of C++, then Java again in the 3rd year. Both are very similar.
I think Java is well suited for most general programming or software engineering courses. However, for courses on data structures (such as trees, linked lists etc.) that are made up of pointers, it seems irrational to teach them in a language which does not have pointers.
Fortunately some universities understand that (see CipherSwarm's post above).
As a few other people have commented, I personally believe the reason many universities are going with Java is that is fairly easy to pick up and learn, allowing the professors to focus more on the theories and concepts behind our field. While I understand people arguing for universities to teach more languages, it takes something away when the student has learn syntax. If they know the concepts and ideas, they should be able to pick up just about any language. I also think that those students that want to be exposed to various languages, can easily do so on their own. Just my two cents (although slightly slanted since the University I attended mainly taught Java also).
Here at the University of Colorado at Boulder, we use Python in the first half of the first CS course and C++ for most of the rest. I am glad that I am expected to deal with a systems-level language for the core courses like data structures and algorithms, having programmed in Basic, C, Java, C#, and others. Some more specialized classes like Operating Systems also deal a bit with x86 assembly.
I asked one of the professors about the language choices, and he talked about the following (paraphrased):
Python is very useful in the beginning class because it has an interactive interpreter, so students can see what happens with short lines or snippets of code, which helps them quickly learn the basics of programming, while avoiding complexities like compilation. As soon as they start getting comfortable with Python, they start using C++ for the second half of the first CS course, so that they are ready for the later courses.
Using a systems-level programming language teaches data structures and algorithms better because you can better understand how the code interacts with the actual computational hardware, and more so with using a bit of assembler code for students focusing on systems programming.
Dealing with pointers, memory addresses, and manually managing memory is viewed as a vital part of the CS curriculum to produce students who know the finer details of computing theory and practice.
The belief that it is easier to move from C++ to Java, C#, and so on than it is to go the other way around. Even though C++ is not the best language for many tasks, knowing C++ is similar enough to most other languages to pick them up quite easily.
I don't think C++ is any good. Beginners should be taught C. Then they can go and learn Java on their own if they want.
My first programming class was PDP 11 assembly. We learned the mechanics of memory management, pointers, and the difference between stacks and heaps. After that class it made you appreciate all the stuff going on when you made a function call, pushing onto the stack input parameters and the return address, etc.
Having spent a lot of time programming in both C++ and Java I would say that it is because people recognise, consciously or unconsciously, that Java has lower accidental complexity than C++. Put simply, it is easier to get more done with fewer lines of code and that helps when trying to teach new concepts.
I have everthing in the first year in C at the University of Sao Paulo (USP, Brazil) in my basic data structures course (and some logic gate programming in hardware class). Then in the second year VHDL (for hardware class) and more C (data structures II class). Now I'm starting in Java in my OOP class which is great to teach the concepts of OOP, C++ have many pitfalls that teachs very bad practices if you are inexperienced. So I think that Java is good, to teach OOP concepts.
I remember my first year, almost everbody in my class did not get pointers, they where completely lost. Took 2 months just so they could learn to build a simple list, but aftwards everthing went smooth, including binary trees.
At my university they teach concepts, not languages. How can you learn what a list is without pointers? How can you learn to deal with objects if you can program using the old paradigm by accident?
As a general tip I think when looking for colleges for Computer Science you should look less into what languages each of the classes are based on, and instead what those classes cover. Ask a recent graduate if they feel they could pick up any language within a short period of time, and if the answer is yes then they most likely got a proper CS education.
I attended Ohio University from 2000-2004 and with the exception of an entry level pascal class it was all C and C++. It was most important that we learned how to program, use good practices, and solve problems. Syntax of different languages should be of little concern once you have a Computer Science degree from a solid curriculum. I have been a Java engineer for 4 years now.
The AP exam.
High Schools that provide the AP to their students have no choice. Colleges that accept the AP exam feel the need to use the language their students need to know.
There is a really good page talking about this: C++ in der Schule . Sadly, it is only in German, but if you understand it, it's worth reading.
Java instead of C++? Hopefully most colleges teach more than one language.
Java has its merits though. It's easier to work with, which allows people to focus a bit more on the stuff they're supposed to be learning, than on debugging segfaults and trying to get their code to compile. My college teaches Java in the OOP course, which I feel makes good sense. Java is built around OOP, while in C++, it is just one of a bucketful of options, all of which you have to master. They also teach SML before Java (as the very first course), as an intro to functional programming. And while we don't have a dedicated "C++ course", plenty of courses use it, so students tend to pick it up anyway.
I don't think it's an 'either or' question. Any decent CS college should treat languages as tools to be picked up when necessary, rather than trying to teach the One True Language. There are plenty of reasons why it's convenient for a CS student to know Java. It just shouldn't be the only language they teach.
When I taught C.S., we taught Basic at the intro level, then Pascal. The main reason was ease of teaching. We shied away from C because, with very inexperienced programmers, there were too many things to explain, and too many ways to shoot yourself. C++ suffers from the same issues.
That is not to say C or C++ are bad at all. Personally I prefer them. But as a teacher, do you really want to be trying to explain to 18-year-old Suzy or Sam the difference between a character array and character pointer, or what a null pointer is, or even what a pointer is? As a teacher you prefer a "nanny language".
I was in a somewhat unique situation when going through college. The first 2 years, my college was big on C++, so almost everything was taught using C++, and I really got to learn all the little details and mechanics of programming. Then starting my 3rd year, they decided to switch to Java due to industry pressure, so I had my last 2 years in a more high-level language, and got to learn things from that perspective as well.
It was somewhat awkward at the time, but looking back, I think it worked out well, as I sort of got the best of both worlds.
I have been taking classes that use C and C++ extensively. There are only 3 classes that use java in the entire department: the intro class, an undergrad level OOP class and an undergrad level DB class. After graduating, if I have to get a job that uses only high level languages, I would probably end up quitting and going back to grad school.
Talking with the professor at my current institution of higher learning who pioneered our migration from C++ to Java several years ago was rather enlightening. His staunch belief is that desktop programming is dead and everything is moving to the web where J2EE, JSP etc. tend to be major players. Hence to forth, students need to be preparing to go write web side code and java apps instead of messing with C++, pointers, memory leaks etc. After all, if it's a common data structure Java already has it built in. At least, that's why we switched from C++ to Java.
As a semi new user to both Java and C++, and programming in general, and by new I mean 3-4 years of using it, my opinion may bring up controversy among people who have been in the industry and have more experience.
The courses offered at my institution have also been taken over by java(with the exception of anything dealing with hardware, assembly, and operating system design). The year before I arrived, there was a C++ programming course offered here, which I looked forward to taking. I arrived only to realize that they had canceled the class due to small numbers in our department and lack of faculty. Or what I like to call laziness.... As of now, our Electrical Engineering department offers the C++ classes and our Computer Science department strictly focuses on the Java framework, with the above exceptions.
Throughout my time studying Java, I have asked numerous times why my Department head and faculty have taken a route that has restricted us to virtually only learning java. I have gotten multiple responses that all have a common theme. For example, In one of my courses, "Enterprise Java", I asked my professor this very question and I received the response, "The Java.Net package is very easy for students to use, and does a lot of the messy work for the programmer, unlike languages like C,C++,etc. I didn't really think much of what he said so I let it go.
In another course I took using, Data Structures and Algorithms, I asked the same question and got the response, "The Java Library allows users to use ADT's from the Java library, so they don't have to worry about all the small coding details.
I could keep typing examples of me asking the question, but I'm hoping you are seeing the trend. In my opinion, although some libraries may be designed for certain functions or may be easier to use then others; this does not validate a reason to strictly use a certain language over the other. I do believe that Java is very useful for Web programming, and I do believe that its Data Structure definitions are very useful, but I don't think that these are reasons to teach the language.
Although I'm trying not to sound redundant to other responses, I feel that Java is well suited for an introductory language for certain reasons, but it would be very helpful to implement other languages into the introductory curriculum.
If nothing I have said has made any sense, then let me show you an example of why I wish I had been taught or was more familiar with C++ or C.
In my first day my programming language design class, I was asked a question about a block of code that contained a reference pointer, and I could not answer the question because I had no knowledge of pointers to that point in my history(3 years). Yet I am graduating soon top in my class. Something is not right....
I think java is a fun language to work with. in java u don't need to clear memory for unused objects. the garbage collector is there to clear up. it actually reduce the work for a programmer.
From my point of view, I would choose another. Just imagine the first day, a hello world example.
In Java:
class HelloWorldApp {
public static void main(String[] args) {
System.out.println("Hello World!");
}
}
In C:
#include <stdio.h>
int main()
{
printf ("Hello World!\n");
return 0;
}
Think about what amount of knowledge you need to understand it. In Java, you need classes definition, method definition, static method, function call, arguments passing, arrays, dot syntax, string syntax ... In C you need #include preprocessor, function definition, function call, string syntax, return ...
My best first time language may be Python:
print "Hello World!"
It is simple enough to start with.
As a school student in the UK looking at Universitys to go to for my BS, I have talked to professors a bit about it, as well as having talked to uni students.
Last thing I heard (Southampton, which is in the top 10 in the UK give or take) is that it was because it was one of the standard industry languages. Sort of makes sense, given how many Haskell jobs are there compared to Java. University's are also becoming a lot closer to industry, as most into lectures to CS departments always has a "We are working with IBM, Microsoft etc" or these recruiters come looking from company x.
I am not saying they are choosing Java solely because of industry pressure, but because it enables people walking out of their university to get a job. This is not just because they want their students to get a job but part of it is because it reflects on their rankings, which is going to be important to attract students.
It doesn't mean they don't teach things like Haskell, C, Prolog etc, they just most teach it first. Any competent person walking out of Uni with a BS in CS should be able to pick up any language given to them without to much difficulty, I don't think it matters what language you start people with. There will always be and have always been the bad, average and good. The good got a 1st and have no problems with pointer and recursion problems, which are the major things that Java doesn't teach.
Interestingly Newcastle (UK) is now starting with JavaScript next year, rather than Java. I haven't yet got around to asking why, but I would love to know the answer.
The side that supports your arguments that it is dumming down students is that I was talking to a 3rd year student (I think she got a 2-1) who I don't think had done any progrmming before she entered university who had said she had done her final project in Java because "it allowed her to create nice GUIs", which didn't say much for experience in other languages. (It wasn't a bad university either).
This article by Joel Spolsky, one of the creators of stack overflow, addresses your question pretty well:
I was at the University of Kent in 1997 when they switched to Java as the primary teaching language (from Modula-3). I believe Kent was the first university in the UK to do so (it subseqently became the the first Sun-certified Java Campus in Europe). It should be noted that, in a gesture completely unrelated to the switch, Sun Microsystems donated a shedload of hardware to the computing laboratory.
FWIW, UC Berkeley (currently) and MIT (at least in the past) used functional programming languages, which I believe is the best way to go. I would argue, as would Abelson and Sussman (both experts in the field of computer science education), that the use of a language like Scheme (a sort-of dialect of LISP) allows for a strong understanding and a strong knowledge retention. From my experience and those of my colleagues, students who complete the "Structure and Interpretation of Programs" book by Abelson and Sussman are at a point where they can learn the basics of any programming language in a weekend, and a new paradigm in under a week (again, anecdotal, but I currently know 11 or so programming languages, and before I took that class I essentially knew no languages). Also, why do you believe that C or Pascal are of pedagogical value? Pascal has nice pedagogical features, C is rather abstruse for the beginning programmer IMO, and the syntactic hurdles can cause interference with the learning curve. I would be very interested in your motivation for advocating the teaching of Pascal and C, as I do research in computer science and math education, and new pedagogical theories, from the layperson or research scientist (in education) are always fascinating to me!
As a student from a Java school (mostly), I'd say any student that's only learned a high-level language is going to be fine, as long as he's only working in high-level languages.
To draw a terrible auto-mechanic analogy, you wouldn't take an engineer that's been working on tractor engines all his life, show him a Porsche engine and expect him to be able to service it (unless he happened to have a Porsche at home that he tinkers with).
(In other words, if you're hiring for a C/embedded/C++ position, don't expect a pure-Java student that doesn't know any low-level languages to be what you're looking for.)
On a side-note, there are definitely fields of study within CS where low-level languages are more of a hindrance than a help. "Why is my neural network segfaulting?" (We wasted more time in our AI course debugging segfaults than actually learning AI concepts.)
On the other hand, knowing what I do from my schooling (C/Prolog/Haskell, some C++) has made me a better programmer overall (via different ways of thinking about things, more knowledge of what the computer's actually doing, ability to program efficiently, etc.).
edit: If I would push for anything, it'd be for a bigger focus on 'pure' functional languages. Haskell is a complete brain-bender when you're learning it, but it will change the way you approach any programming question.
edit 2: @Breton - yeah I got a bit offtopic there. The stated reason we were given for "Why Java?" was so they could focus on concepts and algorithms, rather than having to get stuck into the nitty-gritty of a lower-level language. The more cynical side of me suggests that it was so they could turn the washout rate into 25% first year, 25% second year, rather than having 50% of people failing an 'entry-level' course (my university still jammed C down our throats in second year and onwards).
Since I can't seem to comment on other answers, I will say this here. The link that Robert Harvey posted, at least from my brief scan, is so wrong as to be laughable in current educational circles. Anyone can be taught how to program, the evidence that they seem to be presenting in support of the proposition that many CANNOT learn programming is NOT solid in the least sense, and MUCH solid evidence for the proposition.
Read Papert, read Harel, read Blikstein, read any constructivist education researcher on the subject and it will become clear very quickly that, if the early statements in the link reflect the rest of the diatribe, that it is utterly false. Elementary school children have been taught how to program, as children at every other tier of the K12 process.
(disclaimer: I am quite connected to this topic as it is what I research at UC Berkeley)
(double disclaimer: No offense meant to Robert, I do not criticize you, I do not know your views on the subject, I do however criticize the writers of the paper, at least, with what little I know of the paper).
Given the historical context of the late 1990s, these decisions make sense. Java was the up and coming industry language, supported object-orientation in a clean way, featured garbage collection, and so on. It's not a bad first language, and if you teach C/C++ in systems programming and add in at least a couple more advanced languages you can probably strike the right balance between giving your students better chances in the current job market and preparing them for future developments.
You don't want to be a monoglot though, so if your school is mainly offering Java, try to take courses that use other languages (AI, systems programming, web, ...), or do some independent study, or even change universities.
I'm old enough to have used assembly first (1Mhz 8080) and then Pascal (2Mhz 8085). At that point, Pascal was an academic and an industrial language. IMO, the market moved to C on the basis of benchmarks. Run-time checks cost, and at a few MHz, matter. At that point schools still liked the academic roots of Pascal, and the run-time checks, but what were they going to do at the OO transition? I don't remember a widely supported OO-[oops,pascal] standard. Coming into this Java had some obvious strengths. On the technical side it was both object oriented and multi-threaded. On the pragmatic side it was freely available and fairly cross-platform. And there are jobs for it, as there were for Pascal way back when.
I think it makes perfect sense to teach Java.
What would the alternative object-oriented language be? C++? That's a tough row to hoe if you've never programmed before.
Would C# be considered a worthy alternative now? Why would that be any different from teaching Java?
Java's a perfectly defensible balance between rigor and preparedness for work after university.
That should not be the only language that a student is exposed to. Certainly C, Lisp or Scheme, a functional language like ML, Haskell, or F# ought to be included in the curriculum. Python gives a nice mix of things and has some wonderful libraries. I would consider it an excellent choice.
I'd also say that exposure to a real data structures and algorithms class, numerical methods, compilers, operating systems, etc. should be an integral part of the curriculum.
The original question implies that someone is getting a computer science degree by simply learning the Java API and doing a few class hierarchies with Animal or Shape at the root. If that's the case I would object as well, but I don't believe it's true.
Didn't I hear Jeff Atwood say on a recent podcast that mashing up different bits was what modern programming was all about? Is Joel complaining about Jeff Atwood's skills because he's using .NET and not Scheme or ML or any of the harder languages that he had to suffer through back when he was going to Yale?
Joel's citation is five years old now. I'm not sure that it's 100% pertinent anymore. He sounds like he's turning into an old guy grousing about how easy young kids have it today.
Just my guess for the broader appeal of Java:
- Java doesn't suck like C, i.e. it is a real high-level language, has a garbage collector and no pointer arithmetic.
- Unlike Pascal, Java is object-oriented, which makes Pascal somewhat dated.
- Functional languages are nice for a number of things, but an object-oriented language is better suited for many practical problems.
Personal experience: They don't have professors which are experienced in C++ and therefore not able to teach it..
I am getting ready to graduate with a BS in Software Engineering and the only programming classes I have taken are Java I, Java II, VB.NET I, and VB.NET II. The only "C" exposure I had was with a project I was thrown into at the last minute in which team members had already picked their project (some stuff with OpenCV). Neither of my team members had any C++ experience either, and the project has been very difficult. I don't mind taking the Java classes, but I would really have liked to have some "C" experience, preferably C#. Why they taught us VB.NET over C# I will never know.
Java is better designed and has better support in terms of libraries and editors and tools. Java works well as a batch-processing program and as a web development program.
When I was in college, we were taught PL/I in our first semester. After that we were expected to pick up new languages on our own time. The professors would say, "We'll be using C this semester..." and that would be that. Except for C, all the languages we used back then have fallen by the wayside for the most part... COBOL, Fortran, PL/I, Pascal, BASIC, various assemblers...
If I were to recommend languages for new people to learn, they would be Ruby, Java, and C. A modern OO scripting language, a modern OO statically-typed language, and a decent language when it's time to get dirty. And if I could only pick 2, I'd probably drop Java out of the list.