views:

3877

answers:

32

Over the summer, I was fortunate enough to get into Google Summer of Code. I learned a lot (probably more than I've learned in the sum of all my university coursework). I'm really wondering why they don't teach a few of the things I learned sooner in school though. To name a few:

  • unit testing
  • version control
  • agile development

It seems to me that they spend a significant amount of time teaching other things like data structures and algorithms up front. While I still think those are very important to learn early on, why don't they teach more of these three before them? Or is it just my school that doesn't teach much of this stuff?

Don't get me wrong, I don't think it's desirable for universities to always teach the trendiest programming fads, but shouldn't my professors be teaching me something other than "draw a diagram before you start coding?"

+6  A: 

oh god don't get me started

i once had the dean of cs at a reputable university tell me that object-oriented programming was just a 'fad' so they didn't offer any classes in passing fancies like C++

as to why they don't teach these things, well, the college is there to teach you the fundamentals of the discipline, not necessarily the best practices of the industry

Steven A. Lowe
Or to put it another way, universities see their role (rightly or wrongly) as providing academic education rather than vocational training. That's why many fresh graduates know very little about the craft of real-world programming (e.g. writing maintainable code).
Andrew Swan
And now all they teach (at least in the first couple of years) at many universities is Java. Ah, the irony.
Matthew Schinckel
When did he tell you that OOP was a fad? Until the advent of Java, OOP was closer to a fad than required knowledge.
drewster
@[drewster]: 1994, though I think you give Java far too much credit. OOP is a logical progression in programming-language evolution; to call it a "fad" at any stage of its history (much less in 1994) indicates a level ignorance beyond the pale for a CS dean.
Steven A. Lowe
Regardless, OOP was not mainstream until Java became widespread. In hindsight, it's pretty clear that OOP design is important, but back in 1994 I don't think it was obvious. It was only in 1998 that C++ was standardized.
drewster
@[drewster]: LOL, well that's one way of looking at it. Another way of looking at the same time period is that OOP was not mainstream until universities began teaching it, with Java being the favorite college-class language. So there's a chicken-and-egg issue here ;-) In those days universities (with the exception of MIT and a very few others) were 10-15 years behind the market in what they taught, which was the point of the original question. Appalling lack of foresight should not be surprising, even from alleged centers of learning, but *total ignorance* is another matter!
Steven A. Lowe
What's with the false dichotomy between academic and real-world/practical? Almost every single idea you're using in your "real-world" work came from the academic community or was improved by it.Where do you think the lack of GOTO came from? Objects came from computing scientists in 1967.A lot of CS people weren't clear on the advantages of OOP and it's still an undecided thing. Industry thinks it helps, but there are a lot of failed projects that prove otherwise.
omouse
@[omouse]: there's a difference between what academics invent and what universities routinely teach. Objects indeed came from computing scientists in 1967, but the concepts were not routinely taught for about 30 years!
Steven A. Lowe
+1  A: 

I think good CS programs should teach the fundamentals that will serve as your basis for all future programming education. Development methodologies like Agile, and version control tools are like fads; they come and go. Also, they tend to be used in industry settings and not academic ones, so I think it's rare for universities to cover things such as those that you'll probably learn on the job. I'm not saying it's right, but that's probably the academic mentality.

Bullines
Sorry, but I don't see Agile and version controls as fads any more than the assembly line or the invention of calculus was a fad. In the real world we are designing things that are fundamentally changing programming, but the universities have been so far out of touch with reality in their little lecture stands that they aren't aware we've moved forward.
Austin
+1  A: 

I agree with what you're saying. I just recently started working in the software development world and I've already started to learn about agile development, something that I was never taught in university.

The fact of the matter may be that university profs don't keep up with newer development techniques as much as they should. They also may feel that there are other, more important things on their curriculum.

Dave
+1  A: 

I learned all of those in University. Perhaps it depends on the courses you choose? My courses were very diverse (Software Design, UI Design, eCommerce, AI, Functional programming, etc.). Software Design had exposure to design patterns and unit testing (one large project which involved various things). UI Design...we were a three person group working on a project. We couldn't do anything without version control, so we got that. And agile development was something our professors continuously told us about, but they left it upto each group to use it.

I find that many University students took "easy" courses or courses which would them give a high GPA. Others focus on what they want to learn and are largely exploring to find what field would interest them. And then there are those who know exactly what they are interested in...which is good, except they tend to not diversify their courses.

Swati
The thing is that those classes are higher level classes at least at my school. I feel that those should be among the first things taught or at least that they should be taught at an intermediate level.
Jason Baker
+6  A: 

Why not, indeed? My experience getting my CS degree was pretty much the same. The reason is that people who teach programming don't program, as far as I can tell. It's not required to teach that stuff for accreditation, the teachers aren't familiar with it, and students never develop projects of any significance as part of their coursework. There's no motivation to actually teach programming, as opposed to teaching CS theory or Java syntax.

Allen
+1  A: 

University lecturers don't know about how to write software, they just research it, teach it, and occasionally bash out some code that only has to work until the paper is published.

It's only because of folks like Titus that we're getting any academics who truely grok programming - Read his comments on that topic here

When I was a student I read books in the library about Extreme Programming, an we discussed it breifly in classes - the same classes that demanded that we conform to the "Waterfall Model" of software development, where "compilation" is a step of its own.

All the best with your career, I hope you graduate your degree, it's nice to have letters after your name. :)

Jerub
+5  A: 

Computer scientists think they are mathematicians not engineers and so they prefer teaching the maths parts than the engineering parts. Testing,version control and documentation aren't passing fads any more than they are in any other engineering discipline.

Martin Beckett
So we should only hire software engineers and not computer scientists? ;-)
Andrew Swan
A: 

Computer scientists think they are mathematicians not engineers and so they prefer teaching the maths parts than the engineering parts. Testing,version control and documentation aren't passing fads any more than they are in any other engineering discipline.

Martin Beckett
+26  A: 

Because our teachers:

  1. Never tried unit testing,
  2. Don't know how to use version control and
  3. Haven't even heard of "agile development".

Students should take matters into their own hands. We did that, and turned out just fine, didn't we?

mislav
"We did that, and turned out just fine, didn't we?" - SOME of us... some were lost along the way because the teachers didn't do all that could.
Andrei Rinea
Well whatever the teachers do, people will still complain. The sharp is always hungry for knowledge and _will_ turn out just fine.
jeffjose
Our teachers weren't software developers, and we weren't going for a degree in software development; we - largely - went for computer science, which is a different beast, focused more on theory than practice.
Dean J
+31  A: 

Leonardo da Vinci wrote,

Those who are enamored of practice without science are like a pilot who goes into a ship without rudder or compass and never has any certainty where he is going. Practice should always be based upon a sound knowledge of theory.

The good schools teach both theory (data structures, algorithms, etc.) as well as practice (unit testing, version control, etc.). This requires an appropriate mixture of faculty so that both sides of this coin can be properly taught. A faculty composed entirely of theoretical types with no real experience won't do. Similarly, a faculty composed entirely of practitioners will not do. You need a mix, and the good schools have that.

Alan
I agree with the main thrust of what you say, but I'd argue that the problem of managing multiple versions concurrently is a key theory element to understand. By contrast, I'd agree that the usage of tools like CVS and SVN to solve this problem firmly belongs in the realm of "practice".
Andrew Swan
But covering version control in more than a couple lectures during a general "Intro to Software Engineering"-type class probably isn't necessary. Cover what it does, basic use, maybe a little on branching/merging.
Adam Jaskiewicz
I had such a class called "Team Software Project". It didn't cover version control, but it did cover UML, software development methodologies, requirements gathering, unit testing, etc.
Adam Jaskiewicz
+9  A: 

I taught these things when I was an Adjunct at the Oregon Institute of Technology. They are taught, just sparsely.

Scott Hanselman
You're an aberration. And thanks for being such a good one.
Even Mien
What was the title of the class?
Dean J
+6  A: 

Well the thing with universities is that they need to teach things that really are universal. Something like agile development is still pretty new and despite how much it's talked about on the Internet it's not being used everywhere so teaching it to a whole class of students would potentially only benefit a few people who landed in agile shops.

Version control however is something that these days is inexcusable. It's something that everyone needs to understand it's a tool that is almost as useful as a compiler and CVS has been around for about 20+ years. The concepts at least need to be understood by any programmer leaving a university. Fortunately if you do any group work in university you may be lucky enough to land with someone who already knows about version control and convinces your group to use it. I know I'm glad that person was in my group.

Unit testing is also pretty much as inexcusable. The only thing that I'd say there is that the book is still out on test driven development and going for 100% code coverage always can sometimes be more trouble than it's worth. But unit testing is extremely valuable and should be covered in a software engineering course. I'd imagine that some of this stuff is making it's way into some universities but just hasn't reached all of them yet.

William
version control is not necessary in a university course. They might as well teach "how to use visual studio". Best to leave that for when you get a job. As for testing - unit testing isn't necessarily the best, but they should teach at least a little of all forms of test practices.
gbjbaanb
@gbj agreed, I had no idea what version control was until I got a job, and I saw the benefits immediately and learned it in like a day. There are much more important things to teach in school IMO.
temp2290
+1  A: 

The main reason is that many (most?) universities considers themselves to have a different goal than a trade school. As such, they want to teach students how to learn, and the fundamental principles of the discipline. Additionally, algorithms and data structures will apply to any programming language, and is not dependent on specific tools (which may or may not still be in use by graduation).

In Computer Science, that means algorithms, data structures, computer theory, compiler theory, etc. The stuff that you're listing is less about understanding how to program, how to solve problems, etc. It's about the practice of programming (which, incidentally, is an amazing book for anyone in college with the intention of working as a programmer). Now, much of this will not be used at an entry level code monkey position, leading some people to think it isn't useful. I disagree. I think it can be extremely useful. However, it doesn't mean that after you get your CS degree, you know everything you'll ever need to work as a programmer.

Which also isn't to say that the things you mention aren't useful. They are. You'll have trouble working on as a programmer if you don't learn them, and I do think they should be taught in college, at least to a certain extent. I would look at teaching version control, unit testing, etc, at the same way I would look at an undergraduate programming in art, and the teaching of what paint brushes are and which ones should be used for various cases.

Christopher Cashell
+2  A: 

To answer why these things aren't the first things being taught: Undergraduate programs typically train you to become a Masters student. Only once you start picking your own courses (which typically happens in later years) can you choose to learn about things used outside of academia. This is why they focus on algorithms, data structures, presenting you with unsolved problems, etc.

I personally think it is fine that they are doing this. Programming isn't as easy as many of us make it seem; many people struggle with it. I would rather these people first understand how a for loop works before figuring out the monster that Perforce is.

Swati
+90  A: 

The simplest answer to your question is that the fields of computer science and software development are both very new, and not very well understood. Although all scientific and engineering disciplines are advancing more rapidly in modern times, other fields have a lot more experience to draw on and there is a much broader shared understanding of how they work.

For example, despite recent advancements in materials science, civil engineers have known for about 2000 years how to build an arch that won't fall over, and this is something that can be taught and learned in university with relatively little controversy. Although I completely agree with you about the techniques that software developers should learn, this agreement is based on personal experience and informal reasoning. In order to be a socially accepted "best practice", we need quantitative data which can be very expensive to gather: how much does version control help? How does it help? Unit testing? We can reason about the effectiveness of various techniques, but actually proving that effectiveness conclusively would be very expensive. We'd need to run a complete, realistic software project from beginning to end, numerous times, with groups of programmers that have equivalent expertise, using different techniques. At the very least we'd need lots of data about existing projects which those projects would be unwilling to release.

Civil engineers have thousands of years of bridges to look at, with lots of information. Software developers, on the other hand, have only a few decades of information, most of which is kept secret, since there's little motivation for organizations to collate and publish information about their developers' effectiveness, even if they are collecting it (which most aren't).

There's also some confusion of fields. Software development, or software "engineering", is really a different thing from computer science. Software developers need a working knowledge of computer science, but working at the boundaries of algorithmic complexity or reasoning about parallelism isn't something that a working programmer will do every day; similarly, a real "computer scientist" will write tons of throw-away code that just doesn't work or doesn't do anything interesting, and won't benefit as much from the sort of rigor that an actual software product would.

The emergence of the internet and the open source community may provide enough data to start answering these questions conclusively, but even if the answers were available tomorrow, it will probably take 100 years for them to permeate international society to the point where everyone agrees on what should be taught in schools.

Finally there are some economic considerations. It has been a relatively short time since almost everyone involved in software development had cheap, easy access to dedicated machines to run whatever development tools they want. A few decades ago, completely dedicating a machine to just running your tests, or even housing an infinite history of source code, would have seemed frivolously expensive to a lot of people.

Glyph
Extremely well put.
javamonkey79
+1 outstanding answer.
dreftymac
+1 maybe one of the best answer's I've ever read on SO to date.
SnOrfus
+1  A: 

All three things you mention (unit testing, version control, agile development) are taught to some degree in the Computing Science programme of the University of Groningen. Whether or not that is a good thing I will leave as an open question; but it is not true that no universities teach you the "practical stuff".

Thomas
+21  A: 

Computer science has always been somewhat contradictory; The part that's about computers isn't a science, and the part that's a science isn't about computers.

Universities tend to lean more on the 'science' end (algorithms, datastrctures, compilers, etc) because those things are much more 'timeless' than current industry best practices, which tend to evolve and change from year to year. Version Control, for instance, has undergone amazing changes in the last 5 or 10 years, but big-O is still big-O, and hashing, btrees, and recursion are still as useful as they were 40 years ago. Their idea is generally to give you enough foundations that you can then pick up tools like git and understand what it means when you're told that the underlying datastructure is an acyclic directed graph of SHA-1 hashes, and that the developers have worked hard to optimize the number of syscalls so that it's io-bound.

Now, think about where you learned all the things you had to know to understand that last sentence - if the answer is 'university', they're doing an okay job.

pjz
+1  A: 

These are based off my limited experiences in a CS program before I switched majors, and my experience as an intern at a large software company. Unit testing isn't taught because most of the programs that you have to create arn't large enough to need automated testing, your garanteed a specific set of inputs so can test everything manually. Teaching you how to automate testing may also interfear with the grading of your project since most projects are graded with scripts that run automated tests, with a quick glance at the code to make sure you don't have int foo1; int foo2; and you use proper indentation.

I don't know why version control wouldn't be taught but part of it is probably the size of projects. I never had any project that was large enough for version control, and by large I mean over 1000 lines of code and took an entire semester to write. I guess they figure you'll teach it to your self if you need it. Any group projects I had were supposed to be pair programming projects, and why use version control if your both at the same computer?

I don't know why agile development wouldn't be taught but it probably goes back to the same thing with program size. While adgile development is common with new software that runs on personal computers and small servers it is not generally used on systems such as IBM mainframes or in problem domains such as banking or medical where documentation is king. It also probably has to do with the fact that adgile development wasn't around 20 years ago when a lot of professors were trained.

Jared
> why use version control if your both at the same computer?I use version control even when I'm the only one at the computer! Otherwise how would you manage branches and patch releases, or even view a previous version of a file (back before your last change broke it)?
Andrew Swan
Ditto from Andrew. I use SCM tools extensively, even though all of my work is done on my laptop, and most of it is solo. Backup, Revision Control, Branching and Merging, Patching old code. They are all reasons to use it not just for source code, but for any produced content.
Matthew Schinckel
There's no reason why you aren't graded on whether or not your code passes unit/acceptance tests.
Will
+2  A: 

I think the issue is that universities don't feel that they need to teach you to be a professional, but instead focus on the academic side of programming. I would have thought there should at least be reference to the latest methods and techniques that are used in the industry, as these things are of academic interest as well.

In our course, we were taught Personal Software Process, which covered things like recording time spent on projects, good commenting etc., but no mention of professional fundamentals like version control.

Deeksy
+5  A: 

It depends on the university. I graduated in 2003, from an Australian unviersity. In that time we learnt UML, Unit Testing, XP (and other Agile Methodologies), along with all the formal stuff like Z, Algorithms and Data Structures, Operating Systems, etc.

They didn't cover unit testing in great detail though, more just paid it passing service for one lecture. It would have been great to have learnt how to write effective unit tests, rather than just "What is a unit test".

As far as version control is concerned, we were using it (CVS) in our programming projects from 2nd year onwards.

I do agree strongly with what Glyph said, too. CS is such an immature field, really only around in the last 50 years, that we don't know what we should be learning and what is only a passing fad. Give it a 150 years, then things might be settling down more. The number of failed realworld projects makes it obvious that this is an immature industry. Imagine if 80% of building projects failed!

rob_g
+1  A: 

They don't teach such topics because most schools are academic, not trade. That is, they are designed to teach ideas and theories, not train you into a career. The entire concept of QA has nothing to do with computer science beyond passing a mathematical proof. Besides, QA practices and development workflows differ wildly from one development house to the next, so teaching them in school is a waste of your time and money.

Nathan Strong
+5  A: 

The simplest answer is that you are studying computer science and the things you listed aren't really related to the academic field of computer science. Software development might be something that you do with computer science, something that builds upon the blocks of what you've learned ... but Computer Science and software development are not the same thing.

Classes that taught you version control, or how to write effective unit tests... that would be teaching you a trade, namely, (good) software development.

matt b
+2  A: 

I learned all that stuff freshman year, with the exception of agile development.

It's all about choosing the right school, IMHO. If you go top 10, you will learn all that stuff quickly.

As far as CS Education in general, we're basically asking professors to teach so much (languages of every flavor, data structures, run-time efficiencies, how things actually work at the bit level). I'd like to raise the question, Why don't the kids take it upon themselves to learn more about Software Engineering?

Alex Gartrell
+1  A: 

I think it depends on what type of Computer Science program you are in there are the ones that aim toward the Research and Science side and there are ones that gear toward the Implementation side. I specially declined against certain schools that only had professors that stayed in the academic world. If you don't have professors that havent been "using" what they teach, its all in their head, literally.

Plug: Having taking BS in Comp Sci and MS in Soft Eng at DePaul University, I was mostly taught by instructors/professors who taught part time, which was fine by me because I would rather have them come in with a anecdote from the previous day and relate it to class. Also this being a mostly commuter/part-time school, most of the students have jobs in using what they are learning.

The process of learning still starts with the all the theory, but then we usually get asked "how many of you actually use this in your job?" and the typical answer is "we use it but in a stripped down or simplier way" and then we go into the practical real world scenarios.

During my schooling unit testing was always present. Even though they start you out on Java, they made us use ANT and JUnit for all project. Which was a good starting on build configuration and unit testing.

And Extreme Programing was included in about 3 or 4 of the classes i took. I remember they all started out with the 12 different aspects, from pair programing to unit testing(see above). And now it seems like the focus is on Agile.

So the quick answer is yes there are schools out there that have a more pragmatic approach then others.

Glennular
+5  A: 

Everything is a passing fad. You will learn more in your first year out of college than all of your years in college. Computer science has nothing to do with computers.

College provides you with a tool box full of tools. This is a screwdriver, that is a crescent wrench. You MIGHT get to use each tool once in college. It is when you enter the real world is when you really find out what you have. You sort out the useful ones from the rest, which ones you want to leave at home on the workbench, just in case, and the ones you keep in your pocket every day.

Tqm, Iso, Cmm, Agile, etc. These are all fads they will come and they will go, none of the successful ones are more than just common sense. All successful engineers and companies use some flavor of common sense, that is what made them successful, few needed a name for it. The problem is you cannot sell common sense, a manager cannot prove their value to the company by training and buying common sense without a catchy name. Put a name on it that their superiors have read in some news article or magazine and the manager keeps their job and you keep yours. Very few of the companies that claim to follow these practices actually do. Most write a check to a consultant, and get their annual and or lifetime certificate to some club so that they can put a graphic on their website or a label on the box their product comes in. Many will argue that this is rare...been there, seen it, it happens. This is all part of business, you have to cut corners sometimes to stay profitable and keep the doors open and the lights on. The hardcore followers of all of these practices have all argued that the last one was a fad and this one isnt, the last one really was too expensive to follow, this one isnt. The last one was fake you just hired a consultant, this one is real. Like programming languages, these too will evolve.

Your ability to understand the realities of business, the university system, and your role in it is the key. Like anything in life, choose your battles. Its not the university or the business or the government or anyone else's job to teach you want you need or want to know. It is your job to look out for number one. Likewise you cant blame anyone else for providing you the time to do this, you have to do it. You will fall off the horse, you are not a victim, get up and get back on, no excuses, life is not fair deal with it. Do take advantage of handouts, dont pretend to be independent. And certainly pay your dues, dont suck a company dry of handouts, without giving them something (your best at the time?) in return.

Why do people think cmm or agile or any of the others is a fad? Why do they think they are not? Why did the professor teach you program that way? To avoid gotos or to avoid constants or to avoid this and that? Is it because it produces more reliable code? Better performing code? Reduces human error? Or is it because it is easier to grade papers/programs giving them more time to do research? Is it because they dont know how to program and they are just following someone elses book on the subject? Did they teach you that you cannot have maintainable, reliable, high performance code? You cannot even "choose any two" maintainable interferes both with reliable and high performance? Sometimes you sacrifice reliability for performance. Sometimes you dont care about reliability or performance, you just want to get from version 117.34.2 of yet another accounting software program to version 118.0.0. Your business model is from selling version upgrades and tech support and as far as software developers any old robot will do that can write the same code in the same way. Replace the burnt out one with the fresh out of college one and keep selling upgrades.

There are no universal answers to these questions, you have to find out what your opinion is, live with it and defend it. Change your mind, live with it, and defend it.

Question everything...will I really get burned if I touch the hot pot on the stove? Will the psychological effects of being afraid cause more damage than just getting burned? Is there a safe way to test the answer without getting hurt?

When I could afford it I would buy and eventually melt down transistors, caps, resistors, etc in my dorm room, all of which have a distinctive bad odor. It is far cheaper and easier to just buy an amp for your stereo than trying to build one the day after your first transistor class. Linus being the exception of course its easier to just buy an operating system than write one...You can get more done although what you learn in that time is different than what Linus learned.

The world inside and outside the university will adopt these formulas (cmm, agile, etc) for solving problems and when the next one comes out they will drop them just as fast. You dont have to use version control to be successful, there are just as many successes with as without (well actually because of the age of the industry there are many more successes without version control thus far). Likewise you can be successful with minimal testing (look at the really big names in the computer industry as examples). You can be successful by testing your own code, as well as being successful by following the rule that you should never test your own code. You can be successful using emacs and you can be successful using vi. You have to decide what mix works for you and if you are lucky find a place to work that agrees with you. With time what works for you will change, from tools to languages to programming style to fears, version control, documentation, etc. You will get married and have children and decide you might want to hide in the corner of that big company with the big health insurance package with the boring job and enjoy your kids instead of being the hotshot programmer at the small startup.

When you get out of college and into the real world, listen to and work with and argue with the "old timers". They have decades to centuries of combined experience, traps they have fallen into that you might avoid and or test on your own (maybe you realize you dont have to touch the hot pot to find out it will burn you). Most will have seen at least one or two of these fads come and go, and in particular how badly they were burned, and what they did to recover from it. They know many different ways to test things, and the names of the testing styles that have come and gone as well. What works, what doesnt. Where the risk is and how to avoid wasting time on a tangent. As you mature and you become the old timer, pass it forward. Pay for what you learned by trying to teach those that follow you. Remember to teach them HOW to fish, dont just give them a fish. And sometimes you have to let them fail before they will succeed, keep them from getting burned too badly.

What I really wanted to say here is right now we are in a rare situation where we can witness an evolution of a parallel universe (and perhaps influence it). Yes computer science is a young science compared to say physics. But at the same time it has evolved many times over. Depending on where you work and who you work with you may be able to observe hardware engineers. Programming languages in the hardware world is certainly not new, but it has not evolved as quickly as the software world. Software had a few decades head start. Hardware has always thought of software engineers as second class citizens. Our job is easy, their job is hard. (Note I am actually both a hardware and software engineer). What is interesting is that right now they are still dealing with what we would consider elementary or infantile problems. Why would I need to use version control, I am the only one working on this chip. Your experience with gcc or other cheap compilers or free IDEs cant possibly compare with the expensive tools I use, if the company thought you were worthy enough to use it or even know how to use it they would buy you a copy. And a long list of other excuses. I had the pleasure of learning both vhdl and verilog and becoming productive in both within a week from what was almost a dare from such a hardware engineer (despite my diploma saying electrical engineer my job title is software engineer). I wanted to learn these languages, when the tools were available to me I stayed at the office into the night and taught myself. From that point on that engineer in particular realized that what I was saying was true, languages are just syntax, programming fundamentals are the same, the tools all do the same thing. Its apples and apples not apples and oranges.

In general though it is still difficult to send the message that one of these two parallel industries has a lot more experience in languages, programming habits, source control, testing, tools, programming environments, etc than the other. The problem I am trying to solve is taking the hardware designs as they are being developed, create affordable functional simulators that we can tie in with a simulation (virtual machine) of the processor so that we can start testing the hardware and developing the test and deliverable software long before we go to silicon. No there is nothing "new" about this, but we have no mechanism to get the latest code, track changes in the code to see where we need to focus our time. No mechanism for tracking the documentation defining the user (programming) interface to the hardware. The one golden copy is in someone's email inbox in binary form and only changes when, well it doesnt you have to read the verilog to find out what is going on. Wait, that verilog is how old? That bug I spent all week on you figured out three weeks ago and fixed? So do we just fly to some vacation spot and party for six months waiting for the hardware folks to finish their task and throw it over the wall to us, or do we take this opportunity to try to be patient and optimistic and teach them that they there are common sense methods that are not that intrusive that allow them to both do their job, backup their work as well as share their stuff for peer review...

Remember that hardware engineers did leave college with a box of shiny new tools just like you did. You learned 17 different programming languages of which you may only use one, the rest of the languages you in your career will be invented after you leave college. When they left college they can tell you what they know about calculus and the theory of relativity how many electrons are in each of the elements and compute the charge around a Gaussian surface. But the bulk of their career is one, zero, and, or and not (hey we have those in common, all you really need to know about computers, one, zero, and, or and not hardware or software engineer). Granted the fundamental laws of physics, calculus, electrons are not going to change as fast as programming languages do. But the fundamentals of programming are the same across all languages and will continue to be into the future. Did you leave college knowing that or did you leave thinking java is different and better than C++ because this and that and the other?

Like any other business the universities job is to stay profitable. They have to hire the right academics to bring both the right students and the right research dollars and the right kinds of research to make the university profitable. They have to offer the right classes to bring the right students and produce the right graduates so that as the decades pass employers both near the university and hopefully far away will recognize that this university produces productive and profitable employees. (yes and sometimes you have to attract the right athletes in the right sport to get the right amount of TV time and the right amount of name recognition and sports revenue). Some universities will teach C++ and Java, some never will. Some will invent CMM, and some will teach Agile, some will do neither. If the university has any value at all there is something there for you to learn. They will not teach you everything there is to learn, but they will have something useful. Learn that something while you are there, collect a reasonable number of various forms of tools in your tool box. Leave the university and get a job. If your toolbox sucks maybe find another university and never mention the first. If it is an ok tool box use those tools and build some new ones on your own time. If its a pretty good tool box, say good things about that university and the good academics you learned this and that from and pay the school back for what they gave you. Even though you didnt get every possible tool in the universal catalogue of university tools you will walk away with a certain subset. Even if you dont graduate...

+1  A: 

Unit testing and version control were both taught in 2nd year Computer Science courses where I went to university. Unit testing fell under the part of testing that also included differences between white and black box and a good chunk of the marks in 3rd year programming assignments went for good error handling that can easily come from unit testing.

Agile development may be rather hard to teach in an academic setting I'd think. While I did learn about the Waterfall method in theory, I didn't get to see it in the field until after I graduated and moved into the real world that can be quite different from academia, e.g. in 3rd year I do all the odd error cases and nearly pass an assignment where I never touched the heart of what the assignment tried to teach me about Semaphores.

Also, how long has agile been around and which form of agile did you mean? There are many different implementations of it from what I've seen.

JB King
+1  A: 

I don't think agile programming is a fad, but at the same time I'd be hard pressed to think of a way a teacher could give you projects to allow you to learn it.. Unless they gave you project A build a, project B expand on a. The problem is time and scope. In a 4 month course it would be hard.

Version control and unit testing methods are ever changing and dependant on the language, or the person who defines them.

The data structures and algo's are something that can be worked on in a class setting. Honestly too, they take quite a bit more effort to understand then unit testing and versioning. Try to remember part of university is to teach you to teach yourself. Collage does not quite have the same mandate. Or at least not to the same extent. IMHO.

baash05
Hmm I thought college and university meant the same.. not a native speaker though.
Depending on where you are (country wise) in the US they are the same, in Canada they are different. I think in the states what I call collage is actually called Junior collage. In Australia it's called Taff (forgive spelling).. Not being a native speaker makes things like this very "fun"
baash05
+4  A: 

All of that can easily be covered (shallowly) in a single class on software development practices. It's not part of most CS curriculums, because that isn't what CS is about, though I do think some coverage of that stuff is useful. My school had such a class; it didn't cover version control, but it did cover UML, requirements gathering, development methodologies (various agile and waterfall), unit testing, integration testing, etc., and required us to work in teams of 4-5 to develop a project (a rather simple Clue rip-off in Java). If you felt the need for further Software Engineering classes, they were available as electives.

Despite never having version control mentioned once in any class that I took, most of my friends were using it for personal projects, class assignments, and so forth, so it isn't as if we weren't exposed to it. The people who didn't pick it up on their own got forced to use it by a classmate in the course of a team assignment.

University is meant to teach concepts and theories, because those are the things that are hard to pick up on your own. Version control is a tool, and pretty easy to pick up. Use it a bit, read some tutorials on the web, and you're all set. If you need lectures and homework assignments to figure out how to check something out of SVN, you are going to have a lot of trouble with the things that actually ARE difficult.

Remember that there are plenty of ways to learn stuff in college outside of classes; take advantage of that. You're paying a lot to attend the classes and use the facilities, so milk it for all it's worth and go to LUG and ACM meetings, participate in project teams (there's always some MEs building a robot who need a programmer), or get a job administering the Humanities department's server. Trashpick a computer from the loading dock of the Materials Engineering building, download a Linux iso with your fast dorm internet connection, and play around.

Adam Jaskiewicz
Very well said! Milk all you can.
Andrei Rinea
+1  A: 

Most university software projects have to fit within the confines of a single class which means effectively a 5 - 6 week project involving between 1 and 4 reasonable inexperienced programmers. Unit testing and source control only become convincingly effective once you scale beyond that into longer term projects involving more people. As a result, it's difficult to build such techniques into a class project in a way that doesn't just seem like pointless requirements.

Shalmanese
+1  A: 

You've named 3, some of which I don't think are nearly as important to an understanding of computer systems (eg version control). These things are part of a job, and you can become a good programmer/computer scientist without needing to know it.

similarly for unit testing - why pick out unit testing? Surely usability testing, system test, user acceptance test and factory acceptance test is more important? Well, they are unless you consider your job complete once the code is shipped to the maintenance department :)

Think of the other concepts that I use daily, that would be of little use to a student coming to terms with the fundamentals of software and computer systems:

  • good commenting practices
  • standards compliance (not just international ones, but team coding standards)
  • documentation
  • change control (not necessarily the same as version control which is about storing differences, this is more about what and why you changed something)
  • usability development

The above are all "soft skills" which you don't need to write good code.

However if you're missing the "hard' skills, like data structures and algorithms, then your chance of writing good code is next to impossible.

gbjbaanb
+1  A: 

Just like students, each college is different. Some colleges, or more accurately, some professors are resistant to change or are lazy. Fortunately most are not. Theories, concepts, history etc. are important and vital to any CS curriculum. But so is preparing the student for their working environment. Not surprising, the community colleges in my area offer very current and applicable CS courses. Not so much with large, established and prestigious university.

Matthew Sposato
+1  A: 

It's simply because data-structures and algorithms constitute the core of computing and thereby are much more important. Unit testing, version control and agile methodology are but tools of the trade (and if necessary, one is expected to pick them up on the job).

CaptainHastings