views:

409

answers:

11

You are the newly appointed Dean of Computer Science at a school that has no Computer Science program. With all your year's of knowledge, you know a thing or two about what was worth learning, what was a waste of your time, and what you wish you knew sooner. For instance, Assembly language early on was ideal for me... but the lack of formal Design Pattern instruction was detrimental.

What is your ideal curriculum?

For instance, consider a 4 year program, 2 semesters per year, 3-4 course per semester.

A: 

Find a school that you think has a good BS CS degress and look up the book that the profs use.

Arthur Thomas
+2  A: 

If i had a nickel for everytime i pasted this :D

http://stackoverflow.com/questions/3088/best-ways-to-teach-a-beginner-to-program

http://stackoverflow.com/questions/1711/what-is-the-single-most-influential-book-every-programmer-should-read

http://stackoverflow.com/questions/4769/easiest-language-to-start-with

http://stackoverflow.com/questions/22873/language-bookstutorials-for-popular-languages

http://stackoverflow.com/questions/86103/books-and-tutorials-for-beginners-in-java-and-php

PS: It's always a good idea to search a bit before asking a Q :)

And from my experience, depending on what school you go to, college is NOT a waste of money.

Mostlyharmless
-1 IMHO a curriculum is not simply a pile of books. In other words, this question is different from the ones you list.
Péter Török
A: 

That's easy: GEB to get some feeling. Than SICP for the craft. And than just stick to the bibliography.

Roman Glass
+2  A: 

Sorry to say, but if I had a chance to hire two developers of equal "competency", one with a BA, and the other with home-schooling knowledge, I would pick the BA. I would actually never tell a high-school graduate to give it a go at self-taught software development, and I would strongly recommend a degree, even if it isn't in software development. There is so much to be gained, not just in what you learn, but in learning how to learn. There is a great article on Mental Floss that is the commencement speech of David Foster Wallace, and it sums it up nicely:

http://www.mentalfloss.com/blogs/archives/18395

I know it doesn't answer your question directly, but it is what it is.

hal10001
+1  A: 

@hal10001 Agreed. What you learn in college is not just programming, but GOOD programming PRACTICES (though admittedly, most do not do a very good job of this, but it is still better than nothing.) What you also learn is how to solve problems and how you look at things. Thats exactly why even though most books are the same, some profs are simply better than the others. Also, having finished a degree with decent grades says something about your abilities / potential.

Mostlyharmless
A: 

M.I.T. OpenCourseWare. I would pick and choose from the courses I found interesting there (in fact, I do).

Paul Reiners
A: 

What university gave me (M.Sc.) is knowledge about "what's out there". As a self-taught computer specialist, I would always be in a more narrow state of mind. Doing my own work, but less capable of knowing the big picture of others. Computer algorithms come to mind, and Case Based Reasoning. Also courses on usability. Plain programming can be self-taught but I doubt the rest of it.

akauppi
+5  A: 

Not sure I understand the point of the question, because the premise is flawed. This may sound nitpicky, but as someone who has taught undergraduate and graduate students, I can tell you that this scenario is highly unlikely (barring the death of a billionaire who bequeaths a large sum of money to a school without a CS dept and purposed toward the creation of one). Some scattered thoughts:

New departments don't start this way - usually, some other department will pick up on some need for teaching programming or "information science" or MIS (or whatever it's termed) that is somehow germane to the core curriculum of that department. Some number of students will be attracted to this stuff, and will put up with a core curriculum that they're not interested in so they can study the IT stuff. Eventually, the university realizes that there are enough such students that there is significant financial gain to be had from spawning this group of "studies" into its own department.

Lots of CS departments were formed out of Industrial Engineering departments in just this way. However, by the time they get to that point, there's an entire history involved in the evolution of a curriculum that was determined by some combination of enrollment boosters (i.e., the business case) and academic decisions made by a faculty committee interested in the science itself. At first this is ad-hoc and done one a class-by-class, semester-by-semester basis, but eventually the faculty figures out what material is good and how it fits together and so on, and the funding types figure out what material (and what teachers) will maintain or increase enrollment. This is a little bit idealized, because some decisions will be made on for other reasons (i.e., say an instructor writes a textbook and the university gets a revenue-sharing arrangement in place, or perhaps some celebrity instructor --- ala Patterson, Knuth, Dijkstra --- really wants some particular text or other material). However, the ideal shouldn't be terribly far off from the big picture in reality.

Anyway, the point is that you hardly ever see the case that a curriculum is constructed from scratch, and if it was, I think the chances of it being very good are low. Further, a Dean like you presume in your question is non-existent. No one person knows all the best ways to learn. His experiences would be unique to him, and perhaps representative or perhaps not, but there'd be no way to know.

This is why good CS departments are good - they've already evolved their curriculum and faculty and staff to provide what turns out to be a very valuable education, and their reputations are established (by and large) when their graduates perform well in industry (I include academic work in this). That's why the best advice here is simply, "copy the curriculum of a good CS department", and make substitutions for particular needs.

Ben Collins
+4  A: 

year 1: cover all major programming paradigms in historical perspective, from machine code to assembly to lisp/fortran to algol to pascal to C to C++ to java/c# and beyond. Throw in some prolog and OPS5 for fun. Include history on the founders and movers of computer science from Babbage to Turing to Minksy and Dijkstra and Knuth to Falhman to Goldberg to Hillis to whoever is 'in' that year - not for hero worship, but for a perspective on the field [the field is huge and it is easy to graduate thinking that scientific programming or business programming is all there is]

year 2: start with small-scale projects and emphasize modern methods (TDD, Agile/XP/SCRUM); get used to working in small teams. Have a class once a week where you have to debug a program in a language you don't know.

year 3: work on a large-scale project with multiple teams all year long; have deadlines that if you miss you don't go home for holidays so you take it seriously. Include technical writing classes and other communications courses. Write a paper on a cutting-edge technology that interests you, even if it is just a review-of-the-field paper.

year 4: specialize in one area of computer science (evolutionary methods, numerical analysis, usability, etc) with a siginificant research paper and project, and also minor in something outside of computer science (business, arts, language, etc.) so you don't end up a one-dimensional robot ubergeek

Steven A. Lowe
+1, though I disagree with learning the history early. Most of my freshman/soph classes put emphasis on history/people and I just didn't care until junior/senior year when it all fit together better.
Andrew Coleson
However, the point should be made that this is an excellent Developer curriculum, which is quite different from a CS curriculum.
Andrew Coleson
@[Bremen]: I had the history freshman year, and it was stupid and pointless. I ended up skipping the class all quarter, then reading the book the night before the exam, and passing with a B! The history was useless because it was devoid of context. People plus paradigms would make more sense.
Steven A. Lowe
@[Bremen]: i left out the usual CS suspects and concentrated on what should be different.
Steven A. Lowe
A: 

Thread Title Grammar 101 :)

strongopinions
+1  A: 

I've been programming for decades, and spent 4 years teaching undergrad C.S. There is a definite disconnect between academia and industry.

Professors tend to feel that their job is to teach their favorite subject matter, and that it is the job of industry to teach programmers more practical stuff.

Students tend to feel that once they leave school their job is to be apostles of the lastest dogma their professors handed down, and it takes years to break through this.

More specifically, I would suggest:

1) Have two tracks, one feeding C.S. grad school, and one feeding industry.

2) For each track, especially the industrial one, have a quality-control feedback loop.

For example, it does not serve industry to have applicants who can go on about NP-completeness or the glories of functional programming but do not know how to analyze a design problem and propose a range of options.

Mike Dunlavey