views:

812

answers:

11

I finished second year Computer Science at university with a very good reputation. I was looking for internships and jobs for graduates in general and it seems that going to college for cs is useless. They haven't taught me anything that employers are looking for (like Java EE, C#) all I have been taught is theory which I'm sure most people forget by the end of the year.

My entire first year was in a programming language that has not been used in the real world for decades (hell it was a subset of a language that has not been used for decades). Second year was also ridiculous theory and math which has nothing to do with the real world.

Even looking ahead at courses for next 2 years the only useful course I can see is a course in databases. I know cs students should learn stuff on their on and I have taught myself a couple of things but when you are taking 5.5 courses a semester you don't have the time to do much else, what is even more frustrating is that only a small fraction of courses are about cs (last semester I had to take 2 cs, 2 math and 1.5 non-math/cs courses).

I am extremely pissed off and feeling like going to college has been a waste of time and money. My question is that is there any point of a college CS education? Those who graduated with a cs degree do they feel all the effort and money was for nothing?

+5  A: 

Yes, there's a point. Just don't expect to learn new technologies. Expect to learn how to learn. This is more important, in a long run.

Igor Krivokon
When a law student goes to law school they learn law related stuff they don't "teach you how to learn" (a term which I see get thrown around a lot but no one seems to know what it means) and expect you to become a lawyer, they atleast teach a set of tools that allow you to get a law related job without any extra learning, the same is true for engineers, doctors and accountants. To teach cs students no/little real world skills is utterly ridiculous and non-nonsensical.
Law school is a trade school.
sharth
I know what "teach you how to learn" means... I'm sure many people do. Once you graduate and get some experience you will know what it means too.
cbp
I disagree with this "learn how to learn" business. You've spent 18 years "learning how to learn" by the time you get to college. An undergrad computer science curriculum teaches you a lot of concrete things. Just because they don't correspond 1:1 with the things that Joe Random Software Company is looking for on resumes doesn't mean that you have to bend over backwards to justify learning them; they are valuable in and of themselves. Do people study mathematics and complain that they won't need to know linear algebra to be an accountant?
mquander
A: 

I understand your frustration, but a good-grades degree from a reputable school is going to help you find a better job once you get it, so, hang on tough. Don't omit studying on your own the various technologies and languages that help in the real world, of course...

Edit: btw, I can't speak to this directly from personal experience, as my own degree is in EE -- 30 years ago, it helped me (was actually indispensable) getting a good job designing chips... but I'm pretty sure that the situation is similar in CS and other technical fields (my daughter has a Master degree in Telecom Engineering -- advanced radio systems, essentially, such as ultra-wide-band carrier-less ones -- and she's now going for a PhD for a similar reason... these days, in that field, it seems you need a PhD to get the really good starting jobs...).

Alex Martelli
+3  A: 

Oh, going to university for CS isn't useless. It's not strictly necessary, but for all that you're sitting there and thinking that you don't have the languages that employers are wanting, you're also not noticing all the jobs out there that have as a requirement that you've graduated from University to get the job. So there's that, at minimum.

That said, university is there to teach you how to think about computer programming at a higher level than just using a given language. When I graduated from CMU, I'd learned a fair bit about pascal, which I've never used again, and C, which I've used some from all of the languages descended from it, but not directly C. My first job was in a language that I'd never used before, but I found it easy to learn because of my years in college.

And yeah, take some time outside of classes to pick a technology that interests you and brush up on it if you can. Not because job X wants that technology, but because it's interesting - you'll find it a lot easier to find the time and the energy to pack another language into your head if it's not another slog.

John Fiala
+1  A: 

Programming languages come and go and you can't really expect the university to teach them all. That will be upto you to learn. However, I am pretty sure that any university offering a course in CS covers some of the fundamental aspects of it. Something like Object Oriented concepts, operating system concepts or the compiler design etc. These thing will not be depending on any particular language. Since these are very essential for building good career I never think that doing a CS course in a university is useless.

Naveen
+18  A: 

It seems like you are confusing computer science with software engineering. Employees look for software engineers more than computer scientists the same way construction companies look for civil engineers more than physicists.

I see nothing wrong with your CS curriculum, and only wish more were like it. I would suggest you switch to software engineering.

Also, universities shouldn't be trade schools. You are not at university to learn a trade, you are there to gain the necessary knowledge to do things that has not yet been done.

freespace
@freespace: That's not true with complete generality. In real life, many schools do not have a Software Engineering curriculum, but do have a good CS curriculum. Even if the school thinks of itself as turning out (unemployed) CS grads, what they're really doing is making Software Engineers who will have to learn Software Engineering on the job.
John Saunders
+13  A: 

I've got a post-graduate degree in Computer Science (from Canterbury University, New Zealand) and I find that it's highly relevant to what I do every day.

It's not that the skills I learnt during my degree are appropriate, but rather that the degree taught me the fundamentals that underly everything else, and so I understand that there is no magic and that every layer of the stack can be understood.

A computer science degree isn't going to directly teach you the skills needed for the next five years. It's going to teach you how to teach yourself the skills you need over the next thirty.

If you want to learn to program in C# or Java or C++ or XYZ, go do a vocational course somewhere else.

If you want to learn what underlies all of those languages - as well as those that will be coming down the pipe in the next ten or fifteen years, go finish your Computer Science degree.

As someone else pointed out - don't confuse software engineering with computer science. The two are related, but different.

Bevan
+5  A: 

is there any point of a college CS education? Those who graduated with a cs degree do they feel all the effort and money was for nothing?

It was worth any cent.

Why? Because in university/college they do not teach you programming languages, they teach you thinking. Like math is not about using the calculator.

There are many topics every good programmer should know:

  • Math including linear algebra (yes I mean it)
  • Efficient data structures
  • Hardware structure
  • Verious efficient algorithms.
  • Computability (you should know what is NP complete problem or halting problem)

And this are the basics, there are much more.

Yes, it is worth it.

Your employer do not look knowledge of C++/C#/Java/Python. He looks for knowledge of programming using these languages, that by the way each of them you can learn in several month--half a year and master in a year or two (depending on your previous skills).

Artyom
As one of my mentors said, computer science is as much about computers as astronomy is about telescopes.
Barry Brown
A: 

I finished the second year of computer science last year and am now taking a year out to work in Spain. I have made several similar observations about my course, especially since I really suck at the maths and theoretical stuff.

So far though, I have learned how to learn programming languages and frameworks (java at university), how to think about databases, never to take an interest in system programming (yes, it was taught that badly), and some basic principles of engineering that I have since had time to evolve into something useful. More importantly, I learned a great deal about critical thinking and how to solve problems.

I would say my education has given me a few opportunities to open doors and get internships etc. I am in Spain because despite low pay, it was a more exciting opportunity than my other offer - working on the contacts manager for Nokia phones. In this job, I have assembled a fair collection of small C# programs and taken the time to learn about concurrency, UI design, and other interesting things not covered well at university.

Is it worth £25,000? I can't honestly say yes or no to that, but I can say I would not trade the experience for anything. Most of the fun at uni comes from outside of class and occasionally meaningless coursework.

Also, consider this: Most employers require a degree and want some of the skills you mention. Obviously, they don't know what they are asking for, but you have a lot more credibility with the degree than you do with a couple of years experience in the field. Pathetic, I know, but often true nonetheless.

University is not required to do well, but it is a great opportunity to grow as a person, even if it is grossly overpriced (depends on the school). There are plenty of cases of drop-outs becoming well renowned in their fields - just look at Steve Jobs.

IanGilham
+10  A: 

Let's play a thought experiment and flip this around.

Suppose your university taught you the very latest technologies, methodologies, and tools -- the exact set of things that employers are looking for today. Where would this lead?

First, you'd be making a huge leap of faith that the things you learned in freshman year would still be marketable by the time you graduated three or four years later. Software technologies move very quickly. If you were taught, say, jQuery today, would that specific skill be useful in four years? To put this in perspective: was it useful four years ago?

How could your university possibly predict what will be the hottest technology at the time you graduate?

Second, let's say by some miracle both you and your university have timed things perfectly and you graduate with a set of skills and technologies that allow you to get a terrific first job right out of the gate. Because of your up-to-date skills, you can hit the ground running. A few years later, you start looking for another job. Will the skills you used in your first job be the same ones your next employer is seeking?

Maybe, but most likely not. And it's not just because technology marches relentlessly forward. Employers have different needs, different methodologies. Some use the latest tools and languages, others use ones that were popular ten or more years ago, and still others use ones that you've probably never heard of. Now you might be really steamed at your university.

Would you be willing to go back to school to learn the old stuff that you and your university disregarded and the new things that have come out since you graduated?

Or, as I suspect you might be thinking now, would you have rather learned that:

  • Despite their significant syntactic ("surface") differences, most programming languages in professional use are pretty much the same. The words and symbols are different, but the concepts are similar. If you learn two or three of them in school, you'll see the similarities among them and can apply that knowledge to the new ones that will become popular in the future.

  • Basic algorithms and data structures haven't changed in twenty years, so it doesn't matter what specific language you use to study them. If you study them using the universal language of mathematics, you'll be well prepared to implement them in any other language of your choosing.

New technologies are never created out of thin air. They are always based on what came before. To be able to create something new, you must have a foundation in prior art. For example, to really understand Haskell (one of the hot languages du jour), you need to know about ML. But to understand ML, you probably need to study Lisp and Scheme. (Is Scheme the language your university used?) To really understand how Lisp/Scheme works, you'll need some assembly experience.

My point is this: if all you know is "new" things, you might not be able to learn yet newer things. But if you know about what came before, you can see how we got to where we are today and you'll be in a much better position to learn about where we're going tomorrow.

Barry Brown
I agree with you in some points but your whole "to really understand" notion is baseless. You dont need ML to know Haskell. And you dont need assembly experience to know Scheme. Lot of my friends whose undergrad major was in mathematics are hell better at language than me (i am an ee and worked with assembly for 4 years). They are not at all in any disadvantage.
kunjaan
I mean to *really* understand it -- to know how its lazy-evaluation, dynamic-yet-strong type-inferring engine works -- so that you can create the Next Great Language that's based on Haskell.
Barry Brown
Very nice answer Barry. Really liked how you took the question and flipped it around.
Simucal
This course http://www.ccs.neu.edu/course/csg711/ is one of the toughest in language design here at neu and strangely assembly is not a prereq nor is ML or haskell or any other language other than the good old Scheme. I agree with you that foundations are very very important and the question is quite asinine but the prereqs that you mentioned is quite unbelievable.
kunjaan
A: 

I'm not entirely sure that universities can cover the breadth and scope of all the tools and languages you would need to work with job X; insofar as the tools you learn for that path may be drastically different than job Y, Z, etc.

From my experience it would have been nice to have been some generic knowledge on common tools (version control, dependency management, etc) rather than the latest /. article, but what are you going to do?

javamonkey79
+1  A: 

I realized in my Sophomore year at GaTech that I'd probably never directly use 90% of what I was required to learn for a degree. Despite this, the degree is worthwhile for getting past HR. Probably 80% of job listings out there will want a degree or equivalent experience (which is invariably longer). It won't make you a better programmer and, in fact, some of the best programmers I've met don't have degrees, but you still need to get through HR before you can wow the team you're applying to.

As such, I resolved to spend as little time on school as possible and learn as much real world programming as I could on my own. In the end, I got down to full class attendance and an average of roughly 10 hours per week out of lecture doing schoolwork. Staying in school has the advantage that your time is accounted for on your resume and if you don't care about your grades and aren't in a hurry to graduate, you have tons of free time to study whatever you like instead of studying for your courses.

I personally got involved in the Mozilla project and learned orders of magnitude more about software engineering than I learned in my classes. It also got me interested in HCI/User Experience and gave me time to work through that. Things worked out for me. I got a job as front end developer for a senior team at Akamai straight out of school due to my unusual (at the time) proficiency with javascript.

Choose your academic load as a tool to motivate you to study topics outside your immediate path of interest. The useful academic topics for me were: Languages and Translation (C and compilers), Statistics, Graph Theory, Linear Algebra, Operating Systems, Systems and Controls, and Information Theory. Bits and pieces of these courses have proved themselves useful in a variety of contexts.

Karl Guertin