I thought about this issue a lot. I got my bachelors in CS in May 2002 and now am thinking about a masters in CS (well in fact I applied somewhere for next semester and am waiting for a response...). I am bored of being pigeon holed into mostly SQL Server work in my career because of companies and recruiters who only want to hire you based on what you did at your last job.
So the question becomes why do you want to go to grad school (for a masters)? I have poked around and found a variety of reasons:
- career advancement [better job, more money, higher in the corporate ladder]
- career change (construction worker to computer science)
- since you are a developer already it doesn't apply. But often this is considered the single best reason for a masters.
- learn more about computer science (and or test the waters for a PHD)
After reading this article I would add 4. upgrade your outdated skills
There are other reasons, like wanting to avoid the real world when graduating college, etc. that I won't mention at all. But I think the most common reasons I have seen for seeking a masters are 1-3.
Career goals:
When I look for jobs (2002-2003, 2005, 2006, and a couple times throughout that) Most jobs I saw had Masters + 1 Year experience or Bachelors and 5 years experience. Once you get a bachelors and 5 years of experience you are set. The problem is the experience that they want is usually in the job offered.... A general rule is that a masters counts as 5 years of progressive experience. So if you have a masters you can probably talk your way into some jobs that say 5 years experience even if they don't say or masters and 1 year. Going straight from college to masters with no experience will still yield to a hard search for a job because people are often leery about hiring someone with no experience whether a bachelors or a masters. And now due to your masters they have to pay you more. So a masters and no experience will not always be an advantage over a bachelors and no experience. But it may allow you to land a more interesting job for your first job if you are in the right area.
EXCEPTION MBA. I think if you want more money/to climb higher in the corporate ladder an MBA can help. If you are an MBA and can speak in technology (presumably you can since you are a developer) companies will probably be all over you.
To learn about deeper things of computer science that you did not explore as an undergraduate:
You can learn it all from books, but they are long and many are quite dry. I have a giant stack of books that if I read and understood completely I would be a computer science guru (the dragon book, parallel algorithms, artificial intelligence, information theory, etc..) but I just am not self motivated enough to do it. So the pressure of a class would help me here. I think this is a valid reason and it is one of mine. As far as testing the waters for a PhD this is pretty obvious and it may in part be a reason that I am doing grad school (but I'm not sure the super specialization required for PhD research is for me). Maybe a career in research might be more fun and interesting than one in corporate America. The honest truth is that you don't use most of your computer science undergraduate degree in most jobs (in fact high school algebra and basic programming abilities are often more than enough), let alone a graduate degree. It doesn't take much to slap together yet another database application talking to a web server. There are exceptions (like game development, artificial intelligence type applications, etc.) but out of the total proportion of jobs, most I see are related to database applications with web interfaces (in java/php/c#/c/c++/etc.).
To upgrade your skills to the latest and greatest:
I would say don't bother. There are some masters degrees (usually in smaller colleges) that have classes in web app development and Java (from a learn the libraries and some frameworks perspective as opposed to a learn the general programming concepts and only as much of the library as needed) just like there are some undergrad programs with classes like these. But the serious programs are more theoretical. Meaning that class on Unix is not how to use Unix, the shell, etc. it is a class on the design of the Unix operating system. That class on networking won't help you to set up your home network or a corporate network, but it will teach you all the limit theorems and stuff about data flow as well as the various network layers and protocols and how they work.
In general most of the computer science curriculum is about the same as it was before. There are a few new theoretic developments but most of them you have to dive into on your own. Also the new stuff is often new twists on older things. The core AI, core compilers, core operating system, core algorithms curriculum seems to be about the same as when I looked at it in 2002 and as an undergrad in the undergrad courses most of the material was invented way before 2002 or even 1990. Some books that have been around for a while have new editions but by and large most of the content is the same as their previous editions. In smaller sub-fields there are more developments but from what I can tell the core curriculum of most serious masters programs sticks to more general concepts. Most of the compiler class is on Parsing, Code Generation, Lexing, Grammars, etc. and none of that is new. JITs (just in time compiler) will be touched on but are not a major part of the course. The algorithm courses also seem mostly the same (P vs NP, divide and conquer, etc.). And the mathematical foundations of most of the CS stuff are often even older than the CS stuff itself. So the reality is that the core theory is mostly the same. The reality is that the core pieces of the computer are mostly the same except with higher speed and now more cores. Even software engineering (at least as taught in my two colleges) seems similar (although agile methods are catching on now...but many of the books on them are quite readable). Keeping up cohesion within a module, reducing coupling among different modules, etc., my book from 1985 in software engineering had those core principles and I guarantee they weren't invented in 1985.....
So if you want to upgrade your skills, there is a good chance that
a) grad school won't help you (i highly doubt you'll find a ruby on rails, php, or c# asp.net class)
b) if it does help you because while in classes you research on your own to get the practical skills wouldn't it have been easier and cheaper (grad school is darn expensive) just to have gotten the practical skills without college taking up all your time?
***I will say that Paul Graham mentioned that studying hard problems is like weight lifting but with your mind. Just like you may not need to bench press 350 lbs in the course of a day, when lifting that 30 lb object it sure is easier if you can lift 350 lbs instead of just 50 lbs. So in this case a masters may make your job even easier and improve your problem solving/programming skills overall. But only if you actually do the work.