I've worked in the real world for almost a decade as a developer and have gone to graduate school part-time for some of that time. I am about to complete a master's degree and I agree 100% with the up-voted poster.
In my experience, "real-world" experience is great for learning how to get things done, but all to often there is a lack of rigor, perspective, and eventually a kind of complacency with doing things as they've been done before. If there is no immediate incentive for learning or changing in the form of money or other market forces and without visionary leadership, it is too easy for these things to fall by the wayside. When it becomes necessary to change, often times outsiders are brought into the organization or there is a scrambling for the business to respond to a knowledge crisis. And by responding to a knowledge crisis, I'm not talking about buying the latest API reference for your language of choice. One of the founders on this site has emphasized in the past that he looks for developers that are smart and gets things done. In my experience, the get things done mentality in most businesses comes first, with perspective and longer-term thinking coming second. As a "smart" developer and a non-founder, how long can one develop bug-tracking software and forum-software?
In terms of pay, if you look at the trends on indeed.com, master's degree in computer science salaries actually are lower on average. And in my own experience, after working in the real-world, your non-graduate-degree co-workers and management don't seem to value a graduate degree in computer science (unless you work at the likes of google or microsoft research). That being said, I really feel that having been exposed to more challenging problems that require careful thought over a period of a time, as well as the academic rigor has given me a mental-edge over my co-workers. Plus I've met more genuinely brilliant people in academia then in the business world. Along with this exposure however, I've become more at risk at getting bored more quickly with learning the latest APIs to solve the same business problems over and over again however.
I disagree with Mark Rogers that academia follows industry in our field. I think that reflects his one-sided perspective. The idea for features in C# that I heard my co-workers rave about a few years ago such as lamba expressions and reflection have been around since the 30s and 80s, respectively. Things like cloud computing, which is a top trend on indeed, in some form or another has been around for several decades. Perhaps industry problems drive academics to research, but I don't believe that academia follows industry in our field. I also want to point out that the revered book code complete, if you look at the citations, you will see is loaded with citations to journals that academics spend their lives publishing to. And those articles are not the latest and greatest that have come-out of academia either. As far as the best minds comment: I'll concede that perhaps the best minds at getting things done under budget and on time are in the real-world.
Overkill? No. Worthwhile? Maybe.
Positives
- perspective
- rigor
- opportunity to work on new problems
- familiarity with established literature to solve old problems
- segway into a research or teaching career
Downside:
- it costs money unless you have an assistantship or work for a progressive company
- immediate monetary pay-off is questionable
- if you take evening classes, you may forgo happy hour for a few nights per week for a long time