The traditional approach to teaching computer science focuses on standalone application development. That is how I (and I suspect most SO readers) learned how to do it, but I am wondering if that is in fact the best way to learn how to be a good developer.
I was wondering what others on SO think about this, is standalone development experience/theory a necessary first step in the education of a good developer? I realize this question is broad (and could even be further generalized if we treat the 'web layer' as any presentation layer) - but I am still curious if anyone here started at web development and eventually learned how to work with the back-end and how to write good class libraries and standalone applications.
'Classical' standalone application development makes up a smaller percentage of software development projects, while focus is continuously moving toward web-applications. While I don't expect the classical model to ever completely disappear, I see far more employment opportunities for web-developers than for standalone application developers. And if this employment picture persists, should we change the way we train developers? This question has interested me for a while, and I would welcome any thoughts, links or directions to resources which deal with this aspect of computer science education in greater depth.