views:

131

answers:

7

The traditional approach to teaching computer science focuses on standalone application development. That is how I (and I suspect most SO readers) learned how to do it, but I am wondering if that is in fact the best way to learn how to be a good developer.

I was wondering what others on SO think about this, is standalone development experience/theory a necessary first step in the education of a good developer? I realize this question is broad (and could even be further generalized if we treat the 'web layer' as any presentation layer) - but I am still curious if anyone here started at web development and eventually learned how to work with the back-end and how to write good class libraries and standalone applications.

'Classical' standalone application development makes up a smaller percentage of software development projects, while focus is continuously moving toward web-applications. While I don't expect the classical model to ever completely disappear, I see far more employment opportunities for web-developers than for standalone application developers. And if this employment picture persists, should we change the way we train developers? This question has interested me for a while, and I would welcome any thoughts, links or directions to resources which deal with this aspect of computer science education in greater depth.

+1  A: 

After learning to make web pages with HTML, I taught myself PHP as a next step and as my first exposure to programming. It's certainly possible to do, but I wouldn't say it was necessarily better or worse than what my classmates at university started out with in introductory courses.

If you want to teach a web developer, start with web development stuff. If you want to teach an application developer, start with standalone applications. If you want to teach a software engineer who can fill either role, it doesn't much matter where you start, so long as they know the start isn't the end.

Peter Leppert
+1  A: 

Many (even most?) CS or SE degrees now include some form of web development. I'm sure many people get into programming from HTML, then learning CSS and so on and that the recent increase in the popularity of the internet has some role in this. That is certainly how I started.

In my experience though if you can write reasonably good 'standalone applications', you are capable of web development, and that the inverse is not true so frequently there are exceptions of course) .

To address your original question more simply, the traditional, non-web programming is, in my opinion, more useful if you are just learning the trade. However web development will become more important because that's the way the world is moving.

rmx
A: 

there will be different challenges depending on your problem space, i.e. web application development has different complications when compared with, say, linux kernel development, and say, stand alone app development (like an MS office, or a Quicken).

Any type of serious development will expose the programmer/developer to most language features and a slew of good practices.

I, personally, do not think stand-alone applications is a necessary first step in the education of a good developer. Principals you learn developing any type of application will be applicable for other types of application development. You just have to be flexible and realize that not everything translates over...

hvgotcodes
+7  A: 

I'm very much of the opinion that it's still best to start with console programs. Building a proper web application involves 90% of the knowledge a programmer will use to build a console app, and then a giant pile of other things too.

Learning to build a proper object model, deal with data structures and databases, design control flow structures, and take advantage of the characteristics of the important CS algorithms can all happen in a console environment, and they're all necessary skills for an expert programmer.

Learning to do that while also dealing with networking issues, HTML and JavaScript, web application frameworks and presentation layers, etc. just confuses matters. The more moving pieces a novice programmer is required to deal with at once, the more likely it is that they will avoid ever learning exactly what's going on in favor of learning enough to make their code barely work.

In summary, standalone apps are the simplest possible environment to learn programming concepts, and therefore I think they're the best place to start for new programmers looking to move toward mastery.

Walter Mundt
+3  A: 

Neither is better or worse. But it is certainly more difficult to learn programming if you start with web programming, because you have to cover a whole bunch of concepts that have little to do with programming per se. Thus "classical" programming provides a gentler learning curve. In particular, languages like Python that have interactive command-line shells provide a very smooth on-ramp.

Marcelo Cantos
+2  A: 

Web development is a pain. See: IE 6. See: browser incompatibilities. See: PHP.

The sole advantage is it is "modern" and allows the learner to see results on a browser instead of a console.

Give the student gcc and a copy of K&R, IMO.

Paul Nathan
A: 

I'm a "don't try to learn everything at once" kind of guy, so I think it would be best in general to start with stand-alone console programming. In a formal curriculum, I'd probably start with Python or maybe Scheme and add stuff onto that.

However, if somebody is learning on their own, I'd want to give them something they found fun to work on, and that might well be web apps. Almost anything will serve as a start, if it can be modified in a reasonably straightforward way. If I were guiding somebody doing that, I'd try to keep the environment reasonable, for example keeping network issues and IE 6 out of it.

There's also the dichotomy between somebody wanting to become a good programmer and somebody wanting to program stuff for a certain purpose. If somebody wanted to do web programming, they're probably best off studying web programming at first.

David Thornley