views:

1139

answers:

15

Is it only me, or is coding getting slower and slower, until a product("a project") is finished?

This question came to me as I saw the "Write a program in 30 minutes":

I thought it is impossible to write a decent piece of code which runs smoothly.

As I am getting older I see on all projects, even simple ones a lot of additional work, to get them into a state where everybody can use them. So you need a lot of error-catching, UI-Design, even texts, so your app can communicate with endusers. Back in the old days(around 1985) it was possible to build something usable in some weeks, maybe months. Today You will spend at least a man-year for developing an robust application which performs a nontrivial task.

Of course, the complexity has increased, but I am just worried about the scaling of application complexity regarding to development time.

+1  A: 

IMHO, your wrong.

High level languages have made rapid development possible. This has counterbalanced the rise in application complexity, so that it still takes a similar amount of time to develop a quality application.

FlySwat
same amount of time is not same amount of man power.Today you are working with more people on a project to finish it in same amount of time.
Peter Parker
You're working with more people, but you also sell to more people.
Joeri Sebrechts
+19  A: 

coding had actually gotten much, much faster since 1985, thanks to vastly better tools, but now we demand more of our applications

Steven A. Lowe
Yes that is the point, but the ratio between demand and development speed is getting worse, i think
Peter Parker
that's good for business!
Steven A. Lowe
I strongly disagree. See my answer to explain why.
David G
A: 

I think you have to take into account that the "non-trivial tasks" are somewhat more non-trivial than they were 20 years ago.

With increasing complexity comes a vastly increased time needed to finish a product.

Looking at coding aids like coderush I think it would be extremely wrong to say that coding got slower during the years - the opposite is the case.

Grimtron
You are right, it is not slower, but the project size in numbers of man months is increasing more than the complexity of the problems solved by the application, isn' it?
Peter Parker
Well that's just because managers insist on hiring worthless programmers...You know who I'm talking about.
FlySwat
But this really depends on your framework, team, methodology etc. I can see that new technologies demand some extra time to get used to them, but that is not much different from the old times - only that the technologies themselves offer much more flexibility.
Grimtron
@Grimtron: sorry but I think there is no framework or technology which will allow to produce the same quality of application in half of the time/edveloper. You are right, ther are plenty new technologies out there, however the complexity of your application will also rise.
Peter Parker
@Peter Parker:"no technology will allow to produce the same quality of application in half of the time/developer".What are you comparing here? Take MFC vs. .NET.I worked with the foundation classes for several years,and I would never go back after focusing on .NET because of the productivity gains.
Grimtron
+2  A: 

OK I see your point:

essentially you say, "you may need more resources, but it will be paid", however, if complexity will continue to increase, you have to have more developers on a project, this scales not very well in regards to management, payment etc.

If you think this to an end it means: Software projects will be more expensive and needs more people to accomplish a task. This also means each person will get less money

If you want to see a real live example, look into the games industry: projects grew from 1 person games in 80ies to 3-8 person games in 90ies today on an AAA-title are easily >100 people involved. With this amount of people you have more problems for each person inside (maybe except CEOs): less money, more work and responsibility

Is this the same for all software industry branches?

Peter Parker
games industry not a good example - that's entertainment. This would be like saying movie crews have grown from a director, camera operator, and 8 actors to thousands of actors and artists and CGI programmers and everybody gets less money, when everyone gets more money and they sell more movies
Steven A. Lowe
+1  A: 

I disagree that it takes a whole person-year to develop a non-trivial application.

I can only quote directly from my own experience, but most of our applications take only 1-3 person-months. Admittedly, we have built ourselves a very high-level framework which does a lot of the work for us; and we do limit ourselves to web-based business process applications, which means there is lots we can reuse from one to the other.

I think by using the right tools, having smart people and doing things the right way, you can develop software faster than in the past. No Silver Bullet is a good essay, but if you re-read it (as I did recently) it doesn't say you can't get an order of magnitude productivity improvement overall - it just tells you not to expect it from a single change or a single new tool.

Leigh Caldwell
A: 

Because programming is really, really difficult?!

kronoz
lol being marked down... does that mean programming is easy and all this effort I've gone to has been a waste?! Wowzers!
kronoz
+24  A: 

I think your question is well answered by Fred Brook's famous essay, No SIlver Bullet. You can get the essence of his argument from Wikipedia. The crux of the argument is that essential complexity, the problem the software is trying to solve, is why software is hard and slow. Accidental complexity is what new languages and IDEs solve.

I disagree with all the answers I've seen. Newer languages and improved tools only improve productivity a small amount. The only big jump in productivity I've seen in my career is going from assembler to C. Everything since then is a slight increment. And even with that leap, we haven't solved the essential complexity of what applications are doing.

I think people often confuse the increase in their personal productivity as they go from beginner to accomplished developer with a general increase in productivity. People often go from solving simple problems to being involved in every larger teams and applications, too, so they think applications are generally becoming more complex.

People were writing extremely complex applications in the 1950's, too. Most people don't know enough about the history of the industry to realize it.

David G
while you have some good points, if you compare the tools and libraries of today to those of 1985, the difference is far more than 'accidental'. today: .net, C#, Windows XP, SQL Server; 1985: DOS, C, CTree. Difference: entire app framework, GUI, and database
Steven A. Lowe
Have you read Brook's essay? Even if all those things you mention make you 3 times more productive, they're still accidental. Also, in 1985 there was Unix and Oracle (and other) databases. Don't just look at PCs.
David G
let's see, i read brooks 26 years ago, so yes i've read his essay. My understanding of the point of his essay was that the essential complexty of software development is the reason why the process will never be completely automated or 'solved'... see next comment
Steven A. Lowe
... and not as a reason to automatically discount the producitivty gains from any tool. Fore example, the windowing Invoice app with drill-down that i wrote in 1985 was done in C on DOS using CTree and Curses and took 3 months. The same app today would take about a week.
Steven A. Lowe
ok, i re-read brooks to be certain of the terminology you are using. I would assert that the main benefit of modern tools has been to convert _some of_ what once was essential complexity into merely accidental complexity. GUIs are a prime example; today i don't have to write one, 20 years ago i did.
Steven A. Lowe
I guess we have to agree to disagree. I think that GUIs you no longer have to write are still accidental not essential. Even if I'm wrong about that, GUI generators and IDEs have been available for 20 years. See IDMS products or (later) PowerBuilder as an example... see next comment.
David G
Of course, my views are influenced by the type of development I've done. I've worked on compilers, virtual machines, rules engines, workflow engines, etc. IDE's and GUIs and frameworks aren't much help in that world.
David G
I think thats the difference. If you are developing buisiness oriented apps - like i do - libs and frameworks make getting the foundations and all the peripherals very much easier. I just have to concentrate on getting the business logic right. everything else has been solved.
Mostlyharmless
@[David G]: i think we do now agree - when you had to write the GUI, it was essential; now it is accidental. Brooks statement that no technology could offer an order of magnitude increase in productivity still applies, with the possible exception of code generators ;-)
Steven A. Lowe
all of you are analog but david g is digital
+5  A: 

In the industry today, we have "solved" a lot of the framework issues we used to struggle with. We've got no end of tools for desktop GUIs, web frameworks, database access, and communications plumbing. Take your pick. But the real problem lies in the complexity of the business problem you are trying to solve. No toolkit exists to solve that problem (thank god), nor will it ever. If it did, hypothetically speaking, then no company would have any real competitive advantage over another one. Sure you've got some software that solves a particular problem in common to most businesses (accounting packages, some ERP modules), but most of the toolkits are for specific technology issues and not business problems.

As for a minimum of 1 man year I think it obviously depends on what you're trying to build. The key difference today as opposed to the 80's when I started programming is that we can spend more time on the business issues and less time changing the software every single flipping time we needed to add support for yet another type of printer :-) Hence the business problems we are solving are getting bigger to fill that void.

Phil Bennett
actually, I'd say *because* we have no end of new frameworks, we don't ever get to be masters of any of them before the rules have changed and we have to learn more stuff all over again.
gbjbaanb
+1  A: 

The amount of time dedicated to realizing a abstraction from a real life situation (be it a video game forest, or a sales contract) is related to the amount of detail with which we want to replicate that real life object.

In the 80s, we made do with a minimum of abstraction to get the job done.

Nowadays, a AAA game title uses much more detailed abstractions (programming things like wind, weather, interactive blades of grass), and that requires dedication.

So, the greater the detail of the abstraction you're trying to replicate, the more time you need.

This doesn't account for wasteful details (rendering the boogers in your enemy's nose), and for tuning the accuracy of these abstractions (if a guerilla fart is traveling NW at 2 knots, and the wind is blowing SE 13 knots directions, which way should the grass near his ass blow?)

Pete Karl II
A: 

In 1985, the software users where scientist and the app environements fully under control.

Now you must build for somebody who don't need to know anything about the way it works and ensure it will run in an environment with thousand of variables aren't even related to the app purpose.

Thanks to modern tech, dev is faster. But needs have change, that's why it feels slower.

Writing a static web page and a 'simple' blog engine are just like building a hammer and a hammer factory.

e-satis
A: 

Because you waste too much time on discussion on internet sites instead of actually coding?

Programmers are considered notoriously bad at estimating time to completion. It turns out though that we're not far off if we consider only the coding time -- the problem is all the other crap in our workdays, from useless meetings to, yes, goofing off on the internet.

wnoise
A: 

This may sound cynical:

We as programmers must spend a lot of time trying to take extremely complex requirements written by analysts who have no idea what we actually "do for a living that takes so much time" and then proceed to attempt to mold those requirements into modules that fit within the unnecessary third-party libraries that we are required to used because some consultant who was hired by another consultant convinced upper-management that it was a good idea to spend a lot of money.

wow, that was a bit more harsh than I intended, and please pardon my grammar.

Causas
A: 

When my boss asks why development of 'feature A124#' took soooo long (and he doesn't do that much, I must admit, because he is also a good developer and we're working together for quite some time), my typical answer is: "Because we already solved all simple problems."

Demand has overtaken programming tools so to solve harder and harder problems we have to work more and more. That's why we need better tools.

gabr
+2  A: 

Check out The law of leaking abstractions by Joel Spolsky

Ten years ago, we might have imagined that new programming paradigms would have made programming easier by now. Indeed, the abstractions we've created over the years do allow us to deal with new orders of complexity in software development that we didn't have to deal with ten or fifteen years ago, like GUI programming and network programming. And while these great tools, like modern OO forms-based languages, let us get a lot of work done incredibly quickly, suddenly one day we need to figure out a problem where the abstraction leaked, and it takes 2 weeks. And when you need to hire a programmer to do mostly VB programming, it's not good enough to hire a VB programmer, because they will get completely stuck in tar every time the VB abstraction leaks.

The Law of Leaky Abstractions is dragging us down.

dimitrisp
I agree, but as we "stack" technology over technology, do we create a situation, where it will be more difficult to get rid of the abstraction leaks(they may span more than one level of abstraction)
Peter Parker
+1  A: 

Hi Peter,

I've felt it that way too. And for me I have to believe that nowadays we are writing more code because we want to make the development process fail-proof / maintainable with even more code. That's why we:

  1. Separate UI from data processing and data storing (and each of these layers have added complexiety)
  2. Use frameworks which alllow us to change the database without too much extra work,
  3. Create automated tests and unit-test all the parts of a software
  4. Implement patterns everywhere

What's more, nowadays we have even more ways of doing exactly the same thing - either by knowing better frameworks / libraries / components or by selecting the best algorithms - and making decisions like this take some time.

The time necessary to create / decide and execute all this stuff is quite big, but I want to believe that it is well spent when you are no longer required to start from zero when you can no longer maintain an application, for instance.

rshimoda