views:

325

answers:

2

Background: The Rise of "Worse is Better" and Wikipedia's article

I read it ages ago, and, when looking back now, it seems that it had an influence on the way I approach software development. Though I'm not sure if that was for better or worse. (-:

Do you agree that worse is better? How has it changed the way you approach development?

Does "worse" cost less in the long run?

Do you often say or hear "this is not the right thing"?

+1  A: 

I prefer to integrate some aspects of both approaches.

  • Completeness develops over time. Don't worry about releasing something incomplete.
  • Be honest about what's incomplete. Don't add non-functional stubs to conform to an external interface, even when that interface is a mandatory part of an external interoperability specification -- that's hiding incompleteness behind incorrectness.
  • Optimize later, when you really need it. Don't immediately throw out something powerful but slow like late binding; use techniques like code generation to win back the performance where and when it matters.
  • Solve systemic concerns systemically. Sometimes a requirement is not addressed by any one line of code, but by the way the pieces fit together, or by what the code avoids doing.

The first two points are characteristic of the worse-is-better school, while the last two points are characteristic of the diamond-like jewel subschool.

Jeffrey Hantin
A: 

This is only a tiny part of a larger fight in software engineering that will continue to rage for some time. Ultimately it comes down to the problem that software is abstract and we don't have good tools or techniques for judging different aspects of software and software development yet.

Many people are already aware of the problems with low quality software, though there are still huge problems in the industry in being able to determine the quality of software objectively and in being able to manufacture higher quality software (both of these abilities are essentially at the level of "art form" today).

"Worse is Better", however, is an attempt to point out the perils of over-engineering software. There was a recent stackoverflow podcast with Eric Sink where he said something that struck me as particularly meaningful. He was describing the differences between SVN and Team Foundation and how ridiculous it is that we talk about these two things with the same terminology, because they are vastly different (as different as an open pit mine excavator and a commuter car). This highlights a weakness in the terminology and mental models we use in this field. We are at a stage where we fundamentally lack the intellectual tools and models to discuss software well. Team Foundation is a good source control system, but it's massively the wrong tool at the scale of requirements that SVN satisfies.

This sort of problem exists everywhere in software but at smaller scales such that we don't appreciate it. TF vs. SVN is an orders of magnitude difference in engineering, but think about all the components of any given software system. It's exceedingly likely that in any given piece of software the vast majority of the components are either under-engineered or over-engineered by a substantial margin (though likely not orders of magnitude in most cases). This has drastic consequences on the business of software because under-engineered components lead to products with lower value to customers, which can mean lower revenues as a business and bankruptcy, while over-engineered components often (though certainly not always) can mean that development costs are higher than they should or that there are excessive implicit requirements imposed on the rest of the system.

Sometimes the right solution is just gaff tape and wire, and sometimes the right solution is a CNCed part made from a cast block of Iridium-Titanium alloy. Being able to know the right level of engineering required for a given problem is a key to business success.

Wedge