We have Amdahl's law that basically states that if your program is 10% sequential you can get a maximum 10x performance boost by parallelizing your application.

Another one is Wadler's law which states that

    In any language design, the total time spent discussing
    a feature in this list is proportional to two raised to
    the power of its position.

        0. Semantics
        1. Syntax
        2. Lexical syntax
        3. Lexical syntax of comments

My question is this: What are the most important (or at least significant / funny but true / sad but true) laws of Computer Science and programming?

I want named laws, and not random theorems, So an answer should look something like

Surname's (law|theorem|conjecture|corollary...)

Please state the law in your answer, and not only a link.

Edit: The name of the law does not need to contain it's inventors surname. But I do want to know who stated (and perhaps proved) the law

+13  A: 

Conway's Law: "...organizations which design systems ... are constrained to produce designs which are copies of the communication structures of these organizations."

+1 — I love Conway's Law and the inverse, where organizations can mold to mirror the structure of the software they produce. In my opinion, this is one of the least well-understood phenomena of software which management should definitely understand.
Quinn Taylor
I've always heard this as "four teams of people writing a compiler will produce a four-pass compiler".
+24  A: 

Law of Demeter/Principle of Least Knowledge:

An object should only talk to its "neighbors" and not tell its neighbors how to work or invoke methods on a neighbor's neighbor.

In other words (as Wikipedia puts it): tell your dog to walk, not tell a dog's legs to walk.

Colin Burnett
+14  A: 

The Pareto principle (also known as the 80-20 rule, the law of the vital few and the principle of factor sparsity) states that, for many events, roughly 80% of the effects come from 20% of the causes.

Robert Gould
And the corollary to that rule: One cannot get away with implementing the most-used 20% of the features and still expect to have 80% of the userbase.
+24  A: 

There are no silver bullets by Frederick P. Brooks, Jr

There is no single development, in either technology or in management technique, that by itself promises even one orderof- magnitude improvement in productivity, in reliability, in simplicity.

Try to find the name of the person to whom this is attributed, I know it's known, but can't remember it now
Robert Gould
Right it was Brooks! Thanks!
Robert Gould
It's in the article I linked, but now I also included it here
+67  A: 

Hofstadter's Law:

It always takes longer than you expect, even when you take into account Hofstadter's Law.

Jonathan Schuster
My favorite thing about this law is that it is recursive.
+43  A: 

Brook's law:

Adding manpower to a late software project makes it later.

Kevin Beck
This was shown to be false by a group of MIT students.
Andrei Tanasescu
+12  A: 

Zawinski's Law

Every program attempts to expand until it can read mail. Those programs which cannot so expand are replaced by ones which can.

A reference to all such laws!
...except Microsoft Outlook
+11  A: 

Alan Turing's proof of the insolvability of the Halting Problem

"Given a program and its input, determine whether the program will complete or run forever."

Cannot be solved in the general case with arbitrary input

Try to find out who stated the halting problem and/or who proved it. Then your answer will get upvotes since this is one of the pillars of computer science.
Added the name of the apparent originator (Martin Davis) of the Halting Problem
Robert Gould
If, as according to the link, Turing proved it in 1936, and Davis was born in 1928, did Davis really originate the problem? It seems he maybe just came up with the term.
Matt Lewis
See . Apparently he just came up with the term. "The earliest known use of the words "halting problem" is in a proof by Davis (p. 70–71, Davis 1958)"
Michael Myers
The Halting Problem is usually associated with Turing and his proof, so that's what I'll roll it back to.
+27  A: 

Wirth's law states that:

Software gets slower faster than hardware gets faster.

+2  A: 

Linear speedup theorem

Given any c > 0 and any Turing machine solving a problem in time f(n), there is another machine that solves the same problem in time cf(n)+n+2.

+4  A: 

Little's Law (queueing theory):

The long-term average number of customers in a stable system L (known as the Offered load), is equal to the long-term average arrival rate, λ, multiplied by the long-term average time a customer spends in the system

Matt J
You should divide this into two answers, as they are two different laws
Robert Gould
Good idea; done. :)
Matt J
+52  A: 

Ninety-ninety rule

The first 90% of the code takes 90% of the time. The remaining 10% takes the other 90% of the time.

Attributed to Tom Cargill and popularized by Jon Bentley. Is this superseded by Hofstadter's Law?

Should this be "The first 90% of the code takes 10% of the time..."?
@Jonas - No, it right just the way it is :-)
No. That's the whole point - if people think they're 90% done, they're not even close.
The problem with hofstadter's law is that it tends to put the blame on external factors. The ninety ninety rule emphasizes that the root of the problem was a poor understanding of the work, and estimation of it's difficulty. By applying both, it tends to encourage first bringing expectations into line, then allowing slack in the schedule for unforeseen complications.
Documentation takes the 3rd 90% of the time.
@Carsten, What documentation? ;-)
Nathan Koop
+9  A: 

Godwin's Law:

As a [StackOverflow] discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches 1.

That's simple infinite monkeys reasoning. As any discussion grows longer, the probability of a comparison involving anything approaches 1.
Phil H
@Phil: True, but the probability of comparisons with Nazis/Hitler approaches 1 at an unusually high rate.
Steve S
@Phil H: Logic nazi!
Michael Myers
Actually, somebody's profile here on SO says: "As a StackOverflow discussion grows longer, the probability of Jon Skeet's name being mentioned approaches 1." I wish I could remember who it was. (It wasn't Jon Skeet.)
Michael Myers
Aha, here he is:
Michael Myers
@mmyers: you just proved the point!
+15  A: 

Peter Principle:

In a Hierarchy Every Employee Tends to Rise to His Level of Incompetence.

Otherwise known as Promotion to Mediocrity.
Phil H
+18  A: 

Knuth's optimization principle:

We should forget about small efficiencies, say about 97% of the time: premature optimization is the root of all evil.

*Please* do not cripple the quote like that! The full quote is much more nuanced than this little, oft misquoted snippet!
Jörg W Mittag
Still something missing from the quote? Can you please edit the answer to fix it?
This is a full quote from the classic papaer "Structured programming whith goto statements" (1973):, p.268.Maybe Knuth is quoting another source (himself?) at this place - without marking it explicitly as a quote. But then you should blame him for plagiarism ;-)Which other source do you have in mind?
+3  A: 

Not computer science or programming specifically, but certainly true:

Anything that can go wrong will go wrong.

I don't believe it's necessary to name this adage. For those few too ashamed to admit they don't know it, here is a link for you to anonymously follow to correct this gaping hole in your knowledge.

Chris Lutz
This is Murphy's law.
Christoffer Soop
And expect it to go wrong at the worst possible moment.
Also called Sods Law in the UK.
Omar Kooheji
... and it will happen sooner than you think
Lasse V. Karlsen
+11  A: 

Asimov's three laws:

  1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
  2. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.
  3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Rob Carr
I am Rob, I have interfaced with my spectrum. Behold!
Chris S
+6  A: 

Gustafson's Law ameliorates the parallelism doom-and-gloom of Amdahl's Law by stating that the problem size tends to increase in time, allowing linear application-level speedups even in the face of imperfect parallelization. The linked Wikipedia article has a much better explanation than I can muster, but here's an example:

Amdahl's Law approximately suggests: “ Suppose a car is traveling between two cities 60 miles apart, and has already spent one hour traveling half the distance at 30 mph. No matter how fast you drive the last half, it is impossible to achieve 90 mph average before reaching the second city. Since it has already taken you 1 hour and you only have a distance of 60 miles total; going infinitely fast you would only achieve 60 mph. ”

Gustafson's Law approximately states: “ Suppose a car has already been traveling for some time at less than 90mph. Given enough time and distance to travel, the car's average speed can always eventually reach 90mph, no matter how long or how slowly it has already traveled. For example, if the car spent one hour at 30 mph, it could achieve this by driving at 120 mph for two additional hours, or at 150 mph for an hour, and so on.

Matt J
+25  A: 

The Dilbert Principle (corollary to the Peter Principle) - by Scott Adams, of course:

The most ineffective workers are systematically moved to the place where they can do the least damage: management.

Then of course, there's also the Dogbert Principle: People are idiots... But I didnt want to post that :-)
I like the one from Men In Black "A person is smart. People are stupid", though not named and not programming related
My personal favorite wisdom from Dilbert is what I call Wally's Rule: If you wait long enough, most problems take care of themselves.
@ferocious: I totally agree :-). @ted - Wally might add to that "or be passed to someone else"...
@AviD - He might, but that is more or less a direct quote from Wally. It was the last frame in a strip where Dilbert was working his tail off to prepare for a meeting that got canceled, while Wally slacked. I wish I could find it online.
I can assure you that the most ineffective workers are moved to management, but they can still do a hell of a lot of damage.
Stefano Borini
It's not corollary to the Peter Principle, it's a different concept: The PP applies if a *good* employee gets promoted, but sucks at his *new* work. Because of that he'll be not promoted any further. The DP applies if a *bad* employee gets promoted to a new post, because there he can do *less damage* (supposedly) The DP is not capped, i.e. there are no inherent limits to how far a dope can advance the corporate ladder.
The Dilbert and Peter Principles are superceeded (or at least contested) by the Gervais principle: ''Sociopaths, in their own best interests, knowingly promote over-performing losers into middle-management, groom under-performing losers into sociopaths, and leave the average bare-minimum-effort losers to fend for themselves.''
Christoffer Soop
+4  A: 
Just because John von Neumann revolutionized computing, doesn't mean that everything he wrote is related to computing. He also revolutionized game theory, economics and probably other fields I can't remember right now. This is a theorem from game theory, not computer science.
Jörg W Mittag
+2  A: 

There is my Favorite:

Murphys Law

Simplified: "Whatever can go wrong, will go wrong"

However, there is a little more to it Wikipedia

I like this more humanized version best: „If there's more than one possible outcome of a job or task, and one of those outcomes will result in disaster or an undesirable consequence, then somebody will do it that way.“

And of course Moore's law

famous interpretation: "The processing speed of computers will double every two years!"

stated similarly 1975

Again, there's more to it: Wikipedia

should be 2 seperate answers, especially 2 biggies as these!
Related to Murphy: Sods law: When it goes wrong it will go wrong in the worst possible way.
Martin York
+10  A: 

Greenspun's Tenth Rule of Programming:

Any sufficiently complicated C or Fortran program contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of Common Lisp.

Firas Assaad
+8  A: 

Linus's Law:

Given enough eyeballs, all bugs are shallow.

Firas Assaad
Which is really Eric Raymond's version of Linus' Law... ;-)
Quinn Taylor
+1  A: 

The Last Responsible Moment for decision making rule:

The key is to make decisions as late as you can responsibly wait because that is the point at which you have the most information on which to base the decision.

Mr. Brownstone
+8  A: 

[Sorry couldn't resist]

Stackoverflow law subjective questions:

Each question marked subjective is either closed within minutes
or it collects a large amount of upvotes. (some even both).

and to who should we attribute this law? To Gamecat or to the SO community? (perhaps to the community since this is a community wiki :-)
+2  A: 

Norvig's Law: Any technology that surpasses 50% penetration will never double again (in any number of months).

Bill the Lizard
How do you define 50% penetration? Half of the people on Planet Earth?
Michael Myers
50% of your target market, I guess. Half of the people on Planet Earth would definitely be an upper bound (unless we make an astounding discovery).
Bill the Lizard
Oh? Man + Woman -> Man + Woman + Baby.
Loren Pechtel
@Loren: You make an excellent point. :)
Bill the Lizard
+5  A: 

The Law of Natural Selection: "Natural selection is the process where heritable traits that make it more likely for an organism to survive long enough to reproduce become more common over successive generations of a population. It is a key mechanism of evolution."

This applies to computer systems as well, systems that support the business functions directly related to earning capital are more likely to receive funding and therefore more likely to survive budget cuts. Hence, to survive the tumultuous nature of the software development industry it is logical to concentrate on skill that support those types of applications.

Footnote: Those of us able to apply the aforementioned principal will be more likely to earn more money and thereby be in a better position to procreate. Natural Selection wins again!


This also explains why people in sales always make more money than those in development.
I wish this were even true all the time. Sometimes the department that makes the most money can be such a political whipping boy as to get nearly no staff, attention, and contain the people with the lowest salaries.
Trampas Kirk
+1  A: 

Sods Law: (A kin to Murphy)

When it goes wrong (because according to Murphy it will). It will go wrong in the worst possible way.

Martin York
+10  A: 

Abraham Maslow

If the only tool you have is a hammer, you treat everything like a nail.

So, programmers should learn several languages and learn how to use the strengths of each one effectively. It is no use to learn several languages if you do not respect their differences (Roberto Ierusalimschy, Programming in Lua)

+4  A: 

Principle of least astonishment

In user interface design, programming language design, and ergonomics, the principle (or rule or law) of least astonishment (or surprise) states that, when two elements of an interface conflict, or are ambiguous, the behaviour should be that which will least surprise the human user or programmer at the time the conflict arises.

+5  A: 

Edward V. Berard Law

Walking on water and developing software to specification are easy as long as both are frozen.

That may be true, but there is something to be said for AGILE methodology which does not depend on frozen specs to successfully complete a project.
+14  A: 

I do a lot of distributed programming, so one of my favorites is Segal's Law:

A man with a watch knows what time it is. A man with two watches is never sure

The application to distributed programming is that you have to either arrange things so that your whole distributed system only has one clock driving things, or you have to accept that processes/events using different clocks are going to be running asynchronous. Two clocks will drift from each other. You can't expect two separate clocks (typcially on two seperate machines) to act in lock-step.


Judge Dredd's Law

"I am the Law"

David Plumpton
That was really funny for some reason
+2  A: 

Any sufficiently advanced technology is indistinguishable from magic.

Arthur C. Clarke, Profiles of The Future, 1961 (Clarke's third law)

The only way to discover the limits of the possible is to go beyond them into the impossible.

Arthur C. Clarke, Technology and the Future (Clarke's second law)

When a distinguished but elderly scientist states that something is possible, he is almost certainly right. When he states that something is impossible, he is very probably wrong.

Arthur C. Clarke, (Clarke's first law)

Michael Damatov
Any sufficiently advanced bug is indistinguishable from a feature. -- Rich Kulawiec
Bill the Lizard
+1  A: 

There is "Heisenberg uncertainty principle", which in general states:

In quantum physics, the Heisenberg uncertainty principle states that certain pairs of physical properties, like position and momentum, cannot both be known to arbitrary precision.

With translation to software engineering, the application of this principle is a following: You cannot test and debug you own code, since in order to achieve it you must add additional code, therefore you test not the original system.

Artem Barger
+2  A: 

Software can only ever be two of the following:

  • fast
  • cheap
  • delivered on time
Name for this theorem?
A common variant of this rule replaces the last item with "reliable"
@jonik - no idea, I remember reading it somewhere once. I am happy to name it Amy's Theorem though
Its called the Project Triangle:

Zimmerman's maxim:

Anything written down in more than one place, is wrong in more than one place.

You will never have two copies of data stay the same, especially if you depend on a human to keep them the same.

David Zimmerman
So basically you're taking credit for the DRY principle? At least you have no problems with excessive modesty :)
+5  A: 

One of my Favourites is

"One in a million is next tuesday"

Larry Osterman.

Basically it states that when dealing with computers, things happen so fast, that even something that happens very rarely, is going to happen within the next few days.

+2  A: 

I like Proebstings Law:

Compiler Advances Double Computing Power Every 18 Years

+5  A: 

The Steve Rule:

In a random sample of programmers, there will be more named Steve then there will be females.

This can be refined to a more correct and culture-agnostic version:

In a random sample of programmers, the likelihood of there being a male name with more programmers bearing that name than there are female programmers approaches 1 as the sample size increases.

Michael Borgwardt
This is similar to a rebuttal of a major "creation-science" organization's attempt to get 500 scientists to sign a document stating that they didn't believe in evolution by getting 500 scientists named Steve (or derivatives thereof) to sign one stating that they did.
Chris Lutz
+2  A: 

Atwood's Law:

Any application that can be written in JavaScript, will eventually be written in JavaScript.

+1  A: 

Some less well known ones:

Wheeler's law: "All problems in computer science can be solved by another level of indirection"

Berry’s law. "The best way to go infinitely fast is to produce no code at all" - i.e. If something can be computed once, then do it at compile time.

Charles Ma
The quote from Wheeler continues: "But that usually will create another problem."
All problems in computer science can be solved by adding another level of indirection... except too many levels of indirection...

Not really coined as a law, but I think this quote from Eric Evans Domain-driven design is an important aspect of Brooks "no silver bullets" law:

"One way or another, creating distinctive software comes back to a stable team accumulating specialized knowledge and crunching it into a rich model. No shortcuts. No magic bullets."

Anders Lindahl
+1  A: 

Postel's Law, or the robustness principle:

Be liberal in what you accept, and conservative in what you send.

Not as humorous as many other ones mentioned, but quite insightful. It was, aptly, quoted in the computer networking textbook we used at uni. Apprarently this was originally mentioned in RFC-791, "Internet Protocol," by Joe Postel, September 1981.

Differently worded variants abound (see e.g. RFC-793 and RFC-1122); a common one is: "Be conservative in what you do; be liberal in what you accept from others."

That is one of the reasons why HTML is such a disaster... Joel has an essay about this principle.
+2  A: 

Hanlon's razor:

"Never attribute to malice that which can be adequately explained by stupidity."

Or alternatively:

"Do not invoke conspiracy as explanation when ignorance and incompetence will suffice, as conspiracy implies intelligence."


Here's another one most programmers forget:

Kerckhoffs' principle of secure cryptography: A cryptosystem should be secure even if everything about the system, except the key, is public knowledge.


Pareto Law

The Ninety-ninety rule I read is a plagiarism of Pareto Law


The Dilbert and Peter Principles are superceeded (or at least contested) by the Gervais principle:

Sociopaths, in their own best interests, knowingly promote over-performing losers into middle-management, groom under-performing losers into sociopaths, and leave the average bare-minimum-effort losers to fend for themselves.

See The Gervais Principle, Or The Office According to “The Office

Christoffer Soop