We all know the common laws that (seem to) rule computing like Moore's law, etc. But there are the funky ones :
Murphy's law : Something that can go wrong, will.
That's why your if you cover 99 % of your code with unit tests, the 1% breaks during the marketting presentation, the day your mother in law comes for diner while you have a headache. And there is no more toilet paper.
Parkinson Law : A task always take as much time to be performed as you allocate to it.
If you give yourself 3 weeks to code a Class, you will successively think :
- Week 1 : I've got the time...
- Week 2, day 1 : what about trying this design pattern ?
- Week 2, day 2 : This will be easier with the new eclipse plugin. God, my IDE's slow. Let's tweak that !
- week 2, day 3 : This is slow because of my computer. I'll format it. A quick hour and I'll be on (fast) tracks !
- Week 2, day 4 : This is fast, I sould try the new framework. I've got the fire power !
- Week 2, day 5 : Ok this framework sucks. I'll recode that.
- Week 3, day 1 : Hum, What was I doing again ? I'll redesign that, it will help me to dive in.
- ...
- Week 3, day 4 : WTF ?
- Week 3, day 5 : I guess this will do. It has to.
If you give yourself 3 days :
- Day 1 : Ok, let's rock that !
- Day 2 WTF ?
- Day 3 : I guess this will do. It has to.
Fraysse's law : the greater the interest about a subject, the quicker the time goes.
This is why you start checking mails à 8 A.M and stop at 12 (when you are hungry).
And there are Solow's paradox, Peter's principle, Dilbert variation, Pareto's law, Taylor's, Laborit's, Illich's, etc.
Do you know any others ?