(AKA How To Sound Smart At Your Next Team Meeting)
This widely-known adage dates to a philosopher and friar from the fourteenth century named William of Ockham. Occam’s Razor is often stated as:
“Among competing hypotheses, the one with the fewest assumptions should be selected.”
It’s no surprise that the whole reason we can recall an adage from 600+ years ago is that it works so well. Occam’s Razor is so basic, so fundamental, that it should be the first thing we think of when deciding between two competing theories. I’d even go so far as to argue that in the vast majority of cases, simpler is better.
Sometimes I feel like users are intentionally trying to piss me off. They push buttons they weren’t supposed to, found flaws that shouldn’t have been visible to them (since they weren’t to me), and generally make big swaths of my life more difficult than it would otherwise be.
I try to remember, though, that the vast majority of actions done by people which may seem malicious are not intentionally so. Rather, it’s because they don’t know any better. This is the crux of an adage known as Hanlon’s Razor, which states:
“Never attribute to malice what can be adequately explained by stupidity.”
Don’t assume people are malicious; assume they are ignorant, and then help them overcome that ignorance. Most people want to learn, not be mean for the fun of it.
The Pareto Principle
The last Basic Law of Software Development is the Pareto Principle. Romanian-American engineer Joseph M Juran formulated this adage, which he named after an idea proposed by Italian economist and thinker Vilfredo Pareto. The Pareto Principle is usually worded as:
“80% of the effects stem from 20% of the causes.”
Have you even been in a situation where your app currently has hundreds of errors, but when you track down one of the problems, a disproportionate amount of said errors just up and vanish? If you have (and you probably have), then you’ve experienced the Pareto Principle in action. Many of the problems we see, whether coding, dealing with customers, or just living our lives, share a small set of common root issues that, if solved or alleviated, can cause most or all of the problems we see to disappear.
In short, the fastest way to solve many problems at once is the find and fix their common root cause.
Researchers David Dunning and Justin Kruger, conducting an experiment in 1999, observed a phenomenon that’s come to be known as the Dunning-Kruger effect:
“Unskilled persons tend to mistakenly assess their own abilities as being much more competent than they actually are.”
What follows from this is a bias in which people who aren’t very good at their job think they are good at it, but aren’t skilled enough to recognize that they aren’t. Of all the laws in this list, the Dunning-Kruger effect may be the most powerful, if for no other reason than it has been actively investigated in a formal setting by a real-life research team.
Author and developer Eric S. Raymond developed this law, which he named after Linus Torvalds. Linus’s Law states:
“Given enough eyeballs, all bugs are shallow.”
In other words, if you can’t find the problem, get someone else to help. This is why concepts like pair programming work well in certain contexts; after all, more often than not, the bug is in your code.
Robustness Principle (AKA Postel’s Law)
One of the fundamental ideas in software development, particularly fields such as API design, can be concisely expressed by the Robustness Principle:
“Be conservative in what you do, be liberal in what you accept from others.”
This principle is also called Postel’s Law for Jon Postel, the Internet pioneer who originally wrote it down as part of RFC 760. It’s worth remembering, if for no other reason than an gentle reminder that often the best code is no code at all.
Ever been away from a project for a long time, then returned to it and wondered “what idiot wrote this crap?” only to find out that the idiot was you?
Eagleson’s Law describes this situation quite accurately:
“Any code of your own that you haven’t looked at for six or more months might as well have been written by someone else.”
Remember that the next time you’re rejoining a project you’ve been away from for months. The code is no longer your code; it is someone else’s that you’ve now been tasked with improving.
One of the fundamental laws that can apply to managers (of any field, not just software) is the Peter Principle, formulated by Canadian educator Laurence J Peter:
“The selection of a candidate for a position is based on the candidate’s performance in their current role, rather than on abilities relevant to the intended role.”
The Peter Principle is often sarcastically reduced to “Managers rise to their level of incompetence.” The idea of this principle looks like this:
The problem revealed by the Peter Principle is that workers tend to get evaluated on how well they are currently doing, and their superiors assume that those workers would also be good at a different role, even though their current role and their intended role may not be the same or even similar. Eventually, such promotions place unqualified candidates in high positions of power, and in particularly bad cases you can end up with pointy-haired bosses at every step of an organization’s hierarchy.
Speaking of pointy-haired bosses, cartoonist Scott Adams (who publishes the comic strip Dilbert) proposed an negative variation of the Peter Principle which he named the Dilbert Principle. The Peter Principle assumes that the promoted workers are in fact competent at their current position; this is why they got promoted in the first place. By contrast, the Dilbert Principle assumes that the least competent people get promoted the fastest. The Dilbert Principle is usually stated like this:
“Incompetent workers will be promoted above competent workers to managerial positions, thus removing them from the actual work and minimizing the damage they can do.”
This can be phrased another way: “Companies are hesitant to fire people but also want to not let them hurt their business, so companies promote incompetent workers into the place where they can do the least harm: management.”
Ever noticed that doing something always takes longer than you think? So did Douglas Hofstadter, who wrote a seminal book on cognitive science and self-reference called Godel, Escher, Bach: An Eternal Golden Braid. In that book, he proposed Hofstadter’s Law:
“It always takes longer than you expect, even when you take into account Hofstadter’s Law.”
Always is the key word: nothing ever goes as planned, so you’re better off putting extra time in your estimates to cover some thing that will go wrong, because it unfailingly does.
The 90-90 Rule
Because something always goes wrong, and because people are notoriously bad at estimating their own skill level, Tom Cargill, an engineer at Bell Labs in the 1980’s, proposed something that eventually came to be called the 90-90 rule:
“The first 90 percent of the code accounts for the first 90 percent of the development time. The remaining 10 percent of the code accounts for the other 90 percent of the development time.”
Perhaps this explains why so many software projects end up over budget and short on features.
What is possibly the most astute observation that can be applied to the art of estimation comes from British naval historian C. N. Parkinson. He jokingly proposed an adage called Parkinson’s Law, which was originally understood to be:
“Work expands so as to fill the time available for its completion.”
Remember this next time you pad your estimates.
Economist and professor Charles Issawi proposed an idea that came to be known as Sayre’s Law, named after a fellow professor at Columbia University. Issawi’s formulation of this law looks like this:
“In any dispute the intensity of feeling is inversely proportional to the value of the issues at stake.”
In short, that the less significant something is, the more passionately people will argue about it.
Parkinson’s Law of Triviality (AKA Bikeshedding)
Sayre’s Law segues directly into another law that applies to meetings, and here we again encounter the ideas of C.N. Parkinson. Parkinson’s Law of Triviality states:
“The time spent on any agenda item will be in inverse proportion to the sum of money involved.”
Parkinson imagined a situation in which a committee of people were tasked with designing a nuclear reactor. Said committee then spends a disproportionate amount of time designing the reactor’s bikeshed, since any common person will have enough life experience to understand what a bikeshed should look like. Clearly the “core” functions of the reactor are more important, but they are so complex that no average person will understand all of them intimately. Consequently, time (and opinions) are spent on ideas that everyone can comprehend, but which are clearly more trivial.
Law of Argumentative Comprehension
The last law is one
I totally made up I use to shorthand both Sayre’s Law and Parkinson’s Law of Triviality. I call it the Law of Argumentative Comprehension:
“The more people understand something, the more willing they are to argue about it, and the more vigorously they will do so.”
You’ll notice that many of the laws above don’t directly apply specifically to software, and this is intentional. The fact remains that software is built for people to use and interact with, so many of these laws relate to dealing with people rather than code.
No pithy quote will ever replace the experience you gain every day by writing code, interacting with users, and generally getting better every day. Still, by keeping in mind these 15 laws of software development, you might just make yourself a better developer. Or at least a more knowledgeable one, and really, aren’t those the same thing?
Did I miss any laws that you consider fundamental to the process of creating software, or any of the activity that goes on around said process (e.g. estimations, meetings, etc.)? Share in the comments!
Original URL: http://feedproxy.google.com/~r/feedsapi/BwPx/~3/LbytS0I9Z78/