At various times over the decade I've been writing this column (and longer!), I've proposed -- based upon having worked in, or reported upon, the high-tech world for more than 30 years -- one new law or another to describe the forces at work in the modern world.
It's a natural human tendency to come up with some overarching rule that takes a bunch of largely inchoate experiences and gives them a simple coherence … whether it's entirely true or not. We are wired to like the surprise of seeing disorder suddenly have meaning, especially when the spin it gives is slightly sardonic. It enables us to say, when something goes wrong, "Well, everybody knows that blah-blah-blah."
The classic example of this is Murphy's Law. It's not really a law, of course, but more like a talisman to protect us against bad luck. "If anything can go wrong, it will" is just accurate enough as a description of daily life -- making us chuckle ruefully even as we wince -- that we keep it in our back pocket to pull out as needed to make the world a little less painful.
Murphy's Law has, of course, provoked hundreds of imitators, from the Jelly-Side Down rule to arcane descriptions of recurring disasters in the far reaches of scientific research.
The technology world seems to provoke a lot of Murphy-like laws. One possible explanation is that the world of electronic engineers and code writers is so precise and so empirical, the belief so strong that you can precisely quantify "luck" and failure rates, that when things do go haywire, when the program crashes or the device self-immolates, the only answer is to shrug and blame yet another pseudo-scientific law. It beats admitting that the world is irrational, chaotic and awash with plain, old bad luck.
Of course, there are some real laws in high tech, from the chilly perfection of Maxwell's equations down to the odd world of quantum mechanics. And, of course, there are those two celebrated "laws" to describe the behavior of high tech as it interacts with the world of human beings.
The first, and most famous of these, is Moore's Law of semiconductors. As I've noted many times before, Moore's Law is the single most important predictive tool in the modern world. The pace it sets -- the doubling of performance at the chip level every 24 months -- defines the world we live in better than any demographic or other sociological measure. Trillions of dollars have been made betting on Moore's Law (including billions by Gordon Moore himself at Intel Corporation), while no one has ever won betting against it. Meanwhile, all of us are likely to live under the regime of Moore's Law for the rest of our lives.
Interestingly, Moore's Law, as Moore himself has often reminded us, is not really a law at all, but an implied contract between the semiconductor (and now the entire electronics) industry to keep pushing technology forward at that breakneck pace forever -- or until it crashes into the limitations of physics. And so far, so good.
The other increasingly important law is Metcalfe's Law of networks, which states that the value of any network increases by some large factor with the addition of each new node. Interesting, it is a real law, and the explosive growth of the Internet is its proof. The problem is that nobody seems to agree on what that "factor" is. Bob Metcalfe thought each new node doubled the value of the network. Others who have followed have argued that it might not be that much.