The world of high technology moves so fast and operates in so many diverse industries that it's easy to miss the forest for the trees.
I mean: What is the relationship between, say, Hewlett-Packard and Twitter, or Facebook and Intel, Google and today's newest iPhone apps? How do all of these diverse enterprises fit together in the ecology of the electronics industry? Who matters, who doesn't, who is winning and who is losing? Which represent the future, and which are the lumbering dinosaurs of the past?
You aren't going to find answers to those questions in electronics trade magazines -- they're too busy trying to cover all of the changes taking place in their own niche markets. And though you can sometimes get clues from the mainstream business press, it too often focuses on individuals and current corporate success stories -- and fails to look at the larger trends at work.
So, for once at least, let's turn the process upside down and use this column to look at the big picture in the technology world and then drill our way down. With luck, by the time we're done, we'll not only have a better understanding of where the electronics industry is now, but where it is going next and who the big winners are likely to be.
Okay, let's take the longest perspective of all: We are currently in the sixth decade or so of the Digital Age, which began in the 1950s with the general application of Boolean algebra (the presentation of mathematical equations as 1's and 0's) to mainframe computers. The Digital Age supplanted the Analog Age (continuous measurement using real numbers, like those on a speedometer or pressure gauge), which had essentially been in effect since pre-history.
Digital computation has proven to be so powerful (it works well with electrical switches because it translates to 'on' and 'off' states) that it has increasingly supplanted most other kind of measurement. Thus, the Digital Age could last centuries into the future. The one potential replacement may be a return to some kind of analog computation derived from biological systems or some kind of phase shifting at the quantum level, but the likelihood of either is still pretty low.
At the next lower level of focus, we are currently in the second phase of the Internet Era, the so-called Web 2.0. There have been about six of these eras in the Digital Age: instruments, computers, semiconductors, personal computers, systems and software, and personal electronics and the Internet -- each building upon and incorporating the advances of the eras before them.
Thus, low-cost microprocessors (themselves based on the architecture of mainframe and mini computers) and memory chips made possible the construction of small low-cost personal computers like the Apple II and games like Atari's Pong -- which in turn created a vast demand for new operating systems (like Windows), networks, games and applications (like word processing).
Though the Internet has been around since the annus mirabilis of 1969, the Internet era really didn't begin until the widespread adoption of the World Wide Web in the early 1990s.
The first phase of the Internet era focused on tools (e-mail, etc.) and electronic commerce, and culminated in the dot-com bubble.