An unexpected -- and surprising -- announcement by Apple Corp. on Wednesday serves as yet one more reminder that in the digital age, it all comes down to chips.
Other than the occasional earnings announcement from Intel, you don't hear much about the semiconductor industry these days. Thirty years ago, when I first started covering the electronics industry, the news was only about chips and the extraordinary companies that made them. After all, that's how Silicon Valley got its name.
Missed last week's Silicon Insider on "Classical Gas"? You can find it here.
But the way to be successful as an entrepreneur with a new company is to always move toward the next higher level of integration -- that is, to the secondary market that has just been made possible by the efforts of the previous generation of new companies. That's where the profits, and, just as important, the competition aren't.
And thus, in the last three decades, we have seen the electronics industry build one new industry shell around another, chips to PCs to operating systems to applications software to browsers to Web 1.0 (e-commerce) to search engines to Web 2.0 (social networks), with a lot of other stuff in-between, all in a perpetual race to capture the next higher ground.
All of this is predictable and welcome; by creating a perpetually retreating frontier for entrepreneurs to chase, the electronics industry has rewarded us consumers with an unprecedented explosion of new products and services, as well as produced and distributed an immense amount of new wealth around the world.
But something has been lost in this process as well: a sense of perspective. We are now so many layers removed from silicon that we tend to discount its importance. And just as we are right now discovering -- thanks to a global shortage of rice and other basic foods, as well as skyrocketing oil prices -- how much the world's economic health still depends on things as prosaic as farming and fuel, so too may we one day awaken to discover that the unexpected end of Moore's Law has cast us adrift in a future where innovation and progress become once again as rare as they were three centuries ago.
I always count myself lucky to have begun my career covering, and learning to understand the dynamics of the semiconductor business, and to have known all of the industry pioneers: extraordinary figures such as Bob Noyce, Gordon Moore and Andy Grove of Intel, Jerry Sanders of AMD and Charlie Sporck of National Semiconductor. They were remarkable individuals, in some ways more like the oil wildcatters of old than the hip Web 2.0 heroes of today, and whether most people know it or not, their presence is still felt in every high tech community around the world.
I learned two important lessons in those early days. The first is that, once you drill down past today's hot new widgets, past all of the code and hardware design, at the very heart of the electronics industry there is nothing more than a chemical business that runs an assembly line out of a factory. Now that factory, the semiconductor wafer fabrication plant, may be the most sophisticated production facility the world has ever seen; and the assembly line may be using lasers to cut grooves two atoms wide on the surface of a sheet of silicon the size of a baby's fingernail … but it is still, in the end, a chemical factory.