Silicon Insider: Don't Forget the Microchip

An unexpected -- and surprising -- announcement by Apple Corp. on Wednesday serves as yet one more reminder that in the digital age, it all comes down to chips.

Other than the occasional earnings announcement from Intel, you don't hear much about the semiconductor industry these days. Thirty years ago, when I first started covering the electronics industry, the news was only about chips and the extraordinary companies that made them. After all, that's how Silicon Valley got its name.

Missed last week's Silicon Insider on "Classical Gas"? You can find it here.

But the way to be successful as an entrepreneur with a new company is to always move toward the next higher level of integration -- that is, to the secondary market that has just been made possible by the efforts of the previous generation of new companies. That's where the profits, and, just as important, the competition aren't.

And thus, in the last three decades, we have seen the electronics industry build one new industry shell around another, chips to PCs to operating systems to applications software to browsers to Web 1.0 (e-commerce) to search engines to Web 2.0 (social networks), with a lot of other stuff in-between, all in a perpetual race to capture the next higher ground.

All of this is predictable and welcome; by creating a perpetually retreating frontier for entrepreneurs to chase, the electronics industry has rewarded us consumers with an unprecedented explosion of new products and services, as well as produced and distributed an immense amount of new wealth around the world.

But something has been lost in this process as well: a sense of perspective. We are now so many layers removed from silicon that we tend to discount its importance. And just as we are right now discovering -- thanks to a global shortage of rice and other basic foods, as well as skyrocketing oil prices -- how much the world's economic health still depends on things as prosaic as farming and fuel, so too may we one day awaken to discover that the unexpected end of Moore's Law has cast us adrift in a future where innovation and progress become once again as rare as they were three centuries ago.

I always count myself lucky to have begun my career covering, and learning to understand the dynamics of the semiconductor business, and to have known all of the industry pioneers: extraordinary figures such as Bob Noyce, Gordon Moore and Andy Grove of Intel, Jerry Sanders of AMD and Charlie Sporck of National Semiconductor. They were remarkable individuals, in some ways more like the oil wildcatters of old than the hip Web 2.0 heroes of today, and whether most people know it or not, their presence is still felt in every high tech community around the world.

I learned two important lessons in those early days. The first is that, once you drill down past today's hot new widgets, past all of the code and hardware design, at the very heart of the electronics industry there is nothing more than a chemical business that runs an assembly line out of a factory. Now that factory, the semiconductor wafer fabrication plant, may be the most sophisticated production facility the world has ever seen; and the assembly line may be using lasers to cut grooves two atoms wide on the surface of a sheet of silicon the size of a baby's fingernail … but it is still, in the end, a chemical factory.

The second lesson I learned is that everything in technology moves to the pace set by the semiconductor industry. No matter how far detached your business is from actual chips -- say, your Max Levchin at, designing tools to measure the usage of certain software widgets by teenagers on Facebook -- your long-term fate is inextricably bound with that of the semiconductor industry. If Intel and Samsung missed a beat in delivering their next generation of microprocessors, no matter how far down the digital food chain you are, you will eventually feel the effects, just as much as Pluto does changes on the surface of the sun.

Unfortunately, most contemporary tech reporters know very little about the semiconductor industry, and don't care to learn much more. It is too old, too big and too hermetic in their eyes. It isn't sexy and so they don't see much need to cover it. That's a pity, because I've always found the single best indicator of whether the tech world is heading into a boom or a bust is to watch, still after all of these years, the booking-to-billings ratio of new orders in the chip business. Add to that an understanding of the arrival date and the likely characteristics of the next generation of microprocessors and you can not only pretty accurately guess the boom-bust cycles in electronics, but even their amplitude.

That very few mainstream tech reporters even cover the semiconductor industry anymore goes a long way toward explaining why the media are so often surprised by sudden, seemingly inexplicable, shifts in the industries they cover. And if they can't see what's coming, they certainly can't alert their audiences. And that explains a lot of the volatility in the tech world: It's not just the high company fatality rates caused by rampant entrepreneurship, it's also a singular lack of understanding by everyone involved of the underlying forces driving the entire process. Had they been watching the trajectory of processing equipment orders the chip companies were placing on the Semiconductor Equipment Manufacturers such as Applied Materials (the ultimate coal mine canaries), they might have seen this "unexpected" shift coming from a long ways off.

This lack of interest and understanding explains why several news announcements of the last few days failed to make much of a blip in the press.

The first one, which I referenced at the beginning of this column, was the announcement Wednesday by Apple that it would pay $278 million for low-power chip design company P.A. Semi. If it seems surprising that Apple, in everyone's eyes nowadays the ultimate consumer electronics/digital content company, would devote that kind of money to a chip company acquisition, consider the report that it was Steve Jobs himself who led the negotiations.

Like me, Steve is a Valley boy who grew up in the world of chips. People forget that though he may have never been a true computer nut as a kid, he did spend a lot of his early years working at a chip and components retailer. And so, even if the kids who work for him don't understand the fundamental importance of integrated circuits and microprocessors, Jobs most certainly does. And if you really want to understand where Apple and the consumer electronics industry is going in the next five years, then you should try to answer the question: Why would Jobs and Apple jump from the IBM "Power" chip architecture to the Intel x86 to ARM and now back to the P.A. Semi "Power" architecture in less than 10 years? Is, as CNet speculated, Apple going into game consoles? Servers? Has the company spotted some fundamental weakness in Intel's Atom, or in the Samsung ARM chip it uses in the iPhone? Or is Steve going back to his old proprietary, closed system ways? Answer those questions and you may know the future of consumer electronics.

Oh, and the other news announcement? This one came out of the University of Manchester in the United Kingdom. There, scientists using an amazing material derived from graphite called graphene have not only managed to create the world's thinnest film, but then used X-rays to carve into it the features to create a transistor that is only one atom thick and 10 atoms wide. Better even than that, it appears that unlike silicon transistors, which tend to suffer weird quantum effects as they shrink, these new graphene transistors actually seem to perform better the smaller they get.

In a world that understands technology as well as it should (or thinks it does) this breakthrough (and the amazing number just like it that have occurred in chips in the past few years at the atomic level) would be front page news. If this new chip fabrication technology can indeed be perfected and scaled into large volume production, Moore's Law has just been given at least another 20 years. That means another human generation of continuous new innovation, life-changing products and ever-greater global average wealth.

That's news that should be on the front page, with millions taking to the streets to cheer … not buried in an unread wire service story.

But then, who cares about chips anymore?

Tad's Tab: is a strange site. The entire Web site is a flash video of a woman holding a clock and singing in gibberish (supposedly Finnish). The hands of the clock are leeks and move about randomly while more leeks parade along the bottom of the screen. Although odd, the flash video is also strangely addictive.

This is the opinion of the columnist, and in no way reflects the opinion of ABC News.

Michael S. Malone is one of the nation's best-known technology writers. He has covered Silicon Valley and high-tech for more than 25 years, beginning with the San Jose Mercury News, as the nation's first daily high-tech reporter. His articles and editorials have appeared in such publications as The Wall Street Journal, the Economist and Fortune, and for two years he was a columnist for The New York Times. He was editor of Forbes ASAP, the world's largest-circulation business-tech magazine, at the height of the dot-com boom. Malone is the author or co-author of a dozen books, notably the best-selling "Virtual Corporation." Malone has also hosted three public television interview series, and most recently co-produced the celebrated PBS miniseries on social entrepreneurs, "The New Heroes." He has been the "Silicon Insider" columnist since 2000.