Oct. 6, 2005 — -- Funny what a couple years can do in high-tech. Soon after I started this column, I announced that the Age of the Personal Computer was coming to an end … and all I got were shouts of angry disagreement: How could I suggest such a thing? The personal computer is the centerpiece of digital world, a basic tool in our daily lives, how could it possibly go away?
That was then. Today I'll say it again: the PC is dead. And given the news of the last few months, and given the experiences in your own life, does anybody care to disagree with me now?
Twenty years hence, will anyone really care about the PC? Or, like another techno-obsession in our lifetimes, the pocket calculator, will it so recede into history that we'll forget we were ever really interested?
It's a timely topic, at least for me, because about the time you read this, I'll be sitting in a conference room at the local chamber of commerce discussing what Silicon Valley relics to put into a new museum.
This is a fairly odd task, but not particularly rare -- most towns of any size have some sort of museum dedicated to showing old school sweaters, and sepia-toned Kodaks of sturdy firemen and Fourth of July picnics and the time the locomotive derailed. Typically, a handful of aging locals (who, with little evidence, are assumed to have special knowledge on the subject) are asked to pick what historic items to collect and display.
What makes my task a little more challenging is that this is Sunnyvale, the very epicenter of Silicon Valley and the technology revolution. Throw in our adjoining municipal neighbors to the north and south, and suddenly the history of this place includes not just Ohlone Indians and prune orchards, but the integrated circuit, the microprocessor, the personal computer and the Internet; not to mention Hewlett-Packard, Fairchild, Intel, Apple, Sun, Netscape, Yahoo! and Google.
My task is to figure out which of these objects should be put on display, and which of these companies should be celebrated, in the room dedicated to the Silicon Valley interval in Sunnyvale's history.
There is no little irony here. For one thing, the very idea for a new Sunnyvale History Museum came from my late father almost 30 years ago. He had become head of the Sunnyvale Historical Society because he was steamed that the City Fathers had, a few years before, conspired to demolish the town's founding homestead, the Martin Murphy House.
My old man's notion, which has miraculously now been realized, was that the Murphy House should be replicated, with a modern infrastructure, as a museum to the city's past -- a repository, at least in part, for all those old Murphy relics gathering dust in a city storeroom. Roll forward to the 21st century, and the museum project, now with several million bucks in the bank, is now only about a hundred grand from being ready to break ground.
And that's the second irony. In those 30 years, a lot of history has happened, arguably a lot more than in the centuries that preceded it. Sure, the Murphy-Townsend Party is one of the great unheralded anecdotes in American history (they got over the Sierras without eating each other, took part in the Bear Flag Revolt that made California a future state, and then went back and helped rescue the Donner Party), but how does that compare with being Ground Zero of one of the biggest technological/cultural transformations in human history? Thus, outside of a few old timers and a bunch of fourth-graders who are required by state law to study local history, there's probably not a whole lot of interest in seeing the Murphys' living room couch.
On the other hand, in a town where grandpa worked on multimeters for Bill and Dave, and mom and dad put in years at Intel on the Pentium project, and junior is currently a sponsored Counterstrike champion, there will no doubt be a whole lot of interest in the Silicon Valley room at the new museum.
That is, if I can get the exhibit right. And that won't be as easy as it sounds. Technology evolves so quickly that today's cutting-edge product is tomorrow's curbside garbage pick up special. To see how quickly new tech can become old news, I merely need to visit the Computer History Museum a few miles away and look at the aisles of old bookcase-sized Burroughs and Univac mainframes of interest only to graying geeks. Even lonelier is the Perham Collection of early radios and vacuum tubes, now sitting in warehouses in San Jose.
Five years ago, when the possibility of the Sunnyvale History Museum first became real, I knew exactly what the exhibit should focus upon: the personal computer as the culmination of the electronics revolution: from the triode to the transistor to the IC to the microprocessor to the Mac -- the long digital march that put a computer on Everyman's desk.
But now, doesn't the very idea of such an exhibit seem, well, anachronistic? Nobody cares about personal computers any more, except as those annoying boxes that waste too much desk space, but are still useful for accessing the Internet. They are like toasters -- appliances that remain invisible except when they burn the bread.
Even the people who make personal computers don't seem to like them anymore. IBM got out of the business. HP should do the same thing. And don't you think Steve Jobs would dump the Mac in a Cupertino Minute and focus exclusively on the iPod (and iVideo) if he could only figure out a way to deal with all of those loyal Mac owners? After all, he showed that he understood the fate of the personal computer before anyone else by turning it into a fashion object.
In fact, outside of a bunch of Asian cut-raters, the only guy who seems interested in building PCs these days is Michael Dell … and lately he's been pushing televisions.
The more you look, the more evidence you see that the PC is already at death's door. Intel, which just a year ago announced that PCs are still the future, is now backing away from expensive, high-powered chips and building cheaper multi-processors to stick into consumer electronics -- and moving employees across the Pacific to get closer to the game box makers. And have you read any Walter Mossberg computer reviews lately?
Meanwhile, just this week, Google and Sun Microsystems announced that they were teaming up to attack Microsoft Office by offering online word processing and other applications -- the latest installment in Google's product-per-week plan for the rest of 2005; all of it targeted at pulling users away from their dependence upon PCs and moving them into the world of "thin clients" (cell phones, cars, refrigerators, dumb laptops) and Web services. [No wonder Ballmer is acting crazier than usual these days -- he's the last guy left lobbying for the PC.] And let's not forget my favorite reverse indicator: a few days ago MIT's Media Lab announced the creation of the $100 laptop … and when has the Media Lab ever gotten the future right?
No, my sense is that the current generation of personal computers will be the last, or the next to the last, that we will ever own. It won't be a noisy or painful divorce. Rather, it will be more like the way we started using our cell phone in lieu of the home phone, even when we were at home. Or the day that the VCR broke and we decided there was no real need to replace it, now that we mostly rented DVDs anyway. Leaving our PCs behind will be even easier when everything else will either do the same stuff almost as well (Web-enabled refrigerators, cars) or a whole lot faster (cell phones, handheld devices, game players, etc., using Web services).
What that suggests is a generation from now, future visitors to the Sunnyvale History Museum may look upon a Sony Vaio with the same glazed eyes as we do some bakelite-knobbed oscilloscope from 1956.
So, what will these future visitors want to see? Games, I think. The current interest in retro gaming (Super Mario, Pong, Pac Man) suggests that even the most primitive video games may have far more longevity than the most powerful contemporary computers. And that in turn suggests that the room I design may have to lean more heavily on the stories of Atari and Activision (to name two other Sunnyvale firms).
Still, we should have at least one personal computer. Maybe I can talk Wozniak into donating an Apple I motherboard. We'll caption it: "Ancestor of the Blackberry."
This work is the opinion of the columnist and in no way reflects the opinion of ABC News.
Michael S. Malone, once called "the Boswell of Silicon Valley" most recently was editor at large of Forbes ASAP magazine. He has covered Silicon Valley and high-tech for more than 20 years, beginning with the San Jose Mercury-News as the nation's first daily high-tech reporter. His articles and editorials have appeared in such publications as The Wall Street Journal, The Economist and Fortune, and for two years he was a columnist for The New York Times. He has hosted two national PBS shows: "Malone," a half-hour interview program that ran for nine years, and in 2001, a 16-part interview series called "Betting It All: The Entrepreneurs." Malone is best known as the author of a dozen books. His latest book, a collection of his best newspaper and magazine writings, is called "The Valley of Heart's Delight."