Silicon Insider: Tech Revolutions at Hand

Feb. 13, 2003 -- Every technology revolution takes twice as long as we expected, and half as long as we are prepared for.

If nobody else has already said that, call it Malone's Third Law. [The First, by the way, from about 1982, is: Whenever a company builds a new corporate headquarters, short the stock. The Second, from the mid-'90s, is: Any true technology revolution has its own underlying Law.]

Everybody who has lived in the industrialized world over the last 50 years understands what the Third Law means. My first encounter with this peculiar property of modern life came in the early 1960s, with color television.

In the dozen years since the late 1940s, television had gone from an intriguing novelty to the centerpiece of most American living rooms. In the process, it had hit American society like a tsunami. By 1960, we were still reeling from the cultural implications of Davy Crockett, the Today Show, the Friday Night Fights, the Quiz Show scandals, American Bandstand, Bishop Sheen, Liberace and Ed Sullivan. And coming over the horizon were the Nixon-Kennedy debates, the Cuban Missile Crisis and the Beatles.

In the early years of TV, pundits predicted that the technology would be a slowly accepted, and ultimately elevating, force in modern life. By the first episodes of I Love Lucy and Howdy Doody, that fantasy had long been abandoned. Now there was a headlong rush by everybody in America to get that Muntz or Philco box set up in the living room and an aerial on the roof and tune in to Dinah Shore.

So, what had been predicted to take a generation instead took only a decade. But what had also been predicted to be easy, the implantation of television programming into daily life, proved to unbelievably complex, difficult, transformative and enduring. In some ways we are still trying to recover from what happened to us.

Ringing in the PC Age

Now, with the new decade, there was a whole new television technology appearing down at the local appliance store: color. I remember vividly the conversations of my parents and their suburban neighbors at cocktail parties and barbecues. It was: color TV sounds interesting, and it's probably beautiful to look at — but who needs it? Not at that price. We'll stick to black and white, thank you very much.

Then, amazingly, Disneyland, that mainstay of my Boomer childhood, became The Wonderful World of Color. I remember seeing it for the first time, in that wonderful saturated color of those early TVs, at my uncle's house in Enid, Okla. I stood transfixed. More importantly, so did my old man.

Three years later, we owned a color TV. So did everybody else. The first important program we watched was, ironically, one of the most monochromatic events of the century: John F. Kennedy's funeral.

From then on, the Sixties only grew more colorful: from A Hard Day's Night to Help to Magical Mystery Tour. Without color television, would the Sixties have been the Sixties? Would the psychedelic era have existed? Would we have pulled out of Vietnam if the blood hadn't been so red?

By then, I was already living the beginnings of another revolution. I saw the same acoustic coupled computer terminal in the NASA-Ames Research Center that Steve Wozniak did. And I had the same thought, "Wouldn't it be cool to have this at home?" Luckily, Woz was a genius. Still, this was eight years before the Apple and I, almost 10 before I saw my dream incarnate — the Apple II.

I was 12 when I first imagined a personal computer, 22 before I saw one, and 30 when I saw Jobs unveil the Macintosh. By then, the modern personal computer had taken nearly two-thirds of my life to be realized.

Yet, in retrospect, the whole PC revolution seems to have happened incredibly fast. Though we had a half-generation warning, we still weren't prepared for the arrival of the Personal Computer Age. In fact, even though this year's college freshmen were born the year of the Mac introduction, we are still struggling to deal with all of the implications of its arrival.

Finally, there was the Internet. I first watched somebody access the DARPAnet, at Xerox PARC if I remember right, sometime in the early 1980s. Once again, I thought it would be amazing to have such a research tool in my own house. But that was impossible: the Net was the province of giant institutions, and run on giant mainframe computers using dumb terminals. It cost survivors thousands — millions if you included the hardware — to subscribe. And the access coding was a nightmare. Besides, there didn't seem to be much on the Net that would be useful over time to us everyday folks.

I don't need to tell you what happened a dozen years later. The Internet, now armed with the right infrastructure and search tools, became an overnight sensation. Putting the average person on the Net had taken 20 years. Does anyone doubt that it will take 20 more for us to spin out and cope with all of its implications?

A New Digital Age

So, why this little exercise in nostalgia? Because of two little news items that appeared in the last couple of weeks.

With everything else going on in the world, you may not have noticed them. Yet they are vivid reminders that this law of technology revolutions still holds

The first of these stories had to do with digital cameras. Remember when consumer digital photography was first announced? I think it was sometime in the late 1980s. Digital photography was going to be the next big thing. Then the first cameras started showing up on store shelves in the mid-1990s, and they were expensive and not very good.

Still, a few people bought them. Then, digital cameras got cheaper and better. And then, sometime in the last three years, we all bought one, though the purchase seemed less adventurous than inevitable; more anecdotal than part of a mass movement.

Now, consider the recent announcement by the Camera and Imaging Products Association: In 2002, the unit sales of digital cameras exceeded those of traditional cameras for the first time.

In other words, while we busy elsewhere, the age of film photography just ended. Moreover, two traditional consumer boundaries — between data and imagery, and between still and moving pictures — have also just been erased.

We now live in the age of digital photography. Are we prepared for the implications of ubiquitous image gathering? Spy cameras on every corner? A world in which any image can be easily modified — and thus unbelievable? Of a return to a pre-literate world in which pictures have primacy in communications over words?

Of course we aren't. Digital photography took forever to get here. And now that it has, it seems as if all these new problems have been suddenly sprung upon us.

And that's nothing; consider the second piece of news. This one is out of UCLA: Among Internet users the Internet is now more important than television, radio, and magazines. For veteran Internet users, you can also add newspapers and books (BOOKS!) to that list.

So even as we were following the Bubble burst and concluding that the Internet era was over, the Net quietly became the single most important information source in the industrialized world. It is even doing what was once thought impossible: cutting into television viewing time. Ponder that, and what it means to our society in the years to come.

I don't know about you, but even after 20 years' warning, I'm still not ready for what's coming.

Michael S. Malone, once called “the Boswell of Silicon Valley,” most recently was editor-at-large of Forbes ASAP magazine. His work as the nation’s first daily high-tech reporter at the San Jose Mercury-News sparked the writing of his critically acclaimed The Big Score: The Billion Dollar Story of Silicon Valley, which went on to become a public TV series. He has written several other highly praised business books and a novel about Silicon Valley, where he was raised. For more, go to Forbes.com.