Silicon Insider: The 'Third' Screen Revolution

The PC era is ending, not with a bang, but a ring tone.

There were two very interesting news items this week — one that received a lot of attention, and one that got very little — and for once, in tech, the weighting was appropriate.

The story you probably did notice was the announcement by Google that next week, it will release a new software platform, called Android, that will enable developers to create programs to make mobile phones easier to use and more efficient at manipulating the Web. Google backed this announcement, saying it will begin working closely with the major coalition of phone equipment developers, known as the Open Handset Alliance.

The initial reaction to Google's announcement was dismal: rumors had been flying around Silicon Valley for months that what was going to be announced was an actual Google Phone. Hungry, early adopters drooled at the prospect of an all-out Apple iPhone vs. Google Phone war, which would leave the traditional phone industry behind and shower us consumers with ever-cooler devices at ever-falling prices. Thus, Google's announcement was something of a letdown.

But once they got past their disappointment at not being able to hold, in their hands, an actual phone with the Google logo on it, the gurus of Silicon Valley began to realize that, ultimately, Android might be far bigger news in the long run. That's because, in introducing an open platform for wireless, Google had at last put into place the missing piece in the long-awaited breakout of the so-called "Third Screen" revolution.

In a wonderful alignment of the cosmos, the other, mostly overlooked big tech story of the week had to do with the "Second Screen" — personal computers.

Remember personal computers? Remember how they used to be the hottest thing in technology? Remember how we used to excitedly await the newest Macintosh or the latest update of Windows?

I'm only being partly facetious. For most of us, the PC era has been the electronics revolution, everything else playing a supporting role. It was personal computing that brought most of us into the electronics age, put us on the Internet, and was consistently one of our largest capital expenditures. Thanks to the constant upgrading of PCs, because of the pace of Moore's Law — meaning we replaced our machines every few years — these devices have seemed perpetually young and new.

Because of that, it's easy to forget that personal computing is now more than 30 years old. Woz and Jobs were building the Apple I during the Bicentennial year. There are kids now in graduate school who were born the year the Macintosh was introduced. And the original IBM PC was introduced in the first year of the Reagan administration.

In other words, in tech years, this is an ancient paradigm. Those of you who have read this column from the beginning may remember that five years ago, I predicted the end of the PC era, saying that microprocessors were now leaving the box and embedding their computing power into the larger world.

Needless to say, I was a bit optimistic. I had also not listened to one of my own laws, which states: Every technology revolution arrives slower than we anticipate, and faster than we are prepared for.

Page
  • 1
  • |
  • 2
Join the Discussion
You are using an outdated version of Internet Explorer. Please click here to upgrade your browser in order to comment.
blog comments powered by Disqus
 
You Might Also Like...