Look at almost any personal computer today and it’s not hard to see a lineage that dates back to that clunky box introduced by IBM 20 years ago.
Sure, some computers have broken away from the boring beige plastic cover. Models from Apple Computer, which produced a “personal computer” years before IBM’s original PC, are notable examples. But for the most part, PCs have remained pretty much the same — on the outside at least: a keyboard, a space-eating display monitor and a box that holds all the “guts” of the computer.
In the next 20 years, however, that will all change.
Instead of one clunky box, we’ll have a multitude of devices. Some of them will be as small as a pack of cards or even a wristwatch, making them easy to carry whereever we go. Eventually, they may be even small enough to be embedded in our clothes or perhaps even under our skin.
Connected to a “new” Internet, all these devices will be able to find information and share it among other devices. And instead of requiring a keyboard or even a pen for giving commands and information, these devices will respond to our voice or even a simple look, perhaps even get to know our preferences, and act before we bother to tell them what to do.
Not Far-Fetched or Far Off
Such an evolution may seem far-fetched — especially given how little PCs have changed over the past two decades. But many computer visionaries believe that we’ll soon see devices that will make “computing” much more “personal.”
Chief among the factors is the development of ever faster, smaller and cheaper computer chips.
Since the introduction of the humble IBM PC, the power of microprocessors has roughly doubled every 18-months — a phenomena noted by Intel co-founder Gordon Moore in the 1970s. And software programmers took advantage of this so-called Moore’s Law to produce even more powerful programs and expand the capability of computers.
Consider that the microprocessors of the first IBM PC were capable of running only simple text-based programs. In comparison, today’s PCs equipped with Pentium 4 processors are nearly 400 times more powerful and allow users to interact with colorfully realistic images.
Speak and It Will Hear
But as the next generation of chips will give forthcoming computers even greater power, interacting with them will become even more natural and easier, says Wilf Pinfold, director of the microprocessor research lab at Intel.
For example, Pinford says it wouldn’t be too surprising to one day see a computer that reacts to our speech. Current so-called speech-recognition programs key in on just our voice, but future programs would watch our lips, too. “Speech interfaces can be trained so that when I’m facing the computer, it knows I am dictating to it,” says Pinford. “But if I look away, it will stop.”
“We do believe that this common interface will be the future,” says Pinford. “But it will need a much more powerful processor.”
And undoubtedly, such a brawny electronic brain could be developed as soon as five or 10 years. “The fact is,” says Pinford, “Moore’s law will produce a more user-friendly world.”
And according to some, these new and better interfaces are just as important — if not more so — in the development of the next generation of smaller computing devices.
“We can make powerful chips smaller and smaller and put them inside a watch,” says David Bradley, one of the original 12 engineers who helped develop the original IBM PC. “But your fingers are still the same size.” And while having a watch that reacts to your voice commands is still pretty much the realm of science fiction, “We’re working on getting there,” he says.
Seek and It Will Find
Indeed, as further developments are made in interactive software and powerful chips, these next devices could become quite “personal.”
For example, shopping on the Internet for a large yellow sweater now requires hours of going from Web site to Web site. But Intel’s Pinford says that the next generation computing devices— and the networks they are connected to — will learn from our past shopping experiences and simplify the process. So finding that perfect sweater may be a simple matter of showing a picture of a sweater to the computer and instructing it go find it.
“What we’re talking about here is moving away from an interface designed to allow us to communicate with a machine,” says Intel’s Pinford, “To a machine that is more sensitive to us.” Already, some researchers say that such ”artificial intelligence” technology is only three to five years away.
Whether or not we actually see such Star Trek-like products may ultimately depend on society in general rather than just technology. Just as people are beginning to reject obtrusive cell phone use in cars and restaurants today, “People walking around and talking to their wrists, may not be socially acceptable,” says Bradley.
But, he notes that it wasn’t too long ago that it was also socially unacceptable to type on a laptop during meetings. “Society changes to adopt new technology,” he says.