For nearly 30 years, the way to work a computer was nearly unchanged — you sat looking at a screen, with a keyboard and a mouse in front of you. There were refinements, such as the small touch pads on many laptops, but only lately — with the multi-touch screens on iPads, iPhones and their competitors — have things really moved.
Now, people are pointing and flicking and making swishing motions at their screens.
How long until the next big change?
At the Consumer Electronics Show, a Swedish-based company called Tobii Technology has been showing off Gaze, a system that uses a webcam to track a user’s eyes, and move the things at which you look. At CES, they showed the system in an arcade game, a photo arcade and a screen of text (it scrolled up as you reached the bottom), but you can see the concept going in many directions if it proves widely practical.
“Pointing at something by looking at it is intuitive, natural and immediate. Using a mouse to do the same thing is less so, as it involves an intermediate step of moving a mouse-pointer around,” says Henrik Eskilsson, the CEO of Tobii, on the company’s website. “Gaze is as natural and intuitive as touch, as precise as the mouse and more ergonomic and effortless than both.”
We’ll see how quickly this can happen; the company has not been specific at CES. But it’s an intriguing marriage of motion sensing, facial recognition and other technologies. It could be useful to the disabled. It could also be useful to people who are simply rushed; look at an icon on your screen and more information appears. The company is working to include its technology in Microsoft’s Windows 8 operating system.
Christina Bonnington of Wired wrote that the system is impressive, but there are natural flaws: “If you accidentally look off to the side, or at somewhere random on the page, you could navigate to somewhere you didn’t intend.”