The Year in Hardware

This year, researchers develop touch screens, self-driving vehicles.

ByABC News
January 8, 2009, 12:19 AM

Dec. 26, 2007 — -- At Apple's annual Macworld event last January, showman and CEO Steve Jobs unveiled the iPhone. Holding it onstage, Jobs tapped on its surface to type, flicked his finger to scroll through songs, and pinched his fingers to make pictures smaller. The crowd went wild. But while the iPhone is the world's most prominent example of a multi-input touch screen, other equally innovative technologies came to prominence this year. Jeff Han, a researcher at New York University and founder of the startup Perceptive Pixel, believes that multi-input touch screens should be large--the size of a wall. Microsoft, for its part, unveiled a multitouch computing table that lets users manipulate virtual objects on the surface. And a Microsoft researcher, Patrick Baudish, is working on touch-screen technology that's a few years away from consumers: a double-sided touch screen that lets a user see her fingers on the other side of a tablet PC or phone.

When Apple did away with the keyboard on its phone, it also took away the tactile feedback that people experience when they press a button. Research suggests that smooth touch screens lead to more typing errors than a traditional keypad does, especially in bumpy environments such as a car or a train. Researchers, such as Stephen Brewster at the University of Glasgow, are exploring ways to add a tactile cue that lets a person know when a button on a smooth screen has been tapped. This burgeoning field, called haptics, is also used to make virtual-reality experiences more real. Yoshinori Dobashi, at the University of Hokkaido, in Japan, has simulated the feel of water. And a company is adding tactile feedback to a vest that can be worn when playing video games.

People are getting more and more accustomed to having their cell phones or laptops with them at all times. Useful as these gadgets are, they can be even more helpful if they can automatically suggest things to do or give directions to a restaurant nearby. This year, a number of products and research projects tried to make phones and other gadgets even smarter. Nokia, for instance, introduced a powerful tablet PC with a Global Positioning System (GPS) chip. But not all gadgets have GPS capabilities. Google recently announced a technology that sidesteps the GPS issue and helps a person place himself on a map, within about 1,000 meters, using information from a cell-phone tower. Similarly, the German startup Plazes offers a service that lets a person use a Wi-Fi signal to locate herself, among other services. And what to do with all this location information? Researchers at the Palo Alto Research Center have developed an application for a phone that suggests things that the user might want to do, places to eat and shop, and things to see, based on location, time of day, past preferences, and even text-message conversations.