Researchers Develop Better Car Controls

July 15, 2003 -- If someday in the future you see a driver shake his fist, don't panic. It may not be a road-rage incident. He might just be changing the station on his car's radio.

Researchers at Carnegie Mellon University in Pittsburgh are developing a high-tech system to convert drivers' hand gestures into something more useful than impromptu communications with fellow highway pilots.

Make a few jabbing motions in the air, for instance, and you might adjust the car's electronic control system. A "twirl" of an index finger and the radio's volume goes up or down, depending on the direction of the twirl. "Wave" and incoming cell phone calls are automatically answered by a voice-mail system.

The "gesture interface" system is one of a few advanced control systems being studied by university scientists for the General Motors Collaboration Laboratory. The goal of the lab, a five-year, $8 million joint project with the Detroit automaker, is to help develop new technologies to make cars "smarter" and allow drivers to stay focused on the road.

"Computers are providing the smarts to make car driving easier," says Professor Asim Smailagic, a senior researcher on the gesture interface project. "Performing other tasks — taking a phone call, adjusting the radio or air conditioning — should not take away from the task of driving."

Watchful Eye

The workings of the system are fairly simple and cheap.

The scientists place a common computer camera on the floor of a Pontiac Montana minivan, roughly where a gear-shift lever would be positioned. The camera is attached to a standard laptop computer that contains special computer algorithms developed by Carnegie Mellon computer engineers over the past three years.

When a driver passes his hand above the camera, the computer programs analyzes the images, looking for specific hand features — the fingers, palm, the outline of a fist, and so on — and tracks its motion across the camera's field of view. The program then translates that gesture into appropriate commands.

For now, the scientists have developed only a limited "vocabulary" of 15 gestures for their prototype vehicle — most of them dealing with "nonsafety-critical" car controls, such as the vehicle's radio. But the researchers say there's practically no limit to the gesture-recognition language and its use to control on the car.

"We have concepts of how you can control [a car] through a whole language of gestures," says Ed Schlesinger, co-director of the lab. "I don't have to go fumbling for a knob or knowing of finding where a particular button is."

From Fighter Jets to Car Cockpits

Researchers are still testing the system and doubt driver gesturing will become an option in new model cars in the immediate future. But scientists at the lab say the system is just one of many "smart car" interfaces they are working on.

For example, researchers there are also developing new information display schemes to project vital information on the car's windshield. Such "heads-up" devices — similar to the displays used in fighter jets — would tell drivers what they need to know while keeping their eyes on the road.

Other researchers are working on voice-recognition systems and wireless communication systems that seamlessly connect the car's electronics to other mobile devices such as personal handheld computers and cell phones.

"The decades of the '80s and '90s were all about getting information technology into the office and then the home," says Schlesinger. "The coming decades — and we've seen this already — are all about mobility, having your communications and computation with you and not tethered to the home. And when you talk about mobility, the car is key to that."

More Than Just a Set of Wheels

The vision, says Schlesinger, is that by ganging these developing computer technologies together, cars will be able to assist drivers and help share the workload.

"We really want the car of the future to be an able companion, a thing that knows you, knows what you're up to, knows where you're going, and make intelligent suggestions," says Schlesinger.

For example, researchers could develop a car computer system to automatically query a personal handheld computer for your day's schedule. If it knows a driver has a meeting downtown, the system would then be able to show the route to the city using the heads-up display.

More than that, if the driver encounters traffic, the system might suggest alternative routes. And if the new driving directions mean a longer route, the car system might offer to place a cell phone call to the participants advising them where you are and how late you'll be to the meeting.

Such "context aware" systems are possible, says Schlesinger, given the research work being done by the lab's scientists in collaboration with engineers from GM. But it will be at least five to 10 years before some of the technologies make it out of the lab and into the driver's seat.

Until then, when you see another driver make a hand gesture from behind the wheel, stay out of the way. It may not be an on-board computer he's communicating with.