In one advertising video, Google technicians have a man who is almost completely blind take the driver's seat of an autonomous Toyota Prius, which drives the man to do his shopping. To film this video, the car was surrounded by a police escort -- in the real world, it's still too soon for blind reliance on digital robotic driving.
What, then, is actually possible here and what is not? Designers in the automotive industry find Google's showy celebrations of its autonomous vehicles unsettling. Daimler developer Herrtwich, for example, finds it inappropriate to act as if computer-steered vehicles will soon be able to navigate through the fray of urban traffic. "City traffic is an utterly chaotic situation, and designing autonomous cars that can drive in it is not even one of our goals at this point," he says. "Autonomous driving in monotonous, steady highway traffic is a far more reasonable and feasible goal."
Still, even some conservative German designers take things considerably further. "The pace of development in electronics has often been underestimated," says Groesch, the industrial consultant. In another project with Daimler and Bosch, Groesch worked on developing airbag controls. In the beginning, it was widely believed to be impossible to develop sensors that would react quickly enough to deploy side airbags in time. Today, such airbags are an industry standard. Reducing Mistakes to an Acceptable Level
That Groesch no longer considers autonomous driving a purely utopian vision is owed to one key piece of technology that has advanced by leaps and bounds: "Laser scanning," he explains, "is unbeatable at identifying traffic entering from the side."
Capable of up to 10 revolutions per second, these scanners fire 60 laser beams or more in a 360-degree arc around the vehicle. The beams are invisible to the eye and pose no danger to humans, but they strike objects and bounce back as pulses of light. From the time lapse between pulses, the computer can measure the car's distance from objects.
This allows the vehicle to establish a three-dimensional image of its surroundings to a distance of up to 100 meters (330 feet) -- a more comprehensive view than is possible with the human senses. Interpreting the information from the laser beams faster and better than the brain can interpret feedback from the eyes is only a matter of increasing computing power.
It's more than likely that, within the foreseeable future, autonomous cars will no longer make a mistake every 80,000 kilometers, but perhaps only every couple million kilometers. And someday they will outperform humans in every situation, even in chaotic city traffic.
The benchmark for developers in this field is ASIL D, under ISO 26262, an international standard applied to the safety of electronic and electric systems in automobiles. The standard stipulates that failures should be nearly eliminated -- a tall order when it comes to this sort of mobile technology. What happens when condensation or snow blurs a laser's lens, even if just for a few seconds? "Driving autonomously through Nevada on a sunny day is fairly easy," says Daimler developer Herrtwich. But in a flurry of snow in the mountains, the world looks very different for cameras and lasers.