The next war could be fought partly by unmanned aircraft that respond to spoken commands in plain English and then figure out on their own how to get the job done, even dodging enemy aircraft as they carry out their assignments.
This isn't just robotics, in which someone has to be on hand to issue commands to an unmanned vehicle all along the way. This is autonomy at its best, with vehicles that can make decisions similar to the way a human pilot figures out how to accomplish a task and then carries it out.
Engineers and scientists at several institutions and corporations are working on the project, chiefly under the sponsorship of the Defense Advanced Research Projects Agency. They have already demonstrated that the idea can work.
Last June, a Lockheed T-33 fighter jet successfully completed a series of assignments given by the pilot of another aircraft over Edwards Air Force Base in southern California. There was a pilot aboard the T-33, just in case something went wrong, but it turned out that he had nothing to do. Everything went according to plan, even when some assignments were changed at the last minute.
"That was a proof of concept," says Mario Valenti, a flight controls engineer for Boeing who is on leave to work on the project at the Massachusetts Institute of Technology, one of the lead institutions in the effort. "But this is obviously a system that is still in development."
Acts on Verbal Commands
Valenti sees many applications for the technology in the years ahead. That could include a vastly improved air traffic control system, in which aircraft approaching a busy airport would be given commands in English by the traffic controller in the airport tower, and then the aircraft would figure out the safest way to land at the assigned time while avoiding other planes.
But the first application will almost certainly be military, because it could greatly increase the effectiveness of pilotless aircraft in a combat setting. It doesn't take much imagination to picture a bunch of unmanned aircraft zeroing in on enemy targets while responding to brief verbal commands from pilots of other aircraft who remain safely behind.
The idea behind the project is to reduce the role of actual pilots by providing intelligent unmanned vehicles that can figure out for themselves that it's better to fly around a mountain than through it.
The program builds on research that enables computers to receive and act upon verbal commands, a hot item these days. The Teragram Corp., a software company in Cambridge, Mass., is developing the speech software for the unmanned vehicle project.
The researchers aren't interested in software that will accept simple commands like "turn right" or "turn left." Instead, they want the aircraft to respond to broad commands, like go to a certain area and photograph a specific building. Then, it would be up to the unmanned vehicle to figure out how to do that.
And that, of course, is no easy chore. The vehicle would have to know how to schedule certain tasks on its own, and then it would have to know how to carry them out safely and efficiently without any outside supervision. Those two components are a key part of work that is under way at MIT.
Even a simple command, like "go get the mail," is pretty complex for a robotic system, says Valenti, who has designed a "task scheduler" for the program.
"If I give you a task, like go get the mail, there are a number of steps that are required for you to do that," Valenti says. "First, you subconsciously might say you have to get up. And then you'll walk to the door, and maybe you'll have to go outside, and you'll open the door and walk to the mailbox, and there's your mail. The task scheduler provides the goals."
But how do you carry out those tasks? For a human, it's so easy we don't even have to think about it. But for an unmanned vehicle, it can be very challenging.
That's where the work of Tom Schouwenaars comes into play. He's working on his doctorate in aeronautics and astronautics at MIT, and he has developed something called a "trajectory planner." That's the software that translates the verbal command into a series of coded representations that the computer can understand.
Those codes would, for example, tell the robot to stand up first if it wants to go get the mail. They would also warn the robot that it is on the second floor, so if it wants to go outside to the mailbox, use the stairs first.
Of course, all of that requires that the computer have some knowledge about its setting, but knowing it's on the second floor eliminates the need for some human to remind it to use the stairs.
"The level of autonomy has been increased by a lot," Schouwenaars says. "Here, there's no remote control."
Such "smart systems" could take over many tasks now performed by humans, the researchers say.
"You could have a robot around the house and ask it to do the dishes, or clean up the apartment, or wash the windows," Valenti says. The robot would do it because it would know how to translate those commands into specific tasks, and how to carry them out without breaking all the dishes.
All of this may sound a bit farfetched, but some of it is probably closer than we think, particularly for military purposes. DARPA officials have already indicated that some of the technology will be incorporated in the next generation of unmanned vehicles.
In fact, the key components are all around us. During the proof of concept demonstration over the California desert last June, a pilot aboard a Boeing F-15 fighter jet punched in written commands on a laptop computer, since the system was not yet capable of accepting verbal commands.
Those commands were received by a laptop aboard the T-33, which scheduled specific tasks and instructed the aircraft on how to carry them out while avoiding obstacles such as "no-fly zones."
It worked without a hitch, the researchers say.
So someday, the commercial jet carrying you home for the holidays may land at an airport under the exclusive command of an air traffic controller on the ground, who may have issued a single, verbal command.
"There will still be a pilot aboard," says Schouwenaars says. "But he won't have much to do."
Of course, unless our confidence in computers improves considerably, that change will probably result in more people taking the bus.
Lee Dye's column appears weekly on ABCNEWS.com. A former science writer for the Los Angeles Times, he now lives in Juneau, Alaska.