DeM Banter: always interesting…can our “strategy” and doctrine keep up? What about our JAGs? Who writes, arbitrates, and establishes laws for such equipment and capability?
September 28, 2012
The US military’s current fleet of drones will soon be overtaken by a new wave of robots that will be faster, stealthier and smarter — operating virtually without human intervention, experts say.
The Pentagon is investing heavily in “autonomy” for robotic weapons, with researchers anticipating squadrons of drones in the air, land or sea that would work in tandem with manned machines — often with a minimum of supervision.
“Before they were blind, deaf and dumb. Now we’re beginning to make them to see, hear and sense,” Mark Maybury, chief scientist for the US Air Force, told AFP.
Unmanned aircraft are now overseen by “pilots” on the ground but as the drones become more sophisticated, the role of remote operators will be more hands-off.
Instead of being “in the loop,” humans will be “on the loop,” said Maybury, explaining that operators will be able to “dial in” when needed to give a drone direction for a specific task.
“We’re moving into more and more autonomous systems. That’s an evolutionary arc,” said Peter Singer, an expert on robotic weapons and author of “Wired for War.”
“So the role moves from being sort of the operator from afar, to more like the supervisor or manager, and a manager giving more and more of a leash, more and more independence,” he said.
Despite the dramatic advances in technology, the American military insists humans will remain in control when it comes to using lethal force.
But the next generation of increasingly capable drones will stretch man’s capacity to control robots in battle, generating unprecedented moral and legal quandaries.
“These (technological) responses that are driven by science, politics and battlefield necessity get you into areas where the lawyers just aren’t ready for it yet,” Singer told AFP.
Over the next decade, changes in computing power will enable teams of hi-tech drones to operate virtually on their own, or as “robotic wingmen” to piloted aircraft, said Werner Dahm, the Air Force’s former top scientist.
At a testing range in the Arizona desert, Apache helicopters are flying together with unmanned choppers in experiments the Pentagon believes will serve as an eventual model for future warfare.
“We’re not far away from having a single piloted Apache or other helicopter system and a larger number of unmanned systems that fly with that,” said Dahm, a professor of mechanical and aerospace engineering at Arizona State University.
“These require very high levels of machine reasoning. We’re much closer to that than most people realize,” Dahm told AFP.
The new technology has turned the US Air Force’s doctrine upside down. For decades, the military trained pilots to face an enemy “alone and unafraid,” flying deep into hostile territory to strike at a target and then return home.
Now the Air Force is planning for scenarios in which different tasks would be divided up among manned and unmanned “systems,” with drones jamming enemy air defenses, tracking targets and assessing bomb damage, while piloted warplanes oversee the launching of bombs and missiles.
Instead of the slow-flying turbo-prop Predator, future drones likely will more closely resemble their manned counterparts, with a longer range, more powerful jet engines and radar-evading stealth design, which the bat-winged Sentinel drone already has pioneered.
But the biggest technical hurdle for Pentagon-funded scientists is delivering an iron-clad guarantee that the more autonomous vehicles will not make a grievous mistake with potentially catastrophic consequences.
“You have to be able to show that the system is not going to go awry — you have to disprove a negative,” Dahm said. “It’s very difficult to prove that something won’t go wrong.”
One veteran robotics scientist, Ronald Arkin, a professor at the Georgia Institute of Technology, believes that countries will inevitably deploy independent robots capable of killing an enemy without a human pushing a button.
Arkin, who has worked on US defense programs for years, argues that robotic weapons can and should be designed as “ethical” warriors, with the ability to distinguish combatants from innocent civilians.
Without emotions to cloud their judgment and anger driving their actions, the robots could wage war in a more restrained, “humane” way, in accordance with the laws of war, Arkin said.
“It is not my belief that an unmanned system will be able to be perfectly ethical in the battlefield, but I am convinced that they can perform more ethically than human soldiers are capable of,” he wrote.