The Pentagon Worries That We Don't Trust Our Drones

A MQ-9 Reaper refuels at Creech Air Force Base. Wikimedia Commons/U.S. Air Force

How long can humans remain in the loop?

Trust is the foundation of any relationship, the experts tell us. Without some degree of trust in each other, human society becomes impossible.

But what happens if we don’t trust our drones?

That’s an issue that the U.S. military must tackle, according to a new study. The report from the Defense Science Board, the Pentagon’s group of scientific advisers, argues that establishing trust between man and machine is a must before the military can develop drones that think for themselves.

Autonomy—that is, machines that can act independently of human control—is the holy grail of the drone world. Older models like the MQ-1 Predator are more like remote-controlled aircraft that require a pilot on the ground to fly them with a joystick. Newer models, like the RQ-4 Global Hawk, are semi-autonomous; the Global Hawk flies a preprogrammed flight plan, with the pilots on the ground intervening as needed to change course or altitude. But otherwise, the airliner-sized unmanned spy plane flies itself.

There is good reason for the military to focus on autonomy. Drones that operate without constant human control also can react faster and more flexibly to changing battlefield conditions. There is a big push now for human-machine teaming, in which a manned jet fighter might control a swarm of mini-drones. But for a single pilot to direct a dozen drones, those machines must function largely autonomously.

Autonomy also means drones can perform their mission even if communications with their ground station are jammed or interrupted. Today’s drone pilots are burning out from overwork, but autonomous drones would allow humans to function as supervisors (or babysitters), who just watch to make sure that the machines are working as they should.

But none of this works without trust. People who are freaked out at the prospect of self-driving cars will go ballistic over self-flying, missile-equipped drones. For the military, the soldiers operating those drones must have confidence they will work properly. And the politicians ordering those drones into action must trust that they’ll strike down the terrorist and not the innocent wedding party.

The Defense Science Board study lists a half-dozen barriers to trust between man and machine. Some are intrinsic to human biology and electronic engineering; for example, humans and machines have different sensors and perceive—as much as a machine can be said to perceive—the world in different ways.

Some barriers are matters of communication. Drones must be capable of communicating with humans, and they must behave predictably enough for their operators to know how they will react. “When something goes wrong, as it will sooner or later, autonomous systems must allow other machine or human teammates to intervene, correct, or terminate actions in a timely and appropriate manner,” the study notes.

Perhaps the biggest hurdle to trust is goals. “If humans and autonomous machines are to work effectively together, they need common goals and a mutual knowledge of those common goals,” the Defense Science Board concluded. “Many of the commercial aircraft accidents in the 1990s associated with automation occurred when the flight crew had one goal (e.g., staying on the glide slope during an approach) and the flight management computer had another (e.g., executing a go-around).”

One solution to this is smarter drones. “Significant autonomy capabilities will derive from a machine’s ability to infer the commander’s intent and to act adaptively in a non-pre-programmed fashion, and in doing so, being able to deal with unanticipated situations not foreseen by either the designer or the operator,” said the study.

Inferring intent? Acting outside their programming? If that doesn’t sound like Skynet from the Terminator movies, or a hundred other malevolent computers from science fiction, then I don’t know what does.

As the Pentagon always hastens to point out, drones are not and will never be fully autonomous. The decision to pull the trigger will always be made by a human. Color me skeptical, but as unmanned combat aircraft and ground vehicles replace their human counterparts, and are thrust into situations that demand instant reaction, I’ll be curious to see how long humans remain in the loop.

In the meantime, before we rush into the brave new world of autonomous drones, let’s see how well these drones function. Or whether they simply create a new set of problems and complications.

Trust isn’t bestowed. It’s earned.

Michael Peck is a contributing writer for the National Interest. He can be found on Twitter and Facebook.

Image: A MQ-9 Reaper refuels at Creech Air Force Base. Wikimedia Commons/U.S. Air Force