Russia’s new, flagship Armata tank promises a quantum leap in automated targeting capabilities, but difficult technical and ethical questions remain.
From its advanced radar suite to on-site lavatories, the T-14 Armata main battle tank (MBT) is brimming with modern features, best-in-class performance specifications, and a proportionally impressive price tag. One of the T-14’s most ambitious additions is an unmanned turret with a 125-millimeter 2A82-1M smoothbore gun and autoloader compatibility. The potential benefits of unmanned turrets are well known. For one, they bring a massive boon to personnel safety by removing the traditional need for crew members to sit next to the gun. Instead, T-14’s three operators are tucked safely inside an armored capsule within the tank. The space savings from an unmanned turret can translate into a reduced profile and lower overall weight, though the T-14 is by no means a light MBT at a hefty total weight of fifty-five tons. As with any tank design choice, there are performance trade-offs to unmanned turrets. With no means for the commander to pop their head out of the vehicle, the crew is entirely reliant on the Armata’s sensor suite for their situational awareness—this can potentially cause disorientation in combat situations, though much depends on the Armata's specific sensor implementation.
“The Armata crew does not need to aim accurately,” according to Rostec CEO Sergey Chemezov. “It only has to aim the gun roughly. Electronics will do all the rest: it will accurately determine the distance to the target and aim the gun at it. That is, the vehicle uses artificial intelligence elements that help the crew deliver fire.” It remains to be seen how effectively the T-14’s targeting algorithm, which was reportedly tested in Syria on the Uran-9 robot tank, is able to perform in a wide range of battlefield scenarios. There is reportedly a completely unmanned T-14 model—that is, not just the turret but the entire vehicle is controlled remotely—although Moscow has no plans to serially produce such a tank at this time.
The T-14’s unmanned turret has reignited a longstanding debate about artificial intelligence (AI) systems and the ethics of war. Chemezov noted that, although the Armata tank’s loading, targeting, and firing process is handled from start to finish by artificial intelligence, the decision to shoot still must be manually taken by the crew or a human remote operator. “Armata can be used both with a crew and without a crew—the robot will control the tank, it will choose the target itself. But whether a decision is made to shoot or not to shoot, a person still makes a decision to press the button,” he said.
Nevertheless, experts worry that the automation of a previously manual process can desensitize the operator to the moral weight of a potentially lethal decision. “The problem is that this classification risks that the role of the human will slowly shrink until it merely presses the red button of approval, but is not critically engaged with the process anymore,” Vrije Universiteit Brussel doctoral researcher Maaike Verbruggen told Popular Science. This can lead to a distorted decisionmaking process, where the operator passively agrees with potentially suboptimal AI decisions. “The role of the human is reduced to rubber-stamping the actions of the machine,” Verbruggen noted. There is also the looming concern that these types of operator approval schemes are a stepping stone to what some consider to be the endgame of AI targeting: fully autonomous weapons systems, authorized to take lethal action without human input.
To be sure, these concerns are by no means limited to the T-14 platform. As unmanned weapons systems become increasingly commonplace in coming years, so too will the ethical and practical questions surrounding their usage.
Mark Episkopos is a national security reporter for the National Interest.