Who Will Be in Charge of Future Tank Warfare? Robots or Humans?

July 30, 2021 Topic: military Region: Americas Blog Brand: The Buzz Tags: TanksWeaponsWarRobotArtificial Intelligence

Who Will Be in Charge of Future Tank Warfare? Robots or Humans?

Advanced, artificial-intelligence-enabled algorithms will increasingly be able to support long-range manned-unmanned teaming and information connectivity between manned vehicles and forward operating drones.

As the Army surges ahead with deliberations about a future tank platform, some may wonder if the Army’s most lethal ground fighting machine will be a robot? Or a manned tank? Who or what will ultimately make determinations about where and when to attack? Which weapon to use? When to close with the enemy?  Much of this may be decided by yet-to-exist technologies, yet the current prevailing consensus is that, ultimately, humans will remain in control when it comes to the use of lethal force and other combat decisions. Robots, however, may lead the way.

One significant idea bearing upon these kinds of evaluation relates to a concept described to me by Army Futures Command Commander Gen. John Murray, and that is that should unmanned technology sufficiently evolve to enable breakthrough levels of autonomy, a future tank might not need to be survivable as it will carry no soldiers. The technology for advanced algorithms needs to advance a bit more before new levels of autonomy are operational, Murray said. Progress is underway and rapid progress continues to hit new breakthroughs on a regular basis, he said. Also, certain kinds of autonomy are already here and demonstrating great promise in Army testing and experimentation. 

As far back as a year or two ago, Army Futures Command demonstrated that an armored robotic vehicle could “breach” an enemy tank ditch while soldiers themselves operated several hundred meters behind. The advantage here is extremely significant, as the most dangerous, high-risk and life-threatening circumstances can increasingly be handled by unmanned systems, freeing up human decisionmakers to make optimized decisions about improved methods of attack. Once a tank ditch is handled, crossed or removed as an impediment, or early rounds of incoming enemy fire are confronted, infantry and heavily armed manned vehicles can approach or “close with an enemy” with a much greater tactical advantage increasing the prospect for mission success.  

This does not mean that the unique attributes of human cognition will be replaced or unnecessary in any capacity, Murray said. Instead, advanced, artificial-intelligence-enabled algorithms will increasingly be able to support long-range manned-unmanned teaming and information connectivity between manned vehicles and forward operating drones.  

Perhaps a manned Abrams tank will bring its heavy armor, survivability and firepower to war, while performing command and control operations and making decisions for an ultra-high-tech future Optionally Manned Tank to fight alongside it. This might make a lot of sense given that a robotic tank could move faster, deploy more easily and surge forward into the incoming fire with no risk to soldiers. Yet the platform could be supported by human decisionmakers in a heavily armored tank operating nearby. For instance, advances with the Abrams tank might favor this kind of conceptual thinking, especially if a new generation of Forward-Looking Infrared Sensors enable tanks to find, track and attack enemy targets at safe standoff distances before it is seen.   

Optimizing this kind of approach, which includes both manned-unmanned teaming and even unmanned-unmanned teaming, requires a growing sought-after blending between attributes and functions that are unique to humans and tasks performed by computers in an exponentially superior fashion.  

“We use artificial intelligence to reduce the cognitive burden on the human to allow the machine to do what it does best and allow humans to do what they do best on the battlefield,” Maj. Gen. Ross Coffman, the director of Next Generation Combat Vehicles Cross Functional Team for Army Futures Command, said in an interview. “What is the essence of decisionmaking when it comes to assessing those items that are not exactly tangible? Where there is not a ones and zeros solution . . . and answers come through with experience and anticipation. Currently, humans are better at game theory than machines.”

When it comes to aggregating incoming sensor data, discerning targets, performing real-time analytics, organizing threat data and making recommendations to human decisionmakers in seconds, nothing can parallel artificial-intelligence-enabled systems. At the same time, as indicated by Coffman, less tangible or more subjective variables vital to warfare decisionmaking go well beyond the capacity of 1s and 0s. There are many elements of decisionmaking that pertain to psychological nuances, emotional considerations regarding the level of risk to soldiers and interwoven factors such as how to respond to several new developments intersecting at once. In essence, there are circumstances wherein human reasoning and cognitive processes place humans in a much better position to make determinations. Warfare maneuvers involve degrees of risk-taking, damage assessment possibilities and even some informed speculation which might create a need to make decisions that exceed or venture beyond the analytical capabilities of mathematically oriented machines and advanced algorithms.  

Kris Osborn is the defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Master’s Degree in Comparative Literature from Columbia University. 

Image: Reuters