Yuma Proving Grounds, Arizona- Gray Eagle drones armed with HELLFIRE missiles, GBU-69 glide bombs, and 155mm artillery weapons fired rounds at over 60km to destroy SA-22 enemy air defenses and armored ground combat vehicles, directly hitting multiple T-72 tanks during the U.S. Army’s Project Convergence 2020.
The real story, however, according to senior Army leaders attending the service’s transformational combat experiment, was about data sharing, networked targeting, and a cutting edge AI system called FIRESTORM.
“The bullet flying through the air and exploding is interesting, but that is not what is compelling about Project Convergence. It is everything that happens before the trigger is pulled. We did not come out here for a precision-fires exercise, what we came out here to do is increase the speed of information between sensing the target and passing that information to the effector,” Brig. Gen. Ross Coffman, Director, Next Generation Combat Vehicles Cross Functional Team, Army Futures Command, told reporters Sept. 23 at Yuma Proving Grounds, Ariz.
FIRESTORM uses advanced computer algorithms to gather radio data link feeds, video stream data, navigational and terrain specifics, weather conditions, target coordinates and precisely identified enemy location information. FIRESTORM then uses AI-enabled computer processing to perform near real-time data analytics and compare all of these variables against a vast or seemingly limitless database. The various information streams are pooled together and analyzed in relation to one another to organize the data and identify the optimal weapon or “effector” needed for that particular target.
“FIRESTORM is a computer brain that recommends the best shooter, updates the common operating picture and enemy and friendly situations. It “missions” the effectors that we want to eradicate the enemy on the battlefield. As enemy targets were identified on the battlefield, FIRESTORM quickly paired those targets with the best shooter in position to put effects on this,” Coffman said.
FIRESTORM can in part arrive at analytical conclusions in a mere instant, by weighing new information against previously compiled information. Machine learning happens when AI-enabled databases immediately assimilate new information that is entirely different than what is in the database. The pace at which this new information is discerned, analyzed and integrated comprises the fundamental value-added quality of AI.
Perhaps certain weapons such as artillery were proven effective for a certain range and target composition in those particular weather conditions, at that particular altitudes, with those particular defenses in that and terrain configuration? The computer will analyze all of these variables both individually and in relation to one another against its database.. and pair the right weapon for the particular target engagement. This entire process can now take place in seconds, representing an exponential leap beyond previously achieved benchmarks of roughly twenty minutes.
“This is happening faster than any human could execute,” Coffman said.
However, the system must be adaptable to new enemy threats. Once enemies encounter certain systems, they of course immediately move to counter them, therefore requiring developers to expedite quick improvements.
“We need code writers who will need to change algorithms to adjust to new threats. We can't wait 24 hours, we will have to change instantaneously to targets. We need to make decisions at speed and get ahead of the enemies' decision cycle,” Lt. Gen. Michael Flynn, Director Army G3/5/7 told reporters.
Flynn further explained this “need for speed” in the context of the well-known Processing, Exploitation and Dissemination (PED) process which gathers information, distills and organizes it before sending carefully determined data to decision-makers. The entire process, long underway for processing things like drone video feeds for years, has not been condensed into a matter of seconds, in part due to AI platforms like FIRESTORM. Advanced algorithms can, for instance, autonomously sort through and observe hours of live video feeds, identify moments of potential significance to human controllers and properly send or transmit the often time-sensitive information.
“In the early days we were doing PED away from the front lines, now it's happening at the tactical edge. Now we need writers to change the algorithms,” Flynn explained.
“Three years ago it was books and think tanks talking about AI. We did it today,” McCarthy said.
Kris Osborn is defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.