Thanks To Artificial Intelligence, The Age Of 'Hyperactive' Warfare Is Here
Quite simply put - outcomes in future wars will likely be determined by the “speed” of decision making.
(Washington, D.C.) Future warfare will be characterized by what the Commander of Army Futures Command calls a “hyperactive battlefield” -- a chaotic, fast-moving mix of complex variables in need of instant analysis as lives….and combat victory...hang in a delicate, hazy balance of uncertainty.
Quite simply put - outcomes in future wars will likely be determined by the “speed” of decision making. This phenomenon, referred to by Gen. John Murray, Army Futures Command, in an interview with Warrior, underscores some of the reasons why weapons developers are weaving Artificial Intelligence applications into nearly all major future systems.
"The ability to see understand, decide and act faster than an adversary in what is going to be a very hyperactive battlefield in the future I think would be number one when it comes to the fast application of AI," Murray said.
While Murray was clear to point out that there are some limitations and needed guidelines associated with AI-focused technologies, he did explain that fast-evolving uses of AI-empowered weapons systems can enable Army Commanders to…”see first, decide first, act first,” of course therefore destroying enemies faster. Essentially, AI-processed data can exponentially increase the speed at which humans can make decisions.
“There was a guy named John Boyd in the Air Force who came up with this concept called the OODA Loop, which means Observe Orient Detect and Act... if you can observe and get inside the OODA Loop of your adversary that means you can get to understanding and action faster. I actually think that is a great way to look at what I believe is the most logical and valuable use of AI for military applications,” Murray said.
A quick look across the Army’s portfolio of current high-priority modernization programs immediately reveals that, essentially, “all” of them are being developed with a mind to how AI will improve and impact performance; Army development of future combat vehicles, aircraft, long-range fires, infantry combat and of course warfare networks are all now being integrated with AI-related technologies. (To Read Warrior's Essay on Army Research Lab Efforts to Develop AI for Future Tanks..CLICK HERE)
Interestingly, if even in a somewhat paradoxical way, the prominent emergence of AI has in some ways underscored the uniquely dynamic and indispensable qualities of human cognition. While AI can process information and perform an increasing range of functions by analyzing and organizing data, Army technology developers consistently emphasize that humans need to remain the ultimate decision-makers when it comes to command and control. Therefore, the current Army developers explain that AI can assist, or inform human decision makers by analyzing data, presenting options quickly and performing otherwise overly complex, time consuming or impossible tasks...in seconds. AI, the Army thinking goes, will “assist,” but not replace, human cognition and its many decision-making faculties.
AI & The Counter-Drone War
There are simply too many examples of AI-weapons integration to cite… yet one interesting example of what Murray referred to as “see first, decide first, act first”... can be found in the Army’s current high-tech war on enemy drones.
The Army and Raytheon are now accelerating development and deployment of an upgraded counter-drone weapons system designed specifically to address close-in small drone threats. The integrated counter-drone system uses a Ku band mobile, 360-degree ground radar called KuRFS -- in conjunction with a suite of specific countermeasures, called effectors. KuRFS can provide threat information for ground commanders who can then opt to use laser countermeasures, EW, High-Powered Microwave weapons or a kinetic energy interceptor missile-drone called Coyote Block 2. However, before any threat can be destroyed, it must first be identified or “seen.”
KuRFS began as an Urgent Operational Need request from the Pentagon to address an immediate and pressing need to counter enemy drones, rockets, mortars and other airborne threats -- including lower flying helicopters, Raytheon developers said.
“A complete c-UAS (Counter UAS) solution needs to be able to automatically detect the intrusion of potentially multiple UASs, identify their type and possible payload, and track their movement consistently inside the monitored area. We have a holistic end to end kill chain which includes early warning,” James McGovern, Vice President, Business Development, Mission Systems & Sensors, Raytheon Integrated Defense, told Warrior.
While preparing to upgrade the counter-drone system with the Coyote Block 2, Raytheon and the Army are emphasizing new innovations, such as the application of AI and Machine Learning. The Coyote Block 2 system is already integrated into an elaborate command and control system tasked with organizing and transmitting time sensitive threat data for human decision makers under attack. Raytheon developers tell Warrior the firm is looking at some of the next steps with “data fusion,” involving the use of AI to analyze fast arriving data from otherwise disparate sensor systems to optimize the delivery of crucial decision-informing combat data.
McGovern described Block 2 as a “larger, optimized warhead with improved tracking detection, engine performance and warhead effectiveness.” Equipped with an advanced seeker and small warhead, Coyotes can launch from a range of locations, including fixed locations and armored vehicles on-the-move
By comparing approaching threat information against a vast database of compiled information, AI-enabled algorithms can perform the real time analytics necessary to determine and present course of action options to a human decision - maker -- in a nearly instantaneous fashion.
For instance, AI and Machine Learning programs can analyze arriving threats against previous occasions wherein drones were used to attack in a variety of ways, factoring a wide array of variables such as previous speeds of approach, swarm techniques, weapons used and navigational factors such as weather obscurants or terrain details.
Therefore, by receiving and quickly analyzing electronic return signals or “pings” from a KuRFS radar, AI-empowered command and control applications could instantly present commanders with optimal response options such as which effector, or “kill method” would be best. Perhaps weather complications would make a laser interceptor less effective? Perhaps an attack over an urban area would prevent an option to use “kinetic” or explosive defenses -- given how fragments or debris could present risks to nearby civilians? Perhaps the use of EW weapons or High-Powered microwave might be the optimal method to jam or disable approaching drone swarms, or interfere with the seeker or guidance system used by attacking aircraft? Lastly, so-called kinetic options, such as a Coyote Block 2 interceptor weapon, could directly intercept and explode an approaching drone or use a proximity fuse for an “area” explosive effect to knock out small groups of drones.
All of these potential scenarios require the merging, analysis and organization of threat-specific sensor data - precisely presenting the kinds of predicaments AI applications could perform for humans -- at lightning speed. Ideally, Machine Learning technologies could receive and integrate previously unseen threat specifics of great relevance, merging them with existing data, performing near real-time analytics and rendering organized options for human commanders.
This progress, already well underway by Army and Raytheon developers, is well articulated in an essay called “Deep Learning on Multi Sensor Data for Counter UAV Applications—A Systematic Review,” published by the U.S. National Library of Medicine, National Institutes of Health. -- essay CLICK HERE
Networked AI systems can, as described by the essay, be “utilized to process a large variety of data originating from many different sources. They are utilized to process a large variety of data originating from many different sources because of their ability to discover high-level and abstract features that typical feature extraction methods cannot…. The utilization of deep learning methods in data fusion aspects can be of significant importance in addressing the critical issue of multi-sensory data aggregation.”
Integrating accumulated sensor data can, according to the essay, pinpoint optimal sensor applications specific threat scenarios. Precision renderings generated by KuRFS could quickly be fortified by Electro-Optical/Infrared sensors, laser ISR, acoustic applications or radio (RF) signals. Furthermore, attributes of one sensor can compensate for limitations of another, creating what Raytheon developers describe as a “common operating” picture.
For instance, electro-optical cameras could bring additional detail to some of the electronic returns provided by KuRFS’ use of Doppler radar technology, an application which captures and analyzes speed, movement and other target-relevant details.
“The intrinsic movements of the targets could describe the rotation of rotor blades of a rotary wing UAV or of a helicopter, the propulsion turbine of a jet, the flapping of the wings of a bird, and can be statistically described by the radar m-D (Micro-Doppler) signature,” the NIH essay states.
The essay further elaborates by stating that, for instance, radar might bring excellent precision detection; visual cameras could further distinguish target information and thermal sensors could, for example, detect the “heat signature” coming from an enemy drone’s engine.
Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army - Acquisition, Logistics& Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.
Image: Flickr.