The Army and Raytheon are now accelerating the development and deployment of an upgraded counter-drone weapons system designed specifically to address close-in small drone threats. The integrated counter-drone system uses a Ku band mobile, 360-degree ground radar called KuRFS—in conjunction with a suite of specific countermeasures, called effectors. KuRFS can provide threat information for ground commanders who can then opt to use laser countermeasures, EW, High-Powered Microwave weapons or a kinetic energy interceptor missile-drone called Coyote Block 2. However, before any threat can be destroyed, it must first be identified or “seen.”
KuRFS began as an Urgent Operational Need request from the Pentagon to address an immediate and pressing need to counter enemy drones, rockets, mortars and other airborne threats—including lower flying helicopters, Raytheon developers said.
“A complete c-UAS (Counter UAS) solution needs to be able to automatically detect the intrusion of potentially multiple UASs, identify their type and possible payload, and track their movement consistently inside the monitored area. We have a holistic end to end kill chain which includes early warning,” James McGovern, Vice President, Business Development, Mission Systems & Sensors, Raytheon Integrated Defense, told Warrior.
While preparing to upgrade the counter-drone system with the Coyote Block 2, Raytheon and the Army are emphasizing new innovations, such as the application of AI and Machine Learning. The Coyote Block 2 system is already integrated into an elaborate command and control system tasked with organizing and transmitting time sensitive threat data for human decision makers under attack. Raytheon developers tell Warrior the firm is looking at some of the next steps with “data fusion,” involving the use of AI to analyze fast arriving data from otherwise disparate sensor systems to optimize the delivery of crucial decision-informing combat data.
McGovern described Block 2 as a “larger, optimized warhead with improved tracking detection, engine performance and warhead effectiveness.” Equipped with an advanced seeker and small warhead, Coyotes can launch from a range of locations, including fixed locations and armored vehicles on-the-move
By comparing approaching threat information against a vast database of compiled information, AI-enabled algorithms can perform the real time analytics necessary to determine and present course of action options to a human decision—maker—in a nearly instantaneous fashion.
For instance, AI and Machine Learning programs can analyze arriving threats against previous occasions wherein drones were used to attack in a variety of ways, factoring a wide array of variables such as previous speeds of approach, swarm techniques, weapons used and navigational factors such as weather obscurants or terrain details.
Therefore, by receiving and quickly analyzing electronic return signals or “pings” from a KuRFS radar, AI-empowered command and control applications could instantly present commanders with optimal response options such as which effector, or “kill method” would be best. Perhaps weather complications would make a laser interceptor less effective? Perhaps an attack over an urban area would prevent an option to use “kinetic” or explosive defenses—given how fragments or debris could present risks to nearby civilians? Perhaps the use of EW weapons or High-Powered microwave might be the optimal method to jam or disable approaching drone swarms, or interfere with the seeker or guidance system used by attacking aircraft? Lastly, so-called kinetic options, such as a Coyote Block 2 interceptor weapon, could directly intercept and explode an approaching drone or use a proximity fuse for an “area” explosive effect to knock out small groups of drones.
All of these potential scenarios require the merging, analysis and organization of threat-specific sensor data—precisely presenting the kinds of predicaments AI applications could perform for humans—at lightning speed. Ideally, Machine Learning technologies could receive and integrate previously unseen threat specifics of great relevance, merging them with existing data, performing near real-time analytics and rendering organized options for human commanders.
This progress, already well underway by Army and Raytheon developers, is well articulated in an essay called “Deep Learning on Multi Sensor Data for Counter UAV Applications—A Systematic Review,” published by the U.S. National Library of Medicine, National Institutes of Health.—essay CLICK HERE
Networked AI systems can, as described by the essay, be “utilized to process a large variety of data originating from many different sources. They are utilized to process a large variety of data originating from many different sources because of their ability to discover high-level and abstract features that typical feature extraction methods cannot…. The utilization of deep learning methods in data fusion aspects can be of significant importance in addressing the critical issue of multi-sensory data aggregation.”
Integrating accumulated sensor data can, according to the essay, pinpoint optimal sensor applications specific threat scenarios. Precision renderings generated by KuRFS could quickly be fortified by Electro-Optical/Infrared sensors, laser ISR, acoustic applications or radio (RF) signals. Furthermore, attributes of one sensor can compensate for limitations of another, creating what Raytheon developers describe as a “common operating” picture.
For instance, electro-optical cameras could bring additional detail to some of the electronic returns provided by KuRFS’ use of Doppler radar technology, an application which captures and analyzes speed, movement and other target-relevant details.
“The intrinsic movements of the targets could describe the rotation of rotor blades of a rotary wing UAV or of a helicopter, the propulsion turbine of a jet, the flapping of the wings of a bird, and can be statistically described by the radar m-D (Micro-Doppler) signature,” the NIH essay states.
The essay further elaborates by stating that, for instance, radar might bring excellent precision detection; visual cameras could further distinguish target information and thermal sensors could, for example, detect the “heat signature” coming from an enemy drone’s engine.
Kris Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army - Acquisition, Logistics& Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University.