Here's What You Need to Remember: Fast Event-based Neuromorphic Camera and Electronics (FENCE) is a low-power, event-based infrared al plane array and a new class of digital signal processing and learning algorithms to enable intelligent sensors that can handle more dynamic scenes.
Enemies planning to attack often deliberately move in short, unexpected spurts to elude detection, emerge from undetectable areas and often use decoys or dummies to confuse overhead drones and surveillance planes. Insurgents, enemy vehicles, dismounted fighters and even some aircraft try to vary their patterns, change routines and regularly take specific steps to reduce their chances of being seen by drones.
This is why there is so much work now taking Processing Exploitation Dissemination (PED) to a new level using artificial intelligence (AI) to sift through hours of video data and identify those critical moments of importance to commanders. The PED process, which regularly faces the challenging task of organizing massive volumes of incoming data from electro-optical/infrared cameras and infrared sensors, increasingly draws upon advanced computer algorithms to bounce new information against an existing database and perform analytics to support fast decisionmaking.
With a similar goal in mind, Defense Advanced Research Projects Agency (DARPA) and several industry partners such as Raytheon Intelligence & Space, Northrop Grumman and BAE Systems are now fast-tracking a technological system engineered to find and transmit only images or pixels that have “changed” in order to pinpoint moments of relevance. This is critical, as a drone might have surveillance cameras on a specific static area looking for enemy movements or any key developments of interest for hours and hours. How can that seemingly unmanageable volume of data be organized efficiently to only transmit data of value to human decisionmakers?
The new DARPA system, called Fast Event-based Neuromorphic Camera and Electronics (FENCE), uses emerging technology designed to replicate certain crucial brain functions, a DARPA statement explained. FENCE draws upon advanced algorithms to only transmit pixels that have changed, therefore finding moments of potential relevance and avoiding any need to process or organize hours of static, unchanging images.
FENCE is a low-power, event-based infrared al plane array and a new class of digital signal processing and learning algorithms to enable intelligent sensors that can handle more dynamic scenes.
“Neuromorphic refers to silicon circuits that mimic brain operation; they offer sparse output, low latency, and high energy efficiency,” said Dr. Whitney Mason, the program manager leading the FENCE program. “Event-based cameras operate under these same principles when dealing with sparse scenes, but currently lack advanced ‘intelligence’ to perform more difficult perception and control tasks.”
A Raytheon statement, which quotes Brad Tousley, vice president for Advanced Concepts Technology at Raytheon Intelligence & Space, says the new technology helps identify moments of critical combat relevance to save time and expedite mission efficiency.
“The technology would allow sensors to forego hours of data-intensive video streaming by reacting only when there is activity. The potential result is 100 times less data generated and 100 times less power used by the sensor compared to today’s IR sensors,” a Raytheon Intelligence Space statement said.
It would make sense that an advanced algorithm might have this ability, as AI-enabled programs are engineered to recognize and catalog patterns, find anomalies and discern new input. All of the aggregated data can then be bounced off of a seemingly limitless amount of previously stored information to perform analyses, draw conclusions or simply highlight key points when patterns change or demonstrate some margin of difference.
Silicon circuits mimicking brain operations align, at least in concept, with interesting cutting-edge research now going on with scientists at the Army Research Lab, who explain that electrical impulses from the human brain can inform computerized sensors to instantly process data or “moments of response” identified by human eyes.
Kris Osborn is the defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Master’s Degree in Comparative Literature from Columbia University.
This article is being republished due to reader interest.