Get Ready for the U.S. Army’s War of the Future
Army Futures Command is further refining a cutting edge combat system for soldiers which uses augmented visual reality technology to change the equation regarding how infantry might navigate close-quarter combat while under enemy fire.
Here’s What You Need to Remember: An infantry soldier could be under enemy fire from multiple directions, altitudes, and ranges. If so, then what are the boundaries of human perception defining how to best respond to the attack?
This question offers a window into an interesting technical and scientific challenge now being addressed by U.S. Army weapons developers who are working with soldiers to test and adjust requirements on a high-tech sensor system able to “augment” and improve key characteristics of human vision and perception such as angle, range, distance or multiple variables at one time.
Army Futures Command is further refining a cutting edge combat system for soldiers which uses augmented visual reality technology to change the equation regarding how infantry might navigate close-quarter combat while under enemy fire.
The technology, called the Visual Augmentation System (IVAS), are soldier goggles built with sensors to help soldiers operate beyond the limitations otherwise imposed by human vision. The Army plans to deploy 200,000 of the headsets in 2021, a service statement said.
“IVAS is designed to enhance the lethality and survivability of the Army’s Close Combat Force through a combination of technologies and augmented reality capabilities delivered in the form of a Heads-Up Display device. It is a single platform that allows the Soldier to fight, rehearse, and train, because it leverages networked information sharing and mixed and augmented reality technologies,” an Army report said.
This most recent exercise, called Soldier Touchpoint 2, took place at Fort Pickett, Va., to offer soldiers an opportunity to experiment with the system and offer feedback to commanders and weapons developers in a position to make adjustments.
Earlier this year, the National Interest spoke with Dr. Bruce Jette, Assistant Secretary of the Army, Acquisition, Logistics and Technology, explaining about how IVAS draws upon Human-Machine Interface to connect some of the neurological processing of human vision with software developed to help with depth perception, peripheral vision and other nuances associated with human vision.
“We don’t perceive distance with one eye, we just see larger or smaller - but if I can put it in both eyes I can get the object in 3D. To do that I need to have the sensing system to know where the eye is looking and focusing. The IVAS does that. It determines what you are looking at and what type of object you are looking at and focusing on to generate a 3D image in front of you. The good part about this is I don’t need all those heavy optics on my face,” Jette said.
Kris Osborn is the defense editor for the National Interest. Osborn previously served at the Pentagon as a Highly Qualified Expert with the Office of the Assistant Secretary of the Army—Acquisition, Logistics & Technology. Osborn has also worked as an anchor and on-air military specialist at national TV networks. He has appeared as a guest military expert on Fox News, MSNBC, The Military Channel, and The History Channel. He also has a Masters Degree in Comparative Literature from Columbia University