Simulated Systems Training Key for Building Military AV Trust


Research continues into the use of autonomous vehicles for military operations. The benefits are clear, but for systems to reach their full potential military personal must trust the systems under their control. Simulated systems training can assist in that effort.

Simulated Systems Training Key for Building Military AV Trust

Listen to this article

Voiced by Amazon Polly

BISimAs technology evolves, robotic vehicles and drones stand to play an even larger role on the modern battlefield, from scouting landscapes before an attack to evacuating wounded soldiers. This equipment can reduce a frontline units’ exposure to danger, improve battlefield intelligence and engage enemy troops or vehicles in combat.

Autonomous vehicles (AVs) provide many military benefits — chief among them the ability to increase capabilities without increasing bodily risk. However, autonomous vehicles for use in military scenarios, especially those incorporating weapons, must undergo rigorous testing prior to use in military scenarios.

Even the most advanced technologies on the battlefield have little to offer if operators do not know how to use or trust them. Recent research into the military’s AI investments found a critical lack of examination of human-machine trust.

As the US Department of Defense (DOD) and other militaries across the globe expands their use of autonomous vehicles, they must also gather insight into how pairing robotics technology with humans will work. Cultivating soldiers’ trust, enhancing their core competencies, and increasing their comfort level with AI-enabled systems is critical for mission success. Simulations

Here’s how militaries can keep experimenting and advancing robot vehicles and how this simulation might evolve over the next decade:

BISimMIlitary Autonomous Vehicle R&D
The US Army’s Artificial Intelligence for Maneuver and Mobility (AIMM) Essential Research Program continues to develop robot combat vehicles designed to provide new combat capabilities. Fully autonomous vehicles used in multi-domain operations and diverse terrains, for example, will eliminate the need for soldiers to split their attention between operating remote-controlled vehicles and other mission-critical tasks.

Ongoing Challenges
The Army plans to implement these armed robots in the next decade, but many challenges remain, including evaluating how these autonomous vehicles can best support a fighting force. Preventing cyberattacks, developing appropriate maintenance schedules and identifying the chain of command for their control remains a priority, too. As soon as automated vehicles become more prevalent in the military, they will become a target, especially considering the proliferation of technologies and off-the-shelf technology currently available for the enemy to use in retaliation.

Additional Issues
Another issue? Vehicles command structure. How do you take control if, for example, an officer tasked with commanding an autonomous vehicle is injured or killed during combat? Who assumes responsibility for those commands? Currently, the military is implementing specific safeguards to build soldiers’ trust should the original person in charge become unavailable. Ideally, the vehicle will continue to make the same decisions as a human in these scenarios.

Also evolving is the approach to monitoring multiple automated vehicles. Unlike soldiers, these robots and vehicles cannot explain where they are or communicate situational awareness. If they are under fire — or not under fire — how can you get that status update without monitoring their cameras 24/7? Autonomous vehicles are not trained to report in the same way as human troops. Therefore, if 20, 30 or 40 of these vehicles are actively deployed on a battlefield, how do you identify a specific vehicle with issues or route that information correctly so a solider can resolve the problem?

Autonomous vehicles encountering the chaos of warfare equipped with only civilian data cannot perform to expectations.

Simulated Systems Training
Autonomous AI — much like real soldiers — requires training. In the civilian market, great leaps in AI capability have led to improvements in hardware and AI technology. AI systems can also consume a huge amount of verified training data. For example, most Internet users have encountered CAPTCHA-style security questions like “click all the cars in the picture.” CAPTCHA security enables websites to differentiate humans from bots — one type of verified training data AI systems can use to better predict and respond to behavior.

While this kind of collection can create huge amounts of useful training data in the civilian market, the military domain differs. Some civilian-captured data can teach effective behavior to military tech. However, autonomous vehicles encountering the chaos of warfare equipped with only civilian data cannot perform to expectations. They have not had enough training in that type of intensified scenario.

A simulated environment or situation designed to mimic the real world can present scenarios like live fires, casualties, varying rules of engagement or enemy behavior to train AI how to respond appropriately. Best of all, these variations are limitless since the system can perform repeated tests without time or spatial constraints.

BISimThe Importance of Trust
The next-generation of ground vehicles and future robotic vehicle development will include equipment specifically designed to encounter scenarios deemed too dangerous for human troops. Examples include:

  • Scouting missions where discovery could prove fatal.
  • Reconnaissance missions in terrains, like heavily wooded areas, where flying drones is impractical.
  • Convoys into enemy-held territory.

But soldiers must trust that these AI-powered vehicles will perform as intended. Training AI agents within a simulated environment show the potential for developing that trust in a human-machine team.

No matter how much the Army comes to rely on AI and autonomous vehicles, even fully automated processes will always need a human in the loop. Now more than ever, military personnel require training in simulated environments to fully transition AI/autonomous vehicles to mission-ready status.

About the Author

Oliver ArupOliver Arup is the Senior Vice President of Product Management at BISim. He is responsible for the development and ongoing success of VBS, BISim’s flagship software, as well as additional research and development projects within the company. He has over 18 years of experience in the simulations industry.


Source link

Leave a Reply

Your email address will not be published.