Senior representatives from Team Orlando’s military services participated in the National Training and Simulation Association (NTSA) online webinar, “Preparing Our Military: Opportunities for the Future,” on May 17, 2023.


Dr. Linda Brent, owner and CEO of the ASTA Group, moderated the panel that addressed a variety of topics covered by the webinar’s title. Participants included Karen D. H. Saunders (SES), senior leader for U.S. Army Program Executive Office Simulation, Training and Instrumentation (PEO STRI); Navy Capt. Tim James, executive officer for Naval Air Warfare Center Training Systems Division; Heath Morton, training systems technical advisor for the Air Force Materiel Command’s Air Force Life Cycle Management Center; and John Taylor, deputy program manager for the Marine Corps’ Program Manager for Training Systems (PM TRASYS).


After opening comments, Brent began the discussion by asking the panel about the biggest challenges in implementing augmented reality/virtual reality (AR/VR) in training.


Morton began by addressing three issues the Air Force had identified. First, commercial AR/VR goggles were often made in countries that made it difficult to get them into a classified environment. Second, using goggles that didn’t create “sim sickness” due to lag or other performance issues could also be difficult. The third challenge he mentioned involved validating that the performance of new simulators was equal to, or better than, current simulators for realistic, immersive pilot training.


“From the Marine Corps’ perspective, we’re looking at conducting more front-end analysis of how the bubble of XR [extended reality] can enhance what we’re doing,” Taylor said. “In the meantime, [we must determine if XR] can be integrated, can it be integrated without causing any issues, and we have to ensure that the ‘switchology and knobalogy’ is not presenting ‘negative training’ [actions students would need to unlearn as soon as possible]. There should be no difference in employing a weapons system in real life versus AR/VR… and while we understand what that needs to be, there’s still more work to get there.”


Brent next introduced the topic of ever-changing threats and the speed required to deal with them, and asked what role industry can play in supporting those concerns. James responded that industry can be particularly useful in helping the military realize where it is in peer competition, and what will be necessary in the near future and the longer term. He qualified his comments by saying that the time of exponentially growing budgets were over.


“The norm going forward will be iteration speed, finding out ways to do more with what we already have, and how to train better with assets already deployed [and doing so] at the edge of the envelope,” James said. “Anything we can do with distributed training, connected training, rapidly integrating new capabilities into existing platforms, and getting the training tail closer to the material acquisition tail will be valuable. Our adversaries have less bureaucracy to deal with, so anything that helps us close the gap on getting things fielded faster, connected, interoperable, modular, open – any of those areas – would be greatly appreciated.”


Morton said the Air Force desired systems to be more open, as well as documentation that used model-based systems engineering to quickly understand what the technical base lines are, and he emphasized that being able to update those technical baselines, is important. As technology heads towards more software-oriented baselines, the changes to software can often be done in a rapid fashion. According to Morton, subject matter experts would be looking at how the Air Force effectively puts that speed on contract so that a change from an intelligence source to a simulator can happen in near real time.


When Brent asked the panel about “pain points” when trying to quickly adapt and integrate technologies into training or operational environments, Saunders said that PEO STRI’s pain points weren’t necessarily technology-based, although there were questions on how to integrate artificial intelligence (AI) and machine learning (ML) responsibly. PEO STRI’s challenges, according to Saunders, had more to do with issues such as agile requirements development, having functionals understand how to take high-level requirements, work with STRI to iteratively develop capability, being able to test in an agile manner, and contracting.


In responding to Saunders, Brent added that requirements for both industry and government sides are a challenge because it’s important for industry to sufficiently understand detailed requirements sufficiently to meet them. Brent went on to note that it was sometimes difficult to understand on both the government’s and the user’s sides as to what are “soft” and “hard” requirements, and how to articulate them so they’re clear to all those concerned.


Taylor said that – speaking from his individual perspective – agile contracting, agile requirements, agile acquisition and agile resources could get in each other’s way. In getting fast-developing factors into the system, Taylor said that they could create problems with current OTAs, in addition to cybersecurity challenges.


“I don’t need just a prototype, I need an integrated prototype – that’s how I demonstrate utility,” Taylor said. “I have plenty of one-off systems now but getting them integrated and interoperable is the biggest challenge. Acquisition is a team sport, everybody’s got a role, and… we have to keep the aperture wide open in taking comments from all sides.”


James said that traditionally, and for too long, the focus was on getting more out of the equipment, instead of getting more out of the person.


“If we can get more data collected on training effectiveness and how individuals are interacting to the environment or their equipment in real time, that will help us take advantage of some of these new emerging technologies like AI, data science, and all those computing capabilities we didn’t have in the past… we can make more effective, real-time training,” James said. “It’s going to take some high-performance computers and some clever coding to get there, but we need to get to that level of training where we can do more than just do what the previous student did and later students get the cheat sheet to pass the test. We need to push each individual further in their training so they can master the speed that’s right for them and not just master the canned syllabus.”


When Brent turned the discussion to AI and its ethical use, James mentioned that the advent of AI both scared and excited him, emphasizing that adversaries aren’t constrained by the same ethical or bureaucratic processes as the U.S. military. He said he knew hostile forces were doing things in the AI world that American forces would not do because “[adversaries’] ethical red lines are further out.”


“We definitely need to be claiming that space, pushing ourselves, taking advantage of all the capabilities we can, and we need to be doing it quickly so that we can react in real time when we discover how our adversaries are using [AI] on the battlefield,” James said. “I’m scared of AI in cybersecurity and ‘smartbots’ trying to get around our systems, and I’m as worried about an attack vector as I am training effectiveness. Where I’m excited is the ability to do adaptive, real-time training, dynamic terrain generation, and virtual role players who change their behaviors.”


Saunders recognized the power of AI to help in the training domain, but noted that if the military is going to implement AI, it would need to understand what type of tool it is (there are many under the AI/ML umbrella), understand what the requirement is, and what that tool would be doing for the users.


She went to say that going through a deliberate process would ensure ethical and responsible implementation of AI into military systems, and from a test perspective, PEO STRI is looking at developing capabilities to counter adversarial AI systems.


Taylor agreed with the need to understand requirements and uses, and he voiced more specific concerns when it came to a trust factor, specifically between the AI and the humans interacting with it.


“[For example,] you’re listening to your GPS telling you to go somewhere, and intuitively, you know that’s not where you need to go,” Taylor said. “I think that’s going to be an issue going down range on the implementation of AI.”


The next NTSA major event will be the Training & Simulation Industry Symposium, to be held June 21-22, in Orlando, Florida. The next NTSA webinar is scheduled for August 2023.

People who read this article also found these articles interesting :