TSIS Panel

The National Training and Simulation Association (NTSA) and National Defense Industrial Association (NDIA) Central Florida Chapter featured a number of important panels for industry at the Training and Simulation Industry Symposium, June 15-16.

The panels, “How ‘Price to Win’ Can Save Your Capture,” “Navigating the Uncertainties of Best Value Services Opportunities to Differentiate Your Offering,” and “Leveraging Artificial Intelligence and Machine Learning to Enhance Training Fidelity,” gave industry members insights on best contract proposal practices and the benefits of the use of artificial intelligence from experienced leaders.

Industry Panel 1: How “Price to Win” Can Save Your Capture

The panel was moderated by Ricardo Lopez, Vice President, Competitive Analytics and Pricing Solutions for Lone Star Analysis, and Angela Alban, President and CEO, SIMETRI who also served on the panel along with Catherine Emerick, Business Development Executive, Vertex; Ben Dils, Vice President Business Development, Cubic; and John Dorn, Vice President and Deputy General Manager, BAE Systems.

In contract proposals, a Price to Win (PTW) strategy takes into account pricing and non-pricing factors to develop a price that will likely win a government contract while also evaluating a company’s competition.

BAE System’s vice president, John Dorn, said,“One thing I think is really important about Price to Win is it is not necessarily a number. It is a process and a thinking. And it helps you to really position yourself for success.”

Lopez said another benefit of Price to Win is determining whether a proposal is a good idea at all. “Sometimes you as a leader have to say you really don’t want to move forward, keep your new business investment dollars because you know that your pipeline is out there, and you want to take those dollars and apply it to higher P [profit] ones. The Price to Win system does provide that view for you.”

“From our perspective,” Dils said, “I’d say it’s really helped with OTAs. So as many of you know, the requirements aren’t always firm. So, you all you have to go off is a notification about the customer budget. So, it’s really allowed us to model our options, and also be able to look at what they can afford versus what we’re able to offer.

Emerick said that over two bids, she came to the conclusion that PTW helped her company “determine the value, the premium of what the government was willing to pay based on our capabilities.”

Alban recently hired someone at SIMETRI to support a PTW strategy. “I think, for me, it’s [Price to Win] a safe bet. It really highlighted how much we have not been doing and how lucky we had been prior to that. It’s been educational, but I think that if it’s a must win, you have to have the right answer.”

Alban’s response supported Lopez’ contention that the cost to conduct Price to Win is beneficial in the long run. “How much does it cost to lose? That is the real question,” Lopez said.

All of the panelists agreed that conducting Price to Win early, and holding meaningful conversations with the government are key to the success of any contract strategy.

Dorn said, “Work early and bring your customer into the dialogue.” Dorn uses the “$100 R&D test” when speaking to government customers. “Of these 10 items, which one would you spend 100 dollars on – it’s a litmus test that helps businesses determine the value to a contract versus what is said in the requirements.”

Lopez closed the panel discussion by offering a key statistic for those in the TSIS audience. “Companies that spend 70% of their R&D budget on a proposal before its released have a better chance of winning,” he said.


Industry Panel 2: Navigating the Uncertainties of Best Value Services Opportunities to Differentiate Your Offering

In this panel, members noted that the best opportunities are the ones where the mission, requirements, and ratings are well defined.

Moderator, Angela Alban, President and CEO of SIMETRI joined panelists, Brian Serra, Vice President Acquisition Initiatives and Engagement for Cole Engineering Services; Lee Amuso, Vice President Enterprise Growth at Mag Aerospace; Ashley Dominguez, Vice President, Valiant Integrated Services; and Rod Duke, Co-CEO at Qualis Corporation.

Dominguez identified “early question and answer sessions between all of the stakeholders and industry,” as the key. “Provide opportunities for them to expand on the solutions.”

Getting the opportunities right, value, and the unknown are the words the panelists used to represent uncertainty concerns in best value services opportunities.

These things can be mitigated with meaningful communication and allow a company to demonstrate differentiation.

“I love what we do,” said Duke. “I know there are other companies that can do a good job out here, so how do you sell that company and add the value to be a trusted partner? It’s up to us to show the government that we’re the best choice.”

Pricing is also important. “Start with your price to execute,” said Dominguez, “then identify how to get to the price to win and identify the costs and risks.”

At the end of the day, “make sure when you’re working with your government partners that you can compel them to tell you what they really want, because we have to give them what’s on paper,” concluded Amuso.

Industry Panel 3: Leveraging Artificial Intelligence and Machine Learning to Enhance Training Fidelity

In this panel, members discussed artificial intelligence and the leveraging of AI and machine learning to enhance training capabilities.

Angela Alban, President and CEO of SIMETRI moderated while panelists: Tim Woodard, Senior Solutions Architect at NVIDIA; Todd Griffith, Chief Technology Officer of Discovery Machine; Evan Oster, Scientist and Learning Solutions Architect at Aptima; and Nelson Lerma, Ph.D., Senior Product Manager AI & Simulations at Unity Technologies discussed the need to use AI and machine learning to support realism in training for the warfighter.

Woodard began the conversation by noting the breakneck pace of advances within the computing and AI industries, stating that “everything from healthcare, climate science, autonomous vehicles, recommender systems, and many, many, many different areas are having AI applied”. These advancements are not limited to the training and simulation community.
The first question asks “what are the ways in which you feel AI can be utilized in training systems?”

Evan Oster, representing Aptima, noted that Aptima has been using AI to “better model each person who’s participating in training and using it [AI] to personalize their training. With better models of those users or the trainees, we can really look at tailoring each of their experiences and have a more important view of how they’re doing, what they’re doing, and looking at it from a more holistic standpoint.”

Griffith adds that Discovery Machine is “interested in things like creating virtual structures for training, creating intelligent teammates that can take the role of roleplayers, or creating intelligent adversaries or patterns of life that enable you to make more realistic training solutions.” According to Griffith, “One of the things that we’ve recognized is that we’re not going to get there just by leveraging the current machine learning approaches, we actually have to draw upon many approaches that have existed for years and years – including the knowledge acquisition from subject matter experts.”

“I think that raises an interesting point,” responded Woodard, “we often have a tendency as a new method becomes predominantly known in the community to throw out the knowledge that we had before. We can’t just ignore all the other methods and approaches. We have a great tool in our toolbox with AI, but it’s not the only one.”

Woodard explained that new technologies, such as the metaverse, have become buzzwords while other technologies have been thrown by the wayside.

The MS&T community is more familiar with the concept of the Digital Twin, as Lerma pointed out. “As you create that theoretical Digital Twin or the Metaverse and we enable the cool technologies for virtual and augmented reality, we bring that continuum all the way to training worlds. We can connect the operational world to the training world and make training systems more realistic with a lot more fidelity – both from a visual perspective to an operational state.”

“I think those Digital Twins are [starting] to come to life with all the creators out there that are making this possible,” Lerma continued.

“I think the notion of Digital Twins is extremely important,” added Griffith, “I actually like to talk about it as neuro symbolic twins – things that we can add to a Digital Twin that include the expertise of subject matter experts.”

Griffith noted that people shouldn’t only be asking what the Digital Twin mechanisms are, but they should also ask what are purposes of all the subcomponents are within that device, how to deal with the causality and the purpose of that device, and how you can leverage virtual instruction to take information to the training.

“I think represented Digital Twins is extremely important,” Griffith concludes, “but we have to take it one step further and actually capture knowledge from subject matter experts who understand those devices and then relay that information back to the training.”

As the conversation moved towards training fidelity, Lerma commented on realism, from realistic human behavior to realistic human faces that do not exist in the real world. As modeling becomes more and more realistic, the pieces come together to create the Metaverse and the Digital Twin.

“As we continue to approximate it, most of the creators are going to be able to model things a lot more realistically to give a better training experience,” Lerma said.

The panel also discussed the idea that there’s an assumption you can “sprinkle” AI onto a problem and the problem will get resolved, which can become a barrier to adaptation because throwing AI indiscriminately on problems is likely going to fail. Tim Woodard asked, “How can we overcome that barrier? Eliminate it? And what other barriers are there to adopting these kinds of technologies?”

Griffith explained that when the military “sprinkles” AI onto problems, they’re working under the idea that their AI is working at the same level of the AI owned by big tech companies. These large companies have a larger magnitude of data at their disposal but also face different problems than the military communities. “What we found,” explained Griffith, “is that you can’t just take that learning and just automatically transfer them into a new domain. There are a number of techniques, but all of them require quite a bit of effort. There are things called transformers that can be applied to large data to try to transform it into the new domain, but to solve those kinds of problems that they’re looking for. That’s not actually statistical machine learning.”

People who read this article also found these articles interesting :