Experts representing the Department of Defense (DoD) and industry met on Nov. 15 to discuss different perspectives during a “mixology” panel at the Central Florida Tech Grove entitled, “The Metaverse and its Human Implications.”

Dr. Chris Libutti, Central Florida Tech Grove manager, hosted the event. Panel members included: Richard Boyd, co-founder and chairman of Ultisim; Ricardo Escobar, chief engineer for Common Synthetic Environment at U.S. Army Program Executive Office Simulation, Training and Instrumentation (PEO STRI); and Tyson Griffin, director of research and engineering for Naval Air Warfare Center Training Systems Division (NAWCTSD).

This panel was a follow-up discussion to Tech Grove’s first mixology event on Aug. 30, “What is the Metaverse?” In that first discussion, it was generally agreed that the “metaverse” (in its broadest definition) is considered the next step in the evolution of the internet, and in common use, the term typically refers to a collection of virtual, 3D worlds focused on social interaction.

However, the implications of what the metaverse could become, and how it could evolve to affect several common aspects of daily life, such as business and commerce, or education and training, are currently up for debate.

The discussion began with thoughts of what was currently keeping the metaverse from becoming fully functional as the all-encompassing environment that has been imagined in science fiction.

“It’s got every medium we’ve ever used before in human communication: text, graphics, moving images, sound, and other people to collaborate with,” Boyd said.

Boyd went on to describe the metaverse’s value in improving four areas: entertainment, design, simulation learning and interface. He described the interface as being the especially challenging area because there is currently no universal interface to engage users.

Escobar said that PEO STRI was focused on transforming training for soldiers, specifically citing the Training Simulation Software/Training Management Tool as part of the underlying infrastructure that is crucial in building the Army’s simulated environment.

“One of the challenges of the metaverse becoming ubiquitous is the lack of a singular foundational infrastructure that companies can latch on to, and is also open to other companies,” Escobar said. “Having an open infrastructure might not be in certain companies’ best interests [due to competing commercial priorities].”

Griffin emphasized the human-centered aspect of the metaverse, commenting that everyone perceives physical reality in their own way, and everyone has their own subjective understanding of everyone else’s subjective expectations of the physical world.

“Different organizations have different purposes for how they want to use synthetic training,” Griffin said. “Every commander and service member has an individual expectation of what that digital environment needs to provide from a training perspective, and I don’t think we ever get to a common picture of how we deliver that to our service members.”

The conversation later shifted to human interactions with artificial intelligence (AI) and the increasingly complex AI entities that are able to pass as human. In the next seven to 10 years, Boyd expects those AIs to be practically indistinguishable from the real thing.

“AI machine learning is advancing extraordinarily quickly,” Boyd said. “In complex, deep conversations with AI entities, eventually people figure out that it’s not a real person, but that’s taking longer and longer, and in shorter interactions it’s almost impossible to tell what is a ‘bot’ and what isn’t. We’re making increasingly convincing ‘people,’ for good or for ill, and like any technology, it cuts both ways.”

When the topic of social behavior in the metaverse came up, the participants agreed that users should face real-world consequences for inappropriate conduct in tomorrow’s 3D virtual world. However, they had some differences of opinion on which entity should be responsible for monitoring and enforcing standards.

“If there are going to be real ramifications to what transpires in a synthetic environment, government has to take a significant role on that,” Escobar said. “They’re the ones with the hammer, but we’re far away from that today.”

“Society has to do it – you can’t legislate reality,” Griffin said. “The internet has amplified a lot of good aspects of good human behavior, but overamplified a lot of negative aspects of human behavior. You’d have to legislate government-level rules, regulations and laws for those requirements, but the decentralized aspects of the metaverse will require society to legislate itself through changes in cultural norms that will take place.”

“It down to self-governance and design,” Boyd said. “The anonymization on a new frontier makes it different, but those problems have been solved. In World of Warcraft [and other online communities] they’ve got self-governance in place, and it works because you can be put in ‘game jail’ for a long time or banned from servers based on behavior if people report you. The great thing about these game worlds is that everything is being recorded all the time… you can go back and ‘look at the tape’ to pronounce judgment, or have AIs pronounce judgment.”

The panel later acknowledged the physical limitations of the metaverse in applications of military training, as the technology currently stands.

Boyd related an experience in developing virtual training with Marines. Certain aspects of the training, like visuals through headsets and the use of realistic battlefield smells of dead bodies, improved the simulation’s realism, according to a percentage of the Marines in the testing group. However, the Marines also promptly broke several of pieces of equipment they were required to wear while executing routine combat maneuvers, which demonstrated the obvious restrictions of a simulated environment that kept them from being able to “train like you’ll fight.”

“[You have to determine] which technology is appropriate for the training: Is it augmentation, or is it a replacement?” Griffin said. “You look for cognitive-based training in the virtual environment, and then you switch over to do physical training. It’s a matter of finding that hybridization between the physical world and the virtual world… that’s where the metaverse holds some promise in melding the two.”

Before closing the formal discussion, the panelists’ final thoughts revolved around ethical concerns that AI and the metaverse might present.

“The ethical concerns are legion, but privacy left us a long time ago – there’s no such thing anymore, whether through games, on social media, or if you’re just using email,” Boyd said before asking the audience if anyone had location services on their cell phones. “All of you should have your hands up. Even if you think you turned it off, did you turn it off on every single app? If not, then I can get that data for about a dollar.”

Boyd went on say that the most accurate data that told a person’s truest narrative was their purchase history. Such information is especially concerning to the banking industry, according to Boyd, due to fines that financial institutions must pay for breaches of customer confidentiality.

The next mixology discussion is scheduled for Feb. 21, 2023. For more information on Tech Grove events, go to

People who read this article also found these articles interesting :