By adding your email address, you are subscribing to the Team Orlando News newsletter. Team Orlando News does not share subscriber data under any circumstance. You can unsubscribe at anytime.
Collaboration, Open-Source Tools the Key to Creating a Flight Simulator During Antoinette Project
Home » Collaboration, Open-Source Tools the Key to Creating a Flight Simulator During Antoinette Project
TOPICS & CATEGORIES
In May of this year, Epic Games partnered with Varjo, Meta-Immersive Synthetics (MIS), and Brunner, to showcase the Antoinette Project demo at World Aviation Training Summit (WATS) in Orlando, Florida. The experiment demonstrated how Unreal Engine and partner technologies could produce a low-cost, highly deployable, extremely immersive, military-grade flight simulator.
In interviews with the major participants, the motivation, inner-workings, and results of the Antoinette Project reveal a pathway for Unreal Engine creators and developers who want to build simulators powered by the popular game engine.
We interviewed Seb Lozé, Unreal Engine Business Director for the Simulation Division; Niclas Colliander, Managing Director of Meta Immersive Synthetics (MIS); and John Burwell, Global Head of Simulation and Training at Varjo to tell us more about their participation and results of the project.
Epic Games designed the Antoinette Project to fulfill several needs within the simulation industry. Tell us why the Antoinette Project was created and what challenges were solved.
Seb: Many industry participants and new entrants to the simulation community wanted guidance on where and how to start building a flight simulator with Unreal Engine. We knew we could meet that need by providing a comprehensive set of resources to support the creation of the next generation of flight simulators. With that, Antoinette Project was born.
The project’s name gives homage to La Société Antoinette, the French simulation pioneers. In 1906, they created the Antoinette Barrel as the first known method to demonstrate to pilots what they would sense when flying an airplane. We wanted to pay tribute to these pioneers in simulation.
With Unreal Engine as the creative platform, what else did you need to create the Antoinette Project flight simulator demo?
Seb: Rather than reinvent the wheel, we decided to work with some key players in the field to build a portable demo and create an inspiration reference for our community of developers. In addition to the Unreal Engine platform, we needed companies that offered the physical platform, the visual platform, and integration of all of the devices and software. We chose Brunner, Varjo, and Meta Immersive Synthetics to support the Antoinette Project. These partner companies, while competitors in some respects, all have a goal of developing high-fidelity, cost-effective simulation training. I think we all see virtual and mixed reality as the future of simulation training in a wide range of areas, so providing a pathway for others to develop training like this is a great start in moving this forward.
The demo included NOR, a software framework from MIS, which enables developers to build training scenarios and serious flight simulation applications by leveraging all the expertise of Meta Aerospace. You can read more about this on the Unreal Engine blog.
It also included Brunner’s highly deployable 6DOF motion platform, which uses advanced motion-cueing algorithms and high-fidelity control loading units. The kinaesthetic cues given by the platform let the pilot feel the aircraft they’re sitting in. The feedback given by the aircraft not only immerses the pilot much more, but also trains muscle memory and the feeling for the aircraft’s movements and forces. Brunner worked on their Unreal Engine integration, leveraging the open-source motion-cueing interface developed by Epic Games. The plugin enables companies to develop their own for any motion-cueing solution they need.
Finally, the demo also featured the Varjo VR3 HMD, which produces high-resolution, out-the-window scenes in a very portable way. You can read more about it on the Varjo website.
Tell us more about the future of VR and MR for simulation training? What are the benefits?
John: The advent of these low-footprint simulators has a number of critical benefits. Most importantly, you can provide training at the point of need. Many of these devices can be shipped to the pilot candidates, which allows them to train in their houses. Therefore, it eliminates the need for travel and enables you to train many tasks that are traditionally done in the big simulators.
Varjo received an Epic Games MegaGrant prior to this project. Did that play a role in how Varjo was able to participate in the Antoinette Project ?
John: Epic Games awarded Varjo with a MegaGrant to further develop our mixed reality support for Unreal Engine. With the grant, Varjo moved to OpenXR, creating the Varjo XR-1 Developer Edition, which offers human-eye resolution visual fidelity, integrated depth-sensing, and low-latency video pass-through mixed reality. With OpenXR as the target interface, developers now have access to the industry’s most advanced enterprise-grade mixed reality features to support composing real and virtual environments for a wide variety of applications.
The support from Epic Games let us expand our delivery of mixed reality solutions for the most demanding enterprise VR/XR applications through Unreal Engine. Specifically on the Antoinette Project, Varjo’s OpenXR features include full support of photorealistic visual fidelity, eye tracking, and real-time chroma keying.
What did you see as the most significant aspect of the pilot simulation you created through the Antoinette Project?
John: What we’re seeing with the Antoinette Project is top-quality graphics and exceptional amounts of realism. That’s really important for pilots when you’re trying to create the suspension of disbelief that you need with these devices.
Virtual reality training fully immerses the pilot in a computer-generated environment, and it has been found suitable for training basic tasks such as cockpit familiarization, checklist training, and basic flight skills. Higher-fidelity mixed reality (XR) training, where the pilot can see and interact with physical hardware in his/her vicinity, has been found more suitable for training tasks that require development of muscle memory with complex switchology and systems. In virtual and mixed reality training, high visual fidelity is critical for pilots to see cockpit displays and objectives clearly.
Mixed reality or XR training can be achieved via advanced head-mounted displays with digital pass-through cameras that seamlessly blend video of the outside world with computer-generated content. Mixed reality simulation enables tactile feedback so pilots can see their hands and feel the stick and throttle, buttons, and switches that populate the crew station while flying in a fully virtual environment. XR training setups, unlike traditional displays, can cut the cost of training devices by half to a full order of magnitude with little loss of fidelity. This dramatically reduces the physical size of resulting devices, saving space as well as reducing electrical and cooling needs.
The lower device and operational costs brought by VR and XR equate to greater availability of training tools, which permits trainees to achieve more reps and sets, and encourages them to repeat tasks until they achieve mastery, and all-around better scalability. Thanks to the portability and reduced simulator size of immersive solutions, trainees can use VR and XR to complement traditional simulator training in scenarios where travel is not possible. When training organizations successfully implement immersive solutions, they are able to train more pilots at a faster pace, provide more flexibility in the training, and achieve this with a lower cost than ever before.
MIS played a major role in the Antoinette Project as the software integrator. What can you tell us about MIS’ contribution?
Niclas: MIS provided our software, NOR platform, which is an Unreal Engine-based simulation platform that we’ve been building for the last couple of years. And specifically for the Antoinette Project, we also used the Euro fighter typhoon model.
NOR is a simulation engine based on Unreal Engine, but we build all the frameworks for simulating aircraft, ground vehicles, sensors, weaponry, things like that. We’re also developing our own solid footing planet and environment in which you can simulate things.
We started developing NOR in partnership with Epic. It offers high-fidelity flight simulation while also providing a rich world down at eye level, complete with dust, weather, sound, and other factors that impact the mission.
There are always challenges when integrating hardware and software, but there were no major challenges on this one. MIS already had integration from working with the Varjo headsets before, and the only integration that we had to do was with the Nova platform from Brunner. We hadn’t used that one before, but it was a fairly painless process, due to the fact that we had been experimenting with motion platforms before.
What kind of feedback did you receive when you unveiled the Antoinette Project at WATS 2022?
Niclas: The feedback on the Antoinette Project during the WATS show was very good, both from a graphical perspective and also from a usability perspective. People from all different kinds of branches have been complimentary and that is very important. We knew a couple of retired pilots that did the demo, and they were very impressed. So, you know, when you can get pilots in there, and they give you the thumbs up, you’re doing something right! And the thing is, I’m a fighter pilot myself, so I’m pretty confident in what we do. But it’s always good to see that others like it too.
We are technically very hardware-agnostic. When it comes to HMD, we just want to use the best there is. When it comes to the Varjo headset, it is the highest-resolution headset out there right now. So, in air tactics, being able to read dials and gauges and tactical screens in the aircraft is very important. The dynamic foliated rendering in the Varjo headset makes that high resolution possible.
With dynamic foliated rendering, you track the eyes and only render that higher resolution where you’re actually looking at any given moment. That makes it easier for the PC to keep up with the high resolution and enables high performance at the same time. This also helps with not making people sick. You need a higher frame rate to avoid that and if you’re pushing a lot of pixels for a higher resolution and a lot of graphical effects, that gets hard. Having foveated rendering gives higher performance at that higher number of pixels and that’s why the combination of high resolution and dynamic foliated rendering is key.
In general, NOR and Varjo are a pretty nice combination, because NOR, leveraging UE, has a very high graphical fidelity, and if you have a headset that doesn’t have high resolution, you kind of lose a lot of that to a blurry image in your headset. So that combination becomes not only a nice demo, but a pretty nice use case. We prefer using the Varjo headset because that means that people actually see how good things look in NOR.
In my opinion in, most use cases, VR or MR is going to take over from classical systems such as domes or screens. It’s already happening to some extent, but I think the adoption has been slower than people expected both in the military space and obviously recreational spaces. But for simulation training, VR and MR are going to take over. And I think we’re starting to see even the bigger OEMs move in the direction of VR.
The Antoinette Project was just a couple of companies coming together, putting their stuff together and then showing something really, really cool and fun. I think that’s probably the biggest message—that small players can come together to create a very effective simulation training tool.
Seb, you mentioned a comprehensive set of resources that Epic developed to help other creators and developers follow in the Antoinette Project’s footsteps. Can you talk about those resources?
Seb: The DIY tutorial illustrates how simple and fast it is to create a basic flight simulator. The tutorial provides instructions on how to connect input control devices for the pilot interface, such as keyboard and mouse, gamepad, joystick, or flight-specific control device. It also shows how to integrate an aircraft model from the Unreal Engine Marketplace, add accurate flight dynamics using the open source JSBSIM plugin for Unreal Engine, and simulate flying above world data using the Cesium for Unreal or the ESRI ArcGIS Maps SDK for Unreal Engine.
The Trends and Best Practices for Flight Simulation information paper provides guidelines to use Unreal Engine in this context. The paper also contains the details about Varjo’s support to the Antoinette Project along with observations from Varjo’s Global Lead of Simulation and Training, John Burwell. It also includes insights from users who are already using the engine to build solutions for their own needs.
The JSBSim plugin for Unreal Engine was designed for flight simulation based on an open-source flight dynamics application.