The last TAK-tical mile | Military Training Technology Report 2024 | June | Shephard
Open menu Search

To make this website work, we log user data. By using Shephard's online services, you agree to our Privacy Policy, including cookie policy.

×
IntroductionThe last TAK-tical mileIn the News

REPORT HOME > The last TAK-tical mile

The last TAK-tical mile

JUNE 2024 |  technology report | military Training

Freelance defence journalist and consultant

MVRsimulation, PAR Government, Battlespace Simulations, Samsung and Varjo have integrated their commercial technologies to produce a solution that demonstrates the art of the possible for both training and mission rehearsal, where a simulation can stimulate and be visualised on real equipment used by a trainee immersed in a virtual environment.

Above: A VRSG real-time 3D MH-60 Black Hawk helicopter model with switch state animation showing personnel fast-roping onto the deck of a yacht in a virtual Tampa Bay. (Image: MVRsimulation)

There is a common military aphorism, ‘train hard, fight easy’, which neatly encapsulates the fact that the more time and effort you invest in training the better prepared you are for the rigours and uncertainties of operations. The more realistic that training is, the more effective it will be, but delivering that high level of realism is a perennial challenge for those responsible.  

Believable scenarios, accurate visual imagery and the opportunity to use the appropriate equipment all contribute to the creation of a convincing environment that supports realistic training.  

Every time training resources are unavailable or unusable, surrogate equipment does not feel the same as the real thing, or visual effects are unrealistic, the realism is diminished, resulting in a negative impact and reduced training value. The challenge is to overcome these problems to achieve the maximum realism and suspension of disbelief.


In the mix 

The advent of virtual reality has allowed immersion in different realistic virtual environments with easily changeable scenarios, enabling increased availability and access, and at greatly reduced cost. However, this is often achieved at the expense of other aspects of realism, notably the ability to see, handle and use real equipment while operating in the virtual world.  

Mixed reality has now changed that. Head-mounted displays (HMDs) with passthrough technology such as the Varjo XR series allow the user to handle either real or emulated equipment while remaining immersed in the virtual environment and responding to the scenario in the virtual world.  

The addition of simulated sensor and other data streams also enables real equipment to be stimulated to reflect activity within the virtual environment. 

Uniting these capabilities to create an effective solution requires manufacturers who each offer specific technologies and can then work together at pace to arrive at the desired outcome.  

A good example of this was recently demonstrated during May’s SOF Week 2024 event in Tampa Bay, Florida, which showed how equipment and technology used to conduct everyday operations can be utilised while training in a virtual environment. 

This joint internal R&D effort brought together the technologies and products of MVRsimulation, PAR Government, Battlespace Simulations, Samsung and Varjo.

The underlying concept was to create a virtual replication of a live demonstration conducted during SOFIC 2022 (SOF Week’s predecessor event), which involved multiple assets including helicopters and assault craft conducting a hostage rescue exercise from a luxury yacht, and show how this could be utilised in a mixed reality environment for training or mission rehearsal. In this case the trainee was an observer providing overwatch for the operation. 


Building the city 

MVRsimulation’s Virtual Reality Scene Generator (VRSG) provided the virtual environment via a high-resolution replication of the Tampa Bay area which was produced at short notice using MVRsimulation Terrain Tools software, which is a plugin to Esri ArcGIS: construction of the 3D culture only began in March 2024. The terrain was built using the existing Tampa dataset in VRSG’s Continental US (CONUS) National Agriculture Imagery Programme (NAIP) Southeast terrain drive. 

This includes one-metre NAIP imagery and elevation data compiled from the ten-metre National Elevation Dataset (NED). Additionally, a 3D model of Tampa Bay was created using CyberCity 3D’s high-resolution model, textured using Esri CityEngine, and compiled into the terrain.  

More than 32,000 procedural city model buildings, generated based on building footprints, and also using CityEngine, were compiled into the area, and a number of geospecific models were built and integrated in order to enhance detail of the virtual urban environment, including the Tampa Convention Center and the 345 Bayshore Boulevard condominium.  

Integrated road vectors from OpenStreetMap were used to depict the highway infrastructure around downtown Tampa, and more than 4,000 tree models were distributed throughout the area for added realism. As the Tampa Bay shoreline was a key part of the simulated demonstration, bathymetric data at a three-metre resolution was incorporated from National Oceanic & Atmospheric Administration (NOAA) sources to create the underlying shore elevation. 

According to Garth Smith, President of MVRsimulation, the result was extraordinary in terms of realism for a rapid terrain development cycle. Further enhancements, for example augmenting the buildings with ground-level and low-altitude drone perspectiveless images for textures, would improve immersion significantly, and sub-inch aerial photography could be layered to provide additional detail.  

Providing this greater detail and accuracy would be particularly useful if the integrated capability is to be used for mission rehearsal. 

Above: The Deployable Joint Fires Trainer used for the demo at SOF Week 2024, seen with a Samsung ATAK device pulling VRSG sensor feed from PAR Government’s Sit(x) TAK server, and the Varjo XR-4 headset. (Photo: MVRsimulation) 

A real-time 3D model representative of the luxury yacht that was the target of the operation was built for the demonstration and placed in the correct location. Other models involved in the simulation included MH-6 Little Bird and MH-60 Black Hawk helicopters (complete with switch state animation showing special operations forces (SOF) personnel fast-roping from the latter onto the deck of the yacht), SOF all-terrain vehicles, and Mk VI Special Operations Craft – Riverine (SOCR). 

All these were in the existing VRSG library which has over 10,000 3D models of weapons, vehicles and equipment and is continually updated. It contains more than 90% of the models required by the US Combat Air Force Distributed Mission Operations (CAF DMO) list and 95% of those designated mandatory, with articulated parts, damage states and accurate physics-based infrared/thermal signatures. CAF DMO compliance enables distributed combat training exercises worldwide. 

The scenario was generated by Battlespace Simulations’ Modern Air Combat Environment (MACE), which is a physics-based, many-on-many simulation and threat environment which can replicate a contested battlespace across different domains and in the electromagnetic spectrum. 


Immersive impact 

The simulation was run on MVRsimulation’s Deployable Joint Fires Trainer (DJFT), which was originally designed for Joint Tactical Attack Controller (JTAC) training, with the trainee using the system's observer station. The observer was immersed in the virtual Tampa Bay using a Varjo XR-4 Secure Edition mixed-reality HMD that provides a 360° view of the virtual environment for the wearer. 

The HMD has a 115° horizontal field of view, ultra-low latency and dual 12-megapixel video pass-through at 90Hz with a depth of field of 30-80cm for mixed reality. It is also capable of providing wearer eye-tracking to support after-action review where needed. 

Above: VRSG virtual high-resolution terrain created for the demonstration includes the Tampa Convention Center, pictured with a 3D model of the 80ft yacht moored in the harbour. (Image: MVRsimulation)

 

The mixed-reality passthrough capability enabled the observer to use an emulated physical AN/PEQ-1 Special Operations Forces Laser Acquisition Marker (SOFLAM) and an infrared zoom laser illuminator/designator (IZLID), together with simulated communication devices and a GPS receiver, as well as write in and read a logbook, all while continuing to engage with the virtual environment.  

A significant advantage of the mixed-reality capability offered by the DJFT set-up is that the physical experience of interacting with actual and emulated equipment in the real world, while also being in the virtual simulation, provides firm points of reference for the user, mitigating the effects of motion sickness that can result from using a virtual reality-only HMD to provide the immersive experience. The low latency of the Varjo XR-4 real-world passthrough helps to reinforce this, as the wearer can physically orient themselves in the virtual environment.


Train as you fight 

For the SOF Week demonstration, the observer was able to view the tactical picture via an Android Team Awareness Kit (ATAK). ATAK is fast becoming the command and control (C2), situational awareness and battle management application of choice for both SOF and dismounted operators.  

In this case the ATAK application was hosted on a Samsung Galaxy S23 Tactical Edition end-user device running TAK Server on PAR Government’s Sit(x) server.  

Sit(x) is a software-as-a-service solution hosted in the Federal Risk and Authorization Management Program (FedRAMP)-compliant Amazon GovCloud, delivering TAK Server capabilities. It supports federation with government-off-the-shelf (GOTS) TAK Servers and enables rapid sign-up and deployment. 

As well as geospatial information and tactical C2 plus MACE-generated 9-Line targeting comms and Link 16 data, TAK also provides visualisation for full-motion video (FMV) feeds, enabling them to be overlaid on the geographical and tactical display as they are in real-life missions.

During the demonstration, VRSG provided a simulated H.264 video data link sensor feed from the DJFT simulator’s instructor operator station configured as an MQ-9 remotely piloted aircraft system (RPAS). VRSG-generated video feeds contain Motion Imagery Standards Board (MISB)-compliant KLV (Key-Length-Value) metadata, so they can generate the range and co-ordinates of a designated target.

'This is a crucial aspect of achieving realistic simulated video data streams,' Smith said. 'KLV metadata enables the video imagery to be processed and geolocated and is fundamental to operational FMV usage. If the simulated video does not contain the metadata it cannot be processed in the same way, reducing realism and therefore detracting from the training value.'

Above: A Samsung S23 device displaying Battlefield Simulations’ MACE integrated with the trainee observer's scope sight as viewed within the Varjo XR-4 headset, pushed from the Sit(x) TAK Server. (Photo: MVRsimulation) 

In ATAK, the VRSG video stream is overlaid on the tactical map to ensure all parties in a mission are correlated on the correct target. Using the KLV metadata embedded in that stream, ATAK can display the aircraft's position marker as well as its sensor point of interest.  

For the demonstration the VRSG video downlink was delivered to a Samsung S23 ATAK device from the Sit(x) server over local wi-fi. However, for general usage, as this can be achieved via an Ethernet, wi-fi, cellular or mesh network, it could be taken from a VRSG-simulated sensor feed operating anywhere, in the same way these feeds are pulled from air assets operated from remote locations during real-world events. 

The virtual VRSG sensor feed can therefore be viewed through the Sit(x) server on real, operational end user devices, such as the S23 Tactical Edition.


Partnership potential

'The real motivation behind this partnership is to show a proof-of-concept demonstration that brings virtual and physical training right to the tactical edge,' Jeff 'Hollywood' Puckett, Director of Readiness at PAR Government, said. 'That isn’t something you can do well in the purely virtual space; you need to bring the tactile hardware in so that users really can train as they fight.

'That is why the mixed-reality space excels here, because there is no "learn how to train" phase on "dummy" equipment. Instead, users can be out in the field, training with their actual kit, using simulated sensor feeds and C2 data that is so realistic the end-user device – and hence the trainee – cannot tell the difference between that and the real thing.'

For PAR Government, getting the ATAK component onto the market is the catalyst for delivering that last mile of realism for the tactical training space.

'Personnel have their hardware, they just need the PAR Sit(x) server to pull the real and virtual sensor feeds via VRSG and the C2 data via MACE, and you can train all the way to the tactical edge,' Chad Wiley, BD Director of Innovative Commercial Solutions at the company, said.

'While we showed this at SOF Week on the DJFT, in reality all you would need is one laptop running MACE and VRSG, pushing  sensor feeds from VRSG into TAK, and the end-user devices pulling the feeds via SIT(x) – that truly is enabling that "train as you fight" objective.'

For MVRsimulation’s part, the possibilities enabled by the proof-of-concept demonstration are exciting.

'Using the SOF Week event as a forum for this software integration effort not only allowed us to show how VRSG and the DJFT have the ability to support the training requirements of specialist users like the SOF community. It also gave us the opportunity to investigate additional ATAK integration by all parties, and begin to lay a roadmap for a potential "out of the box” ATAK device,' Smith noted.

'This would offer an extremely compelling technical capability for users of our DJFT or other on-the-edge training systems, and move us closer to our ultimate goal of increasing the realism, suspension of disbelief and resulting effect and value of the mixed-reality training tool.'

WOULD YOU LIKE

20% OFF

ON TURMERIC & GINGER GUMMIES?