Supporting transformational battlefield CONOPS | Military Training Technology Report 2024 | November | Shephard
Open menu Search

To make this website work, we log user data. By using Shephard's online services, you agree to our Privacy Policy, including cookie policy.

×
IntroductionSupporting CONOPSIn the News

REPORT HOME > Supporting CONOPS

Supporting transformational battlefield CONOPS: First Person View attack drone training  

December 2024 |  technology report | military Training

Freelance defence journalist and consultant

The vehicle suicide attack, whether by air, land or sea, is one of the most difficult to counter, as it requires either the complete disablement or the destruction of the vehicle before it gets close enough to its target to detonate its payload and inflict damage. An uncrewed vehicle is even more effective as it removes the human element, and therefore a degree of uncertainty, from the final act of destruction.

Above: A VRSG scene showing a 3D model of the FPV UAV simulator entity approaching its target. (All images: MVRsimulation)  

The use of uncrewed platforms as suicide weapons has increased dramatically during the war in Ukraine, notably at sea where the Ukrainians have scored a number of major successes against the Russian Black Sea Fleet, but even more so on land. Here Ukrainian forces are using first person view (FPV) drones carrying munitions as attack platforms to destroy individual targets, particularly armoured vehicles.

The particular advantage of these tactics is the extended range that one-way operations offer (as opposed to the drone needing to return to base) and the deep asymmetry of being able to destroy costly armoured assets with relatively cheap drones integrated with ordnance.

As an illustration of the growing importance of these systems, in early 2024 President Zelensky of Ukraine announced that "I have just signed a decree which will launch the creation of a separate branch within our armed forces - a drone systems force. This is not a question for the future. Rather, it must provide concrete results in the very near future... Drone systems have shown their effectiveness on land, in the skies and on the seas."

A FPV drone is equipped with a forward-facing camera that transmits a video feed to the operator which is displayed either on a handheld screen or in a headset, mimicking the experience of the pilot in an aircraft cockpit. Originally emerging from the realm of civilian hobby drone racing, FPV drones have robust motors and frames that are built to withstand the demands of high-speed races. Users can learn to fly these drones with the help of simulators, which can also provide a similar experience to that of professional racing pilots who fly through spectacular and complex courses in arenas, stadiums, and famous landscapes.

However, while these commercially-developed simulators can offer training in the basic mechanics of flying an FPV drone, they are often lacking the authenticity of real-life drone in-flight dynamics; the game-style virtual environment in which training takes place is limited; and the closed-system design does not enable networked training operations. In short, they do not reflect the requirements of the tactical operator.  

 

Above: The full FPV UAV Simulator includes a laptop, handheld device, ATAK and 2D video display goggles. 

“A game-style drone simulator that’s simply a low-fidelity physics model with a forward-facing camera view you move around the battlefield is a simple system to produce, but insufficient fidelity will give users the wrong impression of how racing-style drones work on the battlefield,” Garth Smith, President, MVRsimulation said. “We knew from the beginning that we wanted an advanced physics-based flight model that matched real quadcopter dynamics, and that this was of equal importance to the requirement for adherence to open networking standards in order to support large, distributed simulation multi-participant exercises, and the ability to populate the geospecific virtual training environment with accurate models that reflect the reality of the battlespace.

“We identified these elements as the key to providing a highly realistic training solution for the operation of racing-style quadcopter attack drones carrying munitions on the contested battlefield.”

To fill this need, MVRsimulation partnered with Bihrle Applied Research, which developed a high-fidelity flight model, which, when paired with the capabilities of MVRsimulation’s Virtual Reality Scene Generator (VRSG), produced a FPV Unmanned Aerial Vehicle (UAV) Simulator that replicates the tactile, visual and cognitive demands of operating agile UAVs in combat to engage enemy targets successfully in one-way attack combat scenarios.

Developed and brought to market in less than six months, the simulator highlights how innovative companies can leverage their agility to develop training capability in short timescales to support real world requirements.

“We are always driven by the needs of our operators, and we are extremely fortunate to have in Bihrle a partner that was able to deliver to a rapid development cycle and extend their advanced flight model to match the flight profile of a lightweight quadcopter so quickly,” Smith said. “Together we have made a high-fidelity simulator as a deliverable product, that can evolve in capability based on customer needs.”


System enablers

The FPV UAV Simulator can be used as an ultra-low footprint stand-alone training device for tactical operations, or networked with other simulators that operate on the VSRG infrastructure, enabling Large Scale Combat Operations (LSCO) training exercises. 

The highly-portable (back-pack carry-on or pelican case) system consists of a high-end gaming notebook running VRSG and a Republic of Gamers (ROG) handheld controller device providing 1080p resolution with configurable integrated pilot controls. VRSG simulates the front-facing camera view of the UAV, streaming the operator's FPV to the handheld device. The stream shows the UAV’s rotor blades and munition nose-cone so the user gets a realistic view of the scenario. The simulator is also supplied with 2D video display goggles, allowing the UAV operator the option to experience the FPV stream as a head-mounted (non-tracking) view.

Above: The handheld device shows the UAV's first-person view of VRSG's high-resolution virtual terrain, with configurable integrated pilot controls. 

VRSG is a Microsoft DirectX-based render engine that provides geospecific simulation as an image generator (IG) with game quality graphics. It enables users to visualise geographically expansive and detailed virtual worlds at real-time, interactive frame rates on commercially available PCs, using Microsoft commercial standards. As a distributed interactive simulation
(DIS)-based application, VRSG is fully interoperable with other DIS-compliant applications through DIS or Common Image Generator Interface (CIGI); and as an executable-ready render engine, VRSG supports but does not require programming for use.

FPV UAV training scenarios take place in VRSG's whole earth 3D geospecific terrain, supplemented with high-resolution training ground, airport or urban area insets. The terrain can be populated with real-time entities from VRSG’s 3D model library of more than 10,200 currently deployed military weapons and platforms. The majority of these model entities have articulated parts and accurate physics-based infrared/thermal signatures, and can be shown with different damage states and advanced animations, providing a view that reflects the reality of the battlefield. The model library is constantly updated both through MVRsimulation’s own initiatives and at customer request.

Users can also create and edit real-time 3D scenarios to play back in VRSG using MVRsimulation’s Scenario Editor (included with VRSG), with a game-level editor type interface that allows users to add culture and moving models directly to 3D terrain to create dense 3D scenes, and build pattern-of-life scenarios. Damage states for culture area are also handled in Scenario Editor as effects.

Additionally, VRSG can mimic the reality of operating in EW-denied or degraded operations (facing inhibitors such as counter-UAS systems and network jamming), by offering degradation of the video stream in various capacities. This includes the option to realistically represent the reduction in signal strength as the range between the drone and the operator increases in the simulation.

Above: The VRSG video feed on the handheld device can be down-rated to mimic operations in contested EW environments. 

“Essentially, our high-resolution terrain that closely replicates real-world locations, paired with highly accurate military platform models combine to deliver a sophisticated environment in which users can train to acquire, identify, prioritise and defeat the sort of ground targets they will face in real-world operations,” Smith said. “The beauty is in the simulator’s ability to deliver all this while simultaneously training users to navigate visually and pilot an agile drone at high speed – this latter is enabled by Bihrle’s advanced flight model.”


Bihrle advanced flight model 

Bihrle’s high-performance physics model is configured as a very lightweight quad-rotor racing drone UAV with front-facing FPV camera and attachable payload. It has been developed to replicate a high-performance UAV similar to those in active use on the modern battlefield.

Smith explained that when engaging Bihrle, “We requested a configuration for the drone model that reflects those commonly being deployed for this mission type: a very lightweight quadcopter with an RPG7 round attached. These drones are being produced in very large numbers and are likely changing often, so the important thing was to show a representation of the type of system in use.”

For Bihrle’s part, the work drew on its modular physics-based blade-element framework that has been used for full-scale rotorcraft training applications in Full Flight Simulators and Flight Training Devices.

“Our experience with configuring our rotorcraft flight model framework for a variety of aircraft sizes and types gave us confidence in our approach,” Nathan Graybeal, Director of Simulation Technology, Bihrle Applied Research, said. “We have previously developed and deployed models for heavy-lift tandem-rotor flight training devices, medium-lift coaxial rotor full flight simulators, down to small (8kg scale) quadrotor training systems.

“Every project has been an opportunity to refine our modular framework and carry forward lessons in applying the best approach to reach the desired outcome of the final integrated simulation.”   

The resulting high-fidelity model is hosted in Bihrle's DSix simulation environment. Flight model components include four independently modelled prop-rotors featuring ground effect, inflow, and wake modelling and with accountability for atmospheric conditions; with a 6-degree-of-freedom aerodynamic model of airframe (chassis) and optionally attached payload. It features comprehensive automatic flight control system emulation with rate-command, attitude-command, and groundspeed-command control modes (with expansion support for satellite navigation modes), with integrated avionics and IMU and support for drone startup procedures; electronic speed controller (ESC) and individual motors for realistic rotor command response; as well as a Li-Poly battery model with accountability for electrical load profile and accurate energy density.

Above: The FPV UAV Simulator is a low-footprint system consisting of a COTS handheld device and laptop running VRSG., with traditional viewing goggles.

The model’s detailed mass properties are based on geometric configuration and specifically modelled drone components, including optionally attached configurable payload; while its environmental modelling capabilities are user-configurable for outside air temperature, steady and gusting wind speed and direction, and turbulence. Equations of motion with WGS84 global ellipsoid model are configured for 1080Hz integration step.

“The very light mass of a racing drone pushed us to operate at a much higher frame-rate than typically required for 6-DOF simulation of larger manned aircraft,” Graybeal  explained. “Coupling that performance requirement with deployment to a small-form-factor (handheld) Windows PC is a real testament to the capability and flexibility of DSix as a flight model host.”  

He added: “With a modular simulation framework we are able to provide an extensible solution for future enhancement of capabilities while providing a rigorously developed and tested flight model basis for an immersive experience today.”

 

To replicate a high-performance racing drone, similar to those in active use on the modern battlefield, the flight model specifications include: 

  • Airframe mass (without battery or payload installed): 0.61kg
  • Mass with default battery (1,500mAh Li-Poly): 0.85kg
  • Mass with default battery and payload (PG-7VL warhead): 3.05kg 
  • COTS 3-blade propeller with 6in diameter at 4.5in blade-pitch, capable of thrust-to-weight ratios higher than 15:1 (for bare airframe) 
  • ESC and motors providing 30,000rpm at maximum throttle 
  • User-configurable control sensitivity functions for each axis and flight control mode for accurate drone controller emulation 
  • Control response up to 360 degrees/second in rate-command mode 
  • User-configurable battery pack size (with resulting mass changes) for selectable mission duration 
  • User-configurable FPV camera angle to suit the specific mission profile 
  • Selectable payload (eg PG-7VL warhead) with collision and detonation detection due to contact with scenery, terrain, or entities

 

 

 

Networked training 

In addition to stand-alone training, the FPV UAV Simulator can be networked with in-use air, ground and joint fires simulator systems in a shared virtual environment for LSCO training exercises.

In this case, all users in networked simulators and dispersed locations experience the same VRSG-provided terrain and 3D real-time model entities in the same scenario, allowing them to conduct collective training in the shared environment. The system can also integrate with semi-automated forces (SAF) software to see all platforms running on networked simulators as entities within the simulated battlefield.

Users on the network see the FPV UAV simulated entity on the network as a real-time 3D model representative of a quadcopter UAV equipped with an RPG7 munition. The UAV’s camera feed view can also be shared with any user on the network in real time, including to Android Team Awareness Kit (ATAK) devices, providing real-time rectified terrain with geolocated targets for increased situational awareness to all assets.

 

ATAK integration 

The ability to integrate with ATAK has been enabled by the addition of a secondary downward-facing camera which captures simulated sensor imagery video and streams it to ATAK device.

This is enabled by the fact that VRSG streams high-definition quality H.264 video with embedded metadata in the key-length-value (KLV) data encoding standard to produce a video stream that is Motion Imagery Standards Board Standard (MISB ST) 0601.1/0601.9 compliant. An ATAK device receiving this simulated sensor video feed stream cannot discern it from real-world video feed; it simply rectifies the simulated feed to real imagery of terrain.

VRSG can determine the location of the UAV model in flight and display it on the map as an entity within the trainee’s ATAK device without the need for semi-automated forces software.

The result is a real-time feed that delivers situational awareness and intelligence, surveillance and reconnaissance (ISR) data to operators at the tactical level and to any other entity on the network.

“We did this to show our customers how the VRSG infrastructure can support evolving battlefield networking capabilities,” Smith said. “Once we had this FPV drone simulation environment it was simple to add the secondary downward-facing camera, allowing us to prototype theoretical use-cases for GPS data integration and show how you can easily add additional tactical battlefield situational awareness to a low-cost drone, and experiment with roadmaps toward new capabilities.”

 

Above: The first person camera view shows the munition nose and UAV rotor blades for heightened realism. 

 

Doctrine development 

Such a low-cost testbed platform will more than likely have its uses. FPV drones, while already widely in use in Ukraine, are still in their infancy as a standard battlefield weapon and are not yet part of an army’s regular inventory, although it is more than likely that this will come about. The FPV UAV Simulator may offer a useful test-bed for users to experiment with and investigate how to integrate this kind of small drones into their combat operations and develop their tactics, techniques and procedures (TTPs).

This could include considering how Joint Terminal Attack Controllers (JTACs) would integrate these attack drones into their available assets when conducting close air support. VRSG is already in use on the USAF Joint Terminal Control Training and Rehearsal System (JTC TRS) programme of record, as with other similar fully accredited and in-use systems including MVRsimulation’s Deployable Joint Fires Trainer (DJFT).

“Essentially, with our FPV UAV Simulator, you could put a platoon of operators in a room and give them UAV assets and teach them how to coordinate attacks,” said Smith. “Things are moving so fast and integrating these systems is becoming more likely, so having a platform that behaves like the real thing is valuable for exploration and experimentation. In theory they could test it in different modular configurations and then say we want the UAV’s behaviour to change according to which modules are assembled on the battlefield.”

 

The FPV UAV Simulator is running demonstrations at I/ITSEC 2024 on booth #727.

WOULD YOU LIKE

20% OFF

ON TURMERIC & GINGER GUMMIES?