Thinking Machines | UAS Technology Report 2023 | Shephard
Open menu Search

To make this website work, we log user data. By using Shephard's online services, you agree to our Privacy Policy, including cookie policy.

×
IntroductionThinking machinesHidden dangers

REPORT HOME > Thinking machines  

Thinking machines 

November 2023 |  technology report | UAS

Freelance defence journalist 

With applications in areas such as image recognition, navigation and decision-making, artificial intelligence is set to become a key enabler for battlefield UAS, creating both new mission sets and enhancing performance on existing ones. 

Above: AI-enabled autonomy could allow UAS to operate in larger swarms, reducing the number of humans needed to supervise them. (Photo: US Army)

Artificial intelligence (AI) is set to have major significance for UAS operations, accelerating progress towards autonomy across a range of mission sets. Industry is pursuing AI-driven advances in sensors and other systems, with ambitious targets for the years ahead.  

There are a range of potential use cases for AI in UAS, perhaps most notably in areas such as navigation and sensing. Shaan Shaikh, associate director of the Center for Strategic and International Studies (CSIS), noted that the past decade has seen parallel advances in both UAS themselves and in AI and its sister discipline, machine learning (ML).  

‘Over the past decade, we’ve seen UAS get smaller, fly faster and farther, hold heavier payloads and generally become more effective for both military and commercial purposes. In the same period, we’ve seen a lot of productive research in AI/ML, with significant outputs that are driving further investment,’ he said.  

‘I can’t predict specific areas of advancement, but given these trendlines, I believe we’ll see large, AI-enabled UAS fleets very soon.’ 

 

Sensory perception 

Shaikh said AI may or may not expand established mission sets for drones, but it will certainly empower UAS to prosecute their current missions more effectively.

For example, UAS equipped with AI-powered image recognition technology will allow the platforms to identity and track targets faster and more accurately, he said.  

‘The same technology may allow UAS to analyse terrain and navigate to areas of interest without using GPS, making it harder for defenders to find them via passive sensors,’ Shaikh added. ‘AI-enabled autonomy can allow UAS to operate in larger swarms, with fewer personnel needed to coordinate and deconflict attacks.’ 

Dan Mor, director, products and solutions at Aitech Systems, said real-time response applications demand systems that can perform AI processing in sensors at the edge and for autonomous operations, ‘exponentially increasing computing requirements’.  

Aitech has developed an AI Small Form Factor (SFF) product line that provides computing and general-purpose graphics processing units (GPGPUs) for use in a range of military and aerospace systems, including UAS.  

Above: Aitech produces a range of AI Small Form Factor computing systems that can be used in UAS, including the A179 Lightning, pictured here. (Photo: Aitech) 

Edge computing using AI technology and communications is a leading focus of the military and aerospace market, Mor said. In UAS, he said AI’s main applications are object detection, classification, and determining and extracting location, as well as target tracking and quick, precise and real-time decision-making.  

Meanwhile OJ Seeland, product director for personal reconnaissance systems at Teledyne FLIR Defense, said the company is working on AI for UAS in numerous areas, such as voice control and automatic aided target recognition.  

‘For AI to be truly relevant for UAS, we need to move more of it to the edge and enable the aircraft to make correct choices based on the situation it finds itself in,’ he said. The company expects to see both UAS and UGVs working together in much higher numbers to solve complex tasks, although Seeland said this is likely five to ten years away.  

There will be great value in the ability to command a swarm of drones to search an area that is GPS and comms-denied, he added, giving soldiers real-time and detailed information about the battlefield. However, this would require UAVs to be able to fly inside buildings and small spaces, a major focus for the company’s Black Hornet nano-drone.  

‘In modern warfare very few soldiers and military assets just sit out in the open. Everything is hidden behind a treeline, in tunnels or in buildings. That’s where UAS with AI needs to go.’ 

 

Trending technology 

Several companies in the UAS sector have recently announced new products and systems that rely on AI. For example, Steadicopter’s Golden Eagle rotary-wing UAS is the first such system with precision hit capabilities, according to its developer.

One of the technologies behind this is an AI package, said Steadicopter CEO Itai Toren; this enables the classification and identification of targets. Adding that the system was developed based on customer demand, he explained: ‘We found out that there’s a great need out there for a system that can autonomously close the sensor-to-shoot cycle from an airborne platform.’ 

Above: Steadicopter’s Golden Eagle rotary-wing UAS is described as the first such system with precision hit capabilities. One of the enabling technologies behind this is an AI package. (Image: Steadicopter) 

In another example, SCD’s new Swift-EI shortwave IR sensor, which can be integrated in UAS, uses event-based imaging to classify particular types of lasers, said Shai Fishbein, the company’s VP for business development and marketing.  

This could enable the use of AI, as the event-based imaging technique ‘behind the scenes in picking out the data enables the next generation of AI-based systems, offering the multi-domain battlespace multispectral infrared imaging for better situational awareness, advanced automatic target detection and target handoff across platforms and forces, while increasing warrior lethality'. 

Elma Electronic meanwhile produces integrated embedded computing hardware for sensor payloads and other systems, which can also be used to support AI in military UAS. Mark Littlefield, the company’s director, system products, pointed to opportunities in the sensor domain to apply AI-based computing techniques, in everything from radar and EO/IR to electronic warfare and SIGINT.  

‘AI techniques like deep learning are proving to be extremely powerful for solving this sort of problem, and modern hardware like GPGPUs and specialty system-on-chip devices designed specifically as deep learning engines is making it possible to do so.’ 

Littlefield expects new, more powerful and less power-hungry processing to emerge in the near future: ‘We are still at the very early days of the AI/deep learning revolution, so it is difficult to predict with certainty how the coming years will look, but it’s clear that deep learning-based AI will find application in interesting and surprising ways.’ 

Still, there are a range of challenges, he said. Firstly, Littlefield noted that deep learning networks must be trained, requiring huge amounts of data for this purpose, which must be high quality. ‘Like all computing, garbage in results in garbage out,’ he warned.  

Second, Littlefield said deep learning only works with what it is trained on, and may not function correctly if presented with something novel. However, such challenges can be overcome by updating the network and providing additional training.  

Third, he pointed out that computational hardware for deep learning tends to be power-hungry and thus generates heat. ‘UAS platforms are often extremely SWAP-sensitive and cannot host such power and thermally demanding hardware.  However, there are new, lower-power components becoming available that could be deployed on severely SWAP-constrained platforms.’

 

Promising potential 

Computational power is a major bottleneck, Seeland thinks, noting that Teledyne FLIR is collaborating with Qualcomm, whose hardware boosts the capability of FLIR’s software: ‘We need more computing with less power consumption.’ 

Mor said the advancement of AI in military UAS and more widely is an evolving and complex subject, with several technology trends expected to emerge in the near future with relevance for drones. In general, he noted that UAVs will ‘become more capable, reliable and autonomous, reducing the need for direct human control’. 

Above: There will be great value in the ability to command an AI-enabled swarm of drones such as Teledyne FLIR’s Black Hornet to search an area that is GPS- and comms-denied, giving soldiers real-time information about the battlefield. (Image: Teledyne FLIR Defense) 

He also said AI is likely to play a role in boosting cybersecurity and predictive maintenance efforts, as well as operationally: ‘AI systems will assist military commanders in decision-making by providing insights, simulations and recommendations based on data analysis. This can improve the efficiency and effectiveness of military operations.’ 

Among other trends, he pointed to a likely growing debate over the use of lethal autonomous weapons, as well as a focus on AI ethics and bias mitigation.

‘The development and deployment of military AI technologies will vary from one country to another, depending on their technological capabilities, military strategies, and policies,’ Mor said.  

‘These trends highlight the potential for increased automation and data-driven decision-making in military operations, but they also underscore the importance of ethical considerations and international cooperation in this field.’ 

Shaikh said there will continue to be political pushback against the use of AI in UAS, based partly on ‘fears of “Terminators” running wild’, and also on specific concerns, such as militaries enabling drones to identify and strike targets of interest autonomously.  

‘Technologically, we already have AI-enabled UAS. Yet wider adoption across UAS fleets will likely require a higher level of reliability and understanding of when or why AI makes strange decisions.’ 

Most people agree it is necessary to keep humans in the loop, particularly for weaponised systems, Shaikh said.  

‘Certainly, as AI advances and policy-makers and operators grow confident in these systems, they may expand oversight with one human supervisor monitoring, say, a dozen autonomous platforms. But a human will remain in the loop,’ Shaikh stressed. ‘The main questions today are where in the loop they’re operating, and how many loops they’re overseeing at once.’ 

Clearly, if the hurdles of computational power and levels of human oversight can be cleared, AI is going to feature heavily on board future uncrewed systems.

WOULD YOU LIKE

20% OFF

ON TURMERIC & GINGER GUMMIES?