Unlock the Editor’s Digest for free
Roula Khalaf, Editor of the FT, selects her favourite stories in this weekly newsletter.
Earlier this month, the Ukrainian military claimed to have shot down a Russian fighter jet with missiles fired from an unmanned naval drone. The development was emblematic of a conflict in which the battlefield has become a testing ground for new technologies, including drones and artificial intelligence. And those advances are hastening the day where machines that can kill people operate entirely autonomously.
Artificial intelligence can allow drones to operate with greater autonomy and is playing a growing role in the Ukraine conflict, which is frequently referred to as a drone war. Of the almost 2mn drones that Ukraine acquired in 2024, 10,000 were AI-enabled, according to a recent report by Kateryna Bondar, a fellow at the Center for Strategic and International Studies who previously worked as an adviser to the Ukrainian government.
The AI-enabled drones in Ukraine range hugely in appearance, capability, price and size. They span cheap consumer drones embedded with a chip and software based on open-source AI and constructed in underground workshops, to sophisticated models manufactured by western companies such as US-based Anduril and Shield AI, or German start-up Helsing. But the basic principle is the same.
“In its simplest definition, an AI-enabled drone is a drone where certain core functionalities have been replaced by artificial intelligence, taking over from a human always having to have control over it”, explains Ned Baker, UK managing director and head of Ukraine operations at Helsing. The AI specialist and defence tech start-up — valued at €4.95bn in 2024, three years after being founded — announced in February the sale of 6,000 of its new AI-enabled HX-2 attack drones to Ukraine, following a previous order of 4,000 HF-1 drones.
AI has become particularly important in Ukraine due to the prevalence of electronic warfare systems that can block communications with a drone’s operator as well as GPS.
Faced with this, AI “can replace the functionality that is made impossible”, enabling drones to navigate, target, and communicate with other drones when “the link between operator and drone has become disrupted”, Helsing’s Baker says.
AI-enabled drones can use computer vision — the same technology that has been available in commercial drones for a decade for purposes such as skiers filming clips of themselves — to navigate and identify targets autonomously, Bondar explains. The array of various AI technologies available can “make a drone a fully autonomous weapon system”, she says — although, as she stresses, no drones are yet being operated fully autonomously in Ukraine, with humans always kept “in the loop” to approve actions.
Certain weapons such as missiles have for decades had autonomous elements — but the difference with AI-enabled drones is “decision-making capabilities”, says Bondar. Whereas missiles follow “preprogrammed paths” and an “algorithm which was created by a human”, AI allows a drone to “actually fly and see and analyse and orientate . . . without any preprogrammed pathway”.
The technology comes with practical and ethical challenges. For one, while “defence manufacturers promise a lot”, the reality in battle is often very different, says Nick Reynolds, a research fellow at British defence think-tank RUSI. Real-world data is important, but data from the war in Ukraine is not publicly available — although some suppliers, such as Helsing, have some access through their government contracts. “If you don’t have the battlefield data, as a company, you’re going to really struggle”, he adds.
Another obstacle is the cost and complexity of AI technology, including the chips required, combined with the inherently disposable nature of military drones, explains Reynolds. Bondar cites the use of “primitive” tethered drones — which are attached to a long fibre-optic cable, to bypass signal-jamming systems — as “proof that AI software is really hard”, requiring resources, time and expert knowledge.
The changing needs of those on the battlefield — as well as the constant innovations in AI — pose a further challenge. “Things change on the front line every two weeks”, Baker says. To help with this, Helsing releases fortnightly software updates that allow drone users to “access a new capability in the existing system” — Baker says “it’s the iPhone thing, but on the battlefield”.
The semi-autonomous use of AI drones in Ukraine — with a human always in the loop — is partly due to a high error rate, says Bondar. “People don’t trust machines yet,” she explains. Helsing’s Baker says it is more about “policy and ethical considerations”. “We’re going through a transitionary period on the battlefield at the moment”, Baker adds: “a combination of the AI and the human working in tandem . . . rather than it being 100 per cent human, which it would be two years ago, or 100 per cent AI, which it might be in two years’ time”.
The prospect of weapons acting with full autonomy, with no human oversight, has “very serious ethical implications”, Reynolds acknowledges, with a need for regulation to consider “how we mitigate unnecessary harm”, among other things. He fears, however, that the technology has already advanced beyond such considerations. Bondar agrees, describing regulation as “really hard”, given both the rapidly developing technology and the necessity-driven nature of warfare.
Regulatory debates are yet to fully materialise. In the meantime, AI-enabled drones will continue to be “on every major and non-major military power’s agenda”, says Baker. “AI on the battlefield will be easily as important as the advent of gunpowder, advent of machine guns, advent of tanks — but in completely unknown ways as of yet.”