Daily Management Review

AI-Enhanced Targeting Becomes Crucial as Ukrainian Drone Units Seek Battlefield Superiority


11/29/2025




AI-Enhanced Targeting Becomes Crucial as Ukrainian Drone Units Seek Battlefield Superiority
On a cold morning near the eastern front, a Ukrainian drone pilot known by the call sign Mex watched his monitor intently as a small quadcopter disappeared into the haze over contested territory. The aircraft was tasked with striking a Russian armoured vehicle stationed nearly 20 kilometres away — a distance that, until recently, was virtually impossible to manage under the heavy electronic interference saturating the battlefield. Yet the strike succeeded. For Mex, the reason was clear: without an AI-assisted targeting system guiding the drone once communication links were jammed, the mission could not have been completed.
 
His assessment reflects the broader shift taking place across Ukraine’s armed forces. After nearly four years of war marked by intense electronic warfare, drone losses, and rapidly evolving Russian defences, Ukrainian drone units are increasingly depending on artificial intelligence to maintain an edge. What began as improvised battlefield innovation has matured into a structured technological push that is reshaping how drones navigate, identify targets, and strike under hostile conditions.
 
Across the front, pilots report the same pattern: Russian jamming has transformed the airspace into a zone of near-constant electronic distortion. Many drones fail before reaching their intended coordinates, cut off by disrupted GPS signals or severed radio links. AI-based autonomy is emerging as one of the few dependable tools that allows operators to penetrate this environment and deliver precision strikes when manual control becomes impossible.
 
Escalating Electronic Warfare Pushes Drones Toward Autonomy
 
One of the defining characteristics of the conflict has been the overwhelming presence of electronic warfare systems deployed by both sides. As each country manufactures millions of drones annually — ranging from cheap commercial models to custom-built strike platforms — electronic jamming operations have intensified. High-powered Russian systems in particular have expanded their range and density, creating a tangled landscape of disrupted frequencies.
 
Before Ukrainian units adopted AI-enhanced systems, drone operators relied heavily on direct manual control. Even a modest breach in the signal link could send a drone spiralling down or drifting off course. But as drone warfare scaled up, the old model proved unsustainable. Pilots found themselves losing aircraft within seconds of takeoff when entering jamming zones. Critical missions were abandoned. High-value targets deep behind the lines became unreachable.
 
AI solutions were introduced to bridge this growing gap. By enabling drones to recognise, lock onto, and track targets using onboard vision-processing rather than remote commands, AI systems turned drones from vulnerable radio-controlled devices into semi-autonomous weapons capable of surviving in contested airspace. The goal was not full automation but enhanced resilience: if the drone lost communication with its pilot, it would keep flying toward the designated target based on what its onboard algorithms could interpret.
 
This shift marked a fundamental change in Ukraine’s drone doctrine. Instead of relying solely on human reflexes and remote steering, units began incorporating layers of machine learning and image recognition to sustain offensive capabilities. AI became less a futuristic aspiration and more a practical necessity born from battlefield constraints.
 
Image-Based Target Tracking Offers Precision Under Jammed Conditions
 
The core of these systems lies in the drone’s ability to lock onto a visual target seen through its camera. Once an operator designates the object — a vehicle, bunker, trench, or artillery system — the AI software begins tracking that object’s characteristics: shape, colour patterns, movement signatures, thermal contrast. Even when communication is lost, the drone continues to follow the visual imprint, adjusting flight angles as needed.
 
Pilots like Mex explain that these algorithms rely on internal “memory banks” of trained images. The database contains thousands of labelled visuals — cars, trucks, armoured vehicles, motorcycles, and various combat equipment — enabling the system to recognise objects even under suboptimal visibility. This capability becomes critical in environments with smoke, fog, dust, or rapid movement, where human operators struggle to maintain focus.
 
Accuracy improves as the drone approaches the target. If a pilot selects a point several kilometres away, the AI refines its trajectory independently, interpreting small deviations and adjusting in real-time. This dramatically increases mission success rates, especially during long-distance strikes in electronically degraded zones.
 
However, pilots caution that the technology is far from flawless. Weather, lighting, camouflage, and terrain complexity can confuse recognition software. In fast-changing environments such as trench networks or forested areas, the AI may misinterpret similar-looking objects or lose the target altogether. Despite these limitations, its reliability already surpasses traditional manual-only guidance in high-interference settings.
 
Rapid Technological Evolution Driven by Frontline Necessity
 
Ukraine’s embrace of AI-enabled drones is not the result of a single innovation but rather a cumulative evolution. Early in the war, pilots improvised with consumer electronics, attaching makeshift stabilisation systems and low-cost cameras. By late 2023, Ukrainian engineers and volunteer tech groups began producing custom firmware capable of recognising basic patterns. The turning point came when Russian jamming intensified in late 2024, accelerating efforts to integrate machine learning into frontline drone production.
 
Today, dozens of Ukrainian companies and military tech units develop or refine AI-assisted systems. The scale ranges from small teams creating lightweight guidance modules for individual drones to larger organisations designing multi-drone coordination tools. These tools allow several drones to approach a target from different angles even when communication links are partially disrupted, reducing interception risks.
 
The rapid pace of development is driven by constant feedback from operators. Drone pilots test prototypes directly on the battlefield, often modifying algorithms daily based on observed failures or unexpected jamming patterns. This real-time iteration cycle has few historical precedents; the war has effectively turned Ukraine’s frontlines into an experimental ecosystem for tactical AI applications.
 
Yet the speed also introduces challenges. Standardisation remains limited, and software varies across units. Performance differences become apparent when complex weather conditions or terrain variations test the system’s recognition accuracy. Engineers emphasise that AI guidance complements, not replaces, skilled piloting — the human remains responsible for authorising every strike.
 
Ethical and Strategic Concerns Amid Rising Autonomy
 
The growing reliance on AI in Ukrainian drones has inevitably raised ethical and strategic concerns. International norms governing autonomous weapons are still largely undeveloped, and global frameworks have not kept pace with rapid battlefield innovation. Ukraine maintains a firm policy: while drones may navigate and track targets autonomously, the decision to strike always requires human approval.
 
This principle aligns with broader concerns about the risks of fully autonomous lethal systems. If a drone misidentifies a target due to faulty image processing, misconfigured databases, or battlefield confusion, the consequences could be severe. Ukrainian commanders argue that human oversight is essential not only for ethical legitimacy but also for tactical accuracy, as human judgement remains superior at interpreting unusual or ambiguous battlefield scenarios.
 
Nevertheless, the conflict is pushing the boundaries of what constitutes autonomy. As AI-driven drones operate farther from their pilots and as electronic warfare further erodes real-time communication, the line between assisted guidance and independent action grows thinner. Ukraine’s insistence on maintaining human control reflects both moral considerations and strategic caution: preserving accountability while preventing overdependence on unpredictable algorithms.
 
Increasing AI Deployment Signals a New Phase of Warfare
 
As Russia also expands its own AI-enabled drone capabilities, the competition for technological advantage intensifies. Ukraine’s push to integrate AI into thousands of drones signifies a broader transformation in modern warfare — one where electronic dominance, machine learning, and autonomous systems increasingly shape tactical outcomes.
 
For pilots like Mex, AI is not an abstract concept. It is the difference between a drone crashing uselessly into a field and a precision strike hitting a critical enemy asset. The technology does not replace his judgment, but it empowers it, extending his reach into places where human control alone cannot survive.
 
In a war defined by innovation under pressure, AI-enabled drones have become one of Ukraine’s most vital tools — a fusion of necessity, ingenuity, and urgency that continues to reshape the battlefield every day.
 
(Source:www.globalbankingandfinance.com)