Stopping Proliferation of Autonomous Killer Drones
- Emil Knutsson
- Sep 9, 2023
- 7 min read
Updated: Sep 10, 2023

Screenshot from the 2017 video "Slaughterbots"
In the annals of modern warfare and defense technology, few developments have garnered as much attention—and concern—as the rapid evolution of drones. These unmanned aerial vehicles, once a mere supplementary tool in the military's arsenal, have swiftly metamorphosed into autonomous agents of power, propelled by advancements in artificial intelligence and decreasing production costs.
This article will explore the "killer drones" small autonomous quad-copter drones equipped with explosive devices, triggered by algorithms. Their capabilities bring both promise—increased precision and reduced human risk—and concern, presenting new ethical, strategic, and regulatory issues. This article delves into the future threat of autonomous killer drones, the challenges they pose to traditional arms control, and the pressing need for international intervention in this uncharted territory.
While there have been significant advancements in AI-driven target tracking and drone technology, particularly following the Ukraine war, killer drones are not yet commonplace in today's technological landscape. However, in recent years, there's been an observable surge in the affordability and autonomy of drones. According to Grand View Research, investments in the commercial drone market are predicted to increase by 266% over the next decade.
As a result, we can anticipate further enhancements in drone autonomy, amplified capabilities to transport larger payloads, expanded flight ranges, coupled with reduced costs and heightened user-friendliness. Excluding explosive components, every aspect of these drones are being improved by the commercial sector. This indicates that the efficiency of killer drones will persistently advance.
At the dawn of the 21st century, the USA, with its colossal military budget, stood as the sole nation operating strike drones capable of executing precision strikes. Today, various non-state entities have employed this technology to conduct precision attacks, albeit on a smaller scale in terms of payload, range, and stamina.
Historically, the military sector has been the primary driver behind the development and proliferation of drone technology. From initial surveillance applications to the more advanced precision strike capabilities, militaries worldwide have pushed the boundaries of what's possible with drones. Defense departments, in their pursuit of tactical and strategic advantages on the battlefield, have heavily invested in research and development, leading to rapid advancements in drone technologies. Key collaborations with defense contractors and tech firms have resulted in innovations like enhanced AI algorithms, longer flight duration's, and more efficient target acquisition methods. Many of today's commercial drone applications have, in fact, drawn inspiration from military prototypes or have been direct adaptations of technology first introduced in military contexts.
Amidst this proliferation, the democratization of drone technology underscores a crucial shift. No longer confined to the dominion of superpowers with vast resources, drone capabilities are now accessible to smaller nations and even non-state actors. This decentralization not only alters the dynamics of modern conflict but also amplifies the urgency to establish international norms and controls over their usage.
With the emergence of more affordable and precise systems like the MQ1 Predator drone, the U.S. demonstrated a greater inclination to use them, as noted in the Air & Space Power Journal. These drones, being more precise and cost-effective without jeopardizing human pilots, became a preferred approach over traditional aircraft bombings, in the war on terror. Moreover, this technological evolution hints at a future where states might be more inclined to deploy cheaper killer drones equipped with AI algorithms that can select targets based on a given character profile, such as age, gender, ethnicity, etc.
The potential use of autonomous weaponry isn't inherently negative. When used responsibly, they could reduce civilian casualties in wars and counter terrorism operations due to their heightened precision. However, there's a compelling argument that allowing militaries to deploy killer drones could normalize their usage and speed up the technological arms race. This could compel states to increasingly delegate decisions to machines, as human reactions would be too slow. Such a shift may lead to the outbreak of a flash war. This term draws inspiration from the 2010 flash crash, where trading algorithms on Wall Street interacted unpredictably, causing the market to lose over a trillion dollars in mere minutes. The exact cause of the crash remains unclear. Similarly, we might inadvertently initiate a war, driven by algorithmic interactions we cannot fully comprehend or control.
Building upon these concerns, the rapidly decreasing barriers to entry, driven by the commercial sector, also raises alarms about non-state actors. The 2016 report from Combating Terrorism Center, which documents the use of drones in combat by various non-state actors, states that groups such as ISIS managed to create combat drones using readily available consumer products. Given the advancements in autonomy, drones capable of longer flights and heavier payloads, along with their increasing accessibility, there are troubling prospects ahead. For instance, one could imagine a group like ISIS, known for its interest in drones, using drones in a terrorist attack only targeting a specific ethnicity, age, and gender by using consumer drones and open source AI models for target acquisition.
As the commercial sector persistently seeks to lower drone technology costs, the tight-knit relationship between private enterprises and military endeavors becomes evident. The plummeting costs of this technology extend its reach far beyond national military arsenals, putting it into the hands of rogue organizations and even individual actors. This democratization of lethal force amplifies the scale of potential threats exponentially. Compounding this issue is the inherent anonymity that autonomous drone attacks can provide, making the attribution of these acts increasingly challenging. The convergence of these factors not only escalates global security risks but also muddies the waters of accountability, creating a volatile environment rife with potential for misuse and misunderstanding.
In the ever-evolving landscape of warfare, AI-powered drones represent the latest intertwining of technology and strategy. Beyond their tactical advantages, they reflect a deeper societal change: a growing reliance on machines over human judgment, often seen in our daily lives and now magnified on the battlefield. But as these lines between machine-driven efficiency and human conscience blur, we grapple with fundamental questions about the direction and implications of technological progress. Are we paving a path to a safer future, or are we teetering on the brink of a new era of conflict, dictated more by cold algorithmic decisions than human values? This delicate balance between progress and peril demands introspection, not just by nations but also by industries that fuel these advancements.
Bridging the technological and ethical landscape of killer drones to the broader realm of arms control, it becomes clear that history offers valuable lessons. As with earlier arms, the push for controls and restrictions on drone usage can be seen as part of a continuum. Each new weapon technology has been met with global concern and, in many instances, collective efforts to regulate or prohibit their use. As we face the challenges posed by autonomous drones, it is worth revisiting these historical precedents, understanding their successes and shortcomings, to chart a viable path forward for the responsible management of modern warfare technology.
Navigating the challenges of AI arms control, however, requires tackling some foundational questions, the most immediate of which is defining what constitutes an "autonomous weapon." For instance, does a cruise missile employing optical target recognition to engage sea vessels qualify as an autonomous weapon? Or what about modern air-to-air missiles that leverages image recognition to circumvent countermeasures? It is crucial to have universally agreed upon definitions on what an autonomous weapon encompasses for effective arms control. By refining broad terms like "autonomous weapon" to more specific descriptors such as "killer drones" – which could be defined as a quad-copter drone under a certain weight, equipped with an explosive device that destroys the drone when detonated, and capable of self-directed target selection and engagement without a human– we can side step some of the ambiguity surrounding the term "autonomous weapon."
Building upon this foundation of clear definitions, we find that effective arms control measures often leverage past successes. As noted by the Center for New American Security, the 2008 ban on cluster munitions likely stemmed from the successful 1997 ban on antipersonnel mines. Similarly, the ban on chemical and biological weapons emerged from a longstanding sentiment against the use of poisons that dates back to ancient times.
The incremental approach is crucial in new fields like AI arms control. Starting with manageable steps, such as countering killer drones, doesn't necessitate major players to relinquish significant military capabilities. The primary focus of the treaty should be to counteract the unchecked spread of drone technology driven by commercial incentives. By addressing the decentralization of drone technology, the treaty aligns with the security interests of major global actors. This alignment boosts the likelihood of successfully establishing an arms control treaty. Once in place, this treaty can pave the way for more comprehensive and stringent control measures in the future.
The great challenge with killer drones lies in their deep integration with the commercial sector. Any regulation aimed at these drones could inadvertently impact commercial applications of drone technology. Drones serve a myriad of beneficial purposes, including healthcare, agriculture, and search and rescue operations. Therefore, treaty efforts designed to mitigate the risks of killer drones must carefully consider the broader implications for the entire drone market.
Innovative solutions are required, as past arms control measures have not grappled with the complexities of regulating software or the commercial realm. Potential strategies could include implementing traceable features in drone technology for both complete and incomplete drone hardware. Embracing open-source solutions might foster trust and transparency, enhancing the chances of successful regulation. Another consideration could be mandating treaty signatories to ensure their domestic firms contribute to a global, open-access database, documenting drone sales, quantities, and intended use.
In conclusion, the rapid evolution of drone technology, driven largely by advancements in AI and the commercial sector, is an inevitable march of progress. Yet, as with all significant technological leaps, it brings with it both promise and peril. The increasing autonomy and capabilities of drones, especially "killer drones" underscore a pressing need for international guidelines and controls. It is a dual challenge of maintaining technological advancement while ensuring ethical and global safety considerations. Tackling this challenge requires a holistic approach, involving not only nations and international bodies but also the commercial entities at the heart of this innovation. History teaches us that proactive measures, built upon the foundations of past successes, can shape the trajectory of technological warfare for the better. As we stand at this critical juncture, collective global action will determine whether drones become instruments of unchecked chaos or tools that, while powerful, are harnessed responsibly for the greater good.
Comments