By Meline Tarpinyan
Edited by Garbielle Adams

In a memorandum titled “Unleashing U.S. Military Drone Dominance,” Secretary of Defense Pete Hegseth outlined the U.S. Army’s strategy to increase the production of low-cost, AI-powered drones. The memorandum used particularly strong language, emphasizing the need to harness the potential of American technological capabilities while removing any restrictions on the lethality of weapons. This document underscores a broader trend among the U.S. government and other major world powers to rapidly incorporate the latest technologies into their militaries, especially systems powered by artificial intelligence.
The cost-effectiveness and improved accuracy of AI drone technologies have prompted many military organizations to start developing their own versions of this technology. By integrating AI into warfare, governments can reduce the need to train, pay, and maintain soldiers, allowing them to expand their capabilities while spending less money. Ukraine, for example, has revolutionized drone technology as a way to cope with an extremely constrained budget and outnumbered population in defending itself against Russia. For just $500 each, Ukraine’s unmanned aerial vehicles (UAVs) can destroy enemy artillery worth millions of dollars, faster and more accurately than ever, while reducing the risk to soldiers and their own artillery.
That said, UAV technology has raised ethical concerns about its potential to escalate conflict due to technological unreliability and the ambiguity surrounding accountability for human rights violations. When faced with situations not encountered in training data, AI systems can malfunction and respond unpredictably, possibly harming civilians in the process. As AI technologies become more lethal, the risk of larger-scale “flash wars,” conflicts where military systems misinterpret data and trigger unintended escalations before human intervention can occur, has risen as a result of human decision making being increasingly removed from battlefield decisions. Furthermore, the UN warns that advances in AI warfare could result in autonomous drone swarms that may disproportionately target specific areas, raising concerns about the principles of proportionality and precaution in international humanitarian law.
According to Kanaka Rajan, founder of the Kempner Institute for the Study of Natural and Artificial Intelligence at Harvard, tech companies working with military organizations have been unclear about the level of autonomy in their weapon systems. This lack of transparency raises concerns, as some systems may operate without human oversight to verify targets and make ethical decisions, putting the lives of individuals at the mercy of algorithms.
When human rights violations occur in this context, it becomes increasingly difficult to establish accountability, as these weapons cannot be held accountable like humans. UN Special Rapporteur, Christof Heyns stated that the usage of autonomous lethal weapons, “may be unacceptable because no adequate system of legal accountability can be devised for the actions of machines”. Without any form of international governance to check the recent surge in investment and usage of AI weaponry, the risk of unplanned warfare, and larger-scale civilian loss is higher than ever.
In an age of heightened global tensions, technological rivalries, and rising fears of large-scale conflict, countries are competing to create superior military systems, relying less on human resources and more on artificial intelligence to conduct warfare. The growing reliance on artificial intelligence and private innovation may bring unprecedented efficiency and capability to national defense, but, more worrisome, is that it challenges traditional frameworks of accountability, oversight, and ethics. Unchecked, the development and usage of autonomous lethal weaponry could erode moral and legal boundaries that have long governed armed conflict. Without clear international regulation and accountability, the race for technological superiority may lead not to greater security, but to unprecedented instability and humanitarian strife.
