Science-Technology, Americas

Pentagon’s AI contracts spur autonomous weapons, surveillance debate

After Anthropic’s rejection and OpenAI’s acceptance of Defense Department’s terms, US military’s reliance on fluid domestic definitions due to lack of int’l law addressing gap creates legal loopholes for mass surveillance, weapons, says expert

Firdevs Bulut Kartal  | 02.03.2026 - Update : 02.03.2026
Pentagon’s AI contracts spur autonomous weapons, surveillance debate

ANKARA

Tensions are mounting between artificial intelligence (AI) companies and the US Department of Defense, fueling debate over the limits of military AI use, contractual safeguards, the potential role of tech firms in mass surveillance, and the broader implications of state oversight through such systems.

Anthropic CEO Dario Amodei recently said the company declined to accept the Pentagon’s proposed contract terms, citing insufficient safeguards related to its Claude AI model. He warned that removing certain protections could enable the model’s use in mass surveillance of Americans or its integration into fully autonomous weapons systems.

The Defense Department rejected those assertions. Spokesperson Sean Parnell said the military has no intention of surveilling US citizens or developing autonomous weapons without human involvement.

In a parallel development, OpenAI CEO Sam Altman announced that his company reached an agreement with the Pentagon. He said OpenAI’s models would operate on a classified network and that the deal includes explicit prohibitions on mass surveillance, as well as requirements for human oversight in military applications.

However, legal experts caution that referencing such principles in contracts may not provide sufficient real-world guarantees.

Mustafa Tuncer, an international law expert at Istanbul’s National Defense University, told Anadolu that the dispute stems from deep-seated legal and technical ambiguities. He noted that “mass surveillance” is loosely defined under US domestic law, making commitments in this area potentially uncertain.

He added that definitions of “autonomous weapons” derive from documents such as the US Defense Department’s 2023 directive, which could evolve over time, creating further uncertainty regarding the scope and durability of current assurances.

Tuncer emphasized that statements outside formal AI contracts do not alter legal liability if ethical or legal concerns arise later. While the US maintains a national framework governing autonomous systems, he said it remains unclear how binding these rules would be in active combat situations.

According to Tuncer, the Defense Department defines autonomous weapons systems as those capable, once activated, of selecting and engaging targets without further human intervention, though such systems may also include mechanisms to monitor and terminate their own operations if necessary.

He noted that international humanitarian law does not directly regulate the use of AI in warfare. Instead, existing rules governing new weapons are generally considered applicable to AI-powered and autonomous systems. Still, the development of legal oversight is progressing through varying national practices rather than a unified global framework.

“The core principles of international humanitarian law, such as distinction between military targets and civilians, remain binding,” Tuncer said. “But with AI integrated into military operations, it is unclear how these principles are implemented in practice and what concrete limitations apply.”

He further observed that the concept of “human oversight” remains broadly defined and subject to change under both US and international law. This legal fluidity, he warned, creates ongoing risks for technology companies collaborating with military institutions.

Tuncer concluded that contracts between tech firms and defense authorities must clearly define key terms -- including mass surveillance and autonomous weapons -- to protect agreements from potential shifts in future domestic legislation.

*Writing by Emir Yildirim

Anadolu Agency website contains only a portion of the news stories offered to subscribers in the AA News Broadcasting System (HAS), and in summarized form. Please contact us for subscription options.
Related topics
Bu haberi paylaşın