Listen to the article
A battle is brewing inside the Pentagon that could determine the future of American military strategy.
On Tuesday, Defense Secretary Pete Hegseth pledged to cut ties with Anthropic—one of the two AI providers authorized by the Pentagon for classified use—unless the company removed all safeguards from Claude by Friday. This comes after a January memo, in which Hegseth directed the department to only “utilize [AI] models free from usage policy constraints that may limit lawful military applications.” On Thursday, Anthropic CEO Dario Amodei refused Hegseth’s ultimatum.
In a statement published Thursday night, Amodei said Anthropic would not accommodate the Department of Defense’s request to remove the safeguards on its AI model because, “in a narrow set of cases, we believe AI can undermine, rather than defend, democratic values.”
In his letter, Amodei grants that “the Department of War, not private companies, makes military decisions.” However, Amodei refused to capitulate to Hegseth’s demands, saying that “frontier AI systems are simply not reliable enough to power fully autonomous weapons,” and “mass domestic surveillance is incompatible with democratic values.”
Amodei’s response is not surprising. He’s long warned that AI can be used nefariously and repeatedly advocated for regulation. Ironically, the very government Amodei trusted to ensure AI safety is now looking to weaponize the technology.
Unlike Amodei’s previous calls for government intervention, which would have insulated Anthropic from competition, this decision threatens Anthropic’s competitiveness.
On Tuesday, Hegseth threatened to label Anthropic a supply chain risk in the event of noncompliance. This “would ban all other DoD suppliers…from using Anthropic in their fulfillment of DoD contracts,” explains Dean Ball, a senior fellow at the Foundation for American Innovation who served as senior policy adviser for AI and emerging technology at the Office of Science and Technology Policy in 2025.
More disturbing still is Hegseth’s invocation of the Defense Production Act (DPA), which “confers upon the President a broad set of authorities to influence domestic industry in the interest of national defense.”
Among these authorities are Titles I, III, and VII. Title III grants the president the authority to subsidize certain industries via loans and purchase commitments, while Title VII allows the president to compel information from companies. Title I “is a more straightforwardly Soviet power,” says Ball, and gives the government the authority “to directly command the production of industrial goods.” With this power, the Defense Department “intends to…command Anthropic to make a version of Claude that can choose to kill people without any human oversight,” he says.
Hegseth’s demands vindicate Amodei’s mid-February warning to New York Times columnist Ross Douthat that AI can be used to undermine constitutional rights and AI. He expressed particular concern about AI rendering public surveillance hyperlegible, empowering the government to efficiently parse through and act on what’s currently an overwhelming amount of data. This would “make a mockery of the Fourth Amendment by…finding technical ways around it,” he said.
On Thursday, Under Secretary of Defense Emil Michael blithely dismissed Amodei’s concerns: “Mass surveillance violating the 4th Amendment…is illegal which is why the @DeptofWar would never do it.” But an activity’s illegality does not mean the government won’t engage in it. In this case, it already has. Immigration and Customs Enforcement, for instance, has been leveraging AI-powered technology for domestic surveillance, explains Reason’s Autumn Billings.
It’s unclear how this situation will be resolved. Anthropic could forfeit its multimillion-dollar Pentagon contract, lose other business due to its designation as a supply chain risk, or even be nationalized by the feds. Still, Amodei can rest easy knowing that he has taken a stand for privacy and moral responsibility.
Read the full article here
Fact Checker
Verify the accuracy of this article using AI-powered analysis and real-time sources.

