OpenAI is revising its quickly‑made agreement to provide artificial‑intelligence tools to the U.S. Department of War after the company’s chief executive acknowledged that the arrangement appeared “opportunistic and sloppy.”
The contract sparked concerns that the San Francisco firm’s AI might be employed for domestic mass surveillance, but its head, Sam Altman, said on Monday night that the technology would be expressly prohibited from such use or from deployment by defense‑department intelligence bodies such as the National Security Agency.
OpenAI, whose ChatGPT service counts over 900 million users, struck the deal almost immediately after the Pentagon’s prior AI supplier, Anthropic, was dismissed.
Anthropic had argued that “using these systems for mass domestic surveillance is incompatible with democratic values,” prompting President Donald Trump to label Anthropic “left‑wing nut jobs” and to order the federal government to cease using its technology.
Although OpenAI denied that the pact permitted surveillance, commentators invoked the 2013 Snowden revelations, when it emerged that the NSA had been collecting large volumes of phone and internet data.
The agreement provoked an online backlash, with users on X and Reddit urging a “delete ChatGPT” movement. One post read: “You’re now training a war machine. Let’s see proof of cancellation.”
Anthropic’s chatbot Claude surged to the top of Apple’s App Store rankings, overtaking ChatGPT, according to Sensor Tower data.
In a memo to staff reposted on X, the OpenAI chief said the original deal announced on Friday had been concluded too hastily after Anthropic’s removal.
“We shouldn’t have rushed to get this out on Friday,” Altman wrote. “The issues are extremely complex and require clear communication. We were genuinely trying to de‑escalate and avoid a far worse outcome, but it simply looked opportunistic and sloppy.”
When the deal was first announced, OpenAI claimed the contract contained “more guardrails than any previous agreement for classified AI deployments, including Anthropic’s.”
Nevertheless, the prospect of AI use by the U.S. military has alarmed nearly 900 workers at OpenAI and Google, who have signed an open letter urging their leaders to refuse Department‑of‑War requests for surveillance and autonomous killing capabilities.
Warning that the government was attempting to “divide each company with fear that the other will give in,” the signatories wrote: “We hope our leaders will set aside their differences and stand together to continue rejecting the DoW’s current demands for permission to use our models for domestic mass surveillance and autonomous killing without human oversight.”
The letter bears the signatures of 796 Google employees and 98 OpenAI staff. In a blog post announcing the DoW contract, OpenAI said one of its red lines was “no use of OpenAI technology to direct autonomous weapons.”
Read next
Brusselslaunches probe into Snapchat over child safety worries
Brussels has launched an inquiry into Snapchat after worries that the messaging service is exposing children to grooming, sexual abuse and other illegal activity.
In a separate ruling on Thursday, the European Commission also stated that four pornographic sites are not stopping minors from viewing adult material.
The probes targeting
Senior European reporter suspended for using AI‑generated quotations
The owner of the Dutch daily De Telegraaf and the Irish Independent has placed a senior reporter on leave for now after he confessed to employing artificial intelligence to “incorrectly attribute statements to individuals.”
Peter Vandermeersch, who previously led Mediahuis’s Irish division, said he “succumbed to hallucinations” – the label
Fire specialists stay alert as lithium‑ion battery risks rise
Lithium‑ion cells now pose a fresh technological risk, a fire‑science specialist admits keeps him restless at night, while fire‑service leaders caution that the proliferation of these cells in daily items is outstripping public awareness and safety rules.
The inferno that ravaged a historic Glasgow structure and forced