Synthetic Intelligence & Machine Studying
,
Authorities
,
Trade Particular
Companies Prioritizing Monitoring Use Over Imposing Instant Cutoffs

Federal staffers are nonetheless utilizing Anthropic’s synthetic intelligence fashions – regardless of President Donald Trump ordering businesses in late February to halt their use amid a feud between the Division of Protection and the corporate over its know-how in navy programs.
See Additionally: Compliance Workforce Information for Evasion Prevention & Sanction Publicity Detection
Trump issued the directive in a submit on his social media platform, writing: “I’m directing EVERY Federal Company in the USA Authorities to IMMEDIATELY CEASE all use of Anthropic’s know-how.” He added that businesses utilizing the agency’s merchandise would have six months to section them out.
Present and former federal workers inform ISMG the directive didn’t set off any instant or coordinated shutdown, and that inside communications within the weeks that adopted targeted extra on gauging utilization than imposing a cutoff. Their accounts counsel that Trump’s push lags operational realities, notably inside civilian businesses the place AI instruments are already embedded in analysis, coding and analytical workflows.
Staffers from businesses together with the departments of State and Treasury mentioned groups have been nonetheless utilizing Anthropic’s in style Claude mannequin, whilst these businesses roll out official integrations with a model of OpenAI’s ChatGPT. Companies are additionally aiming to check Anthropic’s Mythos system, a sophisticated mannequin constructed to autonomously uncover and assist repair software program vulnerabilities, with Politico reporting the Division of Commerce’s Heart for AI Requirements and Innovation is already evaluating its capabilities.
The continued use of Claude inside civilian businesses contrasts the administration’s aggressive posture towards Anthropic, which has centered on considerations that the corporate retains an excessive amount of management over how its fashions perform as soon as deployed in delicate authorities environments. The Pentagon formally designated Anthropic a provide chain threat in early March, arguing that the corporate’s means to replace or limit its fashions post-deployment might undermine the reliability of programs utilized in nationwide safety operations (see: Pentagon Memo Blasted Anthropic for PR Marketing campaign).
A federal appeals court docket in Washington has allowed the Pentagon to maneuver ahead with eradicating Anthropic’s know-how from navy programs, whilst elements of the coverage face challenges in separate litigation, leaving the corporate successfully reduce off from Pentagon work for now.
That posture has not translated cleanly throughout the civilian authorities, the place officers are nonetheless working to grasp how extensively Anthropic instruments are used – and what merchandise, if any, might be used to switch them.
Staffers who spoke with ISMG mentioned inside communications following Trump’s directive have been geared toward establishing baseline visibility into utilization – together with which workplaces relied on Claude and for what sorts of work – quite than imposing instant restrictions. Since then, the staffers mentioned no follow-up communication surrounding a phase-out interval have been formally relayed to groups nonetheless utilizing Claude.
That strategy displays the sensible problem of unwinding AI adoption already underway throughout businesses, notably as groups have already built-in particular AI instruments into their drafting, coding, knowledge evaluation and different important capabilities.
For now, the accounts from State and Treasury counsel that Anthropic’s instruments stay considerably embedded in day-to-day workflows. The departments of State and Treasury didn’t reply to requests for remark. The White Home didn’t reply to a request for remark.







![[Webinar] Eradicate Ghost Identities Earlier than They Expose Your Enterprise Information](https://blog.aimactgrow.com/wp-content/uploads/2026/04/ghost-120x86.jpg)
