
The FHFA ordered Fannie Mae and Freddie Mac to sever ties with Anthropic, signaling how politics and regulation could shape the adoption of AI in housing finance.
Fannie Mae and Freddie Mac are severing ties with artificial intelligence company Anthropic following a dispute between the company and the federal government over restrictions on how the technology can be used.
The move was announced by Federal Housing Finance Agency Administrator Bill Pulte, who said the government-backed company would sever ties with the company. FHFA regulates both major mortgage lenders.
Bill Palt |Credit: X
The announcement did not indicate any immediate changes to mortgage underwriting or origination policies. But the latest developments highlight how the growing use of AI in housing finance may intersect with federal policy and national security concerns.
For lenders, proptech companies, and real estate professionals experimenting with AI tools, this episode highlights how the technology ecosystem supporting housing finance can also be shaped by broader political and regulatory dynamics.
Federal Housing Corporation, Fannie Mae, and Freddie Mac are terminating all use of Anthropic products, including use of the Claude platform. https://t.co/5KxurW620h
— Pulte (@pulte) March 2, 2026
AI political flashpoints
The move appears to be related to a broader dispute between the federal government and Anthropic over restrictions on how the company uses its AI technology.
Anthropic has emphasized safeguards for the deployment of its models, including restrictions on surveillance and military applications. Federal officials expressed concerns about these restrictions, which led to the decision to cut ties with the company.
This development is notable because federal technology regulations have traditionally focused on foreign suppliers and cybersecurity risks, rather than U.S.-based AI developers.
This episode highlights the growing tension between national security priorities, AI governance, and the growing use of artificial intelligence across regulated industries.
What it means for the mortgage industry
The immediate impact on the housing finance system is expected to be limited.
Fannie Mae and Freddie Mac use AI and machine learning tools across various functions within the company. Moving away from Anthropic will likely mean moving those capabilities to alternative vendors or internal systems.
Still, this episode may signal a new reality for mortgage lenders and proptech companies experimenting with generative AI. As the housing sector integrates artificial intelligence into loan processing, customer service, and fraud detection, regulators may play a larger role in determining what technology government-backed institutions can use.
The move will mirror trends in other regulated industries, such as defense, health care and banking, where vendor selection is often shaped by federal security standards and procurement policies.
A sign of what’s to come?
The move also comes as the mortgage industry increasingly experiments with AI-powered tools.
Lenders are beginning to implement artificial intelligence for tasks such as document classification, underwriting assistance, customer communications, and compliance monitoring. Meanwhile, real estate brokerages and proptech startups are adopting generative AI for marketing, market analysis, and transaction workflows.
Once the Federal Housing Administration begins setting boundaries on which AI providers it can work with, its decisions could ripple throughout the broader housing ecosystem.
Vendors looking to sell AI tools to lenders, servicers, and housing finance institutions may find that technical performance is only part of the equation, and regulatory integrity may also be an important consideration.
For the real estate industry, this episode offers an early glimpse into how the next phase of AI adoption will be shaped by policy as well as innovation.
Email Nick Pipitone
