Back

Anthropic Strikes $200 Billion Deal with Google for Cloud and Chip Access

Anthropic, the startup behind the Claude family of large‑language models, sealed a five‑year agreement that will see it pay Google $200 billion for access to the tech giant’s cloud infrastructure and custom AI chips. The Information first reported the deal earlier this month, and the figure confirms that the partnership is one of the most sizable in the nascent AI‑cloud ecosystem.

The contract joins a string of similarly hefty arrangements, including Anthropic’s recent multi‑billion‑dollar pact with Amazon. Together, these deals are driving what analysts describe as a $2 trillion revenue backlog for the four major cloud providers—Google, Amazon, Microsoft, and Oracle—that have positioned themselves as early investors in the AI boom.

Google’s role goes beyond simply renting server space. The agreement grants Anthropic access to the company’s proprietary tensor processing units, hardware designed specifically for AI workloads. By coupling cloud compute with specialized chips, Anthropic aims to accelerate the training and deployment of its Claude models while sidestepping the need to build its own data‑center hardware.

Industry forecasts highlight the scale of the financial commitment. Projections released last year estimated that OpenAI could spend as much as $45 billion on servers by 2026, while Anthropic’s own server costs were expected to hit $20 billion in the same period. The new Google deal, spread over five years, represents roughly ten percent of those projected expenditures each year.

Chipmakers are also feeling the ripple effect. NVIDIA, for example, has deepened its ties with OpenAI through direct investments and supply agreements, a pattern that mirrors Google’s approach with Anthropic. These intertwined relationships suggest that the AI boom is as much about hardware supply chains as it is about software breakthroughs.

Critics argue that the sheer size of these contracts may not be sustainable in the long run. Data centers consume vast amounts of electricity and rely on a finite supply of high‑performance memory chips, which have faced periodic shortages. As demand for AI compute continues to surge, prices for related components could rise, squeezing margins for both cloud providers and the startups that depend on them.

Nevertheless, the deal reflects a broader strategic bet by Google. By locking in Anthropic’s workload, the search‑engine giant secures a steady stream of revenue while ensuring its AI hardware remains in high demand. For Anthropic, the partnership provides the compute horsepower needed to compete with rivals like OpenAI, whose own agreements with Microsoft have already reshaped the competitive landscape.

“We’re looking at an ecosystem where the biggest cloud players are also the biggest investors,” said an industry analyst who asked to remain anonymous. “These contracts lock in both revenue and technological advantage, but they also tie the fate of AI startups to the health of the underlying hardware market.”

As AI models grow larger and more complex, the financial stakes of powering them will only intensify. Whether the current model of massive, multi‑year cloud‑chip contracts can endure remains an open question, but for now, Anthropic’s $200 billion pledge marks a milestone in the commercialization of artificial intelligence.

Used: News Factory APP - news discovery and automation - ChatGPT for Business

Source: Engadget

Also available in: