TLDR
- Google will supply up to 1 million specialized AI chips (TPUs) to Anthropic in a deal worth tens of billions of dollars
- The arrangement will bring over 1 gigawatt of computing capacity online by 2026 to support Anthropic’s AI model training
- Anthropic’s annual revenue has reached $7 billion, with Claude powering over 300,000 businesses
- Google has invested $3 billion total in Anthropic, while Amazon has committed $8 billion as another major partner
- The deal allows Anthropic to maintain its multi-cloud strategy across Google, Amazon, and Nvidia infrastructure
Google and Anthropic announced a large cloud computing partnership on Thursday that will provide the AI startup with access to up to 1 million tensor processing units. The specialized chips are designed to accelerate machine learning workloads.
Google $GOOGL and Anthropic just officailly annonced their cloud partnership
Antropic gets access to up to one million of Googleâs custom-designed Tensor Processing Units, or TPUs.
The deal will add over a gigawatt of compute capacity by 2026, supporting Anthropicâs surging $7⌠pic.twitter.com/w5dCJDv7Up
— Evan (@StockMKTNewz) October 23, 2025
The deal is worth tens of billions of dollars. It represents one of the largest commitments in the AI hardware sector to date.
The TPUs will be deployed starting in 2026. They will bring more than a gigawatt of capacity online to support Anthropic’s computing needs.
Industry estimates place the cost of a 1-gigawatt data center at around $50 billion. About $35 billion of that amount typically goes toward chips.
Anthropic needs the computing power to train and run its Claude family of large language models. These AI systems compete with OpenAI’s GPT models and Google’s own Gemini AI.
The partnership deepens the existing relationship between Google and Anthropic. Google has already invested approximately $3 billion in the AI startup.
That includes $2 billion invested in 2023 and another $1 billion in early 2025. Google ranks as the third-largest cloud provider behind Amazon and Microsoft.
Anthropic’s Multi-Cloud Strategy
Anthropic maintains partnerships with multiple cloud providers rather than relying on a single vendor. The company runs its Claude models across Google’s TPUs, Amazon’s Trainium chips, and Nvidia GPUs.
Each platform handles specialized workloads like training, inference, and research. This approach allows Anthropic to optimize for price, performance, and power constraints.
The multi-cloud architecture proved useful during a recent AWS outage on Monday. Claude continued operating without disruption because of its diversified infrastructure.
Amazon remains Anthropic’s largest investor with $8 billion committed to date. AWS is considered the company’s primary cloud provider.
Amazon built a custom supercomputer for Claude called Project Rainier. It runs on Amazon’s Trainium 2 chips.
Revenue Growth and Market Position
Anthropic’s annual revenue run rate now approaches $7 billion. The company serves more than 300,000 businesses worldwide.
That represents a 300-times increase over the past two years. The number of large customers contributing over $100,000 in annual revenue has grown nearly sevenfold in the past year.
Claude Code, the company’s coding assistant tool, generated $500 million in revenue within two months of launch. Anthropic claims this makes it the fastest-growing product in history.
Rothschild & Co analyst Alex Haissl estimated that Anthropic added one to two percentage points to AWS growth in recent quarters. The contribution is expected to exceed five percentage points in the second half of 2025.
Anthropic recently closed a $13 billion funding round led by Iconiq Capital. The financing nearly tripled the company’s valuation to $183 billion.
Anthropic was founded by former OpenAI researchers. The company maintains control over its model weights, pricing, and customer data across all cloud partnerships.



