TLDR
- Arm Holdings has launched its first ever in-house chip, the AGI CPU, targeting AI data centers
- Meta Platforms is the lead partner, with OpenAI, Cloudflare, SAP and SK Telecom also signed up as customers
- The chip is built on TSMC’s 3-nanometer process and volume production is set for the second half of 2025
- The move marks a major shift from Arm’s traditional IP-licensing model to competing in physical silicon
- Arm expects the chip to add billions in annual revenue; Wall Street forecasts $4.91 billion in revenue for the current fiscal year
Arm Holdings has unveiled its first ever in-house chip, the AGI CPU, a data center processor purpose-built for agentic AI workloads. The announcement sent ARM stock up 1.43% on Tuesday.
JUST IN:$ARM competes for data center revenue with AI chip designed in partnership with $META. pic.twitter.com/eDCKi5YbkM
— LuxAlgo (@LuxAlgo) March 24, 2026
CEO Rene Haas called it “a very pivotal moment for the company,” speaking to Reuters on the day of the launch event in San Francisco.
For over 35 years, Arm has operated as the so-called Switzerland of the chip industry — licensing its architecture to the likes of Apple, Nvidia, Qualcomm and Amazon, and collecting royalties on every unit shipped. The AGI CPU changes that model entirely.
Arm Holdings plc American Depositary Shares, ARM
The chip is designed for agentic AI, a fast-growing category where AI systems take actions on behalf of users with minimal human input. Unlike chatbot-style AI, agentic tasks demand heavy general compute — exactly where CPUs, not GPUs, shine.
Arm’s AGI CPU is priced to compete. The company won’t disclose exact figures, but analyst Patrick Moorhead of Moor Insights predicts it will run in the thousands of dollars per unit. Awad told CNBC it would be “competitively priced.”
Meta Signs On as Lead Partner
Meta Platforms is the launch customer, a stamp of approval that carries real weight. Meta is currently spending up to $135 billion on capital expenditures this year and is building out multiple gigawatts of AI data center capacity.
Meta software engineer Paul Saab, who worked on the project from its start in 2023, said the chip offers “a lot more flexibility in our software stack and in our supply chain.” He added that the intent was always to make it openly available, not just to Meta internally.
Moorhead put the potential upside bluntly: “Let’s say they get 5% of Meta’s $115 to $135 billion capex going into the future. That is a game changer on the top line for them.”
Beyond Meta, seven other customers have committed to the chip, including OpenAI, Cloudflare, SAP and SK Telecom. Around 50 partners signaled support ahead of the launch.
Built in Austin, Made in Taiwan
Arm spent $71 million and roughly 18 months building out three new lab rooms at its Austin, Texas campus for chip development. The team working on it has grown to over 1,000 people.
The chip is manufactured on TSMC’s 3-nanometer process in Taiwan. It consists of two pieces of silicon that work as one unit. Up to 64 AGI CPUs — around 8,700 cores — can fit into a single air-cooled rack.
Mohamed Awad, Arm’s head of cloud AI, said the chip delivers “two times the performance-per-watt than you can from an x86 rack.”
Volume production is planned for the second half of this year. Arm says test chips are already back and working as expected. Additional chip designs are in the pipeline at 12- to 18-month release intervals.
Wall Street currently expects Arm to post revenue of $4.91 billion for the current fiscal year, with net profit of $1.75 per share, per LSEG estimates.







