TLDR
- Nvidia launched the Groq 3 LPU at GTC 2026, a chip built specifically for AI inference workloads
- The new LPX server rack packs 128 Groq 3 processors and can pair with the Vera Rubin NVL72 for up to 35x higher throughput per megawatt
- Nvidia also unveiled the standalone Vera CPU rack, putting it in direct competition with Intel and AMD in the data center
- The Vera CPU is built for agentic AI tasks — things like browsing the web or pulling data from files
- Nvidia reported data center revenue of $193.5 billion in fiscal 2026, up from $116.2 billion the year before
Nvidia took the stage at GTC 2026 in San Jose on Monday with a lineup of new chips and server systems that push the company well beyond its GPU roots.
The headline act was the Groq 3 language processing unit — or LPU. Nvidia agreed to license technology from Groq and brought on founder Jonathan Ross, president Sunny Madra, and other team members as part of a $20 billion deal struck in December.
The Groq 3 is built for inference — the part of AI that happens after training. Every time a user sends a prompt to a chatbot and gets a reply, that’s inference. It’s a fast-growing part of the AI market, and one where dedicated chips can hold an edge over general-purpose GPUs.
Nvidia VP of hyperscale and HPC Ian Buck said Groq 3’s memory is faster than that found in Nvidia’s GPUs, even if GPUs carry more of it. The plan is to combine both strengths.
That’s the idea behind the new LPX server rack — a system housing 128 Groq 3 LPUs. Paired with the Vera Rubin NVL72 rack, Nvidia says customers can get 35x higher throughput per megawatt and 10x more revenue opportunity. The company says it’s aimed at trillion-parameter models and million-token context windows.
Vera CPU Takes On Intel and AMD
The other big reveal was the Vera CPU rack. The Vera chip has previously been discussed as part of the Vera Rubin superchip — a combination of one Vera CPU and two Rubin GPUs. Now Nvidia is spinning Vera out as a standalone processor.
The new rack combines 256 liquid-cooled Vera chips into a single system. Nvidia describes it as the best CPU for agentic AI — the kind of AI that autonomously browses websites, pulls data from files, or carries out multi-step tasks on a user’s behalf.
“We’ve designed a new kind of CPU, the Olympus core, engineered by NVIDIA for AI execution,” Buck said. Vera also plays a role in data mining, personalization, and context analysis that feeds into AI models.
This puts Nvidia directly against Intel and AMD in the data center CPU market — a space the two companies have dominated for years.
Last month, Nvidia announced a deal with Meta to deploy its prior-generation Grace CPUs at scale — the largest-ever such deployment. The Vera launch builds on that momentum.
More Hardware From the GTC Floor
Nvidia also showed off the Bluefield-4 STX storage rack and the Spectrum-6 SPX networking rack, rounding out a full suite of data center hardware.
Hyperscalers including Amazon, Google, Meta, and Microsoft are set to collectively spend $650 billion on AI infrastructure this year.
Nvidia posted data center revenue of $193.5 billion in fiscal 2026, up from $116.2 billion in fiscal 2025.
Analysts covering NVDA hold a consensus Strong Buy rating based on 38 Buy and one Hold recommendation over the past three months, with an average price target of $273.61.





