STOCK TITAN

NVIDIA Launches Vera CPU, Purpose-Built for Agentic AI

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Positive)
Tags
AI

NVIDIA (NVDA) launched the Vera CPU on March 16, 2026, a processor purpose-built for agentic AI and reinforcement learning that the company says is 50% faster and delivers 2x energy efficiency versus traditional rack-scale CPUs. Vera features 88 Olympus cores, LPDDR5X memory (up to 1.2 TB/s bandwidth), and NVLink-C2C with 1.8 TB/s coherent CPU–GPU bandwidth. NVIDIA also introduced a liquid-cooled Vera rack (256 CPUs) supporting >22,500 concurrent CPU environments and said Vera is in full production with partner availability in the second half of 2026.

Loading...
Loading translation...

Positive

  • Performance claim: 50% faster than traditional rack-scale CPUs
  • Energy efficiency: 2x the efficiency of traditional CPUs
  • Core architecture: 88 Olympus cores with Spatial Multithreading
  • Memory bandwidth: LPDDR5X delivering up to 1.2 TB/s
  • CPU–GPU coherency: NVLink-C2C with 1.8 TB/s bandwidth
  • Scale: liquid-cooled Vera rack with 256 CPUs supporting >22,500 concurrent environments

Negative

  • None.

News Market Reaction – NVDA

+2.19%
1 alert
+2.19% News Effect

On the day this news was published, NVDA gained 2.19%, reflecting a moderate positive market reaction.

Data tracked by StockTitan Argus on the day of publication.

Key Figures

Efficiency vs CPUs: 2x efficiency Performance vs CPUs: 50% faster Rack CPUs: 256 CPUs +5 more
8 metrics
Efficiency vs CPUs 2x efficiency Vera vs traditional rack-scale CPUs
Performance vs CPUs 50% faster Vera vs traditional rack-scale CPUs
Rack CPUs 256 CPUs Liquid-cooled Vera CPUs per rack
Concurrent environments 22,500+ environments Concurrent CPU environments per Vera rack
CPU-GPU bandwidth 1.8 TB/s Coherent bandwidth via NVLink‑C2C
Olympus cores 88 cores Custom NVIDIA-designed Olympus CPU cores
Memory bandwidth 1.2 TB/s LPDDR5X memory subsystem bandwidth
Latency improvement 5.5x lower latency Redpanda Kafka-compatible workload tests vs other systems

Market Reality Check

Price: $183.19 Vol: Volume 56,193,337 is at 0...
low vol
$183.19 Last Close
Volume Volume 56,193,337 is at 0.3x the 20-day average of 186,899,209, indicating lighter trading before the launch news. low
Technical Price $180.25 is trading above the 200-day MA of $177.64 and sits well above the 52-week low of $86.62 but below the $212.1899 high.

Peers on Argus

NVDA was up 2.19% while key peers like AVGO (-1.18%), TSM (-0.86%), AMD (-0.81%)...

NVDA was up 2.19% while key peers like AVGO (-1.18%), TSM (-0.86%), AMD (-0.81%), MU (-4.49%) and NXPI (-1.91%) traded lower, pointing to a stock-specific move rather than a sector-wide rally.

Previous AI Reports

5 past events · Latest: Mar 11 (Positive)
Same Type Pattern 5 events
Date Event Sentiment Move Catalyst
Mar 11 AI cloud partnership Positive +0.7% NVIDIA invests $2B in Nebius to scale full‑stack AI cloud infrastructure.
Mar 03 AI conference preview Positive -1.3% GTC 2026 announcement with 30,000+ attendees and broad AI stack focus.
Feb 17 Meta AI partnership Positive +1.6% Multiyear deal with Meta to codesign AI infrastructure using Grace and Vera.
Feb 03 Industrial AI platform Positive -2.8% Long‑term partnership with Dassault Systèmes to build industrial AI platform.
Jan 26 AI factory expansion Positive -0.6% Expanded CoreWeave collaboration and $2B investment to build AI factories.
Pattern Detected

Recent AI-related announcements have produced mixed or slightly negative average moves, with more divergences than alignments between positive AI news and short-term price reaction.

Recent Company History

Over recent months, NVIDIA has focused heavily on AI infrastructure and partnerships. AI-tagged news includes large investments in partners like Nebius and CoreWeave, multiyear collaborations with Meta and Dassault Systèmes, and the GTC 2026 conference announcement. These events highlight a strategy centered on scaling AI factories, deploying Grace and Vera CPUs, and expanding GPU-based systems across hyperscalers and industrial platforms. Price reactions to these AI updates have been inconsistent, with several positive strategic announcements followed by flat or negative one-day moves around Jan–Mar 2026.

Historical Comparison

-0.5% avg move · This Vera CPU launch fits a series of AI infrastructure announcements that have averaged a modest -0...
AI
-0.5%
Average Historical Move AI

This Vera CPU launch fits a series of AI infrastructure announcements that have averaged a modest -0.5% one‑day move, with several positive AI deals historically met by muted or negative reactions.

AI‑tagged history shows a progression from strategic AI factory partnerships and hyperscaler collaborations featuring Grace and Vera to this formal launch of Vera as a purpose‑built CPU for agentic AI workloads.

Market Pulse Summary

This announcement introduces NVIDIA Vera as a CPU tailored for agentic AI, with features like 88 cus...
Analysis

This announcement introduces NVIDIA Vera as a CPU tailored for agentic AI, with features like 88 custom cores, up to 1.2 TB/s memory bandwidth and 1.8 TB/s CPU‑GPU interconnect. It builds on earlier AI partnerships involving Vera and positions the chip at the center of AI factory architectures. Investors may track hyperscaler and cloud deployments, real‑world latency gains such as the reported 5.5x improvement, and future disclosures on how Vera contributes to data center revenue growth.

Key Terms

agentic AI, reinforcement learning, nvidia mgx, lpddr5x, +2 more
6 terms
agentic AI technical
"the world’s first processor purpose-built for the age of agentic AI and reinforcement"
Agentic AI refers to computer systems that can make their own decisions and take actions without needing someone to tell them what to do each time. It's like giving a robot a degree of independence to solve problems or achieve goals on its own, which matters because it could change how we work and interact with technology in everyday life.
reinforcement learning technical
"purpose-built for the age of agentic AI and reinforcement learning — delivering results"
A type of artificial intelligence that learns by trial and error, receiving feedback from its actions to favor choices that lead to better outcomes. Think of it like a salesperson learning which pitches close deals by trying different approaches and keeping the ones that work. For investors, reinforcement learning matters because it can power smarter trading systems, optimize business operations, or improve products—potentially boosting efficiency and profits while also introducing model and execution risks.
nvidia mgx technical
"The new Vera rack is built using the NVIDIA MGX™ modular reference architecture"
NVIDIA MGX is a modular hardware design and deployment blueprint for building GPU-accelerated servers and data center systems, letting manufacturers mix and match graphics processing units and networking components like building blocks. It matters to investors because MGX can speed up tasks such as artificial intelligence training, cloud computing and data analytics, driving demand for GPUs and related server gear; think of it as a standardized chassis that makes it faster and cheaper for data centers to add high-performance computing power.
lpddr5x technical
"memory subsystem, now built on LPDDR5X memory and delivering up to 1.2 TB/s"
LPDDR5X is a modern, low-power type of volatile memory used mostly in smartphones, tablets and other energy-sensitive devices to store working data temporarily while apps run. Think of it as a faster, more efficient short-term memory for a device — its speed and lower power use can improve performance and battery life, so adoption trends affect makers of chips and devices and can signal competitiveness in hardware design.
apache kafka technical
"Redpanda recently tested NVIDIA Vera running Apache Kafka-compatible workloads and saw"
Apache Kafka is an open-source software tool companies use to move and store streams of information in real time, like a high-speed conveyor belt that carries messages, transactions, and sensor readings from one system to another. Investors care because it enables firms to react instantly to market data, customer behavior, or operational problems, improving decision speed, product reliability, and the ability to create new revenue streams or cut costs.
dpu technical
"SuperNIC cards and NVIDIA BlueField®-4 DPUs for accelerated networking, storage and security"
Distributions per unit (DPU) is the amount of cash or income paid to each unit holder of a trust, real estate investment trust (REIT) or similar pooled investment for a given period. It tells investors how much cash income they received per unit, like getting a fixed slice of a pie for every share you own, and helps compare yield and judge whether the payout level is steady, growing or at risk.

AI-generated analysis. Not financial advice.

NVIDIA Vera CPU Delivers the Highest Performance and Energy Efficiency for Data Processing, AI Training and Agentic Inference at Scale

News Summary:

  • The NVIDIA Vera CPU delivers results with twice the efficiency and 50% faster than traditional CPUs.
  • Customers collaborating with NVIDIA to deploy Vera CPU include Alibaba, ByteDance, Meta and Oracle Cloud Infrastructure, along with CoreWeave, Lambda, Nebius and Nscale.
  • Manufacturing partners already adopting the Vera CPU include Dell Technologies, HPE, Lenovo and Supermicro, along with ASUS, Compal, Foxconn, GIGABYTE, Pegatron, Quanta Cloud Technology (QCT), Wistron and Wiwynn.

SAN JOSE, Calif., March 16, 2026 (GLOBE NEWSWIRE) -- GTC -- NVIDIA today launched the NVIDIA Vera CPU, the world’s first processor purpose-built for the age of agentic AI and reinforcement learning — delivering results with twice the efficiency and 50% faster than traditional rack-scale CPUs.

As reasoning and agentic AI advances, scale, performance and cost are increasingly driven by the infrastructure supporting the models that plan tasks, run tools, interact with data, run code and validate results.

The NVIDIA Vera CPU builds on the success of the NVIDIA Grace™ CPU, enabling organizations of all sizes and across industries to build AI factories that unlock agentic AI at scale. With the highest single-thread performance and bandwidth per core, Vera is a new class of CPU that delivers higher AI throughput, responsiveness and efficiency for large-scale AI services such as coding assistants, as well as consumer and enterprise agents.

Leading hyperscalers collaborating with NVIDIA to deploy Vera include Alibaba, CoreWeave, Meta and Oracle Cloud Infrastructure, as well as global system makers Dell Technologies, HPE, Lenovo, Supermicro and others. This broad adoption establishes Vera as the new CPU standard for the AI workloads that matter most for developers, startups, public-private institutions and enterprises — helping democratize access to AI and accelerating innovation.

“Vera is arriving at a turning point for AI. As intelligence becomes agentic — capable of reasoning and acting — the importance of the systems orchestrating that work is elevated,” said Jensen Huang, founder and CEO of NVIDIA. “The CPU is no longer simply supporting the model; it’s driving it. With breakthrough performance and energy efficiency, Vera unlocks AI systems that think faster and scale further.”

Configurable for Every Data Center
NVIDIA announced a new Vera CPU rack integrating 256 liquid-cooled Vera CPUs to sustain more than 22,500 concurrent CPU environments, each running independently at full performance. AI factories can quickly deploy and scale to tens of thousands of simultaneous instances and agentic tools in a single rack.

The new Vera rack is built using the NVIDIA MGX™ modular reference architecture, supported by 80 ecosystem partners worldwide.

As part of the NVIDIA Vera Rubin NVL72 platform, Vera CPUs are paired with NVIDIA GPUs through NVIDIA NVLink™-C2C interconnect technology, with 1.8 TB/s of coherent bandwidth — 7x the bandwidth of PCIe Gen 6 — for high-speed data sharing between CPUs and GPUs. Additionally, NVIDIA introduced new reference designs that use Vera as the host CPU for NVIDIA HGX™ Rubin NVL8 systems, coordinating data movement and system control for GPU-accelerated workloads.

Vera systems partners are providing both dual and single-socket CPU server configurations, optimal for workloads such as reinforcement learning, agentic inference, data processing, orchestration, storage management, cloud applications and high-performance computing.

Across all configurations, Vera systems integrate NVIDIA ConnectX® SuperNIC cards and NVIDIA BlueField®-4 DPUs for accelerated networking, storage and security, which are critical for agentic AI. This enables customers to optimize for their specific workloads while maintaining a single software stack across the NVIDIA platform.

Designed for Agentic Scaling
By combining high-performance, energy-efficient CPU cores, a high-bandwidth memory subsystem and the second-generation NVIDIA Scalable Coherency Fabric, Vera enables faster agentic responses under the extreme utilization conditions common for agentic AI and reinforcement learning.

Vera features 88 custom NVIDIA-designed Olympus cores, delivering high performance for compilers, runtime engines, analytics pipelines, agentic tooling and orchestration services. Each core can run two tasks, using NVIDIA Spatial Multithreading, to deliver consistent, predictable performance — ideal for multi-tenant AI factories running many jobs at once.

To further enhance energy efficiency, Vera introduces the second generation of NVIDIA’s low-power memory subsystem, now built on LPDDR5X memory and delivering up to 1.2 TB/s of bandwidth — twice the bandwidth and at half the power compared with general-purpose CPUs.

Widespread Ecosystem Support
Cursor, an innovator in AI-native software development, is adopting NVIDIA Vera to boost performance for its AI coding agents.

“We’re excited to use NVIDIA Vera CPUs to improve overall throughput and efficiency so we can deliver faster, more responsive coding agent experiences for our customers,” said Michael Truell, cofounder and CEO of Cursor. 

Redpanda, a leading streaming data and AI platform, is using Vera to dramatically boost performance.

“Redpanda recently tested NVIDIA Vera running Apache Kafka-compatible workloads and saw dramatically better performance than other systems we’ve benchmarked, delivering up to 5.5x lower latency,” said Alex Gallego, founder and CEO of Redpanda. “Vera represents a new direction in CPU architecture, with more memory and less overhead per core, enabling our customers to scale real-time streaming workloads further than ever and unlock new AI and agentic applications.”

National laboratories planning to deploy Vera CPUs include Leibniz Supercomputing Centre, Los Alamos National Laboratory, Lawrence Berkeley National Laboratory's National Energy Research Scientific Computing Center and the Texas Advanced Computing Center (TACC).

“At TACC, we recently tested NVIDIA’s Vera CPU platform as we prepare for deployment in our upcoming Horizon system — and running six of our scientific applications, we saw impressive early results,” said John Cazes, director of high-performance computing at TACC. “Vera’s per-core performance and memory bandwidth represent a giant step forward for scientific computing, and we look forward to bringing Vera-based nodes to our CPU users on Horizon later this year.”

Leading cloud service providers planning to deploy Vera CPUs include Alibaba, ByteDance, Cloudflare, CoreWeave, Crusoe, Lambda, Nebius, Nscale, Oracle Cloud Infrastructure, Together.AI and Vultr.

Leading infrastructure providers adopting Vera CPUs include Aivres, ASRock Rack, ASUS, Compal, Cisco, Dell, Foxconn, GIGABYTE, HPE, Hyve, Inventec, Lenovo, MiTAC, MSI, Pegatron, Quanta Cloud Technology (QCT), Supermicro, Wistron and Wiwynn.

Availability
NVIDIA Vera is in full production and will be available from partners in the second half of this year.

Watch the GTC keynote from Huang and explore sessions.

About NVIDIA
NVIDIA (NASDAQ: NVDA) is the world leader in AI and accelerated computing.

For further information, contact:
Alex Shapiro
Corporate Communications
NVIDIA Corporation
press@nvidia.com

Certain statements in this press release including, but not limited to, statements as to: the CPU driving the model; the benefits, impact, performance, and availability of NVIDIA’s products, services, and technologies; expectations with respect to NVIDIA’s third party arrangements, including with its collaborators and partners; expectations with respect to technology developments; expectations with respect to AI and related industries; and other statements that are not historical facts are forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended, which are subject to the “safe harbor” created by those sections based on management’s beliefs and assumptions and on information currently available to management and are subject to risks and uncertainties that could cause results to be materially different than expectations. Important factors that could cause actual results to differ materially include: global economic and political conditions; NVIDIA’s reliance on third parties to manufacture, assemble, package and test NVIDIA’s products; the impact of technological development and competition; development of new products and technologies or enhancements to NVIDIA’s existing product and technologies; market acceptance of NVIDIA’s products or NVIDIA’s partners’ products; design, manufacturing or software defects; changes in consumer preferences or demands; changes in industry standards and interfaces; unexpected loss of performance of NVIDIA’s products or technologies when integrated into systems; NVIDIA’s ability to realize the potential benefits of business investments or acquisitions; and changes in applicable laws and regulations, as well as other factors detailed from time to time in the most recent reports NVIDIA files with the Securities and Exchange Commission, or SEC, including, but not limited to, its annual report on Form 10-K and quarterly reports on Form 10-Q. Copies of reports filed with the SEC are posted on the company’s website and are available from NVIDIA without charge. These forward-looking statements are not guarantees of future performance and speak only as of the date hereof, and, except as required by law, NVIDIA disclaims any obligation to update these forward-looking statements to reflect future events or circumstances.

© 2026 NVIDIA Corporation. All rights reserved. NVIDIA, the NVIDIA logo, BlueField, ConnectX, NVIDIA Grace, NVIDIA HGX, NVIDIA MGX, and NVLink are trademarks and/or registered trademarks of NVIDIA Corporation in the U.S. and other countries. Other company and product names may be trademarks of the respective companies with which they are associated. Features, pricing, availability and specifications are subject to change without notice.

A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/b7249351-9d2a-4beb-8bef-7515770c18a9


FAQ

What performance gains does the NVIDIA Vera CPU (NVDA) claim?

The Vera CPU (NVDA) claims 50% faster performance and 2x energy efficiency versus traditional rack-scale CPUs. According to the company, these gains stem from 88 Olympus cores, LPDDR5X memory and NVLink-C2C coherency designed for agentic AI workloads.

When will NVIDIA Vera (NVDA) be available and from which partners?

Vera (NVDA) is in full production with partner availability planned in the second half of 2026. According to the company, hyperscalers, cloud providers and system makers including Alibaba, Meta, Oracle Cloud, Dell, HPE and Lenovo will offer Vera-based systems.

How does NVIDIA Vera (NVDA) support agentic AI and reinforcement learning?

Vera (NVDA) targets agentic AI with high single-thread performance, LPDDR5X memory and Spatial Multithreading for multi‑tenant workloads. According to the company, these elements enable faster agentic responses, higher throughput for coding agents, and lower latency for reinforcement learning tasks.

How does NVIDIA Vera (NVDA) integrate with GPUs and system networking?

Vera (NVDA) connects to NVIDIA GPUs via NVLink-C2C providing 1.8 TB/s coherent bandwidth and pairs with HGX/Rubin platforms. According to the company, systems also include ConnectX SuperNICs and BlueField-4 DPUs to accelerate networking, storage and security.

Which customers reported early Vera benchmarking or deployment interest for NVDA?

Customers and partners reported early gains: the company cites Redpanda seeing up to 5.5x lower latency in streaming tests and Cursor noting improved throughput for coding agents. According to NVIDIA, national labs and cloud providers ran early tests with positive results.
Nvidia Corporation

NASDAQ:NVDA

View NVDA Stock Overview

NVDA Rankings

NVDA Latest News

NVDA Latest SEC Filings

NVDA Stock Data

4.38T
23.27B
Semiconductors
Semiconductors & Related Devices
Link
United States
SANTA CLARA