STOCK TITAN

Akamai to Deploy Thousands of NVIDIA Blackwell GPUs to Create One of the World’s Most Widely Distributed AI Platforms

Rhea-AI Impact
(Moderate)
Rhea-AI Sentiment
(Neutral)
Tags
AI

Akamai (NASDAQ: AKAM) announced on March 3, 2026 the deployment of thousands of NVIDIA Blackwell GPUs to build a globally distributed AI inference platform.

The platform routes inference workloads to optimized, localized GPU clusters across Akamai's global fabric of over 4,400 locations, targeting lower latency and reduced data egress versus centralized clouds.

According to the company, the stack can reduce latency up to 2.5x and save businesses as much as 86% on AI inference costs versus traditional hyperscaler infrastructure.

Loading...
Loading translation...

Positive

  • Deployment of thousands of NVIDIA Blackwell GPUs
  • Global footprint of over 4,400 locations
  • Latency reductions of up to 2.5x
  • Estimated AI inference cost savings up to 86%
  • Enables localized fine-tuning and post-training optimization

Negative

  • None.

Market Reaction – AKAM

+5.59% $103.10
15m delay 57 alerts
+5.59% Since News
$103.10 Last Price
$95.37 $103.72 Day Range
+$791M Valuation Impact
$14.95B Market Cap
0.6x Rel. Volume

Following this news, AKAM has gained 5.59%, reflecting a notable positive market reaction. Our momentum scanner has triggered 57 alerts so far, indicating high trading interest and price volatility. The stock is currently trading at $103.10. This price movement has added approximately $791M to the company's valuation.

Data tracked by StockTitan Argus (15 min delayed). Upgrade to Silver for real-time data.

Key Figures

Latency barrier: 56 percent Latency reduction: 2.5x AI inference savings: 86% +1 more
4 metrics
Latency barrier 56 percent Organizations citing latency as main barrier to AI at scale
Latency reduction 2.5x Reduction in latency with Akamai Inference Cloud vs traditional hyperscalers
AI inference savings 86% Cost savings on AI inference vs traditional hyperscaler infrastructure
Global locations 4,400+ Akamai distributed cloud and edge network locations worldwide

Market Reality Check

Price: $97.64 Vol: Volume 4,481,353 vs 20-da...
normal vol
$97.64 Last Close
Volume Volume 4,481,353 vs 20-day average 5,142,972 (relative volume 0.87x), not indicating outsized trading interest. normal
Technical Price $97.64 is trading above the 200-day MA $82.62, reflecting a pre-existing upward trend into this AI announcement.

Peers on Argus

AKAM fell 0.76% while key software/infra peers like SAIL (+5.29%), OKTA (+3.56%)...
1 Down

AKAM fell 0.76% while key software/infra peers like SAIL (+5.29%), OKTA (+3.56%), FFIV (+2.36%), and RBRK (+5.27%) traded higher. Momentum scanner only flagged MDB moving down, pointing to a company-specific reaction rather than a broad sector move.

Previous AI Reports

5 past events · Latest: Nov 05 (Positive)
Same Type Pattern 5 events
Date Event Sentiment Move Catalyst
Nov 05 AI platform traction Positive +1.4% Early traction for Akamai Inference Cloud after NVIDIA GTC debut.
Oct 28 AI platform launch Positive +0.3% Launch of Akamai Inference Cloud with NVIDIA Blackwell infrastructure.
Apr 29 AI security product Positive +1.0% Firewall for AI unveiled to protect AI apps from advanced threats.
Mar 27 Inference performance boost Positive +0.1% Akamai Cloud Inference launched touting better throughput, latency, and costs.
Mar 18 Edge AI partnership Positive -0.9% Strategic partnership with VAST Data to enhance edge AI inference.
Pattern Detected

AI-related announcements have historically produced modest share price moves for AKAM, with an average move of 0.39% and mostly mild positive reactions.

Recent Company History

Over the past year, Akamai has steadily expanded its AI and inference strategy, launching offerings such as Akamai Cloud Inference and Akamai Inference Cloud with NVIDIA Blackwell hardware and partnerships like VAST Data. These initiatives emphasized edge inference, latency reduction, and cost-efficient deployment. Today’s Blackwell GPU expansion and distributed AI platform fit into that same trajectory of scaling inference-optimized infrastructure across Akamai’s global edge footprint.

Historical Comparison

+0.4% avg move · Past AI announcements for AKAM (5 events) moved the stock an average of 0.39%, typically modest gain...
AI
+0.4%
Average Historical Move AI

Past AI announcements for AKAM (5 events) moved the stock an average of 0.39%, typically modest gains, suggesting the market has treated such AI build-outs as incremental rather than transformational.

AI news has progressed from launching and securing inference services to broader edge partnerships and now scaling NVIDIA Blackwell-based capacity across Akamai’s distributed cloud.

Market Pulse Summary

The stock is up +5.6% following this news. A strong positive reaction aligns with Akamai’s ongoing p...
Analysis

The stock is up +5.6% following this news. A strong positive reaction aligns with Akamai’s ongoing push into AI inference, building on prior launches like Akamai Inference Cloud and edge partnerships. Historically, AI-tagged news moved the stock an average of 0.39%, so a much larger gain would mark a break from past behavior. Investors would need to weigh capacity expansion benefits against execution risk and the possibility that enthusiasm for new GPU deployments could moderate once initial optimism fades.

Key Terms

latency, inference, large language models (llms), agentic ai, +1 more
5 terms
latency technical
"56 percent of organizations cite latency as the primary barrier"
Latency is the time delay between when information or an instruction is created and when it is received, processed, or acted on by a market system or data feed. For investors, that delay can alter the price you receive, cause missed trading opportunities, or increase execution risk — like sending a text to buy an item and the seller acting a few seconds later after the price has changed.
inference technical
"intelligently routes AI inference workloads to optimized compute resources"
Inference is the process of drawing a conclusion from available evidence or data, like a detective piecing together clues to form a likely story. For investors it matters because these judgments turn raw reports, test results, or market signals into expectations about future performance, risk, or regulatory outcomes—so how someone infers from the same facts can change investment decisions and valuation.
large language models (llms) technical
"Localized Fine-Tuning: Optimization of Large Language Models (LLMs) on-site"
Large language models (LLMs) are advanced computer programs trained on massive amounts of text to generate, summarize, translate and understand human-like language. For investors they matter because LLMs can act like a very fast, experienced research assistant—automating customer service, speeding product development and cutting costs—while also creating new revenue opportunities and regulatory, accuracy and ethical risks that can affect a company’s profits and reputation.
agentic ai technical
"foundational infrastructure for physical and agentic AI where decisions must happen"
Agentic AI refers to computer systems that can make their own decisions and take actions without needing someone to tell them what to do each time. It's like giving a robot a degree of independence to solve problems or achieve goals on its own, which matters because it could change how we work and interact with technology in everyday life.
edge network technical
"global edge network, which has over 4,400 locations worldwide"
A network of small data centers and computing resources placed close to users and devices rather than in one central facility, so processing and storage happen near the “edge” of the internet. Investors care because edge networks reduce delays and cut bandwidth costs for cloud services, streaming, and connected devices, which can boost revenue, lower operating expenses, and enable new products—like choosing a neighborhood grocery instead of a distant supermarket for faster access.

AI-generated analysis. Not financial advice.

CAMBRIDGE, Mass., March 03, 2026 (GLOBE NEWSWIRE) -- Akamai (NASDAQ: AKAM), announced the acquisition of thousands of NVIDIA® Blackwell GPUs to bolster its global distributed cloud infrastructure. The deployment creates a unified platform for AI R&D, fine-tuning, and post-training optimization that intelligently routes AI inference workloads to optimized compute resources across Akamai's massive global network. The architecture is designed to support rapid inference by reducing the latency and data egress issues associated with centralized data centers.

While the first wave of AI focused on model training in centralized hubs, the industry has reached a tipping point where inference matters as much as training. The MIT Technology Review recently reported that 56 percent of organizations cite latency as the primary barrier preventing AI deployment at scale. By treating the globe as a single, low-latency backplane, Akamai is bridging this gap and providing the foundational infrastructure for physical and agentic AI where decisions must happen at the speed of the real world.

“While hyperscalers continue to push the boundaries of AI training, Akamai is focused on meeting the unique demands of the inference era,” said Adam Karon, Chief Operating Officer and General Manager, Cloud Technology Group, Akamai. “Centralized AI factories remain essential for building models, but bringing those models to life at scale requires a decentralized nervous system. By distributing inference-optimized compute across our global fabric, Akamai isn’t just adding capacity. We’re providing the scale, at minimal latency, that is required to move AI from the laboratory to the street corner and the hospital bed – where the work happens, where the data lives, and where the ROI is realized.”

Akamai’s adoption of Blackwell GPUs advances Akamai’s vision for a globally distributed AI compute grid built for the inference era. By extending AI processing beyond centralized AI factories to high-density distributed infrastructure, Akamai allows AI to interact with physical systems — from autonomous delivery and smart grids to surgical robotics and critical fraud prevention — without the geographic or cost limitations of traditional cloud architecture.

The integration of NVIDIA Blackwell AI infrastructure enables:

  • Predictable, High-Performance Inference: Processing AI workloads on dedicated GPU clusters to generate rapid responses.
  • Localized Fine-Tuning: Optimization of Large Language Models (LLMs) on-site to support data privacy and regional compliance needs.
  • Post-Model Training: Fine-tuning and adapting foundation models on proprietary data to improve accuracy for specific tasks.

This announcement follows Akamai’s recent initiatives to expand its AI inference and generalized compute capabilities. In October 2025, the company announced Akamai Inference Cloud, redefining where and how AI is used by bringing AI inference closer to users and devices.

By providing tools for platform engineers and developers to build and run AI applications and data-intensive workloads closer to end users, Akamai delivers highly efficient throughput while reducing latency up to 2.5x, saving businesses as much as 86% on AI inference using NVIDIA AI infrastructure when compared to traditional hyperscaler infrastructure.

The platform combines NVIDIA RTX PRO™ Servers, featuring NVIDIA RTX PRO™ 6000 Blackwell Server Edition GPUs, and NVIDIA BlueField®-3 DPUs with Akamai's distributed cloud computing infrastructure and global edge network, which has over 4,400 locations worldwide.

Akamai has seen strong demand for its initial deployment of NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs, and will be continuing to add GPU capacity as part of its cloud infrastructure strategy.

About Akamai

Akamai is the cybersecurity and cloud computing company that powers and protects business online. Our market-leading security solutions, superior threat intelligence, and global operations team provide defense in depth to safeguard enterprise data and applications everywhere. Akamai’s full-stack cloud computing solutions deliver performance and affordability on the world’s most distributed platform. Global enterprises trust Akamai to provide the industry-leading reliability, scale, and expertise they need to grow their business with confidence. Learn more at akamai.com and akamai.com/blog, or follow Akamai Technologies on X and LinkedIn.

Contacts
Akamai Media Relations
akamaipr@akamai.com


FAQ

What did Akamai (AKAM) announce on March 3, 2026 about NVIDIA Blackwell GPUs?

Akamai announced deployment of thousands of NVIDIA Blackwell GPUs to power a global inference platform. According to the company, the GPUs form distributed, inference-optimized clusters across its edge network to reduce latency and data egress versus centralized data centers.

How will Akamai's AKAM deployment affect AI inference latency and costs?

Akamai says the deployment can reduce latency up to 2.5x and lower inference costs up to 86%. According to the company, these gains come from localized GPU clusters and optimized routing across its distributed cloud and edge network.

How extensive is Akamai's global infrastructure supporting the AKAM AI platform?

The AI platform will run across Akamai's global fabric of over 4,400 locations. According to the company, that footprint enables localized inference and fine-tuning closer to users and data to meet latency and compliance needs.

What AI use cases does Akamai say the AKAM GPU deployment targets?

Akamai targets inference-driven use cases like autonomous delivery, smart grids, surgical robotics, and fraud prevention. According to the company, distributed inference supports real-world, low-latency decisions where data and ROI reside.

Will Akamai (AKAM) support model fine-tuning and post-training on its distributed platform?

Yes. Akamai says the platform enables localized fine-tuning and post-training adaptation of foundation models on proprietary data. According to the company, this supports data privacy, regional compliance, and improved task-specific accuracy.
Akamai Technologies Inc

NASDAQ:AKAM

AKAM Rankings

AKAM Latest News

AKAM Latest SEC Filings

AKAM Stock Data

14.26B
141.58M
Software - Infrastructure
Services-business Services, Nec
Link
United States
CAMBRIDGE