Akamai Inference Cloud Transforms AI from Core to Edge with NVIDIA
Akamai (NASDAQ:AKAM) launched Akamai Inference Cloud on October 28, 2025, a distributed edge AI inference platform built with NVIDIA Blackwell infrastructure to deliver low-latency, real-time AI inference from core to edge.
The platform pairs NVIDIA RTX PRO Servers, RTX PRO 6000 Blackwell GPUs, BlueField DPUs, and NVIDIA AI Enterprise software with Akamai's global edge network of over 4,200 locations. Initial availability targets 20 locations with expanded rollout planned. Use cases include agentic AI, streaming inference for real-time decisioning, and physical AI for autonomous systems, with an orchestration layer routing tasks between edge and centralized AI factories.
Akamai (NASDAQ:AKAM) ha lanciato Akamai Inference Cloud il 28 ottobre 2025, una piattaforma di inferenza AI edge distribuita costruita con l'infrastruttura NVIDIA Blackwell per fornire inferenza AI a bassa latenza in tempo reale dal core all'edge.
La piattaforma abbina NVIDIA RTX PRO Server, RTX PRO 6000 Blackwell GPU, BlueField DPU e software NVIDIA AI Enterprise con la rete edge globale di Akamai di più di 4.200 località. Le disponibilità iniziali mirano a 20 località con un'espansione pianificata. I casi d'uso includono AI agent, inferenza in streaming per decisioni in tempo reale e AI fisica per sistemi autonomi, con uno strato di orchestrazione che instrada i compiti tra edge e fabbriche AI centralizzate.
Akamai (NASDAQ:AKAM) lanzó Akamai Inference Cloud el 28 de octubre de 2025, una plataforma de inferencia de IA en el borde distribuida construida con la infraestructura NVIDIA Blackwell para ofrecer inferencia de IA en tiempo real de baja latencia desde el núcleo hasta el borde.
La plataforma empareja NVIDIA RTX PRO Server, RTX PRO 6000 Blackwell GPUs, BlueField DPUs y software NVIDIA AI Enterprise con la red global de borde de Akamai de más de 4.200 ubicaciones. Los objetivos de disponibilidad inicial son 20 ubicaciones con expansión planificada. Los casos de uso incluyen IA agentiva, inferencia en streaming para toma de decisiones en tiempo real y IA física para sistemas autónomos, con una capa de orquestación que dirige tareas entre el borde y las fábricas de IA centralizadas.
Akamai (NASDAQ:AKAM) 2025년 10월 28일 Akamai Inference Cloud를 출시했습니다. NVIDIA Blackwell 인프라를 구축한 분산 엣지 AI 추론 플랫폼으로 코어에서 엣지까지 저지연 실시간 AI 추론을 제공합니다.
이 플랫폼은 NVIDIA RTX PRO 서버, RTX PRO 6000 Blackwell GPU, BlueField DPU, 그리고 NVIDIA AI Enterprise 소프트웨어를 Akamai의 글로벌 엣지 네트워크 4,200개 이상의 위치와 결합합니다. 초기 가용성 목표는 20개 위치이며 확장 로드맵이 계획되어 있습니다. 사용 사례로는 에이전트 AI, 실시간 의사결정을 위한 스트리밍 추론, 자율 시스템용 물리적 AI가 있으며, 엣지와 중앙 집중형 AI 팩토리 간 작업을 라우팅하는 오케스트레이션 계층이 있습니다.
Akamai (NASDAQ:AKAM) a lancé Akamai Inference Cloud le 28 octobre 2025, une plateforme d'inférence IA en edge distribuée construite avec l'infrastructure NVIDIA Blackwell pour offrir une inférence IA en temps réel et à faible latence du centre jusqu'à l'extrémité.
La plateforme associe les serveurs NVIDIA RTX PRO, les GPU RTX PRO 6000 Blackwell, les BlueField DPUs et le logiciel NVIDIA AI Enterprise au réseau edge mondial d'Akamai composé de plus de 4 200 sites. Les disponibilités initiales ciblent 20 sites, avec un déploiement élargi prévu. Les cas d'utilisation incluent l'IA agentique, l'inférence en streaming pour la prise de décision en temps réel et l'IA physique pour les systèmes autonomes, avec une couche d'orchestration directionnant les tâches entre l'edge et les usines d'IA centralisées.
Akamai (NASDAQ:AKAM) hat Akamai Inference Cloud am 28. Oktober 2025 eingeführt, eine verteilte Edge-KI-Inferenzplattform, die mit der NVIDIA Blackwell-Infrastruktur aufgebaut wurde, um eine latenzarme, Echtzeit-KI-Inferenz vom Kern bis zum Edge bereitzustellen.
Die Plattform koppelt NVIDIA RTX PRO Server, RTX PRO 6000 Blackwell GPUs, BlueField DPUs und NVIDIA AI Enterprise-Software mit Akamais globalem Edge-Netzwerk von über 4.200 Standorten. Die anfängliche Verfügbarkeit zielt auf 20 Standorte ab, mit einer geplanten erweiterten Einführung. Anwendungsfälle umfassen agentische KI, Streaming-Inferenz für Echtzeitentscheidungen und physische KI für autonome Systeme, mit einer Orchestrierungsschicht, die Aufgaben zwischen Edge und zentralen KI-Fabriken routet.
Akamai (NASDAQ:AKAM) أطلقت Akamai Inference Cloud في 28 أكتوبر 2025، منصة استدلال ذكاء اصطناعي حافة موزعة مبنية على بنية NVIDIA Blackwell لتقديم استدلال ذكاء اصطناعي منخفض الكمون وفي الوقت الفعلي من النواة إلى الحافة.
تقرّب المنصة بين خوادم NVIDIA RTX PRO، وGPU RTX PRO 6000 Blackwell، وBlueField DPUs، وبرمجيات NVIDIA AI Enterprise مع شبكة الحافة العالمية لـ Akamai التي تضم أكثر من 4200 موقع. تستهدف التوفرات الأولية 20 موقعًا مع خطة توسيع. تشمل حالات الاستخدام AI الوكيلة، واستدلال البث للمساعدة في اتخاذ القرار في الوقت الفعلي، وAI الفيزيائي للأنظمة المستقلة، مع طبقة تنظيم ترسل المهام بين الحافة ومصانع الذكاء الاصطناعي المركزية.
Akamai (NASDAQ:AKAM) 于2025年10月28日推出 Akamai Inference Cloud,这是一种分布式边缘AI推理平台,采用NVIDIA Blackwell基础架构构建,可实现核心到边缘的低延迟实时AI推理。
该平台将NVIDIA RTX PRO服务器、RTX PRO 6000 Blackwell GPU、BlueField DPU和NVIDIA AI Enterprise软件与Akamai全球边缘网络相结合,覆盖超过4,200个地点。初始可用性目标为20个地点,并计划扩展部署。用例包括代理型AI、用于实时决策的流式推理,以及用于自主系统的物理AI,伴随一个编排层,在边缘与集中AI工厂之间路由任务。
- Built on NVIDIA Blackwell GPUs and BlueField DPUs
- Leverages Akamai's global edge network of >4,200 locations
- Edge-native orchestration to route inference between edge and cores
- Targets low-latency, real-time agentic and physical AI workloads
- Initial availability limited to 20 global locations
- Platform depends on NVIDIA stack for core acceleration and security
Insights
Launch of Akamai Inference Cloud with NVIDIA expands low-latency AI inference to the global edge, starting immediate commercial rollout.
Akamai pairs its global edge network with NVIDIA Blackwell infrastructure to push inference from core data centers to edge locations. The platform targets low-latency, agentic, and physical AI use cases by placing inference closer to users and devices and tying edge nodes to centralized AI factories.
The proposition depends on deploying GPUs, DPUs, and software across distributed sites and integrating NVIDIA's stack at scale. Execution risks include hardware deployment speed, orchestration complexity, and real-world latency/security performance versus central clouds. Key concrete items to watch are the availability date
Provides scalable, secure, and low-latency AI inference globally, to power the wave of Agentic and Physical AI
Akamai Inference Cloud enables intelligent, agentic AI inference at the edge, close to users and devices. Unlike traditional systems this platform is purpose-built to provide low-latency, real-time edge AI processing on a global scale. This launch of Akamai Inference Cloud leverages Akamai's expertise in globally distributed architectures and NVIDIA Blackwell AI infrastructure to radically rethink and extend the accelerated computing needed to unlock AI's true potential.
The next generation of AI applications, from personalized digital experiences and smart agents to real-time decision systems demand that AI inference be pushed closer to the user, providing instant engagement where they interact, and making smart decisions about where to route requests. Agentic workloads increasingly require low-latency inference, local context, and the ability to scale globally in an instant. Built to power this transformation, Akamai Inference Cloud is a distributed, generative edge platform that places the NVIDIA AI stack closer to where data is created and decisions need to be made.
"The next wave of AI requires the same proximity to users that allowed the internet to scale to become the pervasive global platform that it is today," said Dr. Tom Leighton, Akamai CEO and co-founder. "Akamai solved this challenge before - and we're doing it again. Powered by NVIDIA AI infrastructure, Akamai Inference Cloud will meet the intensifying demand to scale AI inference capacity and performance by putting AI's decision-making in thousands of locations around the world, enabling faster, smarter, and more secure responses."
"Inference has become the most compute-intensive phase of AI — demanding real-time reasoning at planetary scale," said Jensen Huang, founder and CEO, NVIDIA. "Together, NVIDIA and Akamai are moving inference closer to users everywhere, delivering faster, more scalable generative AI and unlocking the next generation of intelligent applications."
Akamai Inference Cloud redefines where and how AI is used by bringing intelligent, agentic AI inference close to users and devices. The platform combines NVIDIA RTX PRO Servers, featuring NVIDIA RTX PRO 6000 Blackwell Server Edition GPUs, NVIDIA BlueField-3 DPUs, and NVIDIA AI Enterprise software with Akamai's distributed cloud computing infrastructure and global edge network, which has over 4,200 locations worldwide. Akamai Inference Cloud will drive Akamai's vision for highly scalable, distributed AI performance worldwide by leveraging NVIDIA's latest technologies — including the recently announced NVIDIA BlueField-4 DPU -- to further accelerate and secure data access and AI inference workloads from core to edge.
Akamai has teamed up with NVIDIA to boldly bring inference where inference has never gone before, charting new AI frontiers with Akamai Inference Cloud:
- Extending enterprise AI Factories to the edge to enable smart commerce agents and personalized digital experiences - AI Factories are powerhouses that orchestrate the AI lifecycle from data ingestion to creating intelligence at scale. Akamai Inference Cloud extends AI Factories to the edge, decentralizing data and processing and routing requests to the best model using Akamai's massively distributed edge locations. This will enable smart agents to adapt instantly to user location, behavior and intent, and act autonomously to negotiate, purchase, and optimize transactions in real time.
- Enabling Streaming Inference and Agents to provide instant financial insights and perform real-time decisioning - AI agents require multiple sequential inferences to complete complex tasks, creating delays that erode user engagement. Agentic AI workflows require several inference calls, and if each call creates a network delay, it makes the experience feel sluggish or too slow to meet machine-to-machine latency requirements. Akamai Inference Cloud's edge-native architecture delivers virtually instant responses, enabling AI agents to operate with human-like responsiveness across multi-step workflows. This can be useful in detecting fraud, accelerating secure payments, and enabling high-speed decisions for industrial edges.
- Enabling Real-Time Physical AI to operate beyond human-level responsiveness - Physical AI systems like autonomous vehicles, industrial robots, and smart city infrastructure require millisecond-precision decision-making to safely interact with the physical world. Akamai Inference Cloud is designed to enable physical AI to process sensor data, make safety decisions, and coordinate actions at the speed of the physical world—helping transform everything from factory floors and delivery drones to surgical robots and autonomous transportation networks into responsive, intelligent systems that can operate safely alongside humans.
- Accelerating Time to Value - Orchestrating complex, distributed AI workloads across multiple cloud regions requires specialized skills and teams. Akamai Inference Cloud's intelligent orchestration layer automatically routes AI tasks to optimal locations—routine inference executes instantly at the edge through NVIDIA's NIM microservices, while sophisticated reasoning leverages centralized AI factories, all managed through a unified platform that abstracts away infrastructure complexity.
Akamai Inference Cloud is available, targeting 20 initial locations around the globe with plans for an expanded rollout underway.
About Akamai
Akamai is the cybersecurity and cloud computing company that powers and protects business online. Our market-leading security solutions, superior threat intelligence, and global operations team provide defense in depth to safeguard enterprise data and applications everywhere. Akamai's full-stack cloud computing solutions deliver performance and affordability on the world's most distributed platform. Global enterprises trust Akamai to provide the industry-leading reliability, scale, and expertise they need to grow their business with confidence. Learn more at akamai.com and akamai.com/blog, or follow Akamai Technologies on X and LinkedIn.
Akamai Statement Under the Private Securities Litigation Reform Act
This press release contains statements that are not statements of historical fact and constitute forward-looking statements for purposes of the safe harbor provisions under The Private Securities Litigation Reform Act of 1995, including, but not limited to, statements about Akamai Inference Cloud, its anticipated capabilities, scalability, performance, global deployment plans and the expected benefits to Akamai, current and prospective customers and end users. Each of the forward-looking statements is subject to change as a result of various important factors, many of which are beyond Akamai's control, including, but not limited to: Akamai's inability to achieve the expected performance or benefits of Akamai Inference Cloud; Akamai's capabilities failing to meet expectations, including due to defects, security breaches, delays in performance, challenges in leveraging NVIDIA's technologies, or other similar problems; effects of competition, including pricing pressure and changing business models; changes in customer or user preferences or demands; macroeconomic trends and uncertainties, including industry volatility, the effects of inflation, fluctuating interest and foreign currency exchange rates, supply chain and logistics costs, constraints, changes or disruptions; defects or disruptions in Akamai's or NVIDIA's products or IT systems, including cyber-attacks, data breaches or malware; changes to economic, political and regulatory conditions in
Contacts
Akamai Media Relations
akamaipr@akamai.com
Akamai Investor Relations
invrel@akamai.com
View original content to download multimedia:https://www.prnewswire.com/news-releases/akamai-inference-cloud-transforms-ai-from-core-to-edge-with-nvidia-302597280.html
SOURCE Akamai Technologies