STOCK TITAN

NVIDIA DGX Cloud Lepton Connects Europe’s Developers to Global NVIDIA Compute Ecosystem

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Neutral)
Tags
NVIDIA announced the expansion of DGX Cloud Lepton, an AI platform featuring a global compute marketplace, with new cloud providers joining the network. Major providers including Mistral AI, AWS, Microsoft Azure, and others are contributing NVIDIA Blackwell GPUs to the marketplace. Hugging Face is integrating DGX Cloud Lepton into its new Training Cluster as a Service offering, enhancing AI researchers' access to computing resources. NVIDIA partnered with European VCs to offer up to $100,000 in marketplace credits to portfolio companies. The platform simplifies access to GPU resources within specific regions, supports data governance, and integrates with NVIDIA's software suite. Early customers include Basecamp Research, EY, and others using the platform for various AI initiatives. The platform includes management software for cloud providers to monitor GPU health and automate analysis, ensuring reliable computing access.
NVIDIA ha annunciato l'espansione di DGX Cloud Lepton, una piattaforma AI con un marketplace globale di calcolo, con nuovi fornitori cloud che si uniscono alla rete. Provider importanti come Mistral AI, AWS, Microsoft Azure e altri stanno contribuendo con GPU NVIDIA Blackwell al marketplace. Hugging Face sta integrando DGX Cloud Lepton nel suo nuovo servizio Training Cluster as a Service, migliorando l'accesso alle risorse di calcolo per i ricercatori AI. NVIDIA ha collaborato con venture capital europei per offrire fino a 100.000 dollari in crediti marketplace alle loro aziende portfolio. La piattaforma semplifica l'accesso alle risorse GPU in specifiche regioni, supporta la governance dei dati e si integra con la suite software di NVIDIA. Tra i primi clienti figurano Basecamp Research, EY e altri che utilizzano la piattaforma per diverse iniziative AI. La piattaforma include software di gestione per i fornitori cloud per monitorare lo stato delle GPU e automatizzare le analisi, garantendo un accesso affidabile al calcolo.
NVIDIA anunció la expansión de DGX Cloud Lepton, una plataforma de IA con un mercado global de computación, con nuevos proveedores en la red. Proveedores importantes como Mistral AI, AWS, Microsoft Azure y otros están aportando GPUs NVIDIA Blackwell al mercado. Hugging Face está integrando DGX Cloud Lepton en su nuevo servicio Training Cluster as a Service, mejorando el acceso a recursos computacionales para investigadores de IA. NVIDIA se asoció con capitales de riesgo europeos para ofrecer hasta 100,000 dólares en créditos del mercado a las empresas de sus carteras. La plataforma facilita el acceso a recursos GPU en regiones específicas, soporta la gobernanza de datos y se integra con la suite de software de NVIDIA. Entre los primeros clientes están Basecamp Research, EY y otros que usan la plataforma para diversas iniciativas de IA. La plataforma incluye software de gestión para proveedores cloud que monitorea la salud de las GPU y automatiza análisis, asegurando acceso confiable a la computación.
NVIDIA는 글로벌 컴퓨팅 마켓플레이스를 갖춘 AI 플랫폼 DGX Cloud Lepton의 확장을 발표하며 새로운 클라우드 제공업체들이 네트워크에 합류했다고 밝혔습니다. Mistral AI, AWS, Microsoft Azure 등 주요 제공업체들이 NVIDIA Blackwell GPU를 마켓플레이스에 제공하고 있습니다. Hugging Face는 DGX Cloud Lepton을 새로운 Training Cluster as a Service 서비스에 통합하여 AI 연구자들이 컴퓨팅 자원에 더 쉽게 접근할 수 있도록 지원합니다. NVIDIA는 유럽 벤처캐피털과 협력해 포트폴리오 기업에 최대 10만 달러의 마켓플레이스 크레딧을 제공하고 있습니다. 이 플랫폼은 특정 지역 내 GPU 자원 접근을 간소화하고 데이터 거버넌스를 지원하며 NVIDIA 소프트웨어 제품군과 통합됩니다. 초기 고객으로는 Basecamp Research, EY 등이 있으며 다양한 AI 프로젝트에 플랫폼을 활용하고 있습니다. 또한 클라우드 제공업체가 GPU 상태를 모니터링하고 분석을 자동화할 수 있는 관리 소프트웨어를 포함해 안정적인 컴퓨팅 접근을 보장합니다.
NVIDIA a annoncé l'expansion de DGX Cloud Lepton, une plateforme d'IA dotée d'un marché mondial de calcul, avec l'arrivée de nouveaux fournisseurs cloud dans le réseau. Des acteurs majeurs comme Mistral AI, AWS, Microsoft Azure et d'autres contribuent avec des GPU NVIDIA Blackwell au marché. Hugging Face intègre DGX Cloud Lepton dans sa nouvelle offre Training Cluster as a Service, améliorant l'accès des chercheurs en IA aux ressources informatiques. NVIDIA s'est associé à des fonds de capital-risque européens pour offrir jusqu'à 100 000 dollars de crédits sur le marché aux entreprises de leur portefeuille. La plateforme facilite l'accès aux ressources GPU dans des régions spécifiques, prend en charge la gouvernance des données et s'intègre à la suite logicielle de NVIDIA. Parmi les premiers clients figurent Basecamp Research, EY et d'autres, qui utilisent la plateforme pour diverses initiatives en IA. La plateforme comprend un logiciel de gestion permettant aux fournisseurs cloud de surveiller l'état des GPU et d'automatiser les analyses, garantissant un accès fiable au calcul.
NVIDIA hat die Erweiterung von DGX Cloud Lepton angekündigt, einer KI-Plattform mit einem globalen Compute-Marktplatz, an dem neue Cloud-Anbieter teilnehmen. Wichtige Anbieter wie Mistral AI, AWS, Microsoft Azure und weitere stellen NVIDIA Blackwell GPUs für den Marktplatz bereit. Hugging Face integriert DGX Cloud Lepton in sein neues Training Cluster as a Service-Angebot, um Forschern im KI-Bereich besseren Zugriff auf Rechenressourcen zu ermöglichen. NVIDIA arbeitet mit europäischen Venture Capital Firmen zusammen, um Portfolio-Unternehmen bis zu 100.000 US-Dollar an Marktplatz-Guthaben anzubieten. Die Plattform vereinfacht den Zugriff auf GPU-Ressourcen in bestimmten Regionen, unterstützt Daten-Governance und integriert sich in NVIDIA's Software-Suite. Zu den ersten Kunden zählen Basecamp Research, EY und weitere, die die Plattform für verschiedene KI-Initiativen nutzen. Die Plattform beinhaltet Management-Software für Cloud-Anbieter, um den GPU-Zustand zu überwachen und Analysen zu automatisieren, was einen zuverlässigen Zugriff auf Rechenleistung sicherstellt.
Positive
  • Integration with major cloud providers including AWS and Microsoft Azure expands compute resource availability
  • Partnership with European VCs offering up to $100,000 in credits to startups accelerates AI ecosystem growth
  • New management software features for real-time GPU health monitoring and automated analysis reduce downtime
  • Integration with Hugging Face's Training Cluster as a Service enhances accessibility for AI researchers
Negative
  • None.

Insights

NVIDIA strengthens its AI ecosystem by expanding DGX Cloud Lepton with major cloud providers and strategic partnerships across Europe.

NVIDIA's expansion of the DGX Cloud Lepton platform represents a significant strategic move to strengthen its AI infrastructure ecosystem in Europe. By adding Mistral AI, AWS, Microsoft Azure, and several other cloud providers to its marketplace, NVIDIA is effectively creating a networked AI compute fabric that spans across regions while addressing data sovereignty concerns – a critical consideration for European businesses and regulators.

The integration with Hugging Face's new Training Cluster as a Service is particularly noteworthy. This partnership creates a streamlined pathway for AI researchers to access high-performance computing resources for model training, removing significant technical barriers. By connecting Hugging Face's extensive community of AI developers with NVIDIA's compute resources, the company is positioning itself at the critical intersection of model development and infrastructure, further cementing its essential role in the AI ecosystem.

The venture capital partnerships with Accel, Elaia, Partech and Sofinnova Partners – offering up to $100,000 in GPU capacity credits to portfolio companies – demonstrates a strategic investment in nurturing the European AI startup ecosystem. This approach creates a pipeline of companies building on NVIDIA's technology stack, driving future demand and establishing NVIDIA's architecture as the standard for AI development in Europe.

What makes this announcement particularly impactful is how it addresses one of the most significant bottlenecks in AI development today: access to specialized GPU compute. By aggregating resources from multiple providers into a unified marketplace with integrated software tools, NVIDIA is creating a more accessible and efficient path for developers to build and scale AI applications, potentially accelerating AI adoption across industries in Europe.

  • Mistral AI, Nebius, Nscale, Firebird, Fluidstack, Hydra Host, Scaleway and Together AI — Along With AWS and Microsoft Azure — Bring Compute Resources to DGX Cloud Lepton Marketplace to Meet AI Demand
  • Hugging Face Integrates DGX Cloud Lepton Into Training Cluster as a Service, Expanding AI Researcher Access to Scalable Compute for Model Training
  • NVIDIA and Leading European Venture Capitalists Offer Marketplace Credits to Portfolio Companies to Accelerate Startup Ecosystem

PARIS, June 11, 2025 (GLOBE NEWSWIRE) -- NVIDIA GTC Paris at VivaTech -- NVIDIA today announced the expansion of NVIDIA DGX Cloud Lepton™ — an AI platform featuring a global compute marketplace that connects developers building agentic and physical AI applications — with GPUs now available from a growing network of cloud providers.

Mistral AI, Nebius, Nscale, Firebird, Fluidstack, Hydra Host, Scaleway and Together AI are now contributing NVIDIA Blackwell and other NVIDIA architecture GPUs to the marketplace, expanding regional access to high-performance compute. AWS and Microsoft Azure will be the first large-scale cloud providers to participate in DGX Cloud Lepton. These companies join CoreWeave, Crusoe, Firmus, Foxconn, GMI Cloud, Lambda and Yotta Data Services in the marketplace.

To make accelerated computing more accessible to the global AI community, Hugging Face is introducing Training Cluster as a Service. This new offering integrates with DGX Cloud Lepton to seamlessly connect AI researchers and developers building foundation models with the NVIDIA compute ecosystem.

NVIDIA is also working with leading European venture capital firms Accel, Elaia, Partech and Sofinnova Partners to offer DGX Cloud Lepton marketplace credits to portfolio companies, enabling startups to access accelerated computing resources and scale regional development.

“DGX Cloud Lepton is connecting Europe’s developers to a global AI infrastructure,” said Jensen Huang, founder and CEO of NVIDIA. “With partners across the region, we’re building a network of AI factories that developers, researchers and enterprises can harness to scale local breakthroughs into global innovation.”

DGX Cloud Lepton simplifies the process of accessing reliable, high-performance GPU resources within specific regions by unifying cloud AI services and GPU capacity from across the NVIDIA compute ecosystem onto a single platform. This enables developers to keep their data local, supporting data governance and sovereign AI requirements.

In addition, by integrating with the NVIDIA software suite — including NVIDIA NIM™ and NeMo™ microservices and NVIDIA Cloud Functions — DGX Cloud Lepton streamlines and accelerates every stage of AI application development and deployment, at any scale. The marketplace works with a new NIM microservice container, which includes support for a broad range of large language models, including the most popular open LLM architectures and more than a million models hosted publicly and privately on Hugging Face.

For cloud providers, DGX Cloud Lepton includes management software that continuously monitors GPU health in real time and automates root-cause analysis, minimizing manual intervention and reducing downtime. This streamlines operations for providers and ensures more reliable access to high-performance computing for customers.

NVIDIA DGX Cloud Lepton Speeds Training and Deployment
Early-access DGX Cloud Lepton customers using the platform to accelerate their strategic AI initiatives include:

  • Basecamp Research, which is speeding the discovery and design of new biological solutions for pharmaceuticals, food and industrial and environmental biotechnology by harnessing its 9.8 billion-protein database to pretrain and deploy large biological foundation models.
  • EY, which is standardizing multi-cloud access across the global organization to accelerate the development of AI agents for domain- and sector-specific solutions.
  • Outerbounds, which enables customers to build differentiated, production-grade AI products powered by the proven reliability of open-source Metaflow.
  • Prima Mente, which is advancing neurodegenerative disease research at scale by pretraining large brain foundation models to uncover new disease mechanisms and tools to stratify patient outcomes in clinical settings.
  • Reflection, which is building superintelligent autonomous coding systems that handle the most complex enterprise engineering tasks.

Hugging Face Developers Get Access to Scalable AI Training Across Clouds
Integrating DGX Cloud Lepton with Hugging Face’s Training Cluster as a Service offering gives AI builders streamlined access to the GPU marketplace, making it easy to reserve, access and use NVIDIA compute resources in specific regions, close to their training data. Connected to a global network of cloud providers, Hugging Face customers can quickly secure the necessary GPU capacity for training runs using DGX Cloud Lepton. Mirror PhysicsProject Numina and the Telethon Institute of Genetics and Medicine will be among the first Hugging Face customers to access Training Cluster as a Service, with compute resources provided through DGX Cloud Lepton. They will use the platform to advance state-of-the-art AI models in chemistry, materials science, mathematics and disease research.

“Access to large-scale, high-performance compute is essential for building the next generation of AI models across every domain and language,” said Clément Delangue, cofounder and CEO of Hugging Face. “The integration of DGX Cloud Lepton with Training Cluster as a Service will remove barriers for researchers and companies, unlocking the ability to train the most advanced models and push the boundaries of what’s possible in AI.”

DGX Cloud Lepton Boosts AI Startup Ecosystem
NVIDIA is working with Accel, Elaia, Partech and Sofinnova Partners to offer up to $100,000 in GPU capacity credits and support from NVIDIA experts to eligible portfolio companies through DGX Cloud Lepton.

BioCorteX, Bioptimus and Latent Labs will be among the first to access DGX Cloud Lepton, where they can discover and purchase compute capacity and use NVIDIA software, services and AI expertise to build, customize and deploy applications across a global network of cloud providers.

Availability
Developers can sign up for early access to NVIDIA DGX Cloud Lepton.

Watch the NVIDIA GTC Paris keynote from Huang at VivaTech, and explore GTC Paris sessions.

About NVIDIA
NVIDIA (NASDAQ: NVDA) is the world leader in accelerated computing.

For further information, contact:
Natalie Hereth
NVIDIA Corporation
+1-360-581-1088
nhereth@nvidia.com

Certain statements in this press release including, but not limited to, statements as to: DGX Cloud Lepton connecting Europe’s developers to a global AI infrastructure; with partners across the region, NVIDIA building a network of AI factories that developers, researchers and enterprises can harness to scale local breakthroughs into global innovation; the benefits, impact, performance, and availability of NVIDIA’s products, services, and technologies; expectations with respect to NVIDIA’s third party arrangements, including with its collaborators and partners; expectations with respect to technology developments; and other statements that are not historical facts are forward-looking statements within the meaning of Section 27A of the Securities Act of 1933, as amended, and Section 21E of the Securities Exchange Act of 1934, as amended, which are subject to the “safe harbor” created by those sections based on management’s beliefs and assumptions and on information currently available to management and are subject to risks and uncertainties that could cause results to be materially different than expectations. Important factors that could cause actual results to differ materially include: global economic and political conditions; NVIDIA’s reliance on third parties to manufacture, assemble, package and test NVIDIA’s products; the impact of technological development and competition; development of new products and technologies or enhancements to NVIDIA’s existing product and technologies; market acceptance of NVIDIA’s products or NVIDIA’s partners’ products; design, manufacturing or software defects; changes in consumer preferences or demands; changes in industry standards and interfaces; unexpected loss of performance of NVIDIA’s products or technologies when integrated into systems; and changes in applicable laws and regulations, as well as other factors detailed from time to time in the most recent reports NVIDIA files with the Securities and Exchange Commission, or SEC, including, but not limited to, its annual report on Form 10-K and quarterly reports on Form 10-Q. Copies of reports filed with the SEC are posted on the company’s website and are available from NVIDIA without charge. These forward-looking statements are not guarantees of future performance and speak only as of the date hereof, and, except as required by law, NVIDIA disclaims any obligation to update these forward-looking statements to reflect future events or circumstances.

Many of the products and features described herein remain in various stages and will be offered on a when-and-if-available basis. The statements above are not intended to be, and should not be interpreted as a commitment, promise, or legal obligation, and the development, release, and timing of any features or functionalities described for our products is subject to change and remains at the sole discretion of NVIDIA. NVIDIA will have no liability for failure to deliver or delay in the delivery of any of the products, features or functions set forth herein.

© 2025 NVIDIA Corporation. All rights reserved. NVIDIA, the NVIDIA logo, DGX Cloud Lepton, NVIDIA NeMo and NVIDIA NIM are trademarks and/or registered trademarks of NVIDIA Corporation in the U.S. and other countries. Other company and product names may be trademarks of the respective companies with which they are associated. Features, pricing, availability and specifications are subject to change without notice.

A photo accompanying this announcement is available at https://www.globenewswire.com/NewsRoom/AttachmentNg/168c2a8e-0342-4717-bde7-a9bdbe436c08


FAQ

What is NVIDIA DGX Cloud Lepton and what does it offer?

NVIDIA DGX Cloud Lepton is an AI platform with a global compute marketplace that connects developers to GPU resources from various cloud providers, simplifying access to high-performance computing for AI applications.

Which major cloud providers are joining NVIDIA's DGX Cloud Lepton marketplace?

AWS and Microsoft Azure are the first major cloud providers joining the marketplace, along with Mistral AI, Nebius, Nscale, Firebird, Fluidstack, Hydra Host, Scaleway and Together AI.

How is Hugging Face integrating with NVIDIA DGX Cloud Lepton?

Hugging Face is introducing Training Cluster as a Service that integrates with DGX Cloud Lepton to connect AI researchers with compute resources for building foundation models.

What benefits does NVIDIA DGX Cloud Lepton offer to European startups?

Through partnerships with European VCs like Accel, Elaia, Partech and Sofinnova Partners, eligible startups can receive up to $100,000 in GPU capacity credits and expert support.

How does NVIDIA DGX Cloud Lepton help with data governance?

The platform enables developers to keep their data local within specific regions, supporting data governance and sovereign AI requirements while providing access to high-performance GPU resources.
Nvidia Corporation

NASDAQ:NVDA

NVDA Rankings

NVDA Latest News

NVDA Stock Data

3.44T
23.34B
4.32%
67.51%
0.99%
Semiconductors
Semiconductors & Related Devices
Link
United States
SANTA CLARA