STOCK TITAN

Micron HBM Designed into Leading AMD AI Platform

Rhea-AI Impact
(Neutral)
Rhea-AI Sentiment
(Very Positive)
Tags
AI
Micron Technology (MU) announced the integration of its HBM3E 36GB 12-high memory solution into AMD's upcoming Instinct MI350 Series GPUs. The collaboration focuses on enhancing AI data center capabilities through improved power efficiency and performance. The AMD Instinct MI350 Series GPU platforms, featuring AMD's CDNA 4 architecture, will incorporate 288GB of HBM3E memory capacity per GPU, delivering up to 8 TB/s bandwidth. This enables support for AI models with up to 520 billion parameters on a single GPU. In full platform configuration, the MI350 Series achieves peak theoretical performance of 161 PFLOPS at FP4 precision, with up to 2.3TB of HBM3E memory. The partnership aims to accelerate AI solutions' time to market, with Micron's HBM3E technology now qualified on multiple leading AI platforms.
Micron Technology (MU) ha annunciato l'integrazione della sua soluzione di memoria HBM3E da 36GB 12-high nelle GPU della serie Instinct MI350 di AMD. La collaborazione punta a migliorare le capacità dei data center AI attraverso una maggiore efficienza energetica e prestazioni superiori. Le piattaforme GPU AMD Instinct MI350, basate sull'architettura CDNA 4 di AMD, includeranno una capacità di memoria HBM3E di 288GB per GPU, offrendo una larghezza di banda fino a 8 TB/s. Questo permette di supportare modelli AI con fino a 520 miliardi di parametri su una singola GPU. Nella configurazione completa della piattaforma, la serie MI350 raggiunge una performance teorica massima di 161 PFLOPS a precisione FP4, con fino a 2,3TB di memoria HBM3E. La partnership mira ad accelerare il time to market delle soluzioni AI, con la tecnologia HBM3E di Micron ora qualificata su molteplici piattaforme AI di rilievo.
Micron Technology (MU) anunció la integración de su solución de memoria HBM3E de 36GB 12-high en las GPUs de la serie Instinct MI350 de AMD. La colaboración se centra en mejorar las capacidades de los centros de datos de IA mediante una mayor eficiencia energética y rendimiento. Las plataformas GPU AMD Instinct MI350, que cuentan con la arquitectura CDNA 4 de AMD, incorporarán una capacidad de memoria HBM3E de 288GB por GPU, ofreciendo un ancho de banda de hasta 8 TB/s. Esto permite soportar modelos de IA con hasta 520 mil millones de parámetros en una sola GPU. En configuración completa de plataforma, la serie MI350 alcanza un rendimiento teórico máximo de 161 PFLOPS con precisión FP4, con hasta 2,3TB de memoria HBM3E. La asociación busca acelerar el tiempo de lanzamiento al mercado de soluciones de IA, con la tecnología HBM3E de Micron ya certificada en múltiples plataformas líderes de IA.
Micron Technology(MU)는 자사의 36GB 12-high HBM3E 메모리 솔루션을 AMD의 차세대 Instinct MI350 시리즈 GPU에 통합했다고 발표했습니다. 이번 협력은 전력 효율성과 성능 향상을 통해 AI 데이터 센터 역량을 강화하는 데 중점을 두고 있습니다. AMD Instinct MI350 시리즈 GPU 플랫폼은 AMD의 CDNA 4 아키텍처를 기반으로 하며, GPU당 288GB의 HBM3E 메모리 용량과 최대 8TB/s 대역폭을 제공합니다. 이를 통해 단일 GPU에서 최대 5200억 개의 매개변수를 가진 AI 모델을 지원할 수 있습니다. 전체 플랫폼 구성 시, MI350 시리즈는 FP4 정밀도에서 이론상 최대 161 PFLOPS의 성능을 발휘하며, 최대 2.3TB의 HBM3E 메모리를 갖추고 있습니다. 이번 파트너십은 Micron의 HBM3E 기술이 여러 주요 AI 플랫폼에서 검증됨에 따라 AI 솔루션의 시장 출시 속도를 가속화하는 것을 목표로 합니다.
Micron Technology (MU) a annoncé l’intégration de sa solution mémoire HBM3E 36GB 12-high dans les GPU de la série Instinct MI350 à venir d’AMD. Cette collaboration vise à améliorer les capacités des centres de données IA grâce à une meilleure efficacité énergétique et des performances accrues. Les plateformes GPU AMD Instinct MI350, dotées de l’architecture CDNA 4 d’AMD, intégreront une capacité mémoire HBM3E de 288GB par GPU, offrant une bande passante pouvant atteindre 8 TB/s. Cela permet de supporter des modèles IA comportant jusqu’à 520 milliards de paramètres sur un seul GPU. En configuration complète, la série MI350 atteint une performance théorique maximale de 161 PFLOPS en précision FP4, avec jusqu’à 2,3TB de mémoire HBM3E. Ce partenariat vise à accélérer la mise sur le marché des solutions IA, la technologie HBM3E de Micron étant désormais qualifiée sur plusieurs plateformes IA majeures.
Micron Technology (MU) hat die Integration seiner HBM3E 36GB 12-high Speicherlösung in AMDs kommende Instinct MI350 Serie GPUs angekündigt. Die Zusammenarbeit konzentriert sich darauf, die Fähigkeiten von KI-Rechenzentren durch verbesserte Energieeffizienz und Leistung zu steigern. Die AMD Instinct MI350 Serie GPU-Plattformen mit AMDs CDNA 4 Architektur werden pro GPU eine HBM3E Speicherkapazität von 288GB bieten und eine Bandbreite von bis zu 8 TB/s liefern. Dies ermöglicht die Unterstützung von KI-Modellen mit bis zu 520 Milliarden Parametern auf einer einzigen GPU. In der Vollplattform-Konfiguration erreicht die MI350 Serie eine theoretische Spitzenleistung von 161 PFLOPS bei FP4-Präzision mit bis zu 2,3TB HBM3E Speicher. Die Partnerschaft zielt darauf ab, die Markteinführungszeit von KI-Lösungen zu beschleunigen, wobei Microns HBM3E-Technologie nun auf mehreren führenden KI-Plattformen qualifiziert ist.
Positive
  • Integration with AMD's high-end AI GPU platform demonstrates Micron's strong industry position in HBM technology
  • Partnership enables support for massive AI models with up to 520 billion parameters on a single GPU
  • Solution delivers exceptional performance with up to 8 TB/s bandwidth and 161 PFLOPS at FP4 precision
  • Product qualification achieved on multiple leading AI platforms, indicating strong market acceptance
Negative
  • None.

Insights

Micron's HBM3E memory design win in AMD's MI350 AI GPUs strengthens its position in the high-margin AI accelerator memory market.

Micron's announcement represents a significant design win in the rapidly expanding AI accelerator market. The company's HBM3E 36GB 12-high memory being integrated into AMD's next-generation Instinct MI350 Series GPUs positions Micron favorably in the high-bandwidth memory segment – one of the semiconductor industry's most lucrative and technically challenging markets.

The technical specifications are particularly noteworthy. AMD's platforms will leverage 288GB of HBM3E memory capacity per GPU with up to 8 TB/s bandwidth, enabling support for AI models with up to 520 billion parameters on a single GPU. In full platform configurations, this scales to 2.3TB of HBM3E memory with theoretical performance reaching 161 PFLOPS at FP4 precision. These specifications represent the cutting edge of AI acceleration capabilities.

From a competitive perspective, Micron's qualification on multiple AI platforms suggests the company is successfully competing against Samsung and SK Hynix in the premium HBM market. This diversification beyond DRAM commodities toward higher-margin, specialized memory products strengthens Micron's business model and potentially improves gross margin profile.

The partnership also demonstrates Micron's manufacturing excellence, as HBM3E production requires advanced packaging technologies and precise manufacturing processes. The 12-high stack configuration mentioned is particularly challenging to produce at scale, indicating Micron has overcome significant technical hurdles.

For semiconductor investors, this partnership reinforces Micron's strategic positioning as AI workloads continue driving demand for high-performance memory solutions, potentially commanding premium pricing compared to standard memory products.

Micron high bandwidth memory (HBM3E 36GB 12-high) and the AMD Instinct™ MI350 Series GPUs and platforms support the pace of AI data center innovation and growth

Micron HBM3E Designed into the AMD Instinct MI350 Series GPU Platforms

A Media Snippet accompanying this announcement is available in this link.

BOISE, Idaho, June 12, 2025 (GLOBE NEWSWIRE) -- Micron Technology, Inc. (Nasdaq: MU) today announced the integration of its HBM3E 36GB 12-high offering into the upcoming AMD Instinct™ MI350 Series solutions. This collaboration highlights the critical role of power efficiency and performance in training large AI models, delivering high-throughput inference and handling complex HPC workloads such as data processing and computational modeling. Furthermore, it represents another significant milestone in HBM industry leadership for Micron, showcasing its robust execution and the value of its strong customer relationships.

Micron HBM3E 36GB 12-high solution brings industry-leading memory technology to AMD Instinct™ MI350 Series GPU platforms, providing outstanding bandwidth and lower power consumption.1 The AMD Instinct MI350 Series GPU platforms, built on AMD advanced CDNA 4 architecture, integrate 288GB of high-bandwidth HBM3E memory capacity, delivering up to 8 TB/s bandwidth for exceptional throughput. This immense memory capacity allows Instinct MI350 series GPUs to efficiently support AI models with up to 520 billion parameters—on a single GPU. In a full platform configuration, Instinct MI350 Series GPUs offers up to 2.3TB of HBM3E memory and achieves peak theoretical performance of up to 161 PFLOPS at FP4 precision, with leadership energy efficiency and scalability for high-density AI workloads. This tightly integrated architecture, combined with Micron’s power-efficient HBM3E, enables exceptional throughput for large language model training, inference and scientific simulation tasks—empowering data centers to scale seamlessly while maximizing compute performance per watt. This joint effort between Micron and AMD has enabled faster time to market for AI solutions.

“Our close working relationship and joint engineering efforts with AMD optimize compatibility of the Micron HBM3E 36GB 12-high product with the Instinct MI350 Series GPUs and platforms. Micron’s HBM3E industry leadership and technology innovations provide improved TCO benefits to end customers with high performance for demanding AI systems,” said Praveen Vaidyanathan, vice president and general manager of Cloud Memory Products at Micron.

“The Micron HBM3E 36GB 12-high product is instrumental in unlocking the performance and energy efficiency of AMD Instinct™ MI350 Series accelerators,” said Josh Friedrich, corporate vice president of AMD Instinct Product Engineering at AMD. “Our continued collaboration with Micron advances low-power, high-bandwidth memory that helps customers train larger AI models, speed inference and tackle complex HPC workloads.”

Micron HBM3E 36GB 12-high product is now qualified on multiple leading AI platforms. For more information on Micron’s HBM product portfolio, visit: High-bandwidth memory | Micron Technology Inc.

Additional Resources:

About Micron Technology, Inc.

Micron Technology, Inc. is an industry leader in innovative memory and storage solutions, transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence (AI) and compute-intensive applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com.

© 2025 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners.

Micron Product and Technology Communications Contact:
Mengxi Liu Evensen
+1 (408) 444-2276
productandtechnology@micron.com

Micron Investor Relations Contact
Satya Kumar
+1 (408) 450-6199
satyakumar@micron.com

____________________________
1
Data rate testing estimates are based on shmoo plot of pin speed performed in a manufacturing test environment. Power and performance estimates are based on simulation results of workload uses cases.


FAQ

What is the memory capacity of Micron's HBM3E solution in the AMD Instinct MI350 Series GPUs?

The AMD Instinct MI350 Series GPU platforms integrate 288GB of HBM3E memory capacity per GPU, with up to 2.3TB in a full platform configuration.

What performance improvements does the Micron HBM3E bring to AMD's MI350 Series?

The solution delivers up to 8 TB/s bandwidth and achieves peak theoretical performance of 161 PFLOPS at FP4 precision, enabling support for AI models with up to 520 billion parameters on a single GPU.

How does this partnership benefit Micron (MU) in the AI market?

The partnership strengthens Micron's position in the AI market by showcasing its HBM industry leadership and demonstrating successful qualification of its HBM3E technology on leading AI platforms.

What are the main applications for the Micron-AMD HBM3E solution?

The solution is designed for large language model training, inference, scientific simulation tasks, and complex HPC workloads such as data processing and computational modeling.

When will the AMD Instinct MI350 Series with Micron's HBM3E be available?

While the press release announces the integration, it does not specify the exact availability date for the AMD Instinct MI350 Series with Micron's HBM3E memory.
Micron Technology Inc

NASDAQ:MU

MU Rankings

MU Latest News

MU Stock Data

115.39B
1.11B
0.27%
83.23%
2.87%
Semiconductors
Semiconductors & Related Devices
Link
United States
BOISE