STOCK TITAN

Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms

Rhea-AI Impact
(Low)
Rhea-AI Sentiment
(Positive)
Tags
AI
Micron Technology (MU) has announced the shipment of HBM4 36GB 12-high samples to key customers, marking a significant advancement in AI memory technology. Built on their 1β DRAM process, the new HBM4 achieves speeds over 2.0 TB/s per memory stack, delivering more than 60% better performance compared to the previous generation. The product features a 2048-bit interface and demonstrates over 20% improved power efficiency versus their previous HBM3E products. This advancement is particularly crucial for AI applications, enhancing inference performance in large language models and chain-of-thought reasoning systems. Micron plans to begin volume production of HBM4 in 2026, aligning with customers' next-generation AI platform deployments.
Micron Technology (MU) ha annunciato la spedizione di campioni di HBM4 da 36GB 12-high ai clienti chiave, segnando un importante progresso nella tecnologia di memoria per l'IA. Realizzato con il processo DRAM 1β, il nuovo HBM4 raggiunge velocità superiori a 2,0 TB/s per stack di memoria, offrendo oltre il 60% di prestazioni migliori rispetto alla generazione precedente. Il prodotto presenta un'interfaccia a 2048 bit e dimostra un'efficienza energetica migliorata di oltre il 20% rispetto ai precedenti prodotti HBM3E. Questo progresso è particolarmente rilevante per le applicazioni di IA, migliorando le prestazioni di inferenza nei modelli linguistici di grandi dimensioni e nei sistemi di ragionamento a catena di pensieri. Micron prevede di iniziare la produzione di massa di HBM4 nel 2026, in linea con i piani dei clienti per le piattaforme IA di nuova generazione.
Micron Technology (MU) ha anunciado el envío de muestras de HBM4 de 36GB 12-high a clientes clave, marcando un avance significativo en la tecnología de memoria para IA. Construido con su proceso DRAM 1β, el nuevo HBM4 alcanza velocidades superiores a 2,0 TB/s por pila de memoria, ofreciendo un rendimiento más del 60% mejor en comparación con la generación anterior. El producto cuenta con una interfaz de 2048 bits y demuestra una eficiencia energética mejorada en más del 20% respecto a sus productos HBM3E anteriores. Este avance es especialmente crucial para aplicaciones de IA, mejorando el rendimiento de inferencia en modelos de lenguaje grandes y sistemas de razonamiento en cadena de pensamiento. Micron planea comenzar la producción en volumen de HBM4 en 2026, alineándose con los despliegues de plataformas de IA de próxima generación de sus clientes.
Micron Technology(MU)는 주요 고객에게 36GB 12-high HBM4 샘플을 출하했다고 발표하며 AI 메모리 기술에서 중요한 진전을 이루었습니다. 1β DRAM 공정을 기반으로 한 새로운 HBM4는 메모리 스택당 2.0TB/s 이상의 속도를 달성하며 이전 세대보다 60% 이상 향상된 성능을 제공합니다. 이 제품은 2048비트 인터페이스를 갖추었으며 이전 HBM3E 제품 대비 20% 이상의 전력 효율 개선을 보여줍니다. 이 발전은 대형 언어 모델과 연쇄 사고 추론 시스템에서 추론 성능을 향상시켜 AI 애플리케이션에 특히 중요합니다. Micron은 고객의 차세대 AI 플랫폼 출시 일정에 맞춰 2026년에 HBM4 대량 생산을 시작할 계획입니다.
Micron Technology (MU) a annoncé l'expédition d'échantillons HBM4 36GB 12-high à des clients clés, marquant une avancée significative dans la technologie mémoire pour l'IA. Construit sur leur procédé DRAM 1β, le nouveau HBM4 atteint des vitesses supérieures à 2,0 To/s par empilement mémoire, offrant plus de 60 % de performance en plus par rapport à la génération précédente. Le produit dispose d'une interface 2048 bits et affiche une efficacité énergétique améliorée de plus de 20 % par rapport à leurs précédents produits HBM3E. Cette avancée est particulièrement cruciale pour les applications d'IA, améliorant les performances d'inférence dans les grands modèles de langage et les systèmes de raisonnement en chaîne. Micron prévoit de lancer la production en volume de HBM4 en 2026, en phase avec les déploiements des plateformes IA de nouvelle génération de ses clients.
Micron Technology (MU) hat die Auslieferung von HBM4 36GB 12-high Mustern an wichtige Kunden angekündigt, was einen bedeutenden Fortschritt in der KI-Speichertechnologie darstellt. Basierend auf ihrem 1β DRAM-Prozess erreicht das neue HBM4 Geschwindigkeiten von über 2,0 TB/s pro Speicherstapel und bietet eine um mehr als 60 % bessere Leistung im Vergleich zur vorherigen Generation. Das Produkt verfügt über eine 2048-Bit-Schnittstelle und zeigt eine um über 20 % verbesserte Energieeffizienz gegenüber den vorherigen HBM3E-Produkten. Dieser Fortschritt ist besonders wichtig für KI-Anwendungen und verbessert die Inferenzleistung bei großen Sprachmodellen und Ketten-von-Gedanken-Systemen. Micron plant, 2026 mit der Serienproduktion von HBM4 zu beginnen, um sich an die nächste Generation von KI-Plattformen ihrer Kunden anzupassen.
Positive
  • HBM4 delivers over 60% better performance compared to previous generation
  • Achieves over 20% better power efficiency versus previous HBM3E products
  • Features high-speed 2048-bit interface with speeds exceeding 2.0 TB/s per memory stack
  • Strategic alignment with customer platform readiness for 2026 production ramp
Negative
  • Production ramp not starting until calendar year 2026, indicating a significant wait for revenue impact

Insights

Micron's HBM4 shipment solidifies its leadership in AI memory, positioning it for growth in the high-margin data center segment.

Micron's announcement of shipping HBM4 36GB 12-high samples to key customers represents a significant technological milestone that strengthens its competitive position in the AI memory market. The new HBM4 delivers more than 2.0 TB/s per memory stack with a 60% performance improvement over the previous generation, alongside 20% better power efficiency compared to their HBM3E products.

This advancement is strategically crucial for three reasons. First, it reinforces Micron's technical leadership in high-performance memory, particularly for AI applications where memory bandwidth is a critical bottleneck. Second, by targeting the high-margin data center segment, Micron is positioning itself in the most profitable part of the memory market. Third, the timing of volume production ramp in 2026 aligns perfectly with next-generation AI platform deployments from major hyperscalers.

The HBM market represents one of the fastest-growing segments in memory, with significantly higher margins than conventional DRAM. Micron's ability to deliver leading power efficiency is particularly valuable for data centers where power consumption represents a major operational cost. By building on their established 1β DRAM process and 12-high packaging technology, Micron has minimized manufacturing risk while maximizing performance gains.

This product strengthens Micron's competitive positioning against Samsung and SK Hynix in the high-performance memory space, potentially enabling market share gains in the lucrative AI accelerator segment where customers like NVIDIA, AMD, and increasingly, hyperscalers developing custom silicon, demand cutting-edge memory solutions.

Micron HBM4 36GB 12-high products lead the industry in power efficiency for data center and cloud AI acceleration

Micron HBM4 Product Images

A Media Snippet accompanying this announcement is available in this link.

BOISE, Idaho, June 10, 2025 (GLOBE NEWSWIRE) --  The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc. (Nasdaq: MU), today announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends Micron’s leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms.

A leap forward
As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than 60% better performance over the previous generation.1 This expanded interface facilitates rapid communication and a high-throughput design that accelerates the inference performance of large language models and chain-of-thought reasoning systems. Simply put, HBM4 will help AI accelerators respond faster and reason more effectively.

Additionally, Micron HBM4 features over 20% better power efficiency compared to Micron’s previous-generation HBM3E products, which first established new, unrivaled benchmarks in HBM power efficiency in the industry.2 This improvement provides maximum throughput with the lowest power consumption to maximize data center efficiency.2

Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.

"Micron HBM4’s performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."

Intelligence Accelerated: Micron’s role in the AI revolution
For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers’ most demanding solutions.

Micron plans to ramp HBM4 in calendar year 2026, aligned to the ramp of customers’ next-generation AI platforms. For more information on Micron HBM4, visit https://www.micron.com/products/memory/hbm.

Additional resources:

About Micron Technology, Inc.
Micron Technology, Inc. is an industry leader in innovative memory and storage solutions, transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence (AI) and compute-intensive applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com.

© 2025 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners.

Micron Product and Technology Communications Contact:
Mengxi Liu Evensen
+1 (408) 444-2276
productandtechnology@micron.com

Micron Investor Relations Contact
Satya Kumar
+1 (408) 450-6199
satyakumar@micron.com

1 Based on internal Micron HBM4 testing and published HBM3E specifications (2.0 TB/s vs. 1.2 TB/s).

2 Based on internal Micron simulation projections in comparison to Micron HBM3E 36GB 12-high and similar competitive products.


FAQ

What are the key performance improvements of Micron's HBM4 memory?

Micron's HBM4 delivers over 60% better performance than the previous generation and features over 20% better power efficiency compared to HBM3E, with speeds exceeding 2.0 TB/s per memory stack.

When will Micron (MU) begin production of HBM4?

Micron plans to begin ramping HBM4 production in calendar year 2026, aligned with customers' next-generation AI platform readiness.

What is the capacity of Micron's new HBM4 memory?

Micron's new HBM4 features 36GB capacity in a 12-high configuration.

How will Micron's HBM4 benefit AI applications?

HBM4 will accelerate inference performance of large language models and chain-of-thought reasoning systems, enabling faster responses and more effective reasoning in AI applications.

What technology is Micron's HBM4 built on?

Micron's HBM4 is built on their 1β (1-beta) DRAM process, featuring 12-high advanced packaging technology and memory built-in self-test (MBIST).
Micron Technology Inc

NASDAQ:MU

MU Rankings

MU Latest News

MU Stock Data

115.39B
1.11B
0.27%
83.23%
2.87%
Semiconductors
Semiconductors & Related Devices
Link
United States
BOISE