Micron Ships HBM4 to Key Customers to Power Next-Gen AI Platforms
- HBM4 delivers over 60% better performance compared to previous generation
- Achieves over 20% better power efficiency versus previous HBM3E products
- Features high-speed 2048-bit interface with speeds exceeding 2.0 TB/s per memory stack
- Strategic alignment with customer platform readiness for 2026 production ramp
- Production ramp not starting until calendar year 2026, indicating a significant wait for revenue impact
Insights
Micron's HBM4 shipment solidifies its leadership in AI memory, positioning it for growth in the high-margin data center segment.
Micron's announcement of shipping HBM4 36GB 12-high samples to key customers represents a significant technological milestone that strengthens its competitive position in the AI memory market. The new HBM4 delivers more than 2.0 TB/s per memory stack with a 60% performance improvement over the previous generation, alongside 20% better power efficiency compared to their HBM3E products.
This advancement is strategically crucial for three reasons. First, it reinforces Micron's technical leadership in high-performance memory, particularly for AI applications where memory bandwidth is a critical bottleneck. Second, by targeting the high-margin data center segment, Micron is positioning itself in the most profitable part of the memory market. Third, the timing of volume production ramp in 2026 aligns perfectly with next-generation AI platform deployments from major hyperscalers.
The HBM market represents one of the fastest-growing segments in memory, with significantly higher margins than conventional DRAM. Micron's ability to deliver leading power efficiency is particularly valuable for data centers where power consumption represents a major operational cost. By building on their established 1β DRAM process and 12-high packaging technology, Micron has minimized manufacturing risk while maximizing performance gains.
This product strengthens Micron's competitive positioning against Samsung and SK Hynix in the high-performance memory space, potentially enabling market share gains in the lucrative AI accelerator segment where customers like NVIDIA, AMD, and increasingly, hyperscalers developing custom silicon, demand cutting-edge memory solutions.
Micron HBM4 36GB 12-high products lead the industry in power efficiency for data center and cloud AI acceleration
A Media Snippet accompanying this announcement is available in this link.
BOISE, Idaho, June 10, 2025 (GLOBE NEWSWIRE) -- The importance of high-performance memory has never been greater, fueled by its crucial role in supporting the growing demands of AI training and inference workloads in data centers. Micron Technology, Inc. (Nasdaq: MU), today announced the shipment of HBM4 36GB 12-high samples to multiple key customers. This milestone extends Micron’s leadership in memory performance and power efficiency for AI applications. Built on its well-established 1ß (1-beta) DRAM process, proven 12-high advanced packaging technology and highly capable memory built-in self-test (MBIST) feature, Micron HBM4 provides seamless integration for customers and partners developing next-generation AI platforms.
A leap forward
As use of generative AI continues to grow, the ability to effectively manage inference becomes more important. Micron HBM4 features a 2048-bit interface, achieving speeds greater than 2.0 TB/s per memory stack and more than
Additionally, Micron HBM4 features over
Generative AI use cases continue to multiply, and this transformative technology is poised to deliver significant benefits to society. HBM4 is a crucial enabler, driving quicker insights and discoveries that will foster innovation in diverse fields such as healthcare, finance and transportation.
"Micron HBM4’s performance, higher bandwidth and industry-leading power efficiency are a testament to our memory technology and product leadership," said Raj Narasimhan, senior vice president and general manager of Micron’s Cloud Memory Business Unit. "Building on the remarkable milestones achieved with our HBM3E deployment, we continue to drive innovation with HBM4 and our robust portfolio of AI memory and storage solutions. Our HBM4 production milestones are aligned with our customers’ next-generation AI platform readiness to ensure seamless integration and volume ramp."
Intelligence Accelerated: Micron’s role in the AI revolution
For nearly five decades, Micron has pushed the boundaries of memory and storage innovation. Today, Micron continues to accelerate AI by delivering a broad portfolio of solutions that turn data into intelligence, fueling breakthroughs from the data center to the edge. With HBM4, Micron reinforces its position as a critical catalyst for AI innovation and a reliable partner for our customers’ most demanding solutions.
Micron plans to ramp HBM4 in calendar year 2026, aligned to the ramp of customers’ next-generation AI platforms. For more information on Micron HBM4, visit https://www.micron.com/products/memory/hbm.
Additional resources:
About Micron Technology, Inc.
Micron Technology, Inc. is an industry leader in innovative memory and storage solutions, transforming how the world uses information to enrich life for all. With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence (AI) and compute-intensive applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more about Micron Technology, Inc. (Nasdaq: MU), visit micron.com.
© 2025 Micron Technology, Inc. All rights reserved. Information, products, and/or specifications are subject to change without notice. Micron, the Micron logo, and all other Micron trademarks are the property of Micron Technology, Inc. All other trademarks are the property of their respective owners.
Micron Product and Technology Communications Contact:
Mengxi Liu Evensen
+1 (408) 444-2276
productandtechnology@micron.com
Micron Investor Relations Contact
Satya Kumar
+1 (408) 450-6199
satyakumar@micron.com
1 Based on internal Micron HBM4 testing and published HBM3E specifications (2.0 TB/s vs. 1.2 TB/s).
2 Based on internal Micron simulation projections in comparison to Micron HBM3E 36GB 12-high and similar competitive products.
