SK hynix Begins Volume Production of the World's First 12-Layer HBM3E
Rhea-AI Summary
SK hynix has begun mass production of the world's first 12-layer HBM3E product with 36GB capacity, the largest in existing HBM. The company plans to supply these products to customers by the end of the year. This achievement comes six months after delivering the 8-layer HBM3E product in March.
Key features of the 12-layer HBM3E include:
- 50% increased capacity at the same thickness as the previous 8-layer product
- DRAM chips made 40% thinner
- Memory operation speed of 9.6 Gbps
- 10% higher heat dissipation performance
SK hynix aims to maintain its leadership in the AI memory market with this product, addressing the growing needs of AI companies.
Positive
- World's first 12-layer HBM3E with 36GB capacity
- 50% increase in capacity compared to previous 8-layer product
- Highest memory speed available at 9.6 Gbps
- 10% higher heat dissipation performance
- Plans to supply mass-produced products to customers within the year
Negative
- None.
- The company plans to supply the highest-performing, highest-capacity 12-layer HBM3E to customers by the end of the year
- DRAM chips made
40% thinner to increase capacity by50% at the same thickness as the previous 8-layer product - The company to continue HBM's success with outstanding product performance and competitiveness
[1] Previously, the maximum capacity of HBM3E was 24GB from eight vertically stacked 3GB DRAM chips. |
[2] HBM (High Bandwidth Memory): This high-value, high-performance memory vertically interconnects multiple DRAM chips and dramatically increases data processing speed in comparison to traditional DRAM products. HBM3E is the extended version of HBM3, the fourth generation product that succeeds the previous generations of HBM, HBM2 and HBM2E. |
The company plans to supply mass-produced products to customers within the year, proving its overwhelming technology once again six months after delivering the HBM3E 8-layer product to customers for the first time in the industry in March this year.
SK hynix is the only company in the world that has developed and supplied the entire HBM lineup from the first generation (HBM1) to the fifth generation (HBM3E), since releasing the world's first HBM in 2013. The company plans to continue its leadership in the AI memory market, addressing the growing needs of AI companies by being the first in the industry to mass-produce the 12-layer HBM3E.
According to the company, the 12-layer HBM3E product meets the world's highest standards in all areas that are essential for AI memory including speed, capacity and stability. SK hynix has increased the speed of memory operations to 9.6 Gbps, the highest memory speed available today. If 'Llama 3 70B'[3], a Large Language Model (LLM), is driven by a single GPU equipped with four HBM3E products, it can read 70 billion total parameters 35 times within a second.
[3] Llama 3: Open-source LLM released by Meta in April 2024, with 3 sizes in total: 8B (Billion), 70B, and 400B. |
SK hynix has increased the capacity by
The company also solved structural issues that arise from stacking thinner chips higher by applying its core technology, the Advanced MR-MUF[5] process. This allows to provide
[4] TSV (Through Silicon Via): This advanced packaging technology links upper and lower chips with an electrode that vertically passes through thousands of fine holes on DRAM chips. |
[5] MR-MUF (Mass Reflow Molded Underfill): The process of stacking semiconductor chips, injecting liquid protective materials between them to protect the circuit between chips, and hardening them. The process has proved to be more efficient and effective for heat dissipation, compared with the method of laying film-type materials for each chip stack. SK hynix's advanced MR-MUF technology is critical to securing a stable HBM mass production as it provides good warpage control and reduces the pressure on the chips being stacked. |
"SK hynix has once again broken through technological limits demonstrating our industry leadership in AI memory," said Justin Kim, President (Head of AI Infra) at SK hynix. "We will continue our position as the No.1 global AI memory provider as we steadily prepare next-generation memory products to overcome the challenges of the AI era."
About SK hynix Inc.
SK hynix Inc., headquartered in
View original content to download multimedia:https://www.prnewswire.com/news-releases/sk-hynix-begins-volume-production-of-the-worlds-first-12-layer-hbm3e-302259251.html
SOURCE SK hynix Inc.