Home News SOLD OUT until 2025! - HBM duo's production capacity

SOLD OUT until 2025! - HBM duo's production capacity

2024-07-04

Share this article :

HBM's two major players have sold out their production capacity until 2025, and they are investing in HBM4 only because their sponsors have enough money.

01 HBM chips

"HBM3 out of stock" meets "HBM4 technology competition"

HBM chips accounted for 15% of the general memory market, up from 8% last year, driven by demand from artificial intelligence (AI), high-performance computing (HPC) and data centers. HBM offers the advantages of high bandwidth and low power consumption through 3D stacking technology, making it the first choice for many high-performance applications.

Recently, Micron Technology (Micron) announced that its HBM production capacity has been sold out until 2025, following SK Hynix's previous disclosure that its HBM capacity has been booked until next year.

The HBM market will continue to grow rapidly in the next few years, and major manufacturers are also continuously increasing production capacity to meet market demand. Micron and SK Hynix are the main players in the HBM market, and Samsung is currently lagging behind. Shelley Jang, an executive at Fitch Ratings, said: "Samsung still needs time to catch up. There is a technological difference in producing HBM. In the short term, SK Hynix may maintain its advantage in the market." Dylan Patel, principal analyst at SemiAnalysis, told EE Times earlier this year that SK Hynix has a leading market share in the HBM field, with a HBM3 market share of more than 85% and an overall HBM market share of more than 70%.



The main players' products are as follows:

Micron's HBM3E memory provides a bandwidth of more than 1.2 TB/s. HBM3E chips will be adopted by Nvidia to power the latter's H200 Tensor chip. HBMnext is speculated to be Micron's HBM 4.

SK Hynix's HBM3E provides a maximum capacity of 36GB and a bandwidth of 1.18TB/s in the same package size. In March, SK Hynix announced that HBM3E has begun mass production and will cooperate with TSMC to develop HBM4 products.

Samsung's HBM3E memory uses 12-layer stacking technology, provides 36GB capacity and 1.28TB/s bandwidth, and is the largest HBM3E product on the market.

Micron achieved revenue of US$6.811 billion in the most recent fiscal quarter (March to May), an increase of 81.5% year-on-year, and operating profit reached US$719 million. Among them, the DRAM product line contributed US$4.7 billion in revenue, accounting for 69%, and this part of revenue also includes HBM sales.

It is worth noting that Micron's HBM3E 8H model brought it more than $100 million in revenue in a single quarter, and Micron claims that its HBM products reduce power consumption by 30% compared to competing products. In addition, Micron also plans to start production of HBM3E 12H in 2025. Although this time point is slightly later than some competitors, the company expects the HBM business to bring billions of dollars in revenue to the company next year, and expects its HBM market share to be comparable to DRAM [Micron increases HBM production capacity worldwide].

In the long run, Micron expects its share of the HBM market to be roughly the same as its share of the entire DRAM market around 2025.

Intel has also publicly expressed its interest in HBM integration and may adopt HBM in future product designs to improve system performance. At the same time, AMD has been an active adopter of HBM technology, and has integrated HBM in its products many times, and is expected to continue to deepen its application in this area in the future.


02 HBM4

What changes will the next generation of HBM4 have?

Samsung, SK Hynix and Micron are all vying to gain market advantage through technological innovation and capacity expansion, especially in the face of the upcoming growth in AI and HPC market demand.

With the acceleration of technological iteration, the competitive landscape of the HBM market will become more complex and changeable. In the development of the next generation of HBM4 memory, South Korean memory chip manufacturers.


Interface width. The interface width has increased from 1024 bits per stack to 2048 bits per stack, bringing a breakthrough change to HBM4. Using a 2048-bit memory interface, the transmission speed can theoretically be doubled again. For example, Nvidia's flagship Hopper H100 GPU, with six HBM3s, reaches a 6144-bit bit width. If the memory interface is doubled to 2048 bits, Nvidia can theoretically halve the number of chips to three and get the same performance.

Number of layers. 

HBM4 also has changes in the number of stacking layers. In addition to the first batch of 12-layer vertical stacking, it is expected that memory manufacturers will bring 16-layer vertical stacking in 2027 - K Hynix will continue to use advanced MR-MUF technology to achieve 16-layer stacking. In contrast, the number of stacking layers of HBM3 is mainly 8/12 layers.

Lower power consumption. 

HBM4E is expected to reduce system energy consumption, reduce heat, and extend the battery life of devices. At the same time, HBM4E memory is expected to be used in various fields, including artificial intelligence, high-performance computing, data centers, graphics processing, etc., bringing higher performance and efficiency to these fields.

Process. 

Samsung also plans to use FinFET nodes on HBM4 to replace planar MOSFETs to produce corresponding logic dies, and the packaging method will change from CoW (Chip on Wafer) based on Bump connection to Bumpless form based on Pad connection. HBM4 mass production time forecast


HBM was developed by SK Hynix in 2014, HBM2 (second generation) was launched in 2018, HBM2E (third generation) was launched in 2020, HBM3 (fourth generation) was launched in 2022, and HBM3E (fifth generation) was launched this year.

Samsung, SK Hynix and Micron are all vying to gain market advantage through technological innovation and capacity expansion, especially in the face of the upcoming growth in AI and HPC market demand.

With the acceleration of technological iteration, the competitive landscape of the HBM market will become more complex and changeable. In the development of the next generation of HBM4 memory, South Korean memory chip manufacturers.


Interface width. The interface width has increased from 1024 bits per stack to 2048 bits per stack, bringing a breakthrough change to HBM4. Using a 2048-bit memory interface, the transmission speed can theoretically be doubled again. For example, Nvidia's flagship Hopper H100 GPU, with six HBM3s, reaches a 6144-bit bit width. If the memory interface is doubled to 2048 bits, Nvidia can theoretically halve the number of chips to three and get the same performance.

Number of layers. HBM4 also has changes in the number of stacking layers. In addition to the first batch of 12-layer vertical stacking, it is expected that memory manufacturers will bring 16-layer vertical stacking in 2027 - K Hynix will continue to use advanced MR-MUF technology to achieve 16-layer stacking. . In contrast, the number of stacking layers of HBM3 is mainly 8/12 layers.

Lower power consumption. HBM4E is expected to reduce system energy consumption, reduce heat, and extend the battery life of devices. At the same time, HBM4E memory is expected to be used in various fields, including artificial intelligence, high-performance computing, data centers, graphics processing, etc., bringing higher performance and efficiency to these fields.

Process. Samsung also plans to use FinFET nodes on HBM4 to replace planar MOSFETs to produce corresponding logic dies, and the packaging method will change from CoW (Chip on Wafer) based on Bump connection to Bumpless form based on Pad connection. HBM4 mass production time forecast

HBM was developed by SK Hynix in 2014, HBM2 (second generation) was launched in 2018, HBM2E (third generation) was launched in 2020, HBM3 (fourth generation) was launched in 2022, and HBM3E (fifth generation) was launched this year.


"Since the development of the first generation of HBM, a new generation has been developed every two years, but starting with HBM3E, it has been updated every year," SK Hynix said. HBM4 (the sixth generation) will be launched even faster.


SK Hynix plans to launch the first HBM4 products with 12-layer DRAM stacks in the second half of 2025, and 16-layer stacked HBM slightly later in 2026. The timing will be "in line with the 'NVIDIA accelerated AI accelerator release cycle'".

Samsung plans to produce HBM 4 samples in 2025 and mass production in 2026.

Micron expects to launch 12- and 16-layer stacked HBM4 in 2026 with bandwidth exceeding 1.5TB/s; by 2027-2028, it will also release 12- and 16-layer stacked HBM4E with bandwidth exceeding 2TB/s.

Interestingly, according to Mehrotra, industry supply of DRAM and HBM will be lower than demand this year, partly because HBM is severely cannibalizing DRAM.

At a given memory process node, HBM3E consumes 3 times as many wafers as ordinary DDR5 memory to produce a given capacity. HBM4 yields are expected to be even worse due to packaging complexity and higher performance requirements for the memory (which will further reduce the yield of the final HBM package).


So the trick is to make money with more complex products, and Nvidia, AMD, Intel, and others may be willing to pay a little more for HBM3E, HBM4, and HBM4E memory in the coming years.



View more at EASELINK

HOT NEWS

Understanding the Importance of Signal Buffers in Electronics

HBM,production,capacity,HBM4,HBM,chips,HBM3E

Have you ever wondered how your electronic devices manage to transmit and receive signals with such precision? The secret lies in a small ...

2023-11-13

How to understand Linear Analog Multipliers and Dividers?

IntroductionLinear analog multipliers and dividers are an advanced-looking device at first glance, but they're actually crucial player...

2023-09-08

Demystifying Data Acquisition ADCs/DACs: Special Purpose Applications

Introduction to Data Acquisition ADCs/DACsUnlocking the potential of data has become an integral part of our ever-evolving technol...

2023-10-12

Another century of Japanese electronics giant comes to an end

"Toshiba, Toshiba, the Toshiba of the new era!" In the 1980s, this advertising slogan was once popular all over the country.S...

2023-10-13

The Future of Linear Amplifiers: Unlocking Breakthroughs in High-Fidelity Audio and Communication

Introduction to Linear AmplifiersWelcome to the world of linear amplifiers, where breakthroughs in high-fidelity audio and communication...

2023-09-22

Understanding the World of Encoders, Decoders, and Converters: A Comprehensive Guide

Encoders play a crucial role in the world of technology, enabling the conversion of analog signals into digital formats.

2023-10-20

Financial Times Documentary "The Battle for Global Semiconductor Dominance"

On September 28, the Financial Times, a century-old media giant, launched a documentary titled "The race for semiconductor suprema...

2023-10-16

What signals does the United States send out on these IC companies?

According to Yonhap News Agency, Choi Sang-moo, the chief economic secretary of the South Korean Presidential Office, held a press...

2023-10-14

Address: 73 Upper Paya Lebar Road #06-01CCentro Bianco Singapore

HBM,production,capacity,HBM4,HBM,chips,HBM3E HBM,production,capacity,HBM4,HBM,chips,HBM3E
HBM,production,capacity,HBM4,HBM,chips,HBM3E
Copyright © 2023 EASELINK. All rights reserved. Website Map
×

Send request/ Leave your message

Please leave your message here and we will reply to you as soon as possible. Thank you for your support.

send
×

RECYCLE Electronic Components

Sell us your Excess here. We buy ICs, Transistors, Diodes, Capacitors, Connectors, Military&Commercial Electronic components.

BOM File
HBM,production,capacity,HBM4,HBM,chips,HBM3E
send

Leave Your Message

Send