Micron is first to mass-produce HBM3E memory, beating Samsung and SK

Home > Business > Industry

print dictionary print

Micron is first to mass-produce HBM3E memory, beating Samsung and SK

Samsung Electronics' fifth-generation HBM3E chips [SAMSUNG ELECTRONICS]

Samsung Electronics' fifth-generation HBM3E chips [SAMSUNG ELECTRONICS]

 
Micron Technology, the United States' leading producer of memory chips, has begun mass-producing high bandwidth memory 3E (HBM3E) for generative AI and high-performance computing, beating dominant Korean players Samsung Electronics and SK hynix to the milestone.
 
The firm unexpectedly announced Monday that its HBM3E chip will be integrated into Nvidia's top-of-the-line H200 GPU, which will begin shipping in the second quarter of 2024. 
 
The Boise, Idaho-based latecomer will become the first chipmaker to mass-produce the new HBM standard, an unanticipated feat given its modest market share in the memory chip segment.
 
Micron shares surged 4.02 percent on Tuesday while those of SK hynix plummeted by 4.94 percent to close at 153,800 won. 
 
The move coincides with Samsung Electronics’ announcement of its successful development of HBM3E chips with the industry’s largest capacity of 36 gigabytes.
 
The Suwon, Gyeonggi-based company has already begun sending product samples to its clients, and the chips are slated for mass production by the first half of this year.
 
HBM3E chips stack twelve 24-gigabit dynamic random access memory (DRAM) chips with peak memory bandwidth of 1.28 terabytes per second. Both aspects have improved by 50 percent compared to its predecessor, eight-stack HBM3.
 
But the HBM3E chips are the same height as eight-layer ones to meet current package requirements, made possible with the application of advanced thermal compression nonconductive film (TC NCF) technology.
 
The chipmaker has also lowered the thickness of its NCF material, achieving the industry's smallest gap between chips at seven micrometers.
 
“The industry’s AI service providers are increasingly requiring HBM with higher capacity, and our new HBM3E 12H product has been designed to answer that need,” said Bae Yong-cheol, executive vice president of memory product planning at Samsung Electronics. “This new memory solution forms part of our drive toward developing core technologies for high-stack HBM and providing technological leadership for the high-capacity HBM market in the AI era.”
 
When applied in AI services, the latest chips will be able to increase the average speed of AI training by 34 percent compared to HBM3 products while expanding the number of simultaneous users of inference services by a factor of 11.5.
 
Meanwhile, SK hynix is on course to deliver HBM3E chips to Nvidia in the first half of the year as planned.

BY LEE JAE-LIM [lee.jaelim@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)