AI acceleration: SK hynix fast-tracking HBM4 on Nvidia's request

Home > Business > Industry

print dictionary print

AI acceleration: SK hynix fast-tracking HBM4 on Nvidia's request

Audio report: written by reporters, read by AI


SK Chairman Chey Tae-won speaks during the SK AI Summit 2024 held at Coex, southern Seoul, on Monday. [SK]

SK Chairman Chey Tae-won speaks during the SK AI Summit 2024 held at Coex, southern Seoul, on Monday. [SK]

 
SK hynix is moving up the initial delivery schedule for its high bandwidth memory 4 (HBM4) chips by six months at Nvidia’s request, and will also start producing its 16-layer HBM3E chips with the largest capacity to date at 48 gigabytes early next year, asserting its dominance in the AI accelerator market.
 
With the announcement, SK hynix shares surged 6.48 percent to close at 194,000 won ($141.25) on Monday.
 
The push came from none other than Nvidia CEO Jensen Huang, according to SK Chairman Chey Tae-won during his keynote speech on Monday for the SK AI Summit 2024 held at southern Seoul’s Coex.
 
“So I agreed to try and shorten the timeline by six months,” Chey said.
 
“Now I’m a bit nervous to meet Huang again,” he said half-jokingly. “We’re worried he might ask us to speed it up even further.”

Related Article

SK hynix is the main supplier of HBM to Nvidia, staying ahead of crosstown rival Samsung Electronics, which is still waiting for much-needed approval to supply its fifth-generation HBM3E chips to the AI chip goliath. SK hynix, on the other hand, began supplying these chips to Nvidia from March.
 
Nvidia's Jensen Huang, left, speaks about the importance of high bandwidth memory (HBM) chips in the AI ecosystem in an interview clip released by SK at the SK AI Summit 2024 held at Coex, southern Seoul, on Monday. [LEE HEE-KWON]

Nvidia's Jensen Huang, left, speaks about the importance of high bandwidth memory (HBM) chips in the AI ecosystem in an interview clip released by SK at the SK AI Summit 2024 held at Coex, southern Seoul, on Monday. [LEE HEE-KWON]

 
When pressed about his view of SK overtaking Samsung in the chip market in a post-speech interview, Chey remained cautious, commenting that a direct comparison wouldn’t be proper.
 
“Integrating AI into the chip industry requires diverse approaches and solutions,” Chey said. “Samsung has more resources and technologies than we do, and I am confident that it will also achieve great results in the AI wave.”
 
To expedite HBM4 delivery, the timeline will depend on next year’s progress, Chey said.
 
"The request [from Huang] was about whether we could see deliver samples sooner, and we expressed that if the customer demands it, we would try. […] Advancing technology doesn’t happen because we decide to, it has to meet all the standards and requirements for mass production.”
 
“The market for 16-layer chips is expected to open up from HBM4 models,” said SK hynix CEO Kwak Noh-jung at the session. “Anticipating this trend, SK hynix is developing 48 gigabyte 16-layer HBM3E chips to secure technological stability, and will begin supplying samples to our clients from early next year.”
 
It is the first time that SK hynix officially admitted to the release of its 16-layer version.
 
HBM is a stack of dynamic random access memory (DRAM) chips for faster data processing that has been in the spotlight amid the AI boom as AI accelerators.
 
SK hynix CEO Kwak Noh-jung speaks at a keynote speech at the SK AI Summit 2024 in southern Seoul on Monday. [SK]

SK hynix CEO Kwak Noh-jung speaks at a keynote speech at the SK AI Summit 2024 in southern Seoul on Monday. [SK]

 
The performance of 16-layer HBM3E showed improvement of 18 percent in training and 32 percent in inference compared to the preceding 12-layer products.
 
The chipmaker reported record-high quarterly profit in its third quarter earnings last month, driven by its competitive edge in AI memory chips. It celebrated a turnaround by logging 7 trillion won in operating profit compared to a 1.8 trillion won loss from a year ago, as well as its highest-ever quarterly revenue of 17.6 trillion won.
 
The company aims to deliver sixth-generation 12-layer HBM4 chips next year, and a 16-layer version by 2026.
 
The Nvidia chief appeared in an interview clip prepared by SK and emphasized the need for further progress in HBM to push AI development forward.
 
“The road map of HBM memory is excellent but frankly, I wish we got more bandwidth with lower energy,” Huang said. “So the road map that SK hynix is on is super aggressive and is super necessary.”
 
SK is also investing heavily in building physical AI infrastructure, starting with the opening of an AI data center test bed in Pangyo, Gyeonggi, next month. The facility will operate on Nvidia’s latest chips and SK hynix’s HBM, as well as the latest liquid cooling solutions and technologies for energy optimization.
 
SK Chairman Chey Tae-won, left, and OpenAI CEO Greg Brockman at one of the booths set up for the SK AI Summit 2024 held in southern Seoul on Monday.[SK]

SK Chairman Chey Tae-won, left, and OpenAI CEO Greg Brockman at one of the booths set up for the SK AI Summit 2024 held in southern Seoul on Monday.[SK]

 
Its telecom affiliate, SK Telecom, will transform its existing Gasan data center to an AI data center with a power density of 44 kilowatts per rack for stable GPU operation, and operate its “GPU as a service” technology, or GPUaaS, which enables enterprises to utilize GPUs in a cloud environment without directly purchasing the chips needed for AI service development.
 
The service is part of SKT’s partnership with U.S.-based GPU cloud service provider Lambda, and the Gasan center will operate on Nvidia’s H100 chips next month, with an aim to bring in the latest H200 chips in a first for Korea next March.
 
It will also invest 100 billion won to establish a large-scale neural processing unit (NPU) farm integrating Rebellions’ NPUs, SK hynix’s HBM and various AI data center solutions from SKT to ultimately build a independent AI ecosystem in collaboration with the government, major companies and cloud providers.

BY LEE JAE-LIM [[email protected]]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)