Rare stock bonus, Jensen Huang's hat tip signal Samsung’s HBM4 breakthrough

Home > Business > Industry

print dictionary print

Rare stock bonus, Jensen Huang's hat tip signal Samsung’s HBM4 breakthrough

Audio report: written by reporters, read by AI


Samsung Electronics' next-generation high bandwidth memory (HBM4) is displayed at the 27th Semiconductor Exhibition held at Coex in Gangnam District, southern Seoul, on Oct. 22. [NEWS1]

Samsung Electronics' next-generation high bandwidth memory (HBM4) is displayed at the 27th Semiconductor Exhibition held at Coex in Gangnam District, southern Seoul, on Oct. 22. [NEWS1]



[NEWS ANALYSIS]
 
Following a sluggish stretch in technological progress, Samsung Electronics looks to catch up with rivals Micron and SK hynix in next-generation high bandwidth memory (HBM4) chips. Signs point to renewed momentum: a hefty bonus for Samsung's HBM4 development team, Nvidia CEO Jensen Huang's encouraging remarks and Samsung's early ramp-up of HBM4 production all suggest a bright outlook for the Korean tech giant's memory business.
 
Samsung’s latest HBM4 chips are currently undergoing verification testing by major client Nvidia — a crucial milestone that could determine its comeback in the high-end memory market. Although the fate of the high-performance memory for AI training chips largely hinges on Nvidia's adoption, sentiment around HBM4 is markedly more optimistic than it was for HBM3.
 
Thirty members of Samsung’s memory division received a combined 515 million won ($354,000) in company shares on Oct. 31, an unusual incentive move tied to their success in developing the company’s newest 10-nanometer-class 1c dynamic random-access memory (DRAM) process. This was the first time Samsung has granted treasury shares to developers, and also the first instance in which the company has officially disclosed the incentive. Industry watchers say the incentive signals positive feedback from Nvidia regarding Samsung’s HBM4 performance.
 

Related Article

 
Nvidia chief Huang, during a press briefing in Gyeongju, North Gyeongsang, on Oct. 31, praised Korean companies’ memory technologies and described Samsung as a “key supply partner for HBM3E and HBM4” in a same-day statement announcing a major Blackwell processor supply deal. The recognition marks a turning point for Samsung, which previously struggled to secure Nvidia’s validation for its HBM3 and HBM3E products due to yield and quality challenges in its 1b process — an opening that allowed SK hynix to take the lead in the memory market.
 
 
Both Samsung and SK hynix are expected to supply HBM4 chips for Nvidia’s next-generation Rubin processors, slated for mass production in the second half of 2026 — a timeline Huang confirmed during his remarks to the Korean press. KB Securities analyst Kim Dong-won projects Samsung’s HBM shipments could surge as much as fourfold by 2026, buoyed by orders for HBM3E and HBM4.
 
The company is in the midst of converting the DRAM production lines at its latest plants in Pyeongtaek and Hwaseong, Gyeonggi, to 1c DRAM, to "actively carry out investments necessary" to expand production capacity, according to Kim Jae-june, Samsung Electronics' vice president in charge of memory chips, said at a conference call on Oct. 30. 
 
 
Logic die yield points to mass production readiness
Adding to the optimism, reports indicate that Samsung’s foundry division has achieved a yield rate exceeding 90 percent for its 4-nanometer logic die — a critical component of the HBM4 architecture. The logic die, often described as the “brain” of the memory stack, connects the DRAM layers to the GPU and manages power distribution and data transmission in AI systems. A 90 percent yield suggests that 9 out of 10 chips are fully functional, implying that the process has matured enough for mass production.
 
SK hynix's fifth-generation high bandwidth memory (HBM3E) is displayed at the chipmaker's booth for SK AI Summit held at Coex in Gangnam District, southern Seoul, on Nov. 3. [YONHAP]

SK hynix's fifth-generation high bandwidth memory (HBM3E) is displayed at the chipmaker's booth for SK AI Summit held at Coex in Gangnam District, southern Seoul, on Nov. 3. [YONHAP]

 
As Samsung’s Suwon-based facilities ramp up HBM4 sample production for customers such as Nvidia and AMD, the company’s foundry division is said to have devoted nearly half of its 4-nanometer line capacity to logic die output, outlining its focus on next-generation memory components.
 
By combining 1c DRAM and the 4-nanometer logic base die, Samsung could boost its HBM4 processing speeds to 11 gigabits-per-second (Gbps), higher than the 9-10 Gbps expected from SK hynix and Micron's HBM4 offerings, making its memory a better fit for Nvidia's Rubin chips. 
 
However, Samsung trails SK hynix by about two months in obtaining Nvidia’s approval, a delay that could reshape the HBM4 market outlook in both the short- and long-term. SK hynix is expected to take the early lead, benefiting from first-mover advantage in production volume and pricing power. Over the longer term, analysts project that Samsung will rapidly gain market share as its technology matures and production capacity scales up.
 

Related Article

 
“As new HBM4 supply comes online, competition among vendors will intensify. Ultimately, the market could settle with SK hynix holding around a 50 percent share, Samsung a 30 percent share and Micron a 20 percent share.” Kiwoom Securities analyst Pak Yu-ak said. 
 
 
Chip supercycle seen extending through 2027
Analysts expect the ongoing semiconductor upcycle to persist through 2027, despite mounting speculation of an AI-driven market correction. In a sign of confidence, Korean brokerages are shifting their valuation frameworks for chipmakers from price-to-book (PBR) to price-to-earnings (PER) ratios — a recognition that the memory business may now be entering a phase of more stable, structural growth.
 
SK Securities set target prices of 1 million won for SK hynix — a record-high figure — and 170,000 won for Samsung Electronics, both based on earnings multiples for the first time. Traditionally, the cyclic nature of memory markets made book value a more dependable metric than volatile earnings, but analysts say the paradigm is changing.
 

Related Article

 
“It’s time to ask whether PBR is still the right yardstick for valuing memory chipmakers in the AI era,” said Han Dong-hee, analyst at SK Securities. “As profits stabilize and long-term structural growth continues, the logic for using PBR will weaken.”
 
Han added that tight production capacity among major players is expected to sustain higher prices in the coming years.
 
Market tracker CTT Research echoed the sentiment, forecasting that demand for DRAM and NAND chips will continue to outpace supply amid robust AI infrastructure investment and expanding data center buildouts. “Despite a few risk factors, the memory market is expected to stay in a favorable cycle at least through the first half of next year,” the report said Wednesday.
 
Correction, Nov. 7, 2025: An earlier version of this article misstated the incentive amount.

BY LEE JAE-LIM [[email protected]]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)