[AI IN ACTION] AI boom runs on chips, and Korean firms are jumping on it

Home > Business > Tech

print dictionary print

[AI IN ACTION] AI boom runs on chips, and Korean firms are jumping on it

Big Tech companies drive demand for high-end AI chips. [SHUTTERSTOCK]

Big Tech companies drive demand for high-end AI chips. [SHUTTERSTOCK]

The wild AI boom has shaken up the semiconductor industry, dividing winners and losers depending on their capabilities to power generative technology.
 
Nvidia, now a proud member of the $1 trillion market cap club, is considered the biggest winner. Its GPU is becoming essential to training advanced AI systems.
 
Korean players are capitalizing on it too. Samsung Electronics and SK hynix produce high-end memory chips tailored to the likes of Nvidia's GPUs. 
 
Samsung Electronics is expanding production of its high-end bandwidth memory (HBM) chips next year by a factor of two and half compared to this year.
 
SK hynix will increase its facility investment next year as well, zooming in on HBM production. The company will add a new packaging line essential to producing HBM chips to its existing Cheongju plant.
 
SK hynix manufacturing site in Cheongju, North Chungcheong [JOONGANG PHOTO]

SK hynix manufacturing site in Cheongju, North Chungcheong [JOONGANG PHOTO]

 
HBM is a type of high-end dynamic random memory (DRAM) chip that uses a stacking technology to enable data processes at faster speeds, with higher energy efficiency, than ordinary chips can produce.
 
As Big Tech companies zealously upgrade their generative AI tools, which need to process massive amount of data with less power, HBMs are selling out like hot cakes.
 
Samsung Electronics and SK hynix control more than 90 percent of the premium DRAM chip market. Both companies said their HBM supply capacity for next year is already fully spoken for. Talks for 2025 have already commenced.
 
HBM, which is on average five to six times pricier than normal DRAM chips, used to account for a single digit share of the DRAM space, but its portion will likely jump.
 
Citigroup chip analyst Peter Lee expects the share to increase 11 percent and reach 30 percent in 2027.
 
No longer a commodity
Memory chips have long been known as a general commodity that can be deployed in various applications without heavy sophistication or customization. For that reason, producers have had to rely strictly on economies of scale for profitability and have been greatly affected by market fluctuations.
 
That should not be the case anymore.
 
As HBM manufacturing gets more complicated, technical differences start to kick in, giving the memory more customizability.
 
SK hynix's HBM3E chip [SK HYNIX]

SK hynix's HBM3E chip [SK HYNIX]

 
SK hynix is currently the No. 1 player in the HBM market, accounting for 50 percent according to TrendForce data.
 
SK hynix' HBM chips differentiate themselves from others through what is called mass reflow molded underfill (MR-MUF) technology, which uses liquefied fillers to bind and protect the circuits that connect each stacked-up wafer.
 
MR-MUF technology is known to raise energy efficiency compared to the more common NCF packaging, which uses a film paper between the wafers that later has to be melted. Delivering heat equally to each wafer was tricky, and yield rates were low.
 
SK hynix will utilize MR-MUF technology for its latest HBM3E chips, which are yet to be mass-produced. The chipmaker began delivering samples to clients, including Nvidia, in August.
 
“Until now, memory chips, which were considered a commodity, was all about who can make it smaller and taller,” said SK hynix CEO Kwak No-jeong recently. “But as AI era arrives, clients' service diversified, as well as specs that are demanded from memory chips. Memory chips, therefore, are evolving into customized products which are especially strong in certain aspects.”
 
Samsung Electronics chip production site in Pyeongtaek, North Gyeongsang [JOONGANG PHOTO]

Samsung Electronics chip production site in Pyeongtaek, North Gyeongsang [JOONGANG PHOTO]

 
Samsung Electronics is building on its process-in-memory technology where it integrates a logic unit inside a memory chip, which helps raise energy efficiency because it requires less data to move between the controller unit and the memory chip. Samsung was the first to come up with the tactic in 2021.
 
At a Hot Chips forum this year, the company revealed research indicating that its technology would more than double the energy efficiency and performance of an ordinary HBM. The research was done in collaboration with AMD; specifically, Samsung integrated its HBM-PIM with AMD's MI-100 GPU accelerator.
 
“HBM-PIM addressed the bottleneck in memory bandwidth by embedding data computation functions within DRAM, leading up to 12-fold performance improvements,” said Hwang Sang-joon, executive vice president of Samsung Electronics responsible for DRAM business.
 
“Specific functions, such as voice recognition, became four times more power-efficient.”
 
Fresh partnerships coming
Samsung Electronics' contract manufacturing business is attracting new clients who are riding the AI hype.
 
Its Taylor, Texas fabrication plant, which is currently under construction, secured its first client — Groq, a Silicon Valley-based AI startup — earlier in the year.
 
AI chips using Samsung's 4-nanometer technology will be fabricated at the plant by the end of next year. 
 
The company recently forged an additional partnership with Canada-based AI chip designer Tenstorrent. Tenstorrent is an up-and-coming firm lead by semiconductor expert Jim Keller.
 
Samsung Electronics HBM3E chip [SAMSUNG ELECTRONICS]

Samsung Electronics HBM3E chip [SAMSUNG ELECTRONICS]

Tenstorrent is working with AI chipmakers Rebellions and DeepX in Korea.
 
Samsung is leveraging its semiconductor prowess to operating both memory chip and contract manufacturing businesses.
 
Since early this year, Samsung has been offering what's called a Package Turn Key service, wherein it acts as a one-stop station that both contract manufactures processors and packages them with its own memory, including DRAM, to save time and money. The product is an effort to attract Big Tech companies seeking high-performance semiconductors to power their AI tools.
 
At the moment, Samsung is the only company that is able to manufacture logic chips and memory chips and package them at the same time.
 
AI chips on the national agenda
Given that AI and the advanced chips powering the AI require a huge sum of investment and infrastructure, government support is key.
 
The Yoon Suk Yeol administration will invest 826.2 billion won ($631 million) with the hope of shaping an AI ecosystem in Korea with indigenous high-end AI chips by 2030.
 
“AI will exert a consequential impact on semiconductors, data and platform services and even security,” President Yoon said in September during a intergovernmental meeting on AI. 
 
“Government support should prime the pump to boost corporate investment and innovation,” he said.
 
The government's central initiative is called the “K-Cloud Project.” Participants include AI chip designers, cloud companies, scholars and the Ministry of Science and ICT.
 
During the first stage of the project, some 100 billion won will be spent to develop a neural processing unit (NPU) for AI data centers by 2025.
 
The K-Cloud Project aims to set up data center clusters that deliver 39.9 petaflops of AI computing power based on the new NPU — 19.95 petaflops each for the public sector and private sectors. One petaflop executes one quadrillion, or thousand trillion, calculations per second.
 
At stage two, the consortium plans to create a DRAM-based low-power processing-in-memory (PIM) chip by 2028. PIM chips integrate memory and processing to reduce latency and address the von Neumann bottleneck.
 
The final stage will upgrade the PIM chip based on nonvolatile memory and super-low energy consumption by 2030.
 
The global AI chip market is forecast to be $86.1 billion in size by 2026, growing 16 percent each year, according to market tracker Gartner.
 
 
Data centers joining the bandwagon
Another sector that expected to receive a boost from touted AI technology is data centers. Demand for data centers and cloud computing, which rose during the Covid-19 pandemic, will expand further due to hyped generative AI, as companies will need computing resources in order to train their own large language models. Experts forecast mid to long-term growth as demand overtakes supply for data centers.
 
The three mobile carriers — SK Telecom, KT and LG Uplus — have a total of 31 data centers nationwide, 18 of which are situated in Seoul. According to the Yuanta Securities report, the trio account for 93 percent of total data center capacity.
 
“From 2028, the internet data center market will shift to a supplier-oriented one,” said Yuanta Securities analyst Lee Seung-woong. “Demand for data centers will likely increase by 1,088 million won in 2030, while supply will likely rise only by 609 million won during the same period. As it is difficult for supply to exceed demand in the near term, data centers will likely boost telecom companies’ profitability in the mid-to-long term.”
 
After buffing up their size within the country, SK Telecom and KT aim to export their data center infrastructure overseas, starting with Southeast Asia.
 
At a press event to introduce its AI strategy last month, SK Telecom CEO Ryu Young-sang said that the company plans to more than double the domestic capacity of its data centers — from 100 million won to 207 million won — and work with foreign cloud service providers to expand its infrastructure overseas.
 
KT’s cloud computing subsidiary KT Cloud secured 600 billion won from local private equity firm IMM Credit & Solutions in May to expand its infrastructure cloud and data center infrastructure. Its annual revenue projected to reach 2 trillion won by 2026.

BY JIN EUN-SOO, PARK EUN-JEE, LEE JAE-LIM [park.eunjee@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)