[ANALYSIS] ChatGPT needs a lot of the memory Korean companies make

Home > Business > Tech

print dictionary print

[ANALYSIS] ChatGPT needs a lot of the memory Korean companies make

Businessman using a laptop computer chatting with an intelligent artificial intelligence asks for the answers he wants. [SHUTTERSTOCK]

Businessman using a laptop computer chatting with an intelligent artificial intelligence asks for the answers he wants. [SHUTTERSTOCK]

 
With the rise of ChatGPT, chipmakers have been quick to jump on the bandwagon, saying that the AI-powered chatbot and similar technologies will boost demand for high-end memory chips that hitherto were relatively slow sellers.  
 
Analysts remain cautious due to the limited chip range and few practical uses so far for the smart-seeming chatbots, while the likes of Samsung Electronics and SK hynix are spinning the hype and fueling the hope.  
 
With the shares of Nvidia, a U.S. chip designer that supplies the main processors to train the ChatGPT, on a tear over the past few weeks, the executives of Nvidia, Samsung Electronics and SK hynix touted the ChatGPT frenzy as the next big thing to impact the chip industry.  
 
SK hynix Vice Chairman and CEO Park Jung-ho likened the uptake of the advanced AI chatbot to the arrival of iPhone.
 
Nvidia’s latest processors – the A100 – were used to train the ChatGPT in data centers, and they are fitted with high-end memory chips called High Bandwidth Memory (HBM), which is supplied by SK hynix. HBMs stack layers of dynamic random access memory (DRAM) vertically to better support high-performance computing and cost about five times more than conventional DRAMs.  
 
“ChatGPT used 10,000 Nvidia processors, and 32 gigabytes of HBM memory is installed per graphics processing unit,” said Jeong In-seong, a semiconductor specialist who authored "The Future of the Semiconductor Empire."
 
“When we convert the size of memory into that used in smartphones, it translates into the memory capacity of 100,000 to 200,000 smartphones, which represents a mere portion given that over a billion smartphones are shipped annually,” he said.  
 
Future orders are coming from competing tech companies around the world, but they are primarily limited to research and development efforts or beta services far smaller in scale than ChatGPT.  
 
If generative AI, the technology behind the ChatGPT, is to become a major revenue stream for pricier memory chips, more tech companies will be assured of commercial viability and validity of the technology beyond exciting some users, analysts say.  
 
“The key issue is whether tech companies are willing to bear the high costs for running and maintaining the expensive infrastructure consisting of high-speed software, hardware and network systems,” said Lim Sang-kook, an analyst at KB Securities.  
 
“To resolve the issue, the AI services' ability to generate a profit will emerge as an important factor,” he said.  
 
HBM products were less than 1 percent of DRAM demand in terms of bit growth, according to a data released by market tracker DRAMeXchange in 2021, the latest data available.
 
Greg Roh, head of research center at Hyundai Motor Securities, said that the popularity of advanced AI systems could drive demand for HBM, but its impact on the entire chip market remains to be seen.  
 
“There is no doubt that demand for HBM will grow thanks to ChatGPT. But H100 chips took up a very scant portion in the data center servers, and HBM itself has a limited presence in the DRAM market. Therefore, investors should approach from mid-to-long term perspectives,” Roh said.  
 
The H100 is expected to succeed the A100 chips to power an updated version of ChatGPT as early as this year, with SK hynix’s HBM3 packed in.  
 
Despite mixed prospects, the world’s two largest memory chipmakers — Samsung Electronics and SK hynix — have focused on the high-end line of chips to ensure technical supremacy over competitors and respond to future demand.  
 
Samsung Electronics released its own HBM — HBM-PIM — last year, which was supplied to AMD, a Santa Clara, California-based chip designer. Samsung Electronics claimed that it is the industry’s first memory chip embedded with an AI processor, based on an architecture called processing-in-memory technology, which is designed to bring powerful AI computing capabilities along with high-performance memory.
 
The chipmaker is also working with Naver to provide advanced memory chips for running heavy-traffic language models and ChatGPT-like services developed by Naver.  
 
Naver plans to roll out SearchGPT, a rival to ChatGPT, in the first half of this year.  
 
The range of memory products being used includes the latest Solid State Drive, HBM-PIM and DRAM chips from Samsung Electronics based on Compute Express Link (CXL) technology.    
 
“The two companies are expected to create semiconductor solutions that address the actual needs of the end-users starting from the initial stages of development while enabling optimization at the system level,” Naver said in a statement.  
 
The two parties aim to reduce latency in processing a myriad of data simultaneously.  
 
“Through our collaboration with Naver, we will develop cutting-edge semiconductor solutions to solve the memory bottleneck in large-scale AI systems,” said Han Jin-man, Executive Vice President of Memory Global Sales & Marketing at Samsung Electronics.  
 
SK hynix reiterated its commitment to maintaining the level of investment into manufacturing equipment and process required for high-end memory ranges this year, although the overall capital expenditure will be halved compared to last year's 19 trillion won.

BY PARK EUN-JEE [park.eunjee@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)