LG's hyperscale AI made its own hat

Home > Business > Tech

print dictionary print

LG's hyperscale AI made its own hat

A pumpkin-shaped hat created by LG's AI system Exaone [LG]

A pumpkin-shaped hat created by LG's AI system Exaone [LG]

 
LG presented to the world Exaone, a large AI system that it claims surpasses the ability of any other existing AI to process and analyze data.  
 
Being large -- or hyperscale in the jargon of the industry -- means that the AI system uses a huge amount of data of various types including images, text, speech and numerical data to provide more accurate result at a faster pace.  
 
In comparing Exaone with existing AIs, LG said it is capable of creating an image of a hat in the shape of pumpkin to a voice request “Make a pumpkin-shaped hat.” Existing AIs would try to find the right image from a pool of given images instead of creating one.  
  
At a conference celebrating the first anniversary of the start of LG's AI research center, a researcher showed a picture of the pumpkin-shaped hat Exaone created. Exaone deploys a total of 300 billion parameters, the highest in Korea, according to LG AI Research during the conference Tuesday.
 
As a configuration variable in an algorithm or programming, the higher the number of parameters, the more sophisticated the system typically is. Still, Exaone's parameter figure falls short of that of bigger tech giants like Google. The Mountain View, California-based company is working on a trillion-parameter AI system.
 
Exaone has been trained with 600 billion corpuses, a unit of text and speech data used to train AI and machine learning systems, and 250 million high-resolution images.  
 
What distinguishes Exaone from other local AI programs is the adoption of GPT-3, one of the largest English language models, to process both Korean and English.  
 
The head of LG AI Research said it is committed to expanding partnerships with top-tier institutions.  
“We will strengthen our research & development ties with major universities including the University of Toronto, University of Michigan, Seoul National University and KAIST,” said Bae Kyung-hoon, the AI think tank’s head.
 
The project is a part of LG’s ambitions to become a big player in AI research.  
 
LG AI Research announced in May that it will invest more than $100 million over a three-year period to establish computing infrastructure to help the organization’s research in artificial intelligence (AI) technology.
 
The investments will go to purchasing or borrowing servers in large volumes dedicated to AI research. By the end of that three-year period, LG hopes its computing infrastructure will reach the capacity to process 9.7 quintillion calculations per second.
 

BY PARK EUN-JEE [park.eunjee@joongang.co.kr]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)