In perception-driven Korea, even breakup texts are crafted with ChatGPT

Home > Business > Industry

print dictionary print

In perception-driven Korea, even breakup texts are crafted with ChatGPT

Audio report: written by reporters, read by AI


[GETTY IMAGES]

[GETTY IMAGES]



[EXPLAINER]
 
Rather than agonizing over how to decline a boring date or strike the perfect tone in an email to her professor, a 25-year-old Korean student turns to an unexpected adviser: ChatGPT. 
 
The AI, she says, knows how to strike the perfect tone — polite but not stiff, sincere but not cringy — better than anything she could manage on her own.
 
The student, surnamed Lee, is far from alone. For a growing number of young Koreans, generative artificial intelligence is more than a tool for studying or upping productivity. 
 
It’s becoming an emotional ghostwriter, the kind of role once reserved for a trusted friend helping to draft love letters or breakup texts too hard to write alone. 
 
Turning to ChatGPT for relationship advice isn't a uniquely Korean phenomenon. People all around the world use generative AI to craft emails or texts, as a source of life, relationship or work advice or even as a faux-counselor.
 
But many Koreans approach the chatbot with the specific cultural backdrop of nunchi — a form of emotional intelligence that involves reading the room, picking up on unspoken cues and being acutely aware of how others perceive of you.
 
“Here, even a text message gets judged — grammar, tone, sincerity, politeness, everything,” Lee said. “It’s just easier to run things through ChatGPT.”
 
She’s used the AI to help her politely decline two recent blind dates. “These things matter,” she added. “People use your words to judge you — and sometimes to mock you. So more and more of us are turning to AI to make sure we get it right.”
 
Lee isn't the only one. In a society where emotional restraint is equated with maturity, and where blending in is often safer than speaking up, turning to AI for emotional expression feels, for many, like a natural evolution. 
 
 
A list of suggested responses from ChatGPT for politely turning down a blind date in Korean language. [SCREEN CAPTURE]

A list of suggested responses from ChatGPT for politely turning down a blind date in Korean language. [SCREEN CAPTURE]



When AI becomes the therapist
People judge, but AI doesn't. In fact, it may even be more empathetic than many human counterparts, according to a 2023 study by Zohar Elyoseph and his team at Max Stern Yezreel Valley College. Titled “The Emotional Awareness of an Artificial Intelligence Agent: A Comparison With Human Emotional Awareness,” the study used the Levels of Emotional Awareness Scale to evaluate ChatGPT's emotional intelligence.
 
The results were striking: ChatGPT outperformed the general population in identifying and articulating emotions across various scenarios. A follow-up test a month later showed even greater improvement, suggesting not only that the AI can deliver emotionally appropriate responses, but also that its performance may sharpen over time.
 
ChatGPT’s ability to emotionally engage has moved well beyond the lab. Around the world, it’s being used as a virtual confidant, therapist, relationship partner and counselor — especially among young users navigating emotionally complex situations.
 
Koreans have quickly become part of this wave. As ChatGPT’s Korean fluency improved, so did its appeal. Earlier versions like GPT-3.5 struggled with awkward phrasing and overly formal tone, limiting its use to basic translation or idea generation. But GPT-4 introduced more natural, conversational Korean, while GPT-4o, released in 2024, further advanced its grasp of cultural nuance, spoken tone, and personality-matching.
 
The leap in the fluency has made the AI more than just a productivity tool — even native speakers now turn it for help with sensitive communication.
 
Na, a 27-year-old office worker, turned to ChatGPT when trying to convince her conservative parents to let her go on a trip with her boyfriend, a request that clashed with the family’s longstanding rule: No overnight stays with the boyfriend until marriage.
 
“I explained the situation to ChatGPT and asked how I could sway them,” said Na. “I also described my personality, since I tend to be blunt and not very affectionate, so I asked for advice in a tone that sounded like me.”
 
Did it work? “Halfway,” she said with a shrug. “They’re strict, but at least my curfew got extended.”
 
Na didn’t stumble into using AI for personal matters on her own. Friends told her how they’d used ChatGPT to sort through difficult conversations and emotional dilemmas. She related to their reasons and soon found herself doing the same, not wanting to constantly vent to people close to her, and soon found comfort in the AI's judgement-free replies.
 
“The AI can be anything I need — a consultant, a therapist, or maybe just an emotional trash can,” she said.
 
 
Numbers speak for themselves
OpenAI's ChatGPT now dominates Korea's AI app usage. In April alone, Koreans spent more than 2.75 billion minutes using the app across Android and iOS devices, accounting for 95 percent of all AI usage time, according to market tracker WiseappㆍRetailㆍGoods.
 
The demand for emotionally tuned writing has grown enough that Naver, Korea’s largest portal site, added a feature to its Clova X AI service last month that allows users to generate emotionally nuanced messages for real-life situations — apologies to a spouse, wedding toasts, condolence notes — all customizable by tone, length and style.
 
An apologetic text about coming home late from drinking, for instance, can be extended into a full, heartfelt letter with just a prompt tweak.
 
Meanwhile, domestic players like Wrtn Technologies and Zeta are helping popularize personalized AI companions for entertainment and emotional support. Grand View Research projected that Korea’s AI companion market would grow from $829.1 million in 2024 to $3.8 billion by 2030, at an annual rate of 28.9 percent.
 
 
But is it real communication?
For younger Koreans like Na and Lee, turning to AI for understanding and as a stand-in messenger is about turning to a low-risk, anonymous medium. AI offers fast, kind answers and guaranteed feedback, a combination that can feel safer and more rewarding than navigating messy human interactions.
 
But that ease comes with potential downsides. The constant need for validation can become addictive and hinder the kind of emotional growth that comes from real, unfiltered conversation, warns Prof. Heyeon Park of Dongduk Women’s University.
 
“In therapy, one of the most important tasks is helping a person recognize and name what they’re feeling,” Park said. “That doesn’t happen immediately — people often discover their true feelings through reflection. If you never do that work yourself and rely on AI to fill in the blanks, you miss out on that awareness-building process. Over time, it can lead to a dependency where you can’t express your own emotions without a digital intermediary.”
 
The concern isn’t just emotional stagnation. It’s a cultural shift toward surface-level relationships. As more young people grow hesitant to make new friends or engage in spontaneous conversations, they turn to AI for help, reinforcing the very insecurities that led them there in the first place.
 
Then comes the ultimate question: If you found out that a heartfelt love letter, was in fact, written by an AI, would you be disappointed? Or would you accept it as the new way of communication?
 
For Lee, the answer was simple.
 
“I don’t think I’d take it the wrong way,” she said. “If anything, I’d find it kind of endearing — seeing that the person went as far as using AI just to give me a proper response.”
 

Related Article


BY LEE JAE-LIM [[email protected]]
Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
s
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)