Inside a developer’s crusade for a clean internet

Home > National > Social Affairs

print dictionary print

Inside a developer’s crusade for a clean internet


Lee Jun-haeng, a web developer, styles himself as a crusader for a clean internet. He is the founder of two websites that critique sensationalist news coverage and has extensive knowledge of programs that can manipulate comments on social media. The interview was taken at the JoongAng Ilbo Building on April 26. [KANG JUNG-HYUN]

In Korea, a polarized political climate and highly wired society has given rise to a combative internet culture dominated by clickbait, heated arguments and nasty personal attacks. A recent scandal involving the ruling party and an alleged campaign to manipulate online opinion has added to concerns about unreliable information spreading on social media.

Amid the furor over fake news and social media bots, one software developer is out to bring some civility back to the internet. Lee Jun-haeng is the founder of two websites that are well known in Korea for critiquing news coverage and skewering online extremists.

Shock Croquette rates online news outlets according to the shock value of their headlines, and Ilganworst is a parody website that pokes fun at an infamous right-wing internet community called Ilbe. The name Ilganworst is a play on Ilbe, which is short for Ilganbest, or “best posts of the day.”

Lee styles himself as a crusader for a clean internet. The 33-year-old developer studied mass communication and journalism at Sungkonghoe University and went on to work at NCsoft, a major video game developer, and Naver, the country’s most popular search engine. In his spare time, he worked on his websites and blogged about internet security. In 2015, he became famous for publicizing the role of an Italian hacking team in helping the National Intelligence Service (NIS) spy on Korean citizens.

His familiarity with surveillance and manipulation tools made him a natural target of political operatives - they wanted his help bending public opinion in their favor.

Last month, police arrested a blogger affiliated with the Democratic Party on suspicions that he used software to fiddle with the comments section on Naver. The blogger, known online as Druking, allegedly used a computer program to inflate the number of “likes” on comments favorable toward Moon Jae-in when Moon was a presidential candidate last year. After Moon was elected, Druking allegedly turned on the president when his administration refused to grant patronage posts to some of his acquaintances.

Because of Lee’s familiarity with such software and techniques, the JoongAng Ilbo sought his thoughts on the recent scandal involving Druking and the Democratic Party. He discussed a wide range of topics related to internet security, culture and online communities. Below are edited excerpts from the interview.

Q. You must have become familiar with the world of online communities while running Ilganworst. Tell us more about that.

. Familiar is an understatement. Exhausted is more like it. After Ilganworst went online in December 2013, it immediately became a target of trolling by Ilbe users. An explosion of internet traffic from these attacks led to our servers crashing multiple times. On one occasion, Ilganworst received over 10 million visitors a day.

But all this is only data rather than actual public opinion. In some online games, users run external programs to automatically generate in-game items which they then sell for real money. In response, game companies like NCsoft have looked for ways to crack down on such manipulation. This is similar to how online comments are dealt with. It would be impossible to respond to such comments if they reflected real public opinion rather than mere data.

Tell us about the involvement of Namuwiki (a Korean version of Wikipedia) in the recent Druking scandal?

In 2015, a Namuwiki article falsely described me as the founder of a pro-Megalia website (Megalia is an online community of radical feminists in Korea). I finally received an apology in 2017 from what turned out to be a 14-year-old. So at that time, a 12-year-old schoolkid was capable of tarnishing someone’s public personality online. Let’s recall how a number of celebrities took their own lives in the past because of negative online comments - these posts can be made by a few people, even one individual. It’s easy for a few people to flood the comments section on Naver or online forums and conceal true public opinion.

On social media, you’ve mentioned that numerous political operatives have sought your help in manipulating online opinion.

That was unrelated to the recent scandal and occurred between 2013 and 2014 during the previous administration. I received proposals from both the left and right. I met the head of a right-wing website and an associate of the Democratic Party who claimed to be responsible for the party’s online activities. The conservative later joined the Blue House under President Park Geun-hye, and the Democratic associate is now running for a district chief position and sometimes appears on television. People like them are often prone to self-inflation. No one knows the full extent of the truth in their words. Another political operative close to the Seoul mayoral candidate Ahn Cheol-soo claimed he could easily mess with the list of top keywords on major search engines, but I don’t know how true that is.

When they sought my help, the right-wing media executive brought up his extensive financial ties and the Democratic associate offered to support my business ventures in exchange for my help. The latter said he would gather an army of left-wing online supporters on par with Ilbe’s numbers and went around meeting organizers of progressive online communities like Oyu and Clien.

At that time, Moon Jae-in’s and Ahn Cheol-soo’s parties were about to merge. The Democratic associate asked me whose side I was on and to help his side win. I told him I was on nobody’s side and turned down the offer.

At that time, the Democratic Party was in the opposition and strongly condemned the National Intelligence Service’s involvement in the 2012 presidential election. Yet it appears both the left and right were engaging in similar political activities online.

It looks like it. I knew the portal sites were vulnerable long before the NIS scandal. Back in 2005 when I was a college student, I was charged with inciting a demonstration and put under watch by the NIS. People from the NIS occasionally contacted me afterward.

Once, an NIS official told me he was able to pull down a news article that was trending on Daum (a major portal site), and it happened while we were still on the phone. (Lee says he didn’t know whether the NIS put pressure on Daum or ran a computer program to do it. He says he just saw it happen.)

The police announced that Druking used a powerful piece of software called King Crab to manipulate data. It must have cost a lot of money. What are your thoughts?

It’s just another macroprogram. The cost is usually determined by the developer’s experience, income and amount of time it took to create it, but a program like this is not too difficult to make. I myself once developed a similar program in about three hours after it was commissioned. It’s even easy to acquire a program that is able to automatically detect changes in online algorithms and respond accordingly. I don’t know why the police characterized the program in those terms.

So a simple tool like this can’t be fended off by Naver? Are they unable to stop it, or just unwilling?

It’s easy to figure out what has been manipulated by macroprograms and what articles are being tampered with. Naver is definitely monitoring such activities.

But stopping it is another matter. If they do so willingly, users will inevitably protest, so they have to use an algorithm to respond.

Ten years ago, online comments were displayed according to how recent they were, so all the manipulators had to do was flood the comments section with new posts. An online jargon called “regen” (short for regeneration) refers to this process. Through regen, certain comments are displayed more prominently to users and make it seem like they reflect the majority opinion.

When portal websites changed their systems to display comments based on likes, manipulators found another way to tamper with them. The result has only deepened online disputes. As long as there are people willing to rig online opinion, preventing such manipulation is difficult. It’s the same in the United States.

It’s true that Naver has failed to put in the least amount of effort to deal with this problem. Not only does their profit system dissuade them from doing so, it also seems like they’ve just given up since they know they can’t really do anything about it.

In the layman’s eyes, it just seems like complacency. What do you think?

For Naver and any other IT company for that matter, time is money. Any delays in the system are perceived as financial loss. Let’s say they are able to filter out articles untampered by macroprograms, clean up keyword rankings and correct the view numbers. That takes time. The company regards putting the system on delay for even a minute, even for the sake of transparency, as a tremendous loss.

Would they do so willingly? I highly doubt it. For them, business comes first. Naver benefits as long as their pages get more clicks. If they had journalistic ethics and a sense of governance, they would pay more attention to the morality of the issue. But the reality is different, and at the end of the day, Naver is just another private company.

If Naver is monitoring this activity, did they request a police investigation based on their knowledge that a certain organization was manipulating their system?

Looks like it. But it seems like they thought that conservative groups were behind the manipulation. If they knew that the culprits were associated with the Democratic Party, I doubt they would have informed the police.

Many developers claim that they finally understand how macroprograms were used by political groups after the Druking scandal came to light. In reality, macroprograms were used long before this scandal between online tutors to undercut each other and by plastic surgery advertisers and music label marketers. But these viral marketing cases involved financial transactions and are different from political opinion manipulation. Naver never would have imagined that the initially pro-Moon Druking would later attack his administration. They miscalculated this aspect.

At a National Assembly hearing, Naver founder Lee Hae-jin claimed he did not know much about the manipulation of “likes” and talked about using artificial intelligence to deal with comments.

Automation means nothing. Even if a machine determines the ranking of comments and how they are displayed, a developer can override it by changing the values in the database. Developers laughed at Lee Hae-jin’s answer because he clearly knows the truth and is pretending not to know.

Is there anything we can do about the politicization of online comments?

Naver definitely should take responsibility for setting up a system where all news is consumed within its website and allowing people to post comments underneath. But when it comes down to it, there will always be people who will attempt to manipulate public opinion. Everyone knows that you can even mess with the Blue House’s online petition system. It can’t just be blamed on Naver and the internet. Even during the days before internet, people rigged billboard charts for music albums and books by buying those items in bulk, so we shouldn’t stress ourselves by trying to clean up a system that can’t really be fixed under the guise of understanding true public opinion.

We should revisit the actual posts that seem to bother us so much. Do we really need a comments section? Major American news websites like National Public Radio also struggled with this problem and eventually got rid of comments altogether. Comments on portal sites do not reflect public opinion. It’s closer to drunken rambling at a pub. Why should we suffer because of them?

There were plenty of refined comments written on sticky notes at the memorial for the Gangnam Station murder victim in 2016. Lots of platforms exist that do not resemble the trash bins that are the online comments sections. If online comments are to be left as they are, then we should at least educate people on the fact that these posts do not reflect actual public opinion but are the result of infighting between certain factions. Then maybe people will ignore negative comments rather than be hurt by them, and as a result reduce the incentive to manipulate.

From the perspective of a developer, is there a difference between online opinion manipulation and digital marketing, between what is legal and illegal?

There is no real difference. The programs you use may vary according to the platform, but the principle remains the same. There is always a degree of manipulation in any corner of the internet. We know how online movie ratings are tampered with, but because what happens online affects the real world, there will always be people who stake their lives on it.

Log in to Twitter or Facebook account to connect
with the Korea JoongAng Daily
help-image Social comment?
lock icon

To write comments, please log in to one of the accounts.

Standards Board Policy (0/250자)