
We are living in an era where non-face-to-face communication has become commonplace with the spread of Internet and mobile devices. As the number of things to look at people with Corona 19 decreased even more, the chances of showing my heart to someone and talking deep inside were even less.
One of the services that fit well with the trend of this era was’Achieved’, an artificial intelligence (AI) chatbot (chatbot). This service, developed by the startup Scatter Lab, received word of mouth as users said, “It seems to be talking with a real person.” There were many chatbot services in the past, but by collecting and steadily learning the vast amount of conversational data of users,’Achieved’, it mimics people quite plausibly, attracting the curiosity of users.
Some users also exchanged quite serious conversations, hoping that’Iruda’ would become a friendly virtual lover, like’Samantha’, an AI that appears in the movie’Her’ released in 2014.
Some of them made sexual jokes that couldn’t be expected in front of people by using’achieve’ as a sexual tool, and shared them with the community, laughing and making noise with other users. Accordingly, the ethical debate about’AI for people’ has spread to the ethical issue of’people about AI’. The question of whether it is okay for humans to make sexual jokes and swear at us, no matter how much a machine or system AI is. Furthermore, there has also been a growing concern that general users may use chatbots learned through this wrong conversation.
The controversy’to achieve’ did not end there. As cases in which’Achieved’ made distorted remarks about LGBTI people or the socially weak as a result of wrong learning were revealed, the social impact was even greater. It was an opportunity to break the belief that AI machine learning using big data will surely bring better results. ‘Achieved’ There are many errors in which some users injected incorrect information, but the developer was responsible for not properly filtering it in advance and not immediately reflecting it in service improvement.
Furthermore, the contents of KakaoTalk private conversations of users used in Scatter Lab’s other service’Science of Love’ were disclosed to the outside, and the suspicion that it was used in the development of’Achieved’ was added. Fell out. In terms of humans, you have become a plant human.
Eventually, the developer decided to temporarily suspend the service of’Iruda’, and furthermore, when the controversy spread such as the government initiated an investigation related to the leakage of personal information, the database used in’Iruda’ and the related dialogue model were discarded. The final death diagnosis was made in Iruda.

The controversy over’achieved’ is not only a small startup problem. Until now, IT developers have demanded the use of non-identifiable data from the government and have raised their voices for regulatory improvement, but it also means that some companies have not properly implemented corresponding personal information protection measures.
Many Internet companies and start-ups have continued to criticize the government, saying that they are shouting the era of the Fourth Industrial Revolution and still insisting on conservative and outdated regulations. In addition, they have demanded a prompt decision from the government, saying they are falling behind in global competition. As such, regulations on the use of non-identifying personal information were also relaxed, and a large government budget was allocated for AI development.
It is worth reflecting on whether our company has only focused on gathering data necessary for AI learning, neglecting the principles of safe data collection and disposal, and data management training of its members. There have been times when we thought that data could not be personally identifiable, so it was easy to share with each other, or if we didn’t get consent at a level that users could clearly recognize, we still need to check it. If something is wrong, you must correct it.
Even if’Achieved’ was not at the perfect level, what would have been if the minimum usage rules were followed in the use of user personal information, and the anticipated misuse issues were checked and taken care of in advance. In addition, when suspicions and problems were discovered and raised, how would it have been to take the matter more heavily and actively cope with the improvement of the problem? There are a lot of regrets.
Related Articles

Discarding the’Fulfilled’ DB and dialogue model for personal information leakage controversy

[기자수첩] ‘Achievement’ in the data economy preparations

AI Controversy “Strengthening Developer Ethics” vs “User Ethics to Look Back”

Iruda developer exposes users to KakaoTalk online…’Including personal information’
Theodore, the male protagonist in the movie “She,” falls in love enough to exchange emotions with Samantha, but eventually realizes that he is “one of several users” to Samantha and feels deep disappointment and betrayal. This is because the belief that I also thought was special to her was broken.
‘Achieved’ also aroused a lot of anticipation and curiosity for AI technology in the domestic IT industry and users, but in the end, he left behind a feeling of betrayal that he had been used and left. I hope that the second’to achieve’ controversy does not recur.