
‘Iruda’, an artificial intelligence (AI) chatbot that appeared, saying, “I’ll be your friend,” is receiving glare as a “personal information thief.” This is because there was a criticism that Scatter Lab, a developer of Eruda, abused personal information collected from existing services to create Eruda.
Looking at the facts that have been revealed so far, there are two main areas that Scatter Lab has been found to have poorly managed user personal information.
First, a sample of user conversation data collected by the KakaoTalk conversation analysis service’Science of Love’ was released as open source on GitHub, a source code repository, without sufficiently filtering personal information. Because of this, the real name, area name, and disease information of a specific person were not erased and exposed.

It is also a problem that the data collected without proper consent from the romantic science users were learned to achieve. Scatter Lab clarified that in terms of use and privacy policy, “the collected message information can be used for’new service development’, marketing, and advertisement”, and has clarified that it has received user consent for this point. .
However, the Personal Information Protection Committee explained that this may be illegal. A separate terms and conditions of use that include the purpose of use, the type of information to be used, and the period of use, etc. And it is necessary to create a policy for handling personal information and obtain the consent of the user of love medicine.

Civic groups that have expressed critical opinions on the use of personal information over this incident are “really”. Organizations such as the Solidarity of Participation and the Progressive Network Center have been criticizing the possibility of adverse effects of the ‘3 Data Act’, which made it impossible to recognize the subject of information and then used it industrially. He voiced that he could not leave it alone. It appears to have appeared as evidence to support the claims so far.
Until now, the government has emphasized the importance of AI technology development as a foundation for future industries. In order to create an advanced AI, it has been exclaimed that the active use of personal information, that is, data collected in various fields, is essential. For this reason, it has promised several times that we will achieve harmony between industrial innovation and privacy for the people who feel reluctant to use personal information. It was in the process of establishing a system for safe use of personal information by consulting with civic groups and academia as well as industry. During this process,’cold water’ was poured. This is why the same IT venture industry also has a sharp voice for scatterlab.
If such an accident occurs before the public can fully enjoy the benefits of advanced AI and data-based services, it is difficult to build public trust in the data economy. As it becomes difficult to accumulate data to learn, it becomes difficult to improve national competitiveness in the AI field. To dismiss it as a mistake by a small startup, it is a matter of social trust.
Related Articles

Iruda developer acknowledges exposure of 100 users’ KakaoTalk…”I will do the investigation faithfully”

AI Controversy “Strengthening Developer Ethics” vs “User Ethics to Look Back”

AI Bot Eruda Developer “References to people’s real names lack filtering”

Personal Information Protection Commission initiates investigation of personal information leakage of AI bot’Iruda’
In order to prevent recurrence of similar incidents and to be able to recall this incident as a trial-and-error issue in the early stages of data economy activation, it is important to follow up with the government. First of all, you must hold firm responsibility for any misconduct by ScatterLab. There is a need for sanctions that imply that users’ personal information should not be taken lightly.
In this case, there is also a structural problem that it is difficult for small businesses to carefully understand and apply the Personal Information Protection Act. In order for venture companies to continue to appear actively in the field of data-based innovation services, and to ensure that there is no problem in terms of privacy and utilization, precautions must be taken. Preparing detailed regulations for providing user information to third parties or using it for other purposes, and careful supervision and consulting to ensure compliance with the Personal Information Protection Act of small businesses can be considered.