AI chatbot achieved, illegal use of personal information?… The government investigates

Input 2021.01.11 18:35 | Revision 2021.01.11 19:08



AI chatbot’achieved’. / Scatter Lab

The government will investigate whether there has been any problem with the use of personal information related to the artificial intelligence (AI) chatbot’Fulda’. Eruda was created based on the data of users of the app’Science of Love’ created by the developer Scatterlab earlier, and in this process, it looks at whether personal information was used illegally.

According to the IT industry on the 11th, the Personal Information Protection Committee is planning to check whether AI startup Scatterlab has violated personal information related laws such as the Personal Information Protection Act. The Gae Bowie plans to ask Scatter Lab to submit data to confirm the facts and to conduct a field investigation if necessary. It is reported that the personal information accident investigation team of the Korea Internet & Security Agency (KISA) will also participate in the investigation.

Scatter Lab developed Eulda based on about 10 billion Kakao Talk conversations and launched it in December last year. The KakaoTalk conversation used here is the conversation between users of the dating science app launched in 2016. The science of love is a service that shows the number of’love level’ by analyzing the corresponding conversation pattern for a fee when users input KakaoTalk conversation data.

Science users of dating claim that they did not know that the information they handed over would be used for’achievement’. Initially, it was notified that it would be used for’new service development’, but it was not expected that it would be used in chatbots. In addition, there are cases in which the real name, home address, bank account number, etc. of a specific person are suddenly said. They are collecting evidence by setting up an open chat room to file a class lawsuit.

.Source