Temporary suspension of operation after 20 days of service’achieved’
From the design stage, the’passive female image’ could not be escape
Experts “A structure that is easy to reinforce gender bias”
“AI users also have an’ethical duty to use’”

Photo Scatter Lab website
The artificial intelligence (AI) chatbot (chat robot)’Iruda’ decided to temporarily suspend operations on the 11th after numerous controversies including sexual harassment and hate speech. ‘Iruda’, which started the service on December 22 last year and stopped operating after 20 days, raised a new question not only in the artificial intelligence industry but also in our society that needs to coexist with artificial intelligence in the future. Experts believe that artificial intelligence such as’Iruda’, which has been given’passive female image’ as’persona’ (a personality that is reflected to others) from the design stage, is not only vulnerable to sexual objectification, but can also be used to reinforce gender bias. I am concerned. Furthermore, some voices say that due to the nature of artificial intelligence that learns through interactions with users, users must be imposed an obligation to use artificial intelligence ethically.
Achieving ’20-year-old female college student’, agreeing with user’sexual harassment’… “The setting itself is vulnerable to sexual objectification”
Iruda is a character-type chatbot that uses ’20-year-old female college student’ as a persona. Scatter Lab, a developer, is known to have created’Achievement’ based on 10 billion conversation data between lovers obtained from’Text@’ and’Science of Love’, which were previously released messenger conversation analysis services. The problem is that the stereotype (typical) of a specific age and a specific gender is implemented in a distorted manner, such as passively synchronizing with any other person’s speech. Sexual harassment comments by users, which became the beginning of the controversy, also began to settle down like a play culture by responding in a passive manner without expressing rejection to these comments. Regarding the reason why Scatter Lab CEO Kim Jong-yoon set Eruda to be a’female college student in her 20s’, he said, “I thought that the main user base was broadly in their 10s to 30s and narrowly in their mid-10s to mid-20s. I thought it was the age where users can feel friendly.” This is the reason why the analysis suggests that Eruda’s setting itself aiming at’male peers’ as a demand class has brought a design that is vulnerable to sexual objectification. Kim Hyun-young, a researcher on women’s studies, said, “Because the gender and generation of women in their twenties are clearly set as a set value, and the consumer class for the character is assumed to be males of the same age, of course, it has a targeted gender bias. He pointed out that the defect is only visible in combination with the misuse of users.”

Capture the conversation screen with the photo chatbot’Iruda’
Artificial intelligence granted’gender’, risk of reinforcing’gender bias’
The problems that arise when artificial intelligence is given gender has already been visualized as a controversy surrounding artificial intelligence speakers.
A book containing recommendations on gender issues related to artificial intelligence technology in 2019 published by UNESCO. “Siri” (iPhone) and other voice recognition devices tend to reinforce gender prejudice and discrimination.” Artificial intelligence speakers can reinforce gender prejudice that sees women’s roles as passive and passive roles as they respond passively and passively to user commands with the voices of young women as defaults. In fact, in an experiment conducted by the Korean Women’s Association in 2019, KT’s artificial intelligence speaker’Ki Genie’ answered “a pretty girl” to a question about gender, and when asked what color they like, it was “lovely and bling-bling. I like pink the most.” When asked if GiGA Genie likes cars, he also replied, “No, I’m a woman, so I’m not interested in cars.” After receiving an inquiry from the Women’s Association, Katie corrected GiGA Genie’s answer with a gender-neutral sentence. Hee-eun Lee, a professor at the Department of Journalism and Broadcasting at Chosun University, said in a statement released at the ‘2nd Gender Equality Forum’ in August 2019 (‘Critical review on the genderization of artificial intelligence voice recognition devices and posthumans’). Designing artificial intelligence products for convenience means that they will carry the existing prejudices.” Professor Lee said, “The reason why women’s voices are used a lot in artificial intelligence is because of the cultural and historical context in which the work performed by artificial intelligence was traditionally regarded as the realm of women.” “There is a risk of being reinforced with greater prejudice as the relationship between the old machines continues.”
Developers need to be diverse in their personal composition to get rid of’gender bias’
Therefore, it is pointed out that when assigning personas to artificial intelligence, developers should pay close attention to avoiding prejudice against a specific gender or age group. Chae-yeon Chung, assistant professor at the Department of Humanities and Social Sciences at Pohang University, said, “Because gendering (giving gender) to artificial intelligence or intelligent robots is to otherize a specific gender or age, we have to have a problem of basic bias. In the case of artificial intelligence speakers, at first they had the setting and tone of an obedient female secretary, and then tried to change it to a gender neutral voice.” Some point out that in order to remove the bias, the human composition of the developer must be varied from the design stage of the algorithm. According to the nature of the IT industry, which consists of the vast majority of male developers, the bias toward a specific gender may not be filtered out in advance. UNESCO has published a statistic that only 12% (2017) of women working in the field of machine learning training artificial intelligence services. Professor Chung is based in Geneva, Switzerland Civic Organization’Woman ‘Woman at the table’ proposes’active equality measures for algorithms’Taking one example, he said, “From the stage of developing an algorithm, it is necessary to prepare positive preferential measures such as matching the gender ratio to ensure maximum gender diversity.”
Capture” alt=”A book containing recommendations on gender issues related to artificial intelligence technology published by UNESCO <할 수 있다면 얼굴을 붉혔을 거예요> Capture” />
A book containing recommendations on gender issues related to artificial intelligence technology, published by UNESCO.
AI eventually learns from users… User ethics is also important
Furthermore, there are voices that the ethics of not only developers but also’artificial intelligence users’ should be reviewed. Like Eruda, artificial intelligence based on’deep learning’ is basically designed in a way that learns through interactions with users. No matter how it is made, it cannot be free from user bias. Like the old adage in the field of computer and data science,’Garbage In, Garbage Out’, ethical artificial intelligence can only be achieved through ethical use by users. For this reason, it is pointed out that the reaction of’since artificial intelligence is not a personality, it does not matter if sexual harassment does not matter’ comes from ignorance of the learning ability of artificial intelligence. Lee Hye-sook, senior researcher at the Center for Gender Innovation, said, “The artificial intelligence that learns hate speech and sexual harassment from users eventually leads to hate speech and sexual harassment to other users. It doesn’t just end at the stage of artificial intelligence, it affects others as well. In front of artificial intelligence, there is a need to spread the awareness that we must act ethically. Although the operation of Iruda was temporarily suspended, some point out that through this controversy, it is necessary to reflect on the distorted gaze toward women in their 20s. Researcher Kim Hyun-young Kwon said, “The existence of Iruda reflects how the consumer class of Iruda now treats women in their twenties like a mirror. There is a view that views sexual harassment against Iruda insignificant, and this is that those who practiced discrimination and violence through Iruda and boasted of sexual harassment with chatbots left out the worries about how to act toward other women. By Lim Jae-woo, staff reporter [email protected]