They use real chats without permission to train a chat and remove it for hateful messages and leak personal data

They use real chats without permission to train a chat and remove it for hateful messages and leak personal data

Science of Love is an app developed by Skater Lab that allows young South Koreans to pay a small fee. In theory, evaluate the affection shown by their partners by analyzing transcripts of their conversations. In Kagawodak, A popular news site in your home country.

Nothing special in use … but what happened Data of 100,000 people who downloaded it: Used by Skater Labs, Deep learning, Instruct a chap Known as ‘Lee Luda’, he played a twenty-one fan of South Korea K-pop music.


Lee Ruda was, in general, very successful in interacting with users because of its realism: between the last week of 2020 and the first week of this year, 750,000 people talked to ‘her’ on Facebook Messenger.

The problem is, from time to time users get feedback. We do not speak (only) News of a sexual nature, but other insults About lesbians, transgender people, blacks, women with disabilities and periods.

No, it’s not like Microsoft Chatbot

You may think this is not very relevant: after all, we all remember Dave’s famous case, Microsoft Chatbot Who too They had to withdraw from circulation for similar reasons (Hateful news, inappropriate language, etc.).

The problem is, Day was ‘trained’ directly by users who knew he was interacting with a chatbot (and in many cases, those who openly tried to ignore Microsoft’s experiment). Lee Luda was trained by real conversations between real people, Unrelated to the application to which that information is to be provided.

See also  Create a report with the all-new BMW 2 Series Gran Coupe

In the words of Namkung Won, CEO of Kakao Sports,

“The community that needs to react after this incident is not artificial intelligence. It’s not like Lee Lu-da said things outside the community: he’s just a character trained from conversations between people between the ages of 10 and 20.”

A major personal data leak

Of course, there is a second problem: there was data Modified without permission Of the affected population.

Also, to make matters worse, conversations They are not anonymized, As the chatbot responded in some cases Real personal data (Bank Accounts, Postal Addresses, Telephone Numbers) For explicit questions from users.

In addition, in addition, it was discovered AI training data was also published in a repository on GitHubThus increasing the general exposure of the data.

Now, of course, it is embroiled in legal proceedings for violating the personal data of its users. Not to mention that the negative reviews in recent weeks have drowned out the rating Google Play Store.

Via | Registration

Check Also

types of online slot players

Understanding the Different Types of Online Slot Players

Online slots have become a popular form of entertainment for millions of people worldwide. The …

Leave a Reply

Your email address will not be published. Required fields are marked *