Twitter's cropping tool decided to show a picture of Mitch McConnell instead of Barack Obama: Pic: @bascule

Science and technology news appears to be a ‘terrible test’ to show that Twitter’s crop tool is racially biased

Twitter has launched an investigation after users claimed that its image crop feature was favorable to white people’s faces.

An automated tool in the social network’s mobile app automatically crops up images that fit on the screen – and selects which parts of an image to cut out.

But a test from a graduate programmer seemed to show racial bias.

Twitter has introduced voice tweets
Image:
Twitter has promised to investigate

To see what Twitter’s method will choose, Tony Archery released a lengthy image of Senate Republican leader Mitch McConnell’s helmets below and former US President Barack Obama’s helmets – separated by white space.

In the second picture, Mr. Obama’s helmet is placed above, below Mr. McCann.

Twice, the former president was completely cut off.

Following the “horrific experiment” – a black colleague who came after a picture he posted – Mr Arkheri wrote: “Twitter is an example of how racism can manifest itself in machine learning methods.”

At the time of writing, his test has been retweeted 78,000 times.

Twitter has vowed to look into the issue, but said in a statement: “Our team tested the sample before sending it, and found no evidence of racial or gender bias in our test.

“It is clear from these examples that we have received additional analysis. We will share what we have learned, what actions we are taking, and open our analysis so that others can review and copy.”

A Twitter representative also pointed to the research of a Carnegie Mellon University scientist who analyzed 92 images. In that test, the algorithm supported black faces 52 times.

See also  Former SolarWinds CEO "Solarwinds 123" blames coach for password leak-protection

In 2018, the tool will be based on a “neural network” that uses artificial intelligence to predict which part of a photo will be interesting to the user.

Meredith Whittaker, co-founder of the AI ​​Now Institute, told the Thomson Reuters Foundation: “This is another matter in a long and exhausting case of examples showing automated systems that encrypt histories of racism, misconceptions and discrimination.”

Check Also

types of online slot players

Understanding the Different Types of Online Slot Players

Online slots have become a popular form of entertainment for millions of people worldwide. The …

Leave a Reply

Your email address will not be published. Required fields are marked *