Black lives matter. One of the largest agendas in 2020 has another angle to it. It looks like Twitter has a racist problem. An issue with previewing images on its site.
There’s really no better way to explain this than actually howing you what’s going down on the Twitter streets. Over the weekend, Twitter users posted several examples of how, in an image featuring a photo of a Black person and a photo of a white person, Twitter’s preview of the photo in the timeline more frequently displayed the white person.
Take a look at this tweet. When you click on the image, you will see both the white man and Barack Obama in the same photo. In the first image, the white man starts off the photo and there are no complaints. In the second Image, Obama starts it off and still, the preview shows the white man.
Trying a horrible experiment…
Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? pic.twitter.com/bR1GRyCkia
— Tony “Abolish (Pol)ICE” Arcieri ? (@bascule) September 19, 2020
The tester then went ahead to carry out the test again. “Maybe the red tie is the issue,” he said. However the tests still proved the same as seen below.
"It's the red tie! Clearly the algorithm has a preference for red ties!"
Well let's see… pic.twitter.com/l7qySd5sRW
— Tony “Abolish (Pol)ICE” Arcieri ? (@bascule) September 19, 2020
The tests went on he finally got Obama to show up on the preview. However, this was only because Obama was lighter than the white man for once. They had to
- Invert the image colour
- Literally make Obama lighter
Let's try inverting the colors… (h/t @KnabeWolf) pic.twitter.com/5hW4owmej2
— Tony “Abolish (Pol)ICE” Arcieri ? (@bascule) September 19, 2020
— Allan_OS (@AllanOS18) September 20, 2020
I had to test it myself and I got the same results;
@bascule Twitter Test pic.twitter.com/swc4qugR8N
— Anfernee Onamu (@AnferneeOnamu) September 21, 2020
The public tests got Twitter’s attention – and now the company is apparently taking action.
“Our team did test for bias before shipping the model. We did not find evidence of racial or gender bias in our testing,” Liz Kelly, a member of the Twitter communications team, told Mashable. “But it’s clear from these examples that we’ve got more analysis to do. We’re looking into this and will continue to share what we learn and what actions we take.”
Twitter’s Chief Design Officer Dantley Davis and Chief Technology Officer Parag Agrawal also chimed in on Twitter, saying they’re “investigating” the neural network.
Just for laughs, they had a test with Cartoons too and the yellow characters got a preview before the black characters did.
I wonder if Twitter does this to fictional characters too.
Lenny Carl pic.twitter.com/fmJMWkkYEf
— Jordan Simonovski (@_jsimonovski) September 20, 2020
There’s no real reason to explain this Twitter racist problem but there are a few speculations. It could be the crop algorithim. However, we will wait for Twitter to respond.
Probably Twitters Crop algorithm is a pretty simple Saliency. We will see… pic.twitter.com/q4R0R8h3vh
— Bianca Kastl (@bkastl) September 20, 2020
[…] month saw a number of users tweet and complain about how Twitter’s image cropping tech seemed to be biased towards fair-skinned people. These Twitter users posted several examples of […]