Racial Bias Noticed in Twitter Picture-Cropping Algorithm

News Twitter Racial Bias Featured

With today’s technology along with social awareness, it’s no longer okay to show any type of bias in a product or service, no matter whether it’s intentional or not. This is why fingers are being pointed at Twitter after its picture-cropping algorithm is showing a racial bias.

Does Twitter Have Racial Bias?

Twitter is investigating its picture-cropping algorithm after users noticed that black faces weren’t always being shown in image previews on mobile OS when an image shows both a Black face and a White face.

Twitter reported it didn’t find any evidence of racial and gender bias when it tested its algorithm, yet it also realizes there is more testing to do.

The social network’s technology officer, Parag Agrawal, said the picture-cropping model was analyzed when it shipped but appreciates that the public is helping test it live. “Love this public, open, and rigorous test — and eager to learn from this,” he said.

A university manager in Vancouver, Colin Madland, started this venture by noticing that when he Zooms with a Black colleague, the other man’s head would disappear. It appeared as if the software saw the darker head as part of the background, so removed it.

News Twitter Racial Bias Zoom

Madland saw a deeper problem when he tweeted about it. He posted to Twitter to ask if anyone knew what was going on with his colleague’s face disappearing. However, once he posted a picture that included both his face and his colleague’s missing face on Zoom, Twitter cropped the picture to only show Madland and not the other man.

He found on Zoom that he could get the Black man’s face to appear if a white globe was placed behind his head as if it was enough to separate him from the background. Yet, Twitter cropped this as well. It cropped the man with the missing head out and also cropped the picture with the head seen in front of the globe.

Twitter’s chief design officer, Dantley Davis, believes the problem would be corrected if Madland’s facial hair and glasses were removed.

“I know you think it’s fun to dunk on me — but I’m as irritated about this as everyone else. However, I’m in a position to fix it, and I will,” said Davis. “It’s 100 percent our fault. No one should say otherwise.”

Twitter users carried out experiments to prove the theory. They found that the algorithm preferred U.S. Senate Majority Leader Mitch McConnell over former U.S. President Barack Obama. Even with a stock photo, a White man was shown rather than a Black man.

Twitter Not Alone

News Twitter Racial Bias Map

This is not an isolated situation with Twitter. It’s also happened to Microsoft, IBM, and Amazon with their facial recognition systems. They weren’t able to identify people of color as well as White people.

Microsoft has said it has taken steps to correct the problem after realizing the system was trained with mostly white faces. The system hadn’t been shown enough people of color to learn how to identify them correctly. Later, Microsoft suggested facial recognition should be regulated to prevent bias.

IBM said it would be launching a new version of its service. Amazon’s Rekognition system sometimes even identifies Black women as Black men, but it doesn’t have the same issue with White women.

The problem, as Madland pointed out on his Twitter account, is that this isn’t always as innocent as a Zoom meeting. Law enforcement use Rekognition. People could be misidentified by this system and accused of a crime. And that, in itself, is a crime.

Image Credit: Colin Midland’s Twitter and Public domain

Laura Tucker Laura Tucker

Laura has spent nearly 20 years writing news, reviews, and op-eds, with more than 10 of those years as an editor as well. She has exclusively used Apple products for the past three decades. In addition to writing and editing at MTE, she also runs the site's sponsored review program.

Leave a Comment

Yeah! You've decided to leave a comment. That's fantastic! Check out our comment policy here. Let's have a personal and meaningful conversation.