Twitter apologises over racial bias in image algorithm
Twitter has issued an apology after users highlighted how its image-cropping algorithms would favour showing white or lighter faces over Black ones.
When images are attached to a tweet, Twitter will crop the preview of the image so that it takes up less space on a user’s timeline.
Users who noticed the feature’s bias on Twitter selected images that showed both white and Black faces, ranging from using photos of real people to cartoon characters and dogs. Each time, Twitter’s algorithm would show the lighter faces as the cropped preview image on the tweet.
Other tech giants have also come under pressure for racial bias in their algorithms. Over the summer Instagram admitted it had work to do to make sure white bodies were not prioritised over Black ones on its Explore page.
Twitter spokesperson Liz Kelley today tweeted a response to the criticism, saying: “Thanks to everyone who raised this. We tested for bias before shipping the model and didn’t find evidence of racial or gender bias in our testing, but it’s clear that we’ve got more analysis to do.”
She added that Twitter will keep its work on the algorithm open-source so that other developers can learn from what they find.