As noted by The Guardian, this is a bug in an image cropping algorithm that seems to regularly focus on white faces, cropping images of dark-skinned people - and even white dogs, cropping out black ones.
Twitter algorithm problems
Twitter has long been using automatic image cropping to keep them from taking up too much space in the main feed and allowing multiple images to be shown in a single tweet. The company uses several algorithmic tools to focus on the most important parts of the image, leaving faces and text.
But in 2020, users noticed that Twitter algorithms favored white people and even white dogs, almost always clipping black ones.
The company's investing-in-ukraine/service-for-investors/business-consultant/" rel="dofollow">research also confirmed the bias in favor of white and female faces, and Twitter launched an incentive program for finding problems with the algorithm, in which researchers who demonstrated errors in cropping images were promised a reward.
Kulinich proved the bias by first artificially generating faces with different characteristics and then running them through Twitter's cropping algorithm to see what the software focuses on.
Since the faces were themselves artificial, it was possible to create faces that were nearly identical, but at different points in the spectrum of skin tone, width, gender representation, or age - and thus demonstrate that the algorithm focused on younger, slimmer, and lighter faces.
“Algorithmic damage is not just 'mistakes'. It is important to note that many harmful technologies are not harmful because of accidents or unintentional mistakes, but because of their very design. It does this by maximizing engagement and, in general, generating a profit by transferring costs to others. For example, increased gentrification, lower wages, the spread of clickbait, and misinformation are not necessarily associated with “biased” algorithms, ”said Bogdan Kulinich.
The company paid the researcher $3,500 to successfully detect and prove that the algorithm was malfunctioning.