According to Twitter’s analysis, the algorithm favoured women by 8% from demographic parity, and white people by 4%. Demographic parity essentially means that there is no bias, and both groups compared have an equal chance of being highlighted by the algorithm. The company also investigated whether its algorithm sexually objectified women by cropping images to focus on body parts aside from faces. But ultimately, the social media firm found no evidence of this.
— Dantley Davis (@dantley) May 5, 2021 Twitter explained that it began using a “saliency algorithm” in 2018 to crop pictures, aiming to standardise the size of images and allow people to see more tweets at a glance. Powered by machine learning, the algorithm estimated what a user might want to see first within an image. Abandoning that approach, the platform recently shifted to displaying standard aspect ratio photos in full, without the algorithm-backed cropping. Users are also shown a true preview of the image before they click to post. In the end, the company concluded, “not everything on Twitter is a good candidate for an algorithm, and in this case, how to crop an image is a decision best made by people.” (Source: Reuters, Twitter.)