According to an investigation by the Guardian, in partnership with Pulitzer Center AI Accountability Network, artificial intelligence algorithms are gender-biased, which frequently results in photos of women being sexualised. Emmy Award-winning journalist, Hilke Schellmann, in her investigation, reveals that images uploaded to social media platforms are subjected to an AI analysis that rates photos of women as more sexually suggestive than similar images of men.
When hundreds of photos of women doing regular activities in underwear, like exercising, using medical tests, were run through these AI tools, it tagged them as sexually suggestive. When she ran a photo of a pregnant women’s belly Microsoft's algorithm, it was 90% sure the image was "sexually suggestive in nature”. In fact, most photos of women doing everyday tasks have also been subjected to this untrue analysis.
This results in social media restricting a post’s visibility -- wherein more photos of women get ‘shadowbanned’ on platforms like Instagram and Facebook, affecting their viewership and reach on the internet. When a user’s post is shawdowbanned, they are unaware of it because they do not get notified, unlike blocking where the user gets notified when a post is blocked.
This Guardian study is one of the most recent examples of how gender prejudice still exists in AI. The system is built upon data that was gathered and annotated by people, the majority of whom were men, with their own preconceptions, worldviews, and conservative prejudices.