By Wenjing Ruan and Yixuan Zhang
A lack of diversity in the media can perpetuate society’s stereotypes about women and reinforce personal biases. When groups of women are consistently underrepresented or portrayed in narrow and one-dimensional ways, it reinforces harmful stereotypes and biases. The media often portrays them in limited and often demeaning roles, which in turn shapes social perceptions and attitudes, leading to discrimination and prejudice in all areas of life.
Gender imbalance in media
Firstly, the underrepresentation of women in the media is a pervasive problem. Globally, women are far less represented in the media than men. The Harvard Business Review article notes that women are often underrepresented in news coverage, accounting for only a quarter of television, radio and print news content. A 2015 report found that women accounted for only 19% of experts quoted in global news reports, and only 37% of journalists covering the news. When researchers study the underrepresentation of women in various professional fields, we learn that this gender imbalance in media can reinforce and perpetuate harmful gender stereotypes.
Stereotypes in media
- Female friendship is demonized:
Inappropriate portrayals of friendships or relationships between women in the media often take the form of portraying friendships or competition between women as extreme or unhealthy, or even malicious. This stereotype may lead viewers to have negative perceptions or misunderstandings about friendships between women.
Portraying friendships between women as intrigue or betrayal leads the audience to believe that women cannot truly trust each other. Or viewing friendships between women as purely based on looks, jealousy, or competition for men, while ignoring true friendship and support. Mean Girls, for example, once demonized female friendship. The film explores the relationships between high school girls, but depicts friendship between women in a melodramatic and stereotypical way. Female characters in movies are often portrayed as very competitive and venomous, while friendships are portrayed as full of betrayal and intrigue.
- Women are objectified
In film and television works, women are treated as objects or mere displays of appearance, rather than as independent individuals with thoughts, feelings and values. In the media, women are often used to capture the attention of viewers by emphasizing their appearance, sex appeal, and sex appeal to sell products or increase ratings. For example, the movie Catwoman is also a superhero. The character of Catwoman is famous for her sexy tights and unique makeup. Her appearance and clothing become almost the focus of her entire character, rather than her personality, intelligence, or actions. This form of objectification highlights the physical appearance of female superheroes rather than their actual abilities and character traits.
Gender bias in artificial intelligence in image analysis
The AI tool deemed photos of women to be more sexually suggestive than photos of men, especially those involving nipples, pregnant bellies or exercise.
An article in The Guardian mentions an experiment involving the image analysis capabilities of artificial intelligence. They analysed hundreds of images depicting men and women wearing underwear and engaging in physical activities, using partial exposure medical tests as a criterion. Shockingly, artificial intelligence consistently flags everyday images of women as evidence of sexual suggestion.
Moreover, AI algorithms sometimes deem images of women to be “racier” or more sexually suggestive than images of men. The implications of AI’s gender bias in image analysis extend far beyond initial experiments. Social media companies and other digital platforms rely on similar algorithms to filter, flag and even censor content.
Therefore, these platforms may inadvertently misidentify or treat images of female attributes unfairly, sometimes deeming legitimate content inappropriate or violating guidelines. This bias can harm women-led businesses and businesses that champion gender equality. When AI misinterprets or inappropriately flags content, it not only hinders their marketing efforts but also undermines their message. In essence, gender bias in AI further widens social gaps, perpetuates stereotypes, and impedes progress toward a more equitable world.
The lack of diversity in digital platforms and media not only perpetuates stereotypes about women, but also reinforces personal biases and contributes to discrimination and bias in all aspects of life. Women are consistently underrepresented in all forms of media. Stereotypes in media depictions further exacerbate the problem. Women are often demonized and portrayed as extreme, unhealthy, or malicious, leading to negative perceptions of relationships between women. Furthermore, the emergence of gender bias in the field of artificial intelligence, especially in image analysis, poses significant challenges. AI algorithms may inadvertently flag images of women as sexually suggestive, leading to unfair scrutiny or misunderstanding on digital platforms.
Solving these problems requires a collective effort. Developers must work to reduce bias in AI algorithms, regulators and tech companies must ensure content filtering is fair, and public awareness campaigns can promote gender equality in the digital world.