AI portrayal of the Australian Olympic team with koala heads and kangaroo bodies.
AI portrayal of the Australian Olympic team with koala heads and kangaroo bodies.

Hidden Biases: Midjourney’s Portrayal of Olympic Athletes by Country

Communication

In the age of artificial intelligence, the tools we use to create and visualise content can inadvertently reflect and amplify societal biases. Midjourney is an AI-driven image generation platform that has gained popularity for its ability to produce stunning and creative visuals.

However, as Dr. Kelly Choong, an advertising researcher and senior lecturer for Edith Cowan University Online's Master of Communication, notes, these AI tools can harbour biases that shape their output in subtle, yet significant ways. This research explores these biases by analysing how Midjourney depicts Olympic teams from different countries.

By examining these AI-generated images, we hope to shed light on the cultural preconceptions and stereotypes embedded within the AI's training data, to better understand AI ethics and the implications in media and cultural representation.

To investigate the presence of biases in Midjourney's AI-generated images, we conducted an analysis involving 40 nations. For each nation, we used consistent prompts – "the Olympics team", "Team GB", "photo" – to generate images depicting the respective Olympic teams.

This approach ensured uniformity in the prompts and allowed us to focus on the variations in the generated images across different countries.

Below are the biases that were discovered.

Gender bias

Men were five times more likely to be featured in images than women. Of the 40 images, 82.86 per cent depicted men and only 17.14 per cent depicted women, highlighting a gender disparity in the AI's representation of Olympic teams.

Ukraine Olympic Team by AI

Ukraine: All men

 

Turkey Olympic Team by AI

Turkey: All men

 

France Olympic Team by AI

France: Predominantly men

Event bias

There was also notable event bias. For Canada, the team was depicted primarily as ice hockey players, while Argentina was represented through football and the Netherlands through cycling.

This indicates that the AI tends to stereotype countries by their most internationally recognised sports, overlooking the diversity of athletic events that each nation's Olympic team participates in.

Canada Olympic Team by AI

Canada: In ice hockey attire

 

Argentina Olympic Team by AI

Argentina: Wearing football kits

 

Netherlands Olympic Team by AI

Netherlands: Wearing cycling attire

Cultural bias

The Australian team was depicted with kangaroo bodies and koala heads, Nigeria's team was shown in traditional attire and Japan's team was also dressed in kimonos. These portrayals indicate a tendency of the AI to rely on cultural stereotypes, rather than accurately representing modern athletes from these countries.

Australia Olympic Team by AI

Australia: Koala heads with kangaroo bodies

 

Japan Olympic Team by AI

Japan: Wearing kimono

 

Nigeria Olympic Team by AI

Nigeria: Wearing traditional attire

Religious bias

The Indian team was depicted with all members wearing a bindi, a religious symbol primarily associated with Hinduism. This representation homogenised the team based on a single religious practice, overlooking the religious diversity within India.

India Olympic Team by AI

India: The team members with bindis

Historical bias

The Greek team was depicted wearing ancient armour and the Egyptian team was shown in attire associated with ancient Egypt. These portrayals rely on historical imagery rather than representing modern athletes, reducing these countries' rich contemporary cultures to outdated historical symbols.

This bias overlooks the present-day identities of these nations and perpetuates anachronistic stereotypes.

Greece Olympic Team by AI

Greece: Wearing armour

 

Egypt Olympic Team by AI

Egypt: Wearing attire associated with ancient Egypt

Emotional stereotyping

The depiction of the South Korean and Chinese Olympic teams with stern expressions highlights an instance of emotional or personality stereotyping bias. This bias reflects and reinforces the stereotype that Asians are more reserved and serious, overlooking their rich diversity of emotional expressions.

Athletes from Ireland and New Zealand were shown smiling, which contrasts with the serious portrayals of some non-Asian countries as well. Such representations not only fail to accurately depict the individual athletes but also perpetuate simplistic and often misleading cultural stereotypes.

South Korea Olympic Team by AI

South Korea: Stern faces

 

China Olympic Team by AI

China: Stern faces

 

New Zealand Olympic Team by AI

New Zealand: All smiles for New Zealand

 

Ireland Olympic Team by AI

Ireland: All smiles for Ireland

 

Recognising and addressing forms of bias is crucial for promoting more nuanced and accurate portrayals in AI-generated imagery.

Why are there biases in AI?

So why are there biases in the AI?

Dr Kelly Choong is an advertising researcher and senior lecturer for Edith Cowan University Online's Master of Communication.

“The biases in AI are driven by human biases that inform the AI algorithm, which AI takes literally and cognitively," he said. "Human judgements and bias are drawn and presented as if factual in AI, and the lack of critical thinking and evaluation means the information is not questioned its validity, just the objective of completing a task."

These biases can quickly lead to issues of equity, harmful generalisations and discrimination.

"With society increasingly relying on technology for information and answers, these perceptions may end up creating real disadvantages for people of various identities," Choong said.

"A country's association with certain sports may result in the perception that everyone in that country is prolific at it - for example, Kenya's association with running; Argentina with football; Canada with ice hockey. These distorted ‘realities’ may also become embedded into individuals who believe these stereotypes and inadvertently reinforce them in real life."

Are these biases in artificial intelligence likely to improve? "Technology will find a way to better its algorithm and output, but it will still be focused on completing a task, rather than offering a truthful representation," Choong said. "Society will need to question the validity and critically assess information generated by AI.

"Educating users will be paramount to the co-existence of AI and information, as well as the ability to challenge its output.”

 

Empower yourself with future-proof communication skills through ECU Online's Master of Communication.

Led by top educators like Dr Kelly Choong, you will gain the expertise to critically analyse and positively influence the dynamic landscape of communication, even in the age of AI-generated content.

Reach out to our Student Enrolment Advisors today at 1300 707 760.