Skip to Main Content

Tutorial on Creating Multimedia with AI Tools: Bias

Bias


Bias

Generative AI image and video generators will often output biased results. For example, if you ask for an image of a group of doctors or lawyers, it will often portray them as white men.

A group of lawyers having a conversation. People eating lunch outdoors.

Prompt: A group of lawyers having a conversation.

Prompt: People eating lunch outdoors.

Why does this happen?

These models are often trained on large amounts of data from the Internet, and that data likely has more examples of data from particular countries, languages, and cultures — groups that aren’t representative of the entire world. So the model learns what to output from that data.

What are some possible solutions?

Developers of large AI models have implemented some guardrails to address this problem. But those developers may not have thought of every type of biased data to address, since like all of us, they see the world from their own viewpoint.

  • Some models (DALL•E 3)  have behind-the-scenes instructions telling the model to use different ethnic groups and genders with equal probability when generating images of people. 
  • Other models (Adobe Firefly) use data that estimates the skin tone distribution of a user’s country, and apply it randomly to any image of a human that gets generated. But neither of these solves the problem for all situations.
  • And other models (Runway) mitigate stereotypical biases by fine tuning on synthetic data that varies across perceived skin tones, genders, professions, and age groups. (Synthetic data in this case is a set of images generated for the purpose of portraying various skin tones, genders, ages, and professions in equal number).

What can you do?

Keep an eye out for possible biased outputs and modify your prompts to correct for it. 

Be very specific in your prompts. See below for examples.

A group of lawyers, having a conversation.  A group of lawyers, having a conversation, two African-American women and one South Asian man.
Prompt: A group of lawyers, having a conversation. Prompt: A group of lawyers, having a conversation, two African-American women and one South Asian man.
People eating lunch outdoors. Asian men and women of different ages eating lunch outdoors.
Prompt: People eating lunch outdoors. Prompt: Asian men and women of different ages eating lunch outdoors.

More Ideas

More ideas for removing bias with detailed prompting

To help set new digital standards of representation, Dove has created the Real Beauty Prompt Playbook. (PDF download)

The focus of this document is on unrealistic beauty standards for women.

 It offers tips on how to create images that are more representative of a diverse range of people on the most popular generative AI models. 

Real Beauty Prompt Playbook

See pages 63 - 70 for an inclusive prompting glossary.

The images on this page were generated with Adobe Firefly.

This tutorial is licensed under a Creative Commons Attribution 4.0 International License.

Vincennes University

812-888-VUVU | 800-742-9198

1002 North First Street; Vincennes, Indiana 47591

www.vinu.edu/

Shake Library

812-888-4165 | libref@vinu.edu

1002 North First Street; Vincennes, Indiana 47591

vinu.libguides.com/shakelibrary

Jasper Academic Center for Excellence

812-481-5923 | ace@vinu.edu

850 College Ave; Jasper, IN 47546

vinu.libguides.com/jasper