Embracing Diversity Through Games and Bots
By Stacy Kish
Many workplaces require diversity training to support an inclusive environment and uphold anti-discrimination laws. Despite best efforts, workplace discrimination persists, commonly expressed through unconscious bias, such as stereotypes and ambivalence toward members of other social identity groups. According to the authors of a new study at 一本道无码, this form of discrimination is rarely addressed effectively in typical workplace training modules.
A team of researchers at 一本道无码 have found that games provide an effective alternative to deliver diversity training, upping the ante by simulating social situations involving bias and including “personality” bots to hone in on how unconscious bias can affect our social context. The results are available in the October issue of the journal .
Advancing Games to Improve Diversity Training
According to the study’s first author , traditional diversity training is delivered as a lecture, and simply educating an employee about the meaning of unconscious bias may be insufficient to recognize prejudicial thoughts or actions. In addition, the training may elicit a range of responses in the participants who may feel defensive or the need to minimize their own bias. Participants can also walk away from training with the belief that they have been absolved of their own bias.
Past research has shown that gaming is an effective way to influence people’s behavior and counteract these psychological defenses.
“Games offered a new way to motivate people around the idea of diversity,” said Cleotilde Gonzalez, research professor in the Department of Social and Decision Sciences and senior author on the study. “The moment you make [training] fun, people are more willing to attend meetings, and the learning is more effective."
The team developed a game, called Moments@Work, an adaptation of a card game, Awkward Moment, which Kaufman developed in collaboration with Tiltfactor Lab at Dartmouth College. The card game examines how people navigate awkward and uncomfortable situations in the workplace by placing the player in situations where they can confront truths about society and watch the effects of their actions in a safe and supportive environment.
“The game provides an entry point for a deeper dive into bias and discrimination,” said Kaufman, the Robert E. Kraut Associate Professor of Human-Computer Interaction in the School of Computer Science. “The hope is that once you have experience with unconscious bias you are more motivated to be aware of it and counteract it in your own behavior.”
The research team captured the responses of more than 1,300 participants as they played the Moments@Work card game to create a database of responsible and offensive ways of reacting to the proposed situations. Kevin Jarbo, assistant professor in the Department of Social and Decision Science, performed the analyses for data collected during the study.
How Bots Can Help Identify and Address Unconscious Bias
The team used AI to generate “personality” bots that mined the human-derived database of responses for different scenarios obtained from the card game. During the digital version of the game, humans knowingly played against the bots, which were programmed to respond in either an offensive, reasonable or random manner.
The digital game gave the research team a way to evaluate how the players responded to the “personality” bots as the game progressed.
Surprisingly, the researchers found that participants began to conform to the “personality” of the bots in their game — both responsible and offensive. This finding shows how bias and prejudice can be introduced into the interactions played through the rounds of the game.
“The results were both interesting and distressing,” said Kaufman. “People caught this contagion of bias and seemed equally amenable to this behavior even if a bot was exhibiting non-PC [politically correct] responses that either reinforced or minimized bias.”
In addition, the research team found participants who aligned with the more offensive “personality” bots reported higher rates of hostility, while players who followed the responsible bots reported higher levels of advocacy for diversity.
These findings illustrate how a person can be influenced by the biases of their community, but it also shows how a game can be used to augment anti-bias interventions. According to Gonzalez, the design of the game in this study is not ideal for diversity training, but it does open a door to develop better tools to improve training to address unconscious bias.
“The power of experience is extremely interesting,” said Gonzalez. “We can learn to be good or bad if the behavior is encouraged and if others have similar behaviors.”