Psychology professor Mina Cikara.

Psychology professor Mina Cikara.

Jon Ratner/Harvard Staff Photographer

Nation & World

Looking at how prejudice is learned, passed

Research suggests power, influence of watching behavior of others

5 min read

Watching how others behave is one of the primary ways people learn social cues and appropriate behaviors. But might that also be a way biases are spread and perpetuated?

A recent psychology study in Science Advances aims to understand how prejudice might be passed along this way and how that contributes to societal-level inequality.

“Transmission of social bias through observational learning” was a collaboration between Harvard, the University of Amsterdam, and the Karolinska Institute. Researchers conducted a series of experiments in which participants observed people who are aware of the stereotypes (demonstrators) and members of stereotyped groups (targets).

“There’s something about watching the choice transpire that is then inculcating in observers an inference about the chosen person despite there being no rational evidence for that person’s generosity,” said Mina Cikara, professor of psychology in the Faculty of Arts and Sciences and co-author of the study, which involved how subjects select players in a money-sharing game.

In the first two experiments, participants were asked to observe interactions in a money-sharing exercise between a demonstrator and players from two social groups, which were given descriptions aligning with positive or negative stereotypes of white and Black Americans. The demonstrator picked one of the players to interact with, who then responded by sharing or not sharing a reward. On average, the participants showed a preference for those of the positively stereotyped group.

Participants who observed these interactions without knowledge of the stereotypes driving demonstrators’ choices were then asked to make their own choices. They tended to exhibit the group bias expressed by demonstrators.

“We found that people can form prejudice by observing the interactions of others, specifically by observing the actions of a prejudiced actor toward members of separate social groups,” explained lead co-author David Schultner, a postdoc in the lab of co-author and principal researcher Björn Lindström at Karolinska Institute. “After observing such biased intergroup interactions, observers then in turn expressed a similar bias themselves.”

David Amodio and David Schultner.
The study’s senior co-author David Amodio and lead co-author David Schultner.

When researchers asked participants to explain their decisions, they pointed to perceived differences over reward feedback between the two groups that was inaccurate. 

“It’s important to note though, that observers were sensitive both to individuals’ history of giving rewards and their group membership,” cautioned Cikara. “Participants paid quite a bit of attention to what they had learned about individual actors when they were making their own selection. It wasn’t just that they were copying prejudice completely ignoring the cost to their own outcomes or self-interests.”

A control experiment administered in study three was particularly enlightening, the authors noted. For this experiment, participants observed interactions involving actors who displayed more overt bias toward players in a bid to make it easier for observers to pick up on the prejudice and potentially adjust their own behavior. Results, however, still reflected the bias seen before.

Participants were given additional information about chosen and unchosen players during the fourth study. However, this failed to change preconceived beliefs these participants had for the two social groups.

“It blew my mind that the transmission still persisted, because it just made so clear how important the choosing rather than the outcome was for this process,” Cikara said.

The research team then introduced a computer actor making random choices in the fifth study to discover whether having a nonhuman demonstrator made a difference. This might prevent observers from thinking that a human demonstrator knew something about the targets, the psychologists believed.

What they found was virtually no difference between those following human actions and those following a computer. “It’s not 100 percent clear why people acted the way they did,” Schultner said. “It could be that people anthropomorphize the computer [or] … that people assumed this was actually a purposeful robot.”

“The connection to AI was serendipitous because that’s something we care about in other studies: how prejudice and stereotypes work their way into artificial intelligence and are recapitulated back to users, which then lead users to act, often unwittingly, in ways that reinforce those prejudices,” added senior co-author David Amodio, professor of social psychology at the University of Amsterdam.

So, what does this mean for the real world? Cikara urged caution about what the research could mean outside of the confines of the highly controlled lab experiments the team conducted but noted it could lead to important inquiries into social media.

“Our broader goal is to understand the psychological mechanisms through which societal prejudices and stereotypes get inside the heads of individuals and affect their behavior,” Amodio said. “This new work looks at what happens next — once a bias forms in an individual’s head, how does it get back out into the community from which it can begin to influence society more broadly.”