service navigation

EASP – European Association of Social Psychology

EASP Seedcorn Grant Report by Konrad Bocian, Lazaros Gonidis, & Jim AC Everett

07.07.2020, by Tina Keil in grant report

Project: Human and nonhuman agents as a source of social pressure for judgements of moral character

Konrad Bocian, Lazaros Gonidis, Jim AC Everett
Konrad Bocian, Lazaros Gonidis, Jim AC Everett

Konrad Bocian12, Lazaros Gonidis1, Jim AC Everett1
1School of Psychology, University of Kent, UK
2Sopot Faculty of Psychology, SWPS University of Social Sciences and Humanities, Poland

Decades of social psychological research has demonstrated both the importance of social conformity, and of appearing moral in the eyes of others. This sets up a distinct and fundamental question: how do conformity pressures influence moral judgement and perceptions of moral character? Many experiments have used the classical Asch-type paradigm to test social influence on a variety of judgements such as line or colour judgements, judgements about intelligence or just opinions (for review see Bond, 2005). Comparatively few, however, have investigated whether moral judgements are subject to social pressure.

The little research there suggests that social influence leads to conformity not just in judgements about lines or colours, but more consequential judgements about morality. Some work suggests that people conform mostly while judging violations of social and decency norms, and least for moral norms (Postma-Nilsenova et al., 2013). Other work suggests that participants do conform to confederates’ judgements about immoral and moral actions (Kundu & Cummins, 2012) and that peer opinions can influence people’s moral judgements in virtual environments (e.g., online survey; Kelly et al., 2017).

That people’s moral judgements can be influenced by peer opinions in virtual environments is particularly interesting as society shifts into an increasingly digital age in which social interaction does not only happen in virtual spaces but can happen with entirely virtual agents. This results in two key research questions. First, in a virtual reality setting, would people conform in the presence of human-controlled avatars? Second, would this conformity still occur if artificial agents control the avatars?

Little work has considered conformity to nonhuman agents, with three main exceptions. First, work has shown that in the immersive virtual environment players’ betting behaviour in a virtual casino was influence by norms imposed by other players represented as avatars (Blascovich, 2002). Second, other research suggests that immersive video gaming increases conformity to judgements cast by computers, especially when the stimulus context is ambiguous (Weger et al., 2015). Third, other work suggests that nonhuman agents induce conformity in immersive virtual environments and that this effect occurs for both avatars controlled by other humans and avatars controlled by a computer algorithm (Kyrlitsias & Michael-Grigoriu, 2018).

In this project, supported by the EASP Seedcorn Grant and the NCN Miniatura Grant, we sought to investigate how different sources of social pressure shape perceptions of moral character. We conducted a behavioural experiment in both real and virtual environment employing classic Asch paradigm to answer two main questions. First, whether people conform to human-controlled avatars and AI-controlled agents as strong as to humans or perhaps less. Second, whether the perception of human agency drives the effect of conformity in the presence of nonhuman agents. Answering these questions seems intriguing because people distance themselves from others who have different moral convictions (Skitka et al., 2005).

We assumed that people judgements of moral character would differ in the private and public context. Therefore, we hypothesised that in the public context, participants judgements would align with confederates’ judgements, even if privately they would state otherwise. We also hypothesised that moral conformity effect would be less pronounced in the virtual environment compared to that one in the lab setting in the presence of real humans. Finally, based on past research (Kyrlitsias & Michael-Grigoriu, 2018), we predicted that the effect of social pressure produced by human-controlled avatars would be as strong as the one produced by AI-controlled agents. However, we did not rule out the possibility that there would be a difference between these two conditions.

We have collected data in Poland (N = 103) and the UK (N = 138) by conducting parallel research studies in which we adapted the traditional Asch (1951) conformity paradigm. First, participants completed a questionnaire in which they judged 20 examples of different behaviours (e.g., “You see a woman spanking her child with a spatula for getting bad grades in school”) and rated whether the agent is mainly a bad or a good person. Two weeks later, participants arrived at the laboratory to answer the same question in the presence of either three physically present humans (the human condition), avatars controlled by humans (the human avatar condition) or avatars controlled by artificial intelligence (the AI avatar condition). The task of real and virtual peers was always providing the different answer to the one participant responded when completing the online questionnaire. For each participant, we obtained a conformity ratio by measuring how many times out of 20 participants changed their moral character judgements under the influence of peer pressure.

The overall conformity ratio for all three conditions was 34.3% (in Asch, 1956 experiment, the ratio was 36.8%). We found that participants conformed more in the human condition (M = 43%, SD = 20%) than the human avatar condition (M = 30%, SD = 15%) and the AI avatar condition (M = 26%, SD = 14%, ps <.001). There was no difference between the two VR conditions (p = .305). We found no evidence that the perception of human agency drives the effect of conformity in the presence of nonhuman agents.

In conclusion, in this project, we sought to investigate moral conformity in both physical and virtual spaces, with both physical and virtual agents. We found evidence for moral conformity effects in the presence of both human and nonhuman agents, highlighting how people’s judgments of moral character are malleable and change under the influence of peer pressure. Moreover, we show that people also conform to nonhuman agents, even though less than to human agents. It will be necessary for future work to investigate this further to demonstrate both the robustness of the phenomenon and the potential mediating mechanisms. More generally, our work highlights the importance of considering how classic social psychological phenomena manifest in our increasingly digital social world.

Currently, we are conducting two experiments which will help us expand our understanding to what extent people conform to peer pressure in virtual spaces (e.g., online questionnaires). These experiments will also answer questions on whether conformity in virtual spaces depends on culture (individualist vs. collectivist), political ideology (liberal vs. conservative), and moral identity (Aquino & Reed, 2002).