service navigation

EASP – European Association of Social Psychology

Seedcorn grant report

01.11.2017, by Sibylle Classen in grant report

Susanne M. Schmittat (Johannes Kepler University Linz, Austria) & Pascal Burgmer (University of Cologne, Germany)
Lay Beliefs in Moral Expertise: Antecedents and Consequences of Believing in Moral Expertise

The EASP seedcorn provided us with the possibility to address some key questions revolving around moral expertise: why do some people believe in moral expertise whereas other people reject the notion of it? Is religiosity or the endorsement of moral absolutism a necessary requirement that determines the belief? And for those who believe in moral expertise, would they accept the advice of a moral expert? With the seedcorn grant, we investigated what determines lay people’s belief in moral expertise, how they construe it, and what the consequences are for their advice-taking tendencies.

Since moral issues are not easy to solve, people who are confronted with a moral dilemma may seek out the help of someone, whom they perceive to be an expert on these moral matters. However, it remains unclear who qualifies as a moral expert (Rossano, 2008; Schwitzgebel & Cushman, 2012, 2015; Tobia, Buckwalter, & Stich, 2013). Usually, one requirement for the development of expertise is the completion of countless hours of training combined with constantly receiving performance feedback. Only then it is possible to advance in the respective expertise field (e.g., Ericsson, Krampe, & Tesch-Römer, 1993). But how can you prove your expertise if neither training nor performance feedback are clear within the moral domain? Consequently, moral expertise does not meet the criteria. As a result, moral expertise has a unique set of problems, which other expertise fields do not have: it lacks clear identification standards and it remains unclear whether or not expertise can exist in areas that do not offer objective answers (Archard, 2011; Gordon, 2014). Thus, how does moral expertise develop then?

As Jones and Schroeter (2012) have noted, there are two main approaches on defining moral expertise. The first approach focuses on the quality and superiority of how moral experts arrive at their conclusions (e.g., Singer & Wells, 1984). According to scholars adopting the second approach, expertise in the moral domain has little to do with moral knowledge from books but rather moral virtues acquired from experience (e.g., Jones & Schroeter, 2012). However, neither philosophical nor psychological research seem to provide a satisfying answer to what moral expertise is, and who can be considered to be a moral expert (Rossano, 2008; Schwitzgebel & Cushman, 2012, 2015; Tobia et al., 2013). Therefore, we proposed a different perspective. We adopted an experimental-philosophical framework (Knobe et al., 2012) and explored people’s lay conceptions about issues of moral expertise. Scientific debates about the existence and definition of moral expertise notwithstanding, we propose that considering the perspective of lay people on the issue is an empirically overlooked but fruitful approach to the study of moral expertise.

Preliminary studies had shown that lay people construe a unique profile for moral expertise, for instance, lay people assign different characteristics to moral experts than to other experts, and beyond that, they do not equate moral expertise with expertise in philosophy. Building on that, we conducted five correlation studies and three experimental studies to explore antecedents (e.g., ambiguity tolerance, belief in science, or religiosity) and consequences (e.g., moral judgment) of the endorsement of beliefs in moral expertise. We developed a scale that measures Belief in Moral Expertise (BME) comprising twelve items that capture various facets such as “I believe that some people possess expertise with regard to moral issues”, “I would value guidance from a moral expert”, and “When dealing with a moral dilemma, moral experts can offer helpful perspectives on the issue”.

Study 1 (N = 287) confirmed a single-factor structure and very good reliability of the BME scale (Cronbach’s α = .92). Additionally, a multiple regression analysis identified moral identity (Aquini & Reed, 2002) and self-reported religiosity as meaningful antecedents of belief in moral expertise. Those, for whom morality is of great importance for their self (i.e., high moral identity), reported stronger endorsement in BME. Specifically, the facet Symbolization (compared to the facet Internalization) emerged as a strong positive predictor of belief in moral expertise.

Study 2 (N = 280) replicated the single-factor structure and excellent reliability of the BME scale (α = .94). Furthermore, we focused on participants’ inclinations to construe the domain of morality as relative (as opposed to absolute) as a central predictor of belief in moral expertise. Particularly, in a multiple regression analysis, we found that moral relativism (Lu, Quoidbach, Gino, Chakroff, Maddux, & Galinsky, 2017) negatively predicted BME––albeit only marginally so. Participants who rejected the notion that answers to moral problems are subjective were more likely to endorse beliefs in moral expertise. Interestingly, beliefs in science (Farias, Newheiser, & Kahane, 2013) emerged as a negative predictor of beliefs in moral expertise. Using a short version of the Big-Five personality model (Gosling, Rentfrow, & Swann, 2003), we found that agreeableness positively predicted BME––complementing the finding from Study 1 on moral identity, and, in particular, Symbolization. Taken together, Studies 1 and 2 indicate that religiosity, moral identity, and agreeableness facilitate beliefs in moral expertise and moral experts, whereas beliefs in science and moral relativism reduce such beliefs.
In Study 3 (N = 294) we investigated whether or not beliefs in moral expertise would be related to either a deliberative or an intuitive information-processing style (Epstein, Pacini, Denes-Raj, & Heier, 1996). We also employed a validated battery of moral dilemmas (Conway & Gawronski, 2013) to explore how BME might relate to either a deontological or a utilitarian moral judgment inclination. Whereas BME and deontology or utilitarianism were unrelated, faith in intuition emerged as a significant positive predictor or beliefs in moral expertise.

Study 4 (N = 281) attempted to look more closely on people’s moral-processing inclinations beyond dilemma judgments, and how these might be related to belief in moral expertise. Specifically, a newly developed moral-orientations scale comprising dimensions such as Affective, Deliberative, Rule, and Sentiment, showed sensible relations to beliefs in moral expertise. Complementing previous results, Affective Orientation and Rule Orientation emerged as positive predictors of BME. We also included intellectual humility (Leary et al., 2017) as a potential predictor of beliefs in moral expertise. Despite a significant zero-order correlation, however, intellectual humility did not emerge as a significant predictor in a multiple regression analysis.

As a potential consequence of beliefs in moral expertise, Study 5 (N = 295) investigated the relation between BME and moral hypocrisy, assessed with a recently developed scenario-based measure that compares moral leniency judgments for self-caused versus other-caused moral transgressions as an indicator of moral hypocrisy (Weiss, Burgmer, & Mussweiler, 2017). Beliefs in moral expertise and moral judgments for own versus for other transgressions, however, were unrelated.

Following up on previous results regarding moral relativism and source of moral expertise (religion vs. science), we designed Study 6 (N = 313). We experimentally crossed whether participants thought about morality in relativistic (absolutistic) terms, and whether they indicated beliefs in moral expertise with regard to an expert from religion (i.e., a priest) or science (i.e., a philosopher). Results suggest that participants more strongly believe in scientifically grounded moral expertise, and that this is particularly true if they think about moral issues in relativistic terms. However, no such difference between sources of moral expertise emerged when participants think about morality in absolutistic terms.

In Study 7 (N = 317) we explored whether lay people evaluate the advice of a moral expert as qualitatively better, whether they are inclined to follow the advice, and whether they would rate the expert’s expertise as higher, when the moral expert is described as being virtuous and having a great moral character compared to being a highly educated moral expert (virtuous expert vs. educated expert). Additionally, we investigated whether the decision-making process (intuition vs. deliberation) influences participants’ perceptions. In this vignette study, all participants read an everyday moral dilemma. Afterwards, participants were introduced to a moral expert, called Paul Edwards. In the virtuous condition, Edwards works as a counselor, helps out at a homeless shelter and other people describe him as a virtuous person. In the educated condition, Edwards is described as highly educated moral philosopher who works at a renowned university. Subsequently, Edwards provides advice on the earlier described dilemma. Edwards either had to thoroughly think about the situation (deliberation condition) or he just knew the correct advice (intuition condition). Dependent measures included advice evaluation, expertise ratings and behavioral intentions. Results suggest that educated Edwards received significantly higher expertise ratings than the virtuous Edwards, but advice evaluation and behavioral intentions were unaffected. The described decision-making process had no impact.

In a follow-up study (Study 8, N = 197), participants were asked to recall a difficult moral decision. Afterwards, participated indicated how much uncertainty they had experienced in that situation. Next, Edwards was again introduced to participants as either a virtuous or educated moral expert. Participants were asked to evaluate Edwards expertise and to indicate whether they would consider his advice if they were ever confronted with another difficult moral dilemma. Results indicate that uncertainty is correlated to the perceived expertise of Edwards. The more uncertainty participants perceived, the higher their evaluation of Edwards’ expertise. The same pattern as in the first study (i.e., higher evaluation of the educated Edwards) emerges, but does not reach significance.

Together, these studies have yielded interesting results that are currently prepared for two publications. Belief in moral expertise varies among lay people, yet, our studies show that there are shared antecedents for this belief. For instance, religiosity, moral identity, and agreeableness facilitate it, whereas beliefs in science and moral relativism reduce such beliefs. Additionally, our results suggest that lay people differentiate between moral experts, depending on the expert’s education and background. In the future, we plan to further investigate lay people’s belief in moral expertise, focusing on the perception of various moral experts and their advice.

In conclusion, the EASP seedcorn grant allowed us to combine our research interests (expertise, moral judgement, and lay beliefs) thereby creating a new and fruitful collaboration.


  • Archard, D. (2011). Why moral philosophers are not and should not be moral experts. Bioethics, 25(3), 119–27.
  • Aquino, K., & Reed, Americus, I. (2002). The self-importance of moral identity. Journal of Personality and Social Psychology, 83, 1423–1440.
  • Conway, P., & Gawronski, B. (2013). Deontological and utilitarian inclinations in moral decision making: A process dissociation approach. Journal of Personality and Social Psychology, 104, 216–235.
  • Ericsson, K. A., Krampe, R. T., & Tesch-Römer, C. (1993). The role of deliberate practice in the acquisition of expert performance. Psychological Review, 100(3), 363–363.
  • Epstein, S., Pacini, R., Denes-Raj, V., & Heier, H. (1996). Individual differences in intuitive–experiential and analytical–rational thinking styles. Journal of Personality and Social Psychology, 71, 390–405.
  • Farias, M., Newheiser, A.-K., Kahane, G., & de Toledo, Z. (2013). Scientific faith: Belief in science increases in the face of stress and existential anxiety. Journal of Experimental Social Psychology, 49, 1210–1213.
  • Gordon, J.-S. (2014). Moral Philosophers Are Moral Experts! A Reply to David Archard. Bioethics, 28(4), 203–206.
  • Gosling, S. D., Rentfrow, P. J., & Swann, W. B. (2003). A very brief measure of the Big-Five personality domains. Journal of Research in Personality, 37, 504–528.
  • Jones, K., & Schroeter, F. (2012). Moral Expertise. Analyse & Kritik.
  • Knobe, J., Buckwalter, W., Nichols, S., Robbins, P., Sarkissian, H., & Sommers, T. (2012). Experimental Philosophy. Annual Review of Psychology, 63(1), 81–99.
  • Leary, M. R., Diebels, K. J., Davisson, E. K., Jongman-Sereno, K. P., Isherwood, J. C., Raimi, K. T., … Hoyle, R. H. (2017). Cognitive and interpersonal features of intellectual humility. Personality and Social Psychology Bulletin, 43, 793–813.
  • Lu, J. G., Quoidbach, J., Gino, F., Chakroff, A., Maddux, W. W., & Galinsky, A. D. (2017). The dark side of going abroad: How broad foreign experiences increase immoral behavior. Journal of Personality and Social Psychology, 112, 1–16.
  • Rossano, M. J. (2008). The Moral Faculty: Does Religion Promote “Moral Expertise”? International Journal for the Psychology of Religion, 18(3), 169–194.
  • Schwitzgebel, E., & Cushman, F. (2012). Expertise in Moral Reasoning? Order Effects on Moral Judgment in Professional Philosophers and Non-Philosophers. Mind and Language, 27(2), 135–153.
  • Schwitzgebel, E., & Cushman, F. (2015). Philosophers’ biased judgments persist despite training, expertise and reflection. Cognition, 141, 127–37.
  • Singer, P., & Wells, D. (1984). The reproduction revolution: new ways of making babies. Melbourne: Oxford University Press.
  • Tobia, K., Buckwalter, W., & Stich, S. (2013). Moral intuitions: Are philosophers experts? Philosophical Psychology, 26(5), 629–638.
  • Weiss, A., Burgmer, P., & Mussweiler, T. Two-faced morality: Distrust promotes divergent moral standards for the self versus others. Manuscript under review.