Link Search Menu Expand Document

Framing Changes Ethical Attitudes

Environmental rhetoric tends to emphasize harm and unfairness. Will introducing moral terms that appeal more to social conservatives than social liberals cause social conservatives to become more supportive of environmental action?

This recording is also available on stream (no ads; search enabled). Or you can view just the slides (no audio or video). You should not watch the recording this year, it’s all happening live (advice).

If the video isn’t working you could also watch it on youtube. Or you can view just the slides (no audio or video). You should not watch the recording this year, it’s all happening live (advice).

If the slides are not working, or you prefer them full screen, please try this link.

The recording is available on stream and youtube.


The fifth and final claim in our argument that differences in moral psychology explain political conflict concerns moral reframing. If environmental arguments are reframed in terms of moral concerns which are likely to be more highly weighted by conservatives than liberals, will conservatives show more support for measures to mitigate climate change?

Feinberg & Willer (2013, p. Study 3) provide evidence that they will. They created two op-ed style pieces which differed only in that one framed environmental issues in terms of harm whereas the other framed them in terms of purity. Participants were divided into two groups. Each group read on of the op-ed style pieces, then answered a survey about proenvironmental attitudes, a survey about proenvironmental legislation and a survey about knowledge of anthropogenic climate change. Conservatives scored significantly higher on all three measures after reading the op-ed style piece which framed things in terms of purity.

Two Extensions

Can moral reframing change how people act?

Kidwell, Farmer, & Hardesty (2013) found that it can. They studied how much people put into their recycling bins after they received a leaflet about recycling which was framed either in terms of harm or else in terms of in-group loyalty and respect for authority. They report:

‘we developed tailored persuasive messages that appealed to the individualizing foundations for liberals, based on fairness and avoiding harm to others, and the binding foundation for conservatives, based on duty and an obligation to adhere to authority. We found that these congruent appeals significantly affected consumers’ acquisition, usage, and recycling intentions and behaviors’ (Kidwell et al., 2013).

Further, Wolsko, Ariceaga, & Seiden (2016, p. Experiment 2) found evidence that moral reframing can influence how much people donate to an ‘Environmental Defense Fund’.

Can liberals’ attitudes on typically conservative issues also be changed using a similar ethical framing strategy?

Feinberg & Willer (2015) looked at a typically conservative issue in the US, making English the official language of the United States. They found that liberals’ support for this issue could be increased by moral reframing; in this case, by reframing it in terms of fairness.

For more on moral reframing, see Feinberg, Kovacheff, Teper, & Inbar (2019)’s review. Scharmer & Snyder (2021, p. Study 4) explore whether moral reframing can influence enviromentally-driven meal choice behvaiours.

Aside: Why isn’t moral reframing more widely used?

Feinberg & Willer (2015) asked conservatives to write arguments that would persuade liberals, and conversely. Participants were told they would be ‘entered into a draw for a $50 bonus’ if their arguments proved effective.

Fewer than 10% of the arguments provided actually fitted with the target morality. Most fitted with the authors’ morality.

Around a third of liberals even wrote arguments attacking conservative morality.

Why are people so bad at moral reframing?

’Without recognizing that one's political rivals possess different morals, and without a clear understanding of what those different morals are, using moral reframing becomes impossible’ (Feinberg & Willer, 2019, p. 7).

Another (compatible) possibility is intolerance. People are less tolerant of differences in moral than in nonmoral attitudes (Skitka, Bauman, & Sargis, 2005). Perhaps this makes them unwilling to provide arguments that are effective across differences in moral psychology.

Never Trust a Psychologist

I am a fan of Feinberg and Willer but they are sometimes unreliable. Consider:

‘individuals experience their moral convictions as objective truths about the world (Skitka et al., 2005). As a result, it can be difficult to recognize that there are different “truths” that other people believe in (Ditto & Koleva, 2011; Kovacheff et al., 2018). Indeed, polling data indicates that people are apt to perceive someone who does not endorse their morality as simply immoral or evil, rather than morally different (Doherty & Kiley, 2016)’ (Feinberg & Willer, 2019, p. 7).

When I read this, I expected to find that the sources they cite provide support for the claims they make. But which of the sources cited do support the claims they make?

Not one:

  • Skitka et al., 2005 mentions the claim about objectivity but does not provide evidence for it. Those authors cite Shweder (2002)[1] in support of it, which is a brief opinion piece in a magazine. Skitka et al., 2005 is indirectly relevant because it is about people being less tolerant of differences in moral than in nonmoral attitudes.

  • Ditto & Koleva, 2011[1:1] is a two-page unargued endorsement of Moral Foundations Theory.

  • Kovacheff et al., 2018[1:2] is an interesting review but I couldn’t find anything directly relevant to the claim it is cited in support of. (It’s very long so I may have missed something.)

  • Doherty & Kiley, 2016[1:3] does not support the point about ‘polling data’ at all. This is a reference to a blog post ( which is about about political parties, not ‘endorsing their morality’. (To make this relevant, you would need a strong premise linking moral psychology and political identity.)

Not all of the sources they cite are even directly relevant to the points they are cited in support of.

My conclusion: Claims made by leading experts in peer-reviewed journals are sometimes unsupported even when citations give the impression that they are based on a rich body of evidence.[2]

Ask a Question

Your question will normally be answered in the question session of the next lecture.

More information about asking questions.


moral conviction : ‘Moral conviction refers to a strong and absolute belief that something is right or wrong, moral or immoral’ (Skitka et al., 2005, p. 896).
Moral Foundations Theory : The theory that moral pluralism is true; moral foundations are innate but also subject to cultural learning, and the Social Intuitionist Model of Moral Judgement is correct (Graham et al., 2019). Proponents often claim, further, that cultural variation in how these innate foundations are woven into ethical abilities can be measured using the Moral Foundations Questionnare (Graham, Haidt, & Nosek, 2009; Graham et al., 2011). Some empirical objections have been offered (Davis et al., 2016; Davis, Dooley, Hook, Choe, & McElroy, 2017; Doğruyol, Alper, & Yilmaz, 2019). See Moral Foundations Theory: An Approach to Cultural Variation.
moral reframing : ‘A technique in which a position an individual would not normally support is framed in a way that it is consistent with that individual's moral values. [...] In the political arena, moral reframing involves arguing in favor of a political position that members of a political group would not normally support in terms of moral concerns that the members strongly ascribe to‘ (Feinberg & Willer, 2019, pp. 2--3).
Social Intuitionist Model of Moral Judgement : A model on which intuitive processes are directly responsible for moral judgements (Haidt & Bjorklund, 2008). One’s own reasoning does not typically affect one’s own moral judgements, but (outside philosophy, perhaps) is typically used only to provide post-hoc justification after moral judgements are made. Reasoning does affect others’ moral intuitions, and so provides a mechanism for cultural learning.


Davis, D., Dooley, M., Hook, J., Choe, E., & McElroy, S. (2017). The Purity/Sanctity Subscale of the Moral Foundations Questionnaire Does Not Work Similarly for Religious Versus Non-Religious Individuals. Psychology of Religion and Spirituality, 9(1), 124–130.
Davis, D., Rice, K., Tongeren, D. V., Hook, J., DeBlaere, C., Worthington, E., & Choe, E. (2016). The Moral Foundations Hypothesis Does Not Replicate Well in Black Samples. Journal of Personality and Social Psychology, 110(4).
Day, M. V., Fiske, S. T., Downing, E. L., & Trail, T. E. (2014). Shifting Liberal and Conservative Attitudes Using Moral Foundations Theory. Personality and Social Psychology Bulletin, 40(12), 1559–1573.
Doğruyol, B., Alper, S., & Yilmaz, O. (2019). The five-factor model of the moral foundations theory is stable across WEIRD and non-WEIRD cultures. Personality and Individual Differences, 151, 109547.
Feinberg, M., Kovacheff, C., Teper, R., & Inbar, Y. (2019). Understanding the process of moralization: How eating meat becomes a moral issue. Journal of Personality and Social Psychology, 117(1), 50–72.
Feinberg, M., & Willer, R. (2013). The Moral Roots of Environmental Attitudes. Psychological Science, 24(1), 56–62.
Feinberg, M., & Willer, R. (2015). From Gulf to Bridge: When Do Moral Arguments Facilitate Political Influence? Personality and Social Psychology Bulletin, 41(12), 1665–1681.
Feinberg, M., & Willer, R. (2019). Moral reframing: A technique for effective and persuasive communication across political divides. Social and Personality Psychology Compass, 13(12), e12501.
Graham, J., Haidt, J., Motyl, M., Meindl, P., Iskiwitch, C., & Mooijman, M. (2019). Moral Foundations Theory: On the advantages of moral pluralism over moral monism. In K. Gray & J. Graham (Eds.), Atlas of Moral Psychology. New York: Guilford Publications.
Graham, J., Haidt, J., & Nosek, B. A. (2009). Liberals and conservatives rely on different sets of moral foundations. Journal of Personality and Social Psychology, 96(5), 1029–1046.
Graham, J., Nosek, B. A., Haidt, J., Iyer, R., Koleva, S., & Ditto, P. H. (2011). Mapping the moral domain. Journal of Personality and Social Psychology, 101(2), 366–385.
Haidt, J., & Bjorklund, F. (2008). Social intuitionists answer six questions about moral psychology. In W. Sinnott-Armstrong (Ed.), Moral psychology, Vol 2: The cognitive science of morality: Intuition and diversity (pp. 181–217). Cambridge, Mass: MIT press.
Kidwell, B., Farmer, A., & Hardesty, D. M. (2013). Getting Liberals and Conservatives to Go Green: Political Ideology and Congruent Appeals. Journal of Consumer Research, 40(2), 350–367.
Kovacheff, C., Schwartz, S., Inbar, Y., & Feinberg, M. (2018). The Problem with Morality: Impeding Progress and Increasing Divides. Social Issues and Policy Review, 12(1), 218–257.
Kugler, M., Jost, J. T., & Noorbaloochi, S. (2014). Another Look at Moral Foundations Theory: Do Authoritarianism and Social Dominance Orientation Explain Liberal-Conservative Differences in “Moral” Intuitions? Social Justice Research, 27(4), 413–431.
Scharmer, A., & Snyder, M. (2021). Political message matching and green behaviors: Strengths and boundary conditions for promoting high-impact behavioral change. Journal of Environmental Psychology, 76, 101643.
Skitka, L. J., Bauman, C., & Sargis, E. (2005). Moral Conviction: Another Contributor to Attitude Strength or Something More? Journal of Personality and Social Psychology, 88(6), 895–917.
Wolsko, C., Ariceaga, H., & Seiden, J. (2016). Red, white, and blue enough to be green: Effects of moral framing on climate change attitudes and conservation behaviors. Journal of Experimental Social Psychology, 65, 7–19.


  1. I'm not including these works in the list of references to avoiding giving the impression that they are relevant to this topic. ↩︎ ↩︎ ↩︎ ↩︎

  2. Imagine how much worse it is for claims made by your lecturer in these lecture notes. ↩︎