Problem retrieving data from Twitter

Moral Foundations of Windows vs. Mac Users

Recently, the topic came up of whether values profiles (and Moral Foundation Scores more specifically) predict behavior.  On the one hand, social and contextual factors often loom larger than individual factors in determining moral behavior.  On the other hand, it seemed rather unlikely that something as central as a persons values would not predict their behavior.  While the effects may be small and indirect in many cases, I would expect a person’s value profile to predict almost everything they do in life.  As a test case, I decided to examine whether moral foundation scores, which measure how much a person cares about harming others, fairness, obeying authority, being loyal, and being pure, in the context of moral judgments, predict whether a visitor to YourMorals.org visited using a Mac vs. a PC.  Below is the graph.

The Values Profile of Mac vs. PC Users

The Values Profile of Mac vs. PC Users

While all visitors to YourMorals.org are generally liberal, it looks as if Windows users are more conservative than Mac users, within this group.  Note that while this isn’t a representative sample, in some ways it is better for answering this question as the users in this sample have such similar characteristics that many variables are naturally controlled for.  Windows users appear to value harm less and purity more.

The take home message for me is that while context certainly matters, so to does a person’s values, even for relatively unrelated decisions, such as which computer to use in daily life.

- Ravi Iyer

The Moral Foundations of ThinkProgress, Alternet, Daily Kos, & the NY Times

Over the past couple years, Jon Haidt has had press articles from various liberal leaning press organizations, including these articles from ThinkProgress, Alternet, Daily Kos, and the New York Times.

One of the great things about doing internet research is that web servers automatically collect information that makes it very easy to do cross-sample validation.  This information can also be used to compare the people who visited us from these articles. Which group is the most liberal and how do they compare on their moral foundations scores?

First, I thought do a simple comparison of these groups.


There are fewer people from the Daily Kos to be able to be sure about conclusions (hence the larger error bars), but it looks like (unsurprisingly) all of these groups are liberal, compared to people who find us via search engines, who tend to be only slightly liberal.  Their moral foundations scores show a similarly more liberal pattern with higher Harm/Fairness scores and lower Ingroup/Authority/Purity scores.  Daily Kos readers are the most liberal followed by ThinkProgress & Alternet and then NY Times readers and finally people who found yourmorals.org via a search engine.

To me, the most interesting results are where groups appear to be equally liberal (ThinkProgress & Alternet), but have differences.  ThinkProgress visitors appear esepcially low on Purity scores, while Alternet visitors appear significantly higher on Harm/Fairness scores.

An even stronger test of the kinds people who use these websites is to control for how liberal (slight, moderate, or extreme) individuals at these sites report themselves to be and examine individuals within each group of liberals. Those results are below.

This is the graph for people who said they were “very liberal”.

These are the results for people who said they were “liberal”.

These are the results for people who said they were “slightly liberal”.  Interestingly, there weren’t enough slight liberals in the Daily Kos sample to include them in this graph.

The pattern seems fairly robust in that ThinkProgress visitors care less about Purity.  Perhaps they are less religious?  Alternet visitors seem to care more about Harm/Fairness.  Perhaps they are more empathically motivated and ThinkProgress visitors are more rationally oriented.  I don’t know enough about the liberal blogosphere to theorize well about why these differences exist, but I’m hopeful that by sharing these differences, others will be able to enlighten me.  At the very least, I hope readers of these sites will find it interesting.

Would you be interested in seeing how your group compares to others on the moral foundations questionnaire?  Or visitors to your website?  You may have noticed a small “create a group” link on our explore page of yourmorals.org which lets you create a custom URL, whereby each visitor’s graphs will not only let them compare their individual scores to other liberals/conservatives, but also to members of their group, and to compare their group scores to the average liberal/conservative.  Once you create those URLs, you can put them into blog posts, articles, or emails targeting your group.  We are still beta testing the feature, but would welcome anyone who wants to try it out and who perhaps has feedback on how we can improve it.

- Ravi Iyer

The Case for Honesty as a Moral Foundation

I was immediately attracted to Moral Foundation Theory (MFT) due to the utility of breaking down partisan and policy differences into questions of what one values. The idea that different people believe in different moral principles is one of those obvious ideas that is yet still under appreciated in every day life, where we attribute differences to ignorance, stupidity, or evil, rather than to underlying value differences.

However, I have never been convinced that there are specifically five foundations or even that the idea of thinking of moral concerns as categorically ‘foundational’ is better than thinking of them in some other less categorical way. Fortunately, those that originally conceived of Moral Foundations Theory do not require such homogenous thinking and even welcome the idea that the five foundation model is likely to undergo changes. I have outlined a few changes I would make previously, as well as the criteria that one might use to posit a new moral category. Even if one does not believe in the categorical distinction that some moral concerns are ‘foundations’, while others are not, it would seem clear that some moral concerns are more common, distinct, and important. I would now like to make that case for honesty.

Honesty is common.

One of the distinctive traits of MFT is the evolutionary focus. People moralize various things (e.g. eating pork or driving while using a cellphone) in various cultures, but the purpose is to identify those moral concerns that appear cross-culturally and have an innate quality. Innate, in this instance, means “organized ahead of experience”, such that people can make intuitive judgments beyond their socialization. Put more concretely, if concern about honesty is innate and universal, one might expect individuals to be able to intuitively signal and detect honesty in others, as this study, where participants are fairly successful in figuring out who will cooperate or cheat, shows. The idea that concern about honesty is universal enough that one might posit an evolutionary story is almost self-evident, but this paper provides evolutionary models about how honesty might evolve. If one subscribes to the evolution of groups that out-compete other groups, one can witness the evolution of honesty in modern society as nations that have low levels of corruption tend to have better economies than countries with high levels of corruption, acheter cialis, mirroring the evolutionary processes theorized.

Honesty is distinct.

The same paper I cited above has some evidence for this, but from the perspective of Moral Foundations Theory, it would be useful to show that honesty is distinct from other moral concerns. We asked users on YourMorals 4 questions about honesty (alpha=.69, .76 if we remove the relevance question) in addition to the standard Moral Foundations Questionnaire that measures the existing five foundational concerns. Factor analyses tell the same story, but examining the correlations tells the story more simply. Specifically, the highest correlation between endorsement of honesty and any other foundation is .31 (with Purity), while all other foundations have fairly high inter-correlations with other foundations (e.g. Purity/Authority/Ingroup inter-correlate >.5, Harm/Fairness inter-correlation = .57). Concern about honesty is empirically distinct from other moral concerns.

Honesty is important.

The pragmatic utility of using the moral foundations to predict ideological differences is perhaps the primary contribution of MFT to date. Are questions about honesty also pragmatically useful?

On a 7 point scale, those who are more conservative endorse questions about honesty more than those who are liberal, but the amount of variance in political attitudes predicted by endorsement of honesty is smaller, though significant, compared to other foundations (beta = .10 vs. other foundations which range from .12 (ingroup) to .33 (purity)). However, if we look at economic conservatism, we do find that endorsing honesty does predict identification as being economically conservative (beta = .13) as well as authority, ingroup & purity concerns (betas = .10, .09, &.11).

I looked at some political attitude variables and the predictive power of endorsing honesty was not impressive. However, endorsement of honesty is a strong negative predictor (in a regression equation, including the other five foundations) of psychopathy (beta = -.23) and utilitarianism (beta = -.26, e.g. willingness to sacrifice one life to save five others). Measurement of endorsement of honesty may have important pragmatic utility, but not for political outcomes.

- Ravi Iyer

Does gratitude promote a sense of fairness and equality?

Gratitude has been theorized to be a moral emotion, yet it has largely been studied for it’s hedonic benefits rather than it’s effect on moral reasoning.  I had done some previous analyses on our data at yourmorals.org where scores on the Gratitude quotient scale were positively related to most all measures of moral reasoning.  By itself, this isn’t particularly interesting as there are so many possible interpretations of this.  People who have nice things happen to them may feel grateful and also be nice people.  Nicer, more moral people may do good things in life and may receive benefits for them, for which they are grateful.  The numerous interpretations make any conclusion difficult.

As such, I decided to put a simple gratitude manipulation where participants were asked to write about something they were grateful for, before the moral foundations questionnaire.  I attempted to test the effects of gratitude on moral reasoning by running an experiment where participants were asked to write about 5 things they were grateful for, 5 hassles from their life, or 5 neutral events.  Below are the results of ~1500 participants.  Generally, it seems gratitude makes people more morally liberal and when I examined the standard liberal/conservative moral split (Harm & Fairness minus Authority, Ingroup, & Purity), there was a marginally significant relationship (p=.06) between being in the gratitude condition and having a greater liberal split.  The effect sizes are obviously small, but those in the gratitude condition appear to endorse the fairness foundation (p<.01) more and the authority foundation less (p<.05).

gratitude_mfq0.JPG

I’m not sure how to interpret this result.  It may just be random error.  To explore the result further, I looked at the individual fairness questions.

Gratitude and Fairness

The fact that the gratitude manipulation has a fairly homogenous effect at the question level is promising.  Fairness can be thought of in many different ways.  It can be thought of as a concern for equality or for people not getting what they deserve.  The “RICH” and “TREATED” questions appear to show the biggest effect and they are most indicative of a concern for equality (see question text below).  I could imagine a theoretical argument for this link as being grateful and satisfied with a situation allows one the luxury of being generous and worrying about equal treatment.  There is research indicating that being grateful motivates prosocial behavior (also see this article).

Here is a list of fairness questions:

TREATED – Whether or not some people were treated differently than others

UNFAIRLY – Whether or not someone acted unfairly

RIGHTS – Whether or not someone was denied his or her rights

FAIRLY – When the government makes laws, the number one principle should be ensuring that everyone is treated fairly.

JUSTICE – Justice is the most important requirement for a society.

RICH – I think it’s morally wrong that rich children inherit a lot of money while poor children inherit nothing.

Still, I’m not 100% convinced of these results given the small effect sizes and will likely have to do more studies to confirm if this effect is replicable or is just an effect of noisy data.  Another way to look at the reliability of these effects is to examine whether these effects are consistent across groups.  It does appear that the effect is consistent across groups for increasing fairness.

Gratitude and Fairness for Liberals, Conservatives, and Libertarians

The robustness of this effect less consistent for the Authority foundation, though it is perhaps worth considering why grateful libertarians may endorse authority less.  Perhaps the only reason for libertarians to value authority is out of a sense of insecurity.  For example, the libertarian party does espouse the idea that the only role of government is to provide security for property rights.  If that security is provided, perhaps libertarians see no need for any authority?

Gratitude and Authority for liberals, conservatives, and libertarians

I’m not sure if I have enough evidence for a paper.  All research is somewhere between a zero and 1 in terms of it’s conclusiveness and these results may be too preliminary to reach the somewhat arbitrary standard of paper-hood.  I could clearly strengthen these results with a regression analyses of our large correlational dataset that confirms these patterns.  I’ll have to get feedback from more objective parties.

What are the basic foundations of morality?

A few years ago, I was fortunate to catch a talk by Jon Haidt at the Gallup Positive Psychology Summit where he gave a wonderful talk about moral foundation theory, which seeks to determine the fundamental systems of morality.  I sought to use his scale in my work and using that scale eventually grew into our current collaboration (along with Jesse Graham, Pete Ditto, and Sena Koleva) of yourmorals.org, where the main instrument used in moral foundation theory, the moral foundations questionnaire, is available.

The moral foundations questionnaire measures 5 foundations.  The below descriptions are taken from the moral foundations theory webpage.

 

1) Harm/care, related to our long evolution as mammals with attachment systems and an ability to feel (and dislike) the pain of others. This foundation underlies virtues of kindness, gentleness, and nurturance.

2) Fairness/reciprocity, related to the evolutionary process of reciprocal altruism. This foundation generates ideas of justice, rights, and autonomy.

3) Ingroup/loyalty, related to our long history as tribal creatures able to form shifting coalitions. This foundation underlies virtues of patriotism and self-sacrifice for the group. It is active anytime people feel that it’s “one for all, and all for one.”

4) Authority/respect, shaped by our long primate history of hierarchical social interactions. This foundaiton underlies virtues of leadership and followership, including deference to legitimate authority and respect for traditions.

5) Purity/sanctity, shaped by the psychology of disgust and contamination. This foundation underlies religious notions of striving to live in an elevated, less carnal, more noble way. It underlies the widespread idea that the body is a temple which can be desecrated by immoral activities and contaminants (an idea not unique to religious traditions).

According to Jon Haidt, ”Moral systems are interlocking sets of values, virtues, norms, practices, identities, institutions, technologies, and evolved psychological mechanisms that work together to suppress or regulate selfishness and make social life possible.”

Perhaps one of the most compelling parts of the theory is that it invites people to try and posit a 6th foundation.  There was even a prize offered by Jon to those who succeeded and a number of possible candidates are listed here.

How can we determine what is or is not a foundation?  Some of the criteria are listed on the above webpage.  Borrowing from a recent lecture I attended on approaches to develop foundations of ‘personality’, I would list the below criteria as important.

With that in mind, I would offer these potential modifications of our initial foundations.

 

These are merely hypotheses and opinions, so take them for what it’s worth.  It is also important to note that the fact that it is possible to refine a theory doesn’t reduce the importance or contribution of the theory.  In fact, the fact that I (and many others) posted about refining it means that this theory has had a significant impact on public discourse and is worthy of refining.