Benjamin Franklin tracked his prideful, sloppy, and gluttonous acts in a daily journal, marking each moral failing with a black ink dot. Now, scientists have devised a modern update to Franklin’s little book, using smart phones to track the sins and good deeds of more than 1200 people. The new data—among the first to be gathered on moral behavior outside of the lab—confirm what psychologists have long suspected: Religious and nonreligious people are equally prone to immoral acts.
Lab studies have backed that view, by asking participants to interpret moral vignettes or play games that tempt players to cheat, says Jonathan Haidt, a social psychologist at New York University in New York City. In a 2008 review for Science, for example, researchers found that believers act more morally than nonreligious people only when interacting with other members of their own religious community. Such selectivity makes sense from an evolutionary perspective, Haidt says. If, as some scientists hypothesize, religion evolved to increase social cohesion, it shouldn’t just make you “blindly nice to everybody; it should make you more virtuous when you are interacting with others of the same faith.”
Lab studies have limitations, however. The artificial scenarios they rely on can’t tell researchers much about how religious and nonreligious people behave in daily life, or whether moral considerations are “even relevant” to how people actually behave, says Daniel Wisneski, a moral psychologist at Saint Peter’s University in Jersey City, New Jersey, and a co-author of the new study, which appears online today in Science.
Wisneski and colleagues used Craigslist, Facebook, Twitter, and other outlets to recruit 1252 adults ages 18 to 68 throughout the United States and Canada. Tempted by the possibility of winning an iPod Touch through a lottery, participants downloaded an app to their smart phones which allowed researchers to buzz them via text five times a day between 9 a.m. and 9 p.m. When they opened the texts, participants were prompted to open a link where they could confidentially report whether they’d witnessed, heard about, or performed any moral or immoral acts within the past hour, and jot down a description. They also entered details about how intensely they felt about the event, rating emotions such as disgust on a 0 to 5 scale.
Reading through the 13,240 messages that the team received over the course of the 3-day study “was an interesting process,” Wisneski says. Participants confessed offenses both tawdry and peculiar: “Arranging adulterous encounter” and “[h]ired someone to kill a muskrat that’s ultimately not causing any harm” were two examples. Although Wisneski says that the negative reports periodically got him down, tidings of good deeds soon lifted his spirits. One person said that they “gave a homeless man an extra sandwich,” for instance, and another reported hearing about an organization that “freed Beagles that had never seen daylight or felt grass.”
Overall, people who had identified themselves as religious or nonreligious when they registered for the study committed both moral and immoral deeds with “comparable frequency,” the team reports. Unsurprisingly, being the target of a positive moral act made people feel slightly better than actually performing one, the researchers found. Benefiting from a good deed made participants more likely to do something nice for someone else later on, a phenomenon known as moral contagion, Haidt says.
The study also confirmed that people with different political views emphasize different moral values. Many of the reported moral acts centered on avoiding harm to others or protecting people from oppression. But other values were at play, too. Wisneski and lead author Wilhelm Hofmann spent weeks classifying the reported acts according to six moral principles identified by Haidt and his colleagues: care for others, fairness, liberty, loyalty, authority, and sanctity. They found that conservatives were more likely than liberals to report acts involving sanctity and respect for authority, and liberals were more likely than conservatives to talk about fairness—a result that replicates earlier findings in the lab, Haidt says. In addition to Haidt’s six original values, the team found that participants’ judgments reflected two others, honesty and self-discipline, which they used to classify behaviors such as sneaking fast food “though I promised someone I wouldn’t have it.”
An obvious weakness of the study is that people’s view of themselves may color how they report their own behavior, says Fiery Cushman, a moral psychologist at Harvard University. Still, it’s reassuring to see phenomena such as moral contagion, which have been observed in experiments, replicated in everyday life, he says. “It’s kind of a report card on what we’ve learned from the lab.”