The essence of all morality is this: to believe that every human being is of infinite importance, and therefore that no consideration of expediency can justify oppression of one by another. But to believe that is necessarily to believe in God.
Richard Henry Tawney[1]
Never let your sense of morals prevent you from doing what is right.
Isaac Asimov

Is there any hope for humans to agree on a universal moral code of conduct? After all, people everywhere seem to have an “innate” sense of right and wrong. The great medieval Christian theologian and philosopher Thomas Aquinas thought that God implanted moral knowledge in the reason of all humans. Aquinas holds that we know immediately, by inclination, what counts as good and thus to be pursued (he mentions things like life, procreation, knowledge, society, and reasonable conduct). So, any rational person in any society can know the general kinds of actions that morality prohibits and the ones it allows. This is the essential idea behind Aquinas’ “Natural Law” theory of ethics, a system of moral law that is purportedly determined by nature.[2]
Aquinas was right in arguing that humans know immediately, by inclination, that some things are “good”, and some are “bad.” So why is there so much immoral behavior in the world? Does our sense of morality follow from instinct and emotion, not reason? To begin with, many of the behaviors that are fundamental to the development of moral systems are found to some degree in most primate species.
Many primates have similar methods to humans for resolving, managing, and preventing conflicts of interest within their groups. Macaques and chimpanzees have a sense of social order and rules of expected behavior, mostly to do with the hierarchical natures of their societies, in which each member knows its own place. Young rhesus monkeys learn quickly how to behave, and occasionally get a finger or toe bitten off as punishment. Primates also have a sense of reciprocity and fairness; they remember those who did them favors and those who did them wrong. Chimps are more likely to share food with those who have groomed them. Such methods are the building blocks of moral systems.[3]
Should we judge the action of a monkey or a chimpanzee as moral or immoral? To do so we require the action to be the result of willful behavior and self-awareness, and only humans have this capability (see Idea 15). Some animals are sensitive to the plight of others. Chimpanzees, for instance, have drowned in zoo moats trying to save others (they cannot swim). Given the chance to get food by pulling a chain that would also deliver an electric shock to a companion, rhesus monkeys will starve themselves for several days.[4]
The Dutch primatologist Frans De Waal and other experts argue that these social behaviors are the precursors of human morality. They further believe that if morality grew out of behavioral rules shaped by evolution, it is for biologists, not philosophers or theologians, to say what these rules are. Darwin himself had this insight. In The Descent of Man, he wrote: “[T]he social instincts – the prime principle of man’s moral constitution – with the aid of active intellectual powers and the effects of habit, naturally lead to the golden rule, ‘As ye would that men should do to you, do ye to them likewise’; and this lies at the foundation of morality.”[5]
While philosophers and theologians scoffed at Darwin’s idea, the evidence in the last few years has tipped the balance in his favor. Indeed, there is evidence that humans evolved a moral instinct, a capacity that naturally grows within each one of us, designed to generate rapid judgment about what is morally right or wrong. We know, for instance, that young infants possess an elementary sense of justice and an understanding of sharing and fairness.
In one recent study researchers found that 15-month-old infants could already identify unequal distributions of food and drink, and that this sense of fairness was connected to their own willingness to share. Other studies have shown that infants in the second year of life already possess context-sensitive expectations relevant to fairness. The accumulating evidence that the roots of a basic sense of fairness and altruism can be found so early in infancy supports Darwin’s arguments for an evolutionary basis of human morality.[6]
This does not mean that the moral education of children is a waste of time. For starters, the “innate” sense of right and wrong that children possess is rudimentary, diverging in significant ways from what the adults may want it to be, which means socialization remains important. Young children can be very self-focused (they want their toy and might not want to share). They also favor people from their own kind: research has shown that 3-month-olds prefer the faces of the race that is most familiar to them; 11-month-olds prefer individuals who share their own taste in food and expect these individuals to be nicer than those with different tastes; 12-month-olds prefer to learn from someone who speaks their own language over someone who speaks a foreign language.[7]
As children mature, so does their moral development. Based on the work of the Swiss psychologist Jean Piaget, the American psychologist Lawrence Kohlberg identified six distinct moral developmental stages in humans, which he grouped as: preconventional, conventional, and postconventional:
·
Preconventional morality is the first stage of moral development and lasts until approximately age 9. At the preconventional level children don't have a personal code of morality, and instead moral decisions are shaped by the standards of adults and the consequences of following or breaking their rules.
·
In the conventional level, morality is characterized by an acceptance of society’s conventions concerning right and wrong. Kids at this level generally care a lot about conformity, and they have great respect for authority (in theory if not practice).
·
In the post-conventional level, there is a growing realization that individuals are separate entities from society, and that the individual’s own perspective may take precedence over society’s view; individuals may disobey rules inconsistent with their own principles. In this stage, moral reasoning is based on abstract reasoning using universal ethical principles.[8]
Box 6. How Disgust Explains Everything1 Can disgust help us understand morality, political orientation and culture wars? Some researchers claim so. Disgust is an emotion that is universally recognized. Most of us have strong dislikes for certain foods, or smells. Disgust is also a powerful motivator; research shows that disgust is the energy powering a whole host of seemingly unrelated phenomena, from our culture wars to the existence of kosher laws, to political ideology. According to the psychologist Paul Rozin, disgust was all about food. It began with the fact that humans have immense dietary flexibility. Unlike koalas, who eat almost nothing but eucalyptus leaves, humans must gaze at a vast range of eating options and figure out what to put in our mouths. (The phrase “omnivore’s dilemma” was originally coined by Rozin.) Disgust, he argued, evolved as one of the great determinants of what to eat: If a person had zero sense of disgust, they might eat something gross and die. On the other hand, if a person was too easily disgusted, they would probably fail to consume enough calories and would also die. It was best to be somewhere in the middle, approaching food with a blend of neophobia (fear of the new) and neophilia (love of the new). Researchers, including Rozin, contend that disgust is the primary driver not only for our selection of food, but also of moral reasoning. Indeed, experiments by Jonathan Haidt, a former student of Rozin, and his associates show that disgust and other social emotions greatly influence our sense of morality. They also show that disgust sensitivity is really measuring our feelings about (moral) purity and (immoral) pollution. And these, in turn, contribute to our construction of moral systems, and it is our moral systems that also guide our political orientations. A 2014 study suggests that disgust is an accurate predictor of political orientation; conservatives displayed a far higher disgust response than liberals. In the study, participants were shown a range of images — some disgusting, some not — while having their brain responses monitored. With great success, researchers could predict a person’s political orientation based on analysis of this fMRI data. What could explain this? As most of us know, parents especially, repeated exposure to things that are “disgusting” usually decreases the feeling of disgust — as happens when one learns to change a baby’s diaper. This may also explain why liberals tend to have a lower disgust response than conservatives—liberals are much more exposed to people and ways of life that are different than their own. Could it be that exposing a conservative to new cultures different from their own would make a person more liberal? This is not clear. It could well be that liberals are more tolerant of feelings of disgust and more motivated to explore different ways of life. Nonetheless, this kind of research is valuable because it sheds light on people’s emotions, which are crucial to our ability to cooperate and live together in society. 1. “How Disgust Explains Everything,” by Molly Young, The New York Times, Dec 27, 2021. Also Mlodinow, L. (2022). Emotional: How Feelings Shape our Thinking. Pantheon Books, New York.
The human nature envisaged by Kohlberg’s development stages is one in which humans are born amoral and then progressively mature until they reach an “enlightened” stage of moral reasoning. The stages, according to Kohlberg, emerge from our own thinking about moral problems. As we get into discussions and debates with others, we find our views questioned and challenged and are therefore motivated to come up with new, more comprehensive positions: people use logic and experimentation to figure out the truth for themselves. For Kohlberg, justice is the essential characteristic of moral reasoning. In actual practice, says Kohlberg, we can reach just decisions by looking at a situation through one another’s eyes.[9]
Human morality: driven by passion more than by reason
Moral reasoning appears to be less relevant to moral action than Kohlberg's theory suggests. While adults may possess the additional critical capacity of being able to consciously reason about morality, they are not otherwise that different from children in that their moral feelings are often instinctive. In fact, as I have mentioned above, the evidence shows that individuals often make moral judgments without weighing concerns such as fairness, law, human rights, or abstract ethical values. Instead, moral reasoning is a post hoc rationalization of intuitive decisions. In this sense, moral reasoning is not different from other types of human reasoning in that it is carried out largely for the purpose of persuasion, rather than discovery (see Idea 10).
Consider the study that psychologist Jonathan Haidt describes in his book The Righteous Mind, in which he examines taboo violations.[10] Haidt and his colleagues presented the participants with stories and then asked questions aimed at eliciting some type of moral judgment.
One of the stories, for example, involves incest, one of the great taboos: Julie and Mark, who are sister and brother, are traveling together in France. They are both on summer vacation from college. One night they are staying alone in a cabin near the beach. They decide that it would be interesting and fun if they tried making love. At the very least it would be a new experience for each of them. Julie is already taking birth control pills, but Mark uses a condom too, just to be safe. They both enjoy it, but they decide not to do it again. They keep that night as a special secret between them, which makes them feel even closer to each other. So, what do you think about this? Was it wrong for them to have sex?
In another story, Jennifer, a woman who works in a hospital and is charged with the task of incinerating corpses, decides one night to cut off a piece of flesh from a fresh cadaver. She then goes home and proceeds to cook the piece of flesh and eat it.
Most people were disgusted with the actions in the stories and judged the participants immoral. According to Haidt, 20% of the subjects said it was OK for Julie and Mark to make love, and 13% said it was OK for Jennifer to eat part of the cadaver.
What is fascinating are the reasons that people gave to justify their judgments. For example, when asked why she thought Julie and Mark should not have sex, one participant explained that if Julie did get pregnant her offspring would most likely become deformed. But when the experimenter clarified that they used a condom and birth control pills, then the subject came up with another reason, and so on, until finally she accepted that she just “knew” that it was wrong. This was the general pattern, as most of the subjects did not change their judgments even after their reasons were disproven by the experimenter.[11]
Other experiments confirm that people make moral judgments rapidly and emotionally. They provide further support to the claim that moral reasoning is mostly just a post hoc search for reasons to justify the judgments people make.[12] This does not mean that independent, reasoned judgment is not possible, only that it is rare in practice. For example, people use reasoning to detect moral inconsistencies in others and in themselves, or when moral intuitions conflict, or are absent. People are also able to reappraise their emotions in a way that diminishes the intensity of the emotional experience, which can lead to more deliberative moral judgments.[13]
Most of the time, though, moral judgments are made intuitively, with little deliberation or conscious weighing of evidence and alternatives. The results do not support Aquinas or Kohlberg, but the Scottish philosopher David Hume, who concluded that desire rather than reason governed human behavior, famously saying: “Reason is, and ought only to be the slave of the passions.”
In summary: Use reason, not solely instinct, for complex moral choices
Overall, the findings reveal a complex picture in which morality emerges from the interaction of multiple psychological building blocks within each person, and from the interaction of the person with other members of society. There is growing evidence that these building blocks are innate and are the product of evolution, with natural selection playing a critical role. As the individual matures within a cultural context, these building blocks are assembled into coherent moralities. Not surprisingly given all these factors, morality varies greatly across individuals as well as cultures. It seems to depend on factors such as the nature of economic activity, form of government, frequency of warfare, and the strength of national institutions (see also Idea 16).
One important lesson is that we should not rely solely on our instincts and emotions to make moral decisions. We should not let our guard down even if we abide by a moral code. Our powerful ability to rationalize our decisions can justify even the most ludicrous actions. One example is how Christian slaveholders in America cherry-picked texts in the Old Testament to justify slavery on Biblical grounds. As we shall also see next, we should approach moral dilemmas from different angles, and avoid dogmatic approaches.
[1]Tawney writing in his commonplace book, quoted by Wright, A. (1990). R. H, Tawney, Manchester University Press, p.19. [2] See, e.g., Stanford Encyclopedia of Philosophy: http://plato.stanford.edu/entries/natural-law-ethics/. [3] Flack, C. F & de Waal, F. B. M. (2000). Any animal whatever: Darwinian building blocks of morality in monkeys and apes, In Leonard D. Katz (Ed.) Evolutionary Origins of Morality, UK: Imprint Academic. [4] de Waal, F. B. M. (2006). Primates and philosophers: how morality evolved, Princeton Science Library. [5] Darwin, C. R. (1874) The descent of man, and selection in relation in sex, 2nd edition, New York: Hurst, pp.285-86. [6] Schmidt, M. F. H. & Sommerville, J. A. (2011). Fairness expectations and altruistic sharing in 15-month-old human infants, PLoS One, 6(10): e23223. doi:10.1371/journal.pone.0023223. [7] See, e.g., “The Moral Life of Babies,” by Paul Bloom, The New York Times, May 5, 2010. [8] Kohlberg, L. (1981). Essays on Moral Development, Vol. I: The Philosophy of Moral Development. Harper & Rowe. [9] Crain, W. C. (1985). Theories of Development, Prentice-Hall, pp.118-136. [10] Haidt, J. (2012). The righteous mind: why good people are divided by politics and religion, Pantheon Books, p.38. [11] Ibid, pp.39-40. [12] See, e.g., Haidt, J. (2001). The emotional dog and its rational tail: A social intuitionist approach to moral judgment. Psychological Review. 108, 814-834. [13] Feinberg, M. et al. (2012). Liberating reason from the passions: Overriding intuitionist moral judgments through emotion reappraisal, Psychological Science OnlineFirst, published on May 25, 2012, 1-8.
Comentarios