Moral psychology

Moral psychology is a field of study in both philosophy and psychology. Some use the term "moral psychology" relatively narrowly to refer to the study of moral development.[1] However, others tend to use the term more broadly to include any topics at the intersection of ethics, psychology, and philosophy of mind.[2][3] Some of the main topics of the field are moral judgment, moral reasoning, moral sensitivity, moral responsibility, moral motivation, moral identity, moral action, moral development, moral diversity, moral character (especially as related to virtue ethics), altruism, psychological egoism, moral luck, moral forecasting, moral emotion, affective forecasting, and moral disagreement.[4][5]

A moral act is a type of behavior that refers has either a moral or immoral consequence. In many cultures, a moral act refers to an act that entails free will, purity, liberty, honesty, and meaning, whereas an immoral act refers to an act that entails corruption and fraudulence.

Some psychologists that have worked in the field are: Jean Piaget, Lawrence Kohlberg, Carol Gilligan, Elliot Turiel, Jonathan Haidt, Linda Skitka, Leland Saunders, Marc Hauser, C. Daniel Batson, Jean Decety, Joshua Greene, A. Peter McGraw, Philip Tetlock, Darcia Narvaez, Tobias Krettenauer, Liane Young, Daniel Hart, Suzanne Fegley, and Fiery Cushman. Philosophers that have worked in the field include Stephen Stich, John Doris, Joshua Knobe, John Mikhail, Shaun Nichols, Thomas Nagel, Robert C. Roberts, Jesse Prinz, Michael Smith, and R. Jay Wallace.

Background

Moral psychology began with early philosophers such as Aristotle, Plato, and Socrates. They believed that "to know the good is to do the good". They analyzed the ways in which people make decisions with regards to moral identity. As the field of psychology began to divide away from philosophy, moral psychology expanded to include risk perception and moralization, morality with regards to medical practices, concepts of self-worth, and the role of emotions when analyzing one's moral identity. In most introductory psychology courses, students learn about moral psychology by studying the psychologist Lawrence Kohlberg,[6] who introduced the moral development theory in 1969. This theory was built on Piaget's observation that children develop intuitions about justice that they can later articulate. The increasing sophistication of articulation of reasoning is a sign of development. Moral cognitive development centered around justice and guided moral action increase with development, resulting in a postconventional thinker that can "do no other" than what is reasoned to be the most moral action. But researchers using the Kohlberg model found a gap between what people said was most moral and actions they took. Today, some psychologists and students alike rely on Augusto Blasi's self-model that link ideas of moral judgment and action through moral commitment. Those with moral goals central to the self-concept are more likely to take moral action, as they feel a greater obligation to do so. Those who are motivated will attain a unique moral identity.[7]

History

Historically, early philosophers such as Aristotle and Plato engaged in both empirical research and a priori conceptual analysis about the ways in which people make decisions about issues that raise moral concerns. With the development of psychology as a discipline separate from philosophy, it was natural for psychologists to continue pursuing work in moral psychology, and much of the empirical research of the 20th century in this area was completed by academics working in psychology departments.

Today, moral psychology is a thriving area of research in both philosophy and psychology, even at an interdisciplinary level.[8] For example, the psychologist Lawrence Kohlberg questioned boys and young men about their thought processes when they were faced with a moral dilemma[9][10] producing one of many very useful empirical studies in the area of moral psychology. As another example, the philosopher Joshua Knobe recently completed an empirical study on how the way in which an ethical problem is phrased dramatically affects an individual's intuitions about the proper moral response to the problem. More conceptually focused research has been completed by researchers such as John Doris. Doris discusses the way in which social psychological experiments—such as the Stanford prison experiments involving the idea of situationism—call into question a key component in virtue ethics: the idea that individuals have a single, environment-independent moral character.[11] As a further example, Shaun Nichols (2004) examines how empirical data on psychopathology suggests that moral rationalism is false.[12]

Key Theorists in Moral Psychology

Lawrence Kohlberg was one of the forerunners in the development of an empirical measure for moral reasoning. He proposed six stages broken into 3 categories of moral reasoning that, he believed to be universal to all people in all cultures[13]. Throughout his career, he focused on narrowing his focus deeper and deeper into a bedrock of human moral reasoning. one of his famous accomplishments was the development of a moral dilemma known as The Heinz Dilemma in which the reader would follow a character, Heinz, through a dilemma to either steal a drug to possibly save his wife from death by cancer, or not steal the drug. Any participants were asked to respond and provide their reasoning for why or why not Heinz should steal the drug for his wife; in so answering, and given their reasoning about why Heinz should or should not steal it, Kohlberg could reliably come to a rational and empirical conclusion about the moral stage the participant was in.

Measures

Philosophers and psychologists have created structured interviews and surveys as a means to study moral psychology and its development.

Interview techniques

Since at least 1894, philosophers and psychologists attempted to empirically evaluate the morality of an individual, especially attempting to distinguish adults from children in terms of their judgment, but the efforts failed because they "attempted to quantify how much morality an individual had—a notably contentious idea—rather than understand the individual's psychological representation of morality".[14] Lawrence Kohlberg addressed that difficulty in 1963 by modeling evaluative diversity as reflecting a series of developmental stages (à la Jean Piaget). Lawrence Kohlberg's stages of moral development are:[15]

  1. Obedience and punishment orientation
  2. Self-interest orientation
  3. Interpersonal accord and conformity
  4. Authority and social-order maintaining orientation
  5. Social contract orientation
  6. Universal ethical principles

Stages 1 and 2 are combined into a single stage labeled "pre-conventional", and stages 5 and 6 are combined into a single stage labeled "post-conventional" for the same reason; psychologists can consistently categorize subjects into the resulting four stages using the "Moral Judgement Interview" which asks subjects why they endorse the answers they do to a standard set of moral dilemmas.[10]

Rather than confirm the existence of a single highest stage, Larry Walker's cluster analysis of a wide variety of interview and survey variables for moral exemplars found three types: the "caring" or "communal" cluster was strongly relational and generative, the "deliberative" cluster had sophisticated epistemic and moral reasoning, and the "brave" or "ordinary" cluster was less distinguished by personality.[16]

Survey instruments

Between 1910 and 1930, in the United States and Europe, several morality tests were developed to classify subjects as fit or unfit to make moral judgments.[14][17] Test-takers would classify or rank standardized lists of personality traits, hypothetical actions, or pictures of hypothetical scenes. As early as 1926, catalogs of personality tests included sections specifically for morality tests, though critics persuasively argued that they merely measured awareness of social expectations.[18]

Meanwhile, Kohlberg inspired a new wave of morality tests. The Defining Issues Test (dubbed "Neo-Kohlbergian" by its constituents) scores relative preference for post-conventional justifications,[19] and the Moral Judgment Test scores consistency of one's preferred justifications.[20] Both treat evaluative ability as similar to IQ (hence the single score), allowing categorization by high score vs. low score.

The Moral Foundations Questionnaire is based on moral intuitions consistent across cultures: care/harm, fairness/cheating, loyalty/betrayal, authority/subversion, and sanctity/degradation (liberty/oppression may be added). The questions ask respondents to rate what they consider morally relevant post-consciously (i.e. this is not a behavioral measure). The purpose of the questionnaire is to measure the degree to which people rely upon different sets of moral intuitions (which may coexist), rather than to categorize decision-makers, but the first two foundations cluster together with liberal political orientation and the latter three cluster with conservative political orientation.[21][22]

The Moral DNA survey by Roger Steare asks respondents to rank their virtues, then divides respondents by three virtue clusters: obedience, care, and reason. The survey was developed for use in business settings, especially to raise awareness of ways perceived workplace discrimination diminishes effective evaluative diversity.[23]

In 1999, some of Kohlberg's measures were tested when Anne Colby and William Damon published a study in which the development of extraordinary moral development was examined in the lives of moral exemplars. In order to specifically show how people may develop exceptional moral commitments, the researchers focused on the two stories of women seen as moral exemplars. They came from different backgrounds and yet had similar moral development over the course of their lives. After finding these participants that exhibited high levels of moral commitment in their everyday behavior, the researchers utilized the moral judgement interview (MJI) to compare the 23 exemplars (including the two women) with a more ordinary group of people. The researchers also put the 23 exemplars through two standard dilemmas to assess what level they were on in Kohlberg's stages. The intention was to learn more about moral exemplars and to examine the strengths and weaknesses of the Kohlberg measure. It was found that the MJI scores were not clustered at the high end of Kohlberg's scale, they ranged from stage 3 to stage 5. Half landed at the conventional level (stages 3, 3/4, and 4) and the other half landed at the postconventional level (stages 4/5 and 5). Compared to the general population, the scores of the moral exemplars may be somewhat higher than those of groups not selected for outstanding moral behaviour. Researchers noted that the "moral judgement scores are clearly related to subjects' educational attainment in this study". Among the participants that had attained college education or above, there was no difference in moral judgement scores between genders. The study noted that although the exemplars' scores may have been higher than those of nonexemplars, it is also clear that one is not required to score at Kohlberg's highest stages in order to exhibit high degrees of moral commitment and exemplary behaviour.[10] Apart from their scores, it was found that the 23 participating moral exemplars described three similar themes within all of their moral developments: certainty, positivity, and the unity of self and moral goals. The unity between self and moral goals was highlighted as the most important theme as it is what truly sets the exemplars apart from the 'ordinary' people. It was discovered that the moral exemplars see their morality as a part of their sense of identity and sense of self, not as a conscious choice or chore. Also, the moral exemplars showed a much broader range of moral concern than did the ordinary people and go beyond the normal acts of daily moral engagements. For example, the moral exemplars would feed their own children, but then go farther and fight to end world hunger on a global scale as well. In order to encourage this strong sense of moral development in children and adolescents, it is recommended to encourage a sense of empowerment and to also show a positive and optimistic approach to life.

Theories

Moral identity

Empirical studies show that reasoning and emotion only moderately predicted moral action. Scholars, such as Blasi, began proposing identity as a motivating factor in moral motivation.[24] Blasi proposed the self model of moral functioning, which described the effects of the judgment of responsibility to perform a moral action, one's sense of moral identity, and the desire for self-consistency on moral action. Blasi also elaborates on the structure of identity and its connection to morality. According to Blasi, there are two aspects that form identity. One of the aspects focuses on the specific contents that make up the self (objective identity content), which include moral ideals. The second refers to the ways in which identity is subjectively experienced (subjective identity experience). As the subjective side of identity matures, the objective side tends to lean towards internal contents like values, beliefs, and goals, rather than external identity contents like physical aspects, behaviors, and relationships. A mature subjective identity yearns for a greater sense of self-consistency. Therefore, identity would serve as a motivation for moral action. Studies of moral exemplars have shown that exemplary moral action often results from the intertwining of personal goals and desires with moral goals, and studies on moral behavior also show a correlation between moral identity and action. S. Hardy and G. Carlo raise critical questions about Blasi's model as well, and propose that researchers should seek to better operationalize and measure moral identity and apply findings to moral education and intervention programs.[25]

Anne Colby and William Damon suggest that one's moral identity is formed through that individual's synchronization of their personal and moral goals. This unity of their self and morality is what distinguishes them from non-exemplars and in turn makes them exceptional.[26] Colby and Damon studied moral identity through the narratives of civil rights activist Virginia Foster Durr and Suzie Valadez, who provided services for the poor, whose behavior, actions, and life's works were considered to be morally exemplary by their communities and those with whom they came in contact. Some common characteristics that these moral exemplars possess are certainty, positivity (e.g. enjoyment of work, and optimism), and unity of self and moral goals.[27] The research suggests that a "transformation of goals" takes place during the evolution of one's moral identity and development and therefore is not an exercise of self-sacrifice but rather one done with great joy; moral exemplars see their personal goals and moral goals as synonymous. This transformation is not always a deliberate process and is most often a gradual process, but can also be rapidly set off by a triggering event.[28] Triggering events can be anything from a powerful moment in a movie to a traumatic life event, or as in the case of Suzie Valadez, the perception of a vision from God. In many of the moral exemplars interviewed, the triggering events and goal transformation did not take place until their 40s. Moral exemplars are said to have the same concerns and commitments as other moral people but to a greater degree, "extensions in scope, intensity and breadth".[29] Furthermore, exemplars possess the ability to be open to new ideas and experiences, also known as an "active receptiveness"[30] to things exterior to themselves.

A 1995 study was conducted to see how teenagers who conducted themselves in a caring manner throughout their communities saw themselves. The findings suggested that adolescent caring exemplars formulated their self-concept differently from comparable peers. Moral exemplars were found to have more references to positive, moral, caring personality traits as well as moral and caring goals. They were also more likely to emphasize academic goals and amoral typical activities. There were no significant differences between the exemplars and the control group concerning moral knowledge. On a semantic space analyses, the moral exemplars tended to view their actual self as more integrated with their ideal and expected self.[31]

David Wong proposes that we think of cultures in an analogy to a conversation, there are people with different beliefs, values, and norms that can voice their opinion loudly or quietly, but over the course of time these factors can change. A moral culture can provide other members with a kind of "language" where there is plenty of room for different "dialects", this allows moral identities to be established and voiced more.

According to Blasi's theory on moral character, moral character is identified by the person's set of the morality of virtues and vices. He theorized willpower, moral desires, and integrity have the capability for a person to act morally by the hierarchical order of virtues. He believed that the "highest" and complex of virtues are expressed by willpower while the "lowest" and simplistic of virtues are expressed integrity. He essentially stated that to have the lower virtues, one must have one or more of the higher virtues. The end goals of moral development identity are to establish and act upon core goals, as well as and use one's strengths to make a difference.[32]

Moral self

A "moral self" is fostered by mutually-responsive parenting in childhood. Children with responsive parents develop more empathy, prosociality, a moral self and conscience.[33] Darcia Narvazes describes the neurobiological and social elements of early experience and their effects on moral capacities.[34]

The moral self results when people integrate moral values into their self-concept.[35] Research on the moral self has mostly focused on adolescence as a critical time period for the integration of self and morality[36] (i.e. self and morality are traditionally seen as separate constructs that become integrated in adolescence.[37] However, the moral self may be established around age 2–3 years.[38][39] In fact, children as young as 5 years-old are able to consistently identify themselves as having certain moral behavioral preferences.[40] Children's moral self is also increasingly predictive of moral emotions with age.[40]

Moral values

Kristiansen and Hotte review many research articles regarding people's values and attitudes and whether they guide behavior. With the research they reviewed and their own extension of Ajzen and Fishbein's theory of reasoned action, they conclude that value-attitude-behavior depends on the individual and their moral reasoning. They also pointed out that there are such things as good values and bad values. Good values are those that guide our attitudes and behaviors and allow us to express and define ourselves. It also involves the ability to know when values are appropriate in response to the situation or person that you are dealing with. Bad values on the other hand are those that are relied on so much that it makes you unresponsive to the needs and perspectives of others.

Another issue that Kristiansen and Hotte discovered through their research was that individuals tended to "create" values to justify their reactions to certain situations, which they called the "value justification hypothesis". The authors use an example from feminist Susan Faludi's journal entry of how during the period when women were fighting for their right to vote, a New Rights group appealed to society's ideals of "traditional family values" as an argument against the new law in order to mask their own "anger at women's rising independence." Their theory is comparable to Jonathan Haidt's social intuition theory, where individuals justify their intuitive emotions and actions through reasoning in a post-hoc fashion.

Kristiansen and Hotte also found that independent selves had actions and behaviors that are influenced by their own thoughts and feelings, but Interdependent selves have actions, behaviors and self-concepts that were based on the thoughts and feelings of others. Westerners have two dimensions of emotions, activation and pleasantness. The Japanese have one more, the range of their interdependent relationships. Markus and Kitayama found that these two different types of values had different motives. Westerners, in their explanations, show self-bettering biases. Easterners, on the other hand, tend to focus on "other-oriented" biases.[41]

Psychologist S. H. Schwartz defines individual values as "conceptions of the desirable that guide the way social actors (e.g.organisational leaders, policymakers, individual persons) select actions, evaluate people an events, and explain their actions and evaluations."[42] Cultural values form the basis for social norms, laws, customs and practices. While individual values vary case by case (a result of unique life experience), the average of these values point to widely held cultural beliefs (a result of shared cultural values).

Moral virtues

Piaget and Kohlberg both developed stages of development to understand the timing and meaning of moral decisions. In 2004, D. Lapsley and D. Narvaez outlined how social cognition explains aspects of moral functioning.[43] The social cognitive approach to personality has six critical resources of moral personality: cognition, self-processes, affective elements of personality, changing social context, lawful situational variability, and the integration of other literature. Lapsley and Narvaez suggest that moral values and actions stem from more than our virtues and are more so controlled by a set of self-created schemas (cognitive structures that organize related concepts and integrate past events). They claim that schemas are "fundamental to our very ability to notice dilemmas as we appraise the moral landscape" and that over time, people develop greater "moral expertise".[44]

Moral reasoning

Jean Piaget, in watching children play games, noted how their rationales for cooperation changed with experience and maturation. He identified two stages, heteronomous (morality centered outside the self) and autonomous (internalized morality). Lawerence Kohlberg sought to expand Piaget's work. His cognitive developmental theory of moral reasoning dominated the field for decades. He focused on moral development as one's progression in the capacity to reason about justice. Kohlberg's interview method included hypothetical moral dilemmas or conflicts of interest. The most widely known moral scenario used in his research is usually referred to as the Heinz dilemma. He interviewed children and described what he saw in six stages (claiming that "anyone who interviewed children about dilemmas and who followed them longitudinally in time would come to our six stages and no others).[45]

In the Heinz dilemma, Heinz's wife is dying of cancer and the town's druggist has something that can help her, but is charging more than Heinz can afford; Heinz steals the drug to save his wife's life. Children aged 10, 13, and 16 years old were asked if what Heinz did was morally justified. Kohlberg's stages of moral development consisted of six stages and three levels. At the Preconventional level, the first two stages included the punishment-and-obedience orientation and the instrumental-relativist orientation. The next level, the conventional level, included the interpersonal concordance or "good boy – nice girl" orientation, along with the "law and order" orientation. Lastly, the final Postconventional level consisted of the social-contract, legalistic orientation and the universal-ethical-principle orientation. Children progressed from stage one, where they began to recognize higher authorities and that there are set rules and punishments for breaking those rules; to stage six, where good principles make a good society. They also start to define which of the principles are most agreeable and fair.[46] According to Kohlberg, an individual is considered more cognitively mature depending on their stage of moral reasoning, which grows as they advance in education and world experience. One of the examples that Kohlberg gives is called "cognitive-moral conflict", wherein an individual who is currently in one stage of moral reasoning has their beliefs challenged by a surrounding peer group. Through this challenge of beliefs, the individual engages in "reflective reorganization", which allows for movement to a new stage to occur.

Previous moral development scales, particularly Kohlberg's, assert that moral reasoning is dominated by one main perspective: justice. However, in a study done by Carol Gilligan and Jane Attanucci, they argue that there is an additional perspective to moral reasoning, known as the care perspective. The justice perspective draws attention to inequality and oppression, while striving for reciprocal rights and equal respect for all. The care perspective draws attention to the ideas of detachment and abandonment, while striving for attention and response to people who need it. Gilligan and Attanucci analyzed male and female responses to moral situations thought up by the participant; they found that a majority of participants represent both care and justice in their moral orientations. In addition, they found that men do tend to use the justice perspective more often than women, and women use the care perspective more frequently than men.[47] However, reviews by others have found that Gilligan's theory was not supported by empirical studies.[48][49] In fact, in neo-Kohlbergian studies with the Defining Issues Test, females tend to get slightly higher scores than males.[50]

Moral willpower

Metcalfe and Mischel offered a new theory of willpower that focused on delayed gratification.[51] They proposed a "hot/cool" structure of analysis to deprive the way one controls the way stimulus is interpreted and willpower is exerted. The hot system is referred to as the "go" system whereas the cool system is referred to as the "know" system. The hot system is characterized as being highly emotional, reflexive, and impulsive. This system leads to go response (instant gratification) and therefore undermines efforts in self-control. The cool system is characterized as being cognitive, emotionally neutral/flexible, slow, integrated, contemplative, and strategic. The hot system develops early in life, whereas the cool system develops later, as it relies on particular brain structures, notably the prefrontal cortex and hippocampus, and particular cognitive capacities that develop later. With age, there is a shift of dominance from the hot system to the cool system. The balance between them is determined by stress, developmental levels, and a person's self-regulating dynamics.[51]

Baumeister, Miller, and Delaney explored the notion of willpower by first defining the self as being made up of three parts: reflexive consciousness, or the person's awareness of their environment and of himself as an individual; interpersonal being, which seeks to mold the self into one that will be accepted by others; and executive function.[52] They stated, "[T]he self can free its actions from being determined by particular influences, especially those of which it is aware".[53] The three prevalent theories of willpower describe it as a limited supply of energy, as a cognitive process, and as a skill that is developed over time. Research has largely supported that willpower works like a "moral muscle" with a limited supply of strength that may be depleted, conserved, or replenished, and that a single act requiring much self-control can significantly deplete the "supply" of willpower.[52] While exertion reduces the ability to engage in further acts of willpower in the short term, such exertions actually improve a person's ability to exert willpower for extended periods in the long run. Muraven, Baumeister and Tice conducted a study on self-regulation and its relationship to power and stamina. This study demonstrated that the moral muscle, when exercised, is strengthened in stamina but not necessarily in power—meaning the subjects became less susceptible to the depletion of self-regulatory faculties.[54] Their study showed that more complex tasks like regulating one's mood present substantive difficulty and may not be as effective in increasing willpower as other, more straight forward activities like posture correction or maintaining a food journal.[54] However, over time, the "moral muscle" may be exercised by performing small tasks of self-control, such as attempting to correct slouched posture, resist desserts, or complete challenging self-regulatory tasks. Lastly, Baumeister argues that self-management, or the ability to alter one's responses, is a kind of skill that develops as one grows up.[52] There are many things that can help a person replenish this source of will power, such as meditation, rest, and positive emotion between tasks.[54] They also showed that there is a conservation effect when it comes to will power: people tend to realize that they are using up their stored-up will and self-control and then use it as needed.[54]

Additional studies have been conducted that may cast doubt on the idea of ego-depletion. One experiment that sought to test this ego-depletion theory was that of Hagger. Hagger challenged the resource depletion explanation that was found by Maruven, Tice, and Baumeister by highlighting there may have been small study bias, as well as, publication bias. Previous literature casting doubt on the ego-depletion analysis included a meta-analysis study in 2014, carried out by Carter et al, which provided significant evidence against the ego-depletion theory and found evidence of small study bias. Hagger carried out a replication of the ego-depletion experiment with a larger more diverse population and stricter experimental protocol to cast doubt on the significance of the resource depletion theory. Twenty-four different labs across the world replicated this study following strict protocols (results of 23 labs were used in the final report). Participants were randomly assigned to control or experimental group. The experimental group participated in the letter "e" depletion version of the task and the control group were administered the letter "e" no- depletion version of the task. Both groups first participated in practice sessions and then continued onto the actual test. Both groups were then asked to self- report items measuring effort, fatigue, difficulty, and frustration on the first task. Participants then continued on to complete the MSIT and results from this task were examined. Reaction time variability (RTV) was the primary variable observed and reaction time (RT) was the second variable. Researchers observed whether RTV was higher for participants assigned to the depletion condition compared to those assigned to the no-depletion condition. Any low accuracy results on either task (less than 80%) were excluded from the results. Twenty of the twenty-three lab replications had a 95% confidence intervals for the small effect size for ego-depletion RTV and RT on the MSIT. Results supported a null effect of the ego-depletion theory. While this one study is not enough to completely debunk the ego-depletion theory, it does cast considerable doubt about the effect size of previous study's findings.[55]

Moral behaviour

James Rest reviewed the literature on moral functioning and identified at least four components necessary for a moral behavior to take place:[56][57]

  • Sensitivity – noticing and interpreting the situation
  • Reasoning and making a judgment regarding the best (most moral) option
  • Motivation (in the moment but also habitually, such as moral identity)
  • implementation—having the skills and perseverance to carry out the action

Reynolds and Ceranic researched the effects of social consensus on one's moral behavior. Depending on the level of social consensus (high vs. low), moral behaviors will require greater or lesser degrees of moral identity to motivate an individual to make a choice and endorse a behavior. Also, depending on social consensus, particular behaviors may require different levels of moral reasoning.[58]

In looking at the relations between moral values, attitudes, and behaviors, previous research asserts that there is no dependable correlation between these three aspects, differing from what we would assume. In fact, it seems to be more common for people to label their behaviors with a justifying value rather than having a value beforehand and then acting on it. There are some people that are more likely to act on their personal values: those low in self-monitoring and high in self-consciousness, due to the fact that they are more aware of themselves and less aware of how others may perceive them. Self consciousness here means being literally more conscious of yourself, not fearing judgement or feeling anxiety from others. Social situations and the different categories of norms can be telling of when people may act in accordance with their values, but this still isn't concrete either. People will typically act in accordance with social, contextual and personal norms, and there is a likelihood that these norms can also follow one's moral values. Though there are certain assumptions and situations that would suggest a major value-attitude-behavior relation, there is not enough research to confirm this phenomenon.

Moral intuitions

In 2001, Jonathan Haidt introduced his social intuitionist models which claimed that with few exceptions, moral judgments are made based upon socially-derived intuitions. Moral intuitions happen immediately, automatically, and unconsciously.[59]

This model suggests that moral reasoning is largely post-hoc rationalizations that function to justify one's instinctual reactions. He provides four arguments to doubt causal importance of reason. Firstly, Haidt argues that since there is a dual process system in the brain when making automatic evaluations or assessments, this same process must be applicable to moral judgement as well. The second argument contains evidence from Chaiken that says social motives cause humans to be biased and to cohere and relate to other's attitudes in order to achieve higher societal goals, which in turn influences one's moral judgment. Thirdly, Haidt found that people have post hoc reasoning when faced with a moral situation, this a posteriori (after the fact) explanation gives the illusion of objective moral judgement but in reality is subjective to one's gut feeling. Lastly, research has shown that moral emotion has a stronger link to moral action than moral reasoning, citing Damasio's research on psychopaths and Batson's empathy-altruism hypothesis.[59]

In 2008, Joshua Greene published a compilation which, in contrast to Haidt's model, suggested that fair moral reasoning does take place. A "deontologist" is someone who has rule-based morality that is mainly focused on duties and rights; in contrast, a "consequentialist" is someone who believes that only the best overall consequences ultimately matter.[60] Generally speaking, individuals who answer to moral dilemmas in a consequential manner take longer to respond and show frontal-lobe activity (associated with cognitive processing). Individuals who answer to moral dilemmas in a deontological manner, however, generally answer more quickly and show brain activity in the amygdala (associated with emotional processing).

In regard to moral intuitions, researchers Jonathan Haidt and Jesse Graham performed a study to research the difference between the moral foundations of political liberals and political conservatives.[61] They challenged individuals to question the legitimacy of their moral world and introduce five psychological foundations of morality:

  • Harm/care, which starts with the sensitivity to signs of suffering in offspring and develops into a general dislike of seeing suffering in others and the potential to feel compassion in response.
  • Fairness/reciprocity, which is developed when someone observes or engages in reciprocal interactions. This foundation is concerned with virtues related to fairness and justice.
  • Ingroup/loyalty, which constitutes recognizing, trusting, and cooperating with members of one's ingroup as well as being wary of members of other groups.
  • Authority/respect, which is how someone navigates in a hierarchal ingroups and communities.
  • Purity/sanctity, which stems from the emotion of disgust that guards the body by responding to elicitors that are biologically or culturally linked to disease transmission.

The five foundations theory are both a nativist and cultural-psychological theory. Modern moral psychology concedes that "morality is about protecting individuals" and focuses primarily on issues of justice (harm/care and fairness/reciprocity).[62] Their research found that "justice and related virtues…make up half of the moral world for liberals, while justice-related concerns make up only one fifth of the moral world for conservatives".[62] Liberals value harm/care and fairness/reciprocity significantly more than the other moralities, while conservatives value all five equally.

Hadit and Graham suggest a compromise can be found to allow liberals and conservatives to see eye-to-eye. They suggest that the five foundations can be used as "doorway" to allow liberals to step to the conservative side of the "wall" put up between these two political affiliations on major political issues (i.e. legalizing gay marriage). If liberals try to consider the latter three foundations in addition to the former two (therefore adopting all five foundations like conservatives for a brief amount of time) they could understand where the conservatives viewpoints stem from and long-lasting political issues could finally be settled.

Augusto Blasi emphasizes the importance of moral responsibility and reflection as one analyzes an intuition.[63] His main argument is that some, if not most, intuitions tend to be self-centered and self-seeking.[64] Blasi critiques Haidt in describing the average person and questioning if this model (having an intuition, acting on it, and then justifying it) always happens. He came to the conclusion that not everyone follows this model. In more detail, Blasi proposes Haidt's five default positions on intuition.

  • Normally moral judgments are caused by intuitions, whether the intuitions are themselves caused by heuristics, or the heuristics are intuitions; whether they are intrinsically based on emotions, or depend on grammar type of rules and externally related to emotions.
  • Intuitions occur rapidly and appear as unquestionably evident; either the intuitions themselves or their sources are unconscious.
  • Intuitions are responses to minimal information, are not a result of analyses or reasoning; neither do they require reasoning to appear solid and true.
  • Reasoning may occur but infrequently; its use is in justifying the judgment after the fact, either to other people or to oneself. Reasons in sum do not have a moral function.

Because such are the empirical facts, the "rationalistic" theories and methods of Piaget and Kohlberg are rejected. Blasi argues that Haidt does not provide adequate evidence to support his position.[65]

Moral emotions

Moral reasoning has been the focus of most study of morality dating all the way back to Plato and Aristotle. The emotive side of morality has been looked upon with disdain, as subservient to the higher, rational, moral reasoning, with scholars like Piaget and Kohlberg touting moral reasoning as the key forefront of morality.[18] However, in the last 30–40 years, there has been a rise in a new front of research: moral emotions as the basis for moral behavior. This development began with a focus on empathy and guilt, but has since moved on to encompass new emotional scholarship stocks like anger, shame, disgust, awe, and elevation. With the new research, theorists have begun to question whether moral emotions might hold a larger in determining morality, one that might even surpass that of moral reasoning.[66]

There have generally been two approaches taken by philosophers to define moral emotion. The first "is to specify the formal conditions that make a moral statement (e.g., that is prescriptive, that it is universalizable, such as expedience)".[67] This first approach is more tied to language and the definitions we give to a moral emotions. The second approach "is to specify the material conditions of a moral issue, for example, that moral rules and judgments 'must bear on the interest or welfare either of society as a whole or at least of persons other than the judge or agent'".[68] This definition seems to be more action based. It focuses on the outcome of a moral emotion. The second definition is more preferred because it is not tied to language and therefore can be applied to prelinguistic children and animals. Moral emotions are "emotions that are linked to the interests or welfare either of society as a whole or at least of persons other than the judge or agent."[69]

There is a debate whether there is a set of basic emotions or if there are "scripts or set of components that can be mixed and matched, allowing for a very large number of possible emotions".[66] Even those arguing for a basic set acknowledge that there are variants of each emotion. Psychology Paul Ekman calls these variants "families":[70]

The principal moral emotions can be divided into two large and two small joint families. The large families are the "other-condemning" family, in which the three brothers are contempt, anger, and disgust (and their many children, such as indignation and loathing), and the "self-conscious" family (shame, embarrassment, and guilt)…[T]he two smaller families the "other-suffering" family (compassion) and the "other-praising" family (gratitude and elevation).[66]

Different cultures, Haidt suggests, can also formulate different moral emotions that reflect the values of that culture. For example, Eastern cultures may be more inclined to consider serenity/calmness as a moral emotion than Western cultures.

As Haidt would suggest, the higher the emotionality of a moral agent the more likely they are to act morally. He also uses the term "disinterested elicitor" to describe someone who is less concerned with the self, and more concerned about the well-being of things exterior to him or herself. Haidt suggests that each person's pro-social action tendency is determined by his or her degree of emotionality. Haidt uses Ekman's idea of "emotion families" and builds a scale of emotionality from low to high. If a person works on a low level of emotion and has self-interested emotions, such as sad/happy, they are unlikely to act. If the moral agent possesses a high emotionality and operates as a disinterested elicitor with emotions such as elevation, they are much more likely to be morally altruistic.

These moral emotions all have elicitors and action tendencies. The "other-condemning" family, which includes anger, contempt, and disgust, is united by caring for what other people do and developing negative feelings about the actions and characters of others who violate the moral codes and order. The moral emotion of anger is elicited by unjustified actions and insults. The action tendency of this emotion is to attack, humiliate, or get back at the person responsible of acting unfairly.[66]

Empathy also plays a large role in altruism. The empathy-altruism hypothesis states that feelings of empathy for another leads to an altruistic motivation to help that person.[71] In contrast, there may also be an egoistic motivation to help someone in need. This is the Hullian tension-reduction model in which personal distress caused by another in need leads the person to help in order to alleviate their own discomfort.[72]

Batson, Klein, Highberger, and Shaw conducted experiments where they manipulated people through the use of empathy-induced altruism to make decisions that required them to show partiality to one individual over another. The first experiment involved a participant from each group to choose someone to experience a positive or negative task. These groups included a non-communication, communication/low-empathy, and communication/high-empathy. They were asked to make their decisions based on these standards resulting in the communication/high-empathy group showing more partiality in the experiment than the other groups due to being successfully manipulated emotionally. Those individuals who they successfully manipulated reported that despite feeling compelled in the moment to show partiality, they still felt they had made the more "immoral" decision since they followed an empathy-based emotion rather than adhering to a justice perspective of morality.[71]

Batson, Klein, Highberger, & Shaw conducted two experiments on empathy-induced altruism, proposing that this can lead to actions that violate the justice principle. The second experiment operated similarly to the first using low-empathy and high-empathy groups. Participants were faced with the decision to move an ostensibly ill child to an "immediate help" group versus leaving her on a waiting list after listening to her emotionally-driven interview describing her condition and the life it has left her to lead. Those who were in the high-empathy group were more likely than those in the low-empathy group to move the child higher up the list to receive treatment earlier. When these participants were asked what the more moral choice was, they agreed that the more moral choice would have been to not move this child ahead of the list at the expense of the other children. In this case, it is evident that when empathy induced altruism is at odds with what is seen as moral, oftentimes empathy induced altruism has the ability to win out over morality.[71]

Recently neuroscientist Jean Decety, drawing on empirical research in evolutionary theory, developmental psychology, social neuroscience, and psychopathy, argued that empathy and morality are neither systematically opposed to one another, nor inevitably complementary.[73][74]

Emmons (2009) defines gratitude as a natural emotional reaction and a universal tendency to respond positively to another's benevolence. Gratitude is motivating and leads to what Emmons' describes as "upstream reciprocity". This is the passing on of benefits to third parties instead of returning benefits to one's benefactors (Emmons, 2009).

Moral conviction

Linda Skitka and colleagues have introduced the concept of moral conviction, which refers to a "strong and absolute belief that something is right or wrong, moral or immoral."[75] According to Skitka's integrated theory of moral conviction (ITMC), attitudes held with moral conviction, known as moral mandates, differ from strong but non-moral attitudes in a number of important ways. Namely, moral mandates derive their motivational force from their perceived universality, perceived objectivity, and strong ties to emotion.[76] Perceived universality refers to the notion that individuals experience moral mandates as transcending persons and cultures; additionally, they are regarded as matters of fact. Regarding association with emotion, ITMC is consistent with Jonathan Haidt's social intuitionist model in stating that moral judgments are accompanied by discrete moral emotions (i.e., disgust, shame, guilt). Importantly, Skitka maintains that moral mandates are not the same thing as moral values. Whether an issue will be associated with moral conviction varies across persons.

One of the main lines of IMTC research addresses the behavioral implications of moral mandates. Individuals prefer greater social and physical distance from attitudinally dissimilar others when moral conviction was high. This effect of moral conviction could not be explained by traditional measures of attitude strength, extremity, or centrality. Skitka, Bauman, and Sargis placed participants in either attitudinally heterogeneous or homogenous groups to discuss procedures regarding two morally mandated issues, abortion and capital punishment. Those in attitudinally heterogeneous groups demonstrated the least amount of goodwill towards other group members, the least amount of cooperation, and the most tension/defensiveness. Furthermore, individuals discussing a morally-mandated issue were less likely to reach a consensus compared to those discussing non-moral issues.[77]

Evolution

In Unto Others: the Evolution and Psychology of Unselfish Behavior (1998), Elliott Sober and David Sloan Wilson demonstrated that diverse moralities could evolve through group selection. In particular, they dismantled the idea that natural selection will favor a homogeneous population in which all creatures care only about their own personal welfare and/or behave only in ways which advance their own personal reproduction.[78] Tim Dean has advanced the more general claim that moral diversity would evolve through frequency-dependent selection because each moral approach is vulnerable to a different set of situations which threatened our ancestors.[79]

Integrated theories

More recent attempts to develop an integrated model of moral motivation[80] have identified at least six different levels of moral functioning, each of which has been shown to predict some type of moral or prosocial behavior: moral intuitions, moral emotions, moral virtues/vices (behavioral capacities), moral values, moral reasoning, and moral willpower. This social intuitionist model of moral motivation[81] suggests that moral behaviors are typically the product of multiple levels of moral functioning, and are usually energized by the "hotter" levels of intuition, emotion, and behavioral virtue/vice. The "cooler" levels of values, reasoning, and willpower, while still important, are proposed to be secondary to the more affect-intensive processes.

Psychologist Jonathan Haidt's moral foundations theory examines the way morality varies between cultures and identifies five fundamental moral values shared by different societies and individuals:[61] care for others, fairness, loyalty, authority and purity.[82]

Sociological applications

Some research shows that people tend to self-segregate based on moral or moral-political values.[83][84]

Triune ethics theory

The triune ethics theory (TET) has been proposed by Darcia Narvaez as a metatheory that highlights the relative contributions to moral development of biological inheritance (including human evolutionary adaptations), environmental influences on neurobiology, and the role of culture.[85] TET proposes three ethics that are the foundation or motivation for all ethics: security (or safety), engagement, and imagination. They differ not only in the recency of evolutionary development but also in their relative capacity to override one another.[86] The study looks back to people in the Pleistocene era and the environment of evolutionary adaptedness (EEA), relating to their early-life supports and its relation to moral functioning. The long term breastfeeding, the constant holding or touching, the frequency of caregivers other than mother, multiage playgroups, and the quick responsiveness to cries during that era is the type of caregiving that supports our biological systems. Current caregiving such as hospital births, solo sleeping, and physical isolation are not the types of early life caregiving to which humans are adapted. The article also mentions "dearth of touch" or faulty serotonin receptors affects society, and how they affect our society. There are higher rates of depression and anxiety, which both affect general and moral functioning.[86] There are two categories of the effects of childrearing on moral functioning, dispositional and situational effects. There are two hypotheses relating to the dispositional effects of childrearing on moral functioning. First, "a personality may cohere around being more or less oriented to each of the three ethics".[86] Second, "during critical periods of brain and personality development 'attachment' and 'trust' aspects of personality development are deeply influenced, affecting the structure and wiring of brain systems".[86] Lastly, there are situational effects. These relate to the idea that "moral personality has a dispositional signature within particular situations: person and situation interact with dispositional regularity".[86]

Security

The security ethic is based in the oldest part of the brain, involving the R-complex or the extrapyramidal system.[87] The security ethic is triggered stressors that activate primal instincts and fight-or-flight responses.[85] These are concerned or centered on safety, survival, and thriving in an environment (or biological system). With these systems present at birth, the security ethic is conditioned during sensitive periods of development (such as infancy), life experience, and trauma.[86] Studies have shown that a dearth of touch in early years result in an underdevelopment of serotonin receptors.[88] Children with faulty serotonin receptors are susceptible to somatosensory affectional deprivation, a condition related to depression, violent behavior, and stimulus seeking.[89][90] As an adult, if serotonin receptors are not properly functioning, an individual is more prone to depression and anxiety.[91] If receptor are damaged, and one becomes fixated at this ethic, they can be seen as cold, closed-minded, and aggressive. This ethic is most responsible for racism and hate towards outside groups.

Engagement

The ethic of engagement is centered in the upper limbic system or the visceral-emotional nervous system.[87] The limbic system allows for external and internal emotional signaling and is critical to emotion, identity, memory for ongoing experience, and an individual's sense of reality and truth. The ethic of engagement refers to relational attunement in the moment, which the stress response prevents, focusing on social bonding. It relies significantly on caregiver influence for its development in early childhood.[86] The engagement ethic is strongly associated with the hormone oxytocin, which has a strong presence during breastfeeding between a mother and child. Oxytocin is essential for building the trust between mother and child.

Imagination

The imagination ethic allows a person to step away from the impetuous emotional responses of the older parts of the brain and consider alternative actions based on logic and reason.[85] It is centered in the neocortex and related thalamic structures, including the frontal lobes used for reasoning and judgement skills.[87] It is focused on the outside world and allows for the integration and coordination of the other parts of the brain to allow for imaginative thinking and strategic problem solving. The ethic of imagination involves integrating internal information with external information, allowing an adult to acknowledge and possibly reject more emotional responses from the security or engagement ethics. The imagination ethic can build on the self-protective states of the security ethic (vicious or detached imagination) or of the prosocial engagement ethic (communal imagination).[86]

See also

Notes

  1. See, for example, Lapsley, Daniel K. (1996). Moral Psychology. Developmental psychology series. Boulder, Colorado: Westview Press. ISBN 978-0-8133-3032-7.
  2. Doris, John; Stich, Stephen (2008), Zalta, Edward N., ed., "Moral Psychology: Empirical Approaches", The Stanford Encyclopedia of Philosophy
  3. Wallace, R. Jay (November 29, 2007). "Moral Psychology". In Jackson, Frank; Smith, Michael. The Oxford Handbook of Contemporary Philosophy. OUP Oxford. pp. 86–113. ISBN 978-0-19-923476-9. Moral psychology is the study of morality in its psychological dimensions
  4. Doris & Stich 2008, §1.
  5. Teper, R.; Inzlicht, M.; Page-Gould, E. (2011). "Are we more moral than we think?: Exploring the role of affect in moral behavior and moral forecasting". Psychological Science. 22 (4): 553–558. doi:10.1177/0956797611402513. PMID 21415242.
  6. Kohlberg, L. (1969). "Stage and sequence: The cognitive development approach to socialization". In Goslin, David. Handbook of Socialization Theory and Research. Chicago: Rand McNally. pp. 347–480.
  7. Hardy, S. A.; Carlo, G. (2011). "Moral identity: What is it, how does it develop, and is it linked to moral action?" (PDF). Child Development Perspectives. 5 (3): 212–218. doi:10.1111/j.1750-8606.2011.00189.x.
  8. Doris & Stich (2008), §1.
  9. Kohlberg, Lawrence (1958). The development of modes of moral thinking and choice in the years 10 to 16 (PhD thesis). Chicago. OCLC 1165315.
  10. 1 2 3 Colby, Anne; Kohlberg, Lawrence (1987). The Measurement of Moral Judgment. Standard Issue Scoring Manual. 2. Cambridge: Cambridge University Press. ISBN 978-0-521-32565-3.
  11. Doris, John M. (2002). Lack of Character: Personality and Moral Behavior. Cambridge University Press. ISBN 978-1-316-02549-9.
  12. Nichols, Shaun (2004). Sentimental Rules: On the Natural Foundations of Moral Judgment. Oxford University Press. ISBN 978-0-19-988347-9.
  13. Kohlberg, Lawrence (1971-01-31), "1. Stages of moral development as a basis for moral education", Moral Education, University of Toronto Press, ISBN 9781442656758, retrieved 2018-09-23
  14. 1 2 Wendorf, Craig A (2001). "History of American morality research, 1894–1932" (PDF). History of Psychology. 4 (3): 272–288. doi:10.1037/1093-4510.4.3.272.
  15. Kohlberg, Lawrence (1973). "The Claim to Moral Adequacy of a Highest Stage of Moral Judgment". Journal of Philosophy. 70 (18): 630–646. doi:10.2307/2025030. JSTOR 2025030.
  16. Walker, Lawrence J.; Frimer, Jeremy A.; Dunlop, William L. (2010). "Varieties of moral personality: beyond the banality of heroism". Journal of Personality. 78 (3): 907–942. doi:10.1111/j.1467-6494.2010.00637.x. PMID 20573130.
  17. Verplaetse, Jan (2008). "Measuring the moral sense: morality tests in continental Europe between 1910 and 1930". Paedagogica Historica. 44 (3): 265–286. doi:10.1080/00309230701722721.
  18. 1 2 Kohlberg, Lawrence (1981). The Philosophy of Moral Development. Essays on Moral Developent. 1. San Francisco: Harper & Row. ISBN 978-0-06-064760-5. OCLC 7307342.
  19. Rest, James R. (1979). Development in Judging Moral Issues. Minneapolis: University of Minnesota Press. ISBN 978-0-8166-0891-1.
  20. Lind, Georg (1978). "Wie misst man moralisches Urteil? Probleme und alternative Möglichkeiten der Messung eines komplexen Konstrukts" [How do you measure moral judgment? Problems and alternative ways of measuring a complex construct]. In Portele, G. Sozialisation und Moral [Socialization and Morality] (in German). Weinheim: Beltz. pp. 171–201. ISBN 9783407511348. OCLC 715635639.
  21. Graham, Jesse; Haidt, Jonathan; Nosek, Brian A. (2009). "Liberals and conservatives rely on different sets of moral foundations" (PDF). Journal of Personality and Social Psychology. 96 (5): 1029–1046. doi:10.1037/a0015141. PMID 19379034.
  22. Graham, J.; Haidt, J.; Koleva, S.; Motyl, M.; Iyer, R.; Wojcik, S.; Ditto, P.H. (2013). "Moral Foundations Theory: The pragmatic validity of moral pluralism" (PDF). Advances in Experimental Social Psychology. 47: 55–130. doi:10.1016/b978-0-12-407236-7.00002-4.
  23. Steare, Roger; Roger Steare Consulting Limited (2006). Ethicability: (n) How to Decide What's Right and Find the Courage to Do It. London: Roger Steare Consulting Limited. ISBN 978-0-9552369-0-7. OCLC 70173013.
  24. Blasi, Augusto (1980). "Bridging moral cognition and moral action: A critical review of the literature". Psychological Bulletin. 88 (1): 1–45. doi:10.1037/0033-2909.88.1.1. ISSN 0033-2909.
  25. Hardy, S. A.; Carlo, G. (2005). "Identity as a source of moral motivation". Human Development. 48: 232–256. doi:10.1159/000086859.
  26. Colby, Anne; Damon, William (1999). "The Development of Extraordinary Moral Commitment". In Killen, Melanie; Hart, Daniel. Morality in Everyday Life: Developmental Perspectives. Cambridge University Press. pp. 362. ISBN 978-0-521-66586-5.
  27. Colby & Damon 1999, pp. 361–362.
  28. Colby & Damon 1999, p. 354.
  29. Colby & Damon 1999, p. 364.
  30. Colby & Damon 1999, p. 350.
  31. Hart, D.; Fegley, S. (1995). "Prosocial behavior and caring in adolescence: Relations to self-understanding and social judgment" (PDF). Child Development. 66 (5): 1346–1359. doi:10.2307/1131651. PMID 7555220.
  32. Blasi, Augusto (2005). "Moral character: A psychological approach". In Lapsley, Daniel; Power, F. Character Psychology and Character Education. Notre Dame, Indiana: University of Notre Dame Press. pp. 67–100. ISBN 978-0-268-03371-2.
  33. Kochanska, Grazyna (2002). "Mutually Responsive Orientation Between Mothers and Their Young Children: A Context for the Early Development of Conscience". Current Directions in Psychological Science. 11 (6): 191–195. doi:10.1111/1467-8721.00198. ISSN 0963-7214.
  34. Narvaez, Darcia (2014). Neurobiology and the Development of Human Morality: Evolution, Culture, and Wisdom (Norton Series on Interpersonal Neurobiology). W. W. Norton & Company. ISBN 978-0-393-70967-4.
  35. Krettenauer, T (2011). "The dual moral self: Moral centrality and internal moral motivation". The Journal of Genetic Psychology. 172: 309–328. doi:10.1080/00221325.2010.538451.
  36. Krettenauer (2013). "Revisiting the moral self construct: Developmental perspectives on moral selfhood". In Sokol, Bryan; Grouzet, Frederick; Müller, Ulrich. Self-Regulation and Autonomy. Cambridge University Press. pp. 115–140. ISBN 978-1-107-02369-7.
  37. See, for example, Damon, William; Hart, Daniel (1988). Self-Understanding in Childhood and Adolescence. Cambridge University Press. ISBN 978-0-521-30791-8.
  38. Emde, R.; Biringen, Z.; Clyman, R.; Oppenheim, D. (1991). "The moral self of infancy: Affective core and procedural knowledge" (PDF). Developmental Review. 11: 251–270. doi:10.1016/0273-2297(91)90013-e.
  39. Kochanska, G (2002). "Committed compliance, moral self, and internalization: A mediational model". Developmental Psychology. 38: 339–351. doi:10.1037/0012-1649.38.3.339.
  40. 1 2 Krettenauer, T.; Campbell, S.; Hertz, S. (2013). "Moral emotions and the development of the moral self in childhood" (PDF). European Journal of Developmental Psychology. 10: 159–173. doi:10.1080/17405629.2012.762750.
  41. Kristiansen, Connie M; Hotte, Alan M (1996). Morality and the self: Implications for the when and how of value-attitude-behavior relations. The Psychology of Values: The Ontario Symposium on Personality and Social Psychology. 8. Erlbaum Hillsdale, NJ. pp. 77–105.
  42. Schwartz, S. H. (1999). "A Theory of Cultural Values and Some Implications for Work" (PDF). Applied Psychology: An International Review. 48 (1): 23–47. doi:10.1080/026999499377655.
  43. Lapsley, Daniel K.; Narvaez, Darcia (2004). "A social-cognitive approach to the moral personality". Moral Development, Self, and Identity. Psychology Press. pp. 189–212. ISBN 978-1-135-63233-5.
  44. Lapsley & Narvaez 2004, p. 197.
  45. Kohlberg, Lawrence (1984). The Psychology of Moral Development: The Nature and Validity of Moral Stages. Essays on Moral Developent. 2. Harper & Row. p. 195. ISBN 978-0-06-064761-2.
  46. Crain, W.C. "Kohlberg's Stages of Moral Development". Theories of Development. Prentice-Hall. Archived from the original on October 4, 2011. Retrieved October 3, 2011.
  47. Gilligan, Carol; Attanucci, Jane (1994). "Two Moral Orientations: Gender Differences and Similarities". In Puka, Bill. Moral Development: Caring Voices and Women's Moral Frames. 34. Taylor & Francis. pp. 123–237. ISBN 978-0-8153-1553-7.
  48. Walker, Lawrence J.; Smetana, Judith (2005). "Gender and Morality". In Killen, Melanie. Handbook of Moral Development. Psychology Press. pp. 93–115. ISBN 978-1-135-61917-6.
  49. Jaffee and Hyde (2001)
  50. Rest, James R.; Narvaez, Darcia; Thoma, Stephen J.; Bebeau, Muriel J. (1999). Postconventional Moral Thinking: A Neo-Kohlbergian Approach. Psychology Press. ISBN 978-1-135-70561-9.
  51. 1 2 Metcalfe, J.; Mischel, W. (1999). "A hot/cool-system analysis of delay of gratification: Dynamics of willpower" (PDF). Psychological Review. 106 (1): 3–19. doi:10.1037/0033-295x.106.1.3. PMID 10197361.
  52. 1 2 3 Baumeister (2005). "Self and volition". In Miller, William; Delaney, Harold. Judeo-Christian Perspectives on Psychology: Human Nature, Motivation, and Change. Washington, DC: American Psychological Association. pp. 57–72. ISBN 978-1-59147-161-5.
  53. Baumeister 2005, p. 68.
  54. 1 2 3 4 Muraven, Mark; Baumeister, Roy F.; Tice, Dianne M. (August 1, 1999). "Longitudinal Improvement of Self-Regulation Through Practice: Building Self-Control Strength Through Repeated Exercise" (PDF). The Journal of Social Psychology. 139 (4): 446–457. doi:10.1080/00224549909598404. ISSN 0022-4545. PMID 10457761.
  55. https://www.dropbox.com/s/c17hie7wh4yaohd/Hagger%20et%20al%20%282016%29%20A%20Multilab%20Preregistered%20Replication%20of%20the%20Ego-Depletion%20Effect.pdf?dl=0
  56. Rest, James R (1983). "Morality". Handbook of Child Psychology. 3: 556–629.
  57. Narváez, Darcia; Rest, James (1995). "The four components of acting morally" (PDF). Moral Behavior and Moral Development: An Introduction: 385–400.
  58. Reynolds, Scott J.; Ceranic, Tara L. (2007). "The effects of moral judgment and moral identity on moral behavior: An empirical examination of the moral individual" (PDF). Journal of Applied Psychology. 92 (6): 1610–1624. doi:10.1037/0021-9010.92.6.1610. ISSN 1939-1854.
  59. 1 2 Haidt, Jonathan (October 2001). "The Emotional Dog and Its Rational Tail" (PDF). Psychological Review. 108 (4): 814–834. doi:10.1037/0033-295X.108.4.814.
  60. Greene, Joshua (2008). "The secret joke of Kant's Soul". In Sinnott-Armstrong, Walter. Moral Psychology. 3. Cambridge, Massachusetts: MIT Press. pp. 35–80. ISBN 978-0-262-69355-4. OCLC 750463100.
  61. 1 2 Haidt, Jonathan; Jesse Graham (2007). "When Morality Opposes Justice: Conservatives Have Moral Intuitions That Liberals May Not Recognize" (PDF). Social Justice Research. 20 (1): 98–116. doi:10.1007/s11211-007-0034-z. Archived from the original (PDF) on September 16, 2008. Retrieved December 14, 2008.
  62. 1 2 Heidt & Graham 2007, p. 99.
  63. Narvaez, Darcia; Lapsley, Daniel K. (2009). Personality, Identity, and Character: Explorations in Moral Psychology. Cambridge University Press. pp. 423. ISBN 978-0-521-89507-1.
  64. Narvaez & Lapsley 2009, p. 397.
  65. Narvaez & Lapsley 2009, p. 412.
  66. 1 2 3 4 Haidth, Jonathan (2003). "The Moral Emotions" (PDF). In Davidson, Richard; Scherer, Klaus; Goldsmith, H. Handbook of Affective Sciences. Oxford University Press. p. 855. ISBN 978-0-19-512601-3.
  67. Hare, R. M. (1981). Moral Thinking: Its Levels, Method, and Point. Oxford University Press, UK. ISBN 978-0-19-824659-6.
  68. Gewirth, A. (1984). "Ethics". Encyclopaedia Brittanica. 6. Chicago. pp. 976–998.
  69. Haidt 2003, p. 583.
  70. Ekman, Paul (May 1, 1992). "An argument for basic emotions" (PDF). Cognition and Emotion. 6 (3–4): 169–200. doi:10.1080/02699939208411068. ISSN 0269-9931.
  71. 1 2 3 Batson, C. D.; Klein, T. R.; Highberger, L.; Shaw, L. L. (1995). "Immorality from empathy-induced altruism: When compassion and justice conflict". Journal of Personality and Social Psychology. 68 (6): 1042–1054. doi:10.1037/0022-3514.68.6.1042.
  72. Batson, C. Daniel; Fultz, Jim; Schoenrade, Patricia A. (March 1, 1987). "Distress and Empathy: Two Qualitatively Distinct Vicarious Emotions with Different Motivational Consequences". Journal of Personality. 55 (1): 19–39. doi:10.1111/j.1467-6494.1987.tb00426.x. ISSN 1467-6494.
  73. Decety, Jean (November 1, 2014). "The Neuroevolution of Empathy and Caring for Others: Why It Matters for Morality" (PDF). Research and Perspectives in Neurosciences. 21: 127–151. doi:10.1007/978-3-319-02904-7_8.
  74. Decety, J.; Cowell, J. M. (2014). "The complex relation between morality and empathy" (PDF). Trends in Cognitive Sciences. 18 (7): 337–339. doi:10.1016/j.tics.2014.04.008.
  75. Skitka, Linda (2002). "Do the means always justify the ends or do the ends sometimes justify the means? A value protection model of justice" (PDF). Personality and Social Psychology Bulletin. 28: 452–461. doi:10.1177/0146167202288003.
  76. Morgan, G. S.; Skitka, L. J. (2011). "Moral conviction". In Christie, Daniel J. Encyclopedia of Peace Psychology. Wiley-Blackwell. ISBN 978-1-4051-9644-4.
  77. Skitka, L. J.; Bauman, C.; Sargis, E. (2005). "Moral conviction: Another contributor to attitude strength or something more?" (PDF). Journal of Personality and Social Psychology. 88: 895–917. doi:10.1037/0022-3514.88.6.895.
  78. Sober, Elliott; Wilson, David Sloan (1998). Unto Others: The Evolution and Psychology of Unselfish Behavior. Cambridge: Harvard University Press. ISBN 9780674930469.
  79. Dean, Tim (2012). "Evolution and moral diversity". Baltic International Yearbook of Cognition, Logic and Communication. 7. doi:10.4148/biyclc.v7i0.1775.
  80. Leffel, G. M. (2008). "Who cares? Generativity and the moral emotions, part 2: A "social intuitionist model" of moral motivation". Journal of Psychology and Theology. 36: 182–201.
  81. Leffel 2008's model draws heavily on Haidt 2001's social intuitionist model of moral judgment.
  82. The moral roots of liberals and conservatives, a TED talk by Jonathan Haidt
  83. Haidt, Jonathan; Rosenberg, Evan; Hom, Holly (2003). "Differentiating Diversities: Moral Diversity Is Not Like Other Kinds". Journal of Applied Social Psychology. 33 (1): 1–36. doi:10.1111/j.1559-1816.2003.tb02071.x.
  84. Motyl, Matt; Iyer, Ravi; Oishi, Shigehiro; Trawalterl, Sophie; Nosek, Brian A. (2014). "How ideological migration geographically segregates groups". Journal of Experimental Social Psychology. 51: 1–14. doi:10.1016/j.jesp.2013.10.010.
  85. 1 2 3 Narvaez, Darcia (March 1, 2008). "Triune ethics: The neurobiological roots of our multiple moralities". New Ideas in Psychology. 26 (1): 95–119. doi:10.1016/j.newideapsych.2007.07.008. ISSN 0732-118X.
  86. 1 2 3 4 5 6 7 8 Narvaez, Darcia; Lapsley, Daniel K. (2009). "Triune Ethics Theory and Moral Personality". Personality, Identity, and Character: Explorations in Moral Psychology. Cambridge University Press. pp. 136–158. ISBN 978-0-521-89507-1.
  87. 1 2 3 MacLean, P. D. (1990). The Triune Brain in Evolution: Role in Paleocerebral Functions. Springer Science & Business Media. ISBN 978-0-306-43168-5.
  88. Kalin, N. H. (1999). "Primate models to understand human aggression". The Journal of Clinical Psychiatry. 60 Suppl 15: 29–32. ISSN 0160-6689. PMID 10418812.
  89. Prescott, James W. (April 1, 1996). "The Origins of Human Love and Violence". Pre- and Peri-Natal Psychology Journal. 10 (3): 143. ISSN 1097-8003.
  90. Prescott, J.W. (1996). "The origins of human love and violence". pre- and perinatal psychology journal.
  91. Caspi, Avshalom; Sugden, Karen; Moffitt, Terrie E.; Taylor, Alan; Craig, Ian W.; Harrington, HonaLee; McClay, Joseph; Mill, Jonathan; Martin, Judy; Braithwaite, Antony; Poulton, Richie (July 18, 2003). "Influence of life stress on depression: moderation by a polymorphism in the 5-HTT gene" (PDF). Science. 301 (5631): 386–389. doi:10.1126/science.1083968. ISSN 1095-9203. PMID 12869766.

References

  • Baron, J.; Spranca, M. (1997). "Protected values". Organizational Behavior and Human Decision Processes. 70: 1–16. doi:10.1006/obhd.1997.2690.
  • Batson, Charles Daniel (1991). The Altruism Question: Toward a Social Psychological Answer. Hillsdale, New Jersey: Lawrence Erlbaum Associates. ISBN 978-0-8058-0245-0.
  • Brandt, Allan M. (2004). "Difference and diffusion: Cross-cultural perspectives on the rise of anti-tobacco policies". In Feldman, Eric; Bayer, Ronald. Unfiltered: Conflicts Over Tobacco Policy and Public Health. Cambridge, Massachusetts: Harvard University Press. pp. 255–380. ISBN 978-0-674-01334-6.
  • Helweg-Larsen, M.; Tobias, M. R.; Cerban, B. M. (2010). "Risk perception and moralization among smokers in the USA and Denmark: A qualitative approach". British Journal of Health Psychology. 15: 871–886. doi:10.1348/135910710x490415. PMC 3465077.
  • Katz (1997). "Secular Morality". In Brandt, Allan; Rozin, Paul. Morality and Health. New York: Routledge. pp. 295–330. ISBN 978-0-415-91582-3. OCLC 301566214.
  • Jones, A.; Fitness, J. (2008). "Moral hypervigilance: The influence of disgust sensitivity in the moral domain". Emotion. 8 (5): 613–627. doi:10.1037/a0013435. PMID 18837611.
  • McGraw, A.P.; Tetlock, P.E.; Kristel, O.V. (2003). "The limits of fungibility: Relational schemata and the value of things" (PDF). Journal of Consumer Research. 30: 219–229. doi:10.1086/376805.
  • Mikhail, John M (2011). Elements of Moral Cognition: Rawls' Linguistic Analogy and the Cognitive Science of Moral and Legal Judgment. New York: Cambridge University Press. ISBN 978-0-521-85578-5. OCLC 741178494.
  • "Moral pscyhology". Britannica Concise Encyclopedia. Chicago: Encyclopaedia Britannica. 2007. ISBN 978-1-59339-293-2.
  • Narvaez, Darcia (2005). "The neo-Kohlbergian tradition and beyond: schemas, expertise, and character" (PDF). Nebraska Symposium on Motivation. Nebraska Symposium on Motivation. 51: 119–163. ISSN 0146-7875. PMID 16335739.
  • Narvaez, Darcia (2005). "Integrative Ethical Education". In Killen, Melanie. Handbook of Moral Development. Psychology Press. pp. 703–733. ISBN 978-1-135-61917-6.
  • Narvaez, D (2010). "Moral complexity: The fatal attraction of truthiness and the importance of mature moral functioning" (PDF). Perspectives on Psychological Science. 5 (2): 163–181. doi:10.1177/1745691610362351.
  • Narvaez, D (2012). "Moral neuroeducation from early life through the lifespan" (PDF). Neuroethics. 5 (2): 145–157. doi:10.1007/s12152-011-9117-5.
  • Rozin, P (1999). "The process of moralization" (PDF). Psychological Science. 10 (3): 218–221. doi:10.1111/1467-9280.00139.
  • Rozin, P.; Lowery, L.; Imada, S.; Haidt, J. (1999). "The CAD triad hypothesis: A mapping between three moral emotions (contempt, anger, disgust) and three moral codes (community, autonomy, divinity)" (PDF). Journal of Personality and Social Psychology. 76 (4): 574–586. doi:10.1037/0022-3514.76.4.574.
  • Rozin, P.; Markwith, M.; Stoess, C. (1997). "Moralization and becoming a vegetarian: The transformation of preferences into values and the recruitment of disgust" (PDF). Psychological Science. 8 (2): 67–73. doi:10.1111/j.1467-9280.1997.tb00685.x.
  • Smith, Michael A. (2004). The Moral Problem. Oxford: Blackwell. ISBN 978-0-631-18941-1. OCLC 451103422.
  • Tetlock, P.; Kristel, O.; Elson, B.; Green, M.; Lerner, J. (2000). "The Psychology of the Unthinkable: Taboo Trade-Offs, Forbidden Base Rates, and Heretical Counterfactuals" (PDF). Journal of Personality and Social Psychology. 78: 853–870. doi:10.1037/0022-3514.78.5.853.
  • Thagard, Paul (2007). "The Moral Psychology of Conflicts of Interest: Insights from Affective Neuroscience" (PDF). Journal of Applied Philosophy. 24 (4): 367–380. doi:10.1111/j.1468-5930.2007.00382.x.
  • Wallace, R. Jay (2006). Normativity and the Will: Selected Essays on Moral Psychology and Practical Reason. Clarendon Press. ISBN 978-0-19-153699-1.
From the Stanford Encyclopedia of Philosophy
From the Internet Encyclopedia of Philosophy
This article is issued from Wikipedia. The text is licensed under Creative Commons - Attribution - Sharealike. Additional terms may apply for the media files.