But that's crazy! Cognitive bias in decision-making

Brain

What are cognitive biases?

Experiments in cognitive science and social psychology have revealed a wide variety of biases in areas such as statistical reasoning, social attribution and memory. It’s argued these biases are common to all human beings, and some have been demonstrated to hold across very different cultures.

Cognitive biases were first identified by Amos Tversky and Daniel Kahneman. They claim biases are artifacts of problem-solving heuristics humans use. Recent work on cognition in other animal species reveals that some cognitive biases are not unique to humans, suggesting an evolutionary origin.

Whatever the mechanisms behind cognitive bias, we have good data to suggest that under some circumstances we all have a tendency to react in a way that seems surprising when viewed from a more detached perspective.

Example cognitive biases

  • Bandwagon effect - the tendency to do (or believe) things because many other people do (or believe) the same. Related to Groupthink.
  • Loss aversion - the tendency for people to strongly prefer avoiding losses over acquiring gains
  • Selective perception - the tendency for expectations to affect perception.
  • Anchoring - the tendency to rely too heavily, or “anchor,” on one trait or piece of information when making decisions

Why join this workshop?

  • Get some “closure” on frustrating or puzzling past experiences.
  • Gain a deeper understanding of human decision-making and its weaknesses.
  • Perhaps discover strategies for coping better with such situations in future.

Workshop format

The goal of the workshop is to formulate explanations for puzzling situations that are based on what we currently understand of cognitive biases. We hope that we’ll learn something about these situations, understand them better, and perhaps discover ways in which we can deal with similar situations in future.

For most of the time we will work in a small group to explore a particular question. For example: why would an organization resist changes that appear to make perfect sense? Each working group will produce a summary of their findings in poster form and present it back to the other groups.

We’ll be on hand throughout the workshop to help with explaining particular biases, but we hope that the list we’ve chosen will be fairly self-explanatory.

Process

  1. After the introduction and explanation of what cognitive biases are, participants will form small groups (ideally 4 people). We’ll explain what kind of scenarios we’re looking for – ones in which puzzling or seemingly irrational decisions were made – and then everyone will have a few minutes to think of the scenarios they’d most like to understand.

  2. Each person in the group will describe the scenario as briefly as they can. Hopefully it will be clear to the other members of their group why it was so puzzling, but there will be a little time to ask questions and explore the scenario. We’d like this to be brief.

  3. Having heard each scenario, we’d like the group to decide which one they want to explore as a group. This scenario will be used for the rest of the workshop (unless we have a longer workshop duration, in which case we may do 2). We have a couple of “spare” scenarios for groups that can’t think of one, but they weren’t needed last time.

  4. Next we need to look at the “playing cards”. Each group will have cards divided into “suits” (we’re aiming for 4 suits) that roughly correspond to major groups of biases. The individual cards explain particular biases to which we know all people are prone. Each person in the group will pick one suit and spend a few minutes studying the cards. This divides the work of understanding the many possible biases among the group’s members. It’s ok to ask questions about the biases during this time – we’ll try to answer with examples or clarifications.

  5. Now we move into the main work of the workshop. The scenario is retold in more detail and with more exploratory questions. Each group member can “play” a cognitive bias card they hold whenever they think it’s possible a bias may have influenced what they are hearing about. Playing a card involves saying what the bias on the card is and why they think it might apply here. Playing a card might cause other people to play their cards. Each card can be played any number of times, and it’s important for the player of a card to make a note of what element of the scenario caused the card to be played. Any number of cards can be played at any time in the scenario.

  6. Each group will now have accumulated notes of possible biases affecting people in the scenario. The next stage is to try to make sense of the bigger picture. Each group will do that by creating a simple systems diagram showing possible effects at work and relationships between them. We will stop to explain systems diagrams very briefly before doing this and will be on hand to help groups as they try to build up a big picture. It isn’t necessary to include every bias card that was played, but it is important to try to create that big picture.

  7. The final stage of the workshop is for each group to create a poster of their conclusions (usually just the systems diagram) and talk the other groups through their scenario and what they made of it.

At the end we will have some time for reflection:

  • What did you learn about the scenario and is it useful?
  • What did you learn about cognitive biases and is it useful?
  • Did you learn anything else?

One possible extension is to brainstorm strategies for coping with or mitigating bias effects in future.

History

We’ve previously run this session at SPA 2007 and are incorporating feedback and new ideas arising from that session. The poster outputs and wiki page for that session are available.

You can find the current list of biases we’re using here but we’ll be adopting the “playing card” idea and reworking this list a little.

List of biases

This is drawn from Wikipedia’s list of cognitive biases and reordered and grouped. I’ve removed a small number that are highly unlikely to be relevant to the scenarios we’ll be considering. I still need to add some notes explaining the biases in more detail, and I may add a short introduction to each section outlining the broad shape of the biases within (if it’s possible to generalize them).

Social and group effects

Social and group related biases are biases primarily involving relationships with other people. These biases may be helpful in understanding group interactions in organizations.

In the following descriptions, the terms ingroup and outgroup refer to people’s perceptions of the groups they identify with (ingroups) or don’t identify with (outgroups). This may or may not have a strong relationship with organizational roles.

  • Bandwagon effect - the tendency to do (or believe) things because many other people do (or believe) the same. Related to Groupthink.
  • Egocentric bias - occurs when people claim more responsibility for themselves for the results of a joint action than an outside observer would.
  • False consensus effect - the tendency for people to overestimate the degree to which others agree with them.
  • Projection bias - the tendency to unconsciously assume that others share the same or similar thoughts, beliefs, values, or positions.
  • Ingroup bias - preferential treatment people give to whom they perceive to be members of their own groups.
  • Outgroup homogeneity bias - individuals see members of their own group as being relatively more varied than members of other groups.
  • Misinformation effect - states that misinformation affects people’s reports of their own memory.

Attitude to risk and probability

These biases affect how individual people makes decisions in the presence of uncertainty and risk or with probabilistic outcomes. They may have an influence on planning and decision-making activities.

  • Loss aversion - the tendency for people to strongly prefer avoiding losses over acquiring gains
  • Pseudocertainty effect - the tendency to make risk-averse choices if the expected outcome is positive, but make risk-seeking choices to avoid negative outcomes.
  • Hyperbolic discounting - the tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs, the closer to the present both payoffs are. For example, most people prefer $50 now over $100 in one year, but $100 in 6 years over $50 in 5 years (effectively the same decision in 5 years time).
  • Neglect of Probability - the tendency to completely disregard probability when making a decision under uncertainty. Rottenstreich and Hsee (2001) found that the typical subject was willing to pay $7 to avoid a 1% chance of a painful electric shock, and $10 - only a little more - to avoid a 99% chance of the same shock. (They suggest that probability is more likely to be neglected when the outcomes are emotion arousing.)
  • Zero-risk bias - preference for reducing a small risk to zero over a greater reduction in a larger risk.
  • Ambiguity effect - the avoidance of options for which missing information makes the probability seem “unknown”
  • Neglect of prior base rates effect - the tendency to fail to incorporate prior known probabilities which are pertinent to the decision at hand
  • Positive outcome bias (prediction) - a tendency in prediction to overestimate the probability of good things happening to them.
  • Conjunction fallacy - the tendency to assume that specific conditions are more probable than general ones. For example: Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations. Which is more likely? 1. Linda is a bank teller. 2. Linda is a bank teller and is active in the feminist movement. 85% of those asked chose option 2. However, mathematically, the probability of two events occurring together (in “conjunction”) will always be less than or equal to the probability of either one occurring alone.
  • Gambler’s fallacy - the tendency to assume that individual random events are influenced by previous random events— “the coin has a memory”
  • Recency effect - the tendency to weigh recent events more than earlier events (see also peak-end rule)
  • Primacy effect - the tendency to weigh initial events more than subsequent events
  • Subadditivity effect - the tendency to judge probability of the whole to be less than the probabilities of the parts. [possible interesting link to Planning Fallacy]

Seeking/recognizing/remembering information

The information we internalize can be strongly affected by our existing ideas. What stands out strongly to one person may not be noticed by another. There are several cognitive biases about attention–how we direct our noticing and evaluating activities.

  • Focusing effect - prediction bias occurring when people place too much importance on one aspect of an event; causes error in accurately predicting their satisfaction with a future outcome.
  • Selective perception - the tendency for expectations to affect perception.
  • Anchoring - the tendency to rely too heavily, or “anchor,” on one trait or piece of information when making decisions
  • Attentional bias - neglect of relevant data when making judgments of a correlation or association
  • Availability error - the distortion of one’s perceptions of reality, due to the tendency to remember one alternative outcome of a situation much more easily than another

Evaluating information

How we evaluate the information we are aware of can also be strongly affected by our existing ideas and some seemingly built-in thinking “shortcuts” we apply.

  • Von Restorff effect - the tendency for an item that “stands out like a sore thumb” to be more likely to be remembered than other items.
  • Contrast effect - the enhancement or diminishment of a weight or other measurement when compared with recently observed contrasting object.
  • Confirmation bias - the tendency to search for or interpret information in a way that confirms one’s preconceptions.
  • Disconfirmation bias - the tendency for people to extend critical scrutiny to information which contradicts their prior beliefs and accept uncritically information that is congruent with their prior beliefs.
  • Myside bias - the tendency for people to fail to look for or to ignore evidence against what they already favor. [related to confirmation/disconfirmation bias]
  • Polarization effect - increase in strength of belief on both sides of an issue after presentation of neutral or mixed evidence, resulting from biased assimilation of the evidence. [related to confirmation/disconfirmation bias]
  • Congruence bias - the tendency to test hypotheses exclusively through direct testing [“happy case”, not indirect or disconfirming tests]
  • Clustering illusion - the tendency to see patterns where actually none exist
  • Illusory correlation - beliefs that inaccurately suppose a relationship between a certain type of action and an effect

Taking action

Once information is available and has been evaluated sufficiently to allow action to be taken, other cognitive biases may have an effect on the actions we take, perhaps delaying or prolonging them.

  • Information bias - the tendency to seek information even when it cannot affect action
  • Planning fallacy - the tendency to underestimate task-completion times.
  • Zeigarnik effect - the tendency for people to remember uncompleted or interrupted tasks better than completed ones.
  • Overconfidence effect - the tendency to overestimate one’s own abilities

Memory, retrospection

Once action has been taken, the ways in which we evaluate the effectiveness of what we did may be biased, influencing our future decision-making.

  • Choice-supportive bias - the tendency to remember one’s choices as better than they actually were.
  • Rosy retrospection - the tendency to rate past events more positively than they had actually rated them when the event occurred.
  • Endowment effect - the tendency for people to value something more as soon as they own it.
  • Post-purchase rationalization - the tendency to persuade oneself through rational argument that a purchase was good value. It is a common phenomenon after people have invested a lot of time, money, or effort in something to convince themselves that it must have been worth it.
  • Mere exposure effect - the tendency for people to express undue liking for things merely because they are familiar with them.
  • Hindsight bias - sometimes called the “I-knew-it-all-along” effect, the inclination to see past events as being predictable
  • Choice-supportive bias - states that chosen options are remembered as better than rejected options.
  • Context effect - states that cognition and memory are dependent on context, such that out-of-context memories are more difficult to retrieve than in-context memories (i.e, recall time and accuracy for a work-related memory will be lower at home, and vice versa).
  • Self-generation effect - states that self-generated information is remembered best.
  • Self-relevance effect - states that memories considered self-relevant are better recalled than other, similar information
  • Misinformation effect - states that misinformation affects people’s reports of their own memory.
  • Mood congruent memory bias - states that information congruent with one’s current mood is remembered best.
  • Picture superiority effect - states that concepts are much more likely to be remembered experimentally if they are presented as pictures rather than as words.
  • Positivity effect - states that older adults favor positive over negative information in their memories.
  • Primacy and Recency effects - first and last items on list are more likely to be remembered. Also Serial position effect - states that items at the beginning of a list are the easiest to recall, followed by the items near the end of a list; items in the middle are the least likely to be remembered.

Judgement and liking

How we judge others and expect them to judge us (in terms of liking, moral acceptability etc) may be influenced by a number of biases:

  • Omission Bias - The tendency to judge harmful actions as worse, or less moral than equally harmful omissions (inactions.)
  • Outcome Bias - the tendency to judge a decision by its eventual outcome instead of based on the quality of the decision at the time it was made.
  • Fundamental attribution error - the tendency for people to over-emphasize personality-based explanations for behaviors observed in others while under-emphasizing the role and power of situational influences on the same behavior. (see also group attribution error, positivity effect, and negativity effect)
  • Halo effect - the tendency for a person’s positive or negative traits to “spill over” from one area of their personality to another in others’ perceptions of them. (see also physical attractiveness stereotype)
  • Illusion of asymmetic insight - people perceive their knowledge of their peers to surpass their peers’ knowledge of them.
  • Ingroup bias - preferential treatment people give to whom they perceive to be members of their own groups.
  • Just-world phenomenon - the tendency for people to believe the world is “just” and so therefore people “get what they deserve.”
  • Outgroup homogeneity bias - individuals see members of their own group as being relatively more varied than members of other groups.
  • Mere exposure effect - states that familiarity increases liking.

Miscellaneous

There are several biases that I found hard to group with others:

  • Status quo bias - the tendency for people to like things to stay relatively the same.
  • Bias blind spot - the tendency not to compensate for one’s own cognitive biases.
  • Illusion of control - the tendency for human beings to believe they can control or at least influence outcomes which they clearly cannot.
  • Impact bias - the tendency for people to overestimate the length or the intensity of the impact of future feeling states.