What is Cognitive Bias?

As Bill Nighy would say, “Cognitive bias is all around us”. Every time you make a decision that involves any element of human judgement, cognitive bias is introduced. While cognitive bias itself is not bad or evil, a failure to acknowledge cognitive bias is a failure to acknowledge the limits of human intuition. Cognitive bias is what makes us human, it is what separates us from a computer making predetermined decisions based on programed algorithms.

This three-part blog series will highlight what cognitive bias is, and how cognitive bias influences our clinical practice and research.

So what is cognitive bias? As with any complex human process it is best illustrated with a famous experiment Amos Tversky and Daniel Kahneman conducted in 1983 [1].

Read the following passage:

Linda is 31 years old, single, outspoken and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.

Which two alternatives is more probable:

  • Linda is a bank teller
  • Linda is a bank teller and is active in the feminist movement

When this question was put to 142 undergraduate students at The University of British Columbia, 85% of respondents indicated that the second option was more probable.

The correct answer is of course A. From a simple statistical standpoint, it is far more likely that Linda is a just a bank teller. The description of Linda is such that, (back in the early 80s) it lends itself to the stereotype of someone who may have been active in the feminist movement. Instead of adopting a purely statistical approach to the example human instinct (bias) kicks in, we place emphasis on the human elements of Linda’s description and ignore the critical issue of probability.

Human judgement and instinct are so strong that we arrive at a conclusion before we have even considered all of the available information and potential options. This instinct is critical to survival, how else do we know which apple to eat off the tree. The same instinct and judgment that has aided our survival for millions of years can also be our undoing when choices need to be made that require more objective decision making.

The book “MoneyBall”, popularised by the movie of the same name in 2011, demonstrates the success of the Oakland athletics baseball team when cognitive bias was reduced [2]. The book highlights the benefits of sabermetrics (the statistical analysis of baseball) over the traditional means of putting together a baseball team, which relied more heavily on intuition, judgement, and instinct of managers, coaches and scouts. Against all odds the cash strapped Oakland athletics baseball team reached the playoffs in the 2002 and 2003 seasons. They controversially selected their squad based on objective statistical analysis of a players performance and ignored the human instinct and bias of whether or not they “looked” like a good baseball player. That is not to say that statistics can save us all the time, as will be explored in an upcoming post, even statistics are open to many forms of bias.

These two examples highlight different aspects of cognitive bias. It is incredibly difficult to pin down exactly what cognitive bias is. It is engrained in all our decision making and comes in many different forms depending on the situation. The Texas sharp shooter is another example of a cognitive bias [3]. The term comes from the story of a self proclaimed Texan sharp shooter. An individual fires a number of rounds at the side of a barn. A target is then drawn around the biggest clump of bullet holes and you have yourself a bullseye. Whilst an extreme example this is not as uncommon as we think. We see things that we want to see, and discover things we want to discover, often at the expense of other truths in our world.

The use of imaging in chronic pain is an example of the Texas sharp shooter effect. Many patients with chronic pain will demonstrate degenerative changes on imaging findings. We might draw the conclusion that the cause of the patients pain is degenerative changes. However we ignore the fact that many people without pain also have degenerative changes. We see the pattern that we want to see. The Texas sharp shooter symbolises just one form of cognitive bias that exists. There are many others as illustrated here.

Whilst we can never remove all forms of cognitive bias from our lives, (nor would we want to) being aware of when and how cognitive bias influences our decision making plays an important role in making us better clinicians, researchers and objective decision makers.

This is the first in a three part series of posts looking at what cognitive bias is, and how cognitive bias influences our clinical practice and research.  In the next blog the effects of cognitive bias in clinical practice will be explored.

About Ian Skinner

Ian SkinnerBased at Neuroscience Research Australia Ian is working on the Prevent low back pain trial. He originally completed a Bachelor of Management in Sport and Exercise Science at the University of Technology Sydney. This provided the perfect backdrop to pursue one of Ian’s other interests in coaching rugby. The west was calling though and Ian headed to Notre Dame in Perth, Western Australia to train as a Physiotherapist. Ian soon became infatuated with the biopsychosocial model spawning what is quickly becoming an ongoing fascination with the complexity of pain. Frustrated at the treatment options for chronic back pain Ian’s PhD investigates attentional bias and cortical changes in low back pain patients. Ian is very excited for the day that chronic pain is easily treated, or even prevented before it becomes an issue, and he can enjoy his days drinking cocktails by a beach in the Caribbean.

References

  1. Tversky, A. and D. Kahneman, Extensional Versus Intuitive Reasoning: The Conjunction Fallacy in Probability Judgment. Psychology Review, 1983. 90(4).
  2. Lewis, M., Moneyball : the art of winning an unfair game. 1st ed. ed. 2003, New York: New York : W. W. Norton.
  3. Let’s think about cognitive bias. Nature, 2015. 526(7572): p. 163.

 

Comments

  1. From David Eagleman’s book, The Brain – the story of you, highlights that our basic sense of empathy changes depending on who we see as in our in-group or out-group. They put participants in a scanner and obtained their baseline responses to a hand being touched by a cotton swab or stabbed by a syringe needle. Once baseline had been established, they assigned a one word label to the hands reading Christian, Jewish, Atheist, Muslim, Hindu or Scientologist. When a hand was either stabbed by a needle or touched by a swab, on average, they showed a larger empathetic response when they saw someone in their in-group being stabbed vs someone in their out group (even with a 1 word label). I look forward to your next post ‘cognitive bias in clinical practice.’ Towards the middle ground between chaos and rigidity perhaps. Thanks.

  2. If we go out and confirm, the thousand things that we have learned
    Are we subject to collusion in the delusions that we see?
    Sit back and receive what your heart wants to believe
    Then your mission is to listen and perceive

    Put down the weight of isolation, ease into the conversation
    Join in and listen, join in and engage
    Put down the weight of isolation, ease into the conversation,
    Join in and listen, as life now turns the page

    When I look back at my life, as I faced my fears and strife
    I would listen to the music that I need
    Now faced with the end, looking back at where I’ve been
    Different tempo / different rhythm / different speed…

    At the mosque or at the steeple, it might seem we’re different people
    It’s one song they’re singing to me
    At a gathering at the temple, if only life could be so simple
    As one song they’re singing to me…
    So many voices, one song – Rumi
    When it comes to our understanding of pain, is the opposite of chaos, logic?
    Or is the opposite of chaos, rigidity?
    To love and be loved…embrace the chaos.
    There is a place between right and wrong, I’ll meet you there…R
    I apologize if I have offended or misquoted; I remain too curious to be wise…
    To the mystery – loved the post. Still have a lot to learn. Thanks.

  3. The issue of cognitive bias in Tversky and Kahneman (1983) is not that subjects/students think illogically but that they think differently than what experimenters anticipate. Therefore the cognitive bias should be in the experimenter AND not the subject, IMHO. In other words, the experimental assumption is that human beings are rationale agents and that they interpret and understand the world in terms of marginal and conjoint probabilities. Based on this assumption they would expect that subjects would interpret the two answers as (1) the probability that an individual is a bank teller (based on the population prevalence of bank tellers) and (2) the probability that an individual is a bank teller AND holds feminist viewpoints (given that she already adopts other progressive views).

    However, this assumption, by the experimenters, is incorrect IMHO. People don’t evaluate probabilities in terms of their conjoint probability unless explicitly asked to—which the Tversky and Kahneman did not do(1). Instead, humans evaluate probabilities based on conditional statements. This is clearly what is set up in the passage. A conditional statement takes the form of given that X then Y. In the context of the experiment given that Linda holds progressive belief’s (supporting social justice, anti-discrimination, and anti-nuclear demonstrations) then how likely is it that she holds additional progressive belief’s (i.e. endorsing feminism)—the issue of if she is bank teller is largely irrelevant since no information is given to indicate that this probability could occur at a rate different than due to the marginal probability that a woman is bank teller in the population at large.**

    Informally, the difference between the conjoint probability (using AND) and the conditional probability (using GIVEN THAT) is that the conjoint probability is the product of two independent events occurring, while the conditional probability describes an event that occurs GIVEN that some other event has already occurred (this event might be acquiring knowledge about Linda). Another way to think about it is that the conditional probability only focuses on only events that occur AFTER some other thing is TRUE (in other words it updates the probability).

    Therefore, when Tversky and Kahneman demonstrated that subjects were interpreting using GIVEN THAT instead of AND—and also focusing on the knowledge gained about progressive beliefs and disregarding the bank teller statement since it was common to both answers.

    Tversky and Kahneman describe this as the conjunctive fallacy. In other words the probability of two events occurring together (in “conjunction”) is always less than or equal to the probability of either one occurring alone— For example, even choosing a very low probability of Linda being a bank teller, say Pr(Linda is a bank teller) = 0.05 and a high probability that she would be a feminist, say Pr(Linda is a feminist) = 0.95, then, assuming independence, Pr(Linda is a bank teller and Linda is a feminist) = 0.05 × 0.95 or 0.0475, lower than Pr(Linda is a bank teller) [source: https://en.wikipedia.org/wiki/Conjunction_fallacy%5D

    Yet, a mathematical relation can be established between the conditional probability and the conjoint probability of two events. That relationship is expressed formally as p(Y GIVEN THAT X)*p(X)=p(X AND Y). However what the study fails to note is that subjects are not evaluating the multiplication when interpreting the question of what is more probable and are only evaluating the conditional statement i.e. p(Y|X)/=p(X).

    Why, might we use conditional probabilities instead of conjoint? Well quite simply, the updating of conditional probabilities is more attune to real world reasoning and conditioned learning.(2) Consider the question of the probability that it is raining out. A reasonable way to form a prior belief would be to divide the number of historical rainy days (in your location of choice) by the total days in a year. Now you obtain more information, e.g. the grass is wet. Given that the grass is wet you conditionally update the probability that it is raining.

    There is nothing inherently wrong with interpreting a vague statement in terms of a conditional probability rather than a conjoint probability. Especially since most people don’t have access to the type of knowledge that a conjoint probability would entail i.e. knowing the probability that a person is bank teller AND that a person adopts progressive views. However, they might have individual access to the probability GIVEN THAT a women and endorses anti-nuclear, pro social justice and anti-discrimination THEN she endorses feminism (especially if they are in college). Therefore the latter, IMHO is likely a better description of the decision making—especially in the context of the experiment. Cognitive bias, IMHO, is imprecise in its description of decision making. Also, it illegitimately associates any deviation from what is expected by experimenters as illogical or otherwise wrong.

    What might be a more important lesson here is that we can use the power of statistics i.e. epidemiological data of populations of people that we have limited exposure to and in conjunction with other measures of classification of dysfunction (diagnosis, disability, etc.) to form more accurate prior beliefs about the people we work with. Then using those beliefs to form better tailored treatments (obviously based on research that fits those classifications). For example, seeing a patient with an accurate diagnosis of CRPS (a diagnosis of low prevalence) (3) and having accurate knowledge about what this entails might lead to better treatment than not having this knowledge and applying other types of models of intervention (i.e. “you just need to get stronger”).

    ** As an aside there may be a population association with bank tellers and their willingness to adopt progressive views and knowing this would further update ones probability. However this would likely entail one a person have adequate exposure to bank tellers (which is doubtful that college students do, even in 1983) and two that they also had exposure to their beliefs about progressive views (also extremely doubtful). Therefore, it is unreasonable that subjects could form a reasonable/accurate estimation of this conjoint probability.

    1. Gigerenzer, G. How to Make Cognitive Illusions Disappear: Beyond ‘Heuristics and Biases’. European Review of Social Psychology 2, 83–115 (1991).
    2. Mitchell, C. J., De Houwer, J. & Lovibond, P. F. The propositional nature of human associative learning. The Behavioral and brain sciences 32, 183–98; discussion 198–246 (2009).
    3. Sandroni, P., Benrud-Larson, L. M., McClelland, R. L. & Low, P. A. Complex regional pain syndrome type I: incidence and prevalence in Olmsted county, a population-based study. Pain 103, 199–207 (2003).

  4. Linda Tunks says

    Ian, when you put it like that it’s so logical!

  5. Thanks Ian.

    “We see things that we want to see”…., or what we are told to see by others.

    The effect in the video below – whilst absolutely real – would not be easy for non-experts to reproduce. Derren would have selected particularly suggestible subjects and spent a lot of time in preparation.