I seriously debated with myself about the inclusion of this post. On the one hand, this is good information for people trying to make informed decisions. On the other hand, this can be very disconcerting information for people trying to make informed decisions because it calls in to question our ability to make good decisions.
Quick disclaimer: I am fully aware these issues affect every human. This means when we start thinking about specific pros and cons or looking at studies confirming/denying acupuncture effects, drug effects or usefulness of medical interventions we are always dealing with the cognitive biases of the researchers, funding sources, peer reviewers and publishers for those studies. We are also dealing with our own biases and potential dissonance as consumers of this information.
What is a cognitive bias? Good question. It’s a systematic pattern of deviation from norm or rationality in judgement (1). We are all filtering and applying lenses to the information we receive. This affects our interpretation of reality and biases sometimes lead us to perceptual distortion, inaccurate judgement, or illogical interpretation (1). Essentially, in the aggregate, we are irrational. We are all irrational. Ever have someone get angry with you because of something you wrote or said, had the angry person explain their reasons for being upset only to think to yourself, “That’s not at all what I wrote/said”? Welcome to cognitive bias.
Why does it happen? Well, that’s complicated. Some biases are adaptive in certain circumstances, sometimes they allow faster decisions, some are a by-product of limitations in our ability to handle large amounts of information (1). What we typically think of as our ‘mind’ is actually composed of two parts: the conscious and the subconscious.
The conscious mind is volitional, we use it to set goals, judge results and think abstractly. It deals with past and future, is relatively slow at processing and, despite what you may think or may have been told, terrible at multi-tasking (2). The conscious mind is mostly engaged when we’re learning something new or performing a new activity and it mostly works off short term memory - roughly the last 20 seconds or so (2).
The subconscious mind handles habitual things as well as bodily processes we don’t want or need to pay explicit attention to (breathing, heart rate, digestion). The subconscious mind is not abstract, it’s very literal and is eternally in the present (2). The subconscious mind has a processing rate that blows the conscious mind out of the water: 40 bits of info per second conscious vs around 11 million bits of info per second subconscious (3).
Because the subconscious mind is doing so much processing, it uses a lot of short cuts to determine what information is important and keep things running smoothly. Biases are largely a result of these processing short cuts.
Wikipedia’s entry for cognitive bias has about 176 entries. There is overlap and not every bias ties directly to our ability to know something scientifically or interpret scientific results. I was going to massage all this down and tie things back to some higher level categories, but it turns out someone has already done this work for me. I highly recommend everyone take a look at https://betterhumans.coach.me/cognitive-bias-cheat-sheet-55a472476b18 if you’d like to try understanding this aspect of why our minds do what they do.
I am going to throw out a short list of biases that I think have a direct link to knowledge, science and our ability to interpret study results, particularly in a health care/medical setting. I’m listing them alphabetically; there is no inherent reason for the order. Everything in the following list comes from Wikipedia (1):
Ambiguity effect - tendency to avoid options for which missing information makes the probability seem unknown.
Availability cascade - a self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse. In other words, whether the information is true or not, the more you hear something the more likely you are to believe it. More simply: any lie repeated often enough becomes the truth. As an aside, this is a very common marketing technique
Bandwagon effect - tendency to do/believe things because many other people do/believe the same.
Belief bias - evaluation of the logical strength of the argument is biased by the believability of the conclusion. Basically, if a person doesn’t believe a thing can be true then no matter how strong the argument or evidence they will not be swayed.
Confirmation bias - tendency to search for, interpret, focus and remember information in a way that confirms one’s preconceptions.
Conservatism - tendency to insufficiently revise one’s belief when presented new evidence.
Continued influence effect - tendency to believe previously learned misinformation even after it has been corrected.
Experimenter’s bias/expectation bias/observer expectancy - tendency of experimenters to believe, certify and publish data that agree with their expectations for the outcome of an experiment and to disbelieve, discard, or downgrade data that appear to conflict with those expectations.
Focusing effect - tendency to place too much importance on one aspect of an event.
Framing effect - drawing a different conclusion from the same information depending on how that information is presented.
Hyperbolic discounting - tendency for people to have a stronger preference for more immediate payoffs relative to later payoffs. Hyperbolic discounting leads to choices that are inconsistent over time - people make choices today that their future selves would prefer not to have made despite using the same reasoning.
Illusory truth effect - tendency to believe that a statement is true if it is easier to process or if it has been stated multiple times, regardless of actual veracity.
Irrational escalation - people justify investment in a decision, based on cumulative prior investment, despite new evidence the decision was probably wrong (sunk cost fallacy or, as I like to refer to it: doubling down on stupid).
Law of the instrument - over-reliance on familiar tools or methods, ignoring or undervaluing alternative approaches (if all you have is a hammer, everything is a nail).
Mere exposure effect - tendency to express undue liking for things merely because of familiarity.
Not invented here - tendency to avoid contact with or use of products, research, standards or knowledge developed outside a defined group.
Reactive devaluation - devaluing proposals only because they purportedly originated with an adversary.
Semmelweis reflex - tendency to reject new evidence that contradicts a paradigm.
Zero sum - a situation is incorrectly perceived to be a zero-sum-game (one person gains at another’s expense).
No discussion about cognitive bias is complete without discussing cognitive dissonance. This is the mental discomfort experienced by a person who is holding two contradictory beliefs, ideas or values at the same time (4). In the context we’re currently discussing, cognitive dissonance would occur when a person is confronted with new information that contradicts their existing beliefs, ideas or values (4).
Cognitive dissonance results in a kind of psychological pain that makes it difficult for people to take in new ideas and evidence, especially when it contradicts something they already believe to be true. Instead they turn to a variety of coping mechanisms which allow them to leave their current beliefs in place while discounting the new information.
It’s pretty easy to see how cognitive bias and cognitive dissonance might affect our ability to interpret study results or make a rational health care decision. Given we’re all susceptible to these errors in judgement and decision making, how can we avoid bias or dissonance?
Like they say, “The first step is admitting you have a problem”. In this case that means we need to recognize we’re all prone to these errors in thinking, no one is immune - in fact, it's a cognitive bias to think you're not affected by cognitive biases. After that it takes a degree of introspection and self examination to make sure we’re not falling into a trap. I’m not going to lie, it can be tedious to examine our decisions and ensure we’re reducing the effects of bias and dissonance as much as possible. Sometimes it can be painful to realize our previous beliefs were incorrect and we need to drop them to move in a new direction.
When decisions must be made quickly, we really don’t have the luxury of second guessing whether or not we’re being biased. On the other hand, when we’re reading a scientific paper, doing research or going over treatment options, we have the time we need to think everything through carefully and question our own assumptions.
We can and should question everything. Look at all sides of an issue. Do not rely on echo chambers (avoiding availability cascade and bandwagon effect). Seek out opposing information and evidence (avoiding confirmation bias). Evaluate all information as rationally as possible: facts don’t care about our feelings. Just because we ‘believe’ something should or should not be true doesn’t make it so.
Cognitive Bias. Retrieved from https://en.wikipedia.org/wiki/Cognitive_bias
Embogama. (5 August 2016). Difference Between Conscious and Subconscious Mind. Retrieved from https://pediaa.com/difference-between-conscious-and-subconscious-mind/
DiSalvo, D. (2013). Your Brain Sees Even When You Don't. Retrieved from https://www.forbes.com/sites/daviddisalvo/2013/06/22/your-brain-sees-even-when-you-dont/?sh=77582d42116a
Cognitive Dissonance. Retrieved from https://en.wikipedia.org/wiki/Cognitive_dissonance
Comentários