3.0 Unconscious Bias & Visioning

3.1 Facts

Facts

check mark iconFacts Checklist

Please complete all of the readings and activities before continuing to part 3.2.

☐ Read and Reflect on Facts 1-6
☐ Reflection Note: The Trusted Team Activity
☐ Complete Project Implicit’s Unconscious Bias Test
☐ Read & watch YouTube video on Heuristics
☐ Reflect on how Heuristics can lead to Biases
☐ Read and Reflect on Combatting Unconscious Biases

heading iconFacts

heading iconWhat Are Heuristics and Why Do They Matter?

person standing at the edge of three arrows going into three different directions

As we move throughout the world, we process large amounts of information and make many choices with limited amounts of time. Hence when information is missing, or an immediate decision is necessary, heuristics act as “rules of thumb” that guide our behavior down the most efficient pathway.

Heuristics are the name given to your brain’s mental reflexes and rapid insights. The human mind can only handle so much information at once, so the brain develops these shortcuts to help you compensate for limitations on time, mental energy, and information. In summary, the mind uses heuristics to simplify decision-making.

Heuristics are created due to prior experiences, and people often give these mental reflexes names such as common sense, intuition, or prejudice. But these shortcuts aren’t always optimal. In fact, heuristics are often inflexible toward change.

Source: https://www.psychologytoday.com/ca/basics/heuristics#understanding-heuristics

heading iconHow Heuristics can Lead us to Mistaken Conclusions?

heading iconHow Heuristics Sometimes Leads to Cognitive Biases

How Cognitive Biases Shape Our World?

Now that you’re familiar with how our mind is susceptible to various kinds of unconscious biases, the role of heuristics, and how they can impact our mind to form different kinds of cognitive biases, check the infographic below for more information. Are there any that have tricked you recently?

20 COGNITIVE BIASES THAT SCREW UP YOUR DECISIONS 1. Anchoring bias. People are over-reliant on the first piece of information they hear. In a salary negotiation, whoever makes the first offer establishes a range of reasonable possibilities in each person's mind. 2. Availability heuristic. People overestimate the importance of information that is available to them. A person might argue that smoking is not unhealthy because they know someone who lived to 100 and smoked three packs a day. 3. Bandwagon effect. The probability of one person adopting a belief increases based on the number of people who hold that belief. This is a powerful form of groupthink and is reason why meetings are often unproductive. 4. Blind-spot bias. Failing to recognize your own cognitive biases is a bias in itself. People notice cognitive and motivational biases much more in others than in themselves. 5. Choice-supportive bias. When you choose something, you tend to feel positive about it, even if that choice has flaws. Like how you think your dog is awesome — even if it bites people every once in a while. 6. Clustering illusion. This is the tendency to see patterns in random events. It is key to various gambling fallacies, like the idea that red is more or less likely to turn up on a roulette table after a string of reds. 7. Confirmation bias. We tend to listen only to information that confirms our preconceptions — one of the many reasons it's so hard to have an intelligent conversation about climate change. 8. Conservatism bias. Where people favor prior evidence over new evidence or information that has emerged. People were slow to accept that the Earth was round because they maintained their earlier understanding that the planet was flat. 9. Information bias. The tendency to seek information when it does not affect action. More information is not always better. With less information, people can often make more accurate predictions. 10. Ostrich effect. The decision to ignore dangerous or negative information by "burying" one's head in the sand, like an ostrich. Research suggests that investors check the value of their holdings significantly less often during bad markets. 11. Outcome bias. Judging a decision based on the outcome — rather than how exactly the decision was made in the moment. Just because you won a lot in Vegas doesn't mean gambling your money was a smart decision. 12. Overconfidence. Some of us are too confident about our abilities, and this causes us to take greater risks in our daily lives. Experts are more prone to this bias than laypeople, since they are more convinced that they are right. 13. Placebo effect. When simply believing that something will have a certain effect on you causes it to have that effect. In medicine, people given fake pills often experience the same physiological effects as people given the real thing. 14. Pro-innovation bias. When a proponent of an innovation tends to overvalue its usefulness and undervalue its limitations. Sound familiar, Silicon Valley? 15. Recency. The tendency to weigh the latest information more heavily than older data. Investors often think the market will always look the way it looks today and make unwise decisions. 16. Salience. Our tendency to focus on the most easily recognizable features of a person or concept. When you think about dying, you might worry about being mauled by a lion, as opposed to what is statistically more likely, like dying in a car accident. 17. Selective perception. Allowing our expectations to influence how we perceive the world. An experiment involving a football game between students from two universities showed that one team saw the opposing team commit more infractions. 18. Stereotyping. Expecting a group or person to have certain qualities without having real information about the person. It allows us to quickly identify strangers as friends or enemies, but people tend to overuse and abuse it. 19. Survivorship bias. An error that comes from focusing only on surviving examples, causing us to misjudge a situation. For instance, we might think that being an entrepreneur is easy because we haven't heard of all those who failed. 20. Zero-risk bias. Sociologists have found that we love certainty — even if it's counterproductive. Eliminating risk entirely means there is no chance of harm being caused. SOURCES: Brain Biases; Ethics Unwrapped; Explorable; Harvard Magazine; HowStuffWorks; LearnVest; Outcome bias in decision evaluation, Journal of Personality and Social Psychology; Psychology Today; The Bias Blind Spot: Perceptions of Bias in Self Versus Others, Personality and Social Psychology Bulletin; The Cognitive Effects of Mass Communication, Theory and Research in Mass Communications; The less-is-more effect: Predictions and tests, Judgment and Decision Making; The New York Times; The Wall Street Journal; Wikipedia; You Are Not So Smart; ZhurnalyWiki

20 Cognitive biases that screw up your decisions.pdf

heading iconOutsmarting Yourself

By now, you should be aware that you, too have blindspots, although what they are is not as important as acknowledging that they exist. The good news is that whatever they are, you can outsmart them.

Let us now dive deep to understand the techniques that we can use as an individual to combat our unconscious biases.

Unconscious bias word cloud filled with words that say Race, decisions, measure, beliefs, people, behaviour, research, unfair, judgement, corporations, resepect, stereotypes, ethnicity, groups, social, unfair, implicit, preferences, reaction, cognition, subconscious, respect, ethnicity, subtle

next icon What’s coming up?

You’ve completed: 3.1 Facts, you can move on to 3.2 Aligning & Crafting Your Vision by using the menu at the left or the navigation at the bottom of this page.

License

Icon for the Creative Commons Attribution 4.0 International License

Intercultural Awareness and Competence Copyright © 2021 by Trecia McLennon is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book