The Scout Mindset: Why Some People See Things Clearly and Others Don’t – Julia Galef
Thoughts: In The Scout Mindset, Julia Galef first outlines how humans often reason in ways that don’t lead to an accurate conception of how the world works, and why, while these patterns of reasoning may have been adaptive in our evolutionary past, having an accurate model of the reality is becoming more and more valuable in modern life. She then suggests a range of practical strategies to move past these habits of motivated reasoning. Clear, succinct, engaging, and packed with valuable advice, I can hardly recommend this book highly enough.
(The notes below are not a summary of the book, but rather raw notes - whatever I thought, at the time, might be worth remembering.)
Galef, Julia. 2021. The Scout Mindset: Why Some People See Things Clearly and Others Don’t. Portfolio.
Part I: The Case for Scout Mindset
Chapter 1: Two Types of Thinking
- 11: accuracy motivated reasoning vs. directionally motivated reasoning:
- directionally: asks “can I believe this?” or “must I believe this?”, depending on one’s preconcieved views on the topic
- accuracy: asks “is it true?”
Chapter 2: What the Soldier is Protecting
- 16: (G. K.) Chesterton’s Fence: imagine you discover a fence built across a road. Whereas someone might say, “what a useless fence, let’s tear it down”, Chesterton argued that if you don’t have a good understanding of why the fence was built in the first place, you can’t be confident that removing the fence won’t cause more problems than it will solve.
- 22: Lyndon B Johnson would practice “working up”, where if he needed to convince someone of something, he would argue with as many people as he could about the topic, until he himself became certain of it. George Reedy: “He had a fantastic capacity to persuade himself that the ‘truth’ which was convenient for the present was the truth and anything that conflicted with it was the prevarication of enemies.”
- cf. the backfire effect
- 23: impression management (as it is referred to by psychologists) and signalling (as it is referred to by evolutionary psychologists) are similar phenomena: when we consider a claim, we subconsciously ask ourselves, “What kind of person would believe a claim like this, and is that how I want others to see me?”
Chapter 3: Why Truth is More Valuable Than We Realize
- 31: The idea of rational irrationality, proposed by Brian Caplan, highlights that there are two kinds of rationality, epistemic and instrumental rationality, and that the two are not always aligned
- epistemic rationality: “holding beliefs that are well justified”
- instrumental rationality: “acting effectively to achieve your goals”
- 40: Galef argues that while a tendency/instinct toward soldier mindset (i.e. de-prioritizing epistemic rationality in order to achieve other goals) served us well in our evolutionary past, today’s world rewards us being able to see clearly, i.e. epistemic rationality is coming into closer alignment with instrumental rationality
Part II: Developing Self-Awareness
Chapter 4: Signs of a Scout
- 46: According to a study by Dan Kahan, lack of knowledge and lack of reasoning ability do not cause one to be more likely to engage in motivated reasoning. In this study, they asked people a series of politically polarized questions. People who had little scientific knowledge were not particularly polarized in their views on specific issues, but as you looked at groups of people with increasing levels of scientific knowledge, the opinions of liberals and conservatives diverged more and more
- 51: Galef suggests that a willingness to admit that one was wrong is a good indicator of a person who values truth higher than their ego
- 52: a friend of Galef has the employees at the business he runs fill out a survey on how he can improve. He optimizes the survey for candid feedback by making it anonymous, and phrasing the questions in multiple ways so as to effectively coax criticism from the employees.
- 55: Blind data analysis: e.g. Saul Perlmutter leading the Supernova Cosmology Project. They gathered the data, and put it into a computer program to systematically shift it by a random amount. They then did the analysis on the modified data, made sure they were happy with their process, and only then ran the analysis on the real data, thus avoiding the possibility of massaging/tweaking their analysis in order to find the result they expected/hoped for.
- 56-57: Galef suggests finding people who are good spokespersons for views you disagree with (i.e. people that aren’t just ill-informed/unreasonable/etc.)
- 57-58: generally, being in the soldier mindset feels like being in the scout mindset: “motivated reasoning is our natural state”. Thus, Galef suggests, if you’re able to point to times you have been in soldier mindset, that can be a good indicator that you’re currently in scout mindset.
Chapter 5: Noticing Bias
- 59-60: forcing is an important tool in a magician’s toolkit: ‘we’re going to remove one of these two cards; please pick one’ - if you point to the one on the left, they say ‘alright, you keep this one’, whereas if you point to the one on the right, they say ‘alright, we’ll get rid of this one’, and both ways, you’re left holding the card on the left, believing you’re the one who chose it.
- the forcing effect is obvious if you’re able to view both situations at once, i.e. imagining the counterfactual situation.
- the forcing effect is most powerful in situations you’ve never encountered before, i.e. you’re more likely to pick up on the card trick if you’ve seen it done several times
- the brain does the same thing when evaluating other situations. If you don’t have a principle to rely on (cf. if you’re not in the habit of saying ‘let’s get rid of that card’ for the magic trick), you’re likely to default to an interpretation that supports your views, without noticing that you’ve done motivated reasoning.
- Again, this can be overcome to some degree by doing counterfactual reasoning: imagine the situation was different (e.g. imagine the politician that had an affair was from another political party) - would your interpretation change?
- 62: Galef notes that you actually have to put in the effort to make contrafactual reasoning work: spend some time properly imagining a counterfactual world and then put yourself in it, rather than just verbally posing the question.
- 63: the double-standard test: “Am I judging other people’s behavior by a standard I wouldn’t apply to myself?”
- 64: the outsider test: “Imagine someone else stepped into your shoes—what do you expect they would do in your situation?” Especially useful when making a difficult decision, confronting a sunk-cost fallacy sort of situation, etc.
- 67: the conformity test: if you find yourself agreeing with someone, imagine that they said they had changed their mind, and now held the opposing view. Would I still hold the view I’m agreeing with, and would I be willing to defend that view against them?
- 68: the selective skeptic test (especially useful when you’re evaluating a study / other source) - “imagine this evidence supported the other side. How credible would you find it?”
- 69: the status quo bias - people tend to defend the situation that is currently the status quo (possibly related to the human tendency toward loss aversion)
Chapter 6: How Sure Are You?
- 82: Evolutionary psychologist Robert Kurzban suggests there are two modes of thought, analogous to the “board of directors” and the “press secretary” for a corporation. Whereas the board of directors makes decisions about how to allocate resources etc., the press secretary tries to explain the reasoning behind any actions the corporation takes by pointing toward the compay’s stated values/mission/etc.
- 83: “The press secretary makes claims; the board makes bets.”
- 84: one way to move toward a scout mindset (trying to arrive at a clear picture of the truth rather than signalling one’s values to others), then, is to make bets about your beliefs. Galef suggests that in order to effectively bet on your beliefs, it is most effective to imagine a concrete situation, e.g. instead of saying “my company’s computer servers are highly secure”, ask yourself whether you would would be willing to hire a hacker to hack the servers - would you be confident betting they wouldn’t be able to succeed?
- 87: the “core skill in this chapter…: being able to tell the difference between the feeling of making a claim and the feeling of actually trying to guess what’s true.”
Part III: Thriving Without Illusions
Chapter 7: Coping with Reality
- 96: Galef points out that some coping strategies, such as denial, false fatalism (“it’s hopeless”), etc. involve self-deception, while others (she offers “count your blessings”, “notice how far you’ve come” and “remember you can’t do more than your best”) dont. If you’re struggling to cope with a situation, while it’s easy to reach for a self-deceptive thought to justify yourself: there’s almost always a comforting thought in the “doesn’t require self-deception” category if you look for a little while.
- 97-98: Galef notes that “it’s striking how much the urge to conclude ‘that’s not true’ diminishes once you feel like you have a concrete plan for what you would do if the thing were true.”
Chapter 8: Motivation without Self-Deception
- 119: when taking a worthwhile but risky bet: Nate Soares: “You want to get into a mental state where if the bad outcome comes to pass, you will only nod your head and say ‘I knew this card was in the deck, and I knew the odds, and I would make the same bets again, given the same opportunities’”
Chapter 9: Influence without Overconfidence
- 122-123: we use “confidence” to refer to two different things, but it’s worth distinguishing between epistemic confidence and social confidence.
- epistemic: how certain you are about what’s true
- social: how self-assured you are
- these two operate independently, even though we often think of them as somewhat equivalent
- 125-126: in a study where university students were asked to rate each other in terms of their competence, reviews depended heavily on how socially confident a student was, but the epistemic confidence a student expressed mattered much less
- 128: one way to express epistemic uncertainty while remaining socially confident: emphasize that your uncertainty is due to the current state of human knowledge, or how reality is messy and hard to predict, rather than being due to your own ignorance/inexperience
Part IV: Changing Your Mind
Chapter 10: How to Be Wrong
- 142: talking about Philip Tetlock’s studies with superforecasters: superforecasters tend to treat errors in prediction as opportunities to hone their technique, so they tend to be happy/comfortable thinking about what they got wrong.
- 146: “most of the time, being wrong doesn’t mean you did something wrong” - Galef suggests learning to incrementally update you beliefs as new evidence comes in is a much better approach than getting worked up about how you got something wrong
Chapter 11: Lean in to Confusion
- 154: Charles Darwin’s “golden rule” to fight motivated reasoning: “…whenever a published fact, a new observation or thought came across me, which was opposed to my general results, to make a memorandum of it without fail and at once; for I had found by experience that such facts and thoughts were far more apt to escape from the memory than favourable ones.” Darwin’s practice of this golden rule is part of the reason his theory proved to be so robust (e.g. figuring out how sexual selection worked by attending to the example of the peacock’s tail)
- 157: Galef notes that when someone’s behavior appears stupid/irrational/etc., it could be that that person is stupid/irrational, but it’s much more likely that you’re missing something. “This is a point that top negotiators all emphasize: don’t write off the other side as crazy. When their behavior confuses you, lean in to that confusion. Treat it as a clue. You’ll often find that it leads to the information you need to resolve the negotiation.”
- book mentioned: Chris Voss’s Never Split the Difference, in which he talks about the importance of leaning into confusion during negotiation
- 160-161: when treating cholera cases in the 1850s, homeopathic hospitals had much lower death rates than conventional hospitals. A council of scientists, upon discovering that the London Homeopathic Hospital had half the death rate of mainstream hospitals, they decided to exclude data from that hospital from a study they were conducting on methods of treating cholera. Had they instead leaned into the confusion, they would have discovered that the difference had nothing to do with homeopathic remedies: the homeopathic hospital emphasized hygeine, ensuring blankets had been sterilized before being given to a new patient, and had cholera patients drink whey, which replenished the patients’ fluids and electrolytes—“essentially an early version of what we now call oral rehydration therapy, something that didn’t become a standardized treatment for cholera until the 1960s.”
- 161: mentioned: Thomas Kuhn’s The Structure of Scientific Revolutions: about paradigm shifts, where through an accumulation of puzzling observations, the science can come to hold a new and more accurate model of phenomena in their field
- 164: “A key determinant of whether someone manages to escape [a multilevel marketing company] after a few months or whether they end up entrenched for years[:] Do they notice the anomalies, the aspects of their experience that aren’t what they expected? Do they notice when their attempts to explain anomalies feel forced? Do they allow themselves to feel confused?”
- 167: “Scouts view anomalies as puzzle pieces to collect as you go through the world. You probably won’t know what to do with them at first. But if you hang on to them, you may find that they add up to a richer picture of the world than you had before.”
Chapter 12: Escape Your Echo Chamber
- 170: experiment where liberals and conservatives were asked to read tweets from twitter accounts associated with their opposite viewpoint. At the end of a month, conservatives who read liberal tweets had become much more conservative in their beliefs, while liberals who read conservative tweets became a little more liberal (but not statistically significant)
- 172: to check out: r/FeMRADebates, where feminists and men’s rights activists come together to productively debate the issues that divide them (!!)
- how did they create this culture? some of the rules/guidelines they have in place:
- don’t insult other members
- don’t use epithets
- don’t generalize
- don’t refer to people as a group (e.g. talking about what “feminists” believe), but rather, disagree with individual people or specific views
- how did they create this culture? some of the rules/guidelines they have in place:
- 173- : anecdote about Jerry Taylor, the climate change skeptic employed by the Cato Institute, who ended up switching sides on the debate. Did so after debating with Bob Litterman, who ran an investment advisory firm. Litterman argued that catastrophic climate change is a nondiversifiable risk, meaning there’s nowhere you can invest to hedge against the risk of it occurring. Since both Taylor and Litterman shared a similar ideology - “[Litterman] is from Wall Street. He is kind of a soft Libertarian.”
- 174: Galef argues that, if you share intellectual/ideological common ground with someone, it makes you more receptive to their arguments.
- 178-179: “Are you sure that none of the absurd-sounding ideas you’ve dismissed in the past aren’t also misunderstandings of the real thing? Even correct ideas often sound wrong when you first hear them.”
- 179: “When we encounter a good argument that’s new to us, we often mistake it for a bad argument we’re already familiar with.”
- 182: in general, when considering different viewpoints, you should be searching for the version of the idea that is most likely to make you change your mind. One possible litmus test: “If reading someone does not make me feel more compassion toward their perspective, then I keep looking.”
Part V: Rethinking Identity
Chapter 13: How Beliefs Become Identities
- 187: Galef notes that beliefs are more likely to ossify into identities when you feel like you’re embattled, struggling to survive in a hostile world. And it’s worth noting that both sides of a debate can feel like they’re the embattled side.
- cf. Ezra Klein’s claim that both the left and the right feel like they’re losing the political battle in the US
- 193: Galef suggests that, if you preface a claim with “I believe”, it’s often an indication that you hold the belief as part of your identity.
- 193: “When you feel the urge to step in and defend a group or belief system against perceived criticism, chances are good that your identity is involved.”
- 195: Labels (“-ist”, “-ism”) can be useful as a practical description of your beliefs. But if you come to hold one of your “ism”s as part of your identity, it can cloud your judgment. One way to tell that this is going on is when it feels important to gatekeep / police the boundaries of an identity (i.e. you’re not a true {blank} unless you believe {blank}).
- 196: “If you use epithets… in talking about a particular issue, that’s a sign you’re viewing it as a fight between people, not ideas.”
- 197: it becomes especially tricky to change a belief once you’ve publicly argued in favour of it.
Chapter 14: Hold Your Identity Lightly
- 200: since your identity can deeply shape your thinking, Galef suggests holding your identity lightly: treat it as a convenient description, and not more than that. “Holding an identity lightly means thinking of it in a matter-of-fact way, rather than as a central source of pride and meaning in your life. It’s a description, not a flag to be waved proudly.”
- 203: to test whether you really understand an ideology that’s not your own, try the ideological Turing test - are you able to explain the ideology well enough that someone on the outside would not be able to tell whether or not you actually hold that belief?
- 206: when you feel morally/intellectually superior to someone else, it’s hard to change someone’s mind.
- 209-210: Galef notes that, when working toward a goal, not all actions are equal. Along one dimension, the action can be very effective, slightly effective, ineffective, or even counterproductive. And along another dimension, an action can be more or less identity-affirming. When you hold your identity lightly, it becomes easier to focus on the effectiveness of your actions—the dimension that really matters—and less on how much the action affirms your identity. {f activism-matrix}
Chapter 15: A Scout Identity
- 217: Galef suggests adopting “scout” as one of the core parts (or even the core part) of your identity. “If you pride yourself on being a scout, it becomes easier to resist the temptation to mock someone who disagrees with you, because you can remind yourself, ‘I’m not the kind of person who takes cheap shots,’ and get a hit of pride. It makes it easier to acknowledge when you make a mistake, because you can tell yourself, ‘I don’t make excuses for myself,’ and feel a rush of satisfaction. And sometimes, those hits of pride or satisfaction are enough to make the scout path more tempting than the soldier path.”
- 219: potential for a series of blog posts: things I’ve been wrong about
- 219: Galef suggests that if you want to become good at practicing scout mindset, you should surround yourself with others who care about practicing scout mindset.
- 223: to check out: changeaview.com, where people post a view they’re willing to shift their beliefs on, and solicit arguments against this view. People get points (like reddit karma) for shifting a person’s view at least a little bit, and thus effective communication strategies are systemically encouraged.
- 226: potential for a series of blog posts: things I’m not sure about
- 230-231: list of potential activities/habits to implement, to make incremental steps toward being more scout-like
Posted: Sep 26, 2021. Last updated: Aug 31, 2023.