THREE NEW BOOKS: tWO — jULIA GALEF’S “THE SCOUT MINDSET”

By Kevin Currie-Knight

____

Early in her book, The Scout Mindset: Why Some People See Things Clearly and Others Don’t (Portfolio, 2021), Julia Galef comes right out and says: “Motivated reasoning is so fundamental to the way our minds work that it’s almost strange to have a special name for it; perhaps it should just be called reasoning.” Jesse Singal’s, The Quick Fix chronicled how social scientists can fall prey to such motivated reasoning. If the ideal is objective scientists dispassionately following evidence wherever it leads, the reality is one of scientists with human commitments and biases sometimes coming to bad conclusions that they defend doggedly. So, what chance do the rest of us have of reasoning well?

The Scout Mindset is Julia Galef’s advice manual on how to reason better, and her approach is a bit different from comparable manuals that have appeared prior. Galef thinks the primary problem is that we reason as “soldiers,” when we should be reasoning as “scouts.”

We are probably all familiar with those who reason as soldiers and have probably done a share of it ourselves. The aim – whether conscious or not – is to win. When one is challenged with data or arguments, the soldier’s job is to refute them, rather than honestly consider them; to defend one’s existing position rather than dispassionately submitting it to challenge. Taking a cue from Lakoff and Johnson’s, Metaphors We Live By, Galef reminds us that this type of reasoning is so typical that it is even reflected in the way we often talk about reasoning.  Arguments are either forms of “attack” or “defense.” If we are not careful, someone might “poke holes” in our logic or “shoot down” our ideas. We might encounter a “knock-down” argument against something we believe. Our positions might get “challenged,” “destroyed,” “undermined,” or “weakened.” So we look for evidence to support, bolster, or buttress our position. Over time, our views become reinforced, fortified, and cemented, and we become entrenched in our beliefs, like soldiers in a foxhole.

The scout, by contrast, has a job that demands detachment from battle. The scout, writes Galef, is interested in seeing as clearly as possible, even when that means giving up on a strategy to which one was previously committed. Rather than reasoning as a means to vindicating our current beliefs, Galef wants us to become more like the detached scout, who remains ever open to challenge.

Since motivated reasoning has a social dimension (friendships, reputation, pride, etc.), Galef advises that we reframe how we think about error, from “I’m wrong about this,” to “I’m recalibrating for future accuracy.” Since motivated reasoning can come from our biases, she also suggests finding ways to examine our current beliefs at a level of imagined detachment (imagining ourselves placing a bet on our current position) or real detachment (learning ways to better acquaint ourselves with and take seriously opposing voices).

Most problematic, in my view, is Galef’s advice along the lines of learning to be more detached from our beliefs; as she puts it “learning to hold your identity lightly.” This may be my own postmodernist tendencies talking, but this strikes me as asking people to do the unrealistic. She spells it out here: “Intelligence and knowledge are just tools. You can use those tools to help you see the world clearly, if that’s what you’re motivated to do. Or you can use them to defend a particular viewpoint, if you’re motivated to do that instead.” [p. 49]

I find it implausible that there is any sort of divide between those who see the world clearly and those who argue as “soldiers,” let alone that such a divide stands much chance of being discovered through introspection. Anyone who has been involved in a heated argument knows the chances that “Hey, you should take a step back and see things more clearly” has of changing the tenor of the dispute are slim. Soldiers rarely think that they are scouts, and being a soldier cannot be reduced to “not seeing the world clearly.” We form beliefs because we think they are correct, and once we do so, our inclination is to hold them strongly, deferring only to what seems to us demonstrably better ones.

I think much of Galef’s advice is sensible, particularly the bits about finding ways to approach our beliefs as if they were not our beliefs. [1] What I find unconvincing is when she treats bad reasoning as a problem primarily of individual characteristics rather than social ones. For instance, Singal’s book is about how social scientists get things wrong by engaging in motivated reasoning, but a careful look suggests that most of these errors in reasoning are products of social incentives and structures. Scientists expressed overconfidence in their conclusions, but within a system of journal publishing that rewards this sort of thing. Scientists defended their claims against evidence that more neutral parties would have found convincing, but in very public forums where there is little good that can come from publicly admitting error. We can and should expect social scientists to reason well, but we should also acknowledge the social factors that make it harder for them to do so.

If I reflect on the many times I’ve changed my mind – or when I haven’t! – I can readily spot the environmental factors that affected the decision. I am more likely to change my mind after a heated argument (or many heated arguments)m when I am able to calmly think to myself and not have to admit in the heat of the public moment that my position was wrong. I am more likely to consider and admit others’ points when: I respect the other person; when we are talking privately, rather than in front of an audience; and when I trust that they are not looking for victory. I notice that debates on social media bring out the soldier in me faster than I’d like, and I suspect this has less to do with me than with incentives that the social media companies have introduced, after a lot of thought and research into what drives engagement.

None of this means that Galef’s recommendations are bad. Sometimes, we can be better reasoners by pulling ourselves up by our cognitive bootstraps, however biased those bootstraps may be. I do wish, however, that some of her advice pertained to recognizing the importance of our surroundings and the incentives involved in how we reason. If one wants to be less soldier-like on social media, Galef’s advice can probably move you a bit in that direction. But what will likely help even more is to realize the ways in which social media incentivizes you to remain in the soldier’s mindset.

In the end, I suspect that Aristotle made a mistake when he suggested that what distinguishes us from brutes is our rationality. It’s true to a degree, but I think that what really separates us has more to do with humans’ capacity to believe and believe strongly. Frankly, we are better at belief than at reasoning to (or out of) it. That’s why there are so many books on how we might reason better, while few find much need to produce books with advice on how to come to quicker and more strident beliefs. The last of my three book reviews will be on How to Keep an Open Mind, a recently translated set of writings by Sextus Empiricus. If the question is how to best form beliefs, given the possibility of error, Sextus’s advice is to do what you can to avoid it. We’ll see how that goes.

Notes

[1] Like Vygotsky and Fernyhough, I believe that most of what we do when reasoning critically is dialogic, by which I mean that we are essentially internalizing the type of dialogue we might have with differently situated others. “It is justified because x. Why do I think that? Because y. But couldn’t someone plausibly argue against y?” If this is right, better critical reasoning means being better able to incorporate the voices of intelligent critics into our internal dialogues, and more of a willingness to do so.

9 comments

  1. I haven’t read Galef, but it seems to me that the “scout mindset” is non-commital and relevant when we’re at the exploratory stage of thinking. Once you firmly commit to a particular theory, then the motivated reasoning kicks in and allows you to discover patterns that you might otherwise have overlooked. If you stay in the “scout mindset” too long you deprive yourself of the building blocks for a good theory. Commitment is necessary for building theories, it’s not an option.

    1. Yes, I think that is right. It is also worth mentioning that one can be more scout than soldier in proportion to how much stake you feel in that particular belief, how important it is to you. There are many things I believe that I don’t at all care to defend much because i don’t much care if I’m wrong about them. They either aren’t terribly important to me (“What do the lyrics to that song say?”) or things that may be important but I readily admit that while I have a belief on it, it is beside my area of expertise (“Are masks effective at fending COVID?”).

      I suppose my concern with Galef is that she writes as if she is talking about the importance of being a scout regarding beliefs we very hold importantly. And while I can see that this is a noble effort, I also see it as an ultimately quixotic one. If you believe that x is a matter of life and death or that you are engaged in defending x for important reasons against a tribe of those determined to deny x and do what you think is damage to the world, I can’t see what chance you have of being much of a scout.

      While I am at it, I should also mention another thing I didn’t put in the essay. The funny thing about argument is that it fails to happen at each extreme. If all parties are warriors, discussion becomes fruitless and frustrating very quickly. But if everyone is a scout, then it is possible that no one will hold a belief strongly enough to argue a point with any force. To be fair, I’ve heard Galef (on the Coleman Hughes podcast) recognize that it isn’t that scouting is always good and warring is always bad. But the fact certainly is that we need not only people who are willing to hold their beliefs gently, but those willing to hold their beliefs strongly.

  2. I haven’t read Galef’s book so I can only go on Professor Currie-Knights representation of her. I can see why her thesis might be initially compelling, especially in today’s social-media driven environment of rancor. I am not sure I see though how what Galef argues for is much more than a platitude. For all the use of the term skepticism as a virtue, global skepticism is a self-refuting idea. That one should be open to dissenting evidence or argument seems trivially true to me. Just as it seems trivially true that one simply could not function in the world without having a definite set of beliefs about what is correct. Perhaps I am taking the metaphor too far as well, but since the scout acts in service to the soldier mustn’t the scout ultimately hold the same beliefs as the soldier.

    Are there ideas that are not open to challenge? I”d say a lot of them. Not because they are too controversial too handle, but because they have been so well established that their denial would invite more questions than they answer. Can adult-child sexual relations be normalized, and still be consistent with a do no harm principle? No, there is massive anecdotal and empirical evidence of long-term trauma and dysfunction. Is the Roman Empire a fictional construct? No, you would have to explain away millions of pieces of physical and referential evidence that point to its existence. Was there a global flood a few thousand years ago that wiped out virtually all life on Earth? No, none of the obvious signs such as extinct civilizations, or human-animal migratory patterns are there. I’m perfectly fine with these questions being raised every generation, and loathe any attempt to limit free discourse, but questions such as the above are easily argued for and don’t require lengthy debate any longer.

  3. My first impression of this dyad is – another mnemic gimmick. There are positions we regard as redoubts, others are risky and may be subject to strategic withdrawal. The situation is fluid, catastrophic but not serious, we await reports but information is fitful, sketchy and firmly provisional.

    Julia is a rationality expert, a very springy branch indeed. On the whole I’d rather stay where I am, muddling along in non-rationality. The company is good.

    1. I find all of these sorts of books essentially platitudinous. It’s a sad indication of the average person’s critical acumen that there even is a market for this sort of thing.

  4. Aren’t scouts a type of soldier? Has anyone ever ventured away from the safety of tribe/city/nation just to poke around for the heck of it? I’m being nit-picky. But maybe I’m not? Anyhow, it’s an interesting pitch she’s throwing. I’ve listened to a couple of podcast appearances of hers and she certainly comes off as affable but I’ve found her mostly unpersuasive. Yes, I’d certainly like to see the world more rationally, when I’m trying to influence circumstances to get what I want. But we already have Robert Greene to help with that, I guess?

    “I find it implausible that there is any sort of divide between those who see the world clearly and those who argue as ‘soldiers,’ let alone that such a divide stands much chance of being discovered through introspection.”

    Bingo. And this kinda gets to my quip about scouts being a type of soldier. I’m skeptical that there is such thing as non-motivated reasoning. Even if the motivation is “reading about Corded Ware culture gives me intellectual pleasure” – it’s still a motivation. Conversely, I can’t count the number of books I’ve put down because the subject just stopped interesting me half way through. I haven’t read Lost in Thought by Zena Hitz yet but I imagine “just because I find it interesting” is the reason most people think about most things. And I don’t think people need more reason than that. Clearly rationalists are very passionate about being rational but they might be a little more tolerable if they admitted that passion itself maybe isn’t all that rational. In the podcasts I’ve listened to Galef at least comes across as a good deal more pleasant than other rationalists I’ve heard.

    “If I reflect on the many times I’ve changed my mind – or when I haven’t! – I can readily spot the environmental factors that affected the decision. I am more likely to change my mind after a heated argument (or many heated arguments)m when I am able to calmly think to myself and not have to admit in the heat of the public moment that my position was wrong.”

    I think you get at something very important here. There’s an inspo-quote I often see floated around Facebook that people don’t remember what you say but how you made them feel when you said it. Seems sorta trite and shallow but I think this is something most people who think for a living would do well to remember. I dropped the only philosophy class I ever took because the professor was a boorish ass. I didn’t care how many publications he had, that hour was better spend day-drinking and trying to get laid, twenty years later I have no regrets. I made my own way to philosophy eventually. And I still think he was an ass.

    The other day my girlfriend and I watched the documentary Behind the Curve, about flat-earthers. The ending was genuinely great. I highly recommend it. But it got at the reasons why people indulge bonkers beliefs and it’s all about a need to be social. I think, in a way, the same thing goes for the rationalist community.

    One last thought: I have a friend who, despite being fully vaccinated, still wears her mask in public. I didn’t give her a hard time about it because I’m (usually) not a jerk. But when I asked she gave me all sorts of scientific-sounding explanations that took her a while to ramble off. None of it was very persuasive given the data we have on-hand. So I just told her it’s okay to admit you’re scared. It’s been a scary year, there’s to be ashamed of. I think when we stop demanding so many explanations of others (and ourselves) and admit there’s no escape from the fact that we’re emotion-driven creatures, life gets a lot easier (or at least we get a lot better at dealing with the fact that life isn’t easy, I don’t know).

    Anyhow, I’ve enjoyed these posts and I’m looking forward to the next one!

  5. Julia is a rationality expert

    This so-called rationalism is a sterile pretentiousness designed to give authority to prejudice.

  6. Aren’t scouts a type of soldier?

    Yes, exactly. They act to collect information in the service of a given point of view. I know this very well from my own military service. And we all know what Julia Galef’s point of view is. I would start to believe her if she discovered points of view that did not agree with her, ahem, prior embedded modes of thought.

Comments are closed.