![]() But they won’t protect you from bias on ideologically charged questions” (p. A high IQ and an advanced degree might give you an advantage in ideologically neutral domains like solving math problems or figuring out where to invest your money. 3 The finding was that “Individuals with greater education, science education, and science literacy display more polarized beliefs on these issues,” though the study found “little evidence of political or religious polarization regarding nanotechnology and genetically modified foods.” Galef summarizes the implications of those studies: “This is a crucially important result, because being smart and being knowledgeable on a particular topic are two more things that give us a false sense of security in our own reasoning. Another study surveyed people’s views on ideologically charged topics, including stem cell research, the Big Bang, human evolution, and climate change. This is not the only study to reveal the tendency of more educated people to diverge in opinion on controversial topics. That is to say, the higher one’s scientific intelligence, the more likely a liberal Democrat was to affirm the statement and the more likely a conservative Republican was to disagree with it. Remarkably, when conservative Republican and liberal Democratic participants were also asked whether they affirmed the statement that there is “solid evidence” of recent global warming due “mostly” to “human activity such as burning fossil fuels,” there was a positive correlation between “scientific intelligence” and divergent opinion. 2 Questions were divided into four categories - basic facts methods quantitative reasoning and cognitive reflection. The prevalence of the soldier mindset in society today is aptly demonstrated by a sobering study, cited by Galef, in which participants were tested in regard to their “scientific intelligence” with a set of questions. Thus, scouts are more likely to seek out and carefully consider data that tends to undermine their own beliefs (thereby making one’s map a more accurate reflection of reality), deeming it more fruitful to pay close attention to those who disagree with their own opinions than to those whose thinking aligns with theirs. For the one in scout mindset, by contrast, reasoning may be likened to mapmaking, and discovering that you are wrong about one or more of your beliefs simply means revising your map. For the soldier, determining what to believe is done by asking oneself “Can I believe this?” or “Must I believe this?”, depending on one’s motives. ![]() One’s allegiance is to one’s cherished beliefs rather than to the truth, even if those beliefs conflict with the balance of evidence. For the soldier, to change one’s mind - to admit that one was wrong - is seen as surrender and failure, a sign of weakness. ![]() ix).įor someone in the soldier mindset, argues Galef, reasoning is like defensive combat - “it’s as if we’re soldiers, defending our beliefs against threatening evidence” (p. ![]() On the other hand, the scout mindset attempts to honestly determine how the world really is - as Galef defines it, the scout mindset is “the motivation to see things as they are, not as you wish they were” (p. This involves actively seeking out data that tends to confirm our beliefs, while rationalizing or ignoring contrary data that tends to disconfirm them. Galef distinguishes between what she dubs “the soldier mindset” and “the scout mindset.” According to Galef, the soldier mindset, also known as motivated reasoning, leads us to loyally defend the stronghold of our belief commitments against intellectual threats, come what may. 1 Here, I shall summarize Galef’s insights and discuss what lessons we as scholars engaged in the debate over ID and evolution can glean from the book. Have you ever wondered why some people are able to think about the world in a clearer way, forming more balanced and nuanced views about controversial topics, than others? Have you pondered what thinking patterns are most conducive to good reasoning and well supported conclusions, and how one might avoid the pitfalls of confirmation bias and self-deception? In her book The Scout Mindset: Why Some People See Things Clearly and Others Don’t, Julia Galef (host of the podcast “ Rationally Speaking” and co-founder of the Center for Applied Rationality) attempts to answer these questions. Photo credit: BORNTHISWAYMEDIA, CC BY-SA 4.0, via Wikimedia Commons.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |