I’ve been reading a lot about beliefs and about how our minds work, and I’ve wanted to write something about it for a long time. I’m fascinated by the difference between disagreement that is healthy and constructive, and disagreement that is toxic and destructive. I’ve been putting it off, subconsciously thinking I had to wait until I knew everything before I wrote anything. I could then write the perfect article, chock full of footnoted links to research. It would answer every question and explore every nuance.
Could I do that without writing a thousand-page textbook? No, there is too much. So I’ll take the Inigo Montoya Approach and sum up. There are some fascinating ideas from evolutionary history and psychology that explain a lot of the animosity surrounding disagreement in today’s conversations about healthcare, politics, and religion. The source material is a blend of Harari’s Sapiens, Haidt’s Righteous Mind, Kahneman’s Thinking Fast and Slow, and Mark McRaney’s entertaining and insightful podcast You Are Not So Smart.
To begin, Harari says there are three reasons why homo sapiens have been very successful since first showing up 200,000 years ago:
- Homo sapiens learned to harness fire to cook food, which increased our caloric intake and resulted in significant brain growth.
- Larger brains led to the development of language, including the ability to tell stories and produce fiction.
- We began to use language, stories, and fiction to coordinate tasks and operate in groups, creating tribes, communities, and eventually cities.
Those three evolutionary ideas lead to some fascinating psychological results.
One, our beliefs aren’t built on pure rationality. Our survival historically depended on maintaining group status, not accuracy. As a result our views have more to do with group dynamics than a dispassionate search for truth. When faced with a new idea, our group-centered minds ask, “How will this affect my group standing?” You’ve seen this in action if you’ve ever seen someone base a political opinion on whether or not it came from their party’s leader, as opposed to the merits of policy itself. We instinctively align ourselves with our group.
Two, our evolutionary wiring causes us to trust people like us. Shared traits (skin color, political ideology, country of origin, socioeconomic standing) lead to increased trust. Dissimilarity decreases instinctive trust. These tendencies come from the deeper preprogrammed parts of our minds. While they can be overwritten with conscious reasoning, conscious rationality is not the default tool we use to evaluate new ideas. Trusting people who are different requires work, and our efficiency-focused brains don’t do that automatically.
Three, we are unable to independently verify all of the things that we know. Due to a vast information surplus of things we could pay attention to, we filter and accept messages from people we trust. Our brains prioritize efficiency over accuracy. For example, I haven’t produced the necessary observations that show the earth is round or that it moves around the sun. I trust the scientific community when it tells me these are settled issues. Nearly everything you and I “know” is borrowed from someone we trust.
Four, stories are more powerful than facts. The first narrative we hear from a trusted source (someone like us) is likely the one that we will use to decide whether we accept or reject the other facts that come to our attention. In fact, exposure to factual information that goes against our narratives causes us to dig in to our viewpoint even more. Contradictory evidence strengthens opposing conclusions, which is both fascinating and disappointing if you’ve ever tried to change someone’s mind.
These four points go a long way in explaining why good people disagree about important issues. Our beliefs are more often based on identity, on maintaining group status, and on who we trust as opposed to a dispassionate factual examination.
It doesn’t mean our beliefs are pre-determined. These are all tendencies we can overcome, with effort.
It does mean that arriving at The Truth (or some approximation to it) involves acknowledging that we have these tendencies and navigating them.
It should also give us pause about how we arrive at our own conclusions. Do we believe something because it feels right, in spite of any contradictory evidence? Which sources do we believe? And why do we believe them? Is it just because they’re like us? Are they from the same political party or religion? Do they write for a news organization we trust? Or have we done some conscious and rational examination of the evidence presented, putting in the extra work of looking up alternative viewpoints?
Finally, it means we should be “slow to speak, slow to wrath” (James 1:19) when we run into those who have come to different conclusions than we have. It’s natural to conclude that your crazy relative with different or weird or conspiratorial beliefs has been brainwashed or is evil. It’s completely natural, but also wildly ineffective if you hope to engage in anything approximating intelligent disagreement.
Idiocy and malice are not the sole sources of disagreement. More often than not, we have all arrived at different conclusions as a result of evolutionary tendencies that lead us to adopt the differing narratives we use to make sense of the world. To an extent, we are all someone’s “crazy relative.” Recognizing the psychological processes we all follow to reach conclusions can steer the conversation away from heated animosity and toward examining the underlying and unspoken narratives that cause most differences of belief.