How can we protect ourselves from bad ideas and wrong information?
Mental Immunity: Infectious Ideas, Mind-Parasites, and the Search for a Better Way to Think
Andy Norman 2021
How can anybody be so sure about anything these days? In an age of mushrooming conspiracy theories, people are more confused than ever about what's fake and what's real. And unscrupulous politicians, media, and businesses are taking advantage of that confusion to sow doubts to push their own agendas. How do we deal with this?
Mental Immunity is an attempt by an actual philosopher to tackle the question of epistemology, or how we know things. Dr. Andy Norman is a philosophy professor at Carnegie Mellon University and he has put together a timely guide through the history of knowledge to try to help us navigate the confusing world of information and fake news. This is a deep read, but it's worth it to get some perspective on the struggles that have befallen humanity since the times of the Greeks.
Norman describes the problems of bad ideas and fake news, which he calls "mind parasites." The Covid-19 epidemic has produced a cascade of rumors and fake news that has killed thousands of people. Things like hydroxychloroquine, horse medicine, and bleach injections have been proposed to treat the disease while established scientific advice such as vaccines, masks, and social distancing has been attacked as government overreach. Where do bad ideas come from? How do you tell what qualifies as a bad idea? And what can you do to protect yourself and those you love from destructive "mind parasites?"
Inspired by Covid, Norman uses the analogy of infection and vaccines all throughout the book. Like a virus, bad ideas can spread from person to person and turn people into unwitting carriers of disinformation and conspiracy theories that they feel in their bones to be true. He provides a list of six immune disruptive ideas and their antidotes:
- My beliefs are private and nobody's business. (Only true if no actions that hurt others come from those beliefs.)
- We have a right to believe what we want to believe. (Again, if beliefs turn into real world results- like the Holocaust or slavery- we don't have that right)
- Values are subjective. (Tastes are subjective, some values are universal).
- We have no standing to criticize other's value judgements. (If they are directly harming others we sure do- see climate change for instance.)
- Basic value commitments are not subject to rational examination. (We all have basic principles, but if yours involve killing or enslaving others, they deserve scrutiny.)
- Questioning a core commitment is fundamentally intolerant, mean or unkind. (Cancel culture seems oppressive at times, but there are legitimate questions to be asked.)
Norman next takes a deep dive into a basic philosophical dilemma- how can you know something to be true? One camp relies on faith in basic beliefs and principles that can never be questioned. This is the path of religion and partisan politics, and has gotten humans into a lot of trouble while giving them a comfortable base of certainty from which to work.
The other camp is the way of inquiry, championed by Socrates, where everything is up for examination and nothing is sacred or assumed. The problem that humans have faced for centuries is that the way of inquiry can lead to an infinite series of questions that only lead to more questions, called infinite regress. At its worst, constant inquiry leads people to feel lost and powerless as they grasp for any kind of truths. To avoid that trap we've relied on faith in unquestioned principles that are immune from inquiry. But where do you find the principles that can't be questioned? And what happens if things change and the old principles don't apply anymore?
The way of inquiry, aka scientific inquiry, is great in some instances but misses the mark in others. Faith leaders point out that having strong bedrock principles protects people from chaotic events, and having strong vision helps create a more desirable future that doesn't necessarily match current conditions. Norman calls this the downstream effect of faith. People of faith ignore the upstream causes of something so that they can focus solely on their desired results. You can see this dynamic at work today, with motivated reasoning ignoring medical advice while latching onto more congruent and desirable miracle cures.
Norman proposes something called reason's fulcrum, where we are supposed to yield our beliefs when the evidence stops supporting them. No belief in that way is sacred, and if better reasons are discovered, norms and beliefs should follow. This goal has been derailed by a bevy of cognitive biases that keep us from seeing things we'd rather not see and learning things we'd rather not learn. Because our beliefs become a part of our very identities, we cling to them even when they stop serving us. Cognitive biases divide people into camps, and when someone plays the faith card in a debate and says "just because," the exchange dies, especially when both sides have opposing faiths that can never be reconciled.
So after many pages of history and philosophy, the author comes down to what he calls a mind vaccine, that will protect the bearer from bad beliefs. That vaccine comes down to one sentence- "a belief is reasonable if it can withstand the challenges that genuinely arise." The key word here is "genuinely." Many challenges to established knowledge are without merit, in bad faith, or baseless. Rather than put the responsibility on society to defend against every challenge, the onus has to be on the challenger, who must offer real proof that their challenge is legitimate. Listening to conspiracists on Facebook and elsewhere, this is where their arguments fall apart- they hear it from someone who either has no experience in the field in question, or from someone who has something to gain by propagating that idea.
In other words, when someone comes at you with a fanciful but suspicious belief, the best protection you can come up with is to ask them questions. Why do you believe that? Where did you get that information from? Why is the rest of society wrong on this and how can you be so sure of your source?
Many people today have bought into a grand conspiracy theory that all the scientists, teachers, media professionals, and world leaders are hiding the truth for nefarious reasons. This kind of grand conspiracy theory blocks any possibility of intelligent debate. Any conspiracy theory becomes unassailable because discounting evidence come from them- the bad guys. It reminds me of a joke:
Alex Jones dies and goes to heaven after spending most of his life spreading the conspiracy theory that the world is flat. He gets to heaven and expresses his frustration to God that people didn't believe him, to which God replies that earth was indeed spherical the entire time. Alex stares at God and shakes his head. "Wow, this conspiracy goes much higher than I ever thought!" he says to himself.
The grand conspiracy is a fiction, but it enables so many more conspiracy theories because it says that you can trust no one. And it is true that when it comes to knowledge and information, everybody has a bias and an agenda, and you should always keep those in mind when deciding who to trust. But in the end the world is unknowable and the best we can hope for is a partial picture guided by inquiry and experimentation.
Dr. Norman 12-step program that encompasses much of what was said in the book, and it's good general information on keeping your information intake clean of nasty viral content. Here it is:
1- Play with ideas and test them, don't fall for them too quickly.
2- Minds are not passive knowledge receptacles. We are always re-learning.
3- You are not entitled to your opinions.
4- Don't use bad faith arguments. Treat people with respect even if you disagree.
5- Be prepared to unlearn things you thought you knew.
6- Figure out how to order your thoughts.
7- Don't cop out by thinking "who's to say?" We all have a responsibility to correct people who are wrong.
8- Value judgements can be objective.
9- Treat challenges as opportunities instead of threats.
10- Join communities of inquiry, not communities of belief.
11- Upgrade your understanding of reasonable beliefs.
12- Don't underestimate value of ideas that have survived scrutiny. They are generally better than the alternatives.
Mental Immunity proposes a middle path between absolute faith and absolute skepticism that is admittedly squishy. There is a base of knowledge, mostly proven by scientific inquiry, that generally can't be disputed without some sort of proof. We build from that base with empirical studies and experiments that add data, from which we hopefully can get a general agreement on what is true. Science can't deal with death, the supernatural, personal tastes, or human biases, so there is a huge gap in which beliefs and opinions still matter, but at least it's something. And hopefully at some point we can agree on the big things like economic fairness, racism, and climate science, though I'm not holding my breath.
Is this book a best-seller? No, of course not. The current top 10 NY Times non-fiction list is populated by political books that tell people what they want to hear. They are written by ideologues who profit from telling lies and half-truths. Books like this - thoughtful, philosophical, and dedicated to finding evasive truths are for a small audience, but hopefully an influential one, and if enough of us just take in the simple truth that our beliefs and opinions impact the lives of others in ways we don't realize, then perhaps we are making progress.
We are besieged by more information than humanity has ever had to deal with in history. Some of it seems urgent and existential. Sifting through all this information to come up with a workable model of how to live your life is the challenge of the 21st century. Getting fooled by mind parasites and bad information is the great danger of our information age, and we need to be vigilant but humble when forming our beliefs and ideas before sharing them with others.