A friend recently wrote to me with some questions about conspiracy theories, in addition to offering push-back to a statement I made to the effect that I was uncomfortable with conspiracy theories. Below is my response to this friend.
Thank you for your questions. When I said I was cautious about “conspiracy theories,” you asked how I defined “conspiracy theories,” and why I hesitated to get on board with some of the theories you mentioned, including conspiracy theories about COVID. Those are all great questions that I want to carefully address.
My understanding is that a conspiracy theory is an explanation for an event that challenges the mainstream explanation for that event, while maintaining that there has been deception to cover-up the real cause of the event. Thus, for something to be a conspiracy theory, there has to be three elements in place:
(1) an event (i.e., a building blowing up, an alleged moon landing, a pandemic, etc.);
(2) an explanation of said event that is accepted by the mainstream, where “mainstream” would be people within a culture’s dominant institutions like media, university;
(3) a theory that the mainstream explanation is false and based on deception.
Now for your next question about why I hesitate to get on board with conspiracy theories in general, and COVID-specific theories in particular. In order to answer these questions, I need to lay some groundwork, and address some big picture issues adjacent to your question.
Labeling something as a “conspiracy theory” is often used to incite prejudice and to serve as a discussion-stopper, to dismiss an idea as weird, and thus to sidestep the need for proper investigation. Conspiracy theories have often been proved to be true, and the past couple years have furnished examples of this. (I discussed one such example in my Salvo column last year here: https://salvomag.com/post/the-struggle-for-information-literacy-and-how-i-was-banned-from-youtube). But even so, I believe it is unhealthy to delve into conspiracy theories for the following reasons.
First, in the majority of cases, the actionable items that would follow from a conspiracy theory being true are the same actions we should be performing even if the conspiracy theory is false, namely prayer, virtuous living, and preparing ourselves to suffer. While we could imagine various hypothetical scenarios that might offer counter-examples, in most cases the practical cash-value of knowing that a certain conspiracy theory is true is usually minimal.
Second, in time most credible conspiracy theories eventually come to light for those willing to be patient. To try to figure out which ones are correct, and thus stay ahead of the curve, takes an enormous amount of work that might be more profitably devoted to expunging our thinking of worldly assumptions and the structural motifs that form part of the lens by which modern people view the world. And here there is a certain irony: as I pointed out here,
“while the conspiracy theorist mindset aims to avoid being taken in or duped by conventional wisdom or mainstream information sources, this rarely prevents the person becoming gullible to the basic tropes and structural motifs that form part of the unquestioned background for how we, as modern people, understand and navigate the world.”
Third, I have noticed that discussions about conspiracy theories tend to hinge on a type of Gnosticism or secret knowledge that is not publicly accessible to all. Here’s what I mean. When I’ve participated in discussions of conspiracy theories and I ask something like, “How do you know that’s true?” or “what’s the three main pieces of evidence to support what you’ve said?” I’m rarely, if ever, given actual pieces of evidence, or actual articles that I can perform due diligence on. Instead, I am habitually presented with a whole system or worldview that one has to first buy into as a whole before any specific claims make sense. For example, one conspiracy theory is used as evidence for another conspiracy theory (i.e., “because the moon-landing was faked, therefore COVID is probably also a hoax”), which actually ends up giving you claims that cannot be verified or falsified since it is a circular self-supporting system. This is the leaky bucket fallacy – one leaky bucket inside another leaky bucket, still gives you a bucket that doesn’t hold any water. So this ends up with an insular type of secret knowledge that only makes sense to people already “in the know.”
Fourth, conspiracy theories can be unhealthy even regardless of whether they are true or false, just as spending too much time thinking about the occult can be unhealthy even if the things you are thinking about actually exist. But why do I think conspiracy theories are unhealthy? Because they often lead to a deficit of intellectual virtue.
To understand about intellectual virtue, we need to learn from our friend Aristotle. Aristotle taught that a persuasive speaker/writer must appeal to his audience at three levels: (1) logos, (2) ethos, (3) pathos. Logos is the appeal to reason and is related to our word “logic.” Ethos appeals to the writer’s character: his virtue and personal grounds for reasonableness/trustworthiness. Pathos, in turn, appeals to the emotional sympathies of the audience or reader – something which the rational Aristotle tended to downplay.
But what does Aristotle’s three-fold system have to do with conspiracy theories? Simply this: in the world of conspiracy theories, ethos is habitually left out. Most conspiracy theories arise because we’ve read things on the internet or watched videos on YouTube from people we don’t know, whose personal histories are unknown to us, and with whom we haven’t been able to talk with to find out if they are reasonable, wise, trustworthy, and intellectual virtuous people. I know that it would seem quixotic to stop and ask, “I need to find out if this person is virtuous before I can fully assess his claims.” But in actual fact, it is us moderns who are strange in that we have entirely divorced traits like personal reasonableness from content, and then routinely disseminate information without knowing if the person who produced this or that video, or this or that article, can pass tests of reasonableness.
One of the ways we can assess a person’s reasonableness (again, at the level of Aristotle’s Ethos, as opposed to Logos) is a person’s level of intellectual virtue – what is also sometimes called epistemic virtue. We don’t hear very much about intellectual virtue, and that is because ever since the Enlightenment, we have tended to think of virtue solely in terms of behavior and actions. But for most of human history, the virtues have been understood as dispositions or character traits that are constitutive of human flourishing. Accordingly, the intellectual virtues would be dispositions that are constitutive of right thinking, and therefore form part of someone’s credibility to teach and pass on knowledge. Intellectual virtues would include traits such as the following:
- non-dismissive consideration of arguments;
- charitable interpretation of opposing arguments;
- being able to weigh up different points of view, including listening to opposing points of view with cognitive elasticity;
- awareness of one’s own presuppositions and potential for being mistaken;
- cognitive empathy;
- confidence in reason;
- intellectual humility;
- desire for consistency;
- love of truth;
Many of the above intellectual virtues are outlined in the Wisdom Literature, particularly in passages from Proverbs that contrasts the fool vs. the wise man. Throughout the book of Proverbs, these intellectual virtues are not only precondition for wisdom but constitute a criteria for determining a person’s reasonableness and thus their credibility as a disseminator of information.
Now it is a legitimate concern that those who advocate conspiracy theories are, as a general rule, lacking such virtues; moreover, the very mechanisms by which conspiracy theories take hold of people’s imaginations are usually linked to a deficit in intellectual virtue. It will probably be helpful if I offer some examples. Let’s consider love of truth, which is one of the most important intellectual virtues.
In 2020, as everyone was coming to terms with the pandemic, friends approached me with various conspiracy theories, and so I spent dozens of hours and hundreds of dollars researching whether these theories were true. Despite what I said above about normally waiting until truth comes to light, the pandemic seemed to be close enough to home to justify some level of investigation on my part. So for a while, I investigated about half of the COVID-based conspiracy theories people sent me. In some cases, I published the results of my investigations, while in other cases I would reply directly via text or email to the people who had sent me the theories. Over time, a surprising trend started to emerge. When I would share the results of my investigations with the friends who had initially sent me the theories, or even when I would simply disclose to them that I had initiated a process of due diligence, they were disinterested in what my investigation might uncover unless it supported the initial conspiracy theory. In many cases, conservative Christians were hostile to the very idea of me performing due diligence, while some friends (even church friends) lapsed into functional relativism and various types of post-truth epistemologies when dismissing either what I had discovered, or dismissing the very process of performing due diligence on conspiracy theories. This was a very eye-opening experience for me, and showed me that love of truth is often absent from conspiracy theorizations. I wrote up the results of my findings here. But be forewarned, if anything will make you cynical of conspiracy theories, this will! I can’t help but think of Yuri Bezmenov’s warnings about complacency when he said,
“Exposure to true information does not matter anymore. A person who was demoralized is unable to assess true information. The facts tell nothing to him. Even if I shower him with information, with authentic proof, with documents, with pictures…he will refuse to believe it.”
So back to my research in 2020-21. I was researching both COVID related conspiracy theories as well as researching conspiracy theorists, and particularly how the latter responded when they found that I was performing due diligence in the theories they were sharing with me. In addition to a deficit in the intellectual virtue of truth-loving, it also became evidence that conspiracy theorists habitually used various heuristics as a substitute for genuine investigation. A heuristic is a rule of thumb method for problem-solving that is thought to yield approximate outcomes in a way that bypasses more labor-intensive research. Some popular heuristics I encountered included the following:
- “Using common sense will give you the answer quicker.”
- “You can use your judgement. If it makes sense it is real.”
- “It all comes down to who you choose to believe.”
- “See if it feels right.”
- “When evaluating information, you sometimes have to have faith.”
- “I trust my inner BS sensor because it has a lot of experience behind it.”
When challenges about these heuristics, people would justify them with various post-truth epistemologies that are becoming increasingly common, including:
- the difference between real news and fake news is just a point of view;
- everyone has their bias, their particular spin, so by finding online sources that simply back up my own opinion, I’m just doing what everyone else does;
- everyone cherry-picks resources that fit their point of view – there is no such thing as objectivity;
- by pressing the necessity of these research skills, you are taking away people’s free will, because each of us has the right to choose what he believes;
- you can’t even fact-check anymore, because people on the left have their facts and people on the right have their facts – you just have to choose based on your worldview.
These positions enabled conspiracy theorists to claim the moral high ground against people like me who wanted to actually conduct deep investigation into the claims being made. In the end, I was the one that had a problem because I wanted to conduct a rigorous investigation, almost as if I was in denial, clinging to processes of due diligence that were no longer applicable to our world.
But just as truth-loving and confidence in reason are intellectual virtues, so is metacognition. Metacognition is the ability developed through practice of watching one’s thinking, to observe one’s own brain, in order better to weed out thinking errors. Among some of the most common thinking errors we can weed out with metacognition include the following:
– The thinking too quickly effect
When we think too quickly, we are prone to numerous errors, including oversimplifying, missing crucial information, and failing to notice our own propensity for error.
-The scattered attention effect
Researchers have found that when our attention is scattered by too much stimuli, the result is similar to what happens when we think too quickly. In particular, we are prone to miss important connections and increase our likelihood to commit errors.
-The echo chamber effect
The echo-chamber effect is what happens when we eliminate opposing viewpoints and differing voices from the information and ideas we consume. This has become very common since the advent of social media, since social media runs on algorithms that supply us with the information we want to hear. But the same thing also happens when we use YouTube, or search engines.
-The bandwagon effect
Often certain beliefs, convictions, and ideas become trendy, and pick up a momentum because of a type of “groupthink.”
-The Dunning–Kruger effect
This effect occurs when a person’s lack of competence causes him to overestimates his own competence.
This is the tendency to interpret information in such a way as to strengthen your preexisting beliefs, and to filter out contrary information, such as new evidence that might weaken your opinions. Confirmation bias occurs when we seek out (perhaps unconsciously) information that merely confirms pre-existing beliefs. Search engine technology has made it very easy to perform “research” that merely makes us more confident in what we already believe.
-The epistemological bubble effect
This effect occurs when the structures the mediate information to us leave out important voices, including voices that might give the other side of the picture. Our structures for getting and interpreting information might be technology, an ideological community, a church, a political party, or something else. These structures become epistemological bubbles when they become so insular that they begin (whether intentionally or unintentionally) screening out information that might potentially disrupt what the community believes
In the book of Proverbs, we are warned against these last two (confirmation bias and the epistemological bubble effect) through the repeated injunctions not to imitate the fool. Proverbs identifies the wise man as one who listens to both sides of an issue and a multitude of voices, while the fool only listens to the flatterer – the source that tells him what he wants to hear. One of the ways we are susceptible to flatterers is cherry-picking experts who merely confirm our opinions.. Here is what my friend Dr. Alastair Roberts says about this:
“The fool will not carefully consider opposing positions to discover what element of wisdom might lie within them, but will leap at whatever excuse he can find—the tone, the political alignment, or the personality of the speaker, etc., etc.—to dismiss and ignore them. Ultimately, whether he realizes it or not, he hates wisdom, as the task of wisdom is discomforting for him and he will avoid it at all costs. By contrast, the wise will endure considerable discomfort to seek wisdom wherever it is to be found. He will willingly expose himself to scathing rebuke, to embarrassing correction, to social alienation, or to the loss of pride entailed in learning from his sharpest critics or opponents or climbing down from former stances, if only he can grow in wisdom.
The wise recognize that the danger of the flatterer is encountered not merely in the form of such things as obsequiousness directed towards us personally. Flattery also expresses itself in the study or the expert that confirms us in the complacency or pride of our own way, bolstering our sense of intellectual and moral superiority, while undermining our opponents. The fool is chronically susceptible to the flatterer, because the flatterer tickles the fool’s characteristic pride and resistance to correction and growth.
The fool will pounce upon studies or experts that confirm him in his preferred beliefs and practices, while resisting attentive and receptive engagement with views that challenge him (or even closely examining those he presumes support him, as such examination might unsettle his convictions). The fool’s lack of humility and desire for flattery make him highly resistant or even impervious to rebuke, correction, or challenge. You have to flatter a fool to gain any sort of a hearing with him.See Also
Ideology is the friend of the fool. Ideology can assure people that, if only they buy into the belief system, they have all of the answers in advance and will not have to accept correction from any of their opponents, significantly to revise their beliefs in light of experience and reality, or acknowledge the limitations of their knowledge.
By contrast, the wise know that the wounds of a friend are faithful and seek correction. They surround themselves with wise and correctable people who are prepared to correct them. They are wary of ideology….
The fool seeks company and will try to find or create a confirming social buffer against unwelcome viewpoints when challenged. The scoffing and the scorn I have already mentioned are often sought in such company. The fool surrounds himself with people who confirm him in his beliefs and will routinely try to squeeze out people who disagree with him from his social groups. The fool’s beliefs, values, and viewpoints seldom diverge much of those of his group, which is typically an ideological tribe designed to protect him from genuine thoughtful exposure to intelligent difference of opinion or from the sort of solitude in which he might form his own mind. He has never gone to the sustained effort of developing a pronounced interiority in solitary reflection and meditation, of attendance to and internalization of the voices of the wise, or of self-examination, so generally lacks the resources to respond rather than merely reacting. When the herd stampedes, the fool will stampede with them, finding it difficult to stand apart from the contagious passions of those who surround him.”
Roberts’ entire article is here and is worth reading in full. One take-home point from the article, besides his insightful discussion of wisdom vs. folly, is something that is highly uncomfortable to those advocating conspiracy theories, namely that none of us should pass on links to information if we have not first performed due diligence on the source we are sharing.
At first, this may seem excessively restrictive. After all, for most of us it has become second-nature to share articles on Facebook and other social media platforms, or to text our friends links to news videos or articles. Rarely do we think to perform due diligence on these sources before sharing. We engage in these types of activities as a form of phatic speech, or to reinforce a sense of victimhood, or to indulge an appetite for the scandalous, or to reinforce political agendas, or because we genuine believe that we can trust this information even if we haven’t conducted an investigation on its veracity. If we are challenged (i.e., “did you perform due diligence on this source before sending it to me?”) we typically excuse ourselves by saying something like, “I was just passing on information” or “it was just a theory.”
In Biblical times, “just passing on information,” was considered a species of gossip and the sin of over-speaking. Interestingly, over-speaking is one of the most discussed sins in all of Scripture. Yet we have largely neglected serious thought about how the Biblical teaching about this sin might transfer into the online environment.
My rule of thumb is this: if something would count as gossip in a small village—for example, passing on information that I have not investigated, especially information that is negative towards a certain person or group, or which could incite fear or anger—then that behavior is also gossip in the interconnected digital village we call the internet. Bottom line: don’t pass on links you have not performed due diligence on. “But,” someone might say, “that would eliminated 95% of conspiracy theories.” Well, exactly. Again, from Roberts’ article:
“Fools will readily believe a case without closely seeking out and attending to the criticisms of it (Proverbs 18:17). They routinely judge before hearing. They also attend to and spread rumours, inaccurate reports, and unreliable tales, while failing diligently to pursue the truth of a matter. The wise, by contrast, examine things carefully before moving to judgment or passing on a report.
In following responses to the coronavirus, I have been struck by how often people spread information that they clearly have not read or understood, simply because—at a superficial glance—it seems to validate their beliefs. They do not follow up closely on viewpoints that they have advanced, seeking criticism and cross-examination to ascertain their truth or falsity. And when anything is proven wrong, they do not return to correct it.”
When I talk about the importance of exercising intellectual virtue and performing due diligence, people often react by saying, “But Robin, it would use up so much time—how can I possibly perform due diligence on everything I read?!” Well, if someone feels like this, perhaps that is a sign that they need to consume less information. It’s better to read one article a week critically (and that means spending some time researching its veracity) than to consume ten articles a week without performing due diligence. This is analogous to what I tell parents about rules and discipline. Many parents tell their children to do things but then don’t follow through on, thus giving their children the idea that it’s okay to be disobedient. So I say that most parents need to decrease the amount of commands they give their children by 90% but enforce the remaining 10% a hundred percent of the time. Similarly, I encourage people to decrease the amount of online articles they read by 90%, but perform due diligence on the remaining 10% one hundred percent of the time.
By slowing down and decreasing our information-consumption, we can be attentive to whether our various beliefs are consistent. Concern for consistency is a sub-category of the intellectual virtue of having confidence in reason. I am convinced that many of the most popular conspiracy theories would evaporate if we asked ourselves:
- Am I holding mutually-exclusive beliefs?
- Do I generally seek consistency between different viewpoints?
- Do I use cognitive dissonance to justify holding beliefs that contradict each other?
Confidence in reason, and the criterion of consistency, seemed to be in short supply in many of the COVID-era conspiracy theories. Consider the following case, which is becoming all-too typical. One friend, whom I will call Bob, sent me a message on Monday claiming that the pandemic originated as a Chinese bioweapon. By Monday afternoon he had sent me a YouTube video claiming that the pandemic was engineered by Bill Gates in order to profit on sale of a vaccine. On Wednesday morning, I had another message in my inbox from Bob, in which he shared a source claiming that the coronavirus was a hoax because COVID-19 hospitals were actually empty.
“I’m curious,” I wrote to Bob, “how all these sources you’re sending me can all be true at the same time? How can the coronavirus be a deadly weapon from China as well as something Bill Gates invented as well as a hoax that doesn’t actually exist?”
This is typical of those who present conspiracy theories in being unconcerned for consistency. Many people like him have ideological reasons for doubting the mainstream media, and this will lead them frequently to latch onto competing explanations with no concern for consistency. Truth doesn’t matter, which is why they are quite happy to simultaneously share inconsistent theories as long as it furthers their agenda of disrupting people’s confidence in the mainstream narrative.
Here again my friend Dr. Roberts has some helpful observations based on the Wisdom Literature of the Bible:
“The wise are concerned to demonstrate consistency in their viewpoints, as agreement between witnesses and viewpoints are evidence of the truth of a matter or case. However, the beliefs of a fool are generally marked by great inconsistency. They lack the hallmarks of truth because they are adopted for their usefulness in confirming the fool in his ways, rather than for their truth. The fool will jump between inconsistent positions as a matter of convenience. The consistency of the positions and beliefs of fools are found, not in the agreement of their substance, but in the fact that they all, in some way or another, further entrench the fools in their prior ways and beliefs. Also, the intellectual laziness of fools means that they will not diligently seek to grow in a true consistency (although some might develop a consistency in falsehoods designed merely to inure them against challenge, rather than as a pursuit of truth itself).”