Much of my recent research on Mormonism has been focused on the intersection of conspiracism with Mormon theology, history, and culture. Therefore, I have been doing a lot of reading on the social psychology of conspiracy belief. I am particularly interested in exploring the ways that the development of a conspiracy mindset may be a common feature of enculturation in Mormon communities, which I explored briefly in a previous essay. Inasmuch as I have continued to read deeply on this topic, I would like to expand upon my previous speculations with some of the information I have learned from the academic literature. Accordingly, this essay will be the first of a series of discussions reviewing discoveries in the emerging literature on conspiracy belief, while exploring ways they might apply to Mormondom.
The Functions of Conspiracy Belief
To organize this essay series, I will be relying heavily on a framework popular within the conspiracism literature regarding the functions that conspiracy theories serve for those who believe them. These can be broadly reduced into three motivational categories—epistemic, existential, and social (Douglas et al, 2017). Briefly, epistemic motives refer to the sense-making function that conspiracy theories provide in our attempt to understand the world around us. Existential motives deal with the way that conspiracy theories are a product of our need to feel safe, secure, and in control of our environment as autonomous agents. Finally, social motives represent our desires for belonging and to maintain a positive image of our self and the groups to which we belong.
As a series, I will devote a separate essay to exploring each of these three motivational categories for conspiracy belief. Within each essay, we will briefly review the literature before discussing potential parallels with Mormon theology and culture. To round out this particular series, I may conclude with a fourth essay regarding the link between religiosity and conspiracy belief, using Mormonism as an investigative lens. This first essay will be fairly lengthy, so consider reading it in sections, using the hyperlinked Table of Contents below as a guide.
Epistemic Motives
The first motivational category to which conspiracy theories appeal is the desire for coherence and predictability regarding the causality of events in one’s social environment. To that end, conspiracy theories provide sense-making narratives about important events or social conditions. They represent internally consistent, alternative explanations that provide a sense of coherence and predictability for believers. Conspiracy narratives reduce ambiguity and bewilderment when information is unavailable or conflicting, provide meaning when events appear to be random, and facilitate defending beliefs from disconfirmation (Douglas et al., 2017).
Several psychological phenomena have been the subject of academic research into the mechanics of conspiracy belief. Relevant to epistemic motives, we will explore the need for cognitive closure, illusory pattern perception, spurious inferences of causality, and overreliance on the heuristics of intuitive reasoning.
Table of Contents
- Need for Cognitive Closure
- Dual Processes and Cognitive Style
- The Availability Heuristic
- The Representativeness Heuristic
- The Affect Heuristic
- Apophenia and Causal Inference
Need for Cognitive Closure
The desire for cognitive closure is one of the major psychological motives that drives conspiracy belief. Researchers have reliably demonstrated that the strength of this impulse is correlated with conspiracy belief and conspiracy mindset (van Prooijen & Acker, 2015; Gligorić et al., 2021). That is, the stronger an individual’s tendency to crave certainty, the more probable it is that they subscribe to conspiracy theories.
As rational creatures, we crave clarity and understanding about the world we live in. Ambiguity and confusion are generally aversive. We seek to reduce ambiguity and strive for a degree of certainty about the workings of the world around us. We crave order and predictability. Therefore, when information about events or conditions of our environment is sparse, unreliable, conflicting, or otherwise ambiguous, we strive to reduce that ambiguity by constructing narratives that confer order and predictability to the world.
There are many methods whereby we construct meaning amidst the ambiguity of our environments. Religion and science are both epistemic enterprises that scratch this itch—as are conspiracy theories. As a method of causal explanation, conspiracy theories are distinct in that they posit intentionally hidden information, the coordination of multiple actors, and are resistant to falsification in that these actors employ secrecy and subterfuge to prevent the discovery of how the world truly operates. Additionally, conspiracy theories are efficacious in protecting cherished beliefs by casting overwhelming disconfirming evidence as the product of a conspiracy (Douglas et al., 2017).
Beyond the Shadow of a DOUBT
A strong aversion for ambiguity runs like a coursing river that irrigates the broad landscape of Mormonism. Simplicity, clarity, and certainty are prized virtues of Mormon faith and culture. Like many expressions of post-Enlightenment Christianity, Mormon theology devotes considerable attention to answering questions arising from ambiguity or contradictions in the Bible and from conflicts between traditional religious understanding with modern scholarship in science and the humanities. A crucial premise of The Book of Mormon is that it restores the “many plain and precious things taken away from the [Bible]”—the removal of which has resulted in an “awful state of blindness” wherein “an exceedingly great many do stumble, yea, insomuch that Satan hath great power over them” (1 Nephi 13:23–32).
Right from the start, Joseph Smith’s development of Mormon theology seemed especially tailored to resolving controversy and removing ambiguity from Christian theology. Alexander Campbell, the prominent Restorationist minister, remarked in his review of The Book of Mormon:
“This prophet Smith, through his stone spectacles, wrote on the plates of Nephi, in his Book of Mormon, every error and almost every truth discussed in New York for the last ten years. He decides all the great controversies—infant baptism, ordination, the trinity, regeneration, repentance, justification, the fall of man, the atonement, transubstantiation, fasting, penance, church government, religious experience, the call to the ministry, the general resurrection, eternal punishment, who may baptize, and even the question of free masonry, republican government, and the rights of man.”
What Smith began with The Book of Mormon he continued in his revision of the Bible, and in his sermons and revelations—right up until he was killed. The development of Mormon theology over the course of Smith’s lifetime and beyond shows an impulse to provide clear and definitive answers to issues of theological ambiguity. Indeed, this is frequently touted as the selling point of the “continuing revelation” received by Mormon prophets and apostles. This compulsion for certainty manifests itself ubiquitously in Mormonism, but is especially pronounced in the writings of Mormon apologists—groups like FAIR, the Interpreter Foundation, Book of Mormon Central, the Joseph Smith Foundation, and the FIRM Foundation. A light perusal of their websites quickly will uncover a preponderance of material devoted to providing concrete, definitive answers resolving matters of ambiguity, especially where that ambiguity presents a potential obstacle to Mormon devotion.
The Book of Mormon also contains an epistemological promise that readers can obtain a certainty of its truthfulness through a mystical encounter with the Holy Ghost. In discussing the distinction between faith and knowledge, the prophet Alma states that faith is not to “know of a surety” because “if a man knoweth a thing he hath no cause to believe, for he knoweth it.” He elaborates that “faith is not to have a perfect knowledge of things” before explaining a process whereby individuals can rely upon their feelings to discern whether something is true. Regarding this discernment, he states that their “knowledge is perfect in that thing” (Alma 32:17–34). Moroni then builds upon these ideas at the end of the book, wherein the reader is promised that a manifestation of the Holy Ghost will accompany their prayerful petitions for confirmation that what they have just read is true, by which they “may know the truth of all things” (Moroni 10:3–5).
Anyone who has sat through a Mormon testimony meeting is familiar with how ubiquitously Mormons use statements of certainty to express their religious devotion. This is not a coincidental phenomenon; church leaders frequently encourage members to strive for, and profess certainty in, their religious convictions. In only minutes of searching, one can easily find numerous talks (1, 2, 3, 4), articles (1, 2), or lessons (1, 2, 3) encouraging members to obtain a sure knowledge of the truthfulness of the church, and to articulate their convictions with “I know” statements expressing absolute assurance. Even in some of their more nuanced conversations about belief, Mormon leaders encourage members to “doubt your doubts” or to otherwise eschew uncertainty. At other times, doubt is characterized as Satanic and actively shamed.
Dual Processes & Cognitive Styles
Broadly speaking, we all engage in two general styles of evaluative cognition: intuitive thinking and analytic reasoning. Intuitive thinking is characterized by reliance on tacit, experiential knowledge and implicit rules of thumb. Conversely, analytic reasoning is reliant upon explicit, domain-general knowledge and is methodically deliberate. While everyone engages in both modes of thinking—and it is possible to induce someone to engage more heavily in either system at a given time—some evidence suggests that people tend to dispositionally gravitate toward one style over the other. Of immediate relevance, conspiracy belief is associated with a tendency to favor intuition over analytic reasoning (Swami et al., 2014; Gligorić et al., 2018).
Intuitive thinking displays a greater reliance on the feelings evoked by information and on cognitive shortcuts in making quick judgments to inform decision-making. We rely upon these shortcuts (i.e., heuristics) particularly under conditions of uncertainty or when information is limited or ambiguous. Generally speaking, these heuristics serve us well in making sense of our world, which is why they exist in the first place. However, they are also systematically prone toward making certain types of errors, reflected by a wide range of cognitive biases. Many of these biases have been explored by social and cognitive psychologists with regards to understanding conspiracy belief. For our present discussion, we will consider in turn the heuristics of availability, representativeness, and affect.
Before we get much further, I want to address the common misconception that intuitive thinking is inherently biased whereas analytic reasoning is inherently corrective. This is simply inaccurate. Analytic thinking is also prone to biases and errors of particular sorts. Neither mode of cognition—intuitive or analytic—is perfect and free from error, nor do they function as separable and independent systems. They are only separate conceptually, for ease in communication. However, it is a very reliable observation that preference for intuitive modes of reasoning are predictive of conspiracy beliefs and generalized conspiracism. Consequently, for our present purposes, we will focus primarily on intuitive thinking and its relationship with conspiracism, while acknowledging that analytic thinking isn’t an error-free panacea to all the world’s ills.
With Every fiber of My Being
As mentioned above, each of us engages in both modes of cognition and decision-making to varying degrees and in diverse settings. However, communities and and their members differ regarding the degree to which they favor one mode over the other and in what settings either mode maintains priority. A hallmark of intuitive thinking is its expression through hunches, gut-feelings, and immediate unsolicited thoughts. Because they are the product of unconscious knowledge, intuitions can feel as though they are expressions of a deeper part of our being than our conscious selves. Conversely, they may feel as though generated apart from and outside of ourselves. In either case, intuitive thoughts have the feeling of truth—especially truth that is apart from, and perhaps even at odds with, our conscious awareness.
In situations where our intuitive and analytic thoughts are at odds, which are given priority? This is a complicated and multifaceted question that goes well beyond the purposes of this essay. However, I want to suggest that one important factor is the culture of the communities in which one is raised or to which one presently belongs. Communities differ regarding the priority they place upon intuition and how intuitions are interpreted. For instance, scientific communities place a high value on skepticism and generally prioritize analytic thinking over intuition. Communities devoted to conspiracy theories generally place a higher priority on intuitive thinking and directing skepticism and analytic reasoning toward defending those intuitions.
Intuitive thinking is also prioritized in Mormon communities. As mentioned above, The Book of Mormon teaches that “the truth of all things” can be known through the power of the Holy Ghost, which manifests as emotional sensations, clarity of thought, a sense of instinctive coherence, sudden and unexpected thoughts, or even a voice audible only to oneself. Most of these are indistinguishable from ways intuitions are described.
In a pair of revelations in The Doctrine and Covenants, Oliver Cowdery is given the following instructions regarding revelation:
2 Yea, behold, I will tell you in your mind and in your heart, by the Holy Ghost, which shall come upon you and which shall dwell in your heart. 7 Behold, you have not understood; you have supposed that I would give it unto you, when you took no thought save it was to ask me. 8 But, behold, I say unto you, that you must study it out in your mind; then you must ask me if it be right, and if it is right I will cause that your bosom shall burn within you; therefore, you shall feel that it is right. 9 But if it be not right you shall have no such feelings, but you shall have a stupor of thought that shall cause you to forget the thing which is wrong; Doctrine and Covenants § 8:2, and § 9:7–9, emphasis my own.
These passages are frequently invoked when instructing members regarding how they too may receive revelation. Several features of these passages are informative for our present conversation. First, manifestations of the Holy Ghost are described in ways that are highly characteristic of intuitive thinking. Namely, clarity of thought with an accompaniment of emotional affirmation. Conversely, the absence of affirming emotions coupled with confusion or clouded thinking are indications of divine disconfirmation.
Second, deliberative thinking is invoked as a vehicle to prepare oneself to receive subsequent, confirmatory revelation. Analytic thinking is not deemphasized in Mormon epistemology, but it serves a specific purpose. In this case, it is to prepare the individual to ask questions and confer their intuitions for confirmation. Thereafter, analytic thinking is employed to interpret those intuitions as manifestations of the Holy Ghost, one’s own thoughts, or perhaps the deceptions of Satan.
The Availability Heuristic
The first of three general heuristics we will discuss is the availability heuristic. In essence, this heuristic reflects the limitations on our cognition imposed by the information readily available for consideration, and our preference for information that is easily accessible and with which we are familiar. This is particularly relevant to our discussion of enculturation because exposure is the biggest factor in determining availability for cognition. Naturally, the social environments in which we find ourselves exert tremendous selective pressure on which ideas and information we encounter and how frequently. However, exposure is not the only factor that affects availability. Priming, recency, and general worldview can each make certain information more or less accessible via recall or otherwise available for cognition.
Illusory Truth Effect
A powerful example of the availability heuristic is the illusory truth effect—the more we are exposed to the same information, the more we tend to intuitively perceive it as true. Critically, this effect can be produced even when we are initially aware that the information is misleading, inaccurate, or false. Repetition and frequent exposure enhance cognitive availability, increasing the accessibility of information for intuitive decision-making.
An expression of this is the misinformation effect, wherein individuals’ memory of episodic effects can be influenced through later, repeated exposure to inaccurate information. There are numerous explanations for why this may occur, including the mixed integration of new and original information, the notion that memories become labile and temporarily vulnerable to alteration before they are reconsolidated, or simply that newer information is more cognitively available than older memories due to recency (i.e., recency bias). A related effect is the hindsight effect, wherein we tend to reinterpret past events to conform with new knowledge or experience so as to give the impression that events were more predictable than they were.
In a later essay, we will revisit the illusory truth and misinformation effects with regards to how the popularity of recovered-memory therapy fueled the Satanic Panic of the ’80s and ’90s, which played out in Utah in uniquely Mormon ways. This psychiatric practice remains an issue that divides therapists and informs popular conspiracy theories into the present.
Confirmation Bias
Confirmation bias is another phenomenon particularly relevant to conspiracy belief. It represents the tendency to preferentially search for, interpret, and recall information that confirms one’s established values or beliefs, affecting both intuitive and analytic modes of thought. These three expressions of confirmation bias—search, interpretation, and recall—provide good examples of the three different cognitive heuristics mentioned above, so we will explore each of these in the corresponding sections below.
Briefly, confirmation-biased search refers to the tendency to seek information and frame inquiry in ways that confirm—rather than disconfirm—preexisting hypotheses or established beliefs. It exemplifies well the interaction of the availability and representativeness heuristics. Confirmation-biased interpretation is expressed when we preferentially interpret ambiguous information to support preexisting beliefs, or devote disproportionate scrutiny to information that challenges those beliefs. The role of the affect heuristic is striking in this expression. Finally, confirmation-biased recall refers to the selective memory for information or events that affirm a person’s current values or beliefs over those that controvert their present beliefs. Because this expression is the most clearly related to availability, we will examine it further in this section, though both the representativeness and affect heuristics can affect selective memory. Things are rarely simple or fit neatly in a box.
The effects of confirmation-biased recall (aka, selective or access-biased memory) are fundamentally a matter of availability—if prior information cannot be recalled, then it is not readily available for decision-making. Without question, both representativeness and affect can have profound impacts on the salience of different kinds of information or events. At the end of the day, however, the effect of those changes in salience are expressed via differential ease of recall (a feature of availability). Therefore, preferential memory influences cognition by favoring information reinforcing one’s perception of their self or their in-group (i.e., myside bias), that corroborates currently held stereotypes of others, or that affirms a currently espoused interpretation of events or circumstances. That said, the unexpectedness of events can have a pronounced effect on salience and thereby improve recall. Among the three expressions of confirmation bias, preferential recall is the area with the least scholarly consensus.
Familiarity Breeds Faith
It is not difficult to identify ways in which the availability heuristic is expressed in Mormon communities, and critics often make a point of it. In particular, the illusory truth effect is frequently invoked as a tool of “Mormon brainwashing.” While these conversations raise some valid criticisms, invoking the pseudoscientific concept of “brainwashing” oversteps reality and ventures into conspiracism in its own right (Montell, 2021). We’ll discuss the tendency to attribute hostile intent where none exists in a subsequent essay; for the present, let us consider how Mormon cultural practices may encourage the illusory truth effect to reinforce faith among members.
Recall that the illusory truth effect refers to our tendency to intuit as true those ideas to which we are most frequently exposed—especially when accompanied by social reinforcement and the absence of counter-messaging. At a basic level, this is an intrinsic feature of enculturation into any community, so it’s not especially remarkable that Mormons are more likely to intuitively accept Mormon ideas because they encounter them more frequently. However, the frequency with which conspiracy narratives are encountered within Mormon communities is a question I find especially interesting—one that lies at the heart of this research project. In short, it is very often and with great variety, but we’ll save an in-depth discussion demonstrating that point for another essay series.
A commonly indicated example of the illusory truth effect in Mormon culture is the practice of frequently bearing personal testimony to the veracity of Mormon ideas, and the encouragement of doing so as a method in reinforcing personal conviction. Mormon leaders and church-produced media frequently encourage this practice explicitly. This is the example that frequently draws attention from believers in “Mormon brainwashing.” Setting aside the “brainwashing” claims, the critique of the practice of testimony-bearing to buttress one’s beliefs has some merit.
Encouraging members to publicly declare their certainty in Mormon ideas—as a method of self-conversion—is an ethically questionable practice. However, the transparency with which this is promoted undermines claims of a sinister “brainwashing” scheme. Rather, these encouragements appear to be descriptions of the illusory truth effect in practice, though sincerely interpreted as divine encounters with the Holy Ghost rather than a well-documented cognitive phenomenon. Consider some of the following examples:
It is not unusual to have a missionary say, “How can I bear testimony until I get one? How can I testify that God lives, that Jesus is the Christ, and that the gospel is true? If I do not have such a testimony, would that not be dishonest?” Oh, if I could teach you this one principle. A testimony is to be found in the bearing of it! Somewhere in your quest for spiritual knowledge, there is that “leap of faith,” as the philosophers call it. It is the moment when you have gone to the edge of the light and stepped into the darkness to discover that the way is lighted ahead for just a footstep or two. “The spirit of man,” is as the scripture says, indeed “is the candle of the Lord.” It is one thing to receive a witness from what you have read or what another has said; and that is a necessary beginning. It is quite another to have the Spirit confirm to you in your bosom that what you have testified is true. Can you not see that it will be supplied as you share it? As you give that which you have, there is a replacement, with increase! Elder Boyd K. Packer, "The Candle of the Lord" Ensign, January 1983, emphasis my own.
Witnesses have a special knowledge and are to bear testimony of “that which they have seen and heard and most assuredly believe.” We make simple, clear, direct statements that we know with certainty and surety that the gospel is true because it has been “made known unto [us] by the Holy Spirit of God.” In bearing such a testimony, speaking by the power of the Holy Ghost, we are promised that “the Holy Ghost shall be shed forth in bearing record unto all things whatsoever [we] shall say.” We are blessed personally when we so testify. [...] Making a determined and confident public statement of your belief is such a step into the unknown. It has a powerful effect in strengthening your own convictions. Bearing testimony drives your faith deeper into your soul, and you believe more fervently than before. Elder Joseph B. Wirthlin, "Pure Testimony" General Conference, October 2000, emphasis my own.
To the youth listening today or reading these words in the days ahead, I give a specific challenge: Gain a personal witness of the Prophet Joseph Smith. Let your voice help fulfill Moroni’s prophetic words to speak good of the Prophet. Here are two ideas: First, find scriptures in the Book of Mormon that you feel and know are absolutely true. Then share them with family and friends in family home evening, seminary, and your Young Men and Young Women classes, acknowledging that Joseph was an instrument in God’s hands. Next, read the testimony of the Prophet Joseph Smith in the Pearl of Great Price or in this pamphlet, now in 158 languages. You can find it online at LDS.org or with the missionaries. This is Joseph’s own testimony of what actually occurred. Read it often. Consider recording the testimony of Joseph Smith in your own voice, listening to it regularly, and sharing it with friends. Listening to the Prophet’s testimony in your own voice will help bring the witness you seek. Elder Neil L. Andersen, "Joseph Smith" General Conference, October 2014, emphasis my own.
Another way to seek a testimony seems astonishing when compared with the methods of obtaining other knowledge. We gain or strengthen a testimony by bearing it. Someone even suggested that some testimonies are better gained on the feet bearing them than on the knees praying for them. Elder Dallin H. Oaks, "Testimony" General Conference, April 2008, emphasis my own.
The representativeness Heuristic
The representativeness heuristic is another major phenomenon of intuitive thinking. Briefly, the representativeness heuristic is a cognitive shortcut that relies upon subconscious evaluations of conceptual similarity, such as the resemblance between causes and their effects. Subconsciously, people create an understanding of their world through networks of associative links between stimuli that represent concepts or ideas. These links are learned through experience and are therefore generally a reflection of the environmental inputs to which we have been exposed. The strength of these associations are primarily determined by the frequency and salience of that exposure, which determine the cognitive ease with which these associations come to mind. Therefore, the representativeness heuristic is not entirely distinct from the availability heuristic.
Like other heuristics of intuition, we rely upon considerations of representativeness automatically and with minimal cognitive effort. We do this because it is an adaptive and generally effective strategy—though it is far from immune to error. Relevant to our present discussion, researchers have found that propensities to rely upon the representativeness heuristic are associated endorsement of conspiracy theories. For instance, individuals with a greater tendency to correlate the magnitude of causal events with the magnitude of their effects are also more likely to believe in conspiracy theories (Brotherton, 2015; van Prooijen & van Dijk, 2014). Likewise, people who are more prone to committing the conjunction fallacy—attributing greater probabilistic likelihood to the conjunction of two or more independently probabilistic occurrences than any of those component occurrences alone—also demonstrate higher rates of conspiracy belief (Brotherton & French, 2014).
Confirmation Bias, Again
As discussed above, confirmation bias can be expressed in terms of preferential search, interpretation, or recall of information that confirms preexisting values or beliefs. Confirmation-biased search is evident in our tendency to make inquiries using a positive test strategy, wherein we seek what we expect to find. For example, imagine you are tasked with discovering the rule underlying the construction of three-number sequences exemplified by “2-4-6.” You are allowed to present any sequence of three numbers, to which a “yes” or “no” answer reveals whether that sequence conforms to the underlying rule. What sort of sequences would you guess?
Most people will form a hypothesis regarding the rule and make guesses motivated by finding confirmations of that hypothesis. You might guess “8-10-12,” then “22-24-26,” and finally “56-58-60,” receiving a “yes” to each that confirms your hypothesis that the rule is “consecutive even numbers.” However, your conclusion would be wrong—the rule in this exercise is “any ascending sequence of numbers.” Your erroneous conclusion stems from neglecting to incorporate a negative test strategy by guessing sequences that do not conform to your hypothesis. For instance, guessing “1-3-5,” “6-4-2,” or “21-56-613” would each challenge your hypothesis and provide critical diagnostic information.
Confirmation-biased search is frequently demonstrated to correlate with conspiracy belief (Leman & Cinnirella, 2013; Pytlik et al., 2020). Indeed, it is believed that one of the reasons that conspiracy beliefs are so persistent is because conspiracy believers are generally protected from contradictory information through their own confirmation-biased search. This leads to increased exposure to belief-confirming information and decreased exposure to controverting information. Considering our discussion above regarding the illusory truth effect, you can probably see already how this may become a self-reinforcing system.
Proportionality Bias
The proportionality bias is the tendency to favor explanations that maintain a correspondence between the magnitude of causes and their effects. That is, when a given event is perceived as particularly momentous or profound, we are inclined to favor explanations invoking causes that correspond in significance or profundity. Although this is often a reasonable heuristic, it can also fail us—sometimes major events are precipitated by seemingly insignificant causes. Our bias toward proportionality can mislead us into favoring complicated explanations about orchestrated conspiracies over more parsimonious accounts regarding the cumulative effects of relatively modest and quotidian causes.
Consider the diverse narratives regarding the genesis of the COVID-19 pandemic. Competing narratives include 1) the unintentional zoonotic transfer of a coronavirus at a Wuhan food market, 2) the accidental leak of SARS-CoV-2 from a virology research lab, 3) the intentional development of SAR-CoV-2 as a Chinese bioweapon, and 4) COVID-19 isn’t a viral infection at all but the deleterious health effects of broadcasting 5G cellular networks. Without question, the COVID-19 pandemic has been a large and socially significant event—one that seems proportionally discordant with the official narratives invoking relatively mundane origins. A feature of the competing conspiracy narratives is that they increase the significance of the proposed causes to have greater proportional correspondence with the resultant effect. We observe similar trends with many other popular conspiracy theories about significant events such as the JFK assassination, 9/11, or the disappearance of Malaysia Airlines Flight 370.
Researchers have found that we are more susceptible to the proportionality bias when we, or those for whom care, are personally affected by a particular event. That is, the more that we (or those we care about) are personally impacted by an event, the greater our tendency to prefer causal narratives that are proportional to that event. Conversely, the degree with which we can distance ourselves from an event, the easier it is for us to maintain objectivity and to explain that event as the product of chance or seemingly insignificant causes. (Brotherton, 2015; van Prooijen & van Dijk, 2014).
The Conjunction Fallacy
Additional research demonstrates that biased perceptions of random coincidence are associated with greater amenability to anomalous beliefs. Namely, individuals more prone to committing the conjunction fallacy also express greater belief in conspiracy theories and paranormal phenomena (Brotherton & French, 2014). The conjunction fallacy is the overestimation of the likelihood of co-occurring events, such that the coincidence of two or more probabilistic occurrences is judged as more likely than each occurring separately. A classic (though not imperfect) example of the conjunction fallacy is “the Linda problem.”
"Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations."
Based on the description above, participants are asked to evaluate the likelihood of the propositions that 1) Linda is a bank teller, 2) Linda is an active feminist, and 3) Linda is a bank teller and an active feminist. Regardless of one’s estimates of the first two propositions, estimating the third proposition as more likely than either of the first two is an error of probabilistic reasoning (though perhaps not a misjudgment of plausibility). The conjunction of two or more propositions cannot be more probable than any one of its component parts because a conjunction is necessarily a more restrictive set of possibilities.
In their research into the conjunction fallacy and anomalous beliefs, Brotherton & French (2014) discovered that participants who scored high on measures of conspiracy or paranormal beliefs committed 50% more conjunctive errors than nonbelieving participants. The correlation with propensity to fall prone to the conjunction fallacy was stronger for conspiracy belief than paranormal belief, despite the strong correlation between the two kinds of anomalous beliefs (a correlation frequently observed in the literature). These researchers suggest that “conspiracy believers have a biased conception of randomness, according to which coincidences are rarely mere chance coincidences” and “individual differences in susceptibility to the representativeness heuristic […] may influence the formation or maintenance of anomalous beliefs.”
Hindsight Bias
The conjunction fallacy is a failure of statistical reasoning that demonstrates how the representativeness heuristic can bias our judgments towards what feels plausible rather than what is probable. A related phenomenon is the Need for Control, which I will argue is related to our Need for Coherence. Briefly, we crave to feel safety, security, and in control of our circumstances. Perceiving the world as orderly and predictable are critical to maintaining this fragile sense of security. Events that diverge from our expectations are disruptive, especially if they defy our current understandings of how the world operates. In such circumstances, we are motivated to construct narratives that restore coherence to the world around us.
One expression of this is the hindsight bias—in which we interpret past events as more predictable than how we experienced them before and during their occurrence. This is different from confirmation-biased recall, which refers to the selective ease with which information that confirms established beliefs are remembered and made available for cognition. Rather, the hindsight bias is a matter of reinterpreting past events to correspond with our understanding of current circumstances. The hindsight bias is sometimes expressed in the sentiments that “everything happens for a reason” or that “there are no coincidences.” In historiography, the advice about avoiding presentism is can be seen as a warning against the hindsight bias.
Representativeness and Mormonism
The epistemological methods prescribed by Mormon scripture provide a strong example of the positive test strategy of confirmation-biased search. In Moroni 10:3–5, readers of the The Book of Mormon are encouraged to ponder over God’s mercies to his people throughout history, and thereafter ask God in prayer “if these things are not true,” with the unspoken assumption of their veracity. The Doctrine and Covenants instructs the inquirer to study things out in their mind, form a conclusion, and then “ask me if it be right,” whereupon they will either receive confirmation or require repeating the process (D&C § 9:7–9). Similarly, the experiment of faith proposed in Alma 32:26–43 encourages seekers to plant the word of God in their hearts and to nourish it there in order to see if it will grow, which will be manifest as a positive emotional response. Finally, the popular Mormon interpretation of John 7:17 is that a person can know the truth of a teaching by following that teaching as though it were true.
Arguably, some prominent Mormon teachings should work against the proportionality bias. Namely, the teaching that “by small and simple things are great things brought to pass” runs directly counter to the proportionality bias (Alma 37:6–7). One might reasonably argue that such teachings might make Mormons less prone to the proportionality bias and perhaps confer some protection against conspiracy belief. On the other hand, perhaps this explicit reversal of the proportionality principle may prime individuals to search for the collective impact of small and subtle causes found in seemingly unlikely places. Such a mindset could encourage seeking patterns and inferring causality between major events and seemingly inconsequential precursors in a manner that is consistent with the subtle machinations of nefarious and crafty conspirators.
Hindsight bias and conjunctive errors also turn up frequently in Mormon culture. The beliefs that God is in total control and everything happens for a reason are a frequent feature of many faith-promoting stories shared in testimony meetings or Sunday School discussions. Hindsight bias is common in the telling of Mormon history, such as when tragedies, persecutions, or even triumphs experienced by early converts are described as predictable and inevitable. In contrast, Mormon apologists will frequently chide critics about engaging in presentism when the latter comment on character flaws of early Mormon leaders. Critical exMormons will often describe their experiences in Mormonism as more alarming or disconcerting than they may have experienced them in the moment. On the other hand, devout Latter-day Saints may describe their interactions with now exMormon acquaintances as more predictive of their exodus from the church than they perceived at the time. Sometimes experiences perceived as predictive in hindsight are interpreted as premonitions given though the Holy Ghost.
The Affect Heuristic
The affect heuristic refers to the tendency for our intuitive judgements to be influenced by the emotional valence of associations evoked by the situation at hand. That is, we tend to favor things that we like, and we tend to like things that make us feel happy, safe, proud, empowered, relaxed, etc. Conversely, we tend to reject things that we dislike, and tend to dislike things that make us feel sad, angry, fearful, disgusted, ashamed, confused, agitated, etc. Importantly, these feelings are evoked involuntarily via activation of associative networks relevant to particular stimuli or events—associations that are the product of particular experiences or repeated exposure to various environmental inputs.
You may feel happy when you think of the Grand Canyon, perhaps because your family frequently vacationed there and those experiences were generally pleasurable. Conversely, you might be sad when thinking about the Grand Canyon because you only visited it once on a traumatic childhood vacation, during which the family dog was lost and never found, which precipitated a particularly bad argument between your parents who later divorced. In either case, your later decisions regarding whether to recommend the Grand Canyon as a vacation spot to a friend will be affected by the emotions evoked by your own experiences with the Grand Canyon—which may have no logical bearing on whether or not the Grand Canyon is a good place to vacation in general.
The influences of affect can manifest in surprising ways, especially since we tend to believe that our decision-making is driven primarily by logic and rationality, rather than by our present emotional state. However, countless studies demonstrate that our systems of rational thought are affected by our emotions. Indeed, simply encouraging someone to think of a time they were especially happy vs. especially upset, before asking them to make evaluations that require deliberative effort to perform accurately, will produce surprisingly divergent results. Those who are made to feel happy will be more prone to rely on intuition and heuristics, making more errors; those made to feel angry or sad will be more deliberative and analytic, resulting in improved accuracy (Bless et al., 1996; Thompson & Morsanyi, 2012).
This finding is often surprising. Why would unpleasant feelings increase accuracy on tasks requiring deliberative effort? The explanation is that intuitive thinking is effortless and automatic; it supplies evaluations by default—whether we desire them or not. Analytic reasoning is deliberate and requires cognitive effort, which we tend to avoid unless situation demands it. Unpleasant emotional states are, well, unpleasant. We generally avoid them. They serve as a signal that something is wrong, which prompts increased vigilance and deliberative activity. This is adaptive—when we perceive things as going well, it doesn’t make sense to be on active lookout for potential problems or threats. When current circumstances evoke sadness, anger, disgust, or fear, it makes sense to devote more cognitive resources to identifying what might be amiss.
However, analytic thinking is not immune from the inputs of affect generated by our intuitions. While it is true that analytic thinking can be employed to override our intuitive dispositions, the emotions generated by our intuitive machinery often recruit our analytic thinking to justify intuitive evaluations. It is important to realize that intuition and deliberative analysis are not truly distinct systems of cognition that operate as separable, independent entities within a person. There are both the products of the same body, involve many of the same brain systems, and are equally integral components of what comprises a person. You are equally the product of your intuitions and emotions as your deliberative thoughts and analytic rationalizations. The boundaries between these “systems” are not clear cut and are principally theoretical.
Confirmation Bias, Again and forever
To reiterate, confirmation bias can manifest as preferential search, interpretation, or recall of information that affirms established beliefs. Confirmation-biased interpretation occurs when ambiguous or equivocal information is preferentially interpreted in a way that corroborates preexisting beliefs. It is also expressed when we devote disproportionate skepticism or scrutiny to new ideas or information that challenges our existing beliefs than to those that affirm those beliefs.
The experience of psychological distress when confronted with information that contradicts our established values or beliefs is termed cognitive dissonance. The experience of cognitive dissonance generally prompts individuals to engage in analytic thinking directed at resolving the contradiction. This may result in replacing or revising prior values or beliefs in accordance with the new information. Alternatively, the contradiction may be resolved by rejecting and refuting the new information, or reinterpreting said information so as to affirm our established beliefs (i.e., confirmation bias). Importantly, cognitive dissonance represents a manifestation of the affect heuristic inasmuch as we generally accept our intuitions until we experience psychological discomfort. Thereupon, we will engage in analytic thinking to resolve that discomfort—either by overriding our intuitions and reevaluating our existing beliefs, or by prioritizing our intuitive judgments and skeptically reinterpreting the information that challenges those intuitions.
In extreme cases, confirmation bias can manifest itself in the backfire effect, wherein individuals become more convinced of their established beliefs after presented with new information controverting those beliefs—though this effect appears to be rare (Nyhan & Reifler, 2010; Nyhan, 2021). Conspiracy theories may be particularly powerful narratives for encouraging the backfire effect. As Rob Brotherton (2015) explains: “…if you’re looking to rationalize away an inconvenient fact, nothing beats a conspiracy theory. When you assert that nobody can be trusted, seemingly incontestable facts can simply be written off as part of the coverup.”
Affect and the Holy Ghost
Affect plays a major role in Mormon epistemology. Both The Book of Mormon and the Doctrine and Covenants prescribe a model wherein one’s intuitions and feelings are given a prioritized role in discerning truth. Positive and affirming emotions are interpreted as the Holy Ghost bearing witness to the truth; confusion or distressing feelings are signs of error, perhaps even the influence of Satan. Several passages have already been referenced in the sections above. (1, 2). In addition to these, The Book of Mormon teaches:
29 For verily, verily I say unto you, he that hath the spirit of contention is not of me, but is of the devil, who is the father of contention, and he stirreth up the hearts of men to contend with anger, one with another. 30 Behold, this is not my doctrine, to stir up the hearts of men with anger, one against another; but this is my doctrine, that such things should be done away. 3 Nephi 11:28–29, emphasis my own.
A common interpretation of these verses is that people who present information that criticizes, contradicts, or otherwise challenges cherished Mormon values or beliefs have the spirit of contention, and are therefore operating—knowingly or not—as agents of Satan. Of course, this is not the only way to interpret this passage, but anyone who has tried to discuss sensitive and controversial topics in devout Mormon spaces—especially criticisms of Mormon history, doctrines, or policies—has experienced the notorious Mormon aversion for conflict. Likewise, anyone who openly criticizes Mormon practices or ideas may have also experienced being branded an “anti-Mormon” instigator who should be avoided or even silenced. Similarly, Mormon leaders and church media frequently emphasize the danger of straying from “proper” and “divinely appointed sources” of information, often with fear-laden rhetoric emphasizing the dangers of being deceived by Satan into losing one’s testimony and salvation.
Returning to the main point, emotions—whether affirming or dissonance-provoking—have an emphasized role as the medium for discerning truth from error in Mormon culture. This is not happenstance; it is taught in foundational Mormon texts, by Mormon leaders, and through approved church materials. The idea that positive emotions evince truth (and disruptive emotions signify dangerous error) feeds directly into the human tendency to resolve cognitive dissonance through confirmation-biased interpretation.
Inoculating for devotion
We simply cannot discuss cognitive dissonance and confirmation bias without also mentioning the effort to “inoculate” members against doubts via the LDS Gospel Topics Essays. In a recorded discussion, Elder Steven E. Snow of the First Quorum of the Seventy explained that the essays were deliberately released on the LDS church’s website via “a soft rollout.” The idea was to make them difficult to encounter accidentally so that only those already experiencing a Mormon “faith crisis” would find them, whether through word of mouth or ecclesiastical recommendation. He further elaborated that they were also composed to “inoculate the rising generation” against troubling facts about church history and teachings, “without them being totally shocked when they hear them for the first time.”
A subsequent article in the Deseret News reported the following observation by believing Mormon historian, Matthew Bowman:
"Adjusting people's worldview is a tough task," said Bowman, a history professor at Henderson State University. "A lot of people may feel resistant to this because they already have a story. They may feel, 'I don't want information that will make me accept a different story.' The church is gently trying to move people into a different story." Leaders "want first and foremost to preserve the faith," he added. "They wrote the essays in a way that will inoculate the faith."
Elder Snow’s and Dr. Bowman’s shared use of the term “inoculate” is noteworthy regarding the strategy of using the Gospel Topics Essays to safeguard members’ devotion. As early as the 1960s—but especially in the 1990s and 2000s—researchers in communication studies and social psychology have investigated techniques of protecting beliefs against persuasion, using what is termed inoculation theory.
Invoking a vaccination metaphor, inoculation theory proposes that people can be made resistant to persuasion by exposing them to weak counter-arguments or bits of controverting information in a controlled environment, while also affording said persons the resources wherewith they may construct their own arguments in defense of their established beliefs. According to this theory, a person’s beliefs must feel sufficiently threatened (i.e., cognitive dissonance) that they become motivated to fight to maintain and strengthen their opinions (i.e., confirmation-biased interpretation), but not perceive that the effort in doing so as insurmountable.
Boiled down to its essence, inoculation theory is about providing individuals with practice in experiencing moderate cognitive dissonance and employing confirmation-biased interpretation of piecemeal information in order to encourage the backfire effect. It is astonishing and disconcerting to see Mormon leadership applying these techniques in their efforts to reinforce members’ faith. Though their efforts are likely borne out of a sincere concern over the spiritual welfare of members, the approach is ethically questionable. Regarding Mormon conspiracism, I am curious whether similar inoculation phenomena also commonly occur in communities of conspiracy believers—whether deliberately among members or naturally through the discourse of the community in online forums and social media. This is a question I have not yet seen addressed in the academic literature.
Apophenia and Causal Inference
In addition to the heuristics and biases described above, our propensity to identify patterns and infer causality between correlated events also warrants discussion regarding conspiracy belief. How we make evaluations of causality is essential to understanding how conspiracy theories function as sense-making narratives. Indeed, all explanations—be they philosophical, mythological, scientific, or conspiracist—are fundamentally about causality. Why events happen, and how present circumstances came to be, are fundamentally questions about cause and effect.
As mentioned above, our cognitive machinery is built around associative networks between ideas reflecting a variety of stimuli or events. These associations are formed primarily through observation or direct experience, with some associations being biologically-prepared to form more readily. Therefore, it may be argued that sense-making is intrinsically an associative endeavor. Human intuitions are primarily expressions of these associative networks, which also provide the matrix within which our analytic reasoning operates. It should be unsurprising then that how we perceive and attribute causality is informed by our models of association between events.
Two relational factors—contiguity and contingency—are the primary criteria whereby we infer causality. Contiguity refers to the proximity of co-occurrence between stimuli or events in time and space. As a general principle of learning, associations between two things will form more readily when they are temporally and/or spatially contiguous (though temporality is arguably more important). Events observed to occur as substantially separated are less likely to become associated. The relationship between the associative strength between two things and their contiguity is a well-documented psychophysical phenomenon, the parameters of which vary by degrees regarding stimulus modality and biological relevance. Most importantly, events that co-occur in close proximity form stronger associations.
Contingency represents the probabilistic relationship of co-occurrence between events. That is, how likely is it that Event A occurs in the exclusive context of Event B? The predictive reliability of one event for the occurrence of the other is the essence of contingency, and events that have reliably predictive relationships form strong associations. However, while contiguity and contingency are commonly in concordance, they need not always be so. Event A might reliably follow the occurrence of Event B, but it might also occur with a high frequency in settings where Event B never occurs. In such cases, an association will still form between Events A and B, but the nature of that association will become nuanced through additional associative structures.
A thorough explanation of how contiguity and contingency determine how were infer causality is a topic better suited to a seminar course for doctoral psychology students—far beyond the purposes of our present discussion. For now, it is only important to recognize two things: 1) much of our cognitive activity involves the formation of associations between events, and 2) we infer causal relationships between events that predictably co-occur with high proximity. Armed with that understanding, lets look at ways in which that can go awry.
Illusory Pattern Perception
The tendency to find patterns in random events, termed apophenia or Illusory Pattern Perception (IPP), has been demonstrated to correlate with the endorsement of conspiracy theories (van Prooijen et al., 2018). The brain is not a passive observer; it actively constructs our perceptual reality. It does so via a repertoire of cognitive shortcuts that impose order upon the often chaotic, ambiguous, or incomplete information that it receives (Brotherton, 2015). Optical illusions are classic example, such as when we perceive motion from particular arrangements of wavy lines, or perceive the contours of illusory shapes from amidst surrounding visual material. We cannot help but perceive these things, even when we consciously know they are illusions.
Much like optical illusions—in which our brains create perceptual distortions by filling in missing visual information—we are also cognitively primed to find patterns out of randomness. We see faces or objects in clouds, rocks, tree bark, the stars, or burn patterns on toast. We may perceive someone calling our name or hear a familiar tune from random background noise. The tendency to identify patterns among randomness can be exacerbated by the increased salience that rare or unexpected events have in capturing attention. This can result in a propensity to disproportionately attend to—and infer causal relationships between—random, unexpected events that appear to follow a pattern but have no actual relationship. Stronger proclivities to connect-the-dots in this manner are correlated with conspiracy belief (Gligorić et al., 2018; 2021).
spurious causal inference
Most people, especially students in the social sciences, have heard some version of the statement “correlation does not mean causation.” The principle behind this statement is simple: two things being highly correlated does not necessarily mean that one causes the other. Rather, the correlation might be the product of random coincidence, or the two things might both be the products of one or more other causes. For instance, ice cream consumption, incidence of drowning, crime rates, and sales of suntan lotion are all correlated. However, most of us are discerning enough to recognize that committing crimes does not increase the sale of suntan lotion, nor does eating ice cream cause drowning. Rather, all of these things are also correlated with sunny weather and warmer outdoor temperatures, which holds more promise as a causally important variable for each phenomenon.
Relationships like the association between ice cream consumption and crime rates are spurious correlations. That is, they have a degree of co-occurrence but do not reflect a direct causal relationship. However, while the above relationships are clearly spurious, many others are less obviously so. For example, an observed correlation between menopause and the increased risk of cardiovascular disease led medical experts to recommend menopausal hormone therapy to reduce women’s risk of diseases. However, this proved to be poor advice, as later research discovered that hormone therapy increased risks of cardiovascular and other diseases. More plausibly, aging results in both menopause and increased risks of cardiovascular disease.
The point is that many things are correlated that do not reflect a causal relationship, though we often cannot help but intuitively infer causality. Even in the examples above, our first inclination is to draw a causal inference before discounting that impression through subsequent analysis. In a continuous quest for sense-making, our cognitive machinery continually draws from its matrix of associative networks and intuits causality by default. When the impression of causality is sufficiently surprising, we are primed to reevaluate that intuitive judgement through deliberative analysis. However, when the impression of causality feels appropriate we are more prone to accept it at face value, or our analytic thinking will quickly converge on a explanatory narrative that confirms our intuitions.
Studies have shown that individual differences in the propensity to draw conclusions of causality from spurious correlations is predictive of conspiracy belief (van der Wal et al., 2018). Should we be cautious about concluding that conspiracy belief is caused by being prone to drawing spurious inferences of causality? Undoubtedly. However, these findings combine with a much broader literature demonstrating that individuals who more frequently rely upon intuitive judgments are more likely to believe in conspiracy theories. Furthermore, manipulations that momentarily increase analytic thinking decrease both inferences of spurious causation and self-reports of conspiracy belief (Swami et al., 2014).
Magical Thinking
Both apophenia and the propensity toward spurious causal inference are components of what is termed magical thinking. However, what makes magical thinking distinct is the inclusion of supernatural characteristics in the narratives constructed to explain these perceived causal relationships. This can take on a wide variety of forms, which is perhaps unsurprising given that accordance with generally-accepted natural laws no longer serves as a constraint. For instance, believing that one’s thoughts—on their own—can bring about effects in the physical world, is a form of magical thinking. So is believing that invisible, immaterial agents cause events in the physical world in accordance with one’s thoughts, words, or deeds.
Magical thinking is strongly associated with beliefs in the paranormal and in conspiracy theories (Dyrendal et al., 2021). Paranormal belief reliably correlates with conspiracy belief, though the latter is far more common (Brotherton & French, 2014). Although both paranormal and conspiracy beliefs often co-occur, it bears emphasizing that conspiracy theories often do not invoke the supernatural or paranormal at all. Indeed, many conspiracy theories can be seen as alternative sense-making narratives that explicitly avoid invoking the paranormal by sticking to conventionally accepted understandings of the natural world. Conspiracy theories principally feature naturalistic human social behavior and only sometimes invoke supernatural phenomena. That said, many conspiracy theories are explicitly paranormal, so these should be considered as distinct categories of anomalous beliefs with considerable potential for overlap.
Regardless, a propensity for magical thinking appears to be an important variable that is strongly and reliably associated with conspiracy belief—and perhaps more importantly—general conspiracy mindset.
The Miraculous World View
As a Mormon youth, I was primed to search events of my everyday life for signs of the miraculous. I was taught that God is ever present and directly intervenes in the world—especially in the lives of the faithful. Week after week, I listened to countless members bear their testimonies in sacrament meetings or Sunday school lessons about the ways that God turned up in their lives, often in the most unexpected places. I was promised that if I was sufficiently attentive, I too would recognize God’s hand in my life and develop an appreciation for everyday miracles. Recognizing these miracles were an important spiritual practice that would instill within me gratitude and humility, in addition to serving as evidence I could point to when bearing my own testimony of the faith.
Repeatedly hearing about others’ experiences with miracles instilled within me a desire to have the same. I was thoroughly motivated to interpret various events in my life as causally linked and evidential of either God’s favor and intervention on my behalf or his disappointment and withholding of blessings.
“Wow, I sure am being productive today! Every thing is going so smoothly. Reading the scriptures this morning really invited the Spirit into my life.”
“Why am I doing so poorly in this class? Is it because I sometimes struggle with impure thoughts and am unworthy of the Holy Ghost’s companionship?”
“My funds were particularly tight last month, but it somehow—just barely—worked out in the end. I’m sure glad I paid my tithing.”
As I reflect back on experiences like these, I am struck by how prevalent such messaging is in Mormon settings, and how illogical it can seem. As a Mormon, however, these relationships make complete sense; they are to be expected. After all, The Book of Mormon teaches that the faithfully obedient “are blessed in all things, both temporal and spiritual” and are promised to “prosper in the land,” whereas the wickedly disobedient “shall be cut off from the presence of the Lord” and His blessings (Mosiah 2:22, 41; Alma 9:13). Going further, the Doctrine and Covenants states that “when we obtain any blessing from God, it is by obedience to that law upon which it is predicated” (D&C § 130:21). Baked throughout Mormon theology is the notion that blessings are the product of righteousness and curses the consequence of wickedness or unfaithfulness.
Another example of pattern-seeking encouraged in Mormon communities is identifying the “signs of the Second Coming.” The Mormon faith is inherently apocalyptic, ever keeping an eye to the future day when the Latter-day Saints will welcome the return of Jesus Christ and the beginning of His millennial reign on the Earth. The premillennialist theology of Mormonism claims that the world will endure a period of tribulation preceding Christ’s miraculous return. Members are encouraged to prepare, in part, to endure these hardships by keeping a weather eye on the horizon. This encourages an attitude of declinism, such that Mormons are often prone to viewing society as generally getting worse over time, proving that the Parousia is close at hand. Notably, apocalypticism and declinism are each associated with conspiracy belief (Robertson & Dyrendal, 2018).
Members are taught that through their faith they can effectively will miracles into being. This is particularly true regarding the holy priesthood, which empowers those ordained to ritually rebuke diseases and evil spirits, consecrate homes and grave sites, and perform the various sacramental ordinances taught to be requisite for eventual exaltation. Boys as young as 11 years are conferred the Aaronic Priesthood, “which priesthood holdeth the key of the ministering of angels” (D&C § 84:26). Church leaders have taught that this enables ordained men to have literal angels attend to and strengthen them on conditions of their spiritual worthiness. Speaking to the young men, Elder Dallin H. Oaks taught:
When I was young, I thought such personal appearances were the only meaning of the ministering of angels. As a young holder of the Aaronic Priesthood, I did not think I would see an angel, and I wondered what such appearances had to do with the Aaronic Priesthood. But the ministering of angels can also be unseen. Angelic messages can be delivered by a voice or merely by thoughts or feelings communicated to the mind. President John Taylor described “the action of the angels, or messengers of God, upon our minds, so that the heart can conceive … revelations from the eternal world.” Nephi described three manifestations of the ministering of angels when he reminded his rebellious brothers that (1) they had “seen an angel,” (2) they had “heard his voice from time to time,” and (3) also that an angel had “spoken unto [them] in a still small voice” though they were “past feeling” and “could not feel his words” (1 Ne. 17:45). [...] Most angelic communications are felt or heard rather than seen. How does the Aaronic Priesthood hold the key to the ministering of angels? The answer is the same as for the Spirit of the Lord. In general, the blessings of spiritual companionship and communication are only available to those who are clean. As explained earlier, through the Aaronic Priesthood ordinances of baptism and the sacrament, we are cleansed of our sins and promised that if we keep our covenants we will always have His Spirit to be with us. I believe that promise not only refers to the Holy Ghost but also to the ministering of angels, for “angels speak by the power of the Holy Ghost; wherefore, they speak the words of Christ" (2 Ne. 32:3). Elder Dallin H. Oaks, "The Aaronic Priesthood and the Sacrament" General Conference, October 1998, emphasis my own.
The examples above illustrate how Mormon culture prepares those socialized into it to participate in seeking patterns where none may exist, attribute causation to events perceived as correlated, and to engage in magical thinking about the way one’s thoughts or actions may influence the world around them through supernatural means. Many of these examples are not unique to Mormon culture, but they are especially prominent among Mormons and are expressed in uniquely Mormon ways. The concept of Mormon men wielding the delegated power and authority of God, for instance, provides a powerful narrative that encourages magical thinking in ways that are distinctly Mormon.
As a whole, Mormon culture appears to provide an especially ripe environment for the development of conspiracy belief along a wide range of epistemic vectors. In subsequent essays, I will explore how the same is true of existential and social motivations for conspiracy beliefs.
Literature Cited
Bless H, Clore GL, Schwarz N, Golisano V, Rabe C, & Wölk M. (1996). Mood and the use of scripts: Does a happy mood really lead to mindlessness? Journal of Personality and Social Psychology, 71, 665–679.
Brotherton R. (2015). Suspicious minds: Why we believe conspiracy theories. Bloomsbury Sigma.
Brotherton R, & French CC. (2014). Belief in conspiracy theories and susceptibility to the conjunction fallacy. Applied Cognitive Psychology, 28, 238–248.
Douglas KM, Sutton RM, & Cichocka A. (2017). The psychology of conspiracy theories. Current Directions in Psychological Science, 26, 538–542.
Dyrendal A, Kennair LEO, Bendixen M. (2021). Predictors of belief in conspiracy theory: The role of individual differences in schizotypal traits, paranormal beliefs, social dominance orientation, right wing authoritarianism and conspiracy mentality. Personality and Individual Differences, 173, 110645.
Gligorić V, da Silva MM, Eker S, van Hoek N, Nieuwenhuijzen E, Popova U, & Zeighami G. (2021). The usual suspects: How psychological motives and thinking styles predict the endorsement of well-known and COVID-19 conspiracy beliefs. Applied Cognitive Psychology, 35, 1171–1181.
Gligorić V, Većkalov B, & Žeželj I. (2018). Intuitive and analytical cognitive styles as determinants of belief in conspiracy theories. In K. Damnjanović, I. Stepanović Ilić, & S. Marković (Eds.), Proceedings of the XXIV conference empirical studies in psychology (pp. 93–95).
Leman PJ, & Cinnirella M. (2013). Beliefs in conspiracy theories and the need for cognitive closure. Frontiers in Psychology, 4, 1–10.
Montell A. (2021). Cultish: The Language of Fanaticism. Harper Collins.
Nyhan B. (2021). Why the backfire effect does not explain the durability of political misperceptions. Proceedings of the National Academy of Sciences, 118, e1912440117.
Nyhan B, & Reifler J. (2010). When Corrections Fail: The Persistence of Political Misperceptions. Political Behavior, 32, 303–330.
Pytlik N, Soll D, Mehl S. (2020). Thinking preference and conspiracy belief: Intuitive thinking and the jumping to conclusions-bias as a basis for the belief in conspiracy theories. Frontiers in Psychiatry, 11, 568942.
Robertson DG, & Dyrendal A. (2018). Conspiracy theories and religion: Superstition, seekership, and salvation. In J. E. Uscinski (Ed.), Conspiracy Theories & the People Who Believe Them (pp. 411–421).
Swami V, Voracek M, Steiger S, Tran US, & Furnham A. (2014). Analytic thinking reduces belief in conspiracy theories. Cognition, 133, 572–585.
Thompson V, & Morsanyi K. (2012). Analytic thinking: Do you feel like it? Mind & Society, 11, 93–105.
van der Wal RC, Sutton RM, Lange J, & Braga J. (2018). Suspicious binds: Conspiracy thinking and tenuous perceptions of causal connections between co-occurring and spuriously correlated events. European Journal of Social Psychology, 48, 970–989.
van Prooijen J-W, & Acker M. (2015). The influence of control on belief in conspiracy theories: Conceptual and applied extensions. Applied Cognitive Psychology, 29, 753–761.
van Prooijen J-W, Douglas KM, & De Inocencio C. (2018). Connecting the dots: Illusory pattern perception predicts belief in conspiracies and the supernatural. European Journal of Social Psychology, 48, 320–335.
van Prooijen J-W, & van Dijk E. (2014). When consequence size predicts belief in conspiracy theories: The moderating role of perspective taking. Journal of Experimental Social Psychology, 55, 63–73.