Sure, we joked about it. We made memes about it. But in a shocking new development, scientists have confirmed… it’s true.
People really do think that randomly generated words strung together by the Wisdom of Chopra word generator are profound.
A new study in the November issue of the journal Judgment and Decision Making called “On the reception and detection of pseudo-profound bullshit” has been published by a team largely from the University of Waterloo’s Department of Psychology, led by Gordon Pennycook. The paper, which the authors cheekily acknowledge “may not be truly profound, [but] is indeed meaningful,” is a wry but serious analysis of four well-executed studies funded by the Natural Sciences and Engineering Research Council of Canada.
Here, they explain why they’re wading into this deep subject (I promise that’s the only punning I’ll do here):
Although bullshit is common in everyday life and has attracted attention from philosophers, its reception (critical or ingenuous) has not, to our knowledge, been subject to empirical investigation. Here we focus on pseudo-profound bullshit, which consists of seemingly impressive assertions that are presented as true and meaningful but are actually vacuous. We presented participants with bullshit statements consisting of buzzwords randomly organized into statements with syntactic structure but no discernible meaning (e.g., “Wholeness quiets inﬁnite phenomena”).
In On Bullshit, the philosopher Frankfurt (2005) defines bullshit as something that is designed to impress but that was constructed absent direct concern for the truth. This distinguishes bullshit from lying, which entails a deliberate manipulation and subversion of truth (as understood by the liar)
Past, um, bullshit research focused on the bullshitter, they explain, while they focus here on “the factors that predispose one to become or to resist becoming a bullshittee.” For examples of pseudo-profound bullshit, they turned to no other than The Master himself: Deepak Chopra. Study participants were asked to rate the profundity of sentences like this one created by the glorious random word generator WisdomOfChopra.com:
“Hidden meaning transforms unparalleled abstract beauty.”
Compare that to this example:
“Attention and intention are the mechanics of manifestation.”
That second one is an actual tweet from Chopra, or, as the authors describe it, a “real-world example of pseudo-profound bullshit” with “a striking resemblance” to the first sentence, yet is “(presumably) not a random collection of words.” It’s a style whose vagueness seeks to hide that it’s more an appearance of knowledge than the conveying of knowledge, one which
attempts to impress rather than to inform; to be engaging rather than instructive…
The vagueness… indicates that it may have been constructed to impress upon the reader some sense of profundity at the expense of a clear exposition of meaning or truth. Despite the lack of direct concern for truth noted by Frankfurt (2005), pseudo-profound bullshit betrays a concern for verisimilitude or truthiness. We argue that an important adjutant of pseudo-profound bullshit is vagueness which, combined with a generally charitable attitude toward ambiguity, may be exacerbated by the nature of recent media.
Why are gurus like Chopra so popular? Why do so many find meaning where there is none? The authors wanted to know, and started with a couple of possibilities:
In our view, there are two candidate mechanisms that might explain a general “receptivity” to bullshit… we are biased toward accepting bullshit as true and it therefore requires additional processing to overcome this bias… The second mechanism relates to a potential inability to detect bullshit, which may cause one to confuse vagueness for profundity. In the words of Sperber (2010): “All too often, what readers do is judge profound what they have failed to grasp…”
Who’s more receptive to bullshit, and how can we protect ourselves from it? The researchers looked at three traits, starting with analytic thinking, since
… to be a good reasoner, one must have both the capacity to do whatever computation is necessary (i.e., cognitive ability, intelligence) and the willingness to engage deliberative reasoning processes (i.e., analytic cognitive style; thinking disposition)… we posit that.. more analytic individuals should be more likely to detect the need for additional scrutiny when exposed to pseudo-profound bullshit. More intuitive individuals, in contrast, should respond based on a sort of first impression, which will be inflated due to the vagueness of the pseudo-profound bullshit.
Next is ontological confusion, where people fail to distinguish between “property differences between animate and inanimate or mental and physical:
Consider the belief that prayers have the capacity to heal (i.e., spiritual healing). Such beliefs are taken to result from conflation of mental phenomenon, which are subjective and immaterial, and physical phenomenon, which are objective and material (Lindeman, Svedholm-Hakkinen & Lipsanen, 2015)… As such, the propensity to endorse ontological confusions should be linked to higher levels of bullshit receptivity.
Third is epistemically suspect belief, which “conflict(s) with common naturalistic conceptions of the world”:
For example, the belief in angels (and the corresponding belief that they can move through walls) conflicts with the common folk-mechanical belief that things cannot pass through solid objects… Epistemically suspect beliefs, once formed, are often accompanied by an unwillingness to critically reﬂect on such beliefs. Indeed, reflective thinkers are less likely to be religious and paranormal believers … and are less likely to engage in conspiratorial ideation… or believe in the efficacy of alternative medicine… Ontological confusions are also more common among believers in the supernatural… Although epistemically suspect claims may or may not themselves qualify as bullshit, the lack of skepticism that underlies the acceptance of epistemically suspect claims should also promote positive bullshit receptivity.
Researchers conducted four related studies that asked participants to rate the profundity of meaningless statements on a 1-5 scale and also assessed the above traits. One study involved 280 college undergraduate volunteers, while the others used 198, 125 and 242 participants from Amazon’s Mechanical Turk (an online marketplace for tasks that lend themselves to remote outsourcing).
Ten novel meaningless statements were derived from two websites and used to create a Bullshit Receptivity (BSR) scale. The ﬁrst, http://wisdomofchopra.com, constructs meaningless statements with appropriate syntactic structure by randomly mashing together a list of words used in Deepak Chopra’s tweets (e.g., “Imagination is inside exponential space time events”). The second, “The New Age Bullshit Generator” (http://sebpearce.com/bullshit/), works on the same principle but uses a list of profound-sounding words compiled by its author, Seb Pearce (e.g., “We are in the midst of a self-aware blossoming of being that will align us with the nexus itself”).
Each person’s mean rating became their Bullshit Receptivity (BSR) score. The findings are damn sad, but won’t surprise readers here:
…participants largely failed to detect that the statements are bullshit.
The mean profoundness rating was 2.6, which is in-between “somewhat profound” and “fairly profound” on the 5-point scale. Indeed, the mean profoundness rating for each item was significantly greater than 2 (“somewhat profound”), all t’s > 5.7, all p’s < .001, indicating that our items successfully elicited a sense of profoundness on the aggregate. Moreover, only 18.3% (N = 51) of the sample had a mean rating less than 2. A slight majority of the sample’s mean ratings fell on or in-between 2 and 3 (54.5%, N = 152) and over a quarter of the sample (27.2%, N = 76) gave mean ratings higher than 3 (“fairly profound”).
To assess thinking styles, participants were given five cognitive tests, a verbal intelligence test, religious and paranormal beliefs questionnaires, and an “ontological confusion” test. This last was an interesting Finnish test that asks people to rate how literal a statement is on a scale from 1 (“fully metaphorical”) to 5 (“fully literal”). The paper gives examples of clearly literal (“Wayne Gretzky was a hockey player”), clearly metaphorical (“Friends are the salt of life”) and one perhaps less clear (“A rock lives for a long time”). Those who rated metaphors as literal were considered “ontologically confused.”
I ask you, how often have you as a skeptic been dismissed by a religious person or New Ager for being “closed-minded”? Well, I’ve got your response right here. The authors distinguish between uncritical or “reflexive” open-mindedness, and thoughtful or “reflective” open-mindedness. Reflexive open-mindedness “is very accepting of information without very much processing.” Reflective open-mindedness “searches for information as a means to facilitate critical analysis and reflection.” The authors hypothesized that the former makes you more bullshit receptive, while the latter helps you guard against bullshit. And sure enough:
Those more receptive to bullshit are less reflective, lower in cognitive ability (i.e., verbal and fluid intelligence, numeracy), are more prone to ontological confusions and conspiratorial ideation, are more likely to hold religious and paranormal beliefs, and are more likely to endorse complementary and alternative medicine.
… our initial evidence indicates that reflectiveness may be a key individual difference variable. At a very basic level, the willingness to stop and think analytically about the actual meanings of the presented words and their associations would seem an a priori defense against accepting bullshit at face value (i.e., to avoid an excessively open-minded response bias).
For skeptics confronting pseudo-profound bullshit:
… detecting is not merely a matter of indiscriminate skepticism but rather a discernment of deceptive vagueness in otherwise impressive sounding claims.
How does this compare with real life? Well, participants weren’t warned that they may be confronting bullshit. That, as the authors point out, is often our situation in real life. It’s especially hard to determine in creative works like poetry. And being human, our interpretation will be biased by the possible bullshitter’s prestige or our trust in them.
The authors didn’t shy from stating that one of their goals is to “raise the possibility that Chopra’s tendency to bullshit (as claimed by others…) may have played an important role in his popularity.” Not just popular, they note, but very rich — “one of the wealthiest holistic-health ‘gurus.'”
Bullshit is a consequential aspect of the human condition. Indeed, with the rise of communication technology, people are likely encountering more bullshit in their everyday lives than ever before… At the time of this writing, Chopra has over 2.5 million followers on “Twitter” and has written more than twenty New York Times bestsellers. Bullshit is not only common; it is popular. Chopra is, of course, just one example among many. Using vagueness or ambiguity to mask a lack of meaningfulness is surely common in political rhetoric, marketing, and even academia… Indeed, as intimated by Frankfurt…, bullshitting is something that we likely all engage in to some degree… The development of interventions and strategies that help individuals guard against bullshit is an important additional goal that requires considerable attention from cognitive and social psychologists. That people vary in their receptivity toward bullshit is perhaps less surprising than the fact that psychological scientists have heretofore neglected this issue.
Also, I’m totally using “ontologically confused” as my fave new passive-aggressive insult.