Lesson from the Rolling Stone: We Are Easily Duped

Lesson from the Rolling Stone: We Are Easily Duped April 6, 2015

I vividly recall reading the Rolling Stone story, “A Rape on Campus,” last November. The article made me queasy, as it did so many others who read it. I recalled what I’ve heard already about rape culture on university campuses, and the stereotypes of fraternity party-culture, and of course my heart went out to the victim of such a horrible crime.  The article led to a national conversation about rape on college campuses and about the entitled and elitist party-cultures that create environments ripe for sexual abuse. But in the case of the event which prompted the conversation, big questions emerged.

An investigative report carried out by the Columbia Graduate School of Journalism (and commissioned by Rolling Stone ) 2000px-Rolling_Stone_logo.svgconcluded that the article lacked sufficient journalistic merit; fundamental journalistic and editorial mistakes were made. The report has prompted the Rolling Stone to retract the article. It seems that none of these mistakes were intentional; that is, no one–from Sabrina Erdely, the author of the piece,  to the managing editors responsible for factual oversight–intended to glide past crucial rules for good, investigative journalism. Instead, this seems to be an illustration of a universal human tendency; a tendency based in what psychologists call dissonance theory and confirmation bias.

I’ve written about this before, but it’s weighty enough to mention again–particularly in light of this report. As Carol Tarvis and Elliot Aronson explain, in their book Mistakes Were Made (but not by me), people do not live easily with cognitive dissonance. Our minds work intuitively and aggressively to eradicate the tensions created by competing, alternative sets of data or lines of evidence. Our inability to deal well with cognitive dissonance leads to a “confirmation bias”; we are biased toward confirming the beliefs we have already come to accept. And so, we become masters as “self-justification”; convincing ourselves that we are on the right track, that the evidence is convincing, our argument compelling, our beliefs are sound, and so on.

So, it’s easy to see that a reader (myself, for example) would be biased toward confirmation of the truth of a published essay by a professional journalist. But it’s also easy to see how the journalist herself would be biased toward confirming the truth of a powerful and troubling story she had been convincingly told, from someone in the position of an apparently vulnerable victim (“Jackie”), coming out bravely to tell the truth of her story. Those of us who are casual observers of this incident cannot possibly know the truth of or behind “Jackie’s” story. What we do know, as a result of the investigative report, is that the details of her story were not sufficiently confirmed through appropriate journalistic procedures and did not meet the standards of epistemic vigilance which journalism requires.  My interest, though, is that what seems to have happened in this case is something like a domino effect of confirmation bias at work, whether “Jackie’s” original narration of her memory, the methods of Erdel’s investigation, the first draft to the last, and all the editorial “checks” in between. Confirmation and dissonance resolution.

It happens to all of us. Only the magnitude of the consequences differ.

Some might be surprised that no one has lost their jobs over this. After reading through Mistakes Were Made, I am not surprised. And it makes sense. Toward the end of their book, Tarvis and Aronson argue for the importance of establishing checks and balances on confirmation bias; institutions and organizations, as well as individuals, all need to seek out correction and help in identifying blind spots. But sometimes it’s in making mistakes that we learn and grow the most. They re-tell a story, originally told by organizational consultants Warren Bennis and Burt Nanus, about Tom Watson Sr., who was “IBM’s founder and its guiding inspiration for over forty years”:

“A promising junior executive of IBM was involved in a risky venture for the company and managed to lose over $10 million in the gamble,’ they wrote. “It was a disaster. When Watson called the nervous executive into his office, the young man blurted out, “I guess you want my resignation?” Watson said, “You can’t be serious. We’ve just spent $10 million educating you!”

Granted, the implications of the Rolling Story touch on much more than money. But in terms of the “lessons learned” about the power of confirmation bias and the importance of vigorous fact-checking, particularly for those in a vocation like journalism, may be similar. I suspect that everyone involved will have grown from the process.

And it’s a good reminder for all of us about the power of our own tendencies toward confirmation and our own difficulty dealing well with cognitive dissonance. Are we up to the task of facing it well?

Note: I first published this post this morning, but after receiving a few early comments questioning my conclusion that “Jackie’s” story was based on false memories (i.e. and suggesting that she had not been raped), I pulled the story in order to revise it later. The critiques and concerns raised were legitimate and on target. I hope that I have at least demonstrated my own willingness to admit that “mistakes were made,” and yes–even by me.

 

 

 

 

 

 


Browse Our Archives