Information Avoidance

Ars technica considers an analysis of how our beliefs determine the information we allow ourselves to be exposed to:

Analysis of the studies shows that people are almost two times more likely to select information that is congenial to their current beliefs and behaviors than they are to pick information that opposes them. That is to say, when offered material containing views that were contrary to their beliefs (either in article or broadcast form), people had only a one-in-three chance of taking a closer look at that information.

Not terribly suprising, but it does help explain why so many persist in their beliefs despite evidence to the contrary. This bit is particularly interesting:

The study also found that people who are unsure of their beliefs are actually more likely to avoid conflicting views.

My gut reaction was to think this can’t possibly be true; If there’s one thing people abhor more than anything, it’s uncertainty. But perhaps there is something worse: fear. Fear that you might be wrong.

About Daniel Fincke

Dr. Daniel Fincke  has his PhD in philosophy from Fordham University and spent 11 years teaching in college classrooms. He wrote his dissertation on Ethics and the philosophy of Friedrich Nietzsche. On Camels With Hammers, the careful philosophy blog he writes for a popular audience, Dan argues for atheism and develops a humanistic ethical theory he calls “Empowerment Ethics”. Dan also teaches affordable, non-matriculated, video-conferencing philosophy classes on ethics, Nietzsche, historical philosophy, and philosophy for atheists that anyone around the world can sign up for. (You can learn more about Dan’s online classes here.) Dan is an APPA  (American Philosophical Practitioners Association) certified philosophical counselor who offers philosophical advice services to help people work through the philosophical aspects of their practical problems or to work out their views on philosophical issues. (You can read examples of Dan’s advice here.) Through his blogging, his online teaching, and his philosophical advice services each, Dan specializes in helping people who have recently left a religious tradition work out their constructive answers to questions of ethics, metaphysics, the meaning of life, etc. as part of their process of radical worldview change.

  • Dan Fincke

    First of all, it’s delightful to see our first post from another blogger on the site besides me! And it’s a really nice find. Thanks for posting, Dave!

    There were several questions I have about what this study shows: The easy way to read it is that we are somehow prejudicial in our choice of reading material. But since young people are more receptive to alternative viewpoints and sources of information, the issue may not be prejudice but familiarity.

    IF when we’re young we explore the basic terrain of two issue sides of an issue and come to see one side as compelling then it is not necessarily irrational to then work within that theory and look for new information about it since that’s how theory’s work, as things we hold to be true and with which we try to evaluate new information consistently.

    In other words, while some people may be prejudicial start to finish, it is just as likely that some of the people in this study have already surveyed the basic terrain, find the alternate paradigm bankrupt and so look for new information within a paradigm they find workable.

    In my own case, I look at it like this, if I have a viewpoint that has already developed my opposition to people on the other side, then I don’t need to read the rabble-rousing pieces on other side, I don’t need to read the introductory materials for the other side, etc. In other words the other sides’ sermons to its choir or attempts to spin its faulty paradigm to a new circumstance will mean very little to me.

    BUT, it’s also important that those stronger thinkers of the other side who do not just traverse familiar ground but who develop new positions be listened to. Those people need to be read as carefully or more as the people who you normally read when looking for help filling out your understanding of a paradigm you already in the past found good reasons to support.

    What’s problematic to me is if there’s never in youth a stage of genuine openminded investigation and if as an adult one shuns or closes one’s mind to the genuinely novel ideas on the other side that have hope of shifting your paradigm. But as a matter of course to rely on people you think are basically right for reasons already established, might not be such a prejudicial thing necessarily. It could be just a matter of looking where you think you will find knowledge and working within the best paradigm you’ve found.

    Now, maybe I’m just imagining ideal thinkers and maybe most people are simply acting on prejudice. Also, maybe I’m underestimating how much objective information is still valuable from even a heavily theory-conditioned opposing source that could make a dent to my views too—even if I find the spin uncompelling for prior established reasons. But again, information alone does very little work without frameworks of interpretation by which to sort it and those can be rationally developed (rather than irrationally) in advance.

    And the insecurity of those in the middle is indeed a fascinating response. I would love more information about why they were conflicted (are there cultural pressures that give them incentive not to be challenged) or do they, as you infer, just fear being wrong the most.

    In my teaching I try to tell my students that their ideas are not them, that having an idea shot down is not the same thing as them being shot down. It’s very hard for us to take feeling wrong and for various likely reasons—we don’t like we just failed at thinking and, even worse, we hate feeling like we were fooled. another reason might be that we don’t know the consequences for the rest of our beliefs and so are hesitant to countenance the possibility of a domino effect if we’re shown wrong on this given point.

    All a lot of interesting things to think about. Your Thoughts?

  • Dan Fincke

    Yeah, reading the article again, I’m really confused because the article waffles between talk of Viewpoints and Information. We shouldn’t (but we do) avoid information we don’t like. But that’s very different from avoiding viewpoints that rehearse interpretations we already have developed reasons to reject.

    So, I really don’t know what to make of this. I used to like watching Brit Hume for information uncongenial to some of my views that were more left wing. But watching Sean Hannity is a pure and unadulterated waste of time because of the prejudice in his viewpoint. There’s a difference between information that supports right wing opinions and right wing spin on the facts.

    On the other hand, while I find Keith Olbermann over the top and obnoxious, since I think he is more usually right than Hannity, I can use him as A source of both information that is uncongenial to the right (as I used Brit Hume for info uncongenial to the left) and I can also generally agree with some of his broader analysis not because I’m biased but because I share the same background reasons and paradigms which had their own separate formation process.

    But, I would be closedminded in the extreme if I did not seek out other sources of information that are hostile to Olbermann and the more sophisticated viewpoints on the other side.

    So, maybe we are hostile to unpleasant information as a rule, but I could imagine an open mind being compatible with a “2/3 my side” rule for views and information consumption. What matters is that we give adequate weight to the 1/3 of opposing sources that can make a dent on us for being (1) objective information and (2) presented by people with fresh angles.

  • Dave Smith

    Thanks Dan, great to hear your thoughts on this! You’ve made an excellent point about biases which are in fact justified based on prior consideration of both sides of an issue. I’m sure that my reading habits would fall right in line with the results of this study. So the real question isn’t so much the ideaological distribution of what you read, but *why* you choose to read what you do. Are your beliefs well founded?

    In my own personal experience, one of the most frustrating parts of dealing with believing family and friends on issues of faith is the fact that many of them have read very little in the way of opposing arguments. Many of them were indoctrinated as children, and they’ve been so immersed in teachings that demonize the other side, that honest inquiry is no longer possible. In fact the act of logically considering the pros and cons of all sides of an issue is looked at as dangerous. Questioning = Temptation.

    Regarding those that avoid information out of insecurity – I suppose I would like to expand my idea of it being out of a “fear of being wrong” to just fear in general. Which would incorporate social pressures and anxieties. I went through a year or two myself where I was seriously questioning things internally but wasn’t willing to “go there” in terms of investingating alternatives to faith. Largely because I think I knew deep down where it would lead and I reaaaallly didn’t want to have to deal with all that would entail. But I don’t know whether the study would have been able to pick apart internal motivations like this, especially when the individual might not be particulary aware or concious of them.

    I’m right there with you on watching cable news shows… I’m somewhat partial to MSNBC now, but still watch Fox pretty regularly. I can’t stand CNN… there’s nothing there of interest on either side. Which is an interesting topic in itself: News has become entertainment to a large degree – and not just on comedy central, but I would say most cable news.


CLOSE | X

HIDE | X