If My Opponent Believes That, Then I Surely Can’t

Scott Alexander has a wonderful post highlighting ten fallacious ways that people hastily jump from seeing that their opponents think something to dismissing it. All ten are very good observations, with examples provided. Let me highlight three I found particularly amusing and send you to Slate Star Codex to read and bookmark the whole list so you never remember to never engage in these evasive thinking strategies:

2. Argument From My Opponent Believes Something, Which Means They Believe It Is The Answer To One Question, Which Is Kinda Like Believing It Is The Answer To All Questions, But It Isn’t: “Statists believe government can solve all our problems. They need to understand the world doesn’t work that way.”

6. Argument From My Opponent Believes Something, Which Is Kinda Like Hating The People Who Don’t Believe In It, And Hatred Is Wrong: “People need to get over their frothing hatred for euthanasia.”

9. Argument From My Opponent Believes Something, Which Might Suggest A Course Of Action, Which Could In Theory Be Implemented Through Violence, And Violence Is Wrong: “Transhumanists think AI may be dangerous, but this could encourage people to kill AI researchers, so holding this belief is irresponsible.” Or, “Environmentalist condemnations of the oil industry encourage eco-terrorist attacks on oil workers.”

Read and avoid all ten “arguments from my opponent believes something”.

About Daniel Fincke

Dr. Daniel Fincke  has his PhD in philosophy from Fordham University and spent 11 years teaching in college classrooms. He wrote his dissertation on Ethics and the philosophy of Friedrich Nietzsche. On Camels With Hammers, the careful philosophy blog he writes for a popular audience, Dan argues for atheism and develops a humanistic ethical theory he calls “Empowerment Ethics”. Dan also teaches affordable, non-matriculated, video-conferencing philosophy classes on ethics, Nietzsche, historical philosophy, and philosophy for atheists that anyone around the world can sign up for. (You can learn more about Dan’s online classes here.) Dan is an APPA  (American Philosophical Practitioners Association) certified philosophical counselor who offers philosophical advice services to help people work through the philosophical aspects of their practical problems or to work out their views on philosophical issues. (You can read examples of Dan’s advice here.) Through his blogging, his online teaching, and his philosophical advice services each, Dan specializes in helping people who have recently left a religious tradition work out their constructive answers to questions of ethics, metaphysics, the meaning of life, etc. as part of their process of radical worldview change.

  • toddistark

    I found the source article difficult to read because of something about the tone of the writing, but I like the use of examples.

    Recently I’ve been thinking that most “cognitive biases” seem linked back in an important way to something I’ll call the “transparency bias” which I think in a sense a kind of master schema that drives them.
    By transparency bias I simply mean the automatic and unavoidable tendency to feel that the world and other people are exactly as we perceive them to be. It’s probably derived from the fundamental need for animals to act on signals as they perceive and process them.

    Hominids of course have this particularly remarkable capacity to ignore their own immediate processing priorities or delay response to them, which we have culturally built into philosophies of reflection and philosophies of doubt. I think the function of those is straightforward for the most part, compensating for the transparency bias.
    I’m not saying something new of course, I’m sure this is part of what ancient philosophers were getting at, and many since them. That’s the point, I think it is fundamental and might be put in psychological as well as philosophical terms in order to learn more about it and link epistemology back better to science.

    If people generally reason through specific perspectives or lenses, comprised of cognitive schema (and I believe they do), and these lenses can vary, and the transparency bias seems to present an asymmetry. We each think we are reasoning in a very straightforward way about our own ideas and those of people we disagree with. The asymmetry is that we tend to reason more carefully and with more precision about our own ideas, and to generalize and stereotype opposing views more. That relates back to all the literature on cognitive conservatism, egocentric bias, and “myside bias” as well as attitude polarization.

    To get to the point at last, I think this list is a very good set of examples in which our natural tendency to perspectival asymmetry is being demonstrated most dramatically. The relentless drift toward seeing opposing viewpoints in a more general form and our own in a more nuanced and flexible way.

    Transparency bias –> perspectival asymmetry –> seeing our own view or perceived supportive views in a more nuanced and flexible way than opposing or perceived threatening views.

    Does that make any sense?

  • GCBill

    Slate Star Codex is my favorite multi-subject blog to read. It makes me happy to see Scott get a mention on one of my favorite atheist blogs.


CLOSE | X

HIDE | X