Philosophical Advice About Not Appearing Closed Minded (Part 1)

Dear Dr. Fincke,

I’m a big fan of your blog and I absolutely adore your articles. I have a bachelor’s degree in Philosophy myself, and I am always up for a rigorous debate. I pride myself on being able to keep an open mind on a lot of issues and entertain assertions and their logical consequences without necessarily accepting it. In fact, I sometimes worry that I’m too open-minded, as I waver between many different positions, especially in matters of ethics. The arguments on all sides often seem to have merit.

So you can imagine my dismay when, in discussions with friends and acquaintances, both in real life and the Internet, I seem to have a chronic problem of being called “closed-minded”.

Now, it’s possible that I’m a little sensitive to this issue. When I was a teenager and initially expressed my doubt in Christianity to some members of my family, my aunt sharply rebuked my line of thought (mostly about evidence and justification for belief) as closed-minded. This caught me off guard and led to some introspection about whether I really had considered all the relevant parts of it, and if there was just some sort of character flaw that was keeping me from seeing the Christian perspective of the issue.

I’m now pretty certain that’s true, but the accusation of closed-mindedness still stings. It seems almost the worst thing a prospective philosopher could be!

The latest was in a discussion about ethics I had on Facebook. An argument was already in progress about a picture a Catholic friend of mine had shared in response to the DOMA ruling – a quote talking about how what is morally right is in fact morally right independently of majority opinion on the matter. A sentiment I agree with, though one I find vaguely sinister from a Catholic page. Another person opined that morality is an illusion, and nothing more than societal consensus. Of course I took that view to task as my training has taught me, and asked for an actual argument to that effect while outlining the consequences (no possibility of moral progress, no heroes, common-sense moral wrongs not actually wrong).

The response to what I thought was a decent write-up in which I asked for clarification of the points I didn’t understand was, “So someone who doesn’t agree with your version of morality is just wrong? You’ve proven me right and proven you’re closed-minded…”

And I have no idea how I gave that person that impression. But it has happened often enough that I wonder if this is actually a flaw.

As a philosopher, I’ve accepted that I generally care about issues that the vast majority of people could not care less about. Could it be that my passion comes off as extremism?

Does my willingness to write and talk at lengths about why I disagree come off as too forceful? Does my relentless appeal to standards of evidence and reasoning come off as dogmatic somehow? Am I simply expecting too much of people who may not have had the same training in Philosophy I have?

Or worse, am I actually closed-minded and too entrenched in my own echo chamber to realize it?

Any assistance in solving this problem would be appreciated, and thank you for both your time and addictive blog content.

Vance

Well, I should probably give a disclaimer that any advice I give here is going to be advice I might have trouble following myself as I am a philosophical guy too!

Without being party to your discussions regularly, it is of course impossible for me to judge whether you are in fact closed-minded. Insofar as your self-descriptions are accurate reflections, you do not sound very closed-minded. Inevitably we deal with people who are guilty of projection. Projection is when we ascribe to others our own negative traits instead of recognizing them in ourselves. It is precisely when people are resisting opening their minds that they are lashing out and accusing you of having a closed mind. If you are the one in the habit of regularly reinspecting all your opinions and holding yourself and others to scrupulous evidential and logical standards, whereas your interlocutors are your everyday non-philosophers who are not trained in such intellectual habits (or in the whole ethos that prioritizes this stuff as a conscientious behavior) then they’re probably much guiltier than you of closed-mindedness and you’re dealing with their defense mechanism.

But it’s not particularly helpful to just pat you on the back and say, “continue what you’re doing” if what you’re doing is still eliciting negative responses. So, let’s think more carefully about how to approach people understanding that they usually don’t share our philosophical habits when it comes to critical thinking. Since decidedly unphilosophical people are decidedly unphilosophical they’re typically bad at explaining the logic of their thinking and feeling in ways that make philosophical sense. So, in order to be charitable to them, let me try to translate what is implicitly going on into terms that make sense philosophically. Philosophically construing the logic of unphilosophical people will actually have an advantage of highlighting some true and useful things that those prone towards philosophical thinking itself are likelier to miss. It will also highlight some traps of closed-mindedness that philosophical people ourselves are more susceptible to falling into. I will get to these tasks in the next two posts. For the rest of this post, I want to give an anatomy of why we all deeply prone to close our minds to challenging new ideas in the first place.

Even unreflective people think with some kind of implicit logic. No one is really “illogical”. All our brains process according to associations that make sense. Our beliefs, feelings, and values interconnect in often deep ways. But a great number of these connections being processed subconsciously are relatively rarely clear to us at the surface. We often may be able to explicitly articulate what we think. Even those of us who are methodically critical only explicitly analyze a relatively small portion of the associations that our brains implicitly have to make decisions about all the time. Our brains are processing the world in ways that make sense of it for them and we are, for the most part, barely aware of what is happening. On the conscious level, we are usually just reading off the results and acting accordingly. It is even dubious that often (or maybe even ever) that our consciousness itself is making the decisions instead of just being the part of us that sees what the inaccessible parts of the brain are figuring out and deciding, etc. Consciousness may be a major screen for taking in information the processing parts of the brain can work on and for becoming aware of what those processers generate. But it is not nearly the cause we think it is. And often of all the problems our brains work out, only relatively few, for one reason or another, rise to the level of conscious awareness.

So when pressed on things we haven’t thought much about answers will often pop straight to our minds. And our first inclination is usually to just run with them. Especially if we are not well trained in some more scrupulous critical thinking habit. And the answers that pop to mind will sound pretty good to us even when we are poor at articulating or defending them cogently, coherently, or systematically. And when more skilled analyzers than we are pressing us to defend these intuitions, we may struggle and feel uncomfortable while we cope with a ton of cognitive dissonance. Deep down we are going to resist changing their minds in radical ways because under the surface of any given momentary discussion we are engaged in there are a ton of interconnected beliefs, values, feelings, relationships, etc., that are both constituting ourselves and our whole, intuitively workable, sense of the world. The deeper parts of our brain are probably sending default messages in response to all disruptive, challenging ideas saying that they just don’t fit and should be rejected, and ad hoc they’re trying to run as many suggestions up as they can to explain what might be wrong.

So when we have a given idea threatened we will often be more resistant to changing it the more that it intuitively feels to us integral to that whole structure with which we find we can make sense of the world. There must be something wrong with your new idea or–even easier!–with you yourself than with this idea because if this idea falls, too much of our whole sense of the world would be wrong. And, to our default non-conscious processing brain, what’s more likely to be wrong? Our whole sense of the world by which we successfully live everyday or this other person or this new idea that doesn’t seem to fit well at all with anything else we think?

The fact that to us our whole worldview bolsters our surface ideas is what gives even our most quickly refutable ideas more credibility to us than we are willing to give others with their counter-arguments. Being intellectually lazy (and probably so by natural “design” as our brains have an inborn prejudice to simpler explanations) others are probably the problem, not our own minds. Even if this is irrational, it’s something we need to work with.

One final thought before I end this post at a reasonable length and address Vance’s question in further depth in three follow up posts that will be posted tomorrow, Sunday, and Monday. I think this rough sketch of mental processing I have just made explains the “Gestalt shift” or “paradigm shift” character of conversions, deconversions, and other major “worldview” changes. One day someone is a committed adherent to a particular belief or values system and then seemingly out of the blue their views on everything are drastically realigned.

What I think, from my own several major experiences with this kind of shift, is that for a long time people take in challenging ideas in ways that can be assimilated as best they can to the dominant beliefs, values, and commitments they presently have. They also learn enough about alternative viewpoints that they can tidily store in their minds as “false views whose logic is understandable”. For a while I think it is possible to compartmentalize ideas and not be persuaded by them by telling yourself when you see how they make sense “oh, I get it, that’s how things would look were your fundamental commitments true. But they’re not. So I’ll store that interesting idea in the ‘false but understandable perspective’ for people who think like you.”

I think over time if a position you are just understanding without accepting gains more and more internal plausibility, you can become more and more often inclined to think it over and practice looking at information within its categories. Eventually you might start seeing how some of the ideas you accepted that were harder to assimilate to your own dominant views actually would make more sense were the other perspective true. And suddenly when new information comes in, your brain starts struggling as it sees how it might fit either your committed perspective or that other one which has minority sway but some sensibleness to it. Then only when that alternative perspective that’s been growing within you gains sufficient explanatory power to seem like it not only is overwhelmingly more likely to be true but that it could functionally and sensibly drive you around the world does the crisis moment happen where you feel compelled to accept this perspective instead.

And sometimes this means a dramatic, fairly sudden and drastic, takeover by this alternative perspective. The cognitive dissonance of two fundamental perspectives is just not going to fly for long, so a major changeover happens. To outsiders it’s jarring that someone can seemingly switch over overnight. But it’s never really overnight. It’s just that most people’s brains need some major sifting categories so that biggest change happens last and when the choice is finally made to switch over all the major categories as one’s official stance, the consequences throughout one’s whole belief and values system can be big and noticeable. For less overtly believing, valuing people, this happens more subtly and with less trauma. For people who invested major parts of their identity in their beliefs, values, and commitments, it can be personally destabilizing for a while and shocking to other people.

So, with these preliminary sketches about how brains resist challenging ideas with the effect of “closed-mindedness” and yet can in time effect major switches, in my next post I explore why many people have specific implicit common sense assumptions and worldviews that incline them towards abstract statements of relativism and see objectivity-oriented philosophers like Vance as closed-minded. I think historical and psychological factors combine to make people have a particular view of what openmindedness and closed-mindedness are ethically. I think these views are both wrongheaded and need to be corrected in one way. But they also evince a fairly sophisticated logic of implicit associations that philosophers would do very well to learn from.

In a third post I talk about how even though philosophers are in many ways more openminded than just about anyone, we have a unique susceptibility to coming off as closed-minded in specific sorts of cases.

This series ends with a post of practical advice about how to not come off as closed-minded to people who will be inclined to see you as such even when you’re not.

I am an American Philosophical Practitioners Association certified philosophical practitioner and I have a PhD in Philosophy from Fordham University.

As a philosophical practitioner I help people reason through their beliefs, values, priorities, identities, emotions, ethical dilemmas, life decisions, existential quandaries, religious or post-religious struggles, love relationships, interpersonal conflicts, search for meaning and purpose, or struggles in any other areas of life that some conceptual clarification, logical consistency, theoretical sensitivity, and emotional intelligence can be helpful.

I do not treat mental illness. I simply help people reason more clearly, consistently, ethically, and proactively about their lives. Send your questions to camelswithhammers at gmail dot com with the subject heading “Philosophical Advice”. The identities of all inquiring for advice are kept confidential and published e-mails will always use pseudonyms instead of real names.

If you are interested in counseling sessions write me with the subject heading “Philosophical Practice”. All sessions are confidential. And it does not matter where you are in the world; philosophical practitioners are not bound by state certification requirements and restrictions, so you and I can meet online.

To keep up with all installments in the “Philosophical Advice” Series keep tabs on this page.

About Daniel Fincke

Dr. Daniel Fincke  has his PhD in philosophy from Fordham University and spent 11 years teaching in college classrooms. He wrote his dissertation on Ethics and the philosophy of Friedrich Nietzsche. On Camels With Hammers, the careful philosophy blog he writes for a popular audience, Dan argues for atheism and develops a humanistic ethical theory he calls “Empowerment Ethics”. Dan also teaches affordable, non-matriculated, video-conferencing philosophy classes on ethics, Nietzsche, historical philosophy, and philosophy for atheists that anyone around the world can sign up for. (You can learn more about Dan’s online classes here.) Dan is an APPA  (American Philosophical Practitioners Association) certified philosophical counselor who offers philosophical advice services to help people work through the philosophical aspects of their practical problems or to work out their views on philosophical issues. (You can read examples of Dan’s advice here.) Through his blogging, his online teaching, and his philosophical advice services each, Dan specializes in helping people who have recently left a religious tradition work out their constructive answers to questions of ethics, metaphysics, the meaning of life, etc. as part of their process of radical worldview change.

  • http://wateringgoodseeds.tumblr.com/ Shira Coffee

    I can’t wait for the subsequent posts!

  • Charles Eggebrecht

    Yours is the best description of how my own “paradigm shifts” have taken place. You said you’ve had several of these shifts of your own. Which were your “biggest” or most dramatic? I’ve read about your deconversion so if that was the most major one, then I’m curious what’s second. Thanks

    • http://camelswithhammers.com/ Dan Fincke Camels With Hammers

      Yes, the deconversion was the biggest one. Possibly the second biggest was away from radical skepticism/anti-realism to (what I consider to be) a form of moral realism and increasingly strong sympathies with ontological realism. I would also count my shift from political conservativism to liberalism as similar even though it was a quieter process that ostensibly trailed the deconversion. There are numerous other mini-paradigm shifts. My deconversion narrative spells out stages I went through along the way in college, for example. Then during graduate school abandoning confidence in what is known as “Continental philosophy” and turning back favorably to the analytics was a big deal for me.


CLOSE | X

HIDE | X