Over at Slate Star Codex, Scott has an excellent post on the epistemological dangers of punching down. I’m using a long quote, so that you can get the crux of the argument, but I strongly recommend reading the whole thing.
What annoys me about the people who harp on moon-hoaxing and homeopathy – without any interest in the rest of medicine or space history – is that it seems like an attempt to Other irrationality.
(yes, I did just use “other” as a verb. Maybe I’ve been hanging around Continental types too much lately.)
It’s saying “Look, over here! It’s irrational people, believing things that we can instantly dismiss as dumb. Things we feel no temptation, not one bit, to believe. It must be that they are defective and we are rational.”
But to me, the rationality movement is about Self-ing irrationality.
(yes, I did just use “self” as a verb. I don’t even have the excuse of it being part of a philosophical tradition)
It is about realizing that you, yes you, might be wrong about the things that you’re most certain of, and nothing can save you except maybe extreme epistemic paranoia.
Talking about moon-hoaxers and homeopaths too much, at least the way we do it, is counterproductive to this goal. Throw examples of obviously stupid false beliefs at someone, and they start thinking all false beliefs are obvious. Give too many examples of false beliefs that aren’t tempting to them, and they start believing they’re immune to temptation.
And I worry that we are vaccinating people against reading the research for themselves instead of trusting smarmy bloggers who talk about how stupid the other side is.
That we are vaccinating people against thinking there might be important truths on both sides of an issue.
That we are vaccinating people against understanding how “scientific evidence” is a really complicated concept, and that many things that are in peer-reviewed journals will later turn out to be wrong.
That we are vaccinating people against the idea that many theories they find absurd or repugnant at first will later turn out to be true, because nature doesn’t respect our feelings.
That we are vaccinating people against doubt.
And maybe this is partly good. It’s probably a good idea to trust your doctor and also a good idea to trust your climatologist, and rare is the field where I would feel comfortable challenging expert consensus completely.
But there’s also this problem of hundreds of different religions and political ideologies, and most people are born into ones that are at least somewhat wrong. That makes this capacity for real doubt – doubting something even though all your family and friends is telling you it’s obviously true and you must be an idiot to question it at all – a tremendously important skill. It’s especially important for the couple of rare individuals who will be in a position to cause a paradigm shift in a science by doubting one of its fundamental assumptions.
I try to keep these kinds of considerations in mind as a writer, both here and over at The American Conservative, and even when just posting links to Facebook. There are a lot of wrong things on the internet that it aren’t worth covering. After all, people being wrong isn’t news. I mostly avoid covering things I feel like are stupid or cruel unless:
- There’s a timebound political action item linked to the story (vote, donate, etc)
- There’s some reason to believe my coverage would have a much larger marginal effect than the coverage that already exists (I’ve almost never felt this condition was satisfied)
- Tracing out the cause of someone’s error when it was comfortable far away from me caused me to notice a related failure mode in myself
These criteria capture the “Is it necessary” component of the True, Necessary, Kind trifecta, and I try to keep the other two in mind as well. Thinking of fights that fail these tests as outside my mission parameters means I have more time and energy to spend, if I choose, on errors that lurk closer to my heart or my community — ones that I actually have a chance of rooting out, rather than just railing against. I also find it means I experience less tsuris, since bad arguments don’t get to demand my time just by virtue of being bad or stridently expressed.
There’s another danger of only dealing with Otherized irrationality. For the most part, when you pick fights on the internet, you can wind up essentializing the mistakes that your enemies are making, since it’s the only facet of their personality that you encounter. (I don’t know any Young Earth Creationists in person). Or, to put it a little more classically, I can wind up turning the error into a Homeric epithet.
As a result, I wind up blinded to the personhood of the antagonist I am grappling with, and can wind up inoculated to noticing the errors of the full persons I encounter in my day to day life (myself included). It’s another reason I’m really grateful to my college debate experience, where I got experience twice a week fighting people with bad ideas, and then continuing the conversation over lunch the next day, or inviting them to a musical that I thought would explicate my critique. Arguing in good faith got integrated into the nature of a friendship, so it didn’t become odd or rude to express disagreement, and philosophical scrapping couldn’t become a habit that was confined to the internet, leading me to be blinder to it in everyday life.
I’m looking forward to the upcoming debate & cookies that I’m having with DC friends over the firing of Brendan Eich, since it’s a chance for us to address a fault line in our values. And, face to face, and breaking bread together, we are forced to remember that the people we find holding these odd or unpleasant views aren’t stupid or malicious, so it’s possible a non-stupid, non-malicious person like me could still make mistakes that do enormous harm to others.
P.S. Christian readers, if you read through this post and thought of your least-favorite atheist blog, and then thought, “Ha, I knew they were terrible,” I think you should reread Scott’s post.