# Conceding a Point is Not a Slippery Slope

I’m having a delightful time at Rationality Camp, and I have two LessWrong posts to recommend today: “Absence of Evidence is Evidence of Absence” and “Conservation of Expected Evidence.”  (Read these first, I’m gesturing at them, but not summarizing below).  They start with this quote from Robyn Dawes’s Rational Choice in an Uncertain World:

Post-hoc fitting of evidence to hypothesis was involved in a most grievous chapter in United States history: the internment of Japanese-Americans at the beginning of the Second World War. When California governor Earl Warren testified before a congressional hearing in San Francisco on February 21, 1942, a questioner pointed out that there had been no sabotage or any other type of espionage by the Japanese-Americans up to that time. Warren responded, “I take the view that this lack [of subversive activity] is the most ominous sign in our whole situation. It convinces me more than perhaps any other factor that the sabotage we are to get, the Fifth Column activities are to get, are timed just like Pearl Harbor was timed… I believe we are just being lulled into a false sense of security.”

Acts of sabotage would definitely have made Warren think it was more likely that there was a Fifth Column.  If he also claims that their absence causes him to revise his estimate upward, he should already have a higher estimate.  Imagine you thought there was a 50% chance a fifth column existed.  If there was an act of sabotage in the next month, you’d revise your guess to 90%.  If there was no sabotage, you’d change your guess to 60%.

In that case, the craziest thing you’re doing is claiming you only thought there was a 50% probability.  No matter what happens during the month, you’ll think it’s at least 60% likely, so you have no excuse for not thinking it’s at least 60% likely now.  In fact, your guess should be even higher, because if you started at 60%, you’d know you could only go revise upward during the month, and you should guess higher until you reached a point where you weren’t sure which direction would make you more accurate.

This is a pretty easy example.  It’s a good way to point to how absurd this kind of reasoning is, but the example doesn’t feel very threatening to most of us.  It’s hard to identify with Warren, not only because he’s saying something dumb, but because he sounds like he’s probably racist.  That makes him a bad match for our own idea of ourselves.  In a later exercise, there was a better example for catching rationality camp-types out in the same kind of error.  We got a set of prompts to discuss and evaluate:

At a party, you meet a stranger who believes in astrology.  After talking to you for long enough to build up a sense of your personality, they offer to guess your birth month and year.  You agree to the test, and the astrologer guesses correctly.  You reply that you still don’t believe in astrology because you’ve read a lot of studies showing that people can’t pick out their own newspaper horoscopes; plus, maybe he secretly looked you up on the internet beforehand.

What I said during the discussion, based on the LessWrong posts above, is that none of those caveats change the fact that you now think astrology is marginally more likely to be true than before he guessed right.  If failing to guess right would be evidence against his theory, guessing correctly has to be conceded as evidence supporting him.

Admitting that the other side scored a point isn’t the same thing as saying that I’ve changed my mind or even that the marginal change in my beliefs was enough to get me to think more about astrology or spend time looking for new evidence.  I haven’t lost anything by conceding, and it doesn’t mean that I’m more likely to find the next piece of paranormal evidence compelling.

Sometimes, being honest means updating in the ‘wrong’ direction.  Although, in the long term, you should expect your beliefs to drift toward the correct answer, there’s no reason to expect that you approach that answer monotonically (always moving in the same direction).

We can have better arguments if both players understand this fact, so no one treats it as a humiliating defeat if you concede that a piece of evidence meshes better with your opponent’s position than your own.  That’s the expected sometimes, no matter who is right.  Concede the point and then explain why it’s not enough evidence to make you drift across a critical value where your beliefs and actions change.  There’s nothing wrong with saying, “That’s not the kind of proof I’d need.  It favors your side, but my prior expectation that you’re wrong (for these reasons…) is strong enough that it doesn’t change my expectations very much.”

Talking this way lowers the stakes of each new piece of evidence, so we can consider them reasonably instead of treating any revision of our estimates as a coup de grace.

"Well, I would love to know if you now believe that homosexuality is intrinsically disordered."

Go Ahead, Tell Me What’s Wrong ..."
"Any chance of you ever addressing the evidence that led you to accept the truth ..."

Letting Go of the Goal of ..."
""Wow, an unevidenced assertion from a religious dipshite. "Your quotes are the evidence and reason ..."

This is my last post for ..."
""Congrats on leaving your brain behind!"Comments like yours are why lots of atheists leave atheism. ..."

This is my last post for ..."

Browse Our Archives

Follow Us!

Related posts from Unequally Yoked
What Are Your Thoughts?leave a comment
• Very important ability in arguing. I also find that if I’m arguing with someone who’s not used to the Less Wrong style, willingly and happily conceding where I’m wrong or where the evidence favors them totally disarms them and makes them less likely to see what we’re doing as a war where all my arguments are enemy soldiers and all that. It makes it more collaborative, more searching for truth together, more fun.

The only interesting point I think you missed here is that if our probability estimate of something is, say 70%, we should *expect* there to be evidence against us. If we didn’t expect any evidence against us, our probability estimate should be almost 1. So arguments against you may make *you* more right as regards your calibration, even if not in regard to your current position, but I think that’s more important anyway.

• Have you read much Polanyi?

• Joe

Glad to hear your having fun!! Just out of curiosity, are you the only religious person there, is the camp largely atheist?

• grok87

+1

• Oregon Catholic

I like this way of evaluating a position. Thanks for posting on it.
Of course it goes without saying, but I’ll say it anyway, that it only works if a person is truly looking for truth and not just validation of an existing opinion. That’s actually pretty rare among atheists and Christians alike on these blogs.

• Interesting, though Atheism isn’t out to actively encourage magical thinking.

• deiseach

I’d like to dedicate this instalment of the estimable Dr. Boli’s magnum opus, his “Complete History of the World”, to you, Zack: “Chapter 10 – Christianity Ruins Everything”.

🙂

(I would also like to recommend earlier chapters of this online masterpiece to discerning readers).

• Oregon Catholic

Au contraire. Most of you think the universe just happened by accident. No cause, no purpose. That’s pretty magical in my book, not to mention laughable and illogical too.

• grok87

agree

• anodognosic

Magic, as I understand here, means seeing purpose where there is none. Purpose can only be obviously and clearly discerned in human affairs–in man-made objects, institutions, et c. In nature, we see the appearance of purpose in biology, but that appearance of purpose is quite sufficiently explained by evolution (which is why evolution is rightly a topic of much interest among atheists). In the universe at large, we see order, which is not the same thing as purpose. There is no discernible purpose to the vastness of the universe outside the human sphere. As for the order, it is an emergent property of simpler rules, which is fairly clearly demonstrated by modern physics.

Now, when we get to metaphysical questions, I agree that things get a lot sketchier. There are definite questions about why the universe is the way it is. We have precious little information to guide us on that count, and I, along with most atheists, admit justified ignorance on that point. But the conspicuous absence of clear, discernible purpose beyond our little planet seems to make purpose a rounding error in the universe, which, I feel, hardly justified ascribing purposefulness to the entire thing.

In the spirit of concession, I admit may be wrong! The universe may very well have a purpose, and I think it’s incumbent on skeptics to be open to explanations. But then, God is only one possibility among many that would account for a purposeful universe. (see: privileging the hypothesis)

• Ted Seeber

Where did you get your axiomatic definition of magic?

It doesn’t fit mine at all. To me, magic is just an explanation that is incorrect.

• Most of you think the universe just happened by accident. No cause, no purpose. That’s pretty magical in my book, not to mention laughable and illogical too.

“Magic” or “God” are not the only two available options (in fact, those are only two faces of the same coin; both say “It came out of nothing and nowhere, all at once!”). Evolution — a gradual process governed by discernible, natural processes — is far from accidental, but that doesn’t mean that there was/is a supernatural intelligence guiding it, either.

• Oregon Catholic

Evolution doesn’t explain the origin of the universe. Evolution is compatible with Catholicism. It requires magical thinking to say the universe began by accident with no uncaused cause.

• Compatible up to the point where Evolution and the Genetics that goes with it shows there was no Garden of Eden, from which you get the Doctrine of Original Sin.

• Ted Seeber

Since when? Atheism is out to encourage the most magical thinking of all: That all events are observable.

• No, we get evidence on stuff. And “Uncaused cause”, it’s your way of rationalising your way out of the “Who Created God?” argument. And it has to be your God, not Quetzalcoatal. On the basis of just because you say so…
Atheism isn’t saying all events are observable. Maybe you have misinterpreted how science is applied; principles are considered consistent over great distances, and that’s different from what you said. And what you said, funnily enough, suggests you find it laughable to suggest that “all events can be observed”. Well, where does that leave your all-seeing, all-knowing, omnipresent God?

• In the astrology example you listed, the problem appears to be here: “You agree to the test”. Why (putting aside a social reasons to humor someone) would one do this, assuming they (correctly) identified it as a faulty test? Certainly if you made the mistake of thinking that it was a useful test, you would be bound to admit that passing it increased the probability of astrology being correct — but it’s not a useful test, and you have no such obligation! (I can expand on why the test is faulty if requested.)

• leahlibresco

When you say it’s a faulty test, do you mean he is equally likely to pass in the world where astrology is true as the world where it is false?

• Doragoon

It’s a good example of what’s wrong with science lately. Experiments designed so that they either provide confirmation of a theory or a null result are not good science.

• Ted Seeber

Yep, perfectly spherical apples don’t exist.

• jose

“If failing to guess right would be evidence against his theory, guessing correctly has to be conceded as evidence supporting him.”

imho… no it hasn’t.
A = I can guess your birthday. B = When I make the guess, I’ll be right. Clearly A=>B.
That doesn’t mean B->A. We all learn this in high school. Guessing correctly doesn’t have to be conceded as anything. What it does mean, however, is ¬B -> ¬A. So I’m justified saying that i f he gets it wrong, that counts as evidence against him; but if he gets it right, it doesn’t mean anything.

But jose, we’re talking about scientific evidence, not formal logic. All truths in science are inductive and provisional, subject to revision against new evidence, so they’re pretty much like our foreteller. If confirmed claims count for science, why shouldn’t they count for foretelling? The reason is that in this case the foreteller is being both judge and party. He is the one proposing that he can see the future, but only him is able to judge whether he really can see it or he’s doing a trick.

In order for the foreteller’s successes to count as scientific evidence, he should explain how he does it, in other words, the analogous of a “Methods and materials” section like we see in all experimental papers. That way the claim is independent from the proponent, so we can rule out the possibility that he’s fooling everybody because now everybody can judge the claim independently.

What I mean generally in my comment regarding the main topic of the post is that a point should only be conceded when it’s valid. All points that you concede should count and they all should change expectations. To me, doing otherwise looks too much like “ok, you’re right, but I don’t care”. Not caring about contrary evidence and continuing believing despite points you acknowledge as good has a name.

• leahlibresco

Jose, did you check out the linked posts? If you’d count an incorrect guess as evidence against his claim, you need to count a correct guess as evidence for. Otherwise, you’ve identified a piece of data that can only move your beliefs in one direction (here, negatively). That means you’re saying that if he’s wrong, maybe you decrease your estimation that he’s right by 1% and if he’s wrong, you don’t change your guess. That means your expected value of new evidence is negative, and you should revise your beliefs downward whether or not you perform the test.

• jose

Hi,
I responded to the first point: No, we need not, for the reason I commented.

Indeed, in this case every piece evidence can only move us in one direction, because as long as he keeps being judge and party, the foreteller can’t produce valid evidence. It doesn’t matter how many times he guesses correctly because if he’s doing a magic trick, he can do it every time all the same. Additional successes don’t add any new information, so they don’t give weight to his claim. I also commented what he can do to change the situation – the old adage “the exception proves the rule” comes into play: change the parameters of the experiment so the claim is independent from its proponent.

Likewise, if you were only able to look a Phoebe, and Monica kept turning the TV on and off, what conclusion would you draw? Is each new blink additional evidence in favor of Phoebe? Would you concede that she’s got a point? She clearly doesn’t. This is another case where every possible evidence can only move you in one direction unless you change the set up.

Also I don’t know where you’re getting those percentages from, I’d rather not use them if you don’t mind.

• I think the root disagreement is earlier than you both recognize.

Leah is arguing in a Bayesian model, where the reasoners beliefs all have probabilities and reasoning basically consists in applying The Rev. T. Bayes’s theorem. Even though they know we mostly don’t work that way, the Less Wrong folks assume that is the ideal.

Jose probably (ha ha) agrees with me that assigning cardinal probabilities to everyday beliefs doesn’t make sense.

I’ll give a model that is still wrong, but closer to how I think rationality actually works: Outside of small islands of established frequencies beliefs don’t have probabilities, but they are ranked in a ordinal hierarchy of sureness. The way to overturn a belief is to prove its converse from beliefs higher up on the trust chain. The position on the trust chain only determines which belief gets thrown out in the event of a contradiction, but the actual derivations go according to propositional logic, not Bayes.

Looking at it like this, the single birthday guess is not evidence, but together with many other guesses and proof there is no cheating it might be. So basically on the Bayesian view every one of the many facts I would need to be convinced is a weak piece of evidence and they are collectively strong evidence. While under the normal and correct rule (what the Less Wrong folks call “classical rationalism”) the facts are evidence only collectively but worthless individually. If you want to turn it into a slogan, there is no such thing as weak evidence.

Now I can agree to keep the successful guess around as a part of a potential demonstration yet to be completed. But right now, that doesn’t have any effect on my belief’s position in the trust chain, which means it hasn’t changed my confidence at all.

Again, this is a cartoon model, while the Bayesian one is meant to be serious, but I think it makes the difference clear.

And here the point is, you are both totally right in your respective models of rationality (assuming I correctly guessed Jose’s) but the disagreement actually goes deeper.

• If I believe him guessing my birth month is evidence for his ability to guess a birth month rather than the truth value of astrology, then I don’t run into a problem. There could be any number of environmental variation – season of the year, corresponding biological changes in ourselves contingent on this season – that could account for month-to-month personality types without astrology necessarily being the answer. Simply because someone has an accurate intuition doesn’t make him right about another thing unrelated to the exercise at hand – namely what is my birth month.

• Josh

I wonder whether part of the hesitation here is the belief that the set of all true propositions should in essence be an interconnected set (or network or web, whatever image works), and that no fact (properly understood and contextualized) should ever count as evidence against one’s own position because that would create a somewhat incoherent set. The problem, though, seems to be that our knowledge is finite, and so we’re forced to make probabilistic judgments based on a very limited set of information. It can be very frustrating, but it doesn’t seem like there’s any way around it.

• Josh

Sorry, that was a tad garbled. We don’t like to concede that a fact is evidence for a position we disagree with, because our beliefs about truth make it uncomfortable for us to think that there could be genuine evidence for positions incompatible with our own.

• A Philosopher

Perhaps violations of van Fraassen’s Reflection Principle are just more common than we might have thought…

• suburbanbanshee

The real point here is historical. A fair number of German-American Bund members, or just plain Germans resident or infiltrated into the US, were conducting sabotage on the East Coast and elsewhere. They did a lot of damage. The one Japanese pilot who got onto the little island and persuaded two Japanese-Americans to spree-kill everybody else on the island — they did a lot of damage.

So Earl Warren reasonably expected to have more Japanese sabotage teams out there just like the Germans had; but for whatever reasons, either they weren’t there or they didn’t do their thing.

Of course, it’s also possible that time travelers smuggled him a lot of DVDs of ninjas. (Or that he just believed the Mr. Moto novels about how savvy Japanese secret agents were.)

• suburbanbanshee

(Insert Russo-Japanese War and Manchuria results here, on top of Pearl Harbor.)