How fortuitous that the same day I asked people to trust me and do the reading, Yvain (who you may remember from the Natural Law post that launched a thousand ships) has written a post examining the usefulness of the Courtier’s Reply. You should read the whole thing, but I’m going to cut and paste for a blockquote:
In other words, any version of the Courtier’s Reply strong enough to shut down people who want you to spend the rest of your life reading about reptilian British monarchs is also strong enough to shut down people who are correct and merely want you to have some idea what you’re talking about before bloviating against them.
…The naive answer, that I should read only books on subjects where I assign a high chance I might be wrong, doesn’t really cut it. The whole problem is that the absurdity heuristic doesn’t work that well, and the unreliability of saying “There’s no way I could be wrong on this” just based on my personal judgment is exactly the problem at issue. But total failure to make any judgment at all leads to me reading books about lizard people (which actually sounds kind of fun, now that I think about it).
Right now the best solution I’ve got is to read books in areas where my opinion differs from the opinion of a bunch of other people whom I consider smart and rational. This suggests the preacher should read more books about Darwinism, since all those Nobel Prize winners and biology Ph.Ds believe it. The atheist should read more books about religion, since many people whom one would otherwise judge as smart and rational believe that too. One seems on relatively safe ground rejecting ID, although maybe one should read a book or two just to be sure. And there doesn’t seem to be any point in reading about the Lizard People, except as previously mentioned that it would be hilarious.
I really like Yvain’s post (read the whole thing!) and I wanted to add a couple more proposals for classes of people whose Courtier’s Replies you should take seriously. (I think his is kinda relevant to finding the right contrarian cluster. If someone is right about something most humans, and even most smart humans are wrong about, treat their outre opinions a little more seriously). Here are some other triggers I can come up with.
Even though their model is different, their predictions look a lot like yours.
So you disagree on whatever question one of you wants to Courtier’s Reply on, but both your systems of belief actually output the same results. Sure, it’s possible that your interlocutor has an invisible dragon in their epistemological garage, but it’s also possible that you do. Whether or not you take their reading recommendations, it’s probably worth your time to see whether your expectations about the world are flowing pretty directly from your model, or whether you’ve tweaked and kludged it past use to overfit the data you’re sure of. Reading through your opponent’s material might help during this process, since you want to subject your beliefs to a test that is actually threatening to false beliefs. So don’t feel relieved if the other side passes the test, too, and don’t feel triumph if you can come up with a black mark for the other side until you check whether your proposition escapes untainted.
It’s hard to figure out how to test which of two similar models are superior. If you’re doing it based on elegance, you’re liable to get tripped up by changing aesthetic standards. It’s best to find places where they do differ in their predictions, and you’ll have an easier time doing that if you understand what your opponent’s position looks like from the inside. Think about how you would convince someone to abandon Ptolemaic astronomy for Newtonian physics (and how someone would get you to give that up in favor of relativity).
You’re in love with them. I don’t mean this in a fuzzy warm feelings way (though replying to your partner’s book recommendations with “I’m not going to fall for the Courtier’s reply!” is probably a terrible idea in any relationship). Remember that my model for romantic love is finding someone who makes it easiest for you to be the person you ought to be and vice versa, and my model for marriage is putting constraints on your future self — thinking if you changed enough to not mesh well with your beloved, you’ve gone off track.
So, somehow, this person brings out the best in you, but is wrong about something important. That’s not a contradiction; something can induce a desired change without being true (a singer knows the mouth is not actually a cathedral, but thinking that helps you make your mouth do whatever hard to put into words thing it’s doing). But this does seem like a specific instance of the case above: something in their model seems to work really well, so it’s interesting to see how the specific false belief manages to be so right or to coexist with so many true ones. Plus, like in the case above, it’s a little more likely that you’re wrong than in the generic Courtier’s Reply case, so spend a little effort double checking.
You both respond strongly to certain not-common aesthetics. This is less a strong signal that you might be mixed up about which of you has made an obvious error, and more just that you might find it interesting to check how your interlocutor’s beliefs work. I usually have more interesting arguments after reading a novel than after reading non-fiction because people pull out weirder and more revealing analogies.