Why does “knowing disbelief” need to be explained?

Towards the end of his last post, Dan Kahan worries he’s missing something important about “knowing disbelievers”:

The “knowing disbeliever” I have in mind isn’t particularly agitated by any apparent conflict or contradiction in his or her states of belief about the science on climate change, and feels no particular compulsion to get in a fight with anyone about it. This individual just wants to be who he or she is and make use of what is collectively known to live and live well as a free and reasoning person. Not having a satisfying understanding of how this person thinks makes me anxious that I’m missing something very important.

I’ve been thinking a lot about this line. Rather than not having a good explanation, I wonder if the problem is that Kahan thinks such people need to be explained in the first place. But why should people be consistent? Why even have that expectation? As Kahan himself notes, even scientists sometimes exhibit cognitive dissonance.

Perhaps we should start from the premise that everyone is intellectually inconsistent at times. Knowing disbelievers should no more need a “satisfying understanding” than amazing basketball players who can’t shoot free-throws. In sports we accept that athletic ability is complicated and can manifest itself in all sorts of unpredictable ways. No one feels the need to explain it because that just the way it is. Why don’t we do the same for intellectual ability?

If we did, we might then conduct research to account for the handful of people who are consistent all the time. Because that’s the behavior that needs explaining.

3 Comments

  1. I agree with your observation that everyone is intellectually inconsistent at times, and it would be those outlier freaks who are consistent all the time that would really merit studying. I don’t know about your sports analogy; I’ll have to think about it more. But I have two other thougths about this.

    First, I think part of the reason people comfortably maintain these kinds of dualisms is because of the inherent difficulty (which most people are sort of instinctively aware of) in “proving” any statement in any field beyond any possible doubt. If you wake up in the morning and look out your window and see snow on the ground, you will infer that the most likely cause is that it snowed last night while you were sleeping. Then again, it might merely look like snow. Maybe fake snow from a movie set was blown over your neighborhood. Maybe a snow machine from a ski resort blew snow over your neighborhood. Maybe it was a miracle. Maybe extraterrestrials visited your neighborhood last night. If you really don’t want to accept that it snowed in an ordinary way last night, there’s always some (implausible!) alternative hypothesis that can be advanced, which might be logically concluded if only you begin with a different set of assumptions. (Maybe there’s a ski resort nearby and the owner owes a favor to a family in the neighborhood). But it takes a lot of work to exhaustively examine every possible alternative hypothesis and rule them all out absolutely. And it’s inefficient. Most people instinctively accept what seems most likely, until the most likely explanation is one that they don’t like. And then they can always reject the most likely case by pointing to a less likely (but barely possible!) explanation and they can justify it by observing that sometimes in the past explanations that at one time seemed less likely did actually turn out later to be the better explanations in light of additional evidences.

    Second, I think peer conformity and acceptance of authority play a much, much greater role in the beliefs of almost anyone (even scientists) on almost anything (even scientific theories) than is often acknowledged. People will say that they think things through on their own and draw their own conclusions. But, suppose you take a simple statement that really absolutely can be proven true or false in a logical sense because it is purely mathematical, like say, Fermat’s Last Theorem. If you have a high-school-graduate grasp of math, are you going to carefully read Andrew Wiles’ paper on this and see whether you come to the same conclusion? Or are you simply going to trust the peer review process? Probably the latter. But what if there are smart, credentialled people who fall on both sides of acceptance of some very complex explanation for a set of observations, and both sides seem compelling? Particularly outside of a field like math where all the relevant assumptions are explicitly stated up front, where the conclusions you draw depend on sets of assumptions that are not obvious or universally agreed upon, you will probably “believe” one explanation or inferences based on sets of assumptions or simply sets of authorities. No one possesses the time or the skill set necessary to really believe everything that they believe based on purely on the best arguments and evidence. And we don’t seem to have a built-in gut feeling that such an approach is even necessary to arrive at a “truth” that we’re comfortable with.

    I just think it shouldn’t be that surprising to Professor Kahan that people can understand a position, understand the evidences and arguments in favor of it, and understand that there is a large group of credentialed authorities that favor that position, but still not personally be convinced. It’s very human.

  2. Hi Tim. Thanks for the thoughtful response. I really appreciate it. I’m sorry it took me so long to respond. Had a very busy work week.

    I really like your thoughts on peer conformity and acceptance of authority. I agree that everyone relies on it, and PhD scientists just as much. I’ve seen this a lot when it comes to global warming. Many of my physics PhD friends will acknowledge they don’t really understand the issue, but believe it because they see lots of other scientists agreeing with it.

    Now I don’t think there’s nothing wrong with that approach. In fact, I think it’s mostly a good thing! There is just too much out there to learn and understand, and we have to outsource our thinking on complex issues.

    Anyway…thanks again for the response. I look forward to hearing your thoughts in the future.

  3. I would add that people are not just intellectually inconsistent. We’re inconsistent in general (especially with the alignment of “belief” and action).

Leave a Comment

Your email address will not be published. Required fields are marked *