A practical guide for avoiding conspiratorial thinking

Marko KovicBlogLeave a Comment

We are all prone to conspiratorial thinking. Luckily, we can do something about it.

Note: This article was first published on Medium.com.

When we talk about conspiracy theories, we are often accusing others of being irrational — spotting irrationality (poorly justified beliefs) in others is easy. But the simple truth is that all of us can fall prey to conspiratorial thinking, whether we like to admit it or not. And that’s ok: Believing in conspiracy theories is not a pathology of the mind, but rather a consequence of the cognitive limitations of our mind.

In order to do something about the proliferation of conspiracy theories, then, we should start with ourselves: We have to take a step back and engage in what is sometimes referred to as metacognition. Metacognition simply means to think about how we think. When we think about how we think, we change our thinking mode from a fast, automated mode to a slow, more deliberate mode. Such slow thinking is necessary in order to avoid conspiratorial thinking.

That might sound like an abstract idea, but it is actually rather easy. All you need to do is follow a few simple steps.

1. Ignore your gut feeling

We’re often told that we need to listen to our heart, our gut feeling, our intuition, something like that. This is often good advice. In matters of the heart, for example, listening to your heart is probably a good idea.

When it comes to thinking about the objective world, however, our gut feeling can be a very bad counsel. The way we think about the world is dotted with errors, so-called cognitive biases as well as logical fallacies. If we listen to our gut, we are actually exacerbating the errors in our reasoning: Our subjective feelings and intuitions are the very consequence of the errors in our reasoning.

Sometimes, we don’t outright declare that we are trusting our gut feeling, but instead, we refer to something like “common sense”. Forget about common sense, too. When we talk about common sense, we are usually simply talking about a gut feeling that other people also have. Multiplying gut feelings does nothing to make them more reliable.

2. Make your beliefs explicit

We humans don’t have perfect access to reality and to objective facts. We basically operate under the assumption that there is a reality, and that is possible to have some idea about what reality looks like. These ideas in our heads of what reality looks are sometimes called beliefs.

When we are trying to figure out, for example, why some event in the world happened, we are implicitly working with a number of beliefs about that event. This happens automatically, and we are not very used to really thinking about the beliefs that we are holding in a given situation.

For example, let’s say you are about to roll a die. You are hoping to roll a six, and you do actually end up with a six. What would you believe happened here? You might, for example, believe that you were lucky. Or you might believe that the die you just rolled is manipulated. Or you might believe that you have telekinetic powers and made the die roll a six.

3. Quantify your beliefs with probabilities

This part is really hard.

In everyday life, we are used to thinking in deterministic terms: Yes or no, black or white, true or false. Something either is or isn’t. In many situations, that makes sense. But in many and, arguably, many more situations, thinking in black-or-white patterns means that we are expressing a very strong belief; much stronger, perhaps, than we would really like to.

Let’s go back to the die rolling example. Imagine that, in the past, your friends have pranked you with manipulated, trick dies. In such a scenario, you might think that the die you have just rolled is also manipulated. However, you are not completely sure about that — your belief is not absolutely certain, but it rather contains some amount of uncertainty.

Uncertainty is a basic property our ideas about reality. At best, we can be almost certain about something, but there are few situations in which we are really absolutely certain about something. Since uncertainty is so omnipresent, we should express that uncertainty. We can do so using probabilities.

A probability as a number between 0 and 1 (or 0% and 100%) can be used as a quantification of the strength of our beliefs. If we return to to our die rolling example, you might say that the probability that the die is manipulated is 50%. In the given context, this is a more rational belief than the black-or-white belief that the die either is or is not manipulated.

4. Justify your beliefs

By now, you have made your beliefs explicit, and you have quantified them. Now, you are naturally asking yourself a simple, but crucial question: Do I have good reasons to believe what I believe?

In the philosophical jargon, this question and, of course, the answer to it is sometimes referred to as justification of beliefs. Justifying beliefs simply means providing reasons for holding the beliefs that we are holding, and holding them as strongly (or weakly) as we are. In the die rolling example, we have expressed the belief that the die is manipulated with a probability of 50%. The justification for that belief is the prior experience we had with manipulated dice (our friends have pranked us before). In this example, the justification at hand is fairly good. It might not be very precise, but, given our prior information, it is probably somewhat accurate.

The concept of belief justification is in and of itself a very complex one. Generally, in order to justify a belief, we should try to stick to logically coherent arguments and, if available, empirical evidence.

5. Evaluate the justification for your beliefs

This step is simply an add-on the the preceding one. Now that you have justified your beliefs, it’s time to take another step back and ask yourself: How good is my justification?

Coming up with a justification for our beliefs is not that difficult. Even though we know, vaguely, that we should care about logic and evidence in our justifications, we are likely to suffer from the so-called confirmation bias in any given ad hoc justification. The confirmation bias is a universal human tendency to seek out information that, on the face of it, supports our beliefs. The confirmation bias is really strong, and that is why we need to stop and think about how well we have justified our beliefs.

6. Be charitable

When we are confronted with arguments and evidence that challenge our beliefs, we tend to react dismissively: What a bunch of idiots! They must be borderline crazy to say things that are incompatible with my beliefs!

Such a reaction is very human, but it is also very irrational. Not necessarily because what others are saying is always a sound argument. Rather, it’s about probability, again: 99 out of 100 people who oppose your view might justify their beliefs badly. But perhaps 1 person out of 100 will oppose your belief, and do so in a justified manner. Because you are trying to be reasonable yourself, you would not want to miss that one potentially relevant piece of information.

This is where the so-called principle of charity should kick in. The principle of charity is a philosophical argument about language: Even if what some person A is saying sounds ridiculous and ludicrous to you, you cannot, rationally, discount the possibility that the reason why the things A is saying sound ridiculous to you is not the actual quality of the argument of A, but rather a problem of conveying what A is intending to say to your mind.

In a less abstract way, you can understand the principle of charity in the following manner: When person A says something, try to interpret what A is saying as if A was saying it as rationally as possible.

The principle of charity is a big deal, because often, what we might think is ridiculous and nonsensical is not necessarily ridiculous and nonsensical — because of our cognitive errors, such as the confirmation bias, we are simply prone to interpreting arguments that conflict with our beliefs in an uncharitable manner.

7. Be ready to change your mind

The last point is really just the icing on the cake: If you have arrived at beliefs following the previous steps, you will almost naturally be willing to update your beliefs in the face of new arguments and evidence.

In reality, of course, things are a little bit more difficult: We like being right, and we don’t really like admitting to ourselves that our beliefs need to be updated. This is especially true when we have invested a lot in our beliefs: If you spend a lot of time reinforcing your belief, and if, perhaps, other parts of your life are contingent on that belief, then it’s really tough updating that belief.

That’s it!

If you roughly follow the seven steps outlined above, the probability that you will engage in conspiratorial thinking is likely to decline. And there is no “debunking” and stuff like that going on — it’s really not about telling you what to believe and what not to believe. The whole metacognitive exercise really is only about thinking about our thinking.

P.S. Tricked you! This isn’t *only* about conspiracy theories

In the title, I describe this text as a practical guide for avoiding conspiratorial thinking. That’s sort of true; metacognition as described here really will help achive you that goal.

But, actually, that is only part of what this guide is useful for. If you try to think about your thinking in a way that is similar to what I describe here, you are likely to be more rational in general, regardless of the specific context.

Autor

Kommentar schreiben