Identity Verified Thinker in Science / Social Sciences / Psychology
Michael Smithson
Michael Smithson
Michael Smithson is a Professor in the Psychology Department at The Australian National University. He has written 6 books, co-edited 3, and published more than 120 refereed articles and book chapters. His research interests focus on how people think about and respond to unknowns.


This Blog has no active categories.
Posted in Science / Social Sciences / Psychology

Delusions: What and Why

Apr. 5, 2011 7:26 pm
Categories: None

Any blog whose theme is ignorance and uncertainty should get around to discussing delusions sooner or later. I am to give a lecture on the topic to third-year neuropsych students this week, so a post about it naturally follows. Delusions are said to be a concomitant and indeed a product of other cognitive or other psychological pathologies, and traditionally research on delusions was conducted in clinical psychology and psychiatry. Recently, though, some others have got in on the act: Neuroscientists and philosophers.

The connection with neuroscience probably is obvious. Some kinds of delusion, as we’ll see, beg for a neurological explanation. But why have philosophers taken an interest? To get ourselves in the appropriately philosophical mood let’s begin by asking, what is a delusion?

Here’s the Diagnostic and Statistical Manual definition (2000):

“A false belief based on incorrect inference about external reality that is firmly sustained despite what almost everyone else believes and despite what constitutes incontrovertible and obvious proof or evidence to the contrary.”

But how does that differ from:

  1. A mere error in reasoning?
  2. Confirmation bias?
  3. Self-enhancement bias?

There’s a plethora of empirical research verifying that most of us, most of the time, are poor logicians and even worse statisticians. Likewise, there’s a substantial body of research documenting our tendency to pay more attention to and seek out information that confirms what we already believe, and to ignore or avoid contrary information. And then there’s the Lake Wobegon effect—The one where a large majority of us think we’re a better driver than average, less racially biased than average, more intelligent than average, and so on. But somehow none of these cognitive peccadilloes seem to be “delusions” on the same level as believing that you’re Napoleon or that Barack Obama is secretly in love with you.

Delusions are more than violations of reasoning (in fact, they may involve no pathology in reasoning at all). Nor are they merely cases of biased perception or wishful thinking. There seems to be more to a psychotic delusion than any of these characteristics; otherwise all of us are deluded most of the time and the concept loses its clinical cutting-edge.

One approach to defining them is to say that they entail a failure to comply with “procedural norms” for belief formation, particularly those involving the weighing and assessment of evidence. Procedural norms aren’t the same as epistemic norms (for instance, most of us are not Humean skeptics, nor do we update our beliefs using Bayes’ Theorem or think in terms of subjective expected utility calculations—But that doesn’t mean we’re deluded). So the appeal to procedural norms excuses “normal” reasoning errors, confirmation and self-enhancement biases. Instead, these are more like widely held social norms. The DSM definition has a decidedly social constructionist aspect to it. A belief is delusional if everyone else disbelieves it and everyone else believes the evidence against it is incontrovertible.

So, definitional difficulties remain (especially regarding religious beliefs or superstitions). In fact, there’s a website here making an attempt to “crowd-source” definitions of delusions. The nub of the problem is that it is hard to define a concept such as delusion without sliding from descriptions of what “normal” people believe and how they form beliefs into prescriptions for what people should believe or how they should form beliefs. Once we start down the prescriptive track, we encounter the awkward fact that we don’t have an uncontestable account of what people ought to believe or how they should arrive at their beliefs.

One element common to many definitions of delusion is the lack of insight on the part of the deluded. They’re meta-ignorant: They don’t know that they’re mistaken. But this notion poses some difficult problems for the potential victim of a delusion. In what senses can a person rationally believe they are (or have been) deluded? Straightaway we can knock out the following: “My current belief in X is false.” If I know believing X is wrong, then clearly I don’t believe X. Similarly, I can’t validly claim that all my current beliefs are false, or that the way I form beliefs always produces false beliefs.

Here are some defensible examples of self-insight that incorporates delusions:

  1. I believe I have held false beliefs in the past.
  2. I believe I may hold false beliefs in the future.
  3. I believe that some of my current beliefs may be false (but I don’t know which ones).
  4. I believe that the way I form any belief is unreliable (but I don’t know when it fails).

As you can see, self-insight regarding delusions is like self-insight into your own meta-ignorance (the stuff you don’t know you don’t know). You can spot it in your past and hypothesize it for your future, but you won’t be able to self-diagnose it in the here-and-now.

On the other hand, meta-ignorance and delusional thinking are easy to spot in others. For observers, it may seem obvious that someone is deluded generally in the sense that the way they form beliefs is unreliable. Usually generalized delusional thinking is a component in some type of psychosis or severe brain trauma.

But what’s really difficult to explain is monothematic delusions. These are what they sound like, namely specific delusional beliefs that have a single theme. The explanatory problem arises because the monothematically deluded person may otherwise seem cognitively competent. They can function in the everyday world, they can reason, their memories are accurate, and they form beliefs we can agree with except on one topic.

Could some monothematic delusions have a different basis from others?

Some theorists have distinguished Telic (goal-directed) from Thetic (truth-directed) delusions. Telic delusions (functional in the sense that they satisfy a goal) might be explained by a motivational basis. A combination of motivation and affective consequences (e.g., believing Q is distressing, therefore better to believe not-Q) could be a basis for delusional belief. An example is the de Clerambault syndrome, the belief that someone of high social status is secretly in love with oneself.

Thetic delusions are somewhat more puzzling, but also quite interesting. Maher (1974, etc.) said long ago that delusions arise from normal responses to anomalous experiences. Take Capgras syndrome, the belief that one’s nearest & dearest have been replaced by lookalike impostors. A recent theory about Capgras begins with the idea that if face recognition depends on a specific cognitive module, then it is possible for that to be damaged without affecting other cognitive abilities. A two-route model of face recognition holds that there are two sub-modules:

  • A ventral visuo-semantic pathway for visual encoding and overt recognition, and
  • A dorsal visuo-affective pathway for covert autonomic recognition and affective response to familiar faces.

For prosopagnosia sufferers the ventral system has been damaged, whereas for Capgras sufferers the dorsal system has been damaged. So here seems to be the basis for the “anomalous” experience that gives rise to Capgras syndrome. But not everyone whose dorsal system is damaged ends up with Capgras syndrome. What else could be going on?

Maher’s claim amounts to a one-factor theory about thetic delusions. The unusual experience (e.g., no longer feeling emotions when you see your nearest and dearest) becomes explained by the delusion (e.g., they’ve been replaced by impostors). A two-factor theory claims that reasoning also has to be defective (e.g., a tendency to leap to conclusions) or some motivational bias has to operate. Capgras or Cotard syndrome (the latter is a belief that one is dead) sounds like a reasoning pathology is involved, whereas de Clerambault syndrome or reverse Othello syndrome (deluded belief in the fidelity of one’s spouse) sounds like it’s propelled by a motivational bias.

What is the nature of the “second factor” in the Capgras delusion?

  1. Capgras patients are aware that their belief seems bizarre to others, but they are not persuaded by counter-arguments or evidence to the contrary.
  2. Davies et al. (2001) propose that, specifically, Capgras patients have lost the ability to refrain from believing that things are the way they appear to be. However, Capgras patients are not susceptible to visual illusions.
  3. McLaughlin (2009) posits that Capgras patients are susceptible to affective illusions, in the sense that a feeling of unfamiliarity leads straight to a belief in that unfamiliarity. But even if true, this account still doesn’t explain the persistence of that belief in the face of massive counter-evidence.

What about the patients who have a disconnection between their face recognition modules and their autonomic nervous systems but do not have Capgras? Turns out that the site of their damage differs from that of Capgras sufferers. But little is known about the differences between them in terms of phenomenology (e.g., whether loved ones also feel unfamiliar to the non-Capgras patients).

Where does all this leave us? To being with, we are reminded that a label (“delusion”) doesn’t bring with it a unitary phenomenon. There may be distinct types of delusions with quite distinct etiologies. The human sciences are especially vulnerable to this pitfall, because humans have fairly effective commonsensical theories about human beings—folk psychology and folk sociology—from which the human sciences borrow heavily. We’re far less likely to be (mis)guided by common sense when theorizing about things like mitochondria or mesons.

Second, there is a clear need for continued cross-disciplinary collaboration in studying delusions, particularly between cognitive and personality psychologists, neuroscientists, and philosophers of mind. “Delusion” and “self-deception” pose definitional and conceptual difficulties that rival anything in the psychological lexicon. The identification of specific neural structures implicated in particular delusions is crucial to understanding and treating them. The interaction between particular kinds of neurological trauma and other psychological traits or dispositions appears to be a key but is at present only poorly understood.

Last, but not least, this gives research on belief formation and reasoning a cutting edge, placing it at the clinical neuroscientific frontier. There may be something to the old commonsense notion that logic and madness are closely related. By the way, for an accessible and entertaining treatment of this theme in the history of mathematics, take a look at LogiComix.

Mike Sutton
April 6, 2011 at 8:42 am
Black Swans?

Another most interesting blog post.

With regard to the Diagnostic and Statistical Manual definition (2000) that you provide. I would just like to add the comment that it does not seem to accept Popperian philosophy regarding what we can't actually know. And so this leads me to ask this question:

Is it so that anyone holding onto the idea that the world and all the life forms that we have in it today were created in seven days - in spite of what the fossil record shows and evolutonary evidence explains - is delusional once they have been shown the evidence from the fossil record and Darwinism explanatons for it? Yet that 200 years ago anyone believing the world was built in 7 days would not be delusional despite the fact that there was zero evidence for such a notion even then?

Thinker's Post
Michael Smithson
April 7, 2011 at 7:02 pm

Your Popperian observation makes a good point. Anyone who thinks "outside the box" would stand accused of delusions according to the DSM definition, and such an accusation would be nigh impossible to falsify.

In fact, a strict Popperian would say anyone who believes anything is deluded-- All theories are to be held provisionally. The Popperian would say the notion that the world was created in 7 days has been disconfirmed, whereas Darwinian evolutionary theory has not yet been disconfirmed. Going back 200 years ago, the Popperian would say the theory that the world was created in 7 days had not yet been disconfirmed, but it also hadn't been put to a test. to Shut Down Permanently on December 31, 2017

If you want to save a copy of your content, you must do so before the website shuts down on December 31, 2017. We will NOT be able to provide any assistance after the website shuts down. We are available at only until the shutdown to provide more information and assistance.

It was a noble 10-year experiment, but it turns out that the writers with the best content are the least adept at the tech required to publish under our model, which in hindsight, makes perfect sense. If you are dedicating your life to becoming an expert in your specialty, you don’t have a lot of time left for figuring out publishing tech.

It hasn't helped that we have entered an age of unprecedented polarization and antagonism which doesn't foster demand for a website dedicated to the respectful engagement of diverse views.

Thank you, everyone!

Latest Ebooks