Delusional beliefs are typically pathological. Being pathological is not the same as being false or being irrational. A woman might falsely believe that Istanbul is the capital of Turkey, but it might just be a simple mistake. A man might believe without good evidence that he is smarter than his colleagues, but it might just be a healthy self-deceptive belief. On the other hand, when a patient with brain damage caused by a car accident believes that his father was replaced by an imposter, or when another patient with schizophrenia believes that ‘The Organization’ painted the doors of the houses on a street as a message to him, these beliefs are not merely false or irrational. They are pathological.
What makes delusional beliefs pathological? One might think, for example, that delusions are pathological because of their extreme irrationality. The problem with this view, however, is that it is not obvious that delusional beliefs are extremely irrational. Maher (1974), for example, argues that delusions are reasonable explanations of abnormal experience.
“[T]he explanations (i.e. the delusions) of the patient are derived by cognitive activity that is essentially indistinguishable from that employed by non-patients, by scientists, and by people generally. The structural coherence and internal consistency of the explanation will be a reflection of the intelligence of the individual patient.” (Maher 1974, 103)
Again, Coltheart and colleagues (2010) argue that it is rational, from the Bayesian point of view, for a person with the Capgras delusion to adopt the delusional hypothesis given his neuropsychological deficits. Bayes’s theorem prescribes a mathematical procedure of updating the probability of a hypothesis on the basis of prior beliefs and new observations. Coltheart and colleagues claim that the delusional hypotheses get higher probabilities than competing non-delusional hypotheses given relevant prior beliefs and the observations of the neuropsychological deficits.
“The delusional hypothesis provides a much more convincing explanation of the highly unusual data than the nondelusional hypothesis; and this fact swamps the general implausibility of the delusional hypothesis. So if the subject with Capgras delusion unconsciously reasons in this way, he has up to this point committed no mistake of rationality on the Bayesian model.” (Coltheart, Menzies, & Sutton 2010, 278)
The claim by Coltheart and colleagues is, however, controversial. In response, McKay (2012) argues that adopting delusional hypotheses is due to the irrational bias of discounting the ratio of prior probabilities. Even if McKay is correct, however, it is not clear that delusional beliefs are extremely irrational since similar biases might be found among normal people as well.
For instance, in the famous experiment by Kahneman and Tversky (1973), normal subjects, first, received the base-rate information about a hypothetical group of people (e.g., “30 engineers and 70 lawyers”). Then, the personality description of a particular person in the group was provided and the subjects were asked to predict the occupation (e.g., an engineer or a lawyer) of the person. The crucial finding was that the manipulation of the base-rate information, which provides the prior probability of the hypotheses at issue (e.g., the hypothesis that this person is a lawyer), had almost no effect on the prediction of the subjects (“base-rate neglect”). The finding suggests that the bias of discounting prior probabilities can be seen among normal people. As Bortolotti pointed out (2009), the irrationality that we find in people with delusions might not be very different from the irrationality we find in normal people.
It is even conceivable that people with delusions are more rational than normal people. In the well-known experiment by Huq and colleagues, the subjects were asked to determine whether a given jar is the jar A, which contains 85 pink beads and 15 green beads, or the jar B, which contains 15 pink beads and 85 green beads, on the basis of the observation of the beads drawn from it. It was found that the subjects with delusions need less evidence (i.e., less beads drawn from the jar) before coming to the conclusion than the subjects in control groups (“jumping-to-conclusion bias”). Interestingly, Huq and colleagues do not take this to show that the subjects with delusions are irrational. Rather, they note; “it may be argued that the deluded sample reached a decision at an objectively “rational” point. It may further be argued that the two control groups were somewhat overcautious” (Huq et al. 1988, 809) (but see Van Der Leer et al. 2015).
In my paper, Delusions as Harmful Malfunctioning Beliefs (http://www.sciencedirect.com/science/article/pii/S1053810014002001), I also examine the views according to which delusional beliefs are pathological because of (1) their strange content, (2) their resistance to folk psychological explanations and (3) the impaired responsibility-grounding capacities. I provide some counterexamples as well as difficulties for these proposals.
I argue, following Wakefield’s (1992a, 1992b) harmful dysfunction analysis of disorder, that delusional beliefs are pathological because they involve some kinds of harmful malfunctions. In other words, they have a significant negative impact on wellbeing (harmful) and, in addition, some psychological mechanisms, directly or indirectly related to them, fail to perform the functions for which they were selected (malfunctioning).
There can be two types of objections to the proposal. The first type of objection is that delusional beliefs might not involve any harmful malfunctions. For example, delusional beliefs might be playing psychological defence functions. The second type of objection is that involving harmful malfunctionings is not sufficient for a mental state to be pathological. For example, false beliefs might involve some malfunctions according to teleosemantics (Dretske 1991; Millikan 1989). But, there could be harmful false beliefs that are not pathological. The paper defends the proposal from these objections.
Bortolotti, L. 2010. Delusions and other irrational beliefs. Oxford: Oxford University Press.
Coltheart, M., Menzies, P. and Sutton, J. 2010. Abductive inference and delusional belief. Cognitive Neuropsychiatry 15(1–3), pp. 261–287.
Dretske, F. I. 1991. Explaining behavior: Reasons in a world of causes. Cambridge, MA: The MIT Press.
Huq, S., Garety, P. and Hemsley, D. 1988. Probabilistic judgements in deluded and non-deluded subjects. The Quarterly Journal of Experimental Psychology 40(4), pp. 801–812.
Kahneman, D. and Tversky, A. 1973. On the psychology of prediction. Psychological Review 80(4), pp 237- 251.
Maher, B. A. 1974. Delusional thinking and perceptual disorder. Journal of Individual Psychology 30, pp. 98–113.
McKay, R. 2012. Delusional inference. Mind & Language 27(3), pp. 330–355.
Millikan, R. G. 1989. Biosemantics. The Journal of Philosophy 86, pp. 281–297.
Van Der Leer, L., Hartig, B., Goldmanis, M. and McKay, R. 2015. Delusion-proneness and ‘Jumping to Conclusions’: Relative and absolute effects. Psychological Medicine 19(3), pp. 257-67.
Wakefield, J. C. 1992a. The Concept of Mental Disorder: On the boundary between biological facts and social values. American Psychologist 47(3), pp. 373–388.
Wakefield, J. C. 1992b. Disorder as harmful dysfunction: A conceptual critique of DSM-III-R’s definition of mental disorder. Psychological Review 99(2), pp. 232–247.