Consciousness

Conscious mental states include sensations, such as the pleasure of relaxing in a hot bath or the discomfort of a hangover, perceptual experiences, such as the visual experience of a computer screen about half a meter in front of me, and occurrent thoughts, such as the sudden thought about how a problem can be solved. Consciousness is thus a pervasive feature of our mental lives, but it is also a perplexing one. This perplexity -- the sense that there is something mysterious about consciousness despite our familiarity with sensation, perception, and thought -- arises principally from the question of how consciousness can be the product of physical processes in our brains.

Ullin Place (1956) introduced a precursor of central state materialism for conscious states such as sensations. But the idea that types of conscious experience are to be identified with types of brain processes raises an important question, which can be made vivid by using Thomas Nagel's (1974) idea of WHAT-IT'S-LIKE to be in a certain state -- and, more generally, the idea of there being something that it is like to be a certain creature or system. The question is, why should there be something that it is like for certain processes to be occurring in our brains? Nagel's famous example of what it is like to be a bat illustrates that our grasp of facts about the subjective character of experiences depends very much on our particular perceptual systems. Our grasp on physical or neurophysiological theories, in contrast, is not so dependent. Thus it may appear that subjective facts are not to be identified with the facts that are spelled out in those scientific theories. This Nagelian argument about the elusiveness of QUALIA is importantly similar to Frank Jackson's (1982, 1986) "knowledge argument" and similar responses have been offered to both (Churchland 1985, 1988; and for a reply, Braddon-Mitchell and Jackson 1996).

Ned Block's (1978) "absent qualia argument" is different from the arguments of Nagel and Jackson because it is specifically directed against FUNCTIONALISM: the idea that mental states are individuated by the causal roles they play in the total mental economy, rather than by the particular neurophysiological ways these roles are realized. The problem for functionalism is that we can imagine a system (e.g., Block's homunculi-headed system) in which there is nothing that it is like to be that system, even though there are, within the system, devices that play the various functional roles associated with sensations, perceptions, and thoughts. This argument is not intended for use against a physicalist who (in the style of Place and subsequent central state materialists) simply identifies conscious mental states with brain processes (pain with C-fibers firing, for example). The examples used in the absent qualia argument may, however, be used to support the claim that it is even logically possible there could be a physical duplicate of a normal human being that nevertheless lacked qualia (a "zombie"; Chalmers 1996).

It is a disputed question whether arguments like Nagel's can establish an ontological conclusion that consciousness involves something nonphysical (see MIND-BODY PROBLEM). But even if they cannot, there still appears to be a problem about consciousness; namely, it is a mystery why there should be something that it is like to undergo certain physical processes. This is what Joseph Levine (1983) has called the EXPLANATORY GAP. Jackson and Block both join Nagel in seeing a puzzle at this point, and Colin McGinn (1989) has argued that understanding how physical processes give rise to consciousness is cognitively beyond us (for a critical appraisal of McGinn's argument, see Flanagan 1992).

One possible strategy for demystifying the notion of consciousness is to claim that consciousness is a matter of thought about mental states. This is the "higher-order thought theory of consciousness" favored by David Rosenthal (1986). In this theory, consciousness, considered as a property of mental states, is analyzed in terms of consciousness of mental states, while consciousness of something is analyzed in terms of having a thought about that thing. Thus for a mental state to be a conscious mental state is for the subject of that state to have a thought about it. If the higher-order thought theory were to be correct, then the occurrence of consciousness in the physical world would not be any more mysterious than the occurrence of mental states, which are not in themselves conscious states, or the occurrence of thoughts about mental states.

However, there are some quite serious problems for the higher-order thought theory. One is that the theory seems to face a kind of dilemma. If the notion of thought employed is a demanding one, then there could be something that it is like for a creature to be in certain states even though the creature did not have (perhaps, even, could not have) any thoughts about those states. In that case, higher-order thought is not necessary for consciousness. But if the notion of thought that is employed is a thin and undemanding one, then higher-order thought is not sufficient for consciousness. Suppose, for example, that thought is said to require no more than having discriminative capacities. Then it seems clear that a creature, or other system, could be in a certain type of mental state, and could have a capacity to detect whether it was in a state of that type, even though there was nothing that it was like to be that creature or system.

More generally, work toward the demystification of consciousness has a negative and a positive aspect. The negative aspect consists in seeking to reveal unclarities and paradoxes in the notion of the subjective character of experience (e.g., Dennett 1988, 1991). The positive aspect consists in offering putative explanations of one or another property of conscious experience in neural terms. Paul Churchland (1988, 148) clearly illustrates how to explain certain structural features of our experiences of color (for example, that an experience of orange is more like an experience of red than it is like an experience of blue). The explanation appeals to the system of neural coding for colors that involves triples of activation values corresponding to the illumination reaching three families of cones, and to structural properties of the three-dimensional space in which they are plotted (see COLOR VISION). But while this is a satisfying explanation of those structural features of color experiences, it seems to leave us without any account of why it is like anything at all to see red. Why there are any experiential correlates of the neural codes is left as a brute unexplained fact. The demystifier of consciousness may then reply that this appearance of residual mystery is illusory, and that it is a product either of fallacies and confusions that surround the notion of the subjective character of experience or else of an illegitimately high standard im-posed on explanation.

The notion of consciousness associated with the idea of the subjective character of experience, and which generates the "hard problem" of consciousness (Chalmers 1996), is sometimes called "phenomenal consciousness." There are several other notions for which the term consciousness is sometimes used (Allport 1988), including being awake, voluntary action, ATTENTION, monitoring of internal states, reportability, INTROSPECTION, and SELF-KNOWLEDGE. The distinctions among these notions are important, especially for the assessment of cognitive psychological and neuroscientific theories of consciousness (see CONSCIOUSNESS, NEUROBIOLOGY OF).

One particularly useful contrast is between phenomenal consciousness and "access consciousness" (Block 1995, 231): "A state is access-conscious if, in virtue of one's having the state, a representation of its content is (1) inferentially promiscuous, that is, poised to be used as a premise in reasoning, (2) poised for rational control of action, and (3) poised for rational control of speech. . . . [Access consciousness is] a cluster concept, in which (3) -- roughly, reportability -- is the element of the cluster with the smallest weight, though (3) is often the best practical guide to [access consciousness]." The two notions appear to be independent in the sense that it is possible to have phenomenal (P) consciousness without access (A) consciousness, and vice versa. An example of P-consciousness without A-consciousness would be a situation in which there is an audible noise to which we pay no attention because we are engrossed in conversation. As an example of A-consciousness without P-consciousness, Block (1995, 233) suggests an imaginary phenomenon of "superblindsight." In ordinary cases of BLINDSIGHT, patients are able to guess correctly whether there is, for example, an O or an X in the blind region of their visual field, even though they are unable to see either an O or an X there. The state that represents an O or an X is neither a P-conscious nor an A-conscious state. In superblindsight, there is still no P-consciousness, but now the patient is imagined to be able to make free use in reasoning of the information that there is an O, or that there is an X.

While the notion of phenomenal consciousness applies most naturally to sensations and perceptual experiences, the notion of access consciousness applies very clearly to thoughts. It is not obvious whether we should extend the notion of phenomenal consciousness to include thoughts as well as sensory experiences. But the idea of an important connection between consciousness and thought is an engaging one. Sometimes, for example, it seems hard to accept that there could be a fully satisfying reconstruction of thinking in the terms favored by the physical sciences. This intuition is similar to, and perhaps derives from, the intuition that consciousness somehow defies scientific explanation.

The question whether there is an important connection between consciousness and thought divides into two: Does consciousness require thought? Does thought require consciousness? The intuitive answer to the first question is that access consciousness evidently does require thought, but that phenomenal consciousness does not. (The appeal of this intuitive answer is the source of some objections to the higher-order thought theory of consciousness.) The answer to the second question as it concerns access consciousness is that there is scarcely any distance at all between the notion of thought and the notion of access consciousness. But when we focus on phenomenal consciousness, the answer to the second question is less clear.

John Searle (1990, 586) argues for the connection principle: "The ascription of an unconscious intentional phenomenon to a system implies that the phenomenon is in principle accessible to consciousness." This is to say that, while we can allow for unconscious intentional states, such as unconscious thoughts, these have to be seen as secondary, and as standing in a close relation to conscious intentional states. Searle's argument is naturally interpreted as being directed toward the conclusion that central cases of thinking are at least akin to phenomenally conscious states.

Even if one does not accept Searle's argument for the connection principle, there is a plausible argument for a weaker version of his conclusion. The INTENTIONALITY of human thought involves modes of presentation of objects and properties (see SENSE AND REFERENCE); demonstrative modes of presentation afforded by perceptual experience of objects and their properties constitute particularly clear examples. For example, we think of an object as "that [perceptually presented] cat" or of a property as "that color." Suppose now that it could be argued that some theoretical primacy attaches to these "perceptual demonstrative" modes of presentation (Perry 1979). It might be argued, for example, that in order to be able to think about objects at all, a subject needs to be able to think about objects under perceptual demonstrative modes of presentation. Such an argument would establish a deep connection between intentionality and consciousness.

Finally, there is another way phenomenal consciousness might enter the theory of thought. It might be because a thinker's thoughts are phenomenally conscious states, that they also have the more dispositional properties (such as reportability) mentioned in the definition of access consciousness. This phenomenal consciousness property might also figure in the explanation of a thinker's being able to engage in critical reasoning -- evaluating and assessing reasons and reasoning as such (Burge 1996). It is far from clear, however, whether this idea can be worked out in a satisfactory way. Would the idea require a sensational phenomenology for thinking? If it does require that, then it might be natural to suggest that phenomenally conscious thoughts are clothed in the phonological or orthographic forms of natural language sentences (Carruthers 1996).

Additional links

-- Martin Davies

References

Allport, A. (1988). What concept of consciousness? In A. J. Marcel and E. Bisiach, Eds., Consciousness in Contemporary Science. Oxford: Oxford University Press, pp. 159-182.

Block, N. (1978). Troubles with functionalism. In C. Wade Savage, Ed., Minnesota Studies in the Philosophy of Science, vol. 9. Minneapolis: University of Minnesota Press, pp. 261-325.

Block, N. (1995). On a confusion about a function of consciousness. Behavioral and Brain Sciences 18:227-287.

Braddon-Mitchell, D., and F. Jackson (1996). Philosophy of Mind and Cognition. Oxford: Blackwell.

Burge, T. (1996). Our entitlement to self-knowledge. Proceedings of the Aristotelian Society 96:91-116.

Carruthers, P. (1996). Language, Thought and Consciousness. Cambridge: Cambridge University Press.

Chalmers, D. (1996). The Conscious Mind: In Search of a Fundamental Theory. New York: Oxford University Press.

Churchland, P. M. (1985). Reduction, qualia and the direct introspection of brain states. Journal of Philosophy 82:8-28.

Churchland, P. M. (1988). Matter and Consciousness. Rev. ed. Cambridge, MA: MIT Press.

Dennett, D. C. (1988). Quining qualia. In A. J. Marcel and E. Bisiach, Eds., Consciousness in Contemporary Science. Oxford: Oxford University Press, pp. 42-77.

Dennett, D. C. (1991). Consciousness Explained. Boston: Little, Brown.

Flanagan, O. (1992). Consciousness Reconsidered. Cambridge, MA: MIT Press.

Jackson, F. (1982). Epiphenomenal qualia. Philosophical Quarterly 32:127-136.

Jackson, F. (1986). What Mary didn't know. Journal of Philosophy 83:291-295.

Levine, J. (1983). Materialism and qualia: The explanatory gap. Pacific Philosophical Quarterly 64:354-361.

McGinn, C. (1989). Can we solve the mind-body problem? Mind 98:349-366.

Nagel, T. (1974). What is it like to be a bat? Philosophical Review 83:435-450.

Perry, J. (1979). The problem of the essential indexical. Noûs 13:3-21.

Place, U. T. (1956). Is consciousness a brain process? British Journal of Psychology 47:44-50.

Rosenthal, D. M. (1986). Two concepts of consciousness. Philosophical Studies 94:329-359.

Searle, J. R. (1990). Consciousness, explanatory inversion, and cognitive science. Behavioral and Brain Sciences 13:585-596.

Further Readings

Block, N. (1998). How to find the neural correlate of consciousness. In A. O'Hear, Ed., Contemporary Issues in the Philosophy of Mind. Royal Institute of Philosophy Supplement 43. Cambridge: Cambridge University Press, pp. 23-34.

Block, N., O. Flanagan, and G. Güzeldere, Eds. (1997). The Nature of Consciousness: Philosophical Debates. Cambridge, MA: MIT Press.

Crick, F., and C. Koch. (1995). Are we aware of neural activity in primary visual cortex? Nature 375:121-123.

Davies, M., and G. W. Humphreys, Eds. (1993). Consciousness: Psychological and Philosophical Essays. Oxford: Blackwell.

Dennett, D. C., and M. Kinsbourne. (1992). Time and the observer: The where and when of consciousness in the brain. Behavioral and Brain Sciences 15:183-247.

Metzinger, T., Ed. (1995). Conscious Experience. Paderborn: Schöningh.

Nelkin, N. (1996). Consciousness and the Origins of Thought. Cambridge: Cambridge University Press.

Peacocke, C. (1998). Conscious attitudes, attention and self-knowledge. In C. Wright, B. C. Smith, and C. Macdonald, Eds., Knowing Our Own Minds. Oxford: Oxford University Press, pp. 63-98.

Rolls, E. T. (1997). Consciousness in neural networks? Neural Networks 10:1227-1240.

Schacter, D. L. (1989). On the relation between memory and consciousness: Dissociable interactions and conscious experience. In H. Roediger and F. Craik, Eds., Varieties of Memory and Consciousness: Essays in Honor of Endel Tulving. Hillsdale, NJ: Erlbaum.

Shear, J., Ed. (1997). Explaining Consciousness: The Hard Problem. Cambridge, MA: MIT Press.

Tye, M. (1995). Ten Problems of Consciousness: A Representational Theory of the Phenomenal Mind. Cambridge, MA: MIT Press.