Informational Semantics

Informational semantics is an attempt to ground meaning -- as this is understood in the study of both language and mind -- in an objective, mind (and language) independent, notion of information. This effort is often part of a larger effort to naturalize INTENTIONALITY and thereby exhibit semantic -- and, more generally, mental -- phenomena as an aspect of our more familiar (at least better understood) material world.

Informational semantics locates the primary source of meaning in symbol-world relations (the symbols in question can occur either in the language of thought or in a public language). The symbol-world relations are sometimes described in information-theoretic terms (source, receiver, signal, etc.) and sometimes in more general causal terms. In either case, the resulting semantics is to be contrasted with conceptual role (also called procedural) semantics, which locates meaning in the relations symbols have to one another (or, more broadly, the way they are related to one another, sensory input, and motor output). Because on some interpretations of information, the information a signal carries is what it indicates about a source, informational semantics is sometimes referred to as indicator semantics. The concept of information involved is inspired by, but is only distantly related to, the statistical construct in INFORMATION THEORY (Dretske 1981).

The word "meaning" is multiply ambiguous. Two of its possible meanings (Grice 1989) are: (1) nonnatural meaning -- the sense in which the word "fire" stands for or means fire; and (2) natural meaning -- the way in which smoke means (is a sign of, indicates) fire. Nonnatural meaning has no necessary connection with truth: "Jim has the measles" means that Jim has the measles whether or not he has the measles. Natural meaning, on the other hand, requires the existence of the condition meant: if Jim doesn't have the measles, the red spots on his face do not mean (indicate) that he has the measles. Perhaps all they mean is that he has been eating too much candy. Natural meaning, what one event indicates about another, is taken to be a relation between sign and signified that does not depend on anyone recognizing or identifying what is meant. Tracks in the snow can mean there are deer in the woods even if no one identifies them that way -- even if they do not mean that to anyone.

Information, as this is used in informational semantics, is akin to natural meaning. It is an objective (mind-independent) relation between a sign or signal -- tracks in the snow, for instance -- and what that sign or signal indicates -- deer in the woods. The information a signal carries about a source is what that signal indicates (means in a natural way) about that source. Informational semantics, then, is an effort to understand non-natural meaning -- the kind of meaning characteristic of thought and language -- as arising out of and having its source in natural meaning. The word "meaning" will hereafter be used to refer to nonnatural meaning; "information" and "indication" will be reserved for natural meaning.

Informational semantics takes the primary home of meaning to be in the mind -- as the meaning or content of a thought or intention. Sounds and marks of natural language derive their meaning from the communicative intentions of the agents who use them. As a result, the information of primary importance to informational semantics is that occurring in the brains of conscious agents. Thus, for informational semantics, the very existence of thought and, thus, the possibility of language depends on the capacity of systems to transform information (normally supplied by perception) into meaning .

Not all information-processing systems have this capacity. Artifacts (measuring instruments and computers) do not. To achieve this conversion, two things are required. First, because meaning is fine grained (even though 3 is327, thinking or saying that x = 3 is not the same as thinking or saying that x = 327) and information is coarse grained (a signal that carries the information that x = 3 necessarily carries the information that x = 327), a theory of meaning must specify how coarse grained information is converted into fine grained meaning. Which of the many pieces of information an event (normally) carries is to be identified as its meaning? Second, in order to account for the fact that something (e.g., a thought) can mean (have the content) that x = 3 when x 3, a way must be found to "detach" information from the events that normally carry it so that something can mean that x = 3 when it does not carry this information (because x 3).

One of the strategies used by some (e.g., Dretske 1981, 1986; Stampe 1977, 1986) to achieve these results is to identify meaning with the environmental condition with which a state is, or is supposed to be, correlated. For instance, the meaning of a state might be the condition about which it is supposed to carry information where the "supposed to" is understood in terms of the state's teleofunction. Others -- for example, Fodor (1990) -- reject teleology altogether and identify meaning with the sort of causal antecedents of an event on which other causes of that event depend. Still others -- for example, Millikan (1984) -- embrace the teleology but reject the idea that the relevant functions are informational. For Millikan a state can mean M without having the function of carrying this information. By combining teleology with information, informational semantics holds out the promise of satisfying the desiderata described in the last paragraph. Just as the pointer reading on a measuring instrument -- a speedometer, for example -- can misrepresent the speed of the car because there is something (viz., the speed of the car) it is supposed to indicate that it can fail to indicate, so various events in the brain can misrepresent the state of the world by failing to carry information it is their function to carry. In the case of the nervous system, of course, the information-carrying functions are not (as with artifacts) assigned by designers or users. They come, in the first instance, from a specific evolutionary (selectional) history -- the same place the heart and kidneys get their function -- and, in the second, from certain forms of learning. Not only do information-carrying functions give perceptual organs and the central nervous system the capacity to misrepresent the world (thus solving the second of the above two problems), they also help solve the grain problem. Of the many things the heart does, only one, pumping blood, is its (biological) function. So too, perhaps, an internal state has the function of indicating only one of the many things it carries information about. Only one piece of information is it supposed to carry. According to informational semantics, this would be its meaning.

Informational semantics -- as well as any other theory of meaning -- has the problem of saying of what relevance the meaning of internal states is to the behavior of the systems in which it occurs. Of what relevance is meaning to a science of intentional systems? Is not the behavior of systems completely explained by the nonsemantic (e.g., neurobiological or, in the case of computers, electrical and mechanical) properties of internal events? This question is sometimes put by asking whether, in addition to syntactic engines, there are (or could be) semantic engines. The difficulty of trying to find an explanatory role for meaning in the behavior of intentional (i.e., semantic) systems has led some to abandon meaning (and with it the mind) as a legitimate scientific construct (ELIMINATIVE MATERIALISM), others to regard meaning as legitimate in only an instrumental sense (Dennett 1987), and still others (e.g., Burge 1989; Davidson 1980; Dretske 1988; Fodor 1987; Kim 1996) to propose indirect -- but nonetheless quite real -- ways meaning figures in the explanation of behavior.

See also

Additional links

-- Fred Dretske

References

Burge, T. (1989). Individuation and causation in psychology. Pacific Philosophical Quarterly 70:303-322.

Davidson, D. (1980). Essays on Actions and Events. Oxford: Oxford University Press.

Dennett, D. (1987). The Intentional Stance. Cambridge, MA: MIT Press.

Dretske, F. (1981). Knowledge and the Flow of Information. Cambridge, MA: MIT Press.

Dretske, F. (1986). Misrepresentation. In R. Bogdan, Ed., Belief. Oxford: Oxford University Press, pp. 17-36.

Dretske, F. (1988). Explaining Behavior. Cambridge, MA: MIT Press.

Fodor, J. (1987). Psychosemantics: The Problem of Meaning in the Philosophy of Mind. Cambridge, MA: MIT Press.

Fodor, J. (1990). A Theory of Content and Other Essays. Cambridge, MA: MIT Press.

Grice, P. (1989). Studies in the Way of Words. Cambridge, MA: Harvard University Press.

Kim, J. (1996). Philosophy of Mind. Boulder, CO: Westview Press.

Millikan, R. (1984). Language, Thought, and Other Biological Categories. Cambridge, MA: MIT Press.

Stampe, D. (1977). Towards a causal theory of linguistic representation. In P. French, T. Uehling, and H. Wettstein, Eds., Midwest Studies in Philosophy 2. Minneapolis: University of Minnesota Press, pp. 42-63.

Stampe, D. (1986). Verificationism and a causal account of meaning. Synthèse 69:107-137.

Further Readings

Barwise, J., and J. Perry. (1983). Situations and Attitudes. Cambridge, MA: MIT Press.

Block, N. (1986). Advertisement for a semantics for psychology. In Midwest Studies in Philosophy, vol. 10. Minneapolis: University of Minnesota Press, pp. 615-678.

Fodor, J. (1984). Semantics, Wisconsin style. Synthèse 59:231-250.

Israel, D., and J. Perry. (1990). What is information? In P. Hanson, Ed., Information, Language, and Cognition. Vancouver: University of British Columbia Press, pp. 1-19.

Lepore, E., and B. Loewer. (1987). Dual aspect semantics. In E. Lepore, Ed., New Directions in Semantics. London: Academic Press.

Papineau, D. (1987). Reality and Representation. Oxford: Blackwell .