Computational theory of mind
In philosophy of mind
, the computational theory of mind
), also known as computationalism
, is a family of views that hold that the human mind
is an information processing system
and that cognition and consciousness together are a form of computation
. Warren McCulloch
and Walter Pitts
(1943) were the first to suggest that neural activity is computational. They argued that neural computations explain cognition
The theory was proposed in its modern form by Hilary Putnam
in 1967, and developed by his PhD student, philosopher and cognitive scientist Jerry Fodor
in the 1960s, 1970s and 1980s.
Despite being vigorously disputed in analytic philosophy
in the 1990s due to work by Putnam himself, John Searle
, and others, the view is common in modern cognitive psychology
and is presumed by many theorists of evolutionary psychology
In the 2000s and 2010s the view has resurfaced in analytic philosophy (Scheutz 2003, Edelman 2008).
The computational theory of mind holds that the mind is a computational system that is realized (i.e. physically implemented) by neural activity in the brain. The theory can be elaborated in many ways and varies largely based on how the term computation is understood. Computation is commonly understood in terms of Turing machines
which manipulate symbols according to a rule, in combination with the internal state of the machine. The critical aspect of such a computational model is that we can abstract away from particular physical details of the machine that is implementing the computation.
For example, the appropriate computation could be implemented either by silicon chips or biological neural networks, so long as there is a series of outputs based on manipulations of inputs and internal states, performed according to a rule. CTM, therefore holds that the mind is not simply analogous to a computer program, but that it is literally a computational system.
Computational theories of mind are often said to require mental representation
because 'input' into a computation comes in the form of symbols or representations of other objects. A computer cannot compute an actual object, but must interpret and represent the object in some form and then compute the representation. The computational theory of mind is related to the representational theory of mind
in that they both require that mental states are representations. However, the representational theory of mind shifts the focus to the symbols being manipulated. This approach better accounts for systematicity and productivity.
In Fodor's original views, the computational theory of mind is also related to the language of thought
. The language of thought theory allows the mind to process more complex representations with the help of semantics. (See below in semantics of mental states).
Recent work has suggested that we make a distinction between the mind and cognition. Building from the tradition of McCulloch and Pitts, the computational theory of cognition
(CTC) states that neural computations explain cognition.
The computational theory of mind asserts that not only cognition, but also phenomenal consciousness or qualia
, are computational. That is to say, CTM entails CTC. While phenomenal consciousness could fulfill some other functional role, computational theory of cognition leaves open the possibility that some aspects of the mind could be non-computational. CTC therefore provides an important explanatory framework for understanding neural networks, while avoiding counter-arguments that center around phenomenal consciousness.
Computational theory of mind is not the same as the computer metaphor, comparing the mind to a modern-day digital computer.
Computational theory just uses some of the same principles as those found in digital computing.
While the computer metaphor draws an analogy between the mind as software and the brain as hardware, CTM is the claim that the mind is a computational system. More specifically, it states that a computational simulation of a mind is sufficient for the actual presence of a mind, and that a mind truly can be simulated computationally.
'Computational system' is not meant to mean a modern-day electronic computer. Rather, a computational system is a symbol manipulator that follows step by step functions to compute input and form output. Alan Turing
describes this type of computer in his concept of a Turing machine
One of the earliest proponents of the computational theory of mind was Thomas Hobbes
, who said, "by reasoning, I understand computation. And to compute is to collect the sum of many things added together at the same time, or to know the remainder when one thing has been taken from another. To reason therefore is the same as to add or to subtract."
Since Hobbes lived before the contemporary identification of computing with instantiating effective procedures, he cannot be interpreted as explicitly endorsing the computational theory of mind, in the contemporary sense.
Causal picture of thoughts
At the heart of the computational theory of mind is the idea that thoughts are a form of computation, and a computation is by definition a systematic set of laws for the relations among representations. This means that a mental state represents something if and only if there is some causal correlation between the mental state and that particular thing. An example would be seeing dark clouds and thinking "clouds mean rain", where there is a correlation between the thought of the clouds and rain, as the clouds causing rain. This is sometimes known as natural meaning. Conversely, there is another side to the causality of thoughts and that is the non-natural representation of thoughts. An example would be seeing a red traffic light and thinking "red means stop", there is nothing about the color red that indicates it represents stopping, and thus is just a convention that has been invented, similar to languages and their abilities to form representations.
Semantics of mental states
The computational theory of mind states that the mind functions as a symbolic operator, and that mental representations are symbolic representations; just as the semantics
of language are the features of words and sentences that relate to their meaning, the semantics of mental states are those meanings of representations, the definitions of the 'words' of the language of thought
. If these basic mental states can have a particular meaning just as words in a language do, then this means that more complex mental states (thoughts) can be created, even if they have never been encountered before. Just as new sentences that are read can be understood even if they have never been encountered before, as long as the basic components are understood, and it is syntactically correct. For example: "I have eaten plum pudding every day of this fortnight." While it's doubtful many have seen this particular configuration of words, nonetheless most readers should be able to glean an understanding of this sentence because it is syntactically correct and the constituent parts are understood.
A range of arguments have been proposed against physicalist conceptions used in computational theories of mind.
An early, though indirect, criticism of the computational theory of mind comes from philosopher John Searle
. In his thought experiment known as the Chinese room
, Searle attempts to refute the claims that artificially intelligent agents
can be said to have intentionality
and that these systems, because they can be said to be minds themselves, are sufficient for the study of the human mind.
Searle asks us to imagine that there is a man in a room with no way of communicating with anyone or anything outside of the room except for a piece of paper with symbols written on it that is passed under the door. With the paper, the man is to use a series of provided rule books to return paper containing different symbols. Unknown to the man in the room, these symbols are of a Chinese language, and this process generates a conversation that a Chinese speaker outside of the room can actually understand. Searle contends that the man in the room does not understand the Chinese conversation. This is essentially what the computational theory of mind presents us—a model in which the mind simply decodes symbols and outputs more symbols. Searle argues that this is not real understanding or intentionality. This was originally written as a repudiation of the idea that computers work like minds.
Searle has further raised questions about what exactly constitutes a computation:
the wall behind my back is right now implementing the WordStar
program, because there is some pattern of molecule movements that is isomorphic with the formal structure of WordStar. But if the wall is implementing WordStar, if it is a big enough wall it is implementing any program, including any program implemented in the brain.
Objections like Searle's might be called insufficiency objections. They claim that computational theories of mind fail because computation is insufficient to account for some capacity of the mind. Arguments from qualia, such as Frank Jackson's knowledge argument
, can be understood as objections to computational theories of mind in this way—though they take aim at physicalist conceptions of the mind in general, and not computational theories specifically.
There are also objections which are directly tailored for computational theories of mind.
Putnam himself (see in particular Representation and Reality
and the first part of Renewing Philosophy
) became a prominent critic of computationalism for a variety of reasons, including ones related to Searle's Chinese room arguments, questions of world-word reference relations, and thoughts about the mind-body relationship. Regarding functionalism in particular, Putnam has claimed along lines similar to, but more general than Searle's arguments, that the question of whether the human mind can
implement computational states is not relevant to the question of the nature of mind, because "every ordinary open system realizes every abstract finite automaton."
Computationalists have responded by aiming to develop criteria describing what exactly counts as an implementation.
has proposed the idea that the human mind does not use a knowably sound calculation procedure to understand and discover mathematical intricacies. This would mean that a normal Turing complete
computer would not be able to ascertain certain mathematical truths that human minds can.
Supporters of CTM are faced with a simple yet important question whose answer has proved elusive and controversial: what does it take for a physical system (such as a mind, or an artificial computer) to perform computations? In other words, under what conditions does a physical system implement a computation? A very straightforward account is based on a simple mapping between abstract mathematical computations and physical systems: a system performs computation C if and only if there is a mapping between a sequence of states individuated by C and a sequence of states individuated by a physical description of the system
(1988) and Searle
(1992) argue that this simple mapping account (SMA) trivializes the empirical import of computational descriptions.
As Putnam put it, “everything is a Probabilistic Automaton under some Description”.
Even rocks, walls, and buckets of water—contrary to appearances—are computing systems. Gualtiero Piccinini identifies different versions of Pancomputationalism, depending on how many computations—all, some, or just one—they attribute to each system.
Among these various versions, unlimited Pancomputationalism
—the view that every physical system performs every computation—is most worrisome. Because if it is true, then the claim that a system S performs a certain computation becomes trivially true and vacuous or nearly so; it fails to distinguish S from anything else.
In response to the trivialization criticism, and to restrict SMA, philosophers of mind have offered different accounts of computational systems. These typically include causal account, semantic account, syntactic account, and mechanistic account.
Causal account: a physical system S performs computation C just in case (i) there is a mapping from the states ascribed to S by a physical description to the states defined by computational description C, such that (ii) the state transitions between the physical states mirror the state transitions between the computational states.
Semantic account: In addition to the causal restriction imposed by the causal account, the semantic account imposes a semantic restriction. Only physical states that qualify as representations may be mapped onto computational descriptions, thereby qualifying as computational states. If a state is not representational, it is not computational either.
Syntactic account: Instead of a semantic restriction, the syntactic account imposes a syntactic restriction: only physical states that qualify as syntactic may be mapped onto computational descriptions, thereby qualifying as computational states. If a state lacks syntactic structure, it is not computational.
Mechanistic account: First introduced by Gualtiero Piccinini
the mechanistic account of computational systems accounts for concrete computation in terms of the mechanistic properties of a system. According to the mechanistic account, concrete computing systems are functional mechanisms of a special kind—mechanisms that perform concrete computations.
- Daniel Dennett proposed the multiple drafts model, in which consciousness seems linear but is actually blurry and gappy, distributed over space and time in the brain. Consciousness is the computation, there is no extra step or "Cartesian theater" in which you become conscious of the computation.
- Jerry Fodor argues that mental states, such as beliefs and desires, are relations between individuals and mental representations. He maintains that these representations can only be correctly explained in terms of a language of thought (LOT) in the mind. Further, this language of thought itself is codified in the brain, not just a useful explanatory tool. Fodor adheres to a species of functionalism, maintaining that thinking and other mental processes consist primarily of computations operating on the syntax of the representations that make up the language of thought. In later work (Concepts and The Elm and the Expert), Fodor has refined and even questioned some of his original computationalist views, and adopted a highly modified version of LOT (see LOT2).
- David Marr proposed that cognitive processes have three levels of description: the computational level (which describes that computational problem (i.e., input/output mapping) computed by the cognitive process); the algorithmic level (which presents the algorithm used for computing the problem postulated at the computational level); and the implementational level (which describes the physical implementation of the algorithm postulated at the algorithmic level in biological matter, e.g. the brain). (Marr 1981)
- Ulric Neisser coined the term 'cognitive psychology' in his book published in 1967 (Cognitive Psychology), wherein Neisser characterizes people as dynamic information-processing systems whose mental operations might be described in computational terms.
- Steven Pinker described a "language instinct," an evolved, built-in capacity to learn language (if not writing).
- Hilary Putnam proposed functionalism to describe consciousness, asserting that it is the computation that equates to consciousness, regardless of whether the computation is operating in a brain, in a computer, or in a "brain in a vat."
- Georges Rey, professor at the University of Maryland, builds on Jerry Fodor's representational theory of mind to produce his own version of a Computational/Representational Theory of Thought.
- ^ a b Piccinini, Gualtierro & Bahar, Sonya, 2012. "Neural Computation and the Computational Theory of Cognition" in Cognitive Science. https://onlinelibrary.wiley.com/doi/epdf/10.1111/cogs.12012
- ^ Putnam, Hilary, 1961. "Brains and Behavior", originally read as part of the program of the American Association for the Advancement of Science, Section L (History and Philosophy of Science), December 27, 1961, reprinted in Block (1983), and also along with other papers on the topic in Putnam, Mathematics, Matter and Method (1979)
- ^ a b c d Horst, Steven, (2005) "The Computational Theory of Mind" in The Stanford Encyclopedia of Philosophy
- ^ a b Pinker, Steven. The Blank Slate. New York: Penguin. 2002
- ^ Hobbes, Thomas "De Corpore"
- ^ Searle, J.R. (1980), "Minds, brains, and programs" (PDF), The Behavioral and Brain Sciences, 3 (3): 417–457, doi:10.1017/S0140525X00005756
- ^ Searle, J.R. (1992), The Rediscovery of the Mind
- ^ a b c Putnam, H. (1988), Representation and Reality
- ^ Chalmers, D.J. (1996), "Does a rock implement every finite-state automaton?", Synthese, 108 (3): 309–333, CiteSeerX 10.1.1.33.5266, doi:10.1007/BF00413692, S2CID 17751467, archived from the original on 2004-08-20, retrieved 2009-05-27
- ^ Edelman, Shimon (2008), "On the Nature of Minds, or: Truth and Consequences" (PDF), Journal of Experimental and Theoretical AI, 20 (3): 181–196, CiteSeerX 10.1.1.140.2280, doi:10.1080/09528130802319086, S2CID 754826, retrieved 2009-06-12
- ^ Blackmon, James (2012). "Searle's Wall". Erkenntnis. 78: 109–117. doi:10.1007/s10670-012-9405-4. S2CID 121512443.
- ^ Roger Penrose, "Mathematical Intelligence," in Jean Khalfa, editor, What is Intelligence?, chapter 5, pages 107-136. Cambridge University Press, Cambridge, United Kingdom, 1994
- ^ Ullian, Joseph S. (March 1971). "Hilary Putnam. Minds and machines. Minds and machines, edited by Alan Ross Anderson, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1964, pp. 72–97. (Reprinted fron Dimensions of mind, A symposium, edited by Sidney Hook, New York University Press, New York 1960, pp. 148–179.)". Journal of Symbolic Logic. 36 (1): 177–177. doi:10.2307/2271581. ISSN 0022-4812.
- ^ Smythies, J. R. (November 1993). "The Rediscovery of the Mind. By J. R. Searle. (Pp. 286; $22.50.) MIT Press: Cambridge, Mass.1992". Psychological Medicine. 23 (4): 1043–1046. doi:10.1017/s0033291700026507. ISSN 0033-2917.
- ^ "ART, MIND, and RELIGION". Philosophical Books. 8 (3): 32–32. October 1967. doi:10.1111/j.1468-0149.1967.tb02995.x. ISSN 0031-8051.
- ^ a b Piccinini, Gualtiero (2015-06-01), "The Mechanistic Account", Physical Computation, Oxford University Press, pp. 118–151, ISBN 978-0-19-965885-5, retrieved 2020-12-12
- ^ a b c d Piccinini, Gualtiero (2017), Zalta, Edward N. (ed.), "Computation in Physical Systems", The Stanford Encyclopedia of Philosophy (Summer 2017 ed.), Metaphysics Research Lab, Stanford University, retrieved 2020-12-12
- ^ Piccinini, Gualtiero (October 2007). "Computing Mechanisms*". Philosophy of Science. 74 (4): 501–526. doi:10.1086/522851. ISSN 0031-8248.
- Ned Block, ed. (1983). Readings in Philosophy of Psychology, Volume 1. Cambridge, Massachusetts: Harvard University Press.
- Tim Crane (2003). The Mechanical Mind: A Philosophical Introduction to Minds, Machines, and Mental Representation. New York, NY: Routledge.
- Shimon Edelman (2008) Computing the Mind: How the Mind Really Works.
- Jerry Fodor (1975) The Language of Thought. Cambridge, Massachusetts: The MIT Press.
- Jerry Fodor (1995) The Elm and the Expert: Mentalese and Its Semantics. Cambridge, Massachusetts: The MIT Press.
- Jerry Fodor (1998) Concepts: Where Cognitive Science Went Wrong. Oxford and New York: Oxford University Press.
- Jerry Fodor (2010) LOT2: The Language of Thought Revisited. Oxford and New York: Oxford University Press.
- C. Randy Gallistel Learning and Representation. In R. Menzel (Ed) Learning Theory and Behavior. Vol 1 of Learning and Memory - A Comprehensive Reference. 4 vols (J. Byrne, Ed). Oxford: Elsevier. pp. 227–242.
- Harnad, Stevan (1994). "Computation Is Just Interpretable Symbol Manipulation: Cognition Isn't". Minds and Machines. 4 (4): 379–390. doi:10.1007/bf00974165. S2CID 230344.
- David Marr (1981) Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. Cambridge, Massachusetts: The MIT Press.
- Steven Pinker (1997) How the Mind Works.
- Hilary Putnam (1979) Mathematics, Matter, and Method: Philosophical Papers, Vol. 1. Cambridge, Massachusetts: The MIT Press.
- Hilary Putnam (1991) Representation and Reality. Cambridge, Massachusetts: The MIT Press.
- Hilary Putnam (1995) Renewing Philosophy. Cambridge, Massachusetts: Harvard University Press.
- Zenon Pylyshyn (1984) Computation and Cognition. Cambridge, Massachusetts: The MIT Press.
- Matthias Scheutz, ed. (2003) Computationalism: New Directions. Cambridge, Massachusetts: The MIT Press.
- John Searle (1992) The Rediscovery of the Mind. Cambridge, Massachusetts: The MIT Press.
- Gualtiero Piccinini (2015). Physical Computation: A Mechanistic Account. NY, Oxford University Press.
- Gualtiero Piccinini (2017) "Computation in Physical Systems", The Stanford Encyclopedia of Philosophy (Summer 2017 Edition), Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/sum2017/entries/computation-physicalsystems/>.
Last edited on 28 April 2021, at 05:17
Content is available under CC BY-SA 3.0
unless otherwise noted.