Philosophy of mind
[edit] Multiple realizability

An illustration of multiple realizability. M stands for mental and P stands for physical. It can be seen that more than one P can instantiate one M, but not vice versa. Causal relations between states are represented by the arrows (M1 goes to M2, etc.)
Putnam's best-known work concerns philosophy of mind. His most noted original contributions to that field came in several key papers published in the late 1960s that set out the hypothesis of multiple realizability.[22] In these papers, Putnam argues that, contrary to the famous claim of the type-identity theory, it is not necessarily true that "Pain is identical to C-fibre firing." Pain, according to Putnam's papers, may correspond to utterly different physical states of the nervous system in different organisms, and yet they all experience the same mental state of "being in pain".
Putnam cited examples from the animal kingdom to illustrate his thesis. He asked whether it was likely that the brain structures of diverse types of animals realize pain, or other mental states, the same way. If they do not share the same brain structures, they cannot share the same mental states and properties. The answer to this puzzle had to be that mental states were realized by different physical states in different species. Putnam then took his argument a step further, asking about such things as the nervous systems of alien beings, artificially intelligent robots and other silicon-based life forms. These hypothetical entities, he contended, should not be considered incapable of experiencing pain just because they lack the same neurochemistry as humans. Putnam concluded that type-identity theorists had been making an "ambitious" and "highly implausible" conjecture which could be disproven with one example of multiple realizability.[23] This argument is sometimes referred to as the "likelihood argument".[22]
Putnam formulated a complementary argument based on what he called "functional isomorphism". He defined the concept in these terms: "Two systems are functionally isomorphic if 'there is a correspondence between the states of one and the states of the other that preserves functional relations'." In the case of computers, two machines are functionally isomorphic if and only if the sequential relations among states in the first are exactly mirrored by the sequential relations among states in the other. Therefore, a computer made out of silicon chips and a computer made out of cogs and wheels can be functionally isomorphic but constitutionally diverse. Functional isomorphism implies multiple realizability.[23] This argument is sometimes referred to as an "a priori argument".[22]
Jerry Fodor, Putnam, and others noted that, along with being an effective argument against type-identity theories, multiple realizability implies that any low-level explanation of higher-level mental phenomena is insufficiently abstract and general.[23][24][25] Functionalism, which identifies mental kinds with functional kinds that are characterized exclusively in terms of causes and effects, abstracts from the level of microphysics, and therefore seemed to be a better explanation of the relation between mind and body. In fact, there are many functional kinds, such as mousetraps, software and bookshelves, which are multiply realized at the physical level.[23]
[edit] Machine state functionalism
A Turing machine can be visualized as an infinite tape of slots that are written or erased one at a time, with the choice of action determined by a "state". According to Putnam's machine-state functionalism, the notions of state in an abstract computer and mental state are essentially the same.
The first formulation of such a functionalist theory was put forth by Putnam himself. This formulation, which is now called "machine-state functionalism", was inspired by analogies noted by Putnam and others between the mind and theoretical "Turing machines" capable of computing any given algorithm.[26]
In non-technical terms, a Turing machine can be visualized as an infinitely long tape divided into squares (the memory) with a box-shaped scanning device that sits over and scans one square of the memory at a time. Each square is either blank (B) or has a 1 written on it. These are the inputs to the machine. The possible outputs are:
* Halt: Do nothing.
* R: move one square to the right.
* L: move one square to the left.
* B: erase whatever is on the square.
* 1: erase whatever is on the square and print a 1.
A simple example of a Turing machine which writes out the sequence '111' after scanning three blank squares and then stopping is specified by the following machine table:
State 1 State 2 State 3
B write 1; stay in state 1 write 1; stay in state 2 write 1; stay in state 3
1 go right; go to state 2 go right; go to state 3 [halt]
This table states that if the machine is in state one and scans a blank square (B), it will print a 1 and remain in state one. If it is in state one and reads a 1, it will move one square to the right and also go into state two. If it is in state two and reads a B, it will print a 1 and stay in state two. If it's in state two and reads a 1, it will move one square to the right and go into state three. Finally, if it is in state three and reads a B, it prints a 1 and remains in state three.[27]
The point, for functionalism, is the nature of the "states" of the Turing machine. Each state can be defined in terms of its relations to the other states and to the inputs and outputs. State one, for example, is simply the state in which the machine, if it reads a B, writes a 1 and stays in that state, and in which, if it reads a 1, it moves one square to the right and goes into a different state. This is the functional definition of state one; it is its causal role in the overall system. The details of how it accomplishes what it accomplishes and of its material constitution are completely irrelevant.
According to machine-state functionalism, the nature of a mental state is just like the nature of the automaton states described above. Just as "state one" simply is the state in which, given an input B, such-and-such happens, so being in pain is the state which disposes one to cry "ouch", become distracted, wonder what the cause is, and so forth.[28]