Readings Index
 

 

Engineering Intelligence: Computers and the Cognitive Revolution in Psychology

A Review of Steven Pinker’s How the Mind Works

In his landmark study The Structure of Scientific Revolutions, Thomas Kuhn explained that science cannot be viewed as a steady accumulation of knowledge. Reviewing the history of any science, one finds, instead, a series of orthodox world views, or paradigms, each of which reigns for a time and then is replaced by a usurper in what can only be called a revolution. Consider, for example, the science of mechanics, the branch of physics that deals with moving bodies. According to the naive view first advanced by Aristotle and still held by people who have not studied elementary physics, when force is imparted to a body, it moves until that force is used up, at which point it comes to rest. This intuitive view prevailed for more than a thousand years. Eventually, however, people began to see a problem with the theory. Imagine a ball rolling along a smooth surface. After an initial push, it will continue for a while and then come to rest some distance from its starting point. According to Aristotle, the ball has used up the force imparted to it. If, however, you add grease to the ball or make the surface smoother, the ball will go further. The better the grease, the smoother the surface, the further the ball will go as a consequence of the initial push. Aristotelian mechanics is at a loss to explain such a phenomenon. The fact that the ball goes further when greased is, from the point of view of Aristotelian mechanics, an anomaly, something the theory cannot explain. After all, the grease has not added force to the ball. 

Imagine, now, a ball covered with a perfect grease or rolling on a perfectly smooth surface. Under such conditions, once an  initial force is imparted to the ball, it will continue on a straight line forever. This observation led to a revolution in physics, instigated by Galileo Galilei and later codified by Isaac Newton. Newton’s Law of  Inertia states that given an initial force in one direction, a body will persevere in uniform motion in that direction until acted upon by another force (the friction of the ball against the surface on which it rolls). Newton’s theory explains the commonplace observation that a ball, when pushed, eventually comes to a stop, but it also explains why the greased ball goes further. In other words, it explains all the facts accounted for by the older theory and also explains the anomaly of the greased ball. That is how science works. A theory is advanced. It proves to have enormous explanatory power and so becomes the accepted paradigm. For some period thereafter, called by Kuhn a period of “normal science,” people make steady progress, reading the book of nature by the light of the orthodox theory. Then an anomaly is discovered that cannot be explained by the theory. Scientists try to shore up the theory, to alter it to fit the observed facts, but eventually, someone proposes a new theory that accounts for the known facts and for the anomaly. Eventually, this theory becomes the new paradigm, and another period of normal, paradigmatic science ensues. 

In the past three decades, the relatively new science of psychology has undergone a paradigm shift of the kind that Kuhn described. The shift is so recent that the psychology textbooks being used in high schools in the United States today do not as yet reflect it. Nonetheless, psychology today is a brave new world, one that would be barely recognizable to the practitioners of that science who died in the 1940s. The dominant paradigm for most of the century, Behaviorism, has given way to a new, powerful synthesis known as cognitive science, which owes its impetus to the creation of the digital computer. Steven Pinker’s How the Mind Works, published in 1997, is as good an introduction to the new paradigm as one is likely to find. Before having a look at the contents of Pinker’s book, however, it will prove valuable to take a brief glance backward to the paradigm that has been replaced. 

The Undoing of the Behaviorist Paradigm

In the nineteenth century, when the new field of psychology was emerging from natural philosophy, the dominant paradigm was Introspectionism. People reasoned that if they wished to explore the workings of the mind, the best way to approach this task was to reflect on the contents of what William James, founder of the first school of psychology in the United States, called the “stream of consciousness.” Nineteenth-century psychologists thus spent a lot of time reflecting on their own perceptions, sensations, intuitions, memories, habits, motives, dreams, desires, emotions, goals, and so on. In the first decade of the twentieth century, the American psychologist John B. Watson challenged Introspectionism on the grounds that it was unscientific. If a given Introspectionist claimed to have a particular subjective experience, there was no way for a colleague to test that claim through observation, for, by definition, a subjective experience is accessible only to the person having it. Watson argued that the proper subject of psychology was not subjective experience but behavior, and the school of psychology that he founded was known as Behaviorism. 

For the greater part of the twentieth century, most orthodox (i.e., Behaviorist) psychologists in the United States and Great Britain  held that any discussion of mental categories or contents was unverifiable nonsense. The foremost Behaviorist in America, the brilliant and irascible B. F. Skinner, went so far as to claim that conscious experience did not exist! Rats, pigeons, and people were biological machines, acted upon by external forces (stimuli) that cause them to react (to respond or behave) in particular ways, much as a rock responds by falling to the stimulus of being pushed off a cliff. Learning, from the Behaviorist point of view, occurred as a result of building up a repertoire of responses to new stimuli, which took the form of positive or negative reinforcements. 

Behaviorism had enormous influence on the social sciences and education. Sociologists (taking their cue from Behaviorism and from the historical materialism of Karl Marx) argued that the ills of society could be cured by changing people’s conditioning. Educators banished from their discussions any talk about students’ understanding, knowledge, feelings, attitudes, and so on, and replaced such talk with behavioral objectives and lists of positive and negative reinforcements. It became unfashionable, for example, to speak of a student’s understanding the difference between a noun and a verb. Instead, one was supposed to say that the student, on being exposed to the stimuli of the lesson, would then behave in a certain way (for example, he or she would be able to sort a list of words under headings Noun and Verb). The desired outcome (e.g., sorting the nouns and verbs correctly) would be reinforced by positive or negative stimuli (e.g., M&Ms or detentions) until the behavior was learned. For a while, it looked as though Behaviorism would remain the dominant paradigm until the trump of doom. 

In the late 1950s and early 1960s, however, psychologists were suddenly confronted by anomalies that could not be accounted for by the Behaviorists’ stimulus-response model. In his devastating review of Skinner’s book Verbal Behavior, the linguist Noam Chomsky pointed out the inability of Behaviorism to account for certain aspects of language, such as the creation of utterances that are completely original and yet completely grammatical. People do not simply parrot back what they have heard. Instead, they make up entirely new sentences, ones that have never before occurred in the history of language use, based on internalized structures and rules. For example, it is highly likely that no one has ever before uttered this sentence: 

    Aardvarks are rarely, if ever, invited to ceremonies in the Rose Garden.
Nonetheless, anyone who has internalized the rules of English knows what the sentence means. Furthermore, every normal child learns a spoken language rapidly and thoroughly, despite the fragmentary, willy-nilly nature of the stimuli to which he or she is exposed. Again, it makes sense to think in terms of an internal language organ, of  preexisting, innate linguistic structures in the brain that are triggered into action by exposure to speech. 

In another blow to Behaviorism, the respected psychologist Karl Lashly delivered a paper to the American Psychological Association pointing out that certain kinds of serial behaviors, such as improvisation by a jazz pianist, occur too quickly to be described in terms of a series of stimuli and responses. There simply isn’t time for the nerve signals to move back and forth in two directions. Instead, the pianist must be “playing out,” in one direction, an internal schema, or model. 

Such observations were the death knell for Behaviorism, but it is likely that psychologists would still be shunning discussion of internal states if they hadn’t had ready to hand an example of another kind of entity with internal states—the digital computer. It is possible, of course, to describe a computer in terms of its inputs (“stimuli”) and its outputs (“behaviors”), but doing so leaves out most of what’s interesting about computers—processing and storage that occur in-between. Computers gave psychologists an entirely new model for their science: the brain is the machine, the wetware, that takes care of processing and storage, and the mind is the program that runs on that wetware. In Steven Pinker’s words, 

    The mind is what the brain does; specifically, the brain processes information, and thinking is a kind of computation . . . . This insight, first expressed by the mathematician Alan Turing, the computer scientists Alan Newell, Herbert Simon, and Marvin Minsky, and the philosophers Hilary Putnam and Jerry Fodor, is now called the computational theory of mind. It is one of the great ideas in intellectual history, for it solves one of the puzzles that make up the “mind-body problem”: how to connect the ethereal world of meaning and intention, the stuff of our mental lives, with a physical hunk of matter like the brain. Why did Bill get on the bus? Because he wanted to visit his grandmother and knew the bus would take him there. No other answer will do. . . . [F]or millennia this has been a paradox. Entities like “wanting to visit one’s grandmother” and “knowing the bus goes to Grandma’s house’ are colorless, odorless, and tasteless. But at the same time they are causes of physical events, as potent as any billiard ball clacking into another. 

    The computational theory of mind resolves the paradox. It says that beliefs and desires are information, incarnated as configurations of symbols. The symbols are physical states of bits of matter, like chips in a computer or neurons in the brain. They symbolize things in the world  because they are triggered by those things via our sense organs, and because of what they do once they are triggered. If the bits of matter that constitute a symbol are arranged to bump into the bits of matter constituting another symbol in just the right way, the symbols corresponding to one belief can give rise to new symbols corresponding to another belief logically related to it, which can give rise to symbols corresponding to other beliefs, and so on. Eventually the bits of matter constituting a symbol bump into bits of matter connected to the muscles, and behavior happens. The computational theory of mind thus allows us to keep beliefs and desires in our explanations of behavior while planting them squarely in the physical universe. It allows meaning to cause and be caused. (24-25)

Another name for the computational theory of mind is cognitive science, a discipline now almost synonymous with psychology in America and much of the rest of the world but encompassing, as well, the sciences of neurology and artificial intelligence (with borrowings from mechanical engineering and evolutionary biology). In How the Mind Works, Pinker does an admirable job of explaining the cognitive science paradigm and of demonstrating its power by applying it to “dozens of mysteries of the mind, from mental images to romantic love, [that] have recently been upgraded to problems” (ix). Before looking at a few of the problems that Pinker addresses in his witty, capacious book, let’s examine in more detail just how the new paradigm differs from the old. 

A New View of the Brains and Minds

Cognitive science views the brain as a processor. The machine is no monolith. It is made up of many modules designed for specific purposes. As Pinker puts it, the brain 
    is a system of organs of computation, designed by natural selection to solve the kinds of problems our ancestors faced in their foraging way of life, in particular, understanding and outmaneuvering objects, animals, plants, and other people. . . . On this view, psychology is engineering in reverse. In forward-engineering [what evolution did when it created the brain], one designs a machine to do something; in reverse-engineering, one figures out what a machine was designed to do. (21)
Three key points here deserve attention. First, the brain does not simply take in information through the senses and store it. Instead, it processes the information in various ways that are dependent on the design of the processor. For example, light rays that enter the lenses of the eyes are projected onto the retinas upside down. However, we experience the world as being rightside up. The brain flips the image as it processes the information. If you wear a pair of glasses specially designed to invert the images, you will at first experience the disorientation of trying to move about in an upside-down world. After a few hours, however, your brain will rewire itself and, though the inputs remain reversed, will deliver to your conscious experience a world turned rightside up again. Take the glasses off and you will again experience disorientation until your brain reprograms itself. The point is that nothing is experienced as it is in itself but rather as it is as a result of a certain kind of processing that the brain was designed by natural selection to do. 

Second, the brain is made up of interconnected modules, each with its own functions. Much as a personal computer can contain a separate math co-processor dedicated to arithmetic operations, so the brain contains separate, dedicated processors for such activities as recognizing faces (a fact proven, Pinker points out, by brain lesions that cause people to suffer from prosopagnosia, an inability to distinguish one face from another that is unrelated to other pattern recognition). 

Third, as one would expect of a machine, the modules of the brain were designed, but not, according to standard cognitive science theory, by an intelligent designer but rather by the blind forces of natural selection. One would expect, based on this theory, that human brains would be very good at kinds of processing that were valuable to their foraging ancestors and very bad at kinds of processing that were not. This prediction is borne out by thousands of observations. For example, as Pinker points out, people without specific training are very bad at doing abstract problems in logic. However, if an abstract logical problem is couched in terms of contractual rules, in which being false is equivalent to cheating, then the same people who could not solve the problem correctly before are able to solve it with ease. That’s because the ability to do abstract logical problems carried no selective value in the past, but the ability to recognize a cheater (someone who took more than his share of food or an unfaithful spouse) did. 

To summarize, the brain processes information, different parts of the brain are specialized for particular processing, and the processors that exist in the brain were designed to meet the needs of our evolutionary ancestors. 

Applications of the Cognitive View

One of the virtues of Pinker’s book is its wide-ranging coverage of many, many issues in psychology. Pinker tackles hundreds of questions (Why do people like parks and waterfront property? Are emotions universal across cultures? Why has instruction in phonics been more successful than whole-language instruction? How do 3-D stereograms work? What causes the psychological disorder known as autism, in which the sufferer is unable to distinguish people from objects? What are the relative roles of nature and nurture in the development of human abilities such as intelligence? Does a faculty called "general intelligence" exist? Why do people fall in love?) and shows how these can be answered within the cognitive science paradigm. Let’s consider two of Pinker’s examples. 

For years, people have debated the relative role of nurture and nature in human development. People on the left, from liberals to Marxists, tend to favor explanations related to nurture. Given the right environment, anyone can be made into a monster or a saint, a dullard or a genius. People on the right, from conservatives to Fascists, tend to favor explanations related to nature. There are natural inequalities among people, life is a struggle to survive, and those with money and power are simply the ones who are best adapted and therefore deserve what they have. To Pinker and other cognitive scientists, the statement that people’s abilities depend upon a complex interaction between nature and nurture is downright silly. Pinker makes the point by having us consider an analogous statement: 

    The behavior of a computer comes from a complex interaction between the processor and input. (33)
To Pinker, the statement is “true but useless—so blankly uncomprehending, so defiantly incurious (33).” One cannot have one without the other. Pinker would argue that the nature and nurture people are both right, but that the observation is trivial. Learning takes place because there are modules in the brain designed to do the learning and because there are experiences that trigger these modules. The modules themselves are universal, with slight variations among individuals, but someone who has an exceptional module for one ability (for example, discriminating between shades of a particular color) is statistically likely to have mediocre modules for other abilities (such as the ability to intuit probabilities from observed frequencies). Pinker’s observation has implications, as well, for concepts of intelligence, which he defines as “the ability to attain goals in the face of obstacles by means of decisions based on rational (truth-obeying) rules (62).” If the brain consists of modules for computation, and if these modules vary in precision within individuals, then it makes no sense to talk of general intelligence but only of hundreds (perhaps thousands) of tiny, individual intelligences. 

Cognitive science not only helps us make sense of ancient, grandiose conundrums such as the nature of intelligence and the relative importance of nature and nurture, but also has many, many practical applications. Consider, for example, the teaching of reading. In the 1960s and 1970s, linguists working within a cognitive science framework demonstrated conclusively that brains contain circuitry specifically designed for learning spoken language. A child does not learn the rules of a spoken language by imitation or by dedicated instruction. No one teaches the average English-speaking child the rules governing the order of precedence of adjectives, yet a six-year-old can tell you that “the little, green VW microbus” sounds right but that “the VW green little microbus” sounds wrong. The brain comes equipped with circuitry specifying the abstract features of possible languages, and the child learns a particular language when what he or she hears fills in the slots in the abstract design. For example, in English an object typically comes after a verb: 

    The boy kissed the girl.
In Latin, the object can come before the verb: 
    Puellam puer osculat. [The girl (object)  the boy (subject) kissed.] 
    Puer puellam osculat. [The boy (subject) the girl (object) kissed.]
The positioning of the subject and object is a learned feature, but the fact that sentences have subjects and objects is hardwired into the brain and does not have to be learned 

In the 1980s, educators in the United States, excited about the discovery by linguists that language-learning was innate, proposed a new approach to reading instruction. In the past, teachers had laboriously drilled students in sound-symbol correspondences, using a technique known as phonics instruction. Students were systematically taught to recognize the written symbols corresponding to sounds in English: cat, bat, mat, sat, rat, fat, hat, and so on. Half understanding what the linguists were saying, the educators decided that they could dispense with phonics and instead simply expose children to stories. Students’ innate language-learning mechanisms would take care of teaching them to read, as long as the stories they were exposed to were interesting enough to engage their attention. This so-called “whole language” approach proved to be a dismal failure, and cognitive science has a ready explanation. What the educators didn’t realize is Pinker’s point that the processing modules that exist in the brain are those that evolved to meet past purposes. The brain evolved specific processors for intuiting the grammatical, phonological, and semantic features of spoken language, but writing is a recent cultural invention. There simply hasn’t been enough time, evolutionarily speaking, for dedicated processors for determining correspondences between written and spoken symbols to develop. The moral is clear: education should step in to meet needs not already provided for by our neural machinery. People have the neural machinery to learn how to distinguish the grammatical difference between who (subject) and whom (object) by simply being exposed to enough examples of so-called “proper” usage, without specific instruction in the differences between who and whom. However, they do not have in-born, hardwired machinery for learning that one particular set of squiggles on a piece of paper stands for the sound who (hoo) and another stands for the sound how (hou). The interpretation of the squiggles has to be taught. In Pinker’s words, 

    [T]he insight that language is a naturally developing human instinct has been garbled into the evolutionarily improbable claim that reading is a naturally developing human instinct. Old-fashioned practice at connecting letters to sounds is replaced by immersion in a text-rich social environment, and the children don’t learn to read. Without an understanding of what the mind was designed to do in the environment in which we evolved, the unnatural activity called formal education is unlikely to succeed. (342)
There is, perhaps, no other single work on cognitive science as all-encompassing, as clear, as engrossing, and as accessible as Pinker’s book. Pinker, who directs the Center for Cognitive Neuroscience at the Massachusetts Institute of Technology, does not claim that cognitive science has resolved all the mysterious of the mind (particularly intractable is the central mystery of consciousness), but he does show how the new paradigm can be used to look for those answers.  L. Ron Hubbard’s book Dianetics, the pseudo-scientific central text of the religious cult that calls itself  Scientology, is often advertised as “An Owner’s Manual for the Human Mind.” If one could actually make such a claim about any book, it would be Pinker’s. Everyone who has a mind will benefit from reading Pinker’s book and from contemplating the light that the analogy between computers and brains throws on the problems and glories of being human. 
 

References

Chomsky, Noam. “Review of B. F. Skinner’s Verbal Behavior.” Language 35 (1959): 26-58. 

Kuhn, Thomas S. The Structure of Scientific Revolutions. Cambridge, MA: Univ. of Chicago P., 1962. 

Pinker, Steven. How the Mind Works. New York: Norton, 1997. 

 
 
Questions for Discussion and Review 

The following questions are based on the preceding text. Clicking on a question will take you to the place in the text where the question is discussed. To return to these questions, simply click the "Back" button in your browser. 

1. According to Aristotle, what causes a ball, once set in motion, to come, eventually, to a stop? 

2. What new paradigm for explaining motion was advanced by Isaac Newton? 

3. What causes the development of new paradigms in science? 

4. What is the name of the new paradigm in psychology that replaced the Behaviorist model? 

5. According to John Watson, the founder of Behaviorism, what is the proper subject of study for psychologists? 

6. What did Behaviorists think of discussions of consciousness and other mental phenomena? 

7. What observations about language and about rapid serial behaviors such as improvisation led to a reevaluation of the Behaviorist dogma? 

8. What other device with internal states provided cognitive scientists with a model for what happens inside the human brain? 

9. What definition does Steven Pinker give for the world "mind," and how is this definition related to computing? 

10. What two names are commonly given to the scientific theory that replaced Behaviorism as the dominant paradigm in psychology? 

11. How does the flipping of the images that fall on the retina show that the brain actively processes incoming information? 

12. What is prosopagnosia, and what evidence does this condition provide for the contention that the brain contains separate processors dedicated to particular tasks? 

13. People's logical reasoning abilities dramatically improve when logic problems are couched in terms of contractual rules. In what way does this fact provide evidence for the idea that the processors in our brain were designed by an evolutionary process? 

14. What consequences does the modular, computational theory of the brain/mind have for the concept of general intelligence? 

15. What explanation can cognitive science offer for the failure of whole-language approaches to the teaching of reading? 
 

 

 EMCParadigm Publishing