First appeared in the Boston Globe
By Harvey Blume
"IN PHILOSOPHY," JOHN SEARLE told me, "the name of the game is disagreement." Searle, who has taught philosophy at the University of California, Berkeley, since 1959, shows no inclination to duck dispute. In The New York Review of Books, for example, where he functions as a sort of philosopher in residence, you can regularly find him at fierce loggerheads with a variety of contemporary thinkers -- including Steven Pinker, Daniel Dennett, Ray Kurzweil, and Noam Chomsky -- over questions of mind, consciousness, and language.
Some of Searle's most serious intellectual brawls are over the question of whether the mind can be construed as a computer. He has argued strenuously that we need to resist the temptation to think of mental processes in terms of computation. "Defined as it is," he said, "by the manipulation of zeroes and ones," the computer model can tell us nothing about how our brains produce mind, consciousness, and a sense of self.
Searle's forte in these battles and in his 16 books, including his new "Freedom and Neurobiology: Reflections on Free Will, Language, and Political Power" (Columbia), is his determination to see how science reformulates traditional questions of philosophy. In some cases -- but "unfortunately," he said, "not many" -- science succeeds in putting a vexed question to rest: The discovery of the double helix, for example, and other advances in microbiology, have made it unnecessary, he believes, to call on an "Èlan vital," or some other mystical force, to explain how life works.
But Searle recognizes that science can also deepen mystery, as it has with regard to free will. The more we know about a universe consisting, he writes, "entirely of mindless, meaningless...fields of force," the harder it is to justify the conception we have of "ourselves as conscious, mindful, free" -- unshakable as that self-conception might be. Searle does not, however, conclude that free will is an illusion. He maintains, instead, that at least for now, we are stuck with a paradox.
I reached John Searle by phone at his office in Berkeley.
IDEAS: You think that questions about the mind are at the core of philosophy today, don't you?
SEARLE: Right. And that's a big change. If you go back to the 17th century, and Descartes, skepticism -- the question of how it is possible to have knowledge -- was a live issue for philosophy. That put epistemology -- the theory of knowledge -- at the heart of philosophy. How can we know? Shouldn't we seek a foundation for knowledge that overcomes skeptical doubts about it? As recently as a hundred years ago, the central question was still about knowledge. But now, the center of philosophical debate is philosophy of mind.
IDEAS: Why the change?
SEARLE: We know too much. The sheer volume of knowledge has become overwhelming. We take basic findings from physics and chemistry about the universe for granted. Knowing much more about the real world than our ancestors did, we can't take skepticism seriously in the old way. It also means that philosophy has to proceed on the basis of all that we know.
The universe consists of matter, and systems defined by causal relations. We know that. So we go on to ask: To what extent can we render our self-conception consistent with this knowledge? How can there be consciousness, free will, rationality, language, political organization, ethics, aesthetics, personal identity, moral responsibility? These are questions for the philosophy of mind.
IDEAS: You call yourself a biological naturalist, and argue that there is a physical underpinning to consciousness.
SEARLE: The question of how it is possible for consciousness to exist in a world made entirely of physical particles is being transformed into a scientific question, much like any other. It's like the question that bothered our great-grandparents, namely how could inert matter be alive, how could life exist, in what is, after all, a bunch of chemicals. Now we have a much richer conception of biochemistry. We don't know all the details, but nobody can feel passionately today that you can't give a biochemical account of life.
How does the brain produce weird states -- consciousness, subjectivity, qualitativeness? That will receive a neurobiological solution. There's a lot of work on it now.
IDEAS: You've argued that no matter what science says, we're inclined to think of ourselves as free.
SEARLE: It isn't just that we're inclined. It's worse than that. You cannot escape the presupposition of free will. When you and I talk, or we order in a restaurant, or vote, we can only do these things on the supposition that we have a choice. We can't think away our own freedom.
IDEAS: Why do so many people find it appealing to think of the mind as a kind of computer?
SEARLE: People have always tried to find a mechanical analogy for the brain. I've come across a passage saying the brain is a telegraph system. Before that people said the brain was a Jacquard loom. In my childhood, people used to say the brain was a telephone system. It was inevitable they'd say the brain is a digital computer.
IDEAS: But you yourself maintain that the brain is a machine.
SEARLE: The brain is a machine, which by means of energy transfers causes and sustains consciousness.
Consciousness consists of private, subjective, qualitative states of feeling and awareness, starting when you wake from a dreamless sleep and continuing until you go back to sleep.
We don't yet know how the brain causes that. Maybe there's no reason why you couldn't produce consciousness in nonbiological phenomena.
IDEAS: Do you really think we can build a machine that has a full grasp of natural language, and has authentic consciousness?
SEARLE: That's a factual question, not a question you can answer by philosophical analysis. My point is that the ability to manipulate symbols, which is what today's computers do, is not the same as the ability to have consciousness.
IDEAS: Wait. You're saying machines can have consciousness. And our brains are machines that have consciousness. Well, a computer is a machine. Why can't it have consciousness?
SEARLE: Sure, the computer you buy in a store is a machine. But computation is not a machine process. Computation is not defined by energy transfer.
People think I'm saying the computer is too much of a machine to be capable of consciousness. That's exactly wrong. I'm saying it's not enough of a machine.
IDEAS: That's tricky.
SEARLE: Actually, it's ludicrously simple. Minds are defined by the possession of mental phenomena -- consciousness, intentionality. Computer operations are defined syntactically, in terms of formal symbol manipulation. And that's neither sufficient by itself for, nor constitutive of, consciousness.
The funny thing is that in all these years nobody's got that point.
IDEAS: Give me an example of the kind of question science doesn't help philosophy answer.
SEARLE: I'll give half a dozen examples. There are the questions that bothered the Greeks: What is the nature of a just society? What is the most satisfying form of life. What are the forms of love? There are questions about ethics and aesthetics. These may have a scientific base, but they're not scientific questions.
IDEAS: Given that many contemporary philosophers agree about the importance of science, why is disagreement among them often so vehement?
SEARLE: I don't worry too much about the fact that philosophers disagree.
Harvey Blume is a writer based in Cambridge. His interviews appear regularly in ideas. E-mail hblume at world.std.com.