08-21-2005, 11:33 PM
If it looks like I'm on a roll, I am. These essays were first written by me on my personal website. I never did get much feedback on them; I'm hoping this community will be different.
This is about whether or not computers can think. The philosopher John Searle invinted this argument. Imagine a guy in a room with two slits for pushing paper through and one huge book filled with Chinese characters.
http://kybele.psych.cornell.edu/~edelman/Psych-214-Fall-2002/Chinese-room.jpg
A Chinese speaker writes a question in Chinese on paper and pushes it through one slit. The guy in the room takes the character (or characters), finds the character(s) in the book under Input, reads what the arrow leads to in the Output column and writes the symbol(s) on a seperate sheet and pushes it out the second slit.
From the outside, what looks to have happened is that a man has asked the room a question and the room as responded, in Chinese. Would you say that the guy in the room really understands Chinese? No, that's obvious.
Now, what Serale argues is that this is exactly what happens in a computer. The computer processor never understands the computer language is manuplates in the same way as the guy in the room never understands Chinese. A computer's essential nature is syntatic manipluation. There is no understanding of the symbols the program pushes around; there's just the manipulation of symbols (or syntax, the ordering of symbols in a sentence). The essence of the human brain, on the other hand, is semantic interpretation. We see the symbols and then decide what they represent. There's something above the mere manipulation of symbols in the human brain.
So, Searle's basic argument is this: since computer programs are syntactic manipulaton, and more of the same will not produce semantic interpretation, it follows that computers will never have intellegence. Now, there's a rebuttal to this. Daniel Dennett, who wrote Consciousness Explained, denies the second part of that claim: that more of the same will just produce more of the same. Think of individual neurons. They can easily be mapped out as an off/on schematic, much like a computer program's manipulatin of symbols. Now, the firing of neurons is what obviously is the important thing with regard to our mind (no neuron firing, no mental life). Since mentality comes from this neuronal network, and since the individual neuronal map can be fully represented in a computer program, it follows that computer will eventually have the capacity to think like humans.
This is the battle line between the two and, for the life of me, I really don't see a compromise. Searle and Dennett combat each other by essentially denying the other's premise and taking that as proof of their argument. Here's some food for though for both sides:
Awkward for Searle: Just what is it about the brain, then, that causes mentality? Dennett snidely remarked that Searle must think the brain somehow "secretes" mentality and consciousness. Searle was greatly offended by that, but after reading some of his writing I've found that Searle doesn't get much beyond that stereotype. Mentality is "biologically based". What does that mean? You're guess is as good as mine.
Awkward for Dennett: Let's go back to the Chinese Room. According to one line of thought advocated by Dennett, there is an entity there that understands Chinese: the room itself. Yeah, the room. It's a system, with inputs and outputs just like a computer program (the material nature of the program shouldn't count much [maybe I'll get into that in another post]). So, it looks like the room is actually thinking Chinese. o_O
This is about whether or not computers can think. The philosopher John Searle invinted this argument. Imagine a guy in a room with two slits for pushing paper through and one huge book filled with Chinese characters.
http://kybele.psych.cornell.edu/~edelman/Psych-214-Fall-2002/Chinese-room.jpg
A Chinese speaker writes a question in Chinese on paper and pushes it through one slit. The guy in the room takes the character (or characters), finds the character(s) in the book under Input, reads what the arrow leads to in the Output column and writes the symbol(s) on a seperate sheet and pushes it out the second slit.
From the outside, what looks to have happened is that a man has asked the room a question and the room as responded, in Chinese. Would you say that the guy in the room really understands Chinese? No, that's obvious.
Now, what Serale argues is that this is exactly what happens in a computer. The computer processor never understands the computer language is manuplates in the same way as the guy in the room never understands Chinese. A computer's essential nature is syntatic manipluation. There is no understanding of the symbols the program pushes around; there's just the manipulation of symbols (or syntax, the ordering of symbols in a sentence). The essence of the human brain, on the other hand, is semantic interpretation. We see the symbols and then decide what they represent. There's something above the mere manipulation of symbols in the human brain.
So, Searle's basic argument is this: since computer programs are syntactic manipulaton, and more of the same will not produce semantic interpretation, it follows that computers will never have intellegence. Now, there's a rebuttal to this. Daniel Dennett, who wrote Consciousness Explained, denies the second part of that claim: that more of the same will just produce more of the same. Think of individual neurons. They can easily be mapped out as an off/on schematic, much like a computer program's manipulatin of symbols. Now, the firing of neurons is what obviously is the important thing with regard to our mind (no neuron firing, no mental life). Since mentality comes from this neuronal network, and since the individual neuronal map can be fully represented in a computer program, it follows that computer will eventually have the capacity to think like humans.
This is the battle line between the two and, for the life of me, I really don't see a compromise. Searle and Dennett combat each other by essentially denying the other's premise and taking that as proof of their argument. Here's some food for though for both sides:
Awkward for Searle: Just what is it about the brain, then, that causes mentality? Dennett snidely remarked that Searle must think the brain somehow "secretes" mentality and consciousness. Searle was greatly offended by that, but after reading some of his writing I've found that Searle doesn't get much beyond that stereotype. Mentality is "biologically based". What does that mean? You're guess is as good as mine.
Awkward for Dennett: Let's go back to the Chinese Room. According to one line of thought advocated by Dennett, there is an entity there that understands Chinese: the room itself. Yeah, the room. It's a system, with inputs and outputs just like a computer program (the material nature of the program shouldn't count much [maybe I'll get into that in another post]). So, it looks like the room is actually thinking Chinese. o_O