You are on page 1of 26

The Turing Test:

Simulating Intelligence

Lecture 2 CSE 140 - Intro to Cognitive Science 1


Alan Turing (1912-1954)
 The Turing machine and the mathematization
of the notion computable function
 The halting problem
 Colossus & breaking the Enigma code
 ACE: England’s first large scale general
computing machine
 Developed an influential mathematical model
of embryological development

Hodges, Andrew (1983) Alan Turing: The Enigma.


Simon and Schuster, New York

Lecture 2 CSE 140 - Intro to Cognitive Science 2


Computing Machinery & Intelligence (1950)

“I PROPOSE to consider the question, 'Can


machines think?' This should begin with
definitions of the meaning of the terms
'machine‘ and 'think'.”

Lecture 2 CSE 140 - Intro to Cognitive Science 3


“Think”: The Imitation Game
 “It is played with three people, a man (A), a woman (B),
and an interrogator (C)…. The interrogator stays in a room
apart from the other two.”
 “The object of the game for the interrogator is to
determine which of the other two is the man and which is
the woman.”
 “The interrogator is allowed to put questions to A and
B….”
 “It is A's object in the game to try and cause C to make the
wrong identification….”
 “ In order that tones of voice may not help the interrogator
the answers should be … typewritten.”

Lecture 2 CSE 140 - Intro to Cognitive Science 4


The Imitation Game II
 “We now ask the question, 'What will happen
when a machine takes the part of A in this
game?' Will the interrogator decide wrongly
as often when the game is played like this as
he does when the game is played between a
man and a woman? These questions replace
our original, 'Can machines think?' “

Lecture 2 CSE 140 - Intro to Cognitive Science 5


“Machine”: An Electronic Computer
 “The question which we put in § 1 will not be
quite definite until we have specified what we
mean by the word 'machine‘”
 “… the present interest in 'thinking machines'
has been aroused by a particular kind of
machine, usually called an 'electronic
computer' or 'digital computer'. Following this
suggestion we only permit digital computers
to take part in our game.”

Lecture 2 CSE 140 - Intro to Cognitive Science 6


Summary of Turing’s Test
Can an appropriately programmed digital computer
consistently deceive a critical observer given
that:

1. The observer is free to ask the computer any


question;
2. The machine is free to lie;
3. The observer distinguishes the machine from a
human at chance.

Lecture 2 CSE 140 - Intro to Cognitive Science 7


The Turing Test: A Sufficient Test
 “May not machines carry out some-thing
which ought to be described as thinking but
which is very different from what a man does?
This objection is a very strong one, but at
least we can say that if, nevertheless, a
machine can be constructed to play the
imitation game satisfactorily, we need not be
troubled by this objection. “

Lecture 2 CSE 140 - Intro to Cognitive Science 8


A Key Implicit Claim

A perfect simulation of intelligence would be


indistinguishable from the real thing so that we
would have no reason to say that the simulation
is not intelligent.

Lecture 2 CSE 140 - Intro to Cognitive Science 9


What Is a Computing Machine?
Motivation: A human “computer” doing arithmetic, e.G.
Adding two large numbers.

The human computer:


 Follows fixed rules, “stored in a book altered when he
is put on to a new job.”
 Has an unlimited supply of paper.

Lecture 2 CSE 140 - Intro to Cognitive Science 10


A Computing Machine Consists of
 An executive unit, which carries out a fixed
set of simple rules.
 A store, which is used as a “notepad.”
• To write down the results of its calculations.
• To remember which rules to use in which
order.
 A control, which makes sure that the
instructions are carried out correctly and in
the right order.

Lecture 2 CSE 140 - Intro to Cognitive Science 11


Turing’s Question:
Given that this model can simulate ANY digital
computer (!!),
 Is human intelligence inside the set of
computable functions; That is, the set of
functions that can be computed by an
algorithm

Lecture 2 CSE 140 - Intro to Cognitive Science 12


Will Computers Pass the Turing Test?
Turing’s belief: in about 50 years (last year!!) It
will be possible to program computers
 With a storage capacity of 109 bits (~100
megabytes)
 Guessing error rates of 30%
 After 5 minutes questioning

Lecture 2 CSE 140 - Intro to Cognitive Science 13


Potential Objections Raised by Turing

Lecture 2 CSE 140 - Intro to Cognitive Science 14


The Theological Objection
“Thinking is a function of man’s immortal soul. God has
given an immortal soul to every man and woman, but
not to any other animal or to machines. Hence no
animal or machine can think.”

 “Should we not believe that He has the freedom to


confer a soul on an elephant if he sees fit?”
 “We are in either case [constructing machines or
procreation] instruments of His will providing
mansions for the souls that He creates.”
 “Such [theological] arguments have been found
unsatisfactory in the past.”

Lecture 2 CSE 140 - Intro to Cognitive Science 15


The “Heads in the Sand” Objection
“The consequences of machines thinking would
be too dreadful. Let us hope and believe that
they cannot do so.”

 Alas….

Lecture 2 CSE 140 - Intro to Cognitive Science 16


The Mathematical Objection
Gödel’s theorem “states that there are certain things
that [a digital] machine cannot do. If it is rigged up to
give answers to questions as in the imitation game,
there will be some questions to which it will either
give a wrong answer, or fail to give an answer at
all…”

 “It has only been stated, without any sort of proof,


that no such limitations apply to the human intellect.”

Lecture 2 CSE 140 - Intro to Cognitive Science 17


The Argument from Consciousness
“Not until a machine can write a sonnet or compose a
concerto because of thoughts and emotions felt, and
not by the chance fall of symbols, could we agree
that machine equals brain… No mechanism could
feel … pleasure at its successes, grief when its
valves fuse, be warmed by flattery, be made
miserable by its mistakes, be charmed by sex, be
angry or depressed when it cannot get what it wants”
 By this argument, “the only which by which one
could be sure that a machine thinks is to be the
machine and to feel oneself thinking.”

Lecture 2 CSE 140 - Intro to Cognitive Science 18


Arguments from Various Disabilities
“… you will never be able to make one to do X.” where
X can be: make mistakes, enjoy strawberries and
cream, do something novel, fall in love, make
someone fall in love with it, tell right from wrong…

For each X, we are faced with an analytical problem. Is


it really true that X lies outside our power to create
algorithms for simulating X? As it stands, this
objection simply asserts that something is impossible
without offering any proof.

Lecture 2 CSE 140 - Intro to Cognitive Science 19


Lady Lovelace’s Objection
“The Analytical Engine has no pretensions of
originate anything. It can do whatever we know
how to order it to perform.” (1842)
 One reading: machines will not surprise us. But
“Machines take me by surprise with great
frequency.”
 This is due to “the [false] assumption that as
soon as a fact is presented to the mind all
consequences of that fact spring into the mind
simultaneously with it.”
 Otherwise, Good Point!!! Turing presents a
framework for machine learning that is still with
us today.
Lecture 2 CSE 140 - Intro to Cognitive Science 20
Argument from Continuity in the Nervous
System
“The nervous system is not a discrete-state machine
[which the computing machine surely is].” Because it
has continuous states, a discrete state machine
cannot simulate a nervous system.
 This presupposes that the nervous system is capable
of making infinitely fine distinctions which is false.
We can digitize all sorts of continuous phenomena
(witness CDs and DVDs); why can’t we similarly
digitize the signals of the nervous system?

Lecture 2 CSE 140 - Intro to Cognitive Science 21


Argument From Informality of Behavior
“It is not possible to produce a set of rules purporting to
describe what a man should do in every conceivable
set of circumstances…To attempt to provide rules of
conduct to cover every eventuality…appears to be
impossible.”
 The argument seems to be “If each person had a
definite set of rules of conduct by which his or her life
is regulated, then people would be no better than
machines. There are no such rules, so people are
not machines.”
No! Compare: If it is snowing, then I will be cold. It
isn’t snowing, so I’m not cold.

Lecture 2 CSE 140 - Intro to Cognitive Science 22


Argument from ESP (!!)
“Unfortunately the statistical evidence, at least for
telepathy, is overwhelming.” A telepath might guess,
for example, what card someone is holding, more
often than a machine.
 “To put the competitors in a ‘telepathy-free room’
would satisfy all requirements.”

Lecture 2 CSE 140 - Intro to Cognitive Science 23


ELIZA
 Written by Weizenbaum in 1966
 Simulated a non-directive psychotherapist
 Sample Dialogue:

Men are all alike.


IN WHAT WAY?
They're always bugging us about something or other.
CAN YOU THINK OF A SPECIFIC EXAMPLE?
Well, my boyfriend made me come here.
YOUR BOYFRIEND MADE YOU COME HERE
He says I'm depressed much of the time.
I AM SORRY TO HEAR YOU ARE DEPRESSED
It's true. I am unhappy

Lecture 2 CSE 140 - Intro to Cognitive Science 24


ELIZA …
DO YOU THINK COMING HERE WILL HELP YOU NOT TO BE
UNHAPPY
I need some help, that much seems certain.
WHAT WOULD IT MEAN TO YOU IF YOU GOT SOME HELP
Perhaps I could learn to get along with my mother.
TELL ME MORE ABOUT YOUR FAMILY
My mother takes care of me.
WHO ELSE IN YOUR FAMILY TAKES CARE OF YOU
My father.
YOUR FATHER
You are like my father in some ways.
WHAT RESEMBLANCE DO YOU SEE

Lecture 2 CSE 140 - Intro to Cognitive Science 25


Recent Objections
 Compiled by David J. Chalmers, Department of Philosophy,
University of Arizona, Tucson AZ 85721. E-mail:
chalmers@arizona.edu.
 Block, N. 1981. A look-up table could pass the Turing test, and
surely isn't intelligent. The TT errs in testing behavior and not
mechanisms.
 Moor, J. H. 1976. The basis of the Turing test is not an
operational definition of thinking, but rather an inference to the
best explanation
 Searle, J. R. 1980. Implementing a program is not sufficient for
mentality, as someone could e.g. implement a "Chinese-
speaking" program without understanding Chinese. So strong AI
is false, and no program is sufficient for consciousness.
 Maudlin, T. 1989 Computational state is not sufficient for
consciousness, as it can be instantiated by a mostly inert object.

Lecture 2 CSE 140 - Intro to Cognitive Science 26

You might also like