You are on page 1of 29

Algoritmi si calculabilitate

lecture #9

Cluj-Napoca, 03.05.18
Agenda

• Turing
• Turing Test
• Philosophy

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 2


Alan Turing
•In 1936, Turing introduced his abstract
model for computation in his article
“On Computable Numbers,
with an application to the
Entscheidungsproblem”.
(Entscheidungsproblem – decision problem) Alan M. Turing
•At the same time, Alonzo Church has (1912–1954)
published similar ideas and results.
•Turing's approach is considerably more accessible and
intuitive than Church's
•Turing model has become the standard model in
theoretical computer science.
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 3
Alan Turing contd.
• Education: King's College, Cambridge
(undergrad 31-34)
• Formalized concepts: algorithms and
computation with the aid of TM
• Considered father of CS and AI
• II WW worked for code breaking – German
naval cyphers (Hut8 – team lead)
• 42-43 worked in Washington, Bell Labs
• Involved in developing ways for decipher
Enigma Machine
• designer of the first stored-program computer
(National Physical Laboratory), then, moved to
Manchester University
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 4
Alan Turing contd.
• 45-48 London, design of the ACE (Automatic
Computing Engine) at National Physical
Laboratory
• 48-49 reader, deputy director, Mathematics
Department, at the University of Manchester,
computing laboratory
• Work on AI and Turing test - attempt to define a
standard for a machine to be called "intelligent“
• 48 – began writing the first chess program
• 52 - no computer was powerful enough to execute
the program, Turing played a game in which he
simulated the computer, ~ half hour/move
(“program” lost)
• 52-54 mathematical biology (morphogenesis)
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 5
Alan Turing contd.
• Since 66 –Turing award (considered the Nobel prize in CS), by
ACM - "an individual selected for contributions of a technical
nature made to the computing community".
(http://en.wikipedia.org/wiki/Turing_Award) .
• Prize recipients:
• Dijkstra (72)
• Knuth (74)
• Dana S. Scott (76)
• Backus (77)
• Floyd (78)
• Hoare (80)
• Cook (82)
• Dennis M. Ritchie (83)
• Wirth (84)
• Karp (85)
• John Hopcroft & Robert Tarjan (86)
• Milner (91)
• Stearns (93)
• Rivest (2001) …
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 6
Turing Test

• Turing Test
• "Can Machines Think?”
• Different definitions
• Different adaptations (relaxations and ?)

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 7


"Can Machines Think?“
Dictionary of Cognitive Science
Alberta University

• operational test of intelligence as a replacement for the


philosophical question, "Can machines think?"
• Turing 1950 (Computing Machinery and Intelligence)
• He argued that conversation was the key to judging
intelligence
• The conversations can be about anything, and proceed
for a set period of time (ex 5 min)
• If, at the end of this time, the judge cannot distinguish
the machine from the human on the basis of the
conversation, then we would have to say that the
machine was intelligent

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 8


"Can Machines Think?“
Stanford Encyclopedia of Philosophy
• According to Turing, the question is itself “too meaningless” to deserve
discussion;
• if we consider the more precise question whether a digital computer
can do well in a certain kind of game (that Turing describes “The
Imitation Game”), then—at least in Turing's eyes—we do have a
question that admits of precise discussion
• Turing himself thought that it would not be too long before we did
have digital computers that could “do well” in the Imitation Game.
• behavioral tests for the presence of
• mind,
• or thought,
• or intelligence
• The Turing Test is prefigured in Descartes' Discourse on the Method
(1637)

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 9


Descartes’s Discourse
1637
• If there were machines which bore a resemblance
to our bodies and imitated our actions as closely
as possible for all practical purposes, we should
still have two very certain means of recognizing
that they were not real men.
• The first is that they could never use words, or put
together signs, as we do in order to declare our
thoughts to others. For we can certainly conceive of a
machine so constructed that it utters words, and even
utters words that correspond to bodily actions causing a
change in its organs. … But it is not conceivable that
such a machine should produce different arrangements
of words so as to give an appropriately meaningful
answer to whatever is said in its presence, as the dullest
of men can do.
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 10
Descartes’s Discourse
1637
• If there were … two very certain means of
recognizing that they were not real men.
• Secondly, even though some machines might do some
things as well as we do them, or perhaps even
better, they would inevitably fail in others, which
would reveal that they are acting not from
understanding, but only from the disposition of their
organs. For whereas reason is a universal
instrument, which can be used in all kinds of
situations, these organs need some particular action;
hence it is for all practical purposes impossible for a
machine to have enough different organs to make it act
in all the contingencies of life in the way in which our
reason makes us act.
• Descartes gives a negative answer to the
question whether
5/3/2018
machines can think
Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 11
Denis Diderot (1713-1784)
• Philosopher during the Enlightenment
• Materialistic view of the universe
• the mind can be explained physically, which leaves
open the possibility of minds that are produced
artificially
• In Pensees Philosophiques about ~Turing-test
criterion
“If they find a parrot who could answer to everything, I
would claim it to be an intelligent being without
hesitation.”
• a common argument of materialists at the time

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 12


Turing Test (TT)
• Turing proposes to change the question from "Do
machines think?" to "Can machines do what we
can do?"
• To show: Imitation Game (IM)
• Objections considered by Turing (9) and counter-
arguments

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 13


Objections and responses
(opinions opposed to Turing’s)
Arguments in section 6. Contrary Views on the Main Question – of the paper
• Theological
• Thinking as a function of non-material (“function of man's immortal
soul”)
• Body does not suffice for the presence of thoughts
• Computers are not different than other “bodies”
• Adding soul to a body is a matter of creation (God’s image)
• Bible quotes to show how past interpretations went wrong
• ‘Heads in the Sand’ (not arguments against, but fears) – “I do not
think that this argument is sufficiently substantial to require refutation”
• Consequences that follow from “thinking” machines – too dreadful :
• Human superior to anything else in the universe no longer applies
(“we like to believe … man … is superior”)
• Machines could think even better than we can
• Become dominated by machines (genuine worry)

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 14


Objections and responses
(opinions opposed to Turing’s)

• Mathematical (in relation with mathematical logic)


• “There are certain things that [any digital computer]
cannot do. If it is rigged up to give answers to
questions as in the imitation game, there will be some
questions to which it will either give a wrong answer, or
fail to give an answer at all however much time is
allowed for a reply.”
• “There may, of course, be many such questions, and
questions which cannot be answered by one machine
may be satisfactorily answered by another.”
• “it has only been stated, without any sort of proof, that
no such limitations apply to the human intellect.”

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 15


Objections and responses – contd.
• Consciousness (Lack of feelings)
• Turing quotes from the 1949 Oration of Professor
Jefferson when awarded Lister (surgical science
award - awarded 1948),
• “Not until a machine can write a sonnet or compose
a concerto because of thoughts and emotions felt, …
No mechanism could feel… pleasure at its successes,
grief …, be warmed …, be made miserable by its
mistakes …” Turing’s argument: “according to this
view the only way to know that a man thinks is
to be that particular man.”

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 16


Objections and responses – contd.
• Various Disabilities (machines have)
• be kind; be resourceful;
• be beautiful; be friendly;
• have initiative; have a sense of humor;
• tell right from wrong; make mistakes;
• fall in love; enjoy strawberries and cream;
• make someone fall in love with one; do something really new;
• learn from experience; use words properly;
• be the subject of one's own thoughts; have as much diversity of behavior
as a man;
Argument “we may (as most English children do) decide that everybody
speaks English, and that it is silly to learn French.”
+ “machines cannot make mistakes” – introduce errors for answers within
the imitation game
+ “cannot have much diversity of behaviour is just a way of saying that it
cannot have much storage capacity”

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 17


Objections and responses – contd.
• Lady Lovelace's Objection
• Ada Biron (Biron’s daughter), mathematician
• Ada language named after her
• Wrote/published the first algorithm (Bernoulli numbers) -> she is
the first computer programmer
• Quote from her (1842) “The Analytical Engine has no pretensions
to originate anything. …machines can only do what we know how
to order them to do…”, “machines can never do anything really
new”
• Turing’s argument “Machines take me by surprise with great
frequency. This is largely because I do not do sufficient
calculation to decide what to expect them to do, or rather
because, although I do a calculation, I do it in a hurried, slipshod
fashion, taking risks. … Naturally I am often wrong, and the result
is a surprise for me.."
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 18
Objections and responses – contd.
• Continuity of the Nervous System
• Objections:
• Turing infers that the brain is likely to be a continuous-state
machine
• Computers are discrete-state machines
• discrete-state machines are different from continuous-state
machines
• Arguments:
• in the imitation game interrogator cannot take advantage from
this difference (ex for pi, chose a random value from a set –
different nb. Of decimals, the same humans and computers)

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 19


Objections and responses – contd.
• The Argument from Informality of Behavior
• Objection:
• There is no set of rules which states what a person ought to do
in every possible situation – Turing agrees (ex traffic light
situations)
• There is a set of rules which describe what a machine will do in
every possible situation
• Argument:
• “I have set up on the Manchester computer a small programme
using only 1,000 units of storage, whereby the machine
supplied with one sixteen-figure number replies with another
within two seconds. I would defy anyone to learn from these
replies sufficient about the programme to be able to predict any
replies to untried values.”

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 20


Objections and responses – contd.
• Extra-Sensory Perception
• Turing considered the empirical evidences for
telepathy
• If the human participant in the IG was telepathic, the
interrogator could exploit this, to determine which is
the machine
• Turing’s strong argument: if humans are telepathic,
why machines can’t?

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 21


Turing’s Predictions
“I believe that in about fifty years' time it will be
possible to programme computers, with a storage
capacity of about 109, to make them play the
imitation game so well that an average
interrogator will not have more than 70 percent
chance of making the right identification after five
minutes of questioning. … I believe that at the end
of the century the use of words and general
educated opinion will have altered so much that
one will be able to speak of machines thinking
without expecting to be contradicted.”

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 22


Turing’s Predictions – contd.
• 70% IG – wrong on 2000
• But it was “about” and not “exact” 50 years

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 23


Opinion about Turing’s IG
• The test is too hard (French 1990 –AI researcher)
• Strings of letters presented; specialists claim humans
need less time than machines do to identify
• Words “created” in a language both (human and
computer know) are recognizable by humans, and not
by machines
• Many other (cognitive features that are difficult to
simulate)
• The test is too narrow
• success in IG is not a necessity of possessing
intelligence (Gunderson 1964)
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 24
Opinion about Turing’s IG – contd.
• The test is too easy
• Propose a more demanding goal (areas for AI)
• Alternative tests
• Is it harmful?
• success in IG is not a necessity of possessing
intelligence (Gunderson 1964)
• Test is circular: fails to detect intelligence and humanity,
as many humans would fail the test.
• “… since one of the players must be judged to be a
machine, half the human population would fail the
species test”.
• Does not consider a different (weaker or even stronger)
form of intelligence

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 25


Strengths and weaknesses
• Strengths
• Tractability and simplicity
• The power derives from its simplicity
• (modern) sciences failed (so far) to provide definitions of "intelligence"
and "thinking" that are sufficiently precise and general to be applied to
machines
• TT, even if imperfect, provides something that can actually be
measured
• Breadth of subject
• The interrogator is allowed to give the machine a wide variety of
intellectual tasks
• To pass, a machine should posses the ability to use NLP, reason, have
knowledge, and ability learn …
(which machined do have now..)

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 26


Strengths and weaknesses –
contd.
• Weaknesses
• Limitation: human beings can judge a machine's intelligence by comparing
its behavior with human behavior.
• Human vs. general intelligence
• TT does not test whether the machine behaves intelligently
• Rather, if machine behaves like humans
• Thus, it fails to measure 2 faces of intelligence:
• Unintelligent human behavior
• Intelligent behavior humans’ don’t posses (If it were to solve a computational problem
that is impossible for any human to solve, then the interrogator would know the
program is not human, and the machine would fail the test).
• Real vs. simulated intelligence
• takes a behaviorist or functionalist approach to the study of intelligence
• machine passing the test may be able to simulate human conversational
behavior by following a simple (but large) list of mechanical rules, without
thinking or having a mind at all
• Searle (1981) has argued that external behavior cannot be used to determine if
a machine is "actually" thinking or merely "simulating thinking

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 27


Alternative Tests
• Total Turing Test
• Give answers to all our inputs, not only linguistic
• Adds perceptual abilities (Computer Vision) and the ability to
manipulate objects (Robotics)
• Chinese Room (John Searle, 1981, Berkeley, philosophy prof.)
• Disagrees with Turing’s claim that programmed computer could
think
• argued that software could pass the Turing Test simply by
manipulating symbols of which they had no understanding
• As understanding is different from "thinking" in the same sense
people do, an so the Turing Test cannot prove that a machine can
think

5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 28


Turing Centennial Celebration
• Princeton 10-12 May, 2012
• Dana Scott, CMU (emeritus). Lambda Calculus, Then and
Now
• Dick Karp, Berkeley. Theory of Computation as an Enabling
Tool for the Sciences
• Tom Mitchell, CMU. Never Ending Language Learning
• Ron Rivest, MIT. The Growth of Cryptography
• Bob Tarjan, Princeton. Search Tree Mysteries
• Christos Papadimitriou, Berkeley. The Origin of Computable
Numbers
Note that all sessions are being recorded. You will be able to review
sessions afterwards on iTunes
5/3/2018 Rodica Potolea - CS Dept., T.U. Cluj-Napoca, Romania 29

You might also like