You are on page 1of 11

Development of Educational

System with the Android Robot


SAYA and Evaluation
Regular Paper

Takuya Hashimoto*, Naoki Kato and Hiroshi Kobayashi


Tokyo University of Science, Japan
*Corresponding author E-mail: tak@kobalab.com

Received 18 March 2011; Accepted 20 March 2011


Abstract This chapter describes an educational system purpose, it is very interesting to investigate what kinds of
with a teleoperated android robot, named SAYA, that functions, mechanisms, and intelligence are required for
can express humanlike facial expressions and perform suchrobotsandtoinvestigateinteractionmannersbetween
some communicative functions with its head and eye human and robot in daily lives. Therefore, many kinds of
movements, and it is utilized as a role ofa teacher. Two robots,calledcommunicationrobot,havebeendeveloped
kindsoffieldexperimentswereconductedtoinvestigate andappliedtonotonlylaboratoriesbutalsoourdailylives
effectiveness of this educational system in actual (Baueretal.,2009;Burgardetal.,1998;Fujita,2001;Hayashi
educationalfields.Anexperimentwasconductedatboth etal.,2007;Kandaetal.,2004;Shiomietal.,2007;Siegwartet
an elementary school and a university to estimate age al.,2003;Wadaetal.,2002).Forexample,thepettyperobot,
dependent differences of its effectiveness. The other namedAIBO(Fujita,2001),wascommercializedaroundten
experiment was carried out to verify whether childrens years ago, which is one successful example of the robots
interest, motivation, and concentration to the class, and whichcanbehaveinourlivingspace.Also,asealtyperobot
scienceandtechnologieswereenhanced. was developed as a mental therapy robot and its
effectiveness for elderly people was verified through the
Keywordshumanrobotinteraction,educationalrobotics, fieldexperimentsinnursingcarefacilities(Wadaetal.,2002).
androidrobot Such kinds of animaltype robots interact with human


emotionally by performing endearing behaviors. On the
other hand, humanoidtype robots have been developed,
1.Introduction which have humanlike body such as a head and arms to
express more humanlike behaviors. In researches of
For the last decade, a wide variety of robots have been humanoidtype robot, body movements such as gestures,
developedandstudied,whichcanbehaveeffectivelyand nodding,eyedirection,andfacialexpressionsareeffectively
offer many kinds of services in our daily lives through utilized as nonverbal behaviors to interact with human
interaction with human. In the near future, such robots naturally (Breazeal & Scassellati, 1999; Bremner et al.,
are expected to be able to offer not only physical assists 2009;Imaietal,2001;Kamasimaetal.,2004;Watanabe,et
butalsoinformativeandemotionalsupports.Towardthis al., 1999). For example, Robovie (Hayashi et al., 2007;

51 Int J Adv Robotic Sy, 2011, Vol. 8, No. 3, Special Issue Assistive Robotics, 51-61 www.intechweb.org
Shiomietal.,2007)wasactuallyusedinrealenvironment 2.TheandroidrobotSAYA
suchasatrainstationoramuseum,anditinteractedwith
humans and offered information of facilities by utilizing Fig. 1 (a) shows the android robot, named SAYA. It has
its humanlike behaviors. Furthermore, communication anthropomorphicappearanceandoneofitscharacteristics
robotsarealsousedineducationalfields(Hanetal.,2005, is to express humanlike facial expressions (Fig. 2). The
2009; Kanda et al., 2004; Tanaka & Kimura, 2009), and main part of SAYA is the face part which is called Face
theycanteachstudentsandlearnwithstudentsthrough Robot and implemented to a mannequin body. The
interactions. A merit of educational applications of followingisthedetaileddescriptionoftheFaceRobot.
communicationrobotsmightbetoencouragechildrento
beinterestedinscienceandtechnology. 2.1ThestructureoftheFaceRobot

Inadditiontotherobotswithmechanicallooks,android Toward the achievement of a humanoid robot with
typecommunicationrobotswithhighlyhumanlikelooks anthropomorphicpropertiesmakingtherobotsorealthat
havebeendeveloped(Ishiguro,2005;Oh,etal.,2006). A itcannotbedistinguishedfromalivinghuman, the Face
meritofandroidrobotsisthattheygivepeopleafeeling Robot have been developed (Kobayashi & Hara, 1993;
ofhumanlikepresenceasifpeoplewereinteractingwith Hashimoto et al., 2006), and Fig.1 (b)(c) show the latest
arealhuman. Therefore, if android robots were used as FaceRobotanditsinternalstructure.TheFaceRobothas
an interface of communication systems and interacted simple structure and basically consists of mechanical
withhumansusinghumanlikebehaviours,peoplecould frame and facial skin. The facial skin is made from soft
interactwithrobotsusingsamemannersasininteraction urethaneresintorecreatethetextureofhumanlikefacial
withrealhumans.Actually,forexample,theeffectiveness skin.Asshowninthefigure,thereare19ControlPoints
of the teleoperated android robot was verified on (CPs) which are moved linearly on the face and as a
conveying presence of a human who was in different results, the Face Robot has 19 Degrees Of Freedom
place rather than existing media such as a speaker or a (DOFs)forgeneratingfacialexpressions.

videoconferencesystem(Sakamotoetal.,2007).


In this chapter, a remote class system is introduced as an
applications of android robots, where the android robot
SAYA (Hashimoto, 2005) (Fig. 1) is used as an interface.
Face Robot
Here,inparticular,theinvestigationofitseffectivenessfor

elementary school children is interesting because children (b)FaceRobot
tended to be interested in learning with a robot and they McKibben artificial muscle
weremotivatedtolearnaforeignlanguageasshowninthe
previousresearch(Kandaetal.,2004).Hence, the proposed
2 13 3
educational system with the android robot SAYA is also 1 4
200 mm

expected to contribute to childrens motivation to learn. In 9 6 7 10


5 8
this study,twokindsoffieldexperimentswereconducted
11 12
toinvestigatetheeffectivenessoftheproposededucational 14 15
18
system in actual elementary schools. One of them was 16 17
19
conducted for both children (elementary school students) Android Robot 115 mm
andadults(universitystudents)toestimateagedependent SAYA (c)InternalstructureandControl

differences of its effectiveness. The other one was carried (a)AndroidRobotSAYA Points(CPs)
out to verify whether there are significant changes in Figure1.PhotosoftheandroidrobotSAYA
childrensinterest,motivation,andconcentrationtoscience
classesandtechnologiesbetweenbeforeandaftertheclass
whichwasconductedbytheproposededucationalsystem.

Thestructureofthischapterisasfollows.InSection2, the
android robot SAYA and its communicative functions are
introduced.Section3describesthesystemstructureofthe Surprise Fear Disgust
remote class system with the android robot SAYA.
Section 4 describes the detailed experimental conditions
and procedures of two kinds of field trials which are Neutral
conducted in actual educational fields, and their results
andthecontributionsofthisresearcharerepresentedand
Anger Happiness Sadness
discussed.the contributions of this research. This chapter
isconcludedinsection5. Figure2.ExamplesofSAYAsfacialexpressions

52 Int J Adv Robotic Sy, 2011, Vol. 8, No. 3, Special Issue Assistive Robotics, 51-61
CCD camera Control Points
z AU Appearance changes
Yaw
Pitch Right Left
Roll y
x Pitch1 (-30~25deg) 1 Inner Brow Raiser 2 3
Yaw (70deg)
2 Outer Brow Raiser 1 4
430 mm

Pitch2 (-30~25deg)
Roll (50deg)
4 Brow Lowerer 5, 6 7, 8
Universal joint
Coil spring
5 Upper Lid Raiser 9 10
McKibben Artificial Muscle 6 Cheek Raiser 14 15
7 Lid Tightener 9 10
130 mm
9 Nose Wrinkler 13
(a)InternalstructureoftheFaceRobot
10 Upper Lip Raiser 11, 13 12, 13
: McKibben artificial muscle 12 Lip Corner Puller 14 15
15 Lip Corner Depressor 16 17
z
Yaw 17 Chin Raiser 18
Pitch
Roll y 20 Lip Stretcher 14, 16 15, 17
x
25 Lips part 11, 16 12, 17
+ + +
26 Jaw Drop 19
Table 1. Required AUs (Action Units) for generating 6 typical
facialexpressions

(b)Actuatorlayoutforheadmovements
2.2Methodologyforgeneratingfacialexpressions
Figure3.InternalstructureoftheFaceRobot
withtheFaceRobot


Intheeyemovements,ithas2DOFsthat includebothyaw
Facialexpressionsareabletoexpressindividualemotions
rotationandpithrotation,andbotheyeballsmovetogether
significantly and play an important role in facetoface
because both eyeballs are linked to each other, and these
communication of humans as a nonverbal media
two eyeballs are driven by two DC motors. A small CCD
(Mehrabian, 1968), and facial expressions seem to
cameraisembeddedinaneyeballforimageprocessing.For
contribute to achieve natural communication between
example,theFaceRobotisabletorecognizehumanfaceby
humans and robots. Therefore, generating natural facial
extractingskincoloranditcantrackahuman.
expressions similar to human is required for robots to

interact with human naturally and emotionally. Almost


Themechanismfortheheadmovementsconsistsofthehead
all related studies of generating facial expressions adopt
partandtheneckpartinwhichacoilspringisutilized,and
Action Unit (AU) approach, and AUs were defined in
the head movements are achieved by combining the head
Facial Action Coding System (FACS) proposed by P.
rotationsandtheneckflexion.Here,theneckpartisableto
Ekman et al. (Ekman & Friesen, 1978). AUs express
bend flexibly by benefiting from the coil spring to mimic
motions of mimic muscles as 44 kinds of basic motions,
flexible neck movements of human. As a result, the Face
and 14 AUs which are shown in Table 1 are required to
Robothas4DOFsintheheadmovementsasshowninFig.
generate6typicalfacialexpressions;Anger,Disgust,
3(a);2DOFsfortheneckpartand2DOFsfortheheadpart.
Fear, Happiness, Sadness, and Surprise.

Referring to this approach, 19 control points were


The lateral flexion of the head is achieved by only the neck
modelled,andtheirmovabledirectionsareshowninFig.
flexion (Roll), and the forward and the backward flexion
1(c). ThecombinationsofCPsarealsodefinedtoachieve
areachievedbycombiningtheheadpitchrotation(Pitch1)
each AU, and various facial expressions can be realized
andtheneckflexion(Pitch2).The horizontal head shaking
with the face robot by combining movements of some
isachievedbyonlytheheadyawrotation(Yaw).
CPs. Fig. 2 shows the examples of 6 typical facial

expressions of the Face Robot, and high correct


McKibben pneumatic actuator is adopted to control the
recognition rate of every facial expression was achieved
facial expressions and the head movements. One of its
inthepreviousstudy(Hashimotoetal.,2006).Inaddition
characteristicsistheabilitytogeneratetoolargeforceforits
to the succeeding in the development of the typical face
relatively small size and light weight, and it can be
robot, the specialized face robot have been developed,
distributedtocurvedsurfaceliketheskulloftheFaceRobot
whichcanremarkablymimicanactuallivinghumanand
becauseofitsflexibility.Fig.3(b)showstheactuatorlayout.
hashighlyrealisticpresence(Hashimotoetal.,2008).
In the face part, one or two actuators are used in each CP,

and the neck part is driven by 4 pairs of antagonistic

actuators.

Takuya Hashimoto, Naoki Kato and Hiroshi Kobayashi: 53


Development of Educational System with the Android Robot SAYA and Evaluation
Operation side Class room Android Teacher
Router Router USB
Camera controller CCD camera
LAN
(TCP/IP) Eye (Pitch)
RS232C Micro computer Motor drivers
Eye (Yaw)
(CPU : H8/3048)

USB McKibben
Operation PC Control PC D / A converter artificial muscles
Speaker
Touch panel
Compressed
air
Electro-pneumatic
Camera regulator
:
Microphone Air compressor
Speaker

Figure4.SystemconfigurationofremoteclasssystemwiththeandroidrobotSAYA

3.EducationalsystemwiththeandroidrobotSAYA controlcomputer(ControlPC)tocontrolSAYAsfacial
expressions and head movements, and the control
An educational system, particularly a remote class system, computer also controls SAYAs eyedirection and
has been developed as a practical application of the utterancesaswell.Amicrophoneandavideocameraare
android robot SAYA. In considering practical aspect of used to obtain visual and sound information of the
communication robot, to achieve smooth and natural classroom.
communication between human and robot is one of the
most important problems. However, intelligence In the operation room, there are two monitors. One of
technologiesofrobotsaregenerallystilllackedtointeract themisusedforthecontrol,andtheotheroneisusedfor
with human and act in daily lives autonomously even theobservation.Theoperatorisabletomonitorstudents
though variety of autonomous robots have been behaviors through the observation monitor, and he can
developed and studied so far. Autonomous manoeuvre SAYAs utterances and actions by sending
communication robots are currently simply able to commands from the operation PC (Operation PC) to
interact with human in welldesigned interaction the control PC (Control PC) through the LAN. The
scenarios and in welldefined environment as well. control PC executes robots utterances and actions based
Meanwhile,teleoperatedrobotwhichismanoeuvredby on received commands. Specifically, captured images
a hidden operator has the advantage in terms of from SAYAs CCD camera and the video camera are
practicality because it seems to conduct behaviors and transmitted to the operation PC, so the operator can
interactions autonomously from the viewpoint of a observe the classroom from the two viewpoints. As a
human who interacts with the robot even though the result, the operator can move SAYAs viewpoint by
robot is controlled by teleoperation. Particularly, if an controlling its eye and head directions corresponding to
android robot is used as an interface of teleoperated these visual information, and SAYA is able to look
communication system, it will give people a strong aroundtheclassroomandlookatastudent.Theoperator
feeling of presence and make people feel like they are isabletohearstudentsfromthespeakersandrespondto
interacting with real human as described in Section 1 studentsaswell.
(Sakamoto et al., 2007). In addition, it is expected that
elementary school students are very interested in 3.2Interactivebehaviors
interaction with an android robot and they actively
participateintheclasswhichisconductedbytheandroid In the developed system,therearetwooperationmodes
robot. The detailed configuration of the proposed whichincludelecturemodeandinteractionmode.
educationalsystemisdescribedasfollows.
In lecture mode, SAYA gives some explanations about
3.1Systemconfiguration somecontentsofaclasstostudentswhilelookingaround
theclassroombytheteleoperation,andsomeslideswhich
Fig. 4 shows the system configuration of the proposed areprojectedonthescreeninfrontoftheclassroomtohelp
educational system, and the android robot SAYA is studentstounderstandsimultaneously.SAYAsutterances
utilizedastheroleofateacher. arepreviouslypreparedalongascenarioofaclass.

In the classroom, there are SAYA and some control In interaction mode, SAYA performs interactive
equipment, and the control system of SAYA requires a behaviors such as looking around the room, paying
compressorandanelectropneumaticregulatortocontrol attentiontoastudent,andtalkingtoastudent.Inorderto
contractions of McKibben artificial muscles. In addition, talk to students, SAYAisabletorespondwithregistered
the electricpneumatic regulator is controlled by the short sentences such as Do your best!, Be quiet!,

54 Int J Adv Robotic Sy, 2011, Vol. 8, No. 3, Special Issue Assistive Robotics, 51-61
Dont look away and so on. If students questions are Inpart(b),thereareshortsentencesandstudentsnames
beyond SAYAs default database, SAYA replies with a thatarepreregisteredforbriefinteractionwithstudents.
word of kindred meaning which is selected by the In part (c) and (d), there are some icons that correspond
operator. In addition, SAYA is able to express facial torobotsbehaviorssuchasthefacialexpressions,theeye
expressionsaccordingtoitsutterance.Forexample,when andheadmovements.Byclickingtheseicons,theoperator
SAYA says Be quiet!, it executes the facial expression isabletoexecutesuchbehaviors.
anger. Also, SAYA is able to call students name
individually, because the names of the students who As shown in Fig. 5(b), the operator can observe a
participate in the class are recorded in advance. Here, classroom and hear students utterances through the
female voice that was recorded beforehand are used as display and speakers, and he is able to click icons easily
SAYAsvoice. byusingatouchpaneloramouse.

The operator can execute these interactive behaviors 4.Fieldtrialsateducationalfields
easily by using a simple operation interface. In addition,
SAYA apparently seems to conduct classes and In order to evaluate the proposed educational system,
interactions with students autonomously because SAYA two kinds of experiments were conducted in actual
isperformedbytheteleoperation. educationalfields,particularlyinelementaryschools.The
detailed procedures and the results of each experiment
3.3Operationinterface aredescribedin4.1and4.2.

Fig.5(a)showsanoperationinterfaceforanoperator.As 4.1Fieldtrialforestimatingagedependent
shown in the figure, there are many kinds of icons that differencesofeffectiveness
correspondtorobotsbehaviorsandutterances.Itmainly
consists of the following four parts; (a) the part for As a first step, the field trials were carried out on both
conductingaclass,(b)thepartforbriefinteraction,(c)the elementary school students and university students to
partforcontrollingthefacialexpressions,and(d)thepart estimate the agedependent differences of the
for controlling the head andeye movements. In part (a), effectiveness, and we investigated whether students
thereareiconstoprogressaclassandtoexecuteSAYAs interest, motivation, and concentration to the proposed
utterancesforexplanations. educationalsystemdifferedaccordingtotheirageornot.
Such differences were estimated by comparing the
questionnaireresultsbetweenelementaryschoolstudents
anduniversitystudents.
(a) part for conducting (b) part for brief interaction
a class ( short utterances, replies)
4.1.1Experimentalsetup

a)Contentoftheclass
In the first experiment, robot class was conducted as a
(c) part for displaying facial (d) part for controlling
expressions head movements science class in which SAYA introduces itself and other
interesting robots.

(b) part for brief interaction
( calling students names ) Table2showstheflowofrobotclass.Firstofall,inScene
1, SAYA gives the selfintroduction to students and
(a)Operationinterface

beginstheclass.InScene2,SAYAinteractsandtalkswith
Observation monitor Operation monitor students briefly, and then SAYA asks students, What
kinds of robots do you know? or What kinds of robots
do you want?. Scene 3 is the introduction of the robots
which can demonstrate high performance in hazardous
environmentssuchasindisastersite,andsoon.Scene4
is the introduction of the robots which canassist human
physically in medical facilities and living environments.
Scene 5 is the introductions of the robots which are
expected to demonstrate high performance in the near
futureandofferserviceswhileinteractingwithhumanin
Operator Speaker ourdailylives.

(b)Operationenvironment

Figure5.Operationinterfaceandoperationenvironment

Takuya Hashimoto, Naoki Kato and Hiroshi Kobayashi: 55


Development of Educational System with the Android Robot SAYA and Evaluation
Sce Content
1 SAYAintroducesitselfandbeginstheclass
SAYAasksstudentssomequesiontssuchaswhat
kinds of robots do you know? or what kinds of
2
robots do you want?, and SAYA engages in a
simpleconversationwithstudents
Lecture on robots which demonstrate high
3 performanceindisastersite,space,andothersuch
hazardousenvironments
Lecture on robots which assist human in medical
4
facilities,livingenvironment,andsoon
Lecture on robots which are expected to present
high performance in the near future and offer
5
services while taking interaction with human in
Figure6.Experimentalenvironment
dailylives

6 Conclusionandclosingoftheclass c)Experimentalenvironment
Table2.TheflowofthescienceclassaboutRobot(robotclass) Fig. 6 shows an experimental environment, and a
standard classroom was used. As described in Section 3,

theandroidrobotSAYAwasputinfrontoftheclassroom,
ElementaryschoolA Elementaryschool and a screen was placed next to SAYA to show
Times Num.of Num.of slideshows that were used to explain and help the
Grade Grade students to understand the content of the class. An
students students
observation camera and a microphone were placed in
1 5 13 back of the classroom. The sitting positions of the
2 1,2 8 studentsweresetwithinboththevisualanglesofSAYA
and the observation camera, and 4 or 5 students sat at
3 3,4 9
eachdesk.
4 5,6 8
Total 38students d)Evaluationmethod
10 questions were prepared as shown in follows to
Table3.Participantsattribution evaluate students interest, motivation,andconcentration

to the class which was conducted by the proposed
InScene6,SAYAsummarizesitstalkandclosestheclass.
educationalsystem.
In addition, an operator sometimes executes SAYAs

interactive behaviors such as looking at a student or
Q.1 Wereyouabletoconcentrateontheclass?
talkingtoastudentduringeachscene,andtheclasstakes
Q.2 Didyouhaveagoodtimeintheclass?
around30minutes.
Q.3 Did you feel something different from usual

class?
b)Participants
Q.4 Did you get nervous more than in usual
The experiments were conducted 4 times in two
classes?
elementary schools as shown in Table 3. As a result, 38
Q.5 Were you interested in the content of the
elementaryschoolstudentswhobelongtofrom1stto6th
lecture?
graderparticipated.
Q.6 Doyouwanttoparticipateagain?

Q.7 DidyoufeelfamiliaritywithSAYA?
In addition, the same experiment was conducted for 30
Q.8 Didyoufeelthatyouarebeingwatched?
university students who were in their twenties to
Q.9 DidyoufeeleerinessinSAYA?
evaluate the agedependent differences of effectiveness
Q.10 Didyoufeelexistenceoftheteacher?
bycomparingwiththeelementaryschoolstudents.Inthe

experiment, they were divided to three groups and each
Each question was evaluated on a scale of -3 to 3 Here,
ofthethreegroupsparticipatedintheclassseparately.
thevaluesoverzeromeanpositive evaluation,whilethe

values less than zero mean negative. Because some of
questions were not easy to understand for the early
elementary grades, additional explanations about the
questions were given in simpler words. The students

56 Int J Adv Robotic Sy, 2011, Vol. 8, No. 3, Special Issue Assistive Robotics, 51-61
*: significant difference (< 0.05) **: significant difference (< 0.01)
4.0 * ** * ** *
3.0
2.0

Evaluation value
1.0
0.0
-1.0
-2.0
-3.0
Elementary school University
-4.0
Q1 Q2 Q3 Q4 Q5 Q6 Q7 Q8 Q9 Q10
Question number
Figure7.Questionnaireresultoffirstfieldtrial(robotclass)

were also asked to comment on the class and the e1. SAYA gave me a littleeeriness and sense of
proposedsystem. tension.
e2. SAYAwasalittlescary.
e)Experimentalprocedure e3. The angry expressions of SAYA was very

Elemetaryschoolstudents
Before the experiment, an experimenter let the students scary.
sit at the assigned positions because an operator needed e4. IwassurprisedthatSAYAcanchangefacial
to identify the students so as to look at a student or call expressions.
studentsnameindividually.Then,theexperimentergave e5. ItwaspreferableforSAYAtobehavemore.
explanations about the experiment to thestudents. After e6. IwassurprisedatthepresenceofSAYA.
that, the experimenter exited the classroom, and the e7. Iwouldliketoknowhowtherobotconducts
operator controlled SAYA to conduct a greeting and to conversationswithus.
begin the class. During the class,theoperatorsometimes e8. IwassurprisedthatSAYAcalledmyname.
talkedtothestudents,andhealsoaskedsomequestions e9. I was surprised that SAYA can conduct
or gave cautions and advices to the students through conversations.
SAYA.After the class,thestudentswereaskedtoanswer e10. Theclasswasfun.
thequestionnaire. e11. Iwouldliketoparticipateagain.

u1. The facial expressions of SAYA were


4.1.2Experimentalresultsanddiscussions
natural.

u2. ItwaspreferableforSAYAtomoveitsarms
Fig. 7 shows the questionnaire result, and the average
andhands.
and the standard deviation of each question are also
u3. It was preferable for SAYA to behave more
shown in the figure. As a result, the evaluation values
sommthlyandnaturally.
except Q. 3 of the elementary school students are higher
Universitystudents

u4. I was surprised that SAYA can have a


than those of the university students. The results of
conversationwithus.
MannWhitneys Utest are also shown in Fig. 7. The
u5. SAYA sometimes said a few irrelevant
significant differences at p < .01 and p < .05 between the
replies.
evaluation values of the elementary school students and
u6. SAYAwassometimesslowtoreact
theuniversitystudentsinQ.2,Q.6,Q.7,Q.8,andQ.9
u7. I was surprised that SAYA knew my name
are proved. It is found that the elementary school
andcalledme.
students are easy to accept the proposed educational
u8. The accuracy of voice recognition was low
system with the android robot SAYA more than the
(orhigh).
university students because the elementary school
u9. I worried about how much my words were
students estimated significantly high in following
recognizedandunderstood.
questions,Q.7DidyoufeelfamiliaritywithSAYA?,Q.
u10. IwouldliketotalktoSAYAfreely.
8 Did you feel that you are being watched?, and Q. 9
Did you feel eeriness in SAYA?. Also, the elementary Table4.Studentscommentsontheclassthatwasconductedby
school students actively participated in the class more proposedsystem
thantheuniversitystudentsbecause thereare significant
differences in Q. 2 Did you have a good time in the Thecommentsfromboththeelementaryschoolstudents
class?andQ.6Doyouwanttoparticipateagain?. andtheuniversitystudentsareshownseparatelyinTable
4.BothofthemindicatethelackofmovementsofSAYA
(e5, u2) , and the university students particularly
indicated the unnaturalness of SAYAs movements (u3).

Takuya Hashimoto, Naoki Kato and Hiroshi Kobayashi: 57


Development of Educational System with the Android Robot SAYA and Evaluation
Thus, the improvements in its movements are required. Table 5 shows the flow of the science class about the
SAYA also needs not only the ability to conduct the principleofleverage.Firstofall,inScene1,SAYAgives
conversation along a scenario but also the ability to talk the selfintroduction to students like robot class, and
freely,becausethereisthecommentofu10Iwouldlike the class is begun. Maintopicsoftheclassconsistofthe
to talk to SAYA freely. On the other hand, it is three scenes (Scene 24). In Scene 2, SAYA gives the
confirmed that calling name is an effective interactive explanationsaboutthetheoryandmechanicaladvantages
behaviour because they were surprised that SAYA can of leverage with some slides. In this scene, three
calltheirnameindividually(e8,u7),andtheadvantageof important points (i.e., a pivot point, a point of effort, and
the remote control is also confirmed because they were a point of load) of leverage are explained. SAYA then
surprised at the conversation ability of SAYA (e9, u4). showssomefamiliarexamplesofleveragethatareseenin
The elementary school students indicated their interests our daily lives in Scene 3. For example, scissors, bottle
in the class such as e10 The class was fun and e11 I openers,tweezers,andsoon.Afterthat,inScene4,SAYA
would like to participate again while the university lets students experiment to confirm the principle of
students mainly emphasized SAYAs abilities and leverage with an experimental kit, and they are able to
functions. Therefore, it is proved that the elementary experience the balancing theory of a lever. In Scene 5,
school students were more interested in the class with SAYA summarizes its talk and closes the class. The class
SAYAthantheuniversitystudents. takes around 30 minutes like the experiment of robot
class.
It is thought that the elementary school students are
involved in high novelties such as robots, particularly Scene Contents
androidrobots,becausetheyhavelessopportunitytosee 1 SelfintroductionofSAYAandopeningtheclass
or touch robots. As a result, the effectiveness of the Lectureonbasictheoryandmechanical
proposed system is confirmed in educational fields, 2
advantagesofleverage
particularly to younger age brackets such as elementary
schoolstudents. Lectureonfamiliarexamplesofleveragein
3
dailylives
4.2Fieldtrialforverifyingeffectiveness Experimentsfordiscoveringtheprincipleof
4
onchildrensmotivation leverage
5 Conclusionandclosingoftheclass
4.2.1Experimentalsetup
Table5.Theflowofthescienceclassabouttheprincipleofleverage
Asasecondstep,anotherfieldtrialwasalsoconductedat
an elementary school to verify the effectiveness of the Experimental lever kit
proposed educational system in an actual science class.
Students interest and motivation to the class were also
estimated by using a questionnaire.

a)Contentoftheclass
In the experiment, the principle of leverage was
adopted as a topic of a usual science class of elementary
school. This topic is generally difficult for children
(a)Experimentalleverkit (b)Photooftheclassroom
because it contains both mathematical elements and
experimental validations. In order to prepare teaching Figure8.Photosofexperimentalenvironement
materials of the class, the science textbooks which are

commonlyusedinelementaryschoolswerereferred. * : significant
3.0
difference (< 0.05)
**: significant difference (< 0.01)

2.0
**
The principle of leverage represents mechanical
Evaluation value

propertiesofleverage.Theleverisarigidobjectwhichis 1.0
used either to amplify small force to move larger force
0.0
(load), or to change small distance and speed of the end
of the lever to larger distance and speed of the opposite -1.0
end. That is, it is a good example of the principle of the -2.0
moment.Theleverageisoneofthesimplemachines,and before experiment after experiment
-3.0
isalsoausualtopicofscienceinelementaryschool. Q.1 v.s. Q.5 Q.2 v.s. Q.6 Q.3 v.s. Q.7 Q.4 v.s. Q.8

Figure 9. Comparisonsbetweenpaired questions ( Q.1andQ.5,
Q.2andQ.6,Q.3andQ.7,Q.4andQ.8)

58 Int J Adv Robotic Sy, 2011, Vol. 8, No. 3, Special Issue Assistive Robotics, 51-61
b)Participants e)Experimentalprocedure
22 elementary school students who were 10-11 years old Theprocedureoftheexperimentwasalmostthesameas
and belonged to the fifth-grader participated in the theexperimentofrobotclassdescribedin4.1. Atfirst,
experiment. anexperimenterletthestudentssitdownattheassigned
positions. He then explained the flow of the experiment
Result briefly and asked the students to answer the brief
Questionnaire (ave. questionnairewhichconsistsof4questions(Q.1Q.4).
(s.d.)) Afterthat,heexitedtheroomandanoperatorbeganthe
Q.1 Areyouinterestedinscienceand 1.00(0.87) science class by controlling SAYA. The science class was
conducted along the scenario described in Table 5.
Q.5 technology? 1.36(1.00)
During the class,theoperatorsometimesinteractedwith
Q.2 0.73(1.08) the students just like the experiment of robot class.
Doyoupreferscienceclasses?
Q.6 1.45(0.74) After the class, the experimenter asked the students to
Q.3 1.09(1.15) answerthequestionnairewhichconsistsof8questions(Q.
Areyouinterestedinrobots? 5Q.12).
Q.7 1.36(1.09)

Q.4 Areyouinterestedinaclass 1.27(0.94) 4.2.2Experimentalresultsanddiscussions
Q.8 conductedbyrobots? 1.50(0.96)
Table6alsoshowstheaverages(ave.)fromQ.1toQ.11,
Wereyouabletoconcentrateonthe
Q.9 0.55(1.30) and the numbers described in parentheses are the
classmorethanusual?
standarddeviations (s.d.). The numbers of students who
Q.10 Wastheclasseasytounderstand? 1.45(0.86) answered yes or no in Q. 12 are also shown. In
addition,Fig.9showsthecomparisonsbetweenQ.1and
Doyouwanttoparticipateinthe
Q.11 1.09(1.23) Q.5,Q.2andQ.6,Q.3andQ.7,Q.4andQ.8. Table 6
classagainifyouhaveopportunity?
shows that students interests and motivation were
DidSAYAansweryoucorrectly? yes:17,no: affected through this experiment. That is, the evaluation
Q.12 values of Q. 1, Q. 2, Q. 3, and Q. 4 are higher than the
(yes/no) 5
valuesofQ.5,Q.6,Q.7,andQ.8respectively.Wilcoxon
Table6.QuestionnaireandresultsoffromQ.1toQ.12 signed-rank test were also applied to each pair,and the
resultsareshowninFig.9.Theresultsrevealasignificant
c)Experimentalenvironment differenceinthepairofQ.2andQ.6(p<.01).Therefore,
A standard classroom was used like the experiment in the class conducted by SAYA enhanced the students
robot class, and the android robot SAYA was put in interestsandmotivationtoscienceclasses. Inaddition,it
front of the classroom as the role of a teacher. In this is found that the concentration of the students was low
experiment, a plasma display was used to show becausetheevaluationvalueofQ.9(concentrationonthe
slideshows,andanexperimentalleverkitwasputineach class)isrelativelylowerthanthatofotherquestions.The
desk as shown in Fig. 8(a)(b). Fourstudentsperdesksat results of Q. 10 and Q. 11 indicate that the students can
downattheassignedpositions. easily understand SAYAs explanations and teaching
materials,andthestudentswanttoparticipateintheclass
d)Evaluationmethod again. In Q. 12, 17 out of 22 students answered Yes.
A questionnaire was organized in order to investigate That is, it is confirmed that the operator replied to them
how students interest and motivation were affected correctlythroughSAYA.
through the experiment, and the questionnaire
investigations were conducted before and after the class. Fig. 10 shows some photos of the experiment at the
Table 6 shows the questionnaire that consists of 12 elementary school. Fig. 10(a) shows the scene in which
questions(Q.1Q.12).Here,thefirst4questions(Q.1 SAYAgavetheexplanationsabouttheleveragealongthe
Q.4)wereusedfortheinvestigationbeforetheclassasa scenario, with the students concentrating on hearing
briefquestionnaire,andtherestofthequestions(Q.5Q. SAYAs talk and paying attention to the screen. SAYA
12) were used after the class. Except that Q. 12 was sometimeslookedatastudentandaskedsomequestions
evaluated with the yes or no, each question was asshowninFig.10(b).Fig.10(c)showsthesceneinwhich
evaluated on a scale of 2 to 2, where 2 is the most thestudentsraisedtheirhandsandattemptedtoanswer
positiveevaluation.Q.1and Q.5(interestinscienceand SAYAs question. In the class, the students also
technology),Q.2andQ.6(interestinscienceclasses),Q.3 experimented for discovering and confirming the
and Q. 7 (interest in robots), Q.4andQ.8(interestina principle of leverage with an experimental lever kit as
class conducted by robots) are paired to investigate the showninFig.10(d).
contrast of students interests and motivation between
beforeandaftertheexperiment.

Takuya Hashimoto, Naoki Kato and Hiroshi Kobayashi: 59


Development of Educational System with the Android Robot SAYA and Evaluation
SAYA SAYA
SAYA

(a) (b)

SAYA


(c) (d)
Figure10.Scenesoffieldtrialinanelementaryschool

5.Conclusion The elementary school students are easier to accept
the proposed educational system and more actively
In this chapter, the remote class system is proposed, participateintheclassthantheuniversitystudents.
where the android robot SAYA is used as a teacher. The proposed educational system enhances the
SAYA has highly anthropomorphic appearance, and the elementary school students motivations to science
remote control system of SAYA was developed for the classes.
proposed remote class system. The developed system
allows an operator to easily control SAYAs behaviors Our future works are to conduct longterm experiments
such as the facial expressions, head movements, eye atelementaryschoolsandevaluateitseducationaleffects
direction,andutterances,andtheoperatorisalsoableto on children, and the proposed educational system also
observestudentsbehaviorsremotely. should be compared with other existing remote
communication system such as tele-conference system to
Two kinds of field trials were conducted in actual evaluate its advantages. Also the contrasts between the
educational fields to investigate the effectiveness of the proposededucationalsystemandhumanteachersshould
proposed educational system. One of them was carried beinvestigated.
out for both elementary school students and university
students to estimate the agedependent difference of 6.Acknowledgement
effectiveness,androbotclasswasconductedasthetopic
of a science class. The other field trial was conducted to This research was partially supported by Japan Society
verifyitseffectivenessinactualscienceclass.Theprinciple for the Promotion of Science (JSPS), GrantinAid for
ofleveragewasadoptedasthetopicofusualscienceclass, YoungScientists(Startup),21800058,2009.
and the students interest and motivation to the class
wereestimated. 7.References

From the experimental results, the followings are [1] Bauer, A.; Klasing, K.; Lidoris, G.; Mhlbauer, Q.;
confirmed in terms of the positive effects and the Rohrmller, F.; Sosnowski, S.; Xu, T.; Khnlenz, K.;
possibility of the proposed educational system in actual Wollherr,D.&Buss,M.(2009).TheAutonomousCity
educationalfields,especiallyinelementaryschools. Explorer: Towards Natural HumanRobot Interaction
in Urban Environments, International Journal Social
Robotics,Vol.1,No.2,pp.127140.

60 Int J Adv Robotic Sy, 2011, Vol. 8, No. 3, Special Issue Assistive Robotics, 51-61
[2] Breazeal, C. & Scassellati, B. (1999). How to build [15] Kamasima, M.; Kanda, T.; Imai, M.; Ono, T.;
robots that make friends and influence people, Sakamoto, D.; Ishiguro, H. & Anzai, Y. (2004).
Proceedings of the 1999 IEEE/RSJ International EmbodiedCooperativeBehaviorsbyanAutonomous
Conference on Intelligent Robots and Systems (IROS09), Humanoid Robot, Proceedings of 2004 IEEE/RSJ
pp.858863. InternationalConferenceonIntelligentRobotsandSystems
[3] Bremner, P.; Pipe, A.; Melhuish, C.; Fraser, M. & (IROS04),pp.25062513.
Subramanian, S.. (2009). Conversational gestures in [16] Kanda, T.; Hirano, T.; Eaton, D. & Ishiguro, H.
humanrobot interaction, Proceedings of the 2009 IEEE (2004). Interactive Robots as Social Partners and Peer
InternationalConferenceonSystems,ManandCybernetics Tutors for Children: A Field Trial, Human Computer
(SMC09),pp.16451649. Interaction,Vol.19,No.12,pp.6184.
[4] Burgard, W.; Cremers, A. B.; Fox, D.; Hahnel, D.; [17] Kobayashi,H.&Hara,F.(1993).Studyonfacerobot
Lakemeyer, G.; Schulz, D.; Steiner, W. & Thrun, S. for active human interfacemechanisms of face robot
(1998). The interactive museum tourguide robot, andexpressionof6basicfacialexpressions,Proceedings
Proceedings of the 15th National Conference on Artificial of the 2nd IEEE International Workshop on Robot and
Intelligence(AAAI98),pp.1118. HumanCommunication(ROMAN93),pp.276281.
[5] Ekman,P.&Friesen,W.V.(1978).TheFacialAction [18] Mehrabian, A. (1968). Communication without
CodingSystem,ConsultingPsychologistsPress. Words,PsychologyToday,Vol.2,No.4,pp.5355.
[6] Fujita, M. (2001). AIBO: towards the era of digital [19] Mutlu, B.; Forlizzi, J. & Hodgins, J. (2006). A
creatures,TheInternationalJournalofRoboticsResearch, Storytelling Robot: Modeling and Evaluation of
Vol.20,No.10,pp.781794. Humanlike Gaze Behavior, Proceedings of 6th IEEE
[7] Han, J.; Jo, M.; Park, S. & Kim, S. (2005). The RAS International Conference on Humanoid Robots 2006
Educational Use of Home Robots for Children, (Humanoids06),pp.518523.
Proceeding of the 14th IEEE International Workshop on [20] Oh,J.;Hanson,D.;Kim,W.;Han,Y.;Kim,J.&Park,
Robots and Human Interactive Communications conference I. (2006). Design of Android Type Humanoid Robot
(ROMAN05),pp.378383. Albert HUBO, Proceedings of the 2006 IEEE/RSJ
[8] Han,J.;Kim,D.&Kim,J.(2009).PhysicalLearning InternationalConferenceonIntelligentRobotsandSystems
Activities with a Teaching Assistant Robot in (IROS06),pp.14281433.
ElementarySchoolMusicClass,Proceedingsofthe2009 [21] Sakamoto, D.; Kanda, T.; Ono, T.; Ishiguro, H. &
FifthInternationalJointConferenceonINC,IMSandIDC, Hagita, N. (2007). Android as a telecommunication
pp.14061410. mediumwithahumanlikepresence,Proceedingsofthe
[9] Hashimoto,T.&Kobayashi,H.(2005).Development ACM/IEEE international conference on Humanrobot
ofthereceptionistsystemwithananthropomorphism interaction,pp.193200.
face,Proceedingsofthe5thAsianSymposiumonApplied [22] Shiomi,M.;Kanda,T.;Ishiguro,H.&Hagita,N.(2007).
ElectromagneticsAndMechanics,pp.190196. Interactive Humanoid Robots for a Science Museum,
[10] Hashimoto,T.;Hiramatsu,S.;Tsuji,T.&Kobayashi, IEEEIntelligentSystems,Vol.22,No.2,pp.2532.
H. (2006). Development of the Face Robot SAYA for [23] Siegwart, R.; Arras, K. O.; Bouabdallah, S.; Burnier,
Rich Facial Expressions, Proceedings of SICEICASE D.;Froidevaux,G.;Greppin,X.,Jensen,B.;Lorotte,A.;
InternationalJointConference2006,pp.54235428. Mayor,L.;Meisser,M.;Philippsen,R.;Piguet,R.;Ramel,
[11] Hashimoto,T.;Hiramatsu,S.&Kobayashi,H.(2008). G.;Terrien,G.&Tomatis,N.(2003).RoboxatExpo.02:
Dynamic Display of Facial Expressions on the Face Alargescaleinstallationofpersonalrobots,Roboticsand
Robot Made by Using a Life Mask, Proceedings of 8th AutonomousSystems,Vol.42,No.34,pp.203222.
IEEERAS International Conference on Humanoid Robots [24] Tanaka,F.&Kimura,T.(2009).Theuseofrobotsin
(Humanoids08),pp.521526. early education: A scenario based on ethical
[12] Hayashi, K.; Sakamoto D.; Kanda, T.; Shiomi, M.; consideration,Proceedingsofthe18thIEEEInternational
Koizumi,S.;Ishiguro,H.;Ogasawara,T.&Hagita,N. Symposium on Robot and Human Interactive
(2007).Humanoidrobotsasapassivesocialmedium Communication,pp.558560.
a field experiment at a train station, Proceedings of [25] Wada, K.; Shibata, T.; Saito, T. & Tanie, K. (2002).
ACM/IEEE 2nd Annual Conference on HumanRobot Analysisoffactorsthatbringmentaleffectstoelderly
Interaction(HRI07),pp.137144. people in robot assisted activity, Proceedings of
[13] Imai, M.; Ono, T. & Ishiguro, H. (2001). Physical IEEE/RSJ International Conference on Intelligent Robots
relation and expression: joint attention for human andSystems,Vol.2,pp.11521157.
robot interaction, IEEE Transactions on Industrial [26] Watanabe, T.; Okuno, M. & Ogawa, H. (1999). An
Electronics,Vol.50,No.4,pp.636643. EmbodiedInteractionRobotsSystemBasedonSpeech,
[14] Ishiguro,H.(2005).AndroidScienceTowardanew Proceedings of the 8th IEEE International Workshop on
crossinterdisciplinary framework, Proceedings of Robot and Human Communication (ROMAN99), pp.
InternationalSymposiumofRoboticsResearch,pp.16. 225230.

Takuya Hashimoto, Naoki Kato and Hiroshi Kobayashi: 61


Development of Educational System with the Android Robot SAYA and Evaluation

You might also like