You are on page 1of 11

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.

ORG

50

Eye Tracking System with Blink Detection


SidraNaveed,BushraSikander, andMalikSikanderHayatKhiyal
Abstract This paper presents an efficient eye tracking system having a feature of eye blink detection for controlling an interface that provides an alternate way of communication for the people who are suffering from some kind of severe physical disabilities. The proposed systemusespupilportionfortrackingthemovementofeyes.Theideaofthistechniqueistotrackeyemovementsbydetectingandlocatingthe centerportionofpupilandthenusesthisinformationtomovethemousecursoraccordinglyfordrivingtheinterface.Thesystemhasshown successfulresultsontherecordedcamvideosasexpectedandaccuratelydetectseyeblinkswhethervoluntaryandinvoluntary.Thesystemis totallynonintrusiveasitonlyusesavideocamforrecordingavideo,soitismoreusersfriendlyandeasiertoconfigure.Theexperimental resultshaveprovedthattheproposedsystemisbetterinitsperformanceandefficiencythanthecurrentsystemsinpractice. Index Terms: Eyetracking,Pupildetection,Blinkdetection,mousemovement.

1. INTRODUCTION ntherecentyearsduetotherapidadvancementin technology there has been a great demand of human computer interfaces (HCIs). Many systems have already been developed for the normal people whohavetheabilitytoperformanyactionvoluntarily buttherewasaneedtodevelopsuchsystemsforthe peoplewhoareonlyabletoperformanyinvoluntary action. The only action that the disable people can perform voluntarily is the blinking of their eyes. Because of this demand, there is an increased developmentofHCIsystemsbasedoneyebiometrics. The need of the system for physically disable people motivated many researchers to develop eye tracking systems for providing an ease of use for those handicapusers.Forsuchusers,eyetrackingsystemis like a substitute of their abnormal physical behavior. Suchsystemallowsaninteractionbetweenthehuman and computers and these systems can easily be used evenbythepeoplewhocannotspeakandwrite. Eyetrackingsystemsuseimageprocessingtechniques

process is also called picture processing. In image processing the picture is analyzed to identify shades, colors,contrastsandshapesofthepicturethatcannot beperceivedbythehumaneyes.[1] There are many systems and applications that are based on human eye tracking .Various kinds of humancomputerinterfaces[2]existsthatmakeuseof human eye movements and eye blinking. Some interfaces make use of eye movement for controlling mouse cursor, some systems track eyes to check the drowsiness of the driver during a drive, some applications makes use of eyes for typing a web address, and eyes are used for many vision based interfaces.Manyeyetrackingtechniquesarealsoused inmedicine[3][4]andoptometrytodiagnosecertain diseases. Without considering the human vision system we cannot think of image processing. With our visual system we observe and evaluate the images that we process.Eyeisanorganthatisuseforvisionandlight perception. Its like a camera having an iris diaphragm with variable focusing .It is used for seeing or vision and has the ability to make intellectualjudgments.Itsaspecializedlightsensitive structure that is used for forming images of sight. Humaneyeisdefinedasthespheroidstructurethatis located on the front side of skull and rests in a bony cavity. Its spherical in shape with an average diameterof20mm.Itissurroundedby3membranes. These 3 membranes are: Cornea and sclera which is the outer layer, Choroid, Retina which encloses the

based on eye biometrics. In image processing the input data is converted into digital form and various mathematical operations are applied to the data to create a more enhanced image to perform tasks like recognition or authentication, and these tasks are performed by humans using digital computers. The

SidraNaveedisanundergraduatestudentofDepartmentofSoftware Engineering, Fatima Jinnah Women University The Mall, Rawalpindi, Pakistan. BushraSikanderisLecturerattheDepartmentofComputerSciences, FatimaJinnahWomenUniversityTheMall,Rawalpindi,Pakistan. Dr.MalikSikandarHayatKhiyalisProfessorandHeadofAcademic (ES),APCOMS,KhadimHussainRoad,Lalkuti,Rawalpindi,Pakistan.

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

51

eye. The original captured eye imagefrom a videois showninfigure1.

Figure1.Originaleyeimage The colored part of the eye is called the iris. Its circular in shape and in variable sizes. Iris regulates theamountoflightenteringintotheeyebyadjusting the size of the pupil. It can be in various colors like green,brown,blue,hazel,greyetc.Likefingerprints, the shape, color and texture of each individual is unique. Pupil is the opening through which the light andpictures(images)thatweview,enterstheeye.Iris formsthepupil.Withthesizeoftheiristhesizeofthe pupilincreasesordecreasescorrespondingly. There are many systems and applications that are based on human eye tracking and blink detection features. ITU is a commercial gaze tracker and its mainaimistoprovidealowcostgazetrackerthatis easilyaccessible.ITUniversityofCopenhagen[5]has developed it. COGAIN communication by gaze interaction association has supported it. Another eye trackingsystemistheETD(eyetrackingdevice).Itisa devicethatisheadmountedanditisusetomeasure the 3 dimensional eye and the head movements. The device was first designed by Prof. Dr. Andrew H. Clarke with the cooperation of companies like choronos vision and mtronix and was originally developed by German space agency (DLR) [6]. The devicewasdevelopedforusingininternationalspace station(ISS)andinearly2004itwasuploadedtothe station. Eye tracking systems are also use in vehicles to check the drowsiness and attention of the driver during driving and when it detects snooze of the driver it generates a warning alarm. A camera is associated with these eye tracking systems and are integrated into automobiles to assess the behavior of thedriver.Thesesystemsareusedtocheckthespeed of the vehicle, the distance of the vehicle from the otherandarealsousedtocheckwhetherthevehicleis in its driving lane or not. A camera is mounted in a

carthatisangledtowardsthedriverfaceandthenthe blinkpatternsofthedriverarerecordedbyusingthe camera. By using these blink patterns the percentage of the time spent on engaging in driving can be measured.In2006Lexushaveequippeditsfirstdiver monitoring system in its car LS460 [7].The system generatesawarningifthedrivertakeshis/hereyesoff the road. Vision trak eye tracking system is the most advanced system in the world that accurately tracks that what the person is looking at. System was developed by ISCAN .Both desktop and head mounted versions are available for this system. The desktopversionofvisiontrak[8]isbinoculardesktop 300 have many applications. iMotion attention tool [9] is an attention tracking software based on human eye tracking that combines eye tracking metrics, reading and emotion metrics. Unique behavior of consumer is given by combining these metrics. Anotherlowcosteyetrackingsystemistheopeneyes toolkit [10]. This toolkit includes algorithm that are used to measure the movement of eyes from the digital videos. Eye tracking systems are also used in web usability. User plus [11] is a software program for free usability testing. Its a beta system and it shares knowledge of usability with designers, developers and usability specialists across the network. Eye tracking systems are also used in laser refractive surgery. The STAR S4 IR Excimer Laser System[12]isthemostadvancedlaservisionsystem. It reduces the effect of laser on the eye cornea and thus increasing the safety of the patient. One of the applications of eye tracking system is in language reading. In these systems eye tracking is used to investigate human language development, language skillsandreadingbehavior.Tobbiremoteeyetrackers areusedforthispurpose.TobiiTX300eyetracker[13] isaremoteeyetrackerwithveryhighaccuracyandis abletocaterforlargeheadmovementsandisusedin languagereadingusingeyetrackingmethods. 2. LITERATUREREVIEW Human eye is one of the most important and prominent feature on the face that also shows many useful information besides facial expressions. By detecting the position of the eye many useful interfaces can be developed. Several researches have beenmadefordevelopingintrusiveandnonintrusive eye tracking systems. Intrusive eye tracking systems are those in which there is a direct contact with eyes and non intrusive eye tracking systems are those in which there is no physical contact with the user.

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

52

Studieshavebeenconductedondifferenteyetracking techniques,duringtheanalysisphaseofthisresearch. The study has showed many advantages as well as disadvantages of different eye tracking techniques. The commonly used techniques are: limbus tracking, iristrackingandelectroculographytechnique. In iris tracking, the motion and direction of iris is detected for designing and implementing an eye tracking system for developing a human computer interface (Shazia et.al) [14]. In this technique batch mode is used for iris detection. The system allowed the users to interact with the computer system by using their eye movements. The system accurately locatedanddetectedtheeyesinimageswithdifferent iris positions and used this information to move the mouse cursor accordingly .As this iris tracking method has been conducted on static images so it providedahigherdegreeofaccuracy.Thedeveloped systemisrestrictedtoworkonlywhenthedirectionof iris is left, right or center. It doesnt work when the position of iris is up or down. The system is not expanded to work in real time and is not able to handleblinksandcloseeyes. Anothertechniquethatwasanalyzedfortrackingand detectingeyeblinkingisstatisticalActiveAppearance Model (AAM) (Ioana Bacivarov et.al ), [15]. The model offers a 2D model that quickly matches the texture and shape of the face. By using Active Appearance Model a proofof concept model for the eyeregioniscreatedtodeterminetheparametersthat measure the degree of eye blinks. An initial model that employs two independent eye regions is then expanded using component based techniques. After developing an eye model, a blink detector is proposed. The main advantage of using AAM techniqueisthatthedetaileddescriptionoftheeyeis obtainedandnotjustitsroughlocation.Thesystemis abletosynthesizeallthestatesinbetween,facilitating the blink parameters extraction. The main drawback ofAAMtechniqueisthatitisdesignedtoworkfora single individual and also the blink parameters have to be identified in advance. For large variation of pose, plan rotation etc. the conventional statistical modelperformspoorly.

Eye can also be tracked by using two interactive particlefilters,onefortheopeneyeandotherforthe closedeye(JunwenWuet.al),[16].Initialeyeposition is located using eye detection algorithm, and then these filters are used for eye tracking and blink detection. Auto regression models are used for describingstatetransition.Classificationbasedmodel isusedformeasuringobservations.Regressionmodel is used in tensor subspace to give the posterior estimation. Performance is measured in two aspects: blinkdetectionrateandthetrackingaccuracy.Videos from varying scenarios are used to evaluate blink detectionrate.Whereastrackingaccuracyismeasured using benchmark data collected with the Vicon motion capturing system. Particle filters have the advantage that with sufficient samples, the solution approaches the Bayesian estimate. The proposed algorithmisabletoaccuratelytrackeyelocationsand detect both voluntary long blinks and involuntary short blinks. Normalizing the size of the images is crucial incase of subspace based observation model. Ifthereisabadscaletransitionmodelitcanseverely affecttheperformance. Userseyescanalsobelocatedinthevideosequence by detecting the eye blink motion (Kristan Grauman et.al) [17]. Initial eye blink is used to locate the eyes. The algorithm detects the eye blinks, measure its durationandthenthisinformationareusedtocontrol thenonintrusiveinterface.TheBlinklinkprototype is used to call the system. By considering the motion information between two consecutive frames and determining that if this motion is caused by blink, eyes are located. Then the eye is tracked and monitored constantly. This system is a real time system that can consistently runs at 2729frames per second. It is a completely non intrusive system that doesnt required manual initialization or special lighting. Voluntary and involuntary blinks can be classifiedreliablybythesystem.Thedisadvantageof thissystemisthatitcanonlyhandlelongblinksand is not able to handle short blinks. In case of short blinksitjustavoidstheblinks.

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

53

Vision based system (Aleksandra Krolak,et.al ),[18] has also been used for detecting long voluntary eye blinks and interpret these blink patterns for the communication between man and machine. In this presented model, statistical approach is used calculatedonthebasisofHaarlikemasks.Templates of different size and orientation are convolved with the image to compute Haarlike features. By sliding thesearchwindowofthesamesizeasthefaceimages used for training through the test image, face detection is done. In the next step eyes are localized. Theextractedimageofeyeregionisfurtherprocessed for eye blink detection. By using normalized cross correlationmethodthedetectedeyesaretracked.The main advantage of this system is that it doesnt required prior knowledge of face location or skin color,noraspeciallighting. Study of all the above systems has shown that these systemshavebothadvantagesanddisadvantages.All these systems were developed to work on static images and they were not expanded to work in real time. Some of the systems were able to handle only voluntarylongblinksandtheywerenotabletocater forshortblinks.Somesystemsweredesignedtowork for the single individual and also blink parameters have to be identified in advance. For large pose variationsthesesystemsperformedpoorly. The goal of this research is to develop a system that can be expanded to work in a real time and that can correctlyanalyzethefactthatinfersaccuratetracking and precise blink detection that is based on pupil tracking. The proposed system can detect both spontaneous and voluntary eye blinks and it doesnt required prior knowledge of face location or skin color,norspeciallighting.

With the growth of technology many systems have been developed for the people who have normal physical functionalities but these systems cannot be used by the handicaps. So the need for the systems has been demandingly increased by the people who have some kind of severe physical disabilities to provide these people an alternated way of communication with the computer machines. So to fulfillthisdemandforthephysicallydisabledpeople, the normal eye tracking system have been expanded toworkinrealtime.Thisresearchproposedasystem thatcancorrectlyanalyzethefactthatinfersaccurate tracking and precise blink detection. The system is also able to tackle for eye blinks. The information from the human eye movement is used to drive the interface. Byfollowingtheabovementionedstepstheresultant solutionisasfollows: Resultant_ Image = MM( BD( CC( PS( F( D( E( S(B( U(G(R(I)))))))))))) Thevariablesusedintheaboveformulaaredescribed intable1. Table1.Listoffactors Symbols I R U G B S E D F PS CC BD MM Description Irisportion Readimage Uint8conversion Grayscaleconversion Binaryimage Smoothingfilter Edgedetection Dilationprocess Fillholes Pupilsegmentation Centerpointcomparison Blinkdetection Mousemovement

3. SYSTEM DESIGN
This section shows the Mathematical model and systemdesignofthedevelopedsystemthatshowsthe interrelationships of each phase and relationships amongdifferentvariablesofthedevelopedsystem.

3.2. Systemdesign
In the first step of the system design, frames are acquiredfromtherecordedvideosandarestoredina data base from where they are retrieved in a batch

3.1. Statementofthemodelingproblem

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

54

modeforfurtherprocessing.Theacquiredimagesare sharpened to enhance the image details in it and to remove the unnecessary details, by applying image enhancement techniques. After enhancing the image, edgeoftheareaofinterest(inthiscaseirisandpupil) is detected to extract the portion of pupil from the enhanced image. After segmented the pupil portion, eye blink is detected. Center point of the pupil is calculated in each frame and mouse is triggered correspondingtothesepoints. Eachstepisexplainedbelow: A. Imageacquisition The first step in the proposed eye tracking system is acquiring the images. These frames have been capturedfromarecordedvideoandarestoredinthe permanentstoragedevicefromwheretheyhavebeen retrieved one by one for further processing and are convertedintograyscaleformat.Asthepreprocessing of the image is easy if they are in binary format and the pupil portion is more prominent in binary as compared to gray scale so these images have been converted into binary. By converting the images in binaryformatitgivesresultedimagesinwhichpupil portion is shown in black and the remaining portion ofimageiswhitethushighlightingtheareaofinterest ( pupil) that makes the task of eye tracking more easier. The acquired original RGB image is shown in figure2.

the images have been smooth to get the desired patternsinanimage.Averagefilterhasbeenapplied on images for smoothing the data. It smoothes the data by eliminating the noise. Using each gray level values,averagefilterperformsspatialfilteringoneach individual pixel. The requirement i.e. to have less detailinanimage(blurredimage)hasbeenachieved by eliminating the extra details and noise using averagefilter.Byincreasingthesizeoffiltermask,this filter makes the image more blurred. This helped in detecting prominent edges of the area of interest as morenoiseisbeingremoved.Manyotherfiltershave also been applied on images but the average filter showed efficient and better results on the binary images for the proposed system and is shown in figure3

Figure3.Smoothenimage

C. Edgedetection
After smoothing the images the next step is to detect the edges of the area of interest (in this case pupil portion) to extract the portion of pupil from the enhanced images. Canny filter has been applied on images to prominent the edges. This filter first smoothes the edges and then it highlights the sharp changes in image brightness to get important information from the image. Many other filters have

Figure2.OriginalRGBimage

beenappliedontheimagestosharpentheedgesbut for the proposed system, canny filter produced the desiredresultspresentedinfigure4.

B. Smoothingprocess
In image processing, smoothing is defined as the process of capturing important patterns from the image. So after acquiring images, in the second step

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

55

Figure4.Detectionofedges Figure5.Dilatedimage

D. Pupilsegmentation
After getting a clear edge of the desired region i.e. pupil,ithasbecomenecessarytotrackthemovement of human eye. Pupil portion has been used for tracking the eye movement. Dilation operation has been performed on the images to make the pupil portion more prominent. After applying the morphologicaloperationasillustratedinfigure5,the onlyholethatishighlightedintheimageistheholeof the pupil. This pupil hole has been filled with the whitecolortomakethepupilprominentasshownin figure6. Afteridentifyingthepupilportionthe nextstepisto segment thisportion from the rest of the image. This isachievedbydetectingthecenterpointsofthepupil inimages.Byfillingholesintheimagestheresulting images have the portion of pupil in white color and the rest of the image portion in black. The only portion that contains maximum number of white pixels is the pupil portion so by finding the column with maximum white pixels and by calculating its center point will give center point of pupil as shown in figure 7. Radius of pupil is calculated and by calculating the coordinates usingstarting and ending value of the column having maximum white pixels and the pupil radius, the pupil portion has been segmentedshowninfigure8.

Figure6.Filledpupil

Figure7.PupilCenterpoint

Figure8.Segmentedpupil

E. Centerpointcomparisonforshifting
The center point of pupil is located in the column where there is maximum number of white pixels. After calculating the center points of each image, these points are stored in 2D array with its x and y coordinates.Aftergettingthecenterpointofpupilin eachimagethenexttaskistocomparethesepointsin

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

56

order to determine the shifting of pupil from one imagetoanother.Thisshiftingvalueisusedtocontrol the mouse movement accordingly. If the difference between two center points is negative then mouse is movedintherightdirectiontothespecificlocation.If thedifferenceispositivethenmouseismovedinthe left direction to the specified location. If the center pointisnotdetectedthenthecursorwillremainatthe previousspecifiedlocation.

implementationofaneyetrackingsystem.Thereason forchoosingMATLABisthatithaspowerfulgraphics and ease of use .Programs can be interpreted easily and debugging becomes easy. For analysis, data acquisition, image and signal processing, finance, control design and simulation, MATLAB has a large numberofadditionalapplications.Forunderstanding andteachingbasicmathematicalengineeringconcepts and for simple mathematical manipulations with matrices,MATLABisusedlargely.Systemwastested on sample images from different videos having differenteyedirections. The original captured RGB image from a recorded videoisshownbelowinfigure9.

F. Blinkdetection
Inordertodetecttheeyeblink,thefirststepistolook for the center point of the pupil. If the eye is open, thencenterpointofpupiliseasilycalculatedasthere will be maximum number of white pixels of pupil somewhereintheimage.Iftheeyeisclosethenthere willbenowhitepixelsofpupilintheimageandthus no center point is calculated. A count variable is created. If the eye is open and center point is calculatedthenthevalueofcountvariableis1andif the eye is close then the count variable will be 0 .If thecountvalueinoneimageis1andifitiszerointhe nextimageandviceversathenit meansthatthereis an eye blink. So by comparing the value of count variable,eyeblinkisdetected.

Figure9.OriginalRGBimage On the original sample image, binary operation is

G. Mousemovement
The stored center points of pupil are used to trigger mouse movements. These points are passed in the mouse function one by one thus resulting in the movement of mouse between these points. If the eye is close then there is no center point calculated and therefore no value is passed in the mouse function andthemousecursorremainedatthepointwhereit liesbefore.

applied for making the computation easy. Resulted binaryimageisgiveninfigure10.

4. EXPERIMENTAL RESULTS
The system has been tested for its accuracy and efficiency on many recorded videos. The results showed the system accuracy of approximately 90%.It was noticed that for accurate eye tracking, center pointofpupilshouldbedetectedcorrectly.MATLAB R2009a version 7.8.0 was used for design and

Figure10.Binaryimage Cannyfilterhasbeenappliedonthebinaryimagein figure 10, to get clear and prominent edges as it highlights important information in the images. The resulted image having prominent edges is shown in figure11.

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

57

Figure11.Edgedetectedimage Figure13.Centerpointofpupil After getting prominent edges, the next step performed was to highlight the edges to make them moreprominentandvisible.Imagedilationwhichisa morphological operation has been performed for makingtheedgesmorethickandprominentasgiven infigure12. After detecting the center point, radius of pupil is calculatedbyfindingthedistancefromthestartofthe column having maximum number of white pixels to thecenterpointofpupil.Thisgivestheradiusofthe pupilportion.Thisradiusisthenusedtodrawapupil circle on original grayscale images. Radius of the pupilisgivenbythefollowingequation: Radius=centerpointstartingpoint; Usingtheabovecalculatedradius,acircleisdrawnon theimageusingthefollowingalgorithm: re2=round(2*pi*rad); Where rad is the radius of the pupil .Using this radiusrad,re2iscalculated.

forloop=0:step2:2*pi Calculate xx2 and yy2 and by using these values in image as an argument, it will draw a circle with radiusradonthegrayscaleimages. Midpointismappedontheimageusingthefollowing steps: Image(Centerpoint1,Column)=255; Image(Centerpoint,Column1)=255; Image(Centerpoint,Column)=255; Image(Centerpoint,Column+1)=255; Image(Centerpoint+1,Column)=255; Where image is the original gray scale image on whichthecenterpointhastobemappedandcolumn is the column having maximum numbers of consecutivewhitepixels.Thegrayscaleimagehaving pupilcircleandthepupilmidpointmappedisshown infigure14.

Figure12.Imageaftermorphologicaloperation Inthedilatedimage,theholeofthepupilisfilledwith whitecolortomakeitclearer.Afterfillingtheholeof the pupil, the next task was to calculate the center point of pupil for eye tracking. Center point is calculated by finding the mid value of the column having maximum numbers of consecutive white pixels.Imagewiththemarkedcenterpointofpupilis shownbelowinfigure13.

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

58

Figure14.Centerdetectedimage Aftergettingapupilportionwithitscenterpoint,the next step is to segment this portion from the rest of theimageasshowninfigure15.Segmentationisdone by calculating the four coordinates of the pupil portiontobesegmentedandisgivenasfollows: r1=fstart1; r2=fend1; r3=Columnrad; r4=Column+rad; pupil=imgg(r1:r2,r3:r4); Where fstart is thestarting point of the column and fendistheendingpointofcolumnhavingmaximum numbersofwhitepixels. Figure17.Binaryimage

Figure16.OriginalRGBimage After applying binary operation on figure 16, the resultedimageisshowninfigure17.

On the binary image shown in figure 17, edge is detectedandisshowninfigure18.

Figure15.Imageofsegmentedpupil The same steps are applied on other sample images from different recorded videos to get the pupil portionforeyetrackingforcontrollinganinterface. Original RGB image captured from another recorded videoisshowninfigure16. Figure18.Edgedetectedimage Thefilledpupilwithitsmarkedcenterpointisshown infigure19.

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

59

movements efficiently and accurately by using the pupil portion and can accurately detect eye blinks whether voluntary and involuntary. The system can track eye portion with the 90% detection accuracy. The system is expanded to work in real time using recordedvideos.Theproposedsystemispurelynon intrusiveasnohardwaredevicehasbeenattachedto the human body so the system is user friendly and easiertoconfigure.Therearestillsomeaspectsofthe Figure19.Centerpointofpupil Center point of pupil marked on gray scale image is showninfigure20. system that are under experimental conditions and development.Butthisprojectprovedtobeanoverall success and achieved the goals and requirements as proposedinthesystemspecifications. Manyaspectsofthesystemcanbeapartofthefuture work for making more efficient and robust eye tracking system. The system can be shifted from recorded videos to a live web cam video with some modifications,formakingitalivesystem.Thesystem can be developed in such a way so that it could also detect human eye gazes and act accordingly. There can be some kind of mouse action when the blink is detected. System efficiency can be achieved for makingitamoreefficientdynamicsystem.

Figure20.Centerdetectedimage Thesegmentedpupilportionisshowninfigure21.

REFERENCES
[1] http://en.wikipedia.org/wiki/Image_processing, 19thJune,2011 [2] AlexPoole,LindenJ.BallEyetrackinginhuman computer interaction and usability research :current status and future prospects, encyclopedia of human computer interaction 2006.pp.211219 [3] http://eyetrackingupdate.com/2011/06/27/eye trackingastigmatismcorrection/,28thJune,2011 [4] Filippo Galgani, Yiwen Sun, Pier Luca Lanzi, Jason Leigh Automatic analysis of eye tracking data for medical diagnosis , In Proceedings of CIDM2009.pp.195~202 [5] http://www.gazegroup.org/downloads/23 gazetracker,8thmay,2011 [6] http://en.wikipedia.org/wiki/Eye_Tracking_Devic e,8thmay,2011 [7] http://en.wikipedia.org/wiki/Driver_Monitoring_ System,8thmay,2011 [8] http://www.polhemus.com/?page=Eye_VisionTra k,8thmay,2011

Figure21.Segmentedpupil

5. CONCLUSION AND FUTURE WORK


Thisresearchprovidesasystemthatisabletotrigger mousemovementsforcontrollinganinterfaceforthe people who are suffering from some kind of severe physical disabilities and who cannot use the system with their hands. The system is able to track eye

JOURNAL OF COMPUTING, VOLUME 4, ISSUE 3, MARCH 2012, ISSN 2151-9617 https://sites.google.com/site/journalofcomputing WWW.JOURNALOFCOMPUTING.ORG

60

[9] http://www.objectivetechnology.com/market research/software/imotionsattentiontool 8th may,2011 [10] http://thirtysixthspan.com/openEyes/, 8th may,2011 [11] http://eyetrackingupdate.com/2010/06/30/eye trackingfreewebusabilitytools/,8thmay2011 [12] http://www.amo inc.com/products/refractive/ilasik/stars4ir excimerlaser [13] http://www.tobii.com/analysisand research/global/products/hardware/tobiitx300 eyetracker/ [14] Shazia Azam, Aihab Khan, M.S.H. Khiyal, design and implementation of human computer interface tracking system based on multiple eye featuresJATITjournaloftheoreticalandapplied informationtechnology,Vol.9,No.2Nov,2009. [15] Ioana Bacivarov, Mircea Ionita, Peter Corcoran, Statisticalmodelsofappearanceforeyetracking and eye blink detection and measurement IEEE transactions on consumer electronics, Vol.54 , No.3,pp.13121320August2008. [16] Junwen Wu, Mohan M. Trivedi simultaneous eye tracking and blink detection with interactive particlefilters,EURASIPJournalonAdvancesin Signal Processing, Volume 28, 17 pages, October 2007 [17] Kristen Grauman, Margrit Betke, James Gips, Gary R. Bradski Communication via eye blinks Detection and duration analysis in real time, proceedings IEEE conf. on computer vision and pattern recognition, Lihue, HI, vol. 1, pp.1010, 2001 [18] AleksandraKrolak,PawelStrumillovisionbased eyeblinkmonitoringsystemforhumancomputer interfacing,Advances in Human system interactions conference, pp. 994998, May 2527, 2008.

Dr.MalikSikandarHayatKhiyalisHeadofAcademic (ES), APCOMS, Khadim Hussain Road, Lalkurti, Pakistan. He served in Pakistan Atomic Energy Commission for 25 years and involved in different research and development program of the PAEC. He developedsoftwareofundergroundflowandadvanced fluid dynamic techniques. He was also involved at teaching in Computer Training Centre, PAEC and International Islamic University. His area of interest is Numerical Analysis of Algorithm, Theory of Automata andTheoryofComputation.Hehasmorethanhundred research publications published in National and International Journals and Conference proceedings. He has supervised three PhD and more than one hundred and thirty research projects at graduate and postgraduate level. He is member of SIAM, ACM, Informing Science Institute, IACSIT. He is associate editor of IJCTE and coeditor of the journals JATIT and International Journal of Reviews in Computing. He is reviewerofthejournals,IJCSIT,JIISIT,IJCEEandCEEof Elsevier.

BIBLIOGRAPHY
Sidra Naveed is the under graduate student of Department of Software Engineering in Fatima Jinnah WomenUniversitytheMall,Rawalpindi,Pakistan. Bushra Sikander is the Lecturer in the Department of Computer Science in Fatima Jinnah Women University the Mall, Rawalpindi, Pakistan. Her qualification is MS CS..

You might also like