EYE TRACKER BASED HUMAN COMPUTER INTERACTION Saswati Pal 14204011 3 rd Semester M.Tech (ECE) Dr. B. R. AMBEDKAR NATIONAL INSTITUTE OF TECHNOLOGY, JALANDHAR der the Guidance of: Dr. Arun Khosla Associate professor Department of ECE
EYE TRACKER BASEDHUMAN COMPUTER INTERACTION
Saswati Pal142040113rd SemesterM.Tech (ECE)
Dr. B. R. AMBEDKAR NATIONAL INSTITUTE OF TECHNOLOGY, JALANDHAR
Under the Guidance of: Dr. Arun Khosla Associate professor Department of ECE
CONTENTS
• Introduction
• What is Human Computer Interaction (HCI)?
• HCI Technologies
• What is Eye Tracker?
• Why we go for Eye Tracker?
• Literature Survey
• Problem Formulation
• References
Introduction
For many having physical disability, even simple tasks may require assistance.
There is a category of people with severe speech and motor impairment or with neuro-locomotor
disabilities who cannot speak and cannot use sign language. [11]
Assistive technology (AT) promotes greater independence for people with disabilities by enabling
them to perform tasks that they were formerly unable to accomplish. [1]
If these people have a good level of understanding and perception, they should use their eyes for
Human-Computer Interaction (HCI).
Eye Tracker makes it possible for severely physically disabled users to interact with computers and
machines using their eyes.
What is HCI?
Human Computer Interaction (HCI) is the study of how computers
and people work together. [12]
It is the interface that make software and computers easier
for others to use.
By improving the usability of computer interfaces, HCI is being improved.
Nowadays, eye tracking is used as a method in HCI research.
HCI is what the user sees and includes:
• What the system looks like?
• How the system accepts input from the user?
• How the system responds to the user input?
• How the system outputs the results of processing?
HCI Technologies
The existing technologies for HCI involve various ways:
Electroencephalography (EEG) based HCI
Lee, J. C. et al in their work, “Using a Low-Cost
Electroencephalograph for Task Classification in HCI Research“
developed a system where users explicitly manipulate their
brain activity instead of using motor movements to produce
signals that can be used to control computers or communication
devices [13].
EEG based evaluation of HCI is tedious. EEG is susceptible to interference from a variety of sources
such as physical movement of the person’s body, and other electronic equipment. Again,
classifying the signal acquired from EEG becomes difficult.
EEG based Speller
Van Kokswijk, J. , Van Hulle, M. in their work, “Self adaptive BCI as Service-oriented Information
System for Patients with Communication Disabilities“ developed a wireless BCI that enables subjects
to spell text on a computer screen by detecting P300 signals in their EEG. It consists of 8-channel
wireless EEG amplifier connected to a wireless transmitter and a USB receiver, which is plugged into
the USB port of a computer, or smart phone. The data is wirelessly transmitted in real-time to a
receiver located up to ten meters from the system[9].
The real-time decoding of the EEG activity; the loss of EEG samples at the receiver side; the slow response time of the
EEG amplifiers (actually the filters therein) causing long waiting times for them to settle, are the limitations to this
device.
Electromyography (EMG) based HCI
Pinheiro, P., et al in their work “Anticipative Shared
Control for Robotic Wheelchairs Used by People with
Disabilities” developed a robotic wheelchair being
driven by user through a facial EMG signal interface using
the anticipative shared control technique [14]. The shared
control algorithm anticipates the user’s intentions
considering data sources such as agendas, travelled path, pose of the wheelchair and time of
the day; and takes favorable action.
For each user, a plan should be created individually. If there is any drastic change in the daily routine
then the plan must be updated. Any muscle movement on the head or neck can produce a large noise
contamination from the corresponding EMG signal .
Electrooculography (EOG) based HCI
Desai, Y. S. in his work, “Natural Eye Movement & its application for paralyzed patients” proposed a
system to notify in writing the needs of the patient in a relatively short time by using a virtual keyboard
like graphical user interface [16].
The EOG recording technique requires electrodes to be placed on both sides of the eyes, and hence
need a trained helper. EOG-based systems using DC coupled amplifiers faces the problem of baseline
drift; while AC coupled records the change in eye position. The measurement of EOG voltage may
vary due to varying eye movement and baseline drift.
Mind Reading Speller
Using Functional Magnetic Resonance Imaging (fMRI), researchers have developed a device that can
help people who are mute or have difficulty speaking to converse with others. The technology uses Magnetic
Resonance Imaging (MRI). MRI uses radio waves and magnetic fields are used to create the detailed images
of the test subject. In fMRI, imaging is used to study
the blood flow in the brain to detect areas of brain
activity. A computer records the changes in the
blood flow to show the brain was working in real
Time [17].
What is Eye Tracker?
Eye trackers are the instruments that measure the visual activities.
An eye tracker records how the eyes move while a subject is completing a task, for example on a
web site.
It captures the reflection off of the retina and the cornea of the eyes, commonly referred to as “red
eye” and the glint, respectively. [6]
It provide computers with a user's gaze position in real time by tracking the position of their pupils.
The trackers can either be worn directly on the user's face, like glasses, or placed in front of them,
such as beneath a computer monitor.
So, we have head-mounted eye tracker and desktop-mounted eye tracker available in market.
Head-mounted eye tracker [15] Desktop-mounted eye tracker [18]
Why we go for Eye Tracker?
Eye tracker gives an opportunity to gain deep insights into the user experience.
Used for evaluating interactive media such as websites, software products, e-mail campaigns.
Used to control the mouse cursor on wall-sized displays.
Used to record user attention patterns, their interest in various elements.
It makes it possible for physically disabled users to interact with computers using their eyes.
Literature Survey
Lupu, R. G. et al developed an Eye Tracking system (ETM) using webcam, and video glasses and an
eye tracking algorithm. The software application was written in C++ and C# using Visual Studio2010
and OpenCV library for image processing. They performed an experimental procedure in order to
evaluate efficiency of the ETM system, providing solutions for disabled person's communication and
independent activities. [1]
Hung, A., et al proposed the EyePhone framework, a mobile HCI that allow users to control mobile
phones through eye or facial movements. They developed a wearable EEG headset connected to a PC
using a wireless dongle, which in turn is connected to an Android smartphone using Bluetooth. A
gyroscope built in the set gives direction to mouse emulator to allow users to move a cursor around the
smartphone screen by moving their head and eyes; and an emergency dialer to allow users to make an
emergency phone call through a certain pattern of eye/facial movements. [2]
Murugesan, G., et al proposed a system using a low resolution webcam which is attached to a wearable
glass frame, for eye tracking. Images were captured from video input and converted into binary using a
dynamic threshold to detect the iris. Center point of the iris is used to estimate the eye gaze. The cursor
is moved in the desired direction on the screen based on the estimated eye gaze; and blinking of the
eyes execute mouse clicking.[3]
Kocejko, T., et al proposed a Real-time daylight based eye gaze tracking system consisting of two
cameras, infrared markers and PC. One camera is used for pupil tracking while second one controls
the position of the head relative to the screen. The authors attempted to create an interface for
handicapped users which they could use instead of mouse or keyboard.[4]
Barreto, A., Scargle, S. et al developed a EMG-based HCI for users with motor disabilities. An
input device that utilizes EMG biosignals from cranial muscles and EEG biosignals from the
cerebrum's occipital lobe, which are transformed into controls for two-dimensional (2-D) cursor
movement, the left-click (Enter) command, and an ON/OFF switch for the cursor-control functions .
This HCI system classifies biosignals into "mouse" functions by applying amplitude thresholds.[5]
However, these systems present limitation in terms of the execution of accurate displacements,
cursor stability and the required dwell time necessary for left-click operation. These limitations
may create delay in clicking on the active sections of many typical Web pages, as well as for typing
letter-by-letter through a virtual keyboard.
Problem Formulation
So there is a requirement to provide an Eye Tracking solution that would provide faster
displacement of the cursor across long distances in the screen.
There is a need to provide an Eye Tracking solution which would allow persons with motor
disabilities to interact with computer or communicative devices without being exhausted by the
use of wearable devices day long.
To develop an application using Eye Tracker for Windows.
References
[1] Lupu, R. G., Ungureanu, F., Siriteanu, V.(2013). Eye Tracking Mouse for Human Computer Interaction,
4th IEEE International Conference on E-Health and Bioengineering - EHB 2013
[2] Hung, A., English, E., et al. EyePhone: A Mobile EOG-based Human-Computer Interface for Assistive
Healthcare, 6th Annual International IEEE EMBS Conference on Neural Engineering San Diego, California, 6 -
8 November, 2013
[3] Murugesan, G., Deepika, S. S. (2015). A Novel Approach for Human Computer Interface Based on Eye
Movements for Disabled People, IEEE International Conference on Electrical, Computer and Communication
Technologies (ICECCT), 2015
[4] Kocejko, T., Bujnowski, A. et al (2008). Eye Mouse for Disabled, Conference on Human System
Interactions, Poland, 25-27 May, 2008, Pages 199-202
[5] Barreto, A., Scargle, S., et al. A practical EMG-based human-computer interface for users with motor
disabilities. Journal of Rehabilitation Research and Development Vol . 37 No . 1, January/February 2000
Pages 53-64
[6] Eye-control empowers people with disabilities, http://www.abilities.com/community/assistive_eye-
control.html
[7] http://www.uxmatters.com/mt/archives/2009/10/eyetracking-is-it-worth-
it.php#sthash.XRZk3zBL.dpuf
[8] http://phys.org/news/2015-02-eye-tracking-frontier-human-computer-interaction.html
[9] Van Kokswijk, J. , Van Hulle, M. Self adaptive BCI as Service-oriented Information System for
Patients with Communication Disabilities. 4th International Conference on New Trends in Information
Science and Service Science (NISS), May 2010, Pages 264-269
[10] Jacob, R. J., Karn, K. S., (2003). Eye tracking in human-computer interaction and usability research:
Ready to deliver the promises.
[11] http://www.sigchi.org/chi95/proceedings/tutors/edm1bdy.htm
[12] Human Computer Interaction, http://www.smartgirl.org/brain-food/career-hubs/human-computer-
interaction.html
[13] Lee, J. C., Tan, D. S. Using a Low-Cost Electroencephalograph for Task Classification in HCI
Research
[14] Pinheiro, P., Cardozo, E., et al. Anticipative Shared Control for Robotic Wheelchairs Used by People
with Disabilities. IEEE International Conference on Autonomous Robot Systems and Competitions 2005
[15] Eye Tracking, http://www.wikid.eu/index.php/Eyetracking
[16] Desai, Y. S. Natural Eye Movement & its application for paralyzed patients. International Journal of
Engineering Trends and Technology (IJETT) - Volume4, Issue4- April 2013
[17] http://www.quantumday.com/2012/06/mind-reading-device-allow-mute-and.html
[18] Tobii Technology, http://www.tobii.com/
Thank YOU