HoughtonlakeBoard.org

TIPS & EXPERT ADVICE ON ESSAYS, PAPERS & COLLEGE APPLICATIONS

The Eye tracking
Approaches and applications

Ali F.Rashid

We Will Write a Custom Essay Specifically
For You For Only $13.90/page!


order now

Asst Proph DR.shimaa Hamed

 

Abstract

Eye-Tracking technology has been a hot
topic in Human-Computer Interaction (HCI) for a long time(Eye-Tracking
Technology in Human-Computer Interaction). The key idea of the approach is the
combined use of head motions for visual navigation and eye pupil positions for contest
switching within the graphic human-computer interface ( Human-Computer Interaction
Based on Eye Movement Tracking ) Eye tracking is a sensor technology that enables a device to
know exactly where your eyes are focused. It determines your presence,
attention, focus, drowsiness, consciousness or other mental states. Nowadays, Eye-Tracking has been developed
into many applications as input devices, and is regarded as a research tool for
researchers to analyze human behavior as well  (Eye-Tracking Technology in Human-Computer
Interaction) Eye tracking involves measuring either where
the eye is focused or the motion of the eye as an individual views a web page, when we using the gaze point on any interface, the eye tracking simplify the
interactions with computers and other devices when the user cannot for any
reasons or does not want to use hands as the input, By
combining eye tracking with other input devise , for
example keyboard, touchpad and voice ,These will be more intuitive, natural,
engaging and efficient than conventional user interfaces. in this paper we will discuss the eye
tracking approach and
applications.

 

Introduction

The
eye-movement tracking approach was introduced to technical
academic research a several years ago and has appeared
again  as the cost of these tools has decreased
while the quality of eye-movement tracking devices has
increased. Eye-movement tracking is method of that
involves recording participants’ eye movements
to determine the specific
locations that a participant’s eye studies at a
given time (Poole & Ball 2006). The application of eye-movement
tracking research methods has received been a
hot topic in recent years Especially in HCI Research .Eye tracking
can help tremendously in the evaluation
of how consumers are using a website, app, product, or any type
of interface generally, Many scientists from multiple
discipline such as medicine, psychology, communication
science, and computer science are
adopting eye-tracking research methods (Duchowski 2007)

 

 

 

 

 

 

Eye tracking Metrics

 

Fixation :   An eye tracking metric The moment of eye
comparatively stable,  “encoding” or
taking in  information. the average of
Fixations are 218 milliseconds , with a domain between  66 to 416 milliseconds .

Gaze : An eye-movement  tracking metric, commonly the total of all
fixation period within a specific area. Also called “fixation cluster” ,”dwell”,
or “fixation cycle”

Area Of interest(AOI) Or Point-of-regard: it is the Point of  space where a person is looking (interested
person) . Usually used in eye tracking research to detect where visual
attentiveness is pointed

Scan Path: spatial arrangement of a sequence of
fixations

 

Eye Tracking Approaches

 

in general , the eye tracker devices
detect/measure  the eye pupil position in
several ways that can be classified in three categories: contact lens based,
electrooculogram based and video based. The first category includes invasive
eye trackers that use contact lens with mirrors (Yarbus, 1967) ormagnetic
search coil (Kenyon, 1985). The eye trackers that uses contact lens with
mirrors implies an entire process of attaching the lens to eye pupil and the experiment
can last only a short period of time (measured in minutes), The eye trackers
with magnetic search coil requires two soft contact lens and between a coil of
wire with 13 mm diameter. The twisted pair of wires from search coil was
connected to a magnetic coil system (Kenyon, 1985) for measuring the intensity
of magnetic field variation, as it is presented in These eye trackers were used
specially used by the scientists for research of physiology and dynamic of eye
movements. Despite the vast improvements and the accuracy obtained, the systems
were not widespread because of invasive process of attaching the lens and
because the head had to be kept still in order not to affect the measurements.

 The
eye trackers from second category measure the eye pupils bio potentials using
electrodes placed near the eye. Because of very high nerves density of retina,
the eye pupils polarized The movement of the eye cause the surrounding electric
fields to move as well. These voltages can be measured by placing electrodes
near the eye The amplitudes of acquired signals depend on position of the eye.
Thus is possible to determine the eye positions and used in human computer
interaction. The disadvantages are the costs of signals amplifiers and the
presence of electrodes on subject face.

 

The eye trackers from second
category measure the eye pupils biopotentials using electrodes placed near the
eye. Because of very high nerves density of retina, the eye ball is polarized .The
movement of the eye cause the surrounding electric fields to move as well.
These voltages can be measured by placing electrodes near the eye The amplitudes
of acquired signals depend on position of the eye. Thus is possible to
determine the eye positions and used in human computer interaction. The
disadvantages are the costs of signals amplifiers and the presence of
electrodes on subject face.

 

The trackers
from the third category use a video camera to track the position of the eye.
This can be done remote, which means the video camera is placed some ware in front of the
subject or head mounted, which means the camera is placed below to visual axis
of the eye, usually on eyeglasses frame

Two types
of images are used in video eye tracking: images in visible spectrum and images
in infrared spectrum (Hansen, 2005). Processing images in visible spectrum is a
passive approach and relies on ambient light reflected by the eyes. The traced
feature is the iris contour. The results of this method are dependent to
ambient light. For poor light conditions it is very hard to detect the feature
of the eye for tracking. Using an infrared light source eliminate this problem.
The eye is illuminated consistently and uniformly, imperceptible to user
(Parkhurst, 2005). Another advantage of infrared light is that it enhances a
feature of the eye which is easy to detect and track: the pupil. Thus, if the
light source is collinear with the eye visual axis, the pupil looks white
because of light reflection on retina (so called cat eye) otherwise black. In
both situations corneal reflection can be observed as the most brighten spot in
the image. Both types of eye trackers, remote or head mounted have a major
drawback if are to be used in HCI systems: the continuously head position
change. This can be resolved for remote trackers using two stereo cameras or
one wide angle camera to search for the person in front of it and another one
to point the person face and zoom (Model, 2012; Hennessey, 2012). Features like
face 3D orientation of subject face and distance are needed in order to
compensate the head movement. Generally, in the case of remote eye tracker
systems, the light source and camera are permanently affixed to a monitor and
the patient’s presence in front of the monitor and calibration procedure for
any new dialog session are required.

I this Paper we will focusing in visible spectrum by using Webcam of computer laptop  as a eye tracker by following the steps below

1-      Face
detection

2-      Eye
detection

3-      Pupil
detection

These three steps make the
tracking of eye pupils possible by using webcam in the same way it will be low
coast eye tracking  

 

 

 

 

 

Detection
Algorithms

 In the last years several
algorithms for eye pupil/iris detection have been developed. From the source
light point of view there are two approaches: based on ambient or infrared
light. All of them search for characteristics of the eye. There are some
algorithms that search for features like blackest pixels in the image, pixels
that correspond to pupil or iris and are know as feature based algorithms. Other
algorithms are trying to best fit a model (ellipse) to the pupil/iris contour
and are known as model based algorithms. The feature based algorithms need to
isolate the searched feature in the whole image or region of interest thru
optimal image segmentation and centre of mass of obtained image. The detection
is affected by the corneal reflection and/or eyelashes or eyelid but have in
important advantage: low computing resources. The model based algorithms search
for best candidate pixels to pupil/iris contour in the whole image or region of
interest and then applies an algorithm the best fit some of the pixels found.
The centre of the model is considered to be the centre of the pupil/iris. The
detection of candidate pixels is affected by the noise in the image, requires
high computational resources but have in important advantage: it can
approximate the pupil even if the corneal reflection, eyelid or eyelashes are
covering partially the pupil. The Starburst algorithm (Parkhurst, 2005) relies
on black or white pupil detection but can also be used for iris detection if
eye receives enough ambient light. It is a hybrid algorithm that search for eye
feature but in the end try to best 78 Robert Gabriel Lupu and Florina Ungureanu
fit an ellipse for the iris/pupil contour. The images are taken from a video
camera placed right underneath the eye at a distance of six centimeters and an
angle of 30º. The algorithm starts by removing the corneal reflection. It
continues by finding points on pupil contour, applies the RANSAC ( Fischler ,
1981) algorithm for the founded points and best fit an ellipse that contains
those points. Because of noisy images, for every frame that is processed
different ellipses with different centres are fitted to pupil contour. This
implies oscillations of determined gaze direction in HCI systems. Improvements
can be made by preprocessing the acquired images and filtering the pupil
detection output (coordinates of pupil centre in time). The preprocessing of
acquired images consists in applying filters like Scale Invariant Feature
Transform (SIFT) or Speed-Up Robust Features (SURF) (Luo, 2009) that have a
high degree of stability when Gaussian blur radius is smaller than 1.5 (Carata
& Manta, 2010). Yet, this preprocessing does not eliminate all the noise
from the image. Filtering the output coordinates improves the stability of gaze
direction by denoising the eye movement signals (Spakov, 2012). The ETAR
algorithm has a feature based approaches (Lupu, 2013). It starts by searching
the region where the eye is located, using Haar Cascade Filter. The region is
set as region so interest (ROI) and a mask image is constructed in order to
eliminate the unwanted noise from the four corners of ROI. The algorithm
continues with determination of an optimal binary segmentation threshold. The
pupil centre is determined by applying the centre of mass to the group of
pixels that correspond to the pupil from the segmented ROI image. The analysis
of determined gaze direction reveals that the algorithm is not sensitive to the
noise from the image.

Post Author: admin

x

Hi!
I'm Irvin!

Would you like to get a custom essay? How about receiving a customized one?

Check it out