Top Banner
A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman i FACULTY OF ENGINEERING AND SUSTAINABLE DEVELOPMENT Department of Electronics, Mathematics and Natural Sciences A study of Eye-tracking properties utilizing Tobii Eye Tracker 5 Muhammad Mezanur Rahaman January 2022 Student thesis, Advanced level (Master’s Degree, Two years) ,30 HE Master’s Program in Electronic/Automation Engineering Supervisor: Dr. Sajid Rafique Assistant Supervisor: Shaikh Masud Rana Examiner: Daniel Rönnow Department of Electronics, Mathematics and Natural Sciences, University of Gävle.
42

A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

Apr 11, 2023

Download

Documents

Khang Minh
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

i

FACULTY OF ENGINEERING AND SUSTAINABLE DEVELOPMENT

Department of Electronics, Mathematics and Natural Sciences

A study of Eye-tracking properties

utilizing Tobii Eye Tracker 5

Muhammad Mezanur Rahaman

January 2022

Student thesis, Advanced level (Master’s Degree, Two years) ,30 HE Master’s Program in Electronic/Automation Engineering

Supervisor: Dr. Sajid Rafique

Assistant Supervisor: Shaikh Masud Rana

Examiner: Daniel Rönnow

Department of Electronics, Mathematics and Natural Sciences, University of Gävle.

Page 2: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

i

Page 3: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

ii

Acknowledgement

The Master thesis was written at the department of the Electrical Engineering, Mathematics, and science at the University of Gävle in Gävle city, Sweden. I would like to thank all the people from the Electrical Engineering, Mathematics, and science who were involved in this project and have supported me with their faithful discussion, knowledge, and encouragement. Also, I give my special thanks to my supervisor Dr. Sajid Rafique for giving me the opportunity to work this thesis project under his supervision. I would also like to thank my examiner Daniel Rönnow for examining my thesis paper.

Finally, I would like to thank my Assistant Supervisor: Shaikh Masud Rana because he always supports

to me throughout the course programs and give idea of problem. I would like to acknowledge my

family members for their love and support throughout my master’s program.

.

Page 4: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

iii

Page 5: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

iv

Abstract

Eye tracking is the process of determining the point where the viewer is looking at. Therefore, an eye tracker is a device which measures eye positions and eye movements. Over the recent few years, eye trackers are being used in research in the area of medical technology, visual system, rehabilitation and in human-computer interaction. This study explores the application of eye tracking in watching and rendering images on computer screen. In this research, Eye-tracker 5 developed by Tobii is used which is a popular instrument among other eye trackers. Tobii Eye Tracker 5 includes a software development kit (SDK) enabling the creation of new research projects based on eye tracking and head movement. This work measured eye tracking streaming data on digital image, and post process the gaze data to observe the gaze pattern of human eye. This thesis investigated the impact of blinking using subtle gaze direction (SGD) approach, which states that flickering on the computer screen in peripheral vision instead of foveal vision attracts human attention and as the viewer’s foeval vision attracted to that blinking point, flickering was stopped and, subsequently, performed flickering to the next point of interest while the viewer is watching the previous point of interest. The work successfully modeled flickering on the desired locations of an image. Furthermore, rendering to different images is also demonstrated in this work using entirely through eye movement. It is envisaged that eye gazing based control technology would have tremendous applications in almost all areas of future technology particularly in assistive technology. Keywords Eye tracking, Eye movements, controllable application with eye tracker, SGD as blinking

Page 6: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

v

Table of Contents Acknowledgement ............................................................................................................................. ii

Abstract ................................................................................................................................................... iv

Table of Contents ............................................................................................................................... v

List of Figures ..................................................................................................................................... vii

1. Introduction .............................................................................................................................. 1

1.1. Purpose of the Project ................................................................................................................. 1

1.2. Background.................................................................................................................................. 1

1.3. Goals and Deliverables ................................................................................................................ 3

1.4. Aim of Study................................................................................................................................. 4

1.5. Study tools .................................................................................................................................... 4

1.6. Outline .......................................................................................................................................... 4

2. Theory ........................................................................................................................................... 5

2.1. Tobii eye tracker 5 ....................................................................................................................... 5

2.1.1. Tobii Service ........................................................................................................................ 6

2.2. Viewing Digital Images ............................................................................................................... 6

2.3. Blinking for Subtle Gaze Direction (SGD) ................................................................................. 8

3. Methods and Processes ...................................................................................................... 8

3.1. Hardware Setup ........................................................................................................................... 9

3.1.1. Computer .................................................................................................................................... 9

3.1.2.eye tracker ................................................................................................................................. 10

3.2. Program in main window Implementation ............................................................................. 10

3.2.1. Gaze Data Stream ............................................................................................................... 11

3.2.1.3. Subtle Gaze Direction, Rendering Blinking .................................................................... 14

3.2.2. Preparing data for json file ............................................................................................... 15

3.2.3. Method of start the Application ....................................................................................... 15

Page 7: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

vi

3.2.4. prepare data for Heatmap ................................................................................................. 19

4. Results ........................................................................................................................................ 20

4.1. participant experience evaluation ........................................................................................... 22

4.2. Blinking operation .................................................................................................................... 23

5. Discussion ................................................................................................................................. 25

6. Conclusion ................................................................................................................................ 27

6.1. Future work ............................................................................................................................... 27

7. Reference .................................................................................................................................. 28

8. Appendix.................................................................................................................................... 32

Page 8: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

vii

List of Figures

Figure 1:eye tracker ............................................................................................. 2 Figure 2: tobii eye tracker 5 outlooking ...................................................................... 6 Figure 3:tobii eye tracker 5 insides ............................................................................ 6 Figure 4:proposal Hardware setup ............................................................................. 9 Figure 5: visual of gaze point and position of Tobii ........................................................ 10 Figure 6: flowchart .............................................................................................. 12 Figure 7: flow chart ............................................................................................. 13 Figure 8: tobii ON .............................................................................................. 16 Figure 9: screen match with display .......................................................................... 17 Figure 10: flow tobii instruction .............................................................................. 17 Figure 11: look dot untill explodes ........................................................................... 18 Figure 12: look dot until explode ............................................................................. 18 Figure 13: calibration successful ............................................................................... 19 Figure 14:after calibration, main window ................................................................... 20 Figure 15: digital image 1 exoskeleton lab ................................................................... 21 Figure 16: digital image 2 drilling with exovest ............................................................ 21 Figure 17: digital image 3 drilling with legx ................................................................. 22 Figure 18: Heat map of digital image 1 ....................................................................... 23 Figure 19: Heat map of digital image 2……………………………………………………....24 Figure 20: Heat map of digital image 3 ……………………………………………………...24 Figure 21: partipants of Interest ratio………………………………………………………..26

Page 9: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

viii

Page 10: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

1

1. Introduction

This thesis provides a quick overview of eye tracking specifically employing Tobi Eye Tracker 5. In

addition, this chapter will cover the project's objectives and deliverables. In particular, the thesis

structure, purpose of research, study techniques, and Sustainable Development Goals are included

in this thesis will be briefly investigated intensively. With eye trackers becoming commercially

accessible at a reduced cost and with increased accuracy, eye gaze interaction is becoming a

prominent topic of this study.

1.1. Purpose of the Project

The purpose of this research is to make a digital platform such as computer screen or display area

more interactive to human eye. Eye tracking technology will be used in the planned system, which

would have a fully gaze driven interface. Eye gazing provides several inherent benefits to a system for

usage in a public area, or any institution such as University of Gavle, in contrast to being a fresh and

interesting form of engagement.

This project was created by the University of Gavle to address the COVID-19 issue when people

utilize laptops and computers for their regular tasks. As a result, users may be able to overcome poor

cleanliness and a lack of peripherals by employing an eye tracker.

Therefore, subtle gaze direction (SGD) would be an excellent approach to guide a spectator around

a picture in a non-intrusive manner that still does not detract from the visual experience, as would

regular prerendered user interface components. Researchers hope to develop a unique method for

people to experience computer display by gaze involvement.

1.2. Background

In recent years, eye-tracking technology has emerged as a critical resource for improving the quality

of service and increasing the flexibility of people with disabilities or public who demand additional

intelligence [1]. Eye tracking appears to have been a terrific resource for conscious experience lasted

for approximately 50 years [2]. It has been used to look at a variety of topics, including the structures

Page 11: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

2

of fascinations and strabismus when studying the subject, stress, and tiredness in various types of

aircraft, and the performance of visual advertisements, to mention a few [3].

Moreover, since the development of high-quality video-based eye-trackers, eye tracking has been

highly relevant. Also, eye tracking investigations used to necessitate intrusive and unpleasant

procedures like contact lens search probes [4] or electro-oculography [5]. Besides, earlier eye-

tracking approaches, which include scleral coils [6] and the dual Purkinje eye-tracker [7], give better

data performance in terms of signal-to-noise ratio (SNR) [8], but they are significantly more difficult

to utilize with illnesses, newborns, disabled, and even unskilled observers and controllers [6], [7],

[8], [9].

Unfortunately, eye tracking research has not been as widely employed as it once was. It has

maintained a fascinating research instrument, but it never really received the kind of recognition it

needs [10]. Those study presents a thorough examination of the issues that may stymie greater

adoption of eye tracking approaches, including the constraints and problems connected with eye

tracking gear and software [11], as well as the quantity, extraction, and comprehension of the data

analysis [3], [10], [12]. Example of eye tracking figure 1.

Figure 1: Eye tracker [13]

An eye-tracker is a sensitive instrument that utilizes prediction schemes and optical sensors to collect

extremely accurate data on eye position, eye movements [14], [15], [16], [17] and gaze direction

[18], [19]. In addition to experimental objectives, eye motions are directly usable in practical human

computer applications for assessment [20], [21], such as gaming [22] or human activity monitoring

Page 12: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

3

[23]. Regarding 3D virtual reality applications, tracking facial movements and generating picture

material in a gaze-contingent approach might be essential [24], [25].

The current study investigates the abilities of the Tobii Eye Tracking 5 Controller, a low-cost, image-

based eye tracking device designed for application domains by Tobii. Also, Tobii promises that this

innovative technology is designed for eye gaze contact with a realistic consumer experience which

allows individuals to sit, stand, and move about with some freedom. The Tobii Eye Tracker 5 may

use across both desktops and laptops, providing instant access and convenience. Furthermore, the

eye tracker is promoted as not needing frequent re-calibrations and therefore being capable of dealing

with a wide range of physiological parameters including eye color, ethnicities, consequences

associated, and maturity, all while being unaffected by head motions or fluctuating illuminance

throughout periods.

In this study, researchers propose an exhaustive survey of the device's features and technical

requirements in terms of accuracy and precision, delay, and sampled regularity. To improve the

device's usefulness for research, researchers are developing and making accessible an open-source

Eye tracker core software, Electron modules, node Js, virtual studio 2019, git bash, tableau desktop,

and MATLAB 2021b that may be used to interact with the eye tracker 5. Considering potential

influence of innovative low-cost commercial solutions on application areas, including the extensive

usage of sensor eye tracking technique, motivated us to incorporate all processes used to evaluate the

Tobii Eye Tracker 5, that could be simply transferred to other eye tracking devices, within the

created Toolkit. The capacity of the Tobii Eye Tracker 5 to offer an assessment of the eye gaze

independently for the left and right eyes, which is not afforded by the eye-tribe gadget, is an intriguing

capability. Therefore, Authors created a calibration technique that can be done both binocularly and

monocularly with each eye to allow users to fully utilize the Tobii Eye Tracker 5 Operator's

capabilities, and we presented an analysis of the discrepancies between the monocular and binocular

calibration procedures.

1.3. Goals and Deliverables

The major purpose of the thesis is to have a complete set of produced resources and information to

be able to recreate the testing approach and include it into design methods for future projects at

institutions, and digital displays. The thesis will include detailed information on eye tracking using

the Tobi Eye Tracker as well as the test technique for precise Blinking position accuracy of eyeballs.

Users will be shown different images Blinking based on their interest in their chosen digital image in

the results section. Therefore, Blinking will be used to capture the audience's attention.

Moreover, COVID-19, sanitation concerns, and a restriction of connectivity can all limits the

implementation of this initiative, which is critical for any organization or institution.

Page 13: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

4

1.4. Aim of Study

Designers will investigate the impact of external elements, such as intelligence, on someone's feelings

and emotions when evaluating application. it will determine whether supplementary music

information offered to a visitor while observing a digital display enhances its entire quality. Analyzing

visceral responses like comprehension and pleasure.

1.5. Study tools

Tobii Eye Tracker 5 is a sensitive piece of equipment that is studied in three digital image which are Exoskeletons Lab, Drilling with Exovest at Exoskeletons Lab and Drilling with LegX at Exoskeletons Lab. Furthermore, participants utilized the Tobii Eye Tracker 5 and expressed their interest after seeing all the images using our eye tracker. Using the toolbox below, I create Digital Application for any image for the digital displays:

• Eye tracker software

• Node Json

• Virtual studio 2019

• Virtual studio code

• Git bash

• Tableau desktop

• MATLAB 2021b

• Excel 2020

• Language: HTML, JavaScript, MATLAB programing

• Unity tools

The following tool is used to record eye tracking data which is considering software development kit:

• Tobii interaction

• .Net Framework 4.8

• Node with NPM

• Electron Module

1.6. Outline

Chapter 1 provides a quick overview of eye tracker; The objectives and deliverables of the project

will be covered in this chapter. The thesis structure, objective of research, study methodology, and

Sustainable Development Goals, which are all contained in this thesis, will be thoroughly studied.

Page 14: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

5

The second chapter of this thesis provides theory connected to this thesis, which explains many types of approaches, methods, ideas of follow work, delicate gaze direction, sccading and fixation, and tobii services. Chapter 3 describes the design, development, and application of methods from a theoretical standpoint in my project's work. Additionally included is a flow diagram, as well as a suggestion for hardware setup and calibration. Chapter 4 depicts the results of program and program window evaluations, as well as a heatmap. Chapter 5 discusses the procedure, results, and participant opinions, as well as other project ideas. In this chapter 6, we will go through the project's primary contributions. We will then explore potential future work, including prospective topics of research as well as possible expansions to the system we developed.

2. Theory In this chapter, we will look at some relevant research which might help us develop our program.

We proceed by investigating the psychology and other natural occurrences which impact image

pleasure. We would then investigate at a certain unique technological application in the digital image

of the Exoskeleton lab, where researchers were examining industrial works using Eksovest and Leg-

X. We will examine the present status of eye-gaze engagement. Eventually, we would examine

adaptations and implications of research on the issue of subtle gaze direction. Also write about tobii

eye tracker and working function.

2.1. Tobii eye tracker 5

Tobii eye tracker 5 is intended for use in gaming, software development, and perhaps other applications. It is a piece of hardware that connects to your computer through USB and attaches to the bottom of your screen. Everything just monitors human head and eye movements to figure to see where you're concentrating just on screen. It tracks your head and eye movements to understand where you are looking at on the screen. Its configuration is Windows 10 (64-bit only) RS3 or newer operating system, USB 2.0 connector with 0.8m / 31.4” integrated cable + 1m / 39.3” extension cable, Operating Distance:

45-95cm / 18-37”, Supported screen size 15″ to 27″ [16:9] or 30″ [21:9], Dimensions 285 x 15 x 8.2mm (11.2” x 0.59” x 0.32”), System recommendation 6th generation Intel Core (i3/i5/i7-6xxx) and later, or equivalent AMD 64-bit processor [26]. A minimum of 2GHz, 8GB RAM, and a USB port are required. The field of vision is 40 degrees by 40 degrees. It also helps if the user wears glasses or lenses. Tobii eye sensor designed with optical eye tracking and biometric data and a high-definition

Page 15: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

6

IR sensor. Figure 2 illustrates an exterior view, whereas presented in figure 3 an internal view with the operating function shown. [26].

Figure 2: tobii eye tracker 5 outlooking [27]

Figure 3:tobii eye tracker 5 insides [27]

2.1.1. Tobii Service

SDK allows you to connect with a Tobii eye tracker, launch an Electron application, and gather and transfer x, y, and timestamp coordinates. Moreover, HTML and JavaScript are used to create the Electron system's main window. Also, Electron applications are made up of two processes: a core control process and a render process for rendering pages. Inter-Process Communication (IPC) allows the processes to communicate with one another. Therefore, Tobii Service oversees starting the calibration procedure and providing the eye movement data to the JavaScript application. [28], [29].

2.2. Viewing Digital Images

This study presented a five-stage theoretical information processing model for defining aesthetic

experiences: perception, explicit categorization, implicit classification, cognitive mastery, and

assessment. The model illustrates the cognitive processes that may occur when a person is exposed

to digital image until they develop a verdict.

Page 16: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

7

Furthermore, this framework describes elements that may impact an organism's appraisal of a sensory appreciation, such as pre-classification of the item, past experiences, field particular knowledge, and personal preferences. The processing model's output consists of two separate outputs. One of those is aesthetically evaluation, which is the consequence of higher-level cognitive appraisal. The second type of emotion is esthetical Sentimentality.

According to previous research, getting a high intellectual efficiency while examining an image

installation resulted in a pleasant sensory appreciation. Moreover, adding or removing important

contextual information correlates with gains or reductions in cognitive fluency. Researchers will

investigate the impact of context, in the form of prior or given knowledge, on aesthetic judgments

further. The influence of specialized knowledge on conceptual readability of a designer statement is

investigated. The outcomes were graded based on the two knowledge feelings: excitement and

uncertainty. The aesthetic fluency scale was used to assess proficiency. Besides, it is indeed a quantity

based on a fundamental precept of aesthetic creations.

Attendees may peruse images and interact to digital displays while at residence to expand their

education. While the notion of gaze-based information processing is contemporary, investigation into

gaze characteristics dates all the way back further than a generation. Many subsequent investigations

were built on the notions of saccades and fixations. Outside of text reading, research is being

performed into patterns of fixations and saccades in a variety of fields. The perception of images and

the psychology underlying it is an appropriate topic for the goal of eye tracking investigations. With

the introduction of computerized eye trackers, there has been an ongoing investigation and aspects

of human gazing patterns during viewing digital image [30].

With the emergence of eye trackers for consumers such as the Tobii eye tracker 5 and the Eye Tribe,

the accuracy of eye tracking technology has risen over the years. Both are frequently designed to be

used in conjunction with standard input techniques. Completely gaze-controlled user interfaces are

still a very new object of research. In this research article, we will need to be totally controlled by

gaze interaction rather than mouse or touch screen input. We will review previous research to

determine the viability of eye-gaze as a full substitute for standard input techniques [31].

The between-subjects research contrasted eye gaze interaction with mouse input [32]. According to

the findings, gaze engagement created higher subjective immersion and was more appealing to

participants. Moreover, the mouse, on the other hand, was the favored means of engagement,

allowing participants to solve issues more quickly. Furthermore, sight selection was more incorrect

than mouse selection. There appears to be a threshold, proportional to the complexity of the required

interaction, beyond which gaze interaction ceases to be as simple to use as mouse input; however,

advances in the accuracy and availability of eye trackers have prompted research into more

sophisticated interfaces as well as design standards associated with gaze interaction [32], [33], [34],

[35].

Page 17: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

8

2.3. Blinking for Subtle Gaze Direction (SGD)

It was discovered that individuals in the SGD group performed statistically much better when it came

to viewing the panels in the intended order. Furthermore, as seen by the heatmaps, viewers tended

to be more concentrated on the panel sections [30]. This study is relevant to our project since it

demonstrated the usage of SGD as a visual navigation tool on digital image. Furthermore, it

demonstrates that SGD remains efficient in directing the viewer's focus in the setting of digital image

with numerous visual distractions [35], [36], [37].

The research looks at the discrepancies between directing novices along the survey route recorded

by the expert and leading users over the critical locations identified by the expert. There have been

no noticeable benefits between these discovered [38], [39], [40].

This method [41] gives the facility of understanding of blinking between fovea and periphery vision.so that viewer see blinking point of peripheral gaze point [41].

3. Methods and Processes

In this chapter, we will describe the process of formulating and executing a digital system. In this portion, we will go through the hardware platform that was used on the system during design and analysis. In the Laboratory of University of Gavle, we will also explore the system's planned final hardware configuration extensively, utilizing suitable sketches and accurate details relevant to connectivity aspects. We proposed Figure 8. Furthermore, demonstrate when to use the Tobii eye tracker 5 to operate this software.

Page 18: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

9

3.1. Hardware Setup

Figure 4: proposal Hardware setup

3.1.1. Computer

The system was designed to use a standard hardware configuration. Intended to run on a Windows PC of low-moderate specifications, such that it could be run on an embedded PC in a gallery space. The system will be tested on a laptop with the following specifications (Intel Core i7,2.60 GHz, 8GB DDR4 RAM, Intel HD Graphics, Windows 10 64 Bit, display monitor 15.6).

Page 19: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

10

3.1.2.eye tracker

Tobii Eye Tracker 5 is the eye tracker of choice because it provides precise, dependable gaze tracking using consumer-available technology. An eye tracker 5 was connected to the PC through a USB connection and glued or magnetized to the front of the display. Figure 9 represent looking eye gaze system and area perception.

Figure 5: visual of gaze point and position of Tobii [42]

3.2. Program in main window Implementation

In this section I will go over the specific implementation details, covering the features outlined in the design section. We see flow char table 1 where mention Tobii service and main window. We already know in theory part how tobii working, here my main contribution is design JavaScript program where user is investigated.

Page 20: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

11

Table 1: Main operation with Tobii

3.2.1. Gaze Data Stream

Originally mentioned throughout the theory part, it was essential to launch the Tobii eye tracker 5 calibration agent and transfer the eye tracker data to the main application. Moreover, this solution was devised in which the tobii service functioned as the server and the electron client. Therefore, Tobii received gaze information through electron transmission.

Page 21: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

12

Figure 4: flowchart

Page 22: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

13

Figure 5: flow chart

Page 23: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

14

3.2.1.1. Close Image window to main window

This approach establishes a specific region where we want to act; if this area matches the current gaze point for several times, it will be acted, which means it will close. Besides, check out the Menu Button ().

3.2.1.2. json file in HTML file for image storing

The way of storing digital images that the system would employ. As part of the project requirements, a digital picture that could be readily added or altered was required. It is decided to utilize the JSON structure. Besides, every digital image with a JSON file. Gaze points were simply saved as coordinates and a radius related to the pixel size of the source picture. Both image and audio files are accessed using a direct path and saved in the layout of the sibling container.

3.2.1.3. Subtle Gaze Direction, Rendering Blinking

After loading the image data, the suitable area for each gaze point was cut. When removing the primary image from the digital image, a saturate filter was applied. However, it is intended to be a blinking technique. The clipper function demonstrates the way of producing modulated sections, as well as the use of the saturate filter and the commencement of blinking where we fixed the blinking point that appears before the Json file.

3.2.2. Saccades and Fixation of eye gaze data

Saccades are the type of eye movement used to move the fovea rapidly from one point of interest to another, while a fixation is the period of time where the eye is kept aligned with the target for a certain duration, allowing for the image details to be processed [26], [28]. Below are general conventions for both of these important terms, as explained by Tobii Pro [44], Saccade Facts [44]:

▪ both eyes move in the same direction

▪ the time to “plan” a saccade (latency) is task dependent and varies between 100-1000 ms

▪ the average duration of a saccade is 20-40 ms

▪ the duration of a saccade and its amplitude are linearly correlated, i.e. larger jumps produce longer durations

▪ the end point of a saccade cannot be changed when the eye is moving

▪ Saccades do not always have simple linear trajectories

Page 24: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

15

Fixation facts [44]:

▪ a fixation is composed of slower and minute movements (microsaccades, tremor and drift) that help the eye align with the target and avoid perceptual fading (fixational eye movements)

▪ the duration varies between 50-600 ms (however longer fixations have been reported)

▪ the minimum duration required for information intake depends on the task and stimulus The implementation for this algorithm is checkFixation() function.

• 𝑉𝑥 = 𝑐𝑢𝑟𝑟𝑒𝑛𝑡𝑔𝑎𝑧𝑒. 𝑥 − 𝑙𝑎𝑠𝑡𝑔𝑎𝑧𝑒. 𝑥

• 𝑉𝑦 = 𝑐𝑢𝑟𝑟𝑒𝑛𝑡𝑔𝑎𝑧𝑒. 𝑦 − 𝑙𝑎𝑠𝑡𝑔𝑎𝑧𝑒. 𝑦

𝑉 = √𝑉𝑥2 + 𝑉𝑦2

Figure: red is fixation and yellow represent saccade

3.2.3. Preparing data for json file

We obtain an exoskeleton lab image from the supervisor, designate a flashing point, and precise each images. In addition, put details about each image as text.

3.2.4. Method of start the Application

Firstly, we connect the Tobii eye tracker 5 to a laptop or desktop computer using a USB connector, and then we mount the eye tracker to the front of the screen using a magnet or adhesive. Secondly, the Tobii core software 5 was then installed figure 12. After installation, we can see a program icon in the taskbar. Thirdly, we must adjust the display screen figure 13 and begin calibration figure14,15,16,17. Fourthly, we have included some screenshots of Tobii being prepared. Finally, displaying adjustment implies that we must establish the scrolling icon of tobii at the bottom of my computer screen, and Tobii calibration is an automatic procedure in which we must execute tobii guidelines and concentrate our eyes until they burst where tobii recommends a dot.

Page 25: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

16

Figure 8: Tobii ON

Page 26: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

17

Figure 9: screen match with display

Figure 10: Flow Tobii instruction

Page 27: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

18

Figure 11: Look dot untill explodes

Figure 12: Look dot until explode

Page 28: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

19

Figure 13: Calibration successful

We are now ready to utilize the Tobii eye tracker 5. After that, we will begin offering Tobii services, and Tobii will be available for calibration. Besides, the electron will be open after calibration. This really is our primary interface. There are three digital picture divs and one confirmation div window. If we want to open any digital image, we simply search for 2 seconds, then see the confirmation div 2 seconds, and it will open as a large interface. The large artwork provides participants with additional information. When you see a cross icon in the right corner after 1 second, that is a menu button, then it will leave, and the data will be saved in the logs folder as a text file. Therefore, when you see a cross icon in the right corner after 1 second, that would be a selection icon, then that will quit, and the data will be recorded in the log’s subfolder as a word document.

3.2.5. prepare data for Heatmap

We grab data from the logs folder and load it into MATLAB. Define a function that arranges the x,

y, and date for each image. Afterwards when, all of the data is prepared for Tableau software, which

is then used to produce a heat map.

We leverage Tableau software to construct a heat map with x, y, and timestamp values as dimensions. The designers then select heat density, which follows the kernel density technique. Besides, approximation is a technique for estimating probability density functions that is required for the user to better understand the investigated probability distribution. [43].

Page 29: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

20

4. Results In this section, we will demonstrate our application, in which participants view computer screen, examine digital images by eye, and provide their own delight, filings, and interest. So, one of our outcomes is a review of the application's user experience, and another outcome is how accurate the eye tracking data and functioning are.

Figure 14: after calibration, main window

In figure 18, we see our main window where there is four div, first one confirmation div, second is Robotics lab div with information. If we want to see as big image with information, just see on this div after that see on confirmation div then image will be open. Third number is Roof drilling, forth number div is wall drilling. Both are opening same system. After opening we see in Figure 19,20,21. We investigate this big image and enjoy with blinking to find object.

Page 30: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

21

Figure 15: digital image 1 exoskeleton lab

Figure 16: digital image 2 drilling with exovest

Page 31: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

22

Figure 17: digital image 3 drilling with legx

4.1. participant experience evaluation

After analyzing the system, each participant provides their own feedback, which is Appendix B. This total value is then averaged out of 5 points, with 1 being strongly disagree and 5 being highly agree. Look at the flowing question. Please flow the question and put ranking in Average mark section from Appendix.

Table 2: average evaluation form

no Question mark

1 Are you appreciating this application? 4.8

2 Does program helps me for understanding more information 4.2

3 the system reduces spread of infection by using eye tracker instead of touch screen or mouse

4

4 Is blinking annoys when user see it 3.8

5 Does it help in guiding you through the image 3.8

6 can you use this program without any external help 1.2

7 In future, this eye controllable system helps to develop better programs 5

Final score (out of 35) 26.8

Page 32: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

23

This method establishes a 76.57 % overall rating. This really is equivalent to a readability rating from

the System Usability Scale (SUS) on which it is based. In this aspect, the system was a huge success,

since it enabled the development of a completely gaze interactive system.

4.2. Blinking operation

In the image, we manually specify the coordinates of the blinking point in a json file. That whenever a participant notices a flashing spot, the blinking should cease and additional information about the item of the image should be shown, and the participant should be able to perceive increased interest in that place. Now we create a heatmap from the data in which the participant sees all the points on the image and the red region represents that the participant sees more points fig 22 in certain areas, proving that the participant is influenced by blinking and spends more time fig 24 compared to other places of point figure 23, indicating that the participant is attracted to our point.

Figure 18: Heat map of digital image 1

Page 33: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

24

Figure 19: Heat map of digital image 2

Figure 20: Heat map of digital image 3

Page 34: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

25

5. Discussion

This section will provide a brief overview of the whole method and outcome. Even though we are dealing with an electron application, we must install node. js with NPN as an electron requirement Electron will include a node module as well as basic HTML and JavaScript files. Also, we change this file according to our project requirements using Visual Studio code and create an eye tracker-controllable software. Moreover, JavaScript default function ipcrender, document.body.getElementsByTagName, getBoundingClientRect() help us to data sent, receive, catch screen-based information. Method of SGD help us to capture specific object with human eye behavior.

This design includes three digital images: exoskeleton lab, Exovast, and Legx. Each image has two SGD points. This point conveys information on the functionality of an object. Furthermore, figure 22 all points are represented that where participant looking on screen. Figure 23 and figure 24 represent how much time spend there. Our system can open and shutting div folders. So, without hesitation, object information represents for the spectator. We employ a predetermined time interval to regulate and flash the attraction in a specified zone. Also, eye activity for specialized purposes such as program opening, shutting, and providing information through blinking. Participants research the application and provide their own feedback. As a result, I am also pleased

with this application. We all know that eye tracking is a relatively recent phenomenon in the globe.

This research adds a little bit to the innovation of a research field.

Page 35: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

26

Figure 21: partisans of Interest ratio

Figure 25 Considering the percipients relevant for a better analysis to employ from the appendix

table, we can see how successful our investigation would be by utilizing Eye tracker 5. In this case,

we investigate that every percipient has a different choice about this study, which is verified appendix

table. Most respondents highly agree on questions 1 and 7, implying that they had a great time using

eye tracker 5 to open everything without using a mouse and that it is a very useful and intriguing

research for future education. A few responses strongly disagree with questions 5 and 6, signifying

that they can address themselves without assistance and have some intention feeling concern about

it. This investigation also revealed that three respondents had the same overall rating for our research

when compared to others. Investigation from Fig.1, 88.57%, 82.85%, 77.14%, 74.28%, and 62.85

% are user interesting efficiency to observe our research respectively. Therefore, Mr. Masud, Mezan,

and Saqib strongly agree with our study.

We all know that most people are unfamiliar with eye tracking applications. As a result, we must

provide guidance prior to conducting the inquiry. There is no need to calibrate the tobii every time.

But every now and again, we ran into a snag. Because no two participants' skins are the same, their

Page 36: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

27

sitting positions vary. We may need to calibrate again at this point, although this is not always the

case.

They assess the form and provide feedback. According to one participant, this technique will be

useful for eye research. Another participant stated that there is no need for a confirmation div in

the main program since he wants it to open whenever he sees a image. Yes, that is conceivable, in

my opinion. However, the system will become uncontrollable. We constantly see on screen, then

any of the images will open with no intention. That's why I included an extra div, which serves as a

confirmation div.

6. Conclusion

In this research, we construct a system that develops an approach for examining digital images. The

technology combines gaze interaction and delicate gaze direction to provide users with a completely

innovative method to examine digital images, perceive blinking, and stare at specific objects. Our

research contribution is the concept of designing a program in HTML and JavaScript that includes a

configurable eye tracker 5. Blinking notion from Subtle Gaze Direction was also incorporated.

Furthermore, participants evaluate the system and provide vital comments on its correctness.

Therefore, it could be useful to other developers, allowing them to create desktop applications that

use gaze interaction.

6.1. Future work

In future eye tracker also build in computer laptop, information desk and other device. we know that

in any public service we take a token by press one click by hand, maybe we can press eye in future.

We see information machine in railway station. Every one press by hand, if we control by eye then

it will be reduced spread of infection.

Page 37: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

28

7. Reference 1. Bilal, S., Mat, M. H., & Nassr, R. (2018). Design a Real-Time Eye Tracker. Proceedings of the 2018 the

2nd International Conference on Video and Image Processing. doi:10.1145/3301506.3301509.

2. Tecce, J. (2007). J.L. Andreassi, Psychophysiology: Human Behavior and Physiological Response,

Lawrence Erlbaum Associates, Mahwah, NJ (2007). International Journal of Psychophysiology, 65(2), 174-

175.doi: 10.1016/j.ijpsycho.2007.02.009.

3. Funke, G., Greenlee, E., Carter, M., Dukes, A., Brown, R., & Menke, L. (2016). Which Eye Tracker Is

Right for Your Research? Performance Evaluation of Several Cost Variant Eye Trackers. Proceedings of the

Human Factors and Ergonomics Society Annual Meeting, 60(1), 1240-1244. doi:10.1177/1541931213601289.

4. A Method of Measuring Eye Movemnent Using a Scieral Search Coil in a Magnetic Field. (1963). IEEE

Transactions on Bio-medical Electronics, 10(4), 137-145. doi:10.1109/tbmel.1963.4322822.

5. Kaufman, A., Bandopadhay, A., & Shaviv, B. (1993). An eye tracking computer user interface. Proceedings

of 1993 IEEE Research Properties in Virtual Reality Symposium. doi:10.1109/vrais.1993.378254.

6. Stevenson, S. B., & Roorda, A. (2010). Miniature eye movements measured simultaneously with

ophthalmic imaging and a dual-Purkinje image eye tracker. Journal of Vision, 5(8), 590-590.

doi:10.1167/5.8.590.

7. Eibenberger, K., Eibenberger, B., & Rucci, M. (2016). Design, simulation, and evaluation of uniform

magnetic field systems for head-free eye movement recordings with scleral search coils. 2016 38th

Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC).

doi:10.1109/embc.2016.7590686.

8. Velisar, A., & Shanidze, N. (2021). Noise in the Machine: Sources of Physical and Computation Error in

Eye Tracking with Pupil Core Wearable Eye Tracker. ACM Symposium on Eye Tracking Research and

Applications. doi:10.1145/3450341.3458495.

9. Hooge, I., Holmqvist, K., & Nyström, M. (2016). The pupil is faster than the corneal reflection (CR):

Are video based pupil-CR eye trackers suitable for studying detailed dynamics of eye movements? Vision

Research, 128, 6-18. doi: 10.1016/j.visres.2016.09.002.

10. Jacob, R. J., & Karn, K. S. (2003). Eye Tracking in Human-Computer Interaction and Usability

Research. The Mind’s Eye, 573-605. doi:10.1016/b978-044451020-4/50031-1.

11. Kovari, A., Katona, J., & Costescu, C. (2020). Evaluation of Eye-Movement Metrics in a Software

Debugging Task using GP3 Eye Tracker. Acta Polytechnica Hungarica, 17(2), 57-76.

doi:10.12700/aph.17.2.2020.2.4.

12. Maiorana, F., Leonardi, R., & Giordano, D. (2012). Eye-tracker data analysis in cephalometric

landmarking. 2012 International Conference on Computer & Information Science (ICCIS).

doi:10.1109/iccisci.2012.6297176.

Page 38: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

29

13. https://abilitynet.org.uk/news-blogs/next-gen-eye-tracking-game-changer-disabled-people.

14. Kierkels, J., Riani, J., Bergmans, J., & Boxtel, G. V. (2007). Using an Eye Tracker for Accurate Eye

Movement Artifact Correction. IEEE Transactions on Biomedical Engineering, 54(7), 1256-1267.

doi:10.1109/tbme.2006.889179.

15. Venugopal, D., Amudha, J., & Jyotsna, C. (2016). Developing an application using eye tracker. 2016 IEEE

International Conference on Recent Trends in Electronics, Information & Communication Technology (RTEICT).

doi:10.1109/rteict.2016.7808086.

16. Noureddin, B., Lawrence, P. D., & Birch, G. E. (2012). Online Removal of Eye Movement and Blink

EEG Artifacts Using a High-Speed Eye Tracker. IEEE Transactions on Biomedical Engineering, 59(8), 2103-

2110. doi:10.1109/tbme.2011.2108295.

17. Ware, C., & Mikaelian, H. H. (1987). An evaluation of an eye tracker as a device for computer

input2. ACM SIGCHI Bulletin, 18(4), 183-188. doi:10.1145/1165387.275627.

18. Elmadjian, C., Shukla, P., Tula, A. D., & Morimoto, C. H. (2018). 3D gaze estimation in the scene

volume with a head-mounted eye tracker. Proceedings of the Workshop on Communication by Gaze Interaction.

doi:10.1145/3206343.3206351.

19. Pichitwong, W., & Chamnongthai, K. (2019). An Eye-Tracker-Based 3D Point-of-Gaze Estimation

Method Using Head Movement. IEEE Access, 7, 99086-99098. doi:10.1109/access.2019.2929195.

20. Duchowski, A. T. (2002). A breadth-first survey of eye-tracking applications. Behavior Research Methods,

Instruments, & Computers, 34(4), 455-470. doi:10.3758/bf03195475.

21. Jacob, R. J., & Karn, K. S. (2003). Eye Tracking in Human-Computer Interaction and Usability

Research. The Mind’s Eye, 573-605. doi:10.1016/b978-044451020-4/50031-1.

22. Corcoran, P., Nanu, F., Petrescu, S., & Bigioi, P. (2012). Real-time eye gaze tracking for gaming design

and consumer electronics systems. IEEE Transactions on Consumer Electronics, 58(2), 347-355.

doi:10.1109/tce.2012.6227433.

23. Reimer, B., & Sodhi, M. (2006). Detecting eye movements in dynamic environments. Behavior Research

Methods, 38(4), 667-682. doi:10.3758/bf03193900.

24. Maiello, G., Chessa, M., Solari, F., & Bex, P. J. (2014). Simulated disparity and peripheral blur interact

during binocular fusion. Journal of Vision, 14(8), 13-13. doi:10.1167/14.8.13.

25. Ee, R. V. (2003). Correlation between Stereoanomaly and Perceived Depth When Disparity and Motion

Interact in Binocular

26. Tobii eye tracker 5 | The next generation of head tracking and eye tracking | Tobii gaming. (2021, December 15).

Tobii Gaming. https://gaming.tobii.com/product/eye-tracker-5/

27. (n.d.). Electron | Build cross-platform desktop apps with JavaScript, HTML, and

CSS. https://www.electronjs.org/.

Page 39: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

30

28. Get started. (2021, September 30). Tobii Tech. https://tech.tobii.com/get-started/.

29. Funke, G., Greenlee, E., Carter, M., Dukes, A., Brown, R., & Menke, L. (2016). Which Eye Tracker Is

Right for Your Research? Performance Evaluation of Several Cost Variant Eye Trackers. Proceedings of the

Human Factors and Ergonomics Society Annual Meeting, 60(1), 1240-1244. doi:10.1177/1541931213601289.

30. Elmadjian, C., Shukla, P., Tula, A. D., & Morimoto, C. H. (2018). 3D gaze estimation in the scene volume

with a head-mounted eye tracker. Proceedings of the Workshop on Communication by Gaze Interaction.

doi:10.1145/3206343.3206351.

31. Heikkilä, H., & Ovaska, S. (n.d.). Usability Evaluation of Gaze Interaction. Gaze Interaction and Applications

of Eye Tracking, 255-278. doi:10.4018/978-1-61350-098-9.ch017.

32. Miyoshi, T., & Murata, A. (n.d.). Input device using eye tracker in human-computer

interaction. Proceeding’s 10th IEEE International Workshop on Robot and Human Interactive Communication.

ROMAN 2001 (Cat. No.01TH8591). doi:10.1109/roman.2001.981967.

33. Duchowski, A. T., Pelfrey, B., House, D. H., & Wang, R. (2011). Measuring gaze depth with an eye

tracker during stereoscopic display. Proceedings of the ACM SIGGRAPH Symposium on Applied Perception in

Graphics and Visualization - APGV 11. doi:10.1145/2077451.2077454.

34. Chugh, S., Brousseau, B., Rose, J., & Eizenman, M. (2021). Detection and Correspondence Matching of

Corneal Reflections for Eye Tracking Using Deep Learning. 2020 25th International Conference on Pattern

Recognition (ICPR). doi:10.1109/icpr48806.2021.9412066.

35. Awais, M., Badruddin, N., & Drieberg, M. (2013). Automated eye blink detection and tracking using

template matching. 2013 IEEE Student Conference on Research and Developement.

doi:10.1109/scored.2013.7002546.

36. Nagamatsu, T., Yamamoto, M., Sugano, R., & Kamahara, J. (2012). Mathematical model for wide range

gaze tracking system based on corneal reflections and pupil using stereo cameras. Proceedings of the Symposium

on Eye Tracking Research and Applications - ETRA 12. doi:10.1145/2168556.2168610.

37. Niehorster, D. C., & Nyström, M. (2018). Microsaccade detection using pupil and corneal reflection

signals. Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications.

doi:10.1145/3204493.3204573.

38. Bailey, R., McNamara, A., Sudarsanam, N. and Grimm, C. (2009), 'Subtle gaze direction',pp(3)

ACM Trans. Graph. 28(4), 100:1-100:14.

URL: http://doi.acm.org/10.1145/1559755.1559757

39. Kernel density estimation and its application | ITM web of conferences. (n.d.). Digital Object Identifier

System. https://doi.org/10.1051/itmconf/20182300037.

40. https://tech.tobii.com/technology/what-is-eye-tracking.

Page 40: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

31

41. Murphy, Sarah. (2013). Data Visualization and Rapid Analytics: Applying Tableau Desktop to Support

Library Decision-Making. Journal of Web Librarianship. 7. 465-476. 10.1080/19322909.2013.825148.

42. https://www.tableau.com/sv-se.

43. Ben-Joseph, E. and Greenstein, E. (2015), A guided user experience using subtle gaze direction, Master's

thesis, Stanford University.

44. https://www.tobiipro.com/learn-and-support/learn/eye-tracking-essentials/types-of-

eyemovements/#:~:text=Saccades%20are%20the%20type%20of,image%20details%2

0to%20be%20processed, Accessed on 31st January 2022 .

Page 41: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

32

8. Appendix

Participant opinion form

Question

Please answer this question of your overall experience of using the system.

Strongly disagree disagree normal agree Strongly agree

1 2 3 4 5

Please flow the question and put ranking in mark section

no Question mark

1 Are you appreciating this application?

2 Does program helps me for understanding more information

3 the system reduces spread of infection by using eye tracker instead of touch screen or mouse

4 Is blinking annoys when user see it

5 Does it help in guiding you through the image

6 can you use this program without any external help

7 In future, this eye controllable system helps to develop better programs

Final score (out of 35)

Participant name:

Profession:

Contact media:

Page 42: A study of Eye-tracking properties utilizing Tobii Eye Tracker 5

A Study of Eye-tracking properties utilizing Tobii Eye tracker 5 Muhammad Mezanur Rahman

33

All participant marking of evaluation.

Number Question sami Masud mezan sakib umer

01 Are you appreciating this application? 4 5 5 5 5

02

Does program help me for understanding more information 4 5 5 4 3

03

the system reduces spread of infection by using eye tracker instead of touch screen or mouse 3 5 4 5 4

04 Is blinking annoys when user see it 4 4 4 4 3

05

Does it help in guiding you through the image 3 5 5 4 2

06

can you use this program without any external help* 2 3 4 5 5

07

In future, this eye controllable system helps to develop better programs 5 5 5 5 5

08 total out of 35 25 32 32 32 27

09 Result positive feedback 74.28%

88.57%

82.85%

77.14%

62.85%