Top Banner
Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical University of Lisbon [email protected],[email protected],[email protected] Abstract For the majority of blind people, walking in unknown places is very difficult or even impossible to perform, without help. The adoption of the white cane by blind community is the main aid to their mobility. However, the major difficulties arise in the orientation task. The lack of reference points and the inability to access visual cues are the main causes. In this paper we present an approach that allows users to walk through unknown places, while receiving a familiar and easily understandable feedback. Our prototype also allows blind users to interact with traditional mobile devices, particularly the ones with touch screens. Finally, we have conducted user studies that validate our approach. Keywords: Blind, Orientation, Accessibility, Touch Screens, Mobile, Evaluation. 1 Introduction There are, approximately, 163.000 people with some visual impairment in Portugal, more than 1.6% of total population, of which 17.000 are legally blind. In Europe, the number of blind people reach’s the 7% and there are more than 161 million worldwide. For the majority of blind people, walking in unfamiliar places is very difficult or even impossible to do without help. Orientation and mobility are the most important skills for a blind person, however, they are usually confused. Mobility depends on skillfully coordinating actions to avoid obstacles in the im- mediate path, whereas spatial orientation depends on coordinating ones actions relative to the further ranging surroundings and the desired destination [12]. Orientation refers to the ability to establish and maintain an awareness of one’s position in space relative to landmarks in the surrounding environment and relative to a particular destination [7]. Well-established orientation and mobility techniques using a cane or guide dog are effective for following paths and avoiding obstacles, but are less helpful for finding specific locations or objects. As possible solutions we have tactile maps and Braille signs (Figure 1), but both are insufficient and sometimes inadequate to user’s needs. (a) Braille sign (b) Tactile map. Figure 1: Tactile map and Braille sign. Meanwhile, there has been an effort to use technolo- gies as a mean to help visually impaired people on their spatial orientation task. The Sendero GPS [11] and Trekker [2] (Figure 2) are two systems available in the market for outdoor environments. Because they use specially designed devices for blind people, their cost and size are the two main causes of their low market penetration. (a) Sendero GPS (b) Trekker Figure 2: Orientation devices for blind people. Besides of the location technology, an orientation system has to have a proper interface that allows it to guide the users. Our approach offers blind users the ability to interact with their (common) mobile device, while they’re being guided through an unknown place. By focusing the design on the users, we also offer a familiar feedback, so the user can easily understand 1
10

Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

Aug 24, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

Blobby:

Mobile Guide for Blind People

Hugo Nicolau Tiago Guerreiro Joaquim JorgeDepartment of Computer Science and Engineering

IST/Technical University of [email protected],[email protected],[email protected]

Abstract

For the majority of blind people, walking in unknownplaces is very difficult or even impossible to perform,without help. The adoption of the white cane byblind community is the main aid to their mobility.However, the major difficulties arise in the orientationtask. The lack of reference points and the inability toaccess visual cues are the main causes. In this paperwe present an approach that allows users to walkthrough unknown places, while receiving a familiarand easily understandable feedback. Our prototypealso allows blind users to interact with traditionalmobile devices, particularly the ones with touchscreens. Finally, we have conducted user studies thatvalidate our approach.

Keywords: Blind, Orientation, Accessibility, TouchScreens, Mobile, Evaluation.

1 Introduction

There are, approximately, 163.000 people with somevisual impairment in Portugal, more than 1.6%of total population, of which 17.000 are legallyblind. In Europe, the number of blind people reach’sthe 7% and there are more than 161 million worldwide.

For the majority of blind people, walking in unfamiliarplaces is very difficult or even impossible to dowithout help. Orientation and mobility are the mostimportant skills for a blind person, however, theyare usually confused. Mobility depends on skillfullycoordinating actions to avoid obstacles in the im-mediate path, whereas spatial orientation dependson coordinating ones actions relative to the furtherranging surroundings and the desired destination[12]. Orientation refers to the ability to establishand maintain an awareness of one’s position in spacerelative to landmarks in the surrounding environmentand relative to a particular destination [7].

Well-established orientation and mobility techniques

using a cane or guide dog are effective for followingpaths and avoiding obstacles, but are less helpfulfor finding specific locations or objects. As possiblesolutions we have tactile maps and Braille signs(Figure 1), but both are insufficient and sometimesinadequate to user’s needs.

(a) Braille sign (b) Tactile map.

Figure 1: Tactile map and Braille sign.

Meanwhile, there has been an effort to use technolo-gies as a mean to help visually impaired people ontheir spatial orientation task. The Sendero GPS [11]and Trekker [2] (Figure 2) are two systems availablein the market for outdoor environments. Becausethey use specially designed devices for blind people,their cost and size are the two main causes of theirlow market penetration.

(a) Sendero GPS (b) Trekker

Figure 2: Orientation devices for blind people.

Besides of the location technology, an orientationsystem has to have a proper interface that allows it toguide the users. Our approach offers blind users theability to interact with their (common) mobile device,while they’re being guided through an unknown place.By focusing the design on the users, we also offer afamiliar feedback, so the user can easily understand

1

Page 2: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

all the given instructions.

In Section 2, we present the most relevant work inorientation systems for blind people and a brief dis-cussion about their advantages and limitations. Afterthis analysis, in Section 3, we describe our approachand user studies. In Section 4, we detail our prototypeand its architecture and in Section 5, we present anddiscuss the results obtained in user evaluation. Finally,in Section 6, we’ll present some conclusions and futurework.

2 Related Work

This area of research (orientation systems for blindpeople) has been a target of many projects for the pasttwo decades. There have been different approachesand systems that differ, especially, on their locationtechnology (e.g. Wireless Local Area Network, Blue-tooth, Radio Frequency Identification or Infrared).None of these technologies, with the exception of GPS(Global Positioning System), that it’s accepted as themain location system for outdoor environments, is ac-cept as a standard technology in indoor environments.

Many GPS’s systems also use a digital compass,to infer the user’s orientation. The project MOTA[18] was a first attempt to help blind users in theirorientation task, using context (e.g. bus timetables,maps and buildings schedule). The main problem ofthis system was the complicated task of programminga route. That was the main contribution of Drishti[16]. This system could automatically reprogram theuser’s route if necessary, without their intervention.

Loomis et al. [14] tried to discover the most appropri-ated interface for route guidance. They studied fivedifferent methods, with 3D speech/sound, a digitalcompass and a joystick. The use of 3D speech with adigital compass placed on user’s head outperformedall other approaches. However, the users didn’t like touse headphones, because they blocked environmentsounds.

The Talking signs [20] was an attempt to allowusers to get contextual information at long distances.The user carries a receiver and he’s constantly scan-ning the place for new infrared signs. The need fora line of sight is the main disadvantage of this system.

The use of RFID (Radio Frequency Identification) isgenerally very simple. Multiple tags, with geographicinformation, are spread over an indoor environmentand the user carries an RFID reader. This way, thesystem can locate him and give the appropriate feed-back [9]. The idea behind The Chatty Environment[5] is the world talking to the users. Some tags RFIDare placed in objects, so they can ”sense” the user,

Figure 3: Camera phone to blind users.

allowing him to interact with the world.

Both Bluetooth and WLAN devices are constantlysending a radio signal, which can be analyzed inorder to obtain their RSSI (Received Signal StrengthIndication) and consequently, the user’s currentposition. With a server responsible for calculatingroutes, guiding and communicate with the user,through his cell phone, LaureaPOP [13] uses WLANas an indoor positioning technology.

A different approach used by Hub et al. [1], consistsin analyze a 2D image, captured by a camera, placedin user’s white cane and correlate it with a 3D model.J. Coughlan and R. Manduchi [8] also use imageprocessing to identify and guide the user to specificsigns placed in walls and doors (Figure 3) of a building.

Generally, the systems designed to outdoor environ-ments uses GPS, due to its advantages, to locate theuser. Because GPS is unavailable in indoor environ-ments, we saw different approaches that can be usedand each one has its own advantages. This indicatesthat an orientation system has to be modular andflexible enough to allow the use of different locationtechnologies, depending on the scenario.

Almost all the works that we have seen tried to dis-cover an approach that would allow them to locate theuser in an indoor environment with a higher precision.On the other hand, the interaction with users is oftenforgotten. User interface is also an important area ofresearch and is the main focus of this paper.

3 Orientation System for BlindPeople

In an orientation system, the way the user is guidedand the feedback he receives is crucial to his perfor-mance. However, this task is often forgotten. It’snecessary to give blind users an appropriate interface.

Our approach tries to give users a familiar feedback

2

Page 3: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

by studying their capabilities and needs. Followinga user centered design approach, we were able toidentify common capabilities and behaviors amongtarget population, when exploring an unknown place.On the other hand, we also studied the way theseusers described and guided other blind users.

Moreover, our approach relies on traditional mobiledevices, because of their availability and low cost. Bytraditional mobile devices we mean that we didn’t useadditional hardware, such as Braille keypads or otheraccessories.

3.1 User Centered Design

We believe this is one of the main contributions ofour work. User centered design is a philosophy thatputs the person in the center and in all stages of thedesign process, trying to gather as much informationfrom the user and his surroundings as possible, inorder to guarantee quality.

On a first stage, we performed eight interviews to geta first contact with the users. These interviews weresemi-structured and mainly composed by open ques-tions, leading to an informal and friendly dialogue.After the interviews analysis, we conducted eighteenquestionnaires in order to obtain the user’s profile,current limitations and needs, degree of independencyand technological knowledge.

The target population has more than forty five yearsold and a low educational background (bellow the12th grade). All the users have one cell phone anduse it daily to make and take phone calls. Thisresult shows that they have some, but very limitedexperience with mobile devices. According to thequestionnaires results, despite the users walk alonemost of their time, 35% always need help in a publicbuilding.

In order to efficiently guide the user, an orientationsystem has to give him an easily understandablefeedback. Therefore, we decided to perform a userobservation that had the follow objectives: analyzetheir behaviors, difficulties, capabilities, needs andtechniques, when exploring an unfamiliar place;analyze the evolution of the user’s mental maps;identify the most important information for the user;study the verbalization of a route; study the way theusers communicate a route to a blind colleague.

Three participants volunteered to this experimentand the main goal was to explore and get a totalknowledge of an unfamiliar place, in order to guidesomeone. The Figure 4 represents the map and routethat were chosen. The main conclusions were: usersscan the whole place using an attempt-error approach;

obstacles impede the correct perception of reality;ease on memorize a route; users don’t like to explorenew routes, leading to an incomplete mental map ofthe place; reference points are crucial and have tobe easily identifiable; when the user is disoriented,he turns back and tries to locate some reference point.

(a) Top floor. (b) Bottom floor.

Figure 4: Map and route chosen for user’s observation.

After this experiment, we conducted a group meetingwith all the participants and a former instructor of ori-entation and mobility (O&M) techniques, to consoli-date and discuss all the obtained results. In this meet-ing, we were able to construct one (the ”best”) storyof the predefined route (Figure 4). We were also ableto identify the main elements that must be present tocontextualize the user in an unknown building: struc-ture - place characteristics (e.g. dimensions, shapeor number of floors); interest points - corresponds toall the possible destinies (e.g. shops, offices, build-ing services or toilets). Finally, during the observationand group meeting phases, all the users stated thatthey prefer a short and simple feedback, over to manydescriptions.

3.2 Mobile Device Interaction

Mobile phones play important roles in modern society.Their applications extend beyond basic communica-tions, ranging from productivity to leisure. However,most tasks beyond making a call require significantvisual skills. The constant evolution of mobile deviceshas been augmenting the gap between users andtechnology. The adoption of graphical interfaces hasmost predominant paradigm and the reduced size ofthese devices are the main causes.

However, despite the limited number of accessibletasks, mobile devices are part of the user’s everyday’slife. Overall, the low cost and the available technolo-gies make them appropriate to be an orientation sys-tem. The actual challenge resides in building an inter-face that allows blind users to interact with a complexapplication using traditional mobile devices (i.e. witha keyboard or touch screen). The users should be able

3

Page 4: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

to enter text, or navigate in menus with the availableinterfaces (Figure 5).

(a) Keyboard. (b) Touchscreen.

Figure 5: Mobile devices.

3.3 Familiar Feedback

Through a study of how blind users verbalize a route,we were able to identify the main elements and rulesfor the automatic building of instructions.

Action: this element corresponds to the verb of theinstruction (e.g. turn, go, enter or leave).

Direction: some of the previous actions need adirection so they can make sense. For instance,the action turn needs to be complemented with therespective direction, left or right.

Side: sometimes, we need to explicitly identify theside, in which the user should walk, in order toidentify a reference point or avoid dangers.

Time/Distance: usually, a reference point is asso-ciated to a decision point (i.e. direction shift). Thereference to these points is done through time ordistance elements. For example, ”turn right after youpass a door” or ”follow three steps forward ...”.

Object: this element can be any artifact existent onthe local (e.g. doors, stairs, tables or walls).

Depending on the place, reference points and theuser’s position, the instructions may vary. Table 1represents some of the most frequent structures andan example (in portuguese).

Table 1: Some of the most frequent instruction’sstructure (in portuguese).

Concluding, all the instructions can be defined

through the following regular expression [10]:

Action Direction? ([Time Distance Side])* Object?

3.4 Preliminary Evaluation

Most of the systems seen in related work (Section2) don’t take into account how the user is guided.Because our approach differ some much from theseworks, it was important to evaluate and validate it.

In this evaluation, the users had to complete a route,while receiving a set of instructions in predefinedpoints (see Figure 4). We chose to employ aWizard-of-Oz technique [6], where all the orientationsystem was simulated by a human. We performed theevaluation with nine participants and none of themknew the place. The instructions that we gatheredbefore (with the users) are listed in Table 2.

Table 2: Instructions (in portuguese).

Although all the vocabulary was given by blind users,there were two instructions that generated someconfusion (instruction 1 and 5), because of theirambiguity. Therefore, three of the participants neededexternal help in order to complete the route. Theremaining participants, also showed some difficultiesin these two instructions, but the system help wasenough to guide them. In this case, the wizard askedthe user to walk to the last reference point andthen continued to guide him. With the exception oftwo cases, all the errors were a result of those twoambiguous instructions (Figure 6).

Figure 6: Preliminary evaluation results.

4

Page 5: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

This evaluation allowed us to perform a first attemptto guide blind users and also validate our approach.The results showed that the reference points have tobe easily identifiable and that one ambiguous word canlead to the user’s disorientation. However, users wereable to associate most of instructions with reality andfollow them.

4 The Mobile Guide

Through user center design we defined an approachthat focused on actual orientation systems limitations.Our approach allows users to interact with traditionalmobile devices while guiding them with a familiar andeasily understandable feedback. In order to evaluateour approach, we implemented a system that will bepresented in this Section.

4.1 System Architecture

To validate our approach, we implemented a proto-type constituted by different modules, which alloweda separation between the users interface, localizationsystem, instruction’s building and application domain.The architecture of our prototype is depicted inFigure 7.

Figure 7: System Architecture.

The Localization Module is responsible for obtainthe user’s position and is directly related with thelocalization technology. In order to obtain a greatermodularity and accuracy, it’s possible to add newtechnology modules and combine all the results. Ourprototype uses two different technologies (modules):WLAN and Bluetooth.

The Application Domain is the core of the applica-tion and manages all the domain entities (e.g. maps,routes, objects or graphs). This module also definesall application’s behavior and state, managing theinputs and outputs.

The Orientation Module is directly related with theuser’s position and route. With these two variables,this module is responsible for give a set of instructionsthat allows the user to complete his route. Onceagain, to obtain a greater modularity we can addnew instruction modules, so we can obtain a differentfeedback.

The User Interface was divided in two components:input and output. The input module allows the virtu-alization of the inputs (i.e. keyboard or touch screen)and is responsible for communicate with applicationdomain. The output module is composed by the tac-tile, sound (i.e. earcons [3]) and speech feedback.

4.2 Building the Instructions

The orientation module is responsible for buildingthe instructions that will be given to the user, takinginto account his position, route and map. A map isa representation of some place and its elements (e.g.doors, walls, elevators, stairs or points of interest).

Our algorithm is just an implementation of therules already defined in Section 3.3 for building theinstructions. These rules, just like the algorithm,were defined for the portuguese language. Next we’llpresent it in more detail.

When the user’s position is detected, each set ofinstructions is built taking into account the nextlocalizable point in the route. This way, when theuser reaches the next point, he will receive a new setof instructions.

Moreover, if there are any reference points or di-rection shifts in between two points, the instructionis subdivided. If there are reference points, we canobtain instructions like, ”Siga em frente ate a portado elevador. Vire a esquerda e siga em frente atea casa de banho”. In this case, the elevator door(porta do elevador) is a reference point. When thereis a direction shift, the user has no reference point.Therefore, we need to build a reference taking intoaccount the nearest objects (e.g. ”Siga em frente evire a direita depois de passar uma porta”). If wecan’t build a reference, then we will use distances toguide the users (e.g. ”Siga em frente 3 passos”).

If the user is lost, strayed of the route or needing help,then the system will ask him to go back to the lastreference point. According to the studies presentedin Section 3.1, this was the natural behavior of userswhen they were disoriented. If he can’t go back, thesystem will guide him from the nearest localizable po-sition.

5

Page 6: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

4.3 Localization

The localization system was not our main focus inthis work. However, in order not to depend on aspecific technology, the localization module was builtso we could add and combine different systems andtechnologies. Our prototype uses WLAN and/orBluetooth technologies, depending on the mobiledevice available technologies. The WLAN systemwas developed by Pedro Sousa and Vera Saraiva [17]in their graduation theses and uses the radio RSSI(Received Signal Strength Indication) value to locatethe user. Our Bluetooth system is just a mappingbetween some element in the map and a Bluetoothdevice.

To obtain the user’s position, the localization managercombines both, WLAN and Bluetooth results (if theyare available). If the difference between both resultsis inferior to Bluetooth’s range, then the WLAN resultis, probably, more accurate. Otherwise, the Bluetoothresult is returned. Despite WLAN and Bluetooth mod-ules are independent, the last one acts like a correctionover WLAN result.

4.4 User Interface

We usually relate user interfaces to graphical ele-ments, and in the particularly case of mobile devices,small graphical elements. While screen-readersmake text more accessible, most interaction such asmenu navigation and text entry requires hand-eye,coordination which makes it difficult for blind usersto interact with mobile devices and execute thosetasks. Recently, mobile phones with touch screenssuch as the iPhone have become popular. The abilityto directly touch and manipulate data on the screen,without using any intermediary devices, has a verystrong appeal. Once again, the possibilities for blindusers are limited or inexistent.

Therefore, our prototype allows users to use a key-board or touch screen to perform tasks such as menunavigation or text entry (i.e. keyword search). Inmenu navigation, the keys ’2’ and ’8’ allow the usersto navigate to the previous and next option and thekeys ’4’ and ’6’ allow the users to go back to the pre-vious menu and select the focused option, respectively.

Because there is no tactile feedback to locate but-tons in a touch screen, we used a slightly differentapproach. Users can navigate in menus performingdirectional gestures on the screen (Figure 8).

We also built a text entry method for touch screensusing a navigational approach proposed in NavTap[19]. This method enables blind users to easily inputtext in a keypad-based mobile device. It rearrangesthe alphabet so that users can use four keys (2, 4,

(a) Right (b) Left (c) Up (d) Down

Figure 8: Directional gestures.

6 and 8) on the keypad to navigate through theletters using vowels as anchors, therefore eliminatingthe need to remember which letters are associatedto each key. Taking advantage of relief markers onkey ’5’, keys ’4’ and ’6’ enable users to navigatehorizontally in the alphabet while keys ’2’ and ’8’allow them to jump between vowels. This navigationmethod requires no memorization beyond knowingthe sequence of letters in the alphabet. Users witha richer mental mapping can use the shortest pathto the desired letters (shown in green in Figure 9).NavTouch is a similar approach we developed fortouch-screens. Using NavTouch, people navigate thealphabet by performing directional gestures on thescreen (Figure 8).

Figure 9: Navigation scenarios to the letter ’t’.

Finally the output module is responsible to give theusers all the feedback. This module is composed bythree modalities: speech (i.e. Text-To-Speech), sound(i.e. earcons) and vibration.

5 Evaluation

To validate our approach and contributions, a func-tional prototype was built (see Section 4), which re-quires a fully evaluation. The system must be eval-uated as a whole, in real situations, by blind users.For that, we conducted task-oriented experiments. Al-though our prototype allows users to interact with akeyboard or touch screen, all evaluation was performedin a HTC TyTn’s mobile device with Windows Mobile5.0 and a TTS system.

6

Page 7: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

5.1 Text-Entry

In order to be able to compare our text-entry methodwith NavTap [19] (keyboard approach) and ensurethe results consistency we had to follow the sameprocedures of its evaluation. Each text-entry methodtest lasted for three sessions, in which the users(five for each method) performed a set of tasksconsisting on writing specific sentences (differentbetween sessions and tasks). The first session had atraining period of twenty minutes. However, the userswere able to understand the navigational approachafter a few minutes (approximately five) of experience.

Both approaches’s had a decrease of errors acrosssessions, with 17%, 6% and 4%, for NavTap and27%, 17% and 14% for NavTouch. However, NavTapshowed a 10% overall MSD error rate (i.e. differencebetween the proposed and transcribed sentence),against 4% in NavTouch. It is important to no-tice that although the users erase more letters withNavTouch, the final result is better than with NavTap.

Concerning the required Keystrokes (or Gestures)Per Character, NavTouch showed better results thanNavTap (NavTap - 5.47 KSPC; NavTouch - 4.68).That can be explained with the load associated withfinding a new directional key with NavTap. Indeed,the users are able to quickly navigate in all fourdirections with NavTouch, as the gestures can beperformed at any time with no extra associated load.

Moreover, NavTouch outperforms NavTap (Figure10), as the gestures are less restrictive and easy toperform than finding and pressing the desired keys.

Figure 10: Words per minute.

5.2 Menu Navigation

This evaluation was performed by six users andmatches the first part of the orientation task, wherethe users choose the desired map and destination.Before this task, the users had a five minute trainingperiod.

The first result worth mentioning was that all userswere able to complete this task without any errors,which indicates that our approach is adequate, easyto perform and to understand. However, the numberof selections was limited, as we can see in Table 3.

Table 3: Menu navigation completion times (in sec-onds).

5.3 Instructions

The main goal of our approach is to guide blindusers through unknown places. To fully evaluatethis approach, we had to choose an unfamiliar placeand trace a route. In this evaluation, we used theBluetooth localization system, represented in Figure11.

Figure 11: Final evaluation route. The blue spotsrepresent the reference points.

During this evaluation, one of the users neededexternal help to complete the task. In this particularcase, the user had some difficulties distinguishing leftand right directions, therefore the need for externalhelp. All the remaining users were able to successfullycomplete the task.

The time needed for each user to complete the routeis available in Figure 12. The waiting time represents

7

Page 8: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

the Bluetooth discovery process. Despite the highwaiting time (approximately 20%), the users wereable to understand all instructions and identify thereference points, so that they could wait for the nextinstruction.

Figure 12: Task completion times (in seconds).

Comparing these results with those obtained inpreliminary evaluation (see Section 3.4), therewere less error for two reasons: the ambiguous in-structions were eliminated; the users knew about thelocation system limitations (i.e. needed waiting time).

Figure 13 represents the average time that usersspent in each point of the route. Analyzing theobtained results, we can conclude that all usersfollowed a similar route and spent most of their timenear reference points, waiting for the next instruction.

Figure 13: Thermal map.

To assess the user’s opinion, we have performed aquestionnaire in the end of the evaluation, that featuresix questions to be classified in a five point scale [15].The results are available in Figure 14 and show thesystem’s likability.

Figure 14: Post-Questionnaire results.

5.4 Discussion

With the results described in previous sections we arenow in condition to discuss the approach usability. Inthis section, we analyze the results taking into ac-count four factors [4, 15]: usefulness, effectiveness,learnability and likability.

Usefulness

Our approach is useful. Indeed, the users were able tounderstand and follow the instructions in order to getto the final destination, without any previous training.The interaction with the mobile devices, particularlywith touch screens, was simple and easy to perform.

Effectiveness

All users had a very similar route, without deviations,which indicates a high effectiveness. Our text-entrymethod (with a touch screen) outperforms the key-board method, in keystrokes (or gestures) per charac-ter and words per minute.

Learnability

In the text-entry evaluation, the users showed anincrease of performance: the errors were reduced in,approximately, half; the gestures per character werereduced in more than 20%; the words per minutewere increased from 1.60 to 2.08.

In the menu navigation task, the results suggest thatour approach is easy and faster to learn. In the orien-tation task, none of the users knew the place or theway that they would be guided. However, only one ofthem couldn’t complete the task, due to very particu-lar reasons. To access the system’s overall learnability,a more extense evaluation is required.

Likability

Through a post-questionnaire, we accessed to theuser’s opinion. Final results were very positive and

8

Page 9: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

showed that users were satisfied with the system andall the feedback (Figure 14).

6 Conclusions

The actual orientation systems for blind peopleneeded an appropriate interface. Almost all of thework that has been done has the main goal of locatethe user with a higher precision. Although this is acrucial component, doesn’t guarantee the system’ssuccess. New approaches that allow guiding the userand adapt to their capabilities and needs are required.

Therefore, our approach consisted in study blind usersso that we could build a more appropriate interfaceand feedback. The achieved results indicate thatthe users easily understand (without previous train-ing) all feedback and can follow the given instructions.

We also developed a new interaction method for touchscreens, so that the users could use new devices andparticularly our system. This interaction method fea-ture two task: menu navigation and text-entry. Theobtained results showed that our method with touchscreens outperforms similar methods with keyboards.

6.1 Future Work

Future work will focus on extending our system tonew scenarios. All our work and research was carriedout in a very limited number of indoor buildings. Inorder to develop and evaluate a more robust system,new scenarios and environments (e.g. outdoor) arerequired.

References

[1] J. Diepstraten e T. Ertl A. Hub. Design and De-velopment of an Indoor Navigation and ObjectIdentification System for the Blind. In In De-sign for Accessibility, pages 147–152, New York,2004. ACM Press.

[2] Axistive. Trekker. Available inhttp://www.axistive.com/trekker.html, lastvisit in 24/07/2008.

[3] Sumikawa D.A. e Greenberg R.M. Blattner,M.M. Earcons and Icons: Their Structure andCommon Design Principles. Human-ComputerInteraction, 4(1):11–44, 1989.

[4] P.A. Booth. An Introduction to Human-Computer Interaction. Psychology Press (UK),1989.

[5] V. Coroama. The Chatty Environment - A WorldExplorer for the Visually Impaired. In In AdjunctProceedings of Ubicomp, 2003.

[6] Dix A. Finlay J. e Abowd G.D. Dix, A.J. Human-Computer Interaction. Prentice Hall, 2004.

[7] D. Grantham e B. Moore. Handbook of Per-ception and Cognition: Hearing, pages 297–345.Academic Press, New York, second edition, 1995.

[8] J. Coughlan e R. Manduchi. Functional Assess-ment of a Camera Phone-Based Wayfinding Sys-tem Operated by Blind Users. In In InternationalIEEE Symposium on Research on Assistive Tech-nology, Washington DC, 2007. IEEE.

[9] S. Willis e S. Helal. RFID Information Grid forBlind Navigation and Wayfinding. In In Pro-ceedings of the ninth International Symposiumon Wearable Computers, pages 34–37, Washing-ton DC, 2005. IEEE Computer Spociety.

[10] J. Friedl. Mastering Regular Expressions. O’ReillyMedia, Inc., 2006.

[11] Sendero Group. Sendero gps. Available inhttp://www.senderogroup.com/gpspromo.htm,last visit in 24/07/2008.

[12] D.A. Guth and J.J. Rieser. Perception and thecontrol of locomotion by blind and visually im-paired pedestrians. Foundations of Orientationand Mobility, 2:9–38, 1997.

[13] J. Tuomisto T. Sederholm e M. Saamanen J. Ra-jamaki, P. Viinikainen. LaureaPOP Indoor Nav-igation Service for the Visually Impaired in aWLAN Environment. In In Proceedings of the6th WSEAS Int. Conf. on Electronics, Hardware,Finland, 2007. Laurea University of Applied Sci-ences.

[14] Marston J.R. Golledge R.G. e Klatzky R.L.Loomis, J.M. Personal Guidance System for Peo-ple with Visual Impairment: A Comparison ofSpatial Displays for Route Guidance. Journalof Visual Impairment and Blindness, 98(3):135–147, 2004.

[15] J. Rubin and T. Hudson. Handbook of UsabilityTesting: How to Plan, Design, and Conduct Ef-fective Tests. John Wiley & Sons, Inc. New York,NY, USA, 1994.

[16] S. Moore e B. Ramachandran S. Helal. Dr-ishti: An Integrated Navigation System for Vi-sually Impaired and Disabled. In In Proceed-ings of the Fifth International Symposium onWearable Computers, pages 149–156, Washing-ton DC, 2001. IEEE Computer Society.

[17] Vera Sousa, Pedro e Saraiva. People finder, 2004.Graduation Theses.

9

Page 10: Blobby: Mobile Guide for Blind People - ULisboa...Blobby: Mobile Guide for Blind People Hugo Nicolau Tiago Guerreiro Joaquim Jorge Department of Computer Science and Engineering IST/Technical

[18] R. Michel A. Raab H. Petrie e V. JohnsonT. Strothotte, S. Fritz. Development of Dia-logue Systems for a Mobility Aid for Blind Peo-ple: Initial Design and Usability Testing. In InProceedings of the second annual ACM confer-ence on Assistive technologies, pages 139–144,New York, 1996. ACM.

[19] Pedro Santana Daniel Goncalves e Joaquim JorgeTiago Guerreiro, Paulo Lagoa. Navtap and brail-letap: Non-visual input interfaces. In Proceed-ings of the Rehabilitation Engineering and Assis-tive Technology Society of North America Con-ference, Washington DC, 2008.

[20] B. Bentzen e L. Myers W. Crandall. Transitaccessibility improvement through talking signsremote infrared signage: A demonstration andevaluation. final report to us department oftransportation, federal transit administration andproject action of the national easter seal society,1995.

10