Top Banner
Usability Studies - JISC Services and Information Environment March 2004 (Version 2.0) Centre for HCI Design
145

Usability Studies - JISC Services and Information Environment

Feb 03, 2022

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Usability Studies - JISC Services and Information Environment

Usability Studies -

JISC Services and

Information Environment

March 2004 (Version 2.0)

Centre for HCI Design

Page 2: Usability Studies - JISC Services and Information Environment

- 3 - Centre for HCI Design

Contents

Executive Summary................................................................................................................... 6 1. Background ........................................................................................................................... 8

1.1 The Joint Information Systems Committee ..................................................................... 8 1.2 Information Environment ................................................................................................ 9 1.3 Usability and accessibility ............................................................................................. 10 1.4 Usability requirements................................................................................................... 11 1.5 Current usability practices within JISC ......................................................................... 12

2. Usability and Accessibility.................................................................................................. 15 2.1 Definition of usability.................................................................................................... 15 2.2 Web usability................................................................................................................. 15 2.3 Web accessibility........................................................................................................... 16 2.4 Disability Discrimination Act........................................................................................ 16 2.5 Benefits of and Return On Investment (ROI) for usability and accessibility ................ 18

3. Usability Methodologies and Techniques ........................................................................... 21 3.1 Usability Evaluation ...................................................................................................... 21 3.2 Stakeholder Analysis ..................................................................................................... 23 3.3 Personas......................................................................................................................... 25 3.4 Scenarios ....................................................................................................................... 29 3.5 Participatory Design ...................................................................................................... 33 3.6 Interviews and Focus Groups ........................................................................................ 35 3.7 Questionnaires ............................................................................................................... 38 3.8 Guidelines.................................................................................................................... 433 3.9 Contextual Inquiry....................................................................................................... 477 3.10 Task Analysis .............................................................................................................. 49 3.11 Accessibility Testing ................................................................................................... 52 3.12 User testing.................................................................................................................. 56

3.13 Cognitive Walkthrough……………………………………………………………….59

3.14 Heuristic Evaluation .................................................................................................... 61 3.15 Summary of Costs and Benefits of the Methods by Olson & Moran (1996) .............. 65

4. Study of Current Practices................................................................................................... 66 4.1 Digital Libraries............................................................................................................. 66 4.2 JISC Digital Library Services........................................................................................ 71

4.2.1 Resource Discovery Services ................................................................................. 71 4.2.2 Bibliographic services ............................................................................................ 74

- 3 -

Page 3: Usability Studies - JISC Services and Information Environment

- 4 - Centre for HCI Design

4.2.3 Virtual map libraries............................................................................................... 76 4.2.4 Digital image libraries ............................................................................................ 80

5. Usability and Accessibility Framework for DL .................................................................. 84 5.1 Nature of digital libraries............................................................................................... 84 5.2 Usability and accessibility iterative framework for DL’s.............................................. 86

6. Instrument Development Tools ........................................................................................... 89 6.1 Questionnaire................................................................................................................. 89 6.2 Focus Group .................................................................................................................. 90 6.3 User Testing................................................................................................................... 91 6.4 Heuristics Evaluation..................................................................................................... 93 6.5 Personas and Scenarios.................................................................................................. 94 6.6 Cognitive Walkthrough ................................................................................................. 95

7. Discussions and Conclusion ................................................................................................ 96 7.1 Suggestions to practitioners........................................................................................... 98 7.2 Suggestions to researchers............................................................................................. 98 7.3 Usability and Accessibility Guidelines........................................................................ 100

7.3.1 Key principles of Human-Centred Design (HCD)................................................ 100

7.3.2 General Presentation…………………………..………………………………….101

7.3.3 Specific Usability Aspects .................................................................................... 102 7.3.4 Specific Accessibility Aspects.............................................................................. 104

References ............................................................................................................................. 106

Appendix A: Sample questionnaire...……………………………………………………….116

Appendix B: Focus group question guidelines..…………………………………………….125

Appendix C: User testing coding form..…………………………………………………….126

Appendix D: Checklist for heuristics evaluation...………………………………………….135

Appendix E: Personas for cognitive walkthrough evaluation……………………………….138

Appendix F: Template for cognitive walkthrough…….…………………………………….146

- 4 -

Page 4: Usability Studies - JISC Services and Information Environment

- 5 - Centre for HCI Design

Figures

Figure 1: Number of design alternatives and cost of changes at different phases of the design

process (from Bias & Mayhew, 1994)............................................................................. 18 Figure 2: Example of a Persona ............................................................................................. 27 Figure 3: Usability Engineering: Scenario-Based Development of ....................................... 30 Figure 4: Example of scenario................................................................................................ 32 Figure 5: DL’s position in relation to traditional libraries and the Internet............................ 84 Figure 6: Axis of users behaviour versus information organisation....................................... 85 Figure 7: DL’s usability/accessibility framework .................................................................. 86 Figure 8: KartOO.................................................................................................................... 97

- 5 -

Page 5: Usability Studies - JISC Services and Information Environment

6 Centre for HCI Design

6

Executive Summary JISC offers wide-ranging services and resources, from bibliographic databases to digital maps, to the

further and higher education sectors and the research community in the UK. City University’s Centre for

Human-Computer Interaction Design was commissioned to conduct a set of usability studies of JISC

services and Information Environment and to propose a framework to guide further usability work within

JISC.

This set of studies comprised two main parts. Part I investigated how usability could best be applied to

the services and resources provided by JISC, with particular reference to the further development of the

Information Environment. Part II was complementary to Part I and focused on investigating the usability

and accessibility of four existing JISC services, using the evaluation framework developed in Part I.

The major deliverable of this research is a usability and accessibility evaluation framework for Digital

Libraries. This framework identifies specific aspects of web-based Digital Library design and provides a

set of suggested applications of the most appropriate usability and accessibility methods, techniques and

guidelines. The framework aims to assist JISC to further develop usable and accessible services for the

higher and further education communities.

In Part I, usability requirements for JISC services were elicited by employing a set of query techniques.

A literature review on the current best practices of usability techniques and methodologies was

conducted, followed by an investigation of the best practices of similar JISC services in terms of

resources and service provision in the UK and abroad. The evaluation framework specifies the stages at

which designers should employ such methodologies and techniques in an iterative design process:

requirements elicitation, design generation and evaluation stages. Thus, this workable framework enables

designers and developers of JISC services and resources to employ the most suitable methodologies and

techniques for each aspect, at each particular stage, in the development of the service or resource.

The studies in Part II are complementary to Part I and focus on investigating the usability and

accessibility of existing JISC services. These studies demonstrate how JISC could apply the evaluation

Page 6: Usability Studies - JISC Services and Information Environment

7 Centre for HCI Design

7

framework proposed in Part I in the development of current and future JISC services and resources. Four

selected JISC services were evaluated using the evaluation framework.

In the evaluations, a set of query techniques was employed for requirements gathering, followed by

usability evaluations conducted to identify the current usability and accessibility issues of the four JISC

domains. User testing was conducted to identify the major usability and accessibility issues from current

and prospective users; expert evaluations (heuristic evaluations and cognitive walkthroughs) were then

employed to identify further usability and accessibility issues as a supplement to findings from the user

testing. The results of this testing is confidential to the JISC and the services involved, but they have also

been used to test out and develop the findings from Part I, which are presented here.

The data obtained from the evaluations were analysed to provide a set of recommendations for

improvements to each of the existing services, together with a general set of usability and accessibility

guidelines for the development of future JISC services and resources.

The main body of the report is presented in the following sections:

Background: An introduction to JISC and current usability practices

Usability and accessibility: The issues and the benefits

Usability and accessibility methodologies and techniques: An outline of the various techniques, their

associated methodology and an example of how JISC could effectively implement them.

Study of current practices: Usability, accessibility and DL’s.

Usability and accessibility framework for digital libraries: Review of how the framework was

designed and implemented.

Instrument development tools: Design of techniques and methodologies for Digital Library framework

Discussion: Summary of framework and future directions.

Page 7: Usability Studies - JISC Services and Information Environment

8 Centre for HCI Design

8

1. Background

Information and Communication Technologies (ICT’s) are enabling a global audience to share knowledge

and ideas with one another by the click of a button. One of the key tools of this revolution is the Internet,

which is replacing CD ROM’s with online databases, traditional hard copy books and journals with

digital libraries and atlases with interactive geo-spatial data. The success of these services is maximised if

end users are well supported to easily accomplish their desired tasks.

1.1 The Joint Information Systems Committee

The Joint Information Systems Committee (JISC) was established as an advisory committee, working on

behalf of funding councils to provide and support the implementation of ICT’s in further and higher

education. They have already achieved many of their goals by providing expertise, independent advice,

guidance and key resources to help institutions throughout the country deliver a high level of service.

The disciplines that JISC services cover range from science and technology to art and the humanities, and

are represented via a broad spectrum of digital media:

• Journals

• Textbooks

• Theses

• Abstracts

• Manuscripts

• Music scores

• Still images

• Moving picture and sound files

• Geo-spatial images and maps

(Grout and Ingram, 2001)

Page 8: Usability Studies - JISC Services and Information Environment

9 Centre for HCI Design

9

Over the past few years JISC’s work has also aided the wider implementation of the Joint Academic

Network (JANET) which links schools, colleges, universities, research establishments and small and

medium size enterprises (SME's). Following on from this has been the development and rollout of Super

JANET 4, an international high-speed backbone enabling the fast transmission of information across the

globe (JISC, 2001). The establishment of this infrastructure has enabled JISC to move forward and

develop an Information Environment (IE) which allows users to find, access, use and disseminate quality

information resources.

1.2 Information Environment

JISC announced their 5-year Information Environment Development Strategy (IEDS) in 2001. The aim of

IEDS is to ‘…build an on-line information environment providing secure and convenient access to a

comprehensive collection of scholarly and educational material’ (JISC, 2002a). With this aim it is

essential that all resources are successfully managed and presented in the most coherent way. The IEDS is

a vital process in the global networked environment and therefore a successful and sustainable

implementation must involve all stakeholders:

• Publishers and suppliers of digital content to the DNER.

• International standards development bodies.

• Library, museum and archival professionals and allied strategic and funding agencies.

• Policy makers, information mediators, and creators and users of digital content in the

further and higher education communities.

(JISC, 2002b)

A key concern for JISC is the wide variety of users’ needs and the usability of services for users:

[U]sers do not all want to access information in the same way but will require a diverse range of

views of resources in order to satisfy their needs (JISC, 2002a).

JISC is fully aware that the material they provide is intended to serve a wide variety of end-users. For

example, post 16 learning requirements are not only different from those in higher education and research

Page 9: Usability Studies - JISC Services and Information Environment

10 Centre for HCI Design

10

environments, but differ between individuals within different institutions and environments. This complex

problem is being tackled by discovering what users require, how best it should be presented to them and

what support mechanisms need to be in place to assist their learning experience. All of these groups need

to be able to access information resources in the manner that most suits their needs, and thus utilise it to

its full potential. JISC is investing in many different avenues of research, the findings of which can be

used to further develop the IE strategy. The team from City University is one body involved in such

research and analysis, leading to the formulation of a sustainable usability and accessibility framework.

The IE strategy Presentation Programme clearly establishes JISC’s aims regarding issues of usability and

accessibility:

1. To significantly improve the usability of JISC Services and resources offered through the Information

Environment.

2. To establish the most effective means of embedding the presentation of resources within institutional,

departmental, local and personal environments.

3. To establish and disseminate best practice wherever possible in the design of interfaces to support the

requirements of access to diverse types of digital resources.

1.3 Usability and accessibility

The International Organisation of Standardisation (ISO) (ISO 9241-11, 1994) identifies three key factors

to assess the usability of an interface:

Usability is measured by the extent to which the intended goals of use of the overall system are achieved

(effectiveness); the resources that have to be expended to achieve the intended goals (efficiency); and the

extent to which the user finds the overall system acceptable (satisfaction) (John and Marks, 1997).

The usability of a system is also related to issues surrounding its accessibility. There is a broad range of

users to whom web-based services are directed, and the services provided ought to be accessible to them:

• People who are visually impaired

• People who are hearing impaired

Page 10: Usability Studies - JISC Services and Information Environment

11 Centre for HCI Design

11

• People who are physically impaired

• People who are cognitively impaired

• People with different experience of and attitudes towards technology

Other factors that may also affect the way an individual accesses web-based services include:

• Stress

• Fatigue

• Temporary disability – e.g. having a broken arm, forgetting one’s glasses

• Environmental setting – e.g. noisy work/study place, poor lighting

Interface design should therefore be governed by the requirements of all stakeholders of the system. Thus

a variety of issues have to be taken into account throughout by using a highly user-centred design process.

1.4 Usability requirements

The vast array of services provided by JISC means that not all usability requirement gathering and

evaluation techniques are applicable to all of the services in the same manner. In addition, the goals and

actions that users wish to achieve vary according to the nature of the service. Therefore, it is imperative

that the specific usability issues that apply to each service are clearly identified, along with the

corresponding stakeholder requirements for each resource.

For example, the usability issues surrounding virtual map libraries differ in some aspects from those of

other digital libraries (DL’s) due to the specific type of information they preserve and display. The

technical nature of the information means that the interface must provide visualisation tools that all users

can utilise. Hence the system's usability, especially in terms of interface design, must be strongly

correlated with the end-users’ productivity. Methodologies that gather user requirements and evaluate

usability must also be adapted to suit individual services in some instances. Query techniques like

questionnaires need to be designed to specifically extract users’ requirements in relation to that service.

Page 11: Usability Studies - JISC Services and Information Environment

12 Centre for HCI Design

12

The requirements of the different types of users also differ between, and within a service. Some users

require a service to offer an abundance of advanced tools, thus providing greater versatility when

interacting with the application. Other users however want clearly structured and formulated steps to help

them accomplish their tasks, although perhaps at the expense of flexibility.

1.5 Current usability practices within JISC

Our research indicates that JISC’s previous application of usability and accessibility techniques have been

directed towards specific projects; thus no framework has been established that can be applied to a variety

of services.

A usability study of the JISC ‘general’ web-site was conducted by the Internet Development Group at the

Institute for Learning and Research Technology in 1999 (Belcher et al, 1999). The aim was to assess the

usability of the web-site at that time, and form part of the JISC web-site ‘Consultation Exercise’ that was

carried out before the re-design of the site in 2000 (JISC, 2000). The standard techniques of

questionnaires and interview were used to establish usability requirements. It should be noted that both of

these techniques were used to gather information relating to the site’s content and layout, as much as to

uncovering usability issues. Another part of the ‘Consultation Exercise’ was an accessibility audit carried

out by the Digital Media Access Group from the Department of Applied Computing at the University of

Dundee (Gregor et al., 1999). Both of these studies identified problems and delivered recommendations to

ensure that the new design was more usable. However, neither provided a detailed methodology or

framework that could be applied to other services, so JISC service staff have been unable to use these

evaluations as templates for their own studies.

In terms of current JISC usability practices, we distributed an informal questionnaire to the delegates

whom attended the JISC conferences in October 2002. We asked their initial views on how usability was

currently being practised in their JISC services. This was then followed by a formal questionnaire sent to

these delegates via email with these questions:

1) Have you previously applied usability testing to your services?

2) What techniques were selected?

Page 12: Usability Studies - JISC Services and Information Environment

13 Centre for HCI Design

13

3) Did you find these techniques to be successful in achieving your objectives?

4) Were modifications made as a result of the evaluations?

From their feedback, most of the services were aware about usability but not much of it was implemented

when designing their services. Most of the services have not conducted formal usability evaluations.

Questionnaires were the most common form of usability assessment used. Data on basic usability issues,

such as ease of use of their sites or services provided were collected from those questionnaires. Detailed

usability issues were not the main focus of these questionnaires though.

Through our research we have identified that usability and accessibility techniques have been

implemented by in-house staff with individual services. These include:

• Questionnaires

• Focus groups

• Interviews

• Check lists – World Wide Web Consortium (W3C) (2002)

• Prototyping

• User testing

• Heuristic evaluations

• Automated testing – Bobby

• Web log statistical analysis

While informal evaluations can produce some valuable data, they are regarded as ‘quick and dirty’ since

they are devised out of a combination of techniques, thus failing to adhere to any formal methodology. An

appropriate formal HCI technique would have been a user testing evaluation where a broad cross-section

of end-users rather than staff beingbeing the participants. In user testing a participant attempts to complete

a number of given tasks, perhaps whilst "thinking aloud" as they do it. The session is generally

videotaped for later analysis. After the participant has attempted the tasks a short interview is held. This

allows the evaluator to obtain more detailed information about features of the design that the user found

Page 13: Usability Studies - JISC Services and Information Environment

14 Centre for HCI Design

14

particularly positive or negative, and to get suggestions about how to improve the service. The validity

and the accuracy of the results are significantly improved when following established usability

techniques.

Nonetheless, the findings of the evaluations previously conducted by JISC services were generally

perceived as being successful in achieving the team’s goals. Many of the services were also keen to build

upon the work they had done, having recognised the significant benefits of usability techniques.

One of the precursors to designing a successful, usable system is internal recognition of the advantages

that increased usability and accessibility can provide. JISC appears to appreciate this and are now in the

position to successfully build upon the work they have already done in this arena.

The work we have conducted in the Centre for HCI Design at City University will enable JISC to evaluate

their services with a unique DL framework that incorporates a variety of established and specifically

modified techniques.

Page 14: Usability Studies - JISC Services and Information Environment

15 Centre for HCI Design

15

2. Usability and Accessibility

2.1 Definition of usability

Usability is a concept that relates to the quality of a service or resource: According to the definition of

ISO 9241 (1994), usability is the effectiveness, efficiency, and satisfaction with which specified users

achieve specified goals in particular environments. It is the measure of the quality of a user's experience

when interacting with a service or resource, which could be a web-site, a software application, mobile

technology, or any user-operated device (Usability.gov, 2002).

2.2 Web usability

In term of web services and resources, usability is important because according to recent research (User

Interface Engineering, 2001) people cannot find the information they seek on Web-sites about 60% of the

time. Similarly, research by Manning et al. (1998) revealed that the consequence of bad site design is that

the site will lose repeat visits from 40% of the users. This can lead to wasted time, reduced productivity,

increased frustration, and loss of repeat visits and revenue, increased training and increased support costs.

Nielsen (1993) points out that usability is not a one-dimensional concept, but includes a number of

components:

Learnability: ease of learning to use the system so that the user can get started rapidly.

Efficiency: once the system has been learned, a high level of productivity should be possible.

Memorability: casual users should be able to return to the system after some period of not having

used it without having to relearn everything.

Errors: it should be easy to recover from errors. Also catastrophic errors should never occur.

Satisfaction: the system should be satisfying to use.

Page 15: Usability Studies - JISC Services and Information Environment

16 Centre for HCI Design

16

2.3 Web accessibility

The power of the Web is in its universality. Access by everyone regardless of disability is

an essential aspect. (Berners-Lee, W3C Director and inventor of the World Wide Web)

“Universal Access” is the concept that promotes designing products and services so that they are usable

by the widest range of people operating in the widest range of situations as is practical. It involves

understanding how users attempt to accomplish tasks using a variety of technologies in different

organisational and social contexts (Shneiderman, 2000), Universal Access applies as much to web-based

services and resources as it does to any other new technology and is explicitly part of the philosophy of

the World Wide Web, as the quote from Berners-Lee indicates.

Universal Access needs to be considered in the development of services and resources in an integral

matter, not as an ad hoc manner. This is both the cost effective approach and treats users with disabilities

on an equal basis with other non-disabled users. According to Travis (2002), when carrying out usability

tests with disabled people, the one comment that they often hear is that disabled people do not want to be

treated as "special"; they want to be treated with the same respect as anyone else. Therefore, we should

aim to achieve this goal by making sure that a web-site is accessible to disabled users and usable by

everyone. Throughout this report we will emphasise the close relationship between usability and

accessibility, both theoretically and practically.

2.4 Disability Discrimination Act

The Disability Discrimination Act (DDA) began to come into effect in December 1996 and brought in

measures to prevent discrimination against people on the basis of disability. Part III of the Act aims to

ensure that disabled people have equal access to products and services. Under Part III of the Act,

businesses that provide goods, facilities and services to the general public (whether paid for or free) need

to make reasonable adjustments for disabled people to ensure they do not discriminate by:

• Refusing to provide a service;

• Providing a service of a lower standard or in a worse manner;

Page 16: Usability Studies - JISC Services and Information Environment

17 Centre for HCI Design

17

• Providing a service on less favourable terms than they would to users without the disability.

It is a legal obligation on service providers to ensure that disabled people have equal access to web-based

products and services. Section 19(1) (c) of the Act makes it unlawful for a service provider to

discriminate against a disabled person "in the standard of service which it provides to the disabled person

or the manner in which it provides it".

While no web-sites in the UK have so far been pursued under the Act, it does appear that courts will use

the W3C WAI guidelines as the accepted standard required for compliance with the DDA (Sloan, 2001).

Interestingly, in April 2003 the Disability Rights Commission (DRC) launched a formal investigation (to

be carried out by the Centre for HCI Design at City University) into the accessibility of public and private

web-sites (for more information refer to http://www.drc-gb.org/newsroom/newsdetails.asp?id=393&section=1).

Additionally, under the eEurope Initiative launched in December 1999, the European Commission has

committed the Member States to "make all public web-sites and their content accessible to people with

disabilities" through the adoption of WAI Guidelines. Although this is a non-legal requirement and only

applies to public sector Web-sites, there is also a commitment to review legislation and standards —

which could see the initiative extended outside the public sector.

A web-site, however, could comply with all of the W3C WAI guidelines yet still not be usable.

Conversely, with the recent research on E-government usability (Ma & Zaphiris, 2003), a usable web-site

does not mean it is also accessible.

An important proviso here is that education is not covered by the DDA, but by separate legislation, the

Special Educational Needs and Disability Act 2001 (SENDA, 2001). This Act introduces the right for

disabled students not to be discriminated against in education, training and any services provided wholly

or mainly for students, and for those enrolled on courses provided by 'responsible bodies', including

further and higher education institutions and sixth form colleges. Student services covered by the Act can

include a wide range of educational and non-educational services, such as field trips, examinations and

assessments, short courses, arrangements for work placements and libraries and learning resources. In a

Page 17: Usability Studies - JISC Services and Information Environment

18 Centre for HCI Design

18

similar wording to the DDA, SENDA requires responsible bodies to make reasonable adjustments so that

people with disabilities are not at a substantial disadvantage.

So if JISC services and resources are used by people with disabilities as part of their work or personal

development, they will be subject to the DDA (as providers of goods and services to employees of

educational or research institutions or members of the public); if they are used by students or prospective

students, they will be subject to SENDA.

2.5 Benefits of and Return On Investment (ROI) for usability and accessibility

Usability evaluations are effective at all stages of the service or resource development cycle. By applying

usability methods in the initial design stage of services such as Digital Libraries (DL’s), one can greatly

reduce the need for extensive redesign, maintenance, and customer support. Thus, although there is a

definite cost to incorporate usability design and usability evaluation within the development cycle, there

is a clear return on investment (ROI) to be recouped.

Figure 1: Number of design alternatives and cost of changes at different phases of the design process

(from Bias & Mayhew, 1994)

Page 18: Usability Studies - JISC Services and Information Environment

19 Centre for HCI Design

19

Figure 1, above, shows the fundamental relationship between the stage at which changes might be

implemented and the cost of those changes. The earlier in the design process the need for change is

identified, the easier and cheaper it is to implement those changes. Thus use of evaluation methods to

identify the need for changes early in the design process will yield the greatest ROI, although identifying

the need for change at any stage of the design process will reduce long term cost of user support.

Users of JISC services can directly benefit by usability and accessibility improvements through increases

in effectiveness, efficiency, ease of use, ease of learning, and overall user satisfaction and experience.

Providers of JISC services can benefit by reductions in the needs for and costs of user training and

support as well as maintenance. Taking proactive measures in usability and quality during the initial

development stages can thus produce a cost saving rippling effect.

As usability increases user satisfaction and productivity, this leads to greater trust and loyalty from users,

and this also results in tangible cost savings. In the first 10% of the design process, when key system

design decisions are made, it can determine 90% of a product’s cost and performance (Smith &

Reinertsen, 1991).

Usability also plays an important role in the users’ overall perception of an organisation, in addition to

their specific perception of its services or resources (Marcus, 2002).

The following are some of the key benefits that JISC could receive by investing in usability and

accessibility work on services and resources:

Savings on redevelopment costs:

Once a system is in development, correcting a problem costs 10 times as much as fixing

the same problem in design. If the system has been released, it costs 100 times as much

relative to fixing [it] in design (Gilb, 1998).

Page 19: Usability Studies - JISC Services and Information Environment

20 Centre for HCI Design

20

Increase user satisfaction:

In a Gartner Group study, usability methods raised user satisfaction ratings for a system by

40%; when systems match user needs, satisfaction often improves dramatically (Bias &

Mayhew, 1994).

Increase ease of use of services and resources:

Incorporating ease of use into your products actually saves money. Reports have shown that it is

far more economical to consider user needs in the early stages of design, than it is to solve them

later (IBM, 2001).

Page 20: Usability Studies - JISC Services and Information Environment

21 Centre for HCI Design

21

3. Usability Methodologies and Techniques

There are a variety of established methodologies and techniques that can be used to identify usability and

accessibility issues with DL’s. This section details a number of these and explains their goal,

methodology, when they can be most effectively implemented and an exampleexample of how JISC

services could employ them.

3.1 Usability Evaluation

Usability evaluations (UE) consist of methodologies for measuring the usability aspects of a system’s

user interface (UI) and identifying specific problems. They are an important part of the overall user

interface design process, which consists of iterative cycles of designing, prototyping, and evaluating. (Dix

et al., 1998, Nielsen, 1993). According to Preece (1994), evaluation is concerned with gathering data

about the usability of a design or product by a specified group of users for a particular activity within a

specified environment or work context.

Ivory and Hearst (2001) suggested that the main activities involved in an evaluation include:

• Capture: collecting usability data, such as task completion time, errors, guideline violations and

subjective ratings;

• Analysis: interpreting usability data to identify usability problems in the interface;

• Critique: suggest solutions or improvements to mitigate problems.

There are various evaluation techniques commonly used by usability professionals at the moment. These

techniques are applied in different stages of the design of products and services. The findings and results

of the usability evaluation can vary widely when different evaluators study the same user interface, even

if they use the same evaluation technique (Jeffries et al., 1991; Molich et al., 1998, 1999; Nielsen, 1993).

The usability evaluation usually covers a subset of the possible actions users might take; as a result, it is

often recommended to use several different evaluation techniques (Dix et al., 1998; Nielsen, 1993) in

parallel.

Page 21: Usability Studies - JISC Services and Information Environment

22 Centre for HCI Design

22

When conducting evaluations, pre-defined test tasks should be selected prior to the evaluations for

techniques such as cognitive walkthrough and user testing. Selected test tasks should represent

characteristic user actions and goals.

Selecting Test Tasks

The test tasks selected should be based on the intended context of use and key scenarios of use. Tasks

should aim to describe specific scenarios of how and why users would access the site, and what they want

to achieve. Furthermore, according to Nielsen (1995) the following guidelines should be taken into

account when designing the test tasks:

• Test tasks chosen should be as representative as possible of the users to which the system will

eventually be put in the field.

• Test tasks should provide reasonable coverage of the most important parts of the interface.

• Test tasks should be small enough to be completed within the time limits of the use test, but should

not be so small that they become trivial.

• Test tasks should specify precisely what result the user is being asked to produce.

• Test tasks should not be frivolous, humorous, or offensive.

• Test tasks should be academic-oriented and as realistic as possible

• Test tasks should relate to an overall scenario in order to increase both the users’ understanding of the

tasks and their sense of being realistic usage of the application.

• The very first task should be designed to be simple in order to guarantee the user an early success

experience to boost moral.

• The last task should be designed to make users feel that they have accomplished something.

Selecting the Class of Users

It was important that the group of users that participate in the test matched the target user group of the

evaluated services/applications. The assumptions made on the class of users should consist of the

attributes collected from the requirements stage in terms of their demographic distribution such as gender,

age, occupation, experience etc, depending on the type of evaluated services or application.

Page 22: Usability Studies - JISC Services and Information Environment

23 Centre for HCI Design

23

3.2 Stakeholder Analysis

Stakeholders are anyone who is affected by the success or failure of a system. This includes end users and

others who have a stake such as managers, maintainers and marketing people etc. In defining

stakeholders, a stakeholder analysis provides a baseline for effective requirements engineering and

subsequent system design, as well as establishing requirements for all key stakeholders.

USTM

The USTM (User Skills and Task Match) identifies the groups of stakeholders who are responsible for

system design and development, those involved with a financial interest; people whom are responsible for

system introduction and maintenance as well as those who have an interest in the use of the system.

(MacCaulay et al., 1990)

CUSTOM

According to Kirby (1991), the CUSTOM model distinguishes four categories of stakeholders:

• Primary: those who use the system

• Secondary: those who don’t use the system directly, but receive output from it or provide input to it

• Tertiary: those not included in categories 1 and 2 but are affected by the success and failure of the

system

• Facilitating those who are involved with the design, development or maintenance of the system.

In determining the user characteristics, the following criteria should be focused on:

• physical/mental abilities

• background, preference, motivation

• anticipated system user – discretionary or mandatory use

• patterns of use – continuous, intermittent

• knowledge of domain, operating system/GUI, computer familiarity

• probable learning curve – frequency of use, general abilities

Page 23: Usability Studies - JISC Services and Information Environment

24 Centre for HCI Design

24

• expertise – novices, skilled users, expert users

Implementation

Stakeholder analysis is a requirement gathering technique that is used in the early stage in the evaluation

process for it is important to identify the anticipated user populations and their salient characteristics.

Hence questions like who are the stakeholders of the service, what do they wish the service to provide,

need to be addressed early in the design cycle.

JISC Implementation

In the evaluation process, JISC should conduct a stakeholder analysis at the early stage to gather the

requirements of all stakeholders. This could be done by adopting the USTM and CUSTOM model to

identify and determine whowho the stakeholders are and their characteristics at the early stage.

Additional Resources

Mark Kirby homepage on stakeholder requirements for computer systems

http://scom.hud.ac.uk/scomdb2/ppp/mark/markkirby.htm

D. Bell, A. Gupta, H. Rozendaal & E. Spencer (1995), Building Bridges between Human Factors and

Software Engineering

http://www.ash-consulting.com/HCI-95-paper.doc

Page 24: Usability Studies - JISC Services and Information Environment

25 Centre for HCI Design

25

3.3 Personas

Personas are a usability technique designed to direct the focus of the development process towards the

goals of the people who actually use the product. The process has traditionally been used by marketing

companies, whereby a set of personas (detailed profiles of typical users) that best represent the intended

audience based on statistical averages and demographics have been created to maximise the efforts of a

marketing campaign (Glaze, 2002).

In the arena of software design the process is adapted and based around the needs of ‘real’ users, not on

demographics and statistical averages. Alan Cooper (1999) notes that the simplest solution to rectify

usability problems would appear to be to ask the user what they believe needs to be improved, but this

does not work. Just because a user can identify a problem, it does not mean they will know how best to

rectify it (Cooper, 1999). Therefore hypothetical archetypes of each user group are created, based upon

facts discovered from the investigation process.

Designers often try to design a product that will suit every user’s intended goal - the ‘elastic user’.

However, the broader the target audience that a piece of software is designed for, the less likely it will

meet any individual’s sole need.

Common issues with an interface that are found once the design team implements personas are:

• The design teams choose advanced technology over accessibility.

• Assumptions were made that users would be more impressed by a robust interface that they couldn’t

actually use, than by a less elegant application that serves their needs.

• Design teams perceive themselves to be the primary persona.

(Hourihan, 2002)

Designing for essentially one primary persona rectifies this problem, for the design teams personal

preferences are removed and an increased awareness of the intended audiences varying skill levels and

goals are recognised. Thus, focusing on one archetype creates a design for the broader group. In the case

of JISC, not every user will have the same knowledge or requirements, so to focus the design around

Page 25: Usability Studies - JISC Services and Information Environment

26 Centre for HCI Design

26

highly detailed personas will enable design teams to assess if they are meeting the actual needs of the

users.

Methodology

Personas are almost always based upon the findings you retrieve from the requirements gathering stage

via questionnaires and interviews. They are highly detailed and not just a job description – thus each is

given an almost real world identity like first and last name, age, background story, goals, job title and

even a photograph. The more specific the persona is the more effective design tools they become for their

elasticity is removed. For example;

We don’t just say that Emilee [derived persona] uses business software. We say that Emilee uses

WordPerfect Version 5.1 to write letters to Grandma…We don’t just let Emilee go to work. We

give her a job as a New-Accounts Clerk in a beige cubicle at Global Airways in Memphis,

Tennessee (Cooper, 1999).

These specific characteristics enable the persona to become a real archetypal user in the eyes of the design

team.

The persona set should also remain small so they are manageable with each individual persona being

allocated a designated status:

• Primary where the persona’s need is sufficiently unique to require a distinct interface.

• Secondary where a primary interface will serve the needs of the persona with a minor

modification/addition.

• Supplemental where the persona’s needs are fully satisfied by a primary interface.

• Served where the persona is not an actual user of the product, but is indirectly affected by it and

its use.

• Negative where the persona is created as an explicit, rhetorical example of whom not to design

for. (Calde et al, 2002)

Page 26: Usability Studies - JISC Services and Information Environment

27 Centre for HCI Design

27

Implementation

Personas are designed to be used at the planning stage or at the early stage of your requirements capture.

They can also be used to help guide other evaluation techniques.

JISC Implementation Example

JISC services are aimed at several broad user groups who all have different needs and abilities;

information professionals; lecturers and teachers; students; publishers and researchers. Numerous

personas would therefore be required if this technique were to be implemented.

The following example is a brief profile for a student that we created for this study.

Matthew Rogers. BSc Geography student at Sussex University.

Matthew is a 24-year-old student in the 2nd year of his degree course. He studied A-levels in

history, geography and English literature at Harrow College in London before taking a gap year

out to Australia. Matthew sits on the universities debating team and also works in the holidays

with the National Trust. He loves gadgets and loves to explore the different tools that software

applications make available. Matthew has a high level of computer competence and has taken

additional courses in Java and Visual Basic. Matthew is familiar with mapping services but has

never used Digimap before.

Figure 2: Example of a Persona

Matthew’s requirements may mean that he is not regarded as a primary persona but is allocated the status

of secondary persona for his requirements can be met by adding additional modifications to a primary

interface. For a service such as this the technique may identify that more than one primary persona may

Page 27: Usability Studies - JISC Services and Information Environment

28 Centre for HCI Design

28

be required. Therefore there may be more than one interface needed to ‘…create a truly comprehensive

and integrated information system that enables each primary persona to achieve his or her goals’ (Calde et

al., 2002).

Additional Resources

Ennis provide a comprehensive list of the benefits that can be gained by including personas in the

design cycle of a product: http://www.eias.ie/aaa/usability_service_persona_devel.htm

Alan Cooper’s web-site http://www.cooper.com and his book The Inmates are Running the Asylum: Why

High-Tech Products Drive Us Crazy and How to Restore the Sanity both provide leading information

regarding the implementation of persona’s in HCI Design.

Page 28: Usability Studies - JISC Services and Information Environment

29 Centre for HCI Design

29

3.4 Scenarios

Scenario-based methods is the description of people using technology and it is essential in discussing and

analysing how the technology is (or could be) used to reshape their activities. A scenario describes a

sequence of events when interacting with a system from the users’ perspective and the scenario

descriptions can be created before a system is built and its impacts felt.

‘Scenarios’ are similar to ‘Use Cases’, which describe interactions at a technical level, but scenarios can

be easily understood by anyone regardless of the level of their technical knowledge. Scenarios are

especially useful when you need to remove the focus from the technology in order to consider other

design possibilities. Scenarios focus in terms of tasks rather than the technology used to support them.

E.g. “User enters his pin” is incorrect because it mentions the technology used, whereas “User identifies

himself” is okay because it keeps open other alternatives.

Methodology

Typically, three to four scenarios should be a good starting point to cover the standard users of a web-site,

and more scenarios might be required when the site has a diverse group of audience with different needs.

For example, JISC web-site has to cater the needs for disabilities when designing the web-sites. And for

each scenario, validate it by asking the represented users to review each scenario. A standard scenario

often links to a specific persona, therefore it consists of a profile of the user and an interaction episode or

story. Figure 3 further shows the process of how scenarios should be applied into the entire development

process.

Problem Scenario

A problem scenario tells a story of the current practices. These stories are carefully developed to reveal

aspects of the stakeholders and their activities that have implications for design while other members of

the project team should be able to read the problem scenarios and appreciate the work-related issues that

the field study has uncovered. It is called "problem scenarios" not because they emphasise problematic

aspects of current practices, but because they describe activities in the problem domain. In scenario-

based design, new activities are always grounded in current activities.

Page 29: Usability Studies - JISC Services and Information Environment

30 Centre for HCI Design

30

Activity Scenario

In the activity scenario, the design team first introduces concrete ideas about new functionality, new ways

of thinking about users' needs and how to meet them is the focus of activity scenarios. As in the other

steps of the process, a claims analysis is generated to help identify the tradeoffs as you move forward with

prototypes.

Information and Interaction Design Scenarios

After completing the claims analysis, the goal of information design is to specify representations of a

task's objects and actions that will help users perceive, interpret, and make sense of what is happening.

(Rosson & Carroll 2002)

Figure 3: Usability Engineering: Scenario-Based Development of

Human Computer Interaction

Page 30: Usability Studies - JISC Services and Information Environment

31 Centre for HCI Design

31

The goal of interaction design is to specify the mechanisms for accessing and manipulating the task

information and activities.

Implementation

Scenarios are a relatively inexpensive technique. They are most useful in collecting actual data about real

users based on surveys, interviews, focus groups, or observations of work environments. This is when the

user’s work environment would have a big impact on the use of the web-site. A scenario is to make sure

that the site is practically usable and actually serves the needs of target users in real life. Scenarios are

usually created after requirement gathering techniques have been performed. Scenario-based methods

help to define user requirements during the design process and are used to ensure necessary features and

needs are supported when the system is being designed. By doing so a scenario brings out additional

functional requirements and ideas for the user interface that are driven by users’ profiles and context.

Also, a scenario provides a rationale for design decisions that can be useful in presenting designs to the

development team or to decision makers.

Scenarios have limitations when the user population is extremely diverse and the time is restricted and

this does not allow the generation of a good coverage of scenarios. Also, it would be less useful for

simple marketing sites and whenever the context is likely to be a major factor, e.g. search engines.

JISC Implementation Example

JISC’s services would require numerous scenarios to cover a wide variety of needs. By adapting scenario

in the design, this could help to describe how specific individuals in specific circumstances would use the

web-site. This is to make sure that necessary small details are being considered for actually

accomplishing real tasks with the web-site. It would also ensure that the design has been considered in

context.

Figure 4 identifies how persona’s and scenarios complement each other, by displaying typical scenarios

we created for this study (see section 3.3)

Page 31: Usability Studies - JISC Services and Information Environment

32 Centre for HCI Design

32

Matthew’s goals are to:

• Manipulate maps of Devon to display campsites in the region.

• Be able to choose from a variety of design tools.

• Incorporate the maps with other software like Arc-view.

Figure 4: Example of scenario

Additional Information

Clarke, L. (1991). The use of scenarios by user interface designers. In D Diaper & N. Hammnd (Eds.)

People and Computers VI. Cambridge: Cambridge University Press, pages 103-115.

Design for Learning, Stanford University (2002). Provides an overview of the scenario-based framework

with examples of how to apply the theory into the design. http://ldt.stanford.edu/~gimiller/Scenario-

Based/scenarioIndex2.htm

Page 32: Usability Studies - JISC Services and Information Environment

33 Centre for HCI Design

33

3.5 Participatory Design

Participatory design (PD), also called the ‘Scandinavian Challenge’, refers to a design approach that

focuses on the intended user of the service or product, and advocates the active involvement of users

throughout the design process. According to Blomberg and Henderson (1990), the PD approach

advocates the principles and the goal to improve the quality of life, achieving collaborative orientation

and iterative processes. In participatory design, a team of people who represent the major stakeholders in

a web design team work together to create designs that reflect the way users will actually use the product

in their own work. Users play a central role in the participatory design sessions, they will talk about their

work environments and the tasks they're trying to accomplish, including what works for them and what

doesn't when they use their current tools. This proactive user input can both result in better designs and

help shorten product development and testing cycles.

Methodology

Depending on the complexity of the tool or feature to be designed, a complete participatory design

exercise normally last from one to five days. The exercise is performed by a team of individuals working

with a facilitator round a table creating a paper prototype of an interface design.

In a complete participatory design, this methodology can be implemented as a four-step process (Ellis,

Jankowski & Jasper, 1998):

1) Building bridges with the intended users

2) Map user needs and suggestions to the system

3) Develop a prototype

4) Integrate feedback and continue the iteration

Participatory design requires a group of people representing major stakeholders of the product/services

including users, web-site designers and developers, a usability engineer, and others as needed (for

example, people from documentation, training, or testing). The team should consist of at least four

Page 33: Usability Studies - JISC Services and Information Environment

34 Centre for HCI Design

34

individuals but no more than eight representatives. This will make the best use of all the participants'

time and ensure that useful information will be collected from the PD session.

Implementation

Participatory design method gives users an opportunity to interact with their suggestions for the web-site

before moving forward to the actual design. In most cases, these interactions lead to practical

improvements on user suggestions. Such improvements could result in producing a web-site that better

fits the users' needs.

JISC Implementation Example

Participatory design involves a higher cost and longer development process compared to other approaches

of usability engineering. Under the JISC environment, for example, to enable a true participatory design

exercise would involve a lot of human resources. It would require a lot of collaborations among different

parties from different organisations that make it difficult to carry out in reality. It is recommended that

participatory design could be conducted for a smaller size development within JISC but not on large scale

ones.

Additional Information

Computer Science Department, Stanford University, gives a comprehensive introduction of what

participatory design is, from the history of PD, the current application of PD and its directions in the

future with some useful references to understand PD in a deeper perspective.

http://www-cse.stanford.edu/classes/cs201/projects-00-01/participatory-design/history.html

M. Silva & A. Breuleux (1994) The Use of Participatory Design in the Implementation of Internet-based

Collaborative Learning Activities in K-12 Classrooms.

http://www.helsinki.fi/science/optek/1994/n3/silva.txt

Page 34: Usability Studies - JISC Services and Information Environment

35 Centre for HCI Design

35

3.6 Interviews and Focus Groups

Interviews and focus groups elicit information about user’s experiences and preferences. An Interview is

a formal, structured technique where the moderator interacts with users, asking them about their personal

experience and preferences regarding the targeted web services. Focus groups are an informal technique

that can help to assess user needs and feelings both before interface design and long after implementation.

Focus groups are a very efficient method to evaluate a web system. It helps to get user feedback and

gauge initial reactions to a design. Focus groups are also good at discovering how the actual performance

of the web system differs from users' expectations. It should be treated as a way to find out how people

react to ideas. It could be an effective technique for seeking to learn more about particular services at all

stages of a development cycle.

The main difference between the two methods is that interviews involve speaking to one individual at a

time while focus groups discuss issues with a group of people.

Methodology

To begin with an interview or hold a focus group, the first thing to do is to start with formulating

questions about the particular web services based on the type of information the moderator wants to know

about. Then it is good practice to start getting to know the interviewees, as this will encourage them to

speak more freely once they get used to the atmosphere. The difference between interviews and focus

groups compared to surveys and questionnaires (see more about surveys and questionnaires in section

3.7) is that the moderator is present to interact and facilitate discussion about the issues raised by the

moderator’s questions.

In a focus group, the moderator brings together six to nine users to discuss issues and concerns about the

features of a user interface. The group typically lasts about two hours and is run by a moderator who

maintains the group's focus. At the beginning, the moderator should ask everyone to introduce him or

herself which enables them to get used to the discussion environment and be willing to participate in the

discussion.

Page 35: Usability Studies - JISC Services and Information Environment

36 Centre for HCI Design

36

The following are some keys to maximise the data collected from focus groups:

• Schedule a moderator with previous experience co-ordinating focus groups

• Obtain a facility with several computers and a projection screen

• Recruit representative users, perhaps from a user group or email discussion group

• Ask participants to provide anonymous feedback via a computer station, web-site or email

• Include a survey at the beginning or at the end of the focus group

Implementation

Interviews and focus groups are useful for getting subjective reactions to the designs and for finding out

how people handle a particular task. They are appropriate at any stage of the design. Conducting them at

an earlier stage will enable design teams to obtain important information and be able to contribute user

views to the final design. If conducted at a later stage, this would enable designers to receive more

specific and concrete comments and information from the interviewees on the actual product. It should be

assessed whether the focus group methodology is the most appropriate way to research the topic, and

appropriate steps should be taken to guarantee that the study is conducted in an effective manner.

The main advantage of an individual interview is that other people in the group would not influence the

individual as might happen in a focus group. On the other hand, in a focus group, if a person raises an

idea, another person could then develop and expand that idea, and the moderator could further explore in

greater detail on some issues. When holding a focus group, it is however important to watch out for

‘group-think’ where people tend to conform to one another’s view and are reluctant to disagree with the

consensus view.

The benefits of focus groups are that they are less expensive than conducting interviews with the same

number of people. Also, the group interaction triggers memories which lead to uncovering additional

issues that may not come up during interviews. Common problems that most users might experience can

also be identified.

Page 36: Usability Studies - JISC Services and Information Environment

37 Centre for HCI Design

37

JISC Implementation Example

When designing a JISC web-site, conduct an interview or focus group to elicit user needs and

functionality ideas, and use it for exploring preferences, opinions and subjective reactions. If there are

prototypes available, ask people to review what they think about the prototypes. Furthermore, if there are

screen shots or a storyboard to review, an interview or focus group is a good way to ask users to walk

through the site or to perform informal user testing. However, it is important to note that these

techniques are not practical to conduct with highly inaccessible user populations such as high-paid busy

professionals (e.g. doctors or lawyers). When conducting a focus group, it is difficult to conduct for users

who are geographically isolated or dispersed and for highly specialised fields in which the target user

population is small. As an alternative, it would be feasible to approach them at conferences that they all

attend or consider conducting online interviews instead of face-to-face interviews.

Additional Information

Nielsen, Jakob, "The Use and Misuse of Focus Groups" http://www.useit.com/papers/ focusgroups.html

Focus group interview to evaluate library services http://www.berea.edu/library/bieval/focus-Group.html

Greenbaum, Thomas L., The Handbook for Focus Group Research, 1997, Sage Pubns; ISBN:

0761912533

Templeton, JF. The Focus Group: A Strategic Guide to Organizing, Conducting and Analyzing the Focus

Group Interview (second edition), Probus Publishing

Page 37: Usability Studies - JISC Services and Information Environment

38 Centre for HCI Design

38

3.7 Questionnaires

Questionnaires are an indirect usability assessment method since they do not measure the user interface

itself, but the end-users’ subjective opinions of the interface. As a result, they are a common technique

used for the elicitation, recording and collection of information about a design. Questionnaires are also an

inexpensive way to reach a wide audience, anywhere between 50 and 1,000 users (Nielsen, 1993).

However, due to the very nature of questionnaires resulting in no direct contact, the response rate is often

very low. Thus anything above 50% is regarded as a success (Colton, 1999). Kirakowski (2002) further

notes that if ‘…the questionnaire is reliable… then this feedback is a trustworthy sample of what you

[will] get from your whole user population.’ The lack of direct contact can also be a bonus, as the

complete anonymity of a questionnaire enables the researcher to retrieve more in-depth personal

information than he would necessarily gain through other techniques.

On the negative side, users’ perceptions as expressed through questionnaires early on in the design

process have a tendency to change at a later design stage. For example, a study regarding new features on

an interface revealed that:

The correlation between users’ predictions of whether they would like the new features, and their

ratings of the feature having tried them was very low, indicating that one should not always

interpret the [questionnaire] results literally (Nielsen, 1993).

Methodology

A questionnaire should be directed at a representative audience, with care being taken in its design to

ensure a high response rate. Dix et al. (1998) note that ‘[t]he first thing the evaluator must establish is the

purpose of the questionnaire: what information is sought? It is also useful to decide at this stage how the

questionnaire responses are to be analysed’. Appropriate questions may arise out of other activities like

brainstorming sessions with the design team, analysing the results from other evaluations or liaising with

other stakeholders. Additionally, there are many pre-designed questionnaires by usability experts already

aimed at web-based interfaces.

Page 38: Usability Studies - JISC Services and Information Environment

39 Centre for HCI Design

39

There are a number of types of questions that can be explored to help the evaluator obtain the information

they require: factual, opinion or satisfactory questions all provide different sorts of data. Open or closed

questions, Likert scales or semantic differential scales also enable the evaluator to analyse and present the

results in different ways (Kirakowski, 2002). The design of a questionnaire must follow some established

guidelines however if it is to be successful in its aims:

• Wording and terminology used must be clear and simple.

• Must be neutral as possible and not leading or biased.

• Closed questions must have a complete set of response alternatives that do not overlap.

• Each question to focus on only one issue.

• Scales should be appropriate and consistent.

• Must be clear where to place answer corresponding to question.

• Questionnaire must be piloted before wide scale distribution.

(Nicholas, 2001)

The structure of a questionnaire of this nature usually starts by explaining its purpose, then general factual

questions asking the user background information. This is useful to find out the range of experience etc

within the sample group. It then asks opinion questions relating to specific features, further contributing

to the study’s final evaluation goals.

Piloting of a questionnaire is essential to ensure the quality of data retrieved is maximised. Typical

questions that should be answered as a result of piloting your questionnaire are:

• Do the questions adhere to the scope and focus previously established?

• Is the range of possible answers sufficient?

• Are the questions clear, concise and easy to understand?

• Does the questionnaire provide the respondent with enough opportunity to express their personal

opinions?

• Is the questionnaire structure clear?

Page 39: Usability Studies - JISC Services and Information Environment

40 Centre for HCI Design

40

• Does the questionnaire manage to maintain the respondent’s attention till the end?

• How long does it take to complete?

The key therefore to retrieving good usability data, as stated by Shneiderman (1992), is to ensure that

‘…precise, as opposed to general, questions are used in surveys’ thus providing the designer with

‘…useful guidance for taking action’ to redesign the interface.

Implementation

By selecting appropriate questions, questionnaires can be applied at any stage in the design process. They

are especially effective:

• At the beginning of a project to establish the users’ requirements.

• Once the web-based service has gone live to gather feedback from end-users.

They are less effective when it is not established who the core users are, or when the motivation to return

the questionnaire is low (Brinck et al., 2002).

JISC Implementation Example

Questionnaires can be a very powerful tool by which JISC could reveal how end-users regard general and

more specific aspects of their services. Therefore identifying key usability issues by collating and

comparing the results with those from the other evaluation techniques.

For many services general questions in a variety of formats could be asked, such as:

• Closed questions:

How often do you access this service? Once a day

Once a week

Once a month

Less than above

Page 40: Usability Studies - JISC Services and Information Environment

41 Centre for HCI Design

41

• Semantic differential scales:

Relationship between headings and content is: Confusing Clear

1 2 3 4 5

• Open ended questions:

Please describe any ways the service could be improved to make it more user friendly?

_______________________________________________________________

_______________________________________________________________

_______________________________________________________________

To ascertain direct information relating to one type of service then more specific questions could be

presented:

• Closed questions:

How do you regard the quantity of information provided in the ‘core records’? Far too much

Too much

Satisfactory

Too little

Far too little

• Likert scale:

The size of the default thumbnails is adequate:

Strongly agree Strongly disagree

1 2 3 4 5 6 7

The data collected by questions such as these would provide the JISC with both quantitative and

qualitative information, which the design team can use to identify areas for possible improvement with

regards to usability and accessibility issues.

Page 41: Usability Studies - JISC Services and Information Environment

42 Centre for HCI Design

42

Additional Resources

Shneiderman, B. (1992). Designing the User Interface: Strategies for Effective Human-Computer

Interaction. Wokingham: Addison-Wesley Publishing Company. This book provides clear guidelines on

how to construct effective questionnaires.

Kirakowski provides a list of common questions related to executing questionnaires in the field of

usability engineering: http://www.ucc.ie/hfrg/resources/qfaq1.html

QUIS provides questionnaires aimed at assessing users’ subjective satisfaction to an interface, each

questionnaire is designed to be configured according to the needs of each interface analysis:

http://www.lap.umd.edu/quis/about.html

Page 42: Usability Studies - JISC Services and Information Environment

43 Centre for HCI Design

43

3.8 Guidelines

End users’ priorities may at times conflict with those of designers, developers or owners. In such cases

guidelines will advocate the users’ best interests. Recommended design guidelines are also used to ensure

usability design principles are adhered to. Guidelines are vital for web-based technologies, since users

display different habits when using this medium over more traditional information sources.

• 79% of users scan the page instead of reading word-for-word.

• Reading from computer screens is 25% slower than from paper.

• Web content should have 50% of the word count of its paper equivalent.

At the moment, there are lots of design guidelines and standards in relation to usability. With a series of

International ergonomic standards, the ISO 9241 (1994) Ergonomics requirements for office work with

visual display terminals (VDTs) provides guidance in the form of general principles and techniques by

describing the basis of the user performance approach rather than in the form of requirements to use

specific methods. According to the ISO 9241, usability is the extent to which a computer system enables

users, in a given context of use, to achieve specified goals effectively and efficiently while promoting

feelings of satisfaction.

After applying guidelines Sun Microsystems noticed a significant improvement in all of the key usability

metrics, leading to an improvement of 159% in overall usability of their site (Nielsen, 2002).

Usability guidelines build upon previously derived usability principles, but their adoption is not as

stringent as specific conventions that must be followed. Design teams may only find 90% of a set of

guidelines to be applicable to an interface. Hence the team has to decide which guidelines are most

appropriate (Thornton, 2002).

There are many sources providing web usability guidelines that address content accessibility and design

(see Additional Resources). These sources fall into five categories:

Page 43: Usability Studies - JISC Services and Information Environment

44 Centre for HCI Design

44

1. Design rules – a set of functional or operational specifications applying to a particular user

interface.

2. Ergonomic algorithms – a comprehensive and systematic procedure that usually appears as a

software component.

3. Style guides – set of guidelines that apply to a specific collection of user interfaces.

4. Compilation of guidelines – devised for a wide range of user interfaces.

5. Standards – to regulate

However, Ratner, Grose & Forsythe (Scapin, 1999) have noted in a study that there is little consistency

between different guidelines. In a study of 21 established style guides, only 25% of the total

recommendations appeared in more than one guide. In addition, Spool (2002) notes that ‘web usability

guidelines are very sensitive to the nature of the tasks and subtle differences in the content’. JISC has a

number of different services like multimedia, electronic libraries and geo-spatial information. All of these

would require some specific guidelines to ensure the usability and accessibility of each service is

maximised.

Methodology

Guidelines are derived from a number of procedures like the information gathered in the requirements

stage, developing an understanding of best practices by examining your own site and those of comparable

sites, and manipulating existing guideline frameworks. Web-sites such as Amazon and eBay have

developed guidelines by creating their own mutations.

They'll isolate one of their many servers and change the design of a few pages, just for users of

that server. They'll compare the results with users of the other servers running the 'existing site'.

Because of their traffic volume, they can learn a lot in just a few hours (Spool, 2002).

Once a set of guidelines has been established then the interface should be evaluated to ensure none of

them have been violated. However, guidelines should not be singularly considered as a remedy to ensure

a site is usable. They are designed to complement other HCI design techniques.

Page 44: Usability Studies - JISC Services and Information Environment

45 Centre for HCI Design

45

Implementation

Suitable guidelines should be applied throughout a user-centred design process. Whereby local guidelines

should be identified and applied to a specific phase in the development life cycle. Pervasive guidelines

should be implemented through a number of continuous stages and global guidelines are relevant to all

stages in the design (Scapin, 1999).

JISC Implementation Example

Many standard guidelines can be applied to a range of JISC services:

1. Create a style of presentation that is consistent across pages.

2. Use navigation mechanisms in a consistent manner.

3. Provide information about the general layout of a site (e.g., a site map or table of contents).

4. If search functions are provided, enable different types of searches for different skill levels and

preferences.

More specific guidelines can also be devised to ensure that usability and accessibility issues are addressed

(e.g. for an image service):

1. Give attention to visual organisation of the display

2. Ensure use of colours embedded in graphical images is consistent

(Gabbard & Hix, 1997)

Guidelines such as these can help ensure that JISC services meet the requirements of their varied users.

Additional Resources

The W3C provides 14 guidelines on web content and accessibility. Each is accompanied with additional

checkpoints that have been awarded a priority rating.

http://www.w3.org/TR/WAI-WEBCONTENT-TECHS

Page 45: Usability Studies - JISC Services and Information Environment

46 Centre for HCI Design

46

Jakob Nielsen provides both guidelines for homepage design and specifically for multimedia on the web.

www.useit.com

Ken Moffitt provides a compilation of usability guidelines for web-sites:

http://www.unt.edu/benchmarks/archives/2002/august02/access.htm

Organisations also provide their own guidelines such as:

- The National Cancer Institute: http://usability.gov/guidelines

- IBM: http://www-3.ibm.com/ibm/easy/eou_ext.nsf/Publish/572

Page 46: Usability Studies - JISC Services and Information Environment

47 Centre for HCI Design

47

3.9 Contextual Inquiry

Contextual inquiry, according to Holtzblatt and Beyer (1993) is a technique whereby usability issues of

concern are identified by users, or by users and designers collaboratively, while the users are working in

their natural environment on their own work. It is a structured field interviewing method, based on a few

core principles that differentiate this method from plain, journalistic interviewing. It is a discovery

process rather than an evaluative process; more like learning than testing.

Contextual inquiry is based on three core principles:

1. understanding the context in which a product is used (the work being performed) is essential for

elegant design,

2. that the user is a partner in the design process,

3. that the usability design processes, including assessment methods like contextual inquiry and

usability testing, must have a focus.

Methodology

Contextual inquiry follows many of the same process steps as field observations or interviews, with

different considerations in some portions of the process. In contextual inquiry, the usability specialist

actually becomes involved in the user’ tasks, experiencing them as if they are the users themselves.

For example, interviewing during a contextual inquiry study usually does not include pre-set, broadly

worded questions. Instead, the interviewer and interviewee create a dialogue conversation, where the

interviewer can not only determine the user's opinions and experiences, but also his or her motivations

and context.

Implementation

The contextual inquiry technique is best used in the early stages of development, it can be used to produce

user needs analyses and task analyses. This is because a lot of the information the moderator receives

will be subjective, such as how people feel about their jobs, how work or information flows through the

Page 47: Usability Studies - JISC Services and Information Environment

48 Centre for HCI Design

48

organisation, etc. It is used to understand the context in which a task is being performed. It is one of the

best methods to use to understand the users' work context. This is due to the fact that the environment in

which people work really influences how users use a product or services.

This technique helps finding out about work practices in domains that evaluators know nothing about -

for example a lawyer looking up court cases in a DL.

Studies using this technique could be time-consuming and expensive, but they are effective when the cost

could be justified. It is recommended that a successful way of conducting a contextual inquiry is to

establish a master/apprentice relationship with a domain expert. This is because the domain expert could

teach the usability specialist how to do a specific task or job.

JISC Implementation Example

JISC’s BUFVC – the British Universities Film and Video Council - provides a range of services to

promote the production, study and use of film and related media for education and research. When

developing information services for BUFVC, a contextual inquiry could be conducted by establishing a

master/apprentice relationship with a domain expert. For example, the domain expert (e.g. a film studies

expert) demonstrates the production process of a film to the usability specialists in the studio. And based

on information gathered from observations and interviews, the usability specialists will then be able to

have deeper insight in terms of the context of how the production process of a film is by experiencing the

task as if they are the users themselves.

Additional information

Information and Design. This link provides an introduction to contextual enquiry.

http://www.infodesign.com.au/usability/contextualenquiry.html

D. Wixon, K. Holtzblatt, S. Knox (1990) Contextual Design: An Emergent View of System Design.

http://www.cs.indiana.edu/~connelly/Usability/Local/contextual-inquiry-paper.pdf

Page 48: Usability Studies - JISC Services and Information Environment

49 Centre for HCI Design

49

3.10 Task Analysis

A task anyalysis is, according to Shepherd (1998), a process of sorting out what people actually do when

they perform tasks, i.e. what actions they carry out; how they respond to different cues in their working

environment and how they plan their activities. It is often applied to the design and evaluation of training,

jobs and work, equipment and systems as well as in interactive system design. A task is a set of activiites

in which a user engages to achieve a particular goal, in which a ‘goal’ could be distinguished as the

desired state of a system while a ‘task’ is the sequence of actions performed to achieve the goal, in which

case it is a structured set of activites.

Use Case Analysis

Use cases analyse the development of a system from the perspective of how a user would typically

interact with the system. User cases combine a simple way of capturing user scenarios in a text document

and diagramming how different user groups interact while using the system. The interaction between

different actors in the web-site could be captured using the use case diagrams, and these diagrams provide

a standard means for viewing an entire transaction in a single view.

Hierarchical task analysis

Hierarchical task analysis (HTA) describes the task in terms of a hierarchy of operations and plans based

on structured chart notation. The hierarchical task analysis prompts the analyst to establish the conditions

when various sub-tasks should be carried out in order to meet a system's goal. This method produces a

hierarchy of three levels of task analysis: Goals (external task): system state that the human wishes to

achieve, Tasks: structured set of activities in some sequence to achieve goals, Operations or actions:

different things that a person must do within a system; simple tasks having no control structure.

Methodology

Use cases describe the activities of the users or actors of a system. Use cases include the typical or

primary scenario that the user will go through to accomplish a particular goal or task and they also include

alternative scenarios that the user might go through in other circumstances.

Page 49: Usability Studies - JISC Services and Information Environment

50 Centre for HCI Design

50

When conducting a HTA, the following have to be taken into consideration:

• It is necessary to look at the big picture. Identify the user groups that will be using the web-site,

and how do users interact with other users using the site as well.

• Consider the pages that a single user will navigate in order to accomplish a particular task.

• Address the procedures that a user will utilise within each individual web page.

According to Brinck, Gergle & Wood (2002), it is recommended to use a hybrid method of task analysis

with the combination of both use cases and HTA. This would be able to combine both the high-level

interactions of users and other actors with the depth and psychological grounding of hierarchical

procedure decomposition. It should be done with the following stages:

1. start with use cases

2. decompose tasks hierarchically

3. determine appropriate technologies.

Implementation

Task analysis should be used in the early stage of the design to capture user’s requirements.

Use cases document the interactions between different user groups and are used as a first pass at high-

level design. The limitations of use cases are that it might not be able to tell whether a procedure

(scenario) is inefficient or whether these procedures are within the possibilities of human performance

and the training that might be required.

While one advantage of HTA is that it is easy to learn and to use, a limitation is that it applies only to

procedural activities and not to heavily parallel activities. HTA is also poor at capturing contextual

information and it requires much time, skill and effort in use. As an alternative, by using use case

analysis to supplement HTA, this would generate more specific information that is essential and alsoany

additional improvements that could be made to the overall workflow.

Page 50: Usability Studies - JISC Services and Information Environment

51 Centre for HCI Design

51

Task analysis assumes that there is a correct and complete symbolic description of user tasks, and it seeks

to capture that description. It consists of tasks as hirearchical structures of operations, articulated through

systematic decomposition, often to fairly detailed levels. For task analysis, representing work activitiy is

the primary objective. However, according to Diaper (2002), informing and guiding design to optimise

individual and collective performance is a hoped for benefit, but is not always easy to achieve.

JISC Implementation Example

When applying the task analysis technique, tasks should only be decomposed to a stage that the

information obtained would affect the decisions on interface design choices.

In the JISC information environment, task analysis could be adopted to capture the user requirements in

the early design stage. Services could adopt the hybrid task analysis approach in the early design stages

that involve the combination of both use cases analysis and HTA.

Additional Information

Human Factors Design: http://www.cdc.gov/niosh/mining/hfg/taskanalysis.html

Usable Web – Topic on Task Analysis: http://usableweb.com/topics/000876-0-0.html

Kirwan, B. & Ainsworth, L.K. (Eds.) (1992). A Guide to Task Analysis. London: Taylor and Francis.

Page 51: Usability Studies - JISC Services and Information Environment

52 Centre for HCI Design

52

3.11 Accessibility Testing

Currently, the most authoritative standards for accessibility in web design are the WSC WAI guidelines

(http://www.w3.org/TR/WAI-WEBCONTENT). There are several automatic tools available for

evaluating web-sites in terms of accessibility at the moment. Bobby is an automatic web evaluation tool

that provides detailed suggestions for improving web-sites (http://www.cast.org/bobby) while LIFT is

both a usability and an accessibility automatic evaluation tool (UsableNet, 2002)

(http://www.usablenet.com/products_services/ lfdnng/lfdnng.html). Both Bobby and LIFT are based on

either the Web Content Accessibility Guidelines 1.0 (WCAG) established by W3C or the American

Section 508 accessibility standard. Also, RetroAccess (http://www.retroaccess.com) addresses usability

errors on your web-site, it evaluates and corrects a single web page based on the Section 508 standards to

see how the evaluation and correction process works.

Methodology

Automatic evaluation automates some aspects of usability and accessibility evaluations such as capture,

analysis or critique activities. The advantages of automatic usability evaluation is that by automating the

evaluating tasks we can:

• Reduce the cost of usability evaluation as the automation will minimise the time spent on usability

evaluation and consequently the cost.

• Increase consistency of the errors uncovered

• Predict time and error costs across an entire design

• Reduce the need for evaluation expertise among individual evaluators

• Increase the coverage of evaluated features

• Enable comparisons between alternative designs.

• Incorporate evaluation within the design phase of UI development.

Bobby Automatic evaluation Tool

Bobby is the automatic evaluator for web accessibility and browser compatibility. It is one of the most

widely used automatic accessibility evaluation tools. It recommends effective web page authoring for

Page 52: Usability Studies - JISC Services and Information Environment

53 Centre for HCI Design

53

special web browsers. Bobby looks at the underlying HTML code that controls the presentation of a web

page and analyses it against the Web Content Accessibility Guidelines 1.0 (WCAG) which is the W3C

specification providing guidance on accessibility of web-sites for people with disabilities. If Bobby

detects a violation it will highlight it and then provide some guidelines on how to repair the HTML code.

Bobby divides the accessibility errors into four sections to be tested:

• Priority 1: Accessibility problems that seriously affect a page's usability by people with disabilities.

A Bobby Approved rating can only be granted to a site when none of the pages contain accessibility

errors. Bobby Approved status is equivalent to Conformance Level A for the WCAG.

• Priority 2: Accessibility problems are second-tier problems that are considered important for access

although not as fatal as Priority 1 access errors. Designers should try to fix the items in this section.

If all items in this section in addition to the Priority 1 section, including relevant User Checks, are

passed, the page meets Conformance Level AA for the WCAG. This is the preferred minimum

conformance level for an accessible site.

• Priority 3: Accessibility problems are third-tier access problems. If all items in this section in

addition to the Priority 1 and 2 sections are passed, including relevant User Checks, the page meets

Conformance Level AAA for the WCAG.

• Browser Compatibility: Issues are HTML elements and element attributes that are used on the page

which are not valid for particular browsers. These elements do not necessarily cause accessibility

problems, but users might experience difficulties as the page may not be displayed as expected which

may affect the usability and accessibility as a result.

(Zaphiris & Ellis, 2001)

As a general rating, once the Web-site reaches a Bobby Approved rating, the organisation is entitled

(though not required) to use the Bobby Approved icon on the site. These icons identify the organisation as

one committed to inclusion.

Page 53: Usability Studies - JISC Services and Information Environment

54 Centre for HCI Design

54

LIFT Automatic Evaluation Tool

LIFT combines both usability and accessibility evaluation in a single evaluation. It provides a report of a

number of catastrophic errors (errors that disable users to complete tasks), major errors (errors that cause

users to face major impediments), minor errors (errors that are really a nuisance for users) and cosmetic

errors (low priority materials).

Automatic evaluation is the convenient and cost effective way to perform usability and accessibility

analysis, however, due to the fact that some of the usability and accessibility elements cannot be fully

determined and judged solely by automatic evaluation, it is not recommended to totally rely on these

evaluation results to determine whether a web-site is truly usable and accessible or not. It is

recommended that manual evaluations be carried out after automatic evaluations to supplement the

evaluation results.

Analytic Evaluation - Manual/Non-Automatic Evaluation

Manual evaluations require an evaluator to conduct the evaluation and be familiar with the evaluation

techniques - something that requires training, is time consuming, and requires many resources (in

particular human resources) when there are a number of pages or a number of sites to evaluate within a

limited time period.

Accessibility Guidelines

The same principles apply here as were introduced in the Guidelines section earlier in section 3.8.

Implementation

Accessibility guidelines should be applied when designing web-sites. The guidelines are focused on the

concerns of people with visual impairments and those with motor difficulties that affect their physical

ability such as to type or position a mouse pointer precisely. In order to ensure accessibility, it is

necessary to follow the recommendations from guidelines as well as possible and then most importantly

test the site with users with disabilities. Automatic tests are performed later in the design cycle.

Page 54: Usability Studies - JISC Services and Information Environment

55 Centre for HCI Design

55

JISC Implementation Example

In order to comply with the DDA, it is essential to take accessibility into design considerations and apply

the guidelines into the design of the web-site where applicable. Additionally, it may also conduct an

automatic accessibility evaluation.

Page 55: Usability Studies - JISC Services and Information Environment

56 Centre for HCI Design

56

3.12 User testing

User testing is a process of analysing how users really use an interface. It can often uncover very specific

areas needing improvement, where focus groups and task analysis often find more general problems.

According to Nielsen and Landauer (1993), testing with 15 users will be able to discover all the usability

problems and the ultimate user experience is improved much by three tests with 5 users. According to

Nielsen (1993) eight users is enough to test and the first eight users are expected to detect 85% of the

site’s usability problems.

Methodology

There are two types of user testing addressed below, formal and retrospective.

Formal User testing

User testing involves a testing session with users. The participants that test the system are potential users

of the system. A set of test scripts should be prepared in advance aiming to collect both qualitative and

quantitative data from the evaluation, ideally in a usability laboratory. Participants should be carefully

selected from the potential/target user population.

A facilitator is responsible for conducting the testing and recording the results during the testing.

Participants should be asked to perform a set of pre-selected tasks and their interactions with the interface

as well as their facial expressions should be recorded during the test. They are asked to ‘think aloud’

whilst performing each task. The ‘think aloud’ technique is intended to generate a concurrent verbal

protocol to capture what they were thinking while interacting with the system. It is a simple technique

and was originated in psychological research methods (Ericsson & Simon 1984). Recently, it is

increasingly popular for practical evaluation in usability research (Denning et al.,1990).

It is a good practice to ask participants to fill out a survey relating to the tested site in between intervals of

each individual test task. This is because human memory has very limited capacity, especially regarding

small details that are crucial for interface design. Informal interviews should be carried out with

participants aiming to investigate the behaviours of users and evaluate how users interact with the

interface on a task by task basis.

Page 56: Usability Studies - JISC Services and Information Environment

57 Centre for HCI Design

57

Retrospective User Testing

Retrospective user testing carries out a testing session with a group of participants performing the test at

the same time under the same environment. A set of observation scripts is prepared in advance aiming to

collect both qualitative and quantitative data from the evaluation. Retrospective verbal protocols are

adopted requiring the subjects to report what they did after the task has been completed. It requires the

subjects to remember what they did, but this could be supported by a video of task performance. Users

are asked to fill out a survey relating to the site after each individual test task due to the limited capacity

of human memory. After the user testing, a focus group is conducted with the participants to investigate

the information seeking behaviour of the users, and evaluate how they interacted with the interface on a

task by task basis.

In user-centred design, the focus is to match the needs and capabilities of the people that are going to use

it. So if some target users are disabled, their needs should be considered in the design process. In an

evaluation, the facilitator should do the review from the perspective of a disabled customer. When the

web-site undergoes a usability test, some of the participants in the test should be from the disabled

community.

Implementation

Usability evaluation could be conducted at any stage of the development process to ensure that the design

is staying on track.

JISC Implementation Example

Evaluation is a critical component of every stage of the design. Under the JISC information environment,

it is important to raise the quality of the design and keep costs under control by preventing the design

from drifting off the track.

Page 57: Usability Studies - JISC Services and Information Environment

58 Centre for HCI Design

58

For example, when developing a JISC service user testing should be carried out throughout different

stages of the development cycle. This would provide an informative approach to involve users in the

design and to ensure that the JISC services conform to the usability and accessibility standards.

Additional Information

Cost of usability inspection methods: http://www.pages.drexel.edu/~zwz22/Cost.html

Usability evaluation techniques: http://www.dcs.napier.ac.uk/marble/Usability/Evaluation.html

G. Brajnik (2000) Automatic web usability evaluations: What needs to be done

http://www.tri.sbc.com/hfweb/brajnik/hfweb-brajnik.html

Page 58: Usability Studies - JISC Services and Information Environment

59 Centre for HCI Design

59

3.13 Cognitive Walkthrough

Cognitive walkthrough is an expert evaluation where usability experts, who may be sourced from outside

or within an institution, possibly through a local psychology department, step through a scenario/task to

question the design, focusing on the users’ knowledge and goals.

Methodology

A cognitive walkthrough involves a detailed sequence of actions, i.e. the steps that an interface would

require a user to perform in order to accomplish some tasks. Expert evaluators step through the sequence

to check it for potential usability and accessibility problems. The main focus is to establish how easy a

system is to learn.

Implementation

Before conducting a cognitive walkthrough a scenario/task should be created. A usability expert then,

based on the scenario/task created to question the design, walks through by simulating the users’

knowledge and goals.

The scenarios/tasks should represent the goals that users would be trying to achieve when using the web-

site. The sequence of actions that the expert follows represent the pathway that real end-users follow to

accomplish tasks, with potential problems be identified and documented. The evaluation should first start

with high frequency tasks and then explore more specific or critical tasks, such as error recovery.

A set of templates based on the CE+ theory (Wharton et al, 1994), which is an information- processing

model of human cognition, should be adopted to assist the walkthrough. This set of templates is designed

in a way that the usability experts could then simulate the roles of each person’s profile created in the

persona and scenarios. The walkthrough should then aim to simulate the actual users making sure that the

site actually serves the needs of specific people in real life.

Page 59: Usability Studies - JISC Services and Information Environment

60 Centre for HCI Design

60

JISC Implementation

A cognitive walkthrough could be conducted by JISC to evaluate the usability and accessibility of the

current applications or throughout the development stage of a new application or in a re-design. This

could start with creating scenarios and personas for that specific JISC web-based application. The

scenarios created would be adopted as the primary tasks that represent tasks that most users would be

doing when using the JISC web-based application.

Usability experts could then examine the web-based application from the user’s point of view based upon

the personas created prior to the walkthrough and simulated the persona’s behaviour by carrying out the

predetermined scenarios. The experts would examine each of the correct actions needed to accomplish a

task, and evaluate whether the four cognitive steps from the template were satisfactorily addressed.

Based on the template designed for use in the cognitive walkthrough, the usability experts would repeat

the four steps several times (CE+ theory by Wharton et al. (1994)), thus achieving a series of sub-goals

that define the complete task for the scenarios that a user would attempt with the web-based application.

Additional Information

J. Rieman, M. Franzke, and D. Redmiles (1995) Usability Evaluation with the Cognitive Walkthrough

http://www.acm.org/sigchi/chi95/proceedings/tutors/jr_bdy.htm

Gregory Abowd (1995) Performing a Cognitive Walkthrough

http://www.cc.gatech.edu/computing/classes/cs3302/documents/cog.walk.html

Page 60: Usability Studies - JISC Services and Information Environment

61 Centre for HCI Design

61

3.14 Heuristic Evaluation

A heuristic evaluation is an expert evaluation method that uses a set of principles to assess if an interface

is user friendly. One of its key advantages is that it is not dependent on end-user involvement, therefore it

is a lot quicker and often cheaper to conduct - factors that are especially appealing when resources are

limited (Kantner & Rosenbaum, 1997).

Nielsen and Molich re-developed the technique and provided the industry with an original set of nine

heuristics, which were subsequently improved and extended (Dix et al., 1998). Since their introduction

many new sets of heuristics have been devised. For example Nielsen himself has developed HOMERUN,

a set he regards as more suitable for evaluations of commercial web-sites (Preece et al., 2002).

The effectiveness of an evaluation is largely dependent on the experience and number of experts involved,

with three to five experts being regarded as sufficient to identify the majority of the key problems. Danino

(2001) states that ‘…5 evaluators who are experts in software ergonomics and in the field in which the

software is applied … will typically find 81%-90% of usability problems’.

However, it should be noted that heuristic evaluations are commonly regarded as an inferior form of

evaluation since they are subject to evaluator bias and often miss or wrongly identify problems. Thus,

Preece et al. (2002) believe that heuristic evaluations should not be considered a substitute for user

testing.

Methodology

To carry out a heuristic evaluation the following steps must be followed:

1. Select your expert evaluators, preferably with extensive usability knowledge and domain

awareness.

2. Define suitable tasks for the evaluators to attempt, or ask them to just walkthrough the site.

3. Each evaluator should independently attempt the tasks looking at each element of the interface,

assessing it against the set of heuristics.

Page 61: Usability Studies - JISC Services and Information Environment

62 Centre for HCI Design

62

4. If a heuristic is contravened then the evaluator records a description of the problem, the location

on the site, the heuristic that was violated and often a rating identifying how severe the problem

is.

5. The problems identified by all the evaluators are then appropriately grouped, and possibly

categorised (i.e. navigational problem, consistency problem).

6. The findings are presented in a written report.

As mentioned in step 4 a severity rating is also often applied to the usability problems uncovered,

generally on a scale of 0 to 4 (4 being most critical). Ranking of usability problems by severity helps to

determine the key problems that should be addressed, given that not all problems can be fixed due to

constraints of time and costs. The severity rating of a problem is a combination of three key factors:

• Frequency – how often the problem occurs?

• Impact – how easy or difficult is it to overcome?

• Persistence – once aware of the problem can it be overcome, or will it repeatedly impede their

performance?

(Nielsen, 1995)

The findings from an evaluation of this nature provide the design team with a large amount of qualitative

data, which can be transformed into clear re-design proposals.

A note of caution however - as previously mentioned heuristic evaluations are perceived to contain

judgements biased on the side of the evaluators to some degree. This is most prominently displayed by an

inconsistency between experts with regard to severity ratings (Capra, 2001). Additionally, heuristic

evaluations often identify usability issues that real users do not perceive as being a problem.

Implementation

Heuristic evaluations are suitable at almost any time during a user-centred design cycle. Thus the

technique can be applied to prototypes or fully implemented interfaces to retrieve valuable information

regarding issues of usability.

Page 62: Usability Studies - JISC Services and Information Environment

63 Centre for HCI Design

63

JISC Implementation Example

Heuristics are a powerful technique and there are currently a variety of sets focused upon web-based

services. However, the diversity of JISC’s services would imply that some modification of these sets

would further ensure their effectiveness was maximised.

A study of heuristic evaluations has been carried out at Indiana University on their Digital Music

Libraries interface. Two separate heuristic evaluations were conducted, one by users and the other by

experts. Contrary to popular belief, they found that the users uncovered more problems with higher

degrees of severity than the experts did. Thus, Minibayeva (2002) notes that there is potential for

successful heuristic evaluations to be carried out by users:

[E]mploying user-based versions of certain expert-based methodologies could potentially ensure

higher validity of usability evaluation, reduce the time and number of usability experts, and

involve users more actively at earlier stages of systems design.

Traditional research has however shown that a high level of usability experience is imperative in

achieving quality results. Although there is no question over domain knowledge, section 1 of this report

has already identified that the level of usability experience varies greatly between JISC services.

Therefore, heuristic evaluations do not present themselves as the most appropriate method to be widely

implemented by internal JISC staff.

Additional Information

Jakob Nielsen’s web-site Useit.com provides information on how to conduct a heuristic evaluation, and

also presents different sets of heuristics: http://www.useit.com/papers/heuristic/heuristic_evaluation.html

Danino provides a step by step guide to conducting heuristic evaluations:

http://www.webmasterbase.com/article/520

Page 63: Usability Studies - JISC Services and Information Environment

64 Centre for HCI Design

64

Instone provides an explanation and list of usability heuristics for web evaluations: http://user-

experience.org/uefiles/writings/heuristics.html

A background introduction into Minibayeva’s, research A User-Based Approach to Cognitive

Walkthrough and Heuristic Evaluation is available at:

http://www.slis.indiana.edu/news/story.php?story_id=406

Page 64: Usability Studies - JISC Services and Information Environment

65 Centre for HCI Design

65

3.15 Summary of Costs and Benefits of the Methods by Olson & Moran (1996)

Methods Type Benefits Costs

DEFINE THE PROBLEM

Naturalistic observation (diaries, videotape, etc.)

Empirical Tasks 2 days

Interviews (including focus groups, decision tree analysis, semantic nets)

Empirical Tasks 1 day

Scenarios or use cases (including envisioning)

Analytic Tasks 1 day

Task analysis (including operator function model)

Analytic Tasks 2 days

GENERATE A DESIGN Building on previous designs (steal and improve, design guidelines)

Constructive Tasks, perform, learn, accept

1 day

Represent conceptual model Constructive Learn 1 day Represent interaction (GTN, dataflow diagram)

Constructive Perform, learn 2 days

Represent visual display Constructive Perform, learn 2 days Design space analysis (QOC, decomposition analysis)

Analytic Tasks, perform, learn 3 days

REFLECT ON THE DESIGN Checklists Analytic Perform, learn 1 day Walkthroughs Analytic Perform, learn 2 days Mapping analysis (task action, metaphor, consistency)

Analytic Perform, learn 2 days

Methods analysis (GOMS, KLM, CPM, CCT)

Analytic Perform, learn 3 days

Display analyses Analytic Perform, learn 3 days BUILD A PROTOTYPE Prototyping tools Constructive Testable system 1 month Participatory prototyping Empirical Tasks, accept 1 week TEST THE PROTOTYPE Open testing (storefront or hallway, alpha, damage testing)

Empirical Perform, learn, accept 1 month

Usability testing Empirical Perform, learn, accept 1 year IMPLEMENT THE DESIGN Toolkits (Motif, NeXTstep, Apple, etc.) Constructive Fully testable system 6 months

DEPLOY THE SYSTEM Internal testing Empirical Perform, learn, accept 1 month

Beta testing (logging, metering, surveys) Empirical Tasks, perform, learn, accept

1 month

Page 65: Usability Studies - JISC Services and Information Environment

66 Centre for HCI Design

66

4. Study of Current Practices

4.1 Digital Libraries

In recent years, the information superhighway, the Internet, has become a global gateway for information

dissemination. With the ability to share worldwide collections of information, DL’s have become one of

the common mediums to store and disseminate information by individuals or groups that select, organise

and catalogue large numbers of documents.

DL’s, generally referred to as ‘collections of information that are both digitised and organised’ (Lesk,

1997) give us opportunities we never had with traditional libraries or even with the web. DL’s are

emerging and the digital computer is the technology that has enabled Bush’s ‘memex’ to be finally

realised. Bush (1945).

According to JISC, a DL may be defined as an organised collection of digital resources accessible by

means of an electronic catalogue or other form of finding aid. It includes conventional library catalogues

such as the Online Public Access Catalogue (OPAC) as well as newer resources such as e-journals,

records from special and archival collections and multimedia resources. A DL system is the software

environment that underpins the library catalogue and other resources. The system may be wholly or partly

accessible via an Intranet or the Internet. It may often integrate access to virtual versions of library

services, such as reservations, registration and reference enquiries offered, particularly to distance

learners. The term DL resource refers to a resource that is associated with or part of a DL. The linking to

local DL systems is defined as systems that are available within the local institution.

JISC currently funds a number of projects and programmes to develop a national infrastructure for access

to DL resources in the UK. JISC is supporting the management and preservation of institutional and

community records and digital materials. Preservation of digital resources will be of increasing

importance for a wide range of activities and materials within UK further and higher education. The

sector invests substantial sums in subscriptions to e-journals and in addition is investing heavily in

Page 66: Usability Studies - JISC Services and Information Environment

67 Centre for HCI Design

67

digitisation and in arts and scientific data in digital form. The organisation aims to provide a strategy and

a range of information and advice sources that assist in the wider process of digital preservation.

DL’s consist of four main components:

1. Information referring to the content of a DL

2. Structure referring to the metadata of objects described in the DL collection.

3. Interaction elements referring to the dynamics of searching and browsing, screen design, dialogue

between end-users and the DL’s

4. Propriety referring to security, ethical, copyright issues, etc.

Studies have shown that users have great difficulty using relatively basic OPACs. These difficulties are

mainly caused by issues such as difficulties in learning to use any new piece of software, difficulties in

getting to know the structure of where information lies in a library and difficulties of using Boolean

search operators.

Current design of DL’s contains complex facilities including text search, functionality relating to

hypertext, multimedia, the Internet and highly interactive interfaces.

As a DL is more complex than simply a web page, it is reasonable to assume that a lot more work has to

be focused on usability in order to ensure having a more usable DL than conventional web pages. DL’s

are just more than web-sites or a place for information storage. In order to design usable and accessible

DL’s, it is essential to have knowledge about who the users are, what they will use DL’s for, their context

of work and the environment in which the DL’s will be used, as well as the technical aspects of the DL’s

and the feasibility of that in logistical terms. When designing usable DL’s, it is important to be aware

that special usability issues such as knowing the tasks and populations of users as well as the cultural

diversity issues of users should be taken into account as these are all important aspects in producing truly

usable DL’s.

Page 67: Usability Studies - JISC Services and Information Environment

68 Centre for HCI Design

68

Dix et al (1998) suggested that even if the best methodologies and models are adopted in the design of a

usable interactive system, it is still necessary to assess the design and test the system to ensure that it

behaves as expected and meets the user’s requirements. Therefore, there is a need for a usability and

accessibility framework that supports the development of effective solutions for DL’s in order to produce

truly usable and accessible DL’s.

When designing and building DL’s, it is important to take account of existing usability research in the

area of traditional libraries as well as research in the area of information management. To ensure a good,

usable DL is produced with high performance and users' satisfaction, it is important that not only the

advantages of digital information are embraced,but also to retain the advantages of print, drawing upon

expertise, and knowledge from both the DL as well as the information and library science communities.

Current research (Theng et al., 1999) evaluated DL’s which provide similar services to some of the DL’s

services funded by JISC. These libraries are:

• The Networked Computer Science Technical Reference Library (NCSTRL)

• The New Zealand Digital Library (NZDL)

• The ACM Digital Library (ACMDL)

These services are all available to the general public and are good examples of DL’s found on the web in

terms of their information and coverage according to the evaluators.

NCSTRL is an international collection of computer science research reports and papers made available

for non-commercial use from more than 100 participating institutions and archives.

(http://www.ncstrl.org). NZDL comprises several demonstration collections such as computer science

technical reports, literary works, Internet FAQs, and the Computists Communique magazine

(http://www.nzdl.org). And for ACMDL, it consists of a vast resource of bibliographic information,

citation and full-text articles (http://www.acm.org).

Questionnaire results in the research show that users' overall impressions of DL’s are determined by how

effective the DL’s are in helping them to complete the tasks successfully.

Page 68: Usability Studies - JISC Services and Information Environment

69 Centre for HCI Design

69

In performing the ‘Search’ tasks, it was found that categorisation is important for a successful search

result. While for the ‘Browse’ tasks, it was found that confusing layout of the site prevents users from

browsing the site effectively.

Based on the DL study conducted by Theng et al. (1999), areas of design flaws were evident in their DL’s

evaluation and need improving. The main issue was to provide better navigation support mechanisms to

address the ‘lost in hyperspace’ problem. Navigation here is used in terms of end-users’ confidence in

navigating within the DL. Users experienced some degree of ‘lostness’. This relates to the ‘lost in

hyperspace’ problem referring to the following phenomena:

• The problem of not knowing where they are in the DL

• How to get to some other place they know (or think) exists in the DL

• How to return to a topic left previously

• The problem of forgetting the key points covered

Soergel (2002) developed a framework for DL research. His research proposed some guiding principles

for the development of DL. The principles that related to usability issues were:

• DL’s need linked data structures for powerful navigation and search.

• The interface for the DL’s should guide users through complex tasks

• Innovative DL design should be informed by studies of user requirements and user behaviour.

First, as much of the knowledge base and intellectual assets of institutions and staff are now in digital

form, unless significant effort is put urgently into digital preservation and securing long-term access to

these digital resources, uncertainties over archiving will continue to impede the growth and take-up of

digital services, e-science, and new working practices. Second current investment in digitisation and

digital content will also only secure short-term rather than long term benefits.

In building DL’s for JISC, it is important to consider key principles so that these libraries will be easily

usable, and have long-term archival value:

Page 69: Usability Studies - JISC Services and Information Environment

70 Centre for HCI Design

70

1. Declarative representations of documents should be used.

2. Document components should be represented using natural forms, namely objects that can be

manipulated by users familiar with those objects.

3. Links should be recorded, preserved, organised and generalised.

4. There should be a separation between the DL and user interfaces to it.

5. Searching should make use of advanced retrieval methods.

6. Open systems that include the user, and where (some of) the functions of librarians are carried out by

the computer, must be developed.

7. Task-oriented access to electronic archives must be supported.

8. A user-centred development approach should be adopted.

9. Users should work with objects at the right level of generality.

Page 70: Usability Studies - JISC Services and Information Environment

71 Centre for HCI Design

71

4.2 JISC Digital Library Services

In this usability evaluation study for JISC Information Environment four types of service were evaluated:

4.2.1 Resource Discovery Services

In order to improve resource discovery over the Internet, there is a need for better interaction between a

user carrying out the search and the search system. Resource Discovery Services (RDS) improve the

current search engines and offers better search accuracy for the users by using their profile information.

From the European funded project GESTALT (Getting Education Systems Talk Across Leading Edge

Technologies), a list of user requirements for RDS was identified. According to the list:

• the user environment should be multi-platform

• the user environment should be easy to use (technical and non-technical users) and should provide

fast response times

• different actions (search, filter, retrieve) should be integrated in one consistent user interface: use-

standardised interface for all searches.

• the user environment should support Internet access and a number of different access networks (e.g.

ISDN, ATM)

• the user environment should keep a trace of the different actions performed by a specific customer.

The system could use information generated from previous searches

And in terms of the search facility:

• the search utility should be fast and easy to use

• the search result should provide the user with all the information he/she needs about a product or

services

• the search engine should confirm the authenticity and quality of the product and supplier

• the search facility should support multi-type and multi-level searches

Page 71: Usability Studies - JISC Services and Information Environment

72 Centre for HCI Design

72

The resource discovery service requires the development of a user profile service where users are allowed

to enter and manage information about themselves together with their service requirements, and this

information will then be used by the RDS to improve the search facilities.

Comparative Services

• Academic Info – The Social Science Gateway http://www.academicinfo.net/subsoc.html - This social

science information gateway aims to improve access to online educational resources by developing

an easy to use subject directory covering each academic discipline. In terms of usability, this site has

a simple and basic interface design. However, the visual hierarchy is not clear enough and it has a

rather limited collection of resources available. The information does not categorise according to

resource type but only by alphabetical order, the search function is not obvious enough to users as

well, which makes finding specific information a difficult task to complete.

• The Voice of the Shuttle http://www.mirror.ac.uk/sites/vos.ucsb.edu/ - The Voice of the Shuttle is a

web directory for academic research which provides information resources to a broad range of

categories not just limited to social science resources. However, the visual hierarchy is not clear to

users with poor navigation structure. The links provided are not obvious whether they are clickable

or not, and the site contains a number of broken links which makes it not usable at all.

• Social Science Virtual Library http://www.clas.ufl.edu/users/gthursby/socsci/index.htm -

The Virtual Library is the oldest catalogue of the web, started by Tim Berners-Lee, the creator of

html and the web itself. It is a non-commercial service run by a group of volunteers, whom compile

pages of key links for particular areas in which they are experts. This site mainly categorises the

resources according to alphabetic order, the navigation is simple and easy to use but the site does not

have breadcrumbs (on a Web site, a breadcrumb trail is a navigation tool that allows a user to see

where the current page is in relation to the Web site's hierarchy) to tell users where they are within

the site. Users might experience degrees of ‘lostness’ while browsing the site looking for

information.

Page 72: Usability Studies - JISC Services and Information Environment

73 Centre for HCI Design

73

• Infoglobus – Social Sciences Gateway http://social.narod.ru - Infoglobus is a Russia- based online

resources for social scientists. The links are not obvious whether they are clickable or not and the

visual hierarchy is not clear as well. In particular, the site does not clearly tells users how to change

the default language from Russian to English, which makes it non-usable for users who have no

knowledge of the Russian language.

• The Pinakes http://www.hw.ac.uk/libWWW/irn/pinakes/pinakes.html - Pinakes is a web page that

provides Internet resources by linking to other major subject gateways. The image icon associated

with each link enables users to easily spot the subject gateway that relates to the subject area or

specific information that they are looking for. The categorisation of the site is a bit ambiguous

though, as some of the links were categorised as ‘multi-subject gateways’ while the rest of the links

have no category at all.

Page 73: Usability Studies - JISC Services and Information Environment

74 Centre for HCI Design

74

4.2.2 Bibliographic services

Bibliographic services contain databases in a form of ‘an organised collection of information’. The

database contains descriptive information (citation and subject headings) for publications, such as books,

periodical articles, videotapes or government documents. According to Aalberg (2002), when users have

vaguely defined information needs and prefer to explore which publication is available by browsing the

catalogue, it is important to have a meaningful navigation path to assist their search.

The structure of the database generally consists of the following information:

Index- includes citation and subject headings, also known as descriptors, for each publication.

Abstracted Index - includes the citation, subject headings (descriptors) and a summary of the content of

the publication.

Comparative Services

Most publishers offer table of contents alerting for their own journals. For example:

• Elsevier http://www.elsevier.nl/homepage/alert/?mode=direct (Alerting service) and Springer

http://link.springer.de/cs/service.htm (Alerting service) - Elsevier’s includes books in its table of

contents and the site has a clear visual hierarchy and clickable items are obvious to users. However,

some of the links in the navigation bar split into two lines which makes it difficult for users to

identify each individual link. For Springer, the visual hierarchy of Springer’s site is also clear to

users and the site is divided into sections with primary navigation and secondary navigation in its

hierarchy, which makes it easy to navigate. Clickable links and buttons in part of the Springer site

(e.g. the Alert services) are not obvious enough though, which needs improvements in these areas.

• There are also specific products such as ISI's Current Contents Connect

(http://www.isinet.com/isi/products/alerting/) that provides alerting services to users.

Page 74: Usability Studies - JISC Services and Information Environment

75 Centre for HCI Design

75

• MedFetch http://www.medfetch.com/ uses Medline to provide subject based alerting services. Users

may find difficulties in navigating the site as the site does not have a clear visual hierarchy. Some

terminologies are not clear to users as well and the search function is not obvious to users.

Page 75: Usability Studies - JISC Services and Information Environment

76 Centre for HCI Design

76

4.2.3 Virtual map libraries

Geo-spatial data resources like virtual map libraries are increasing in availability, size and complexity.

However, the expected growth in numbers of users related to research in higher education has not

materialised. One possible reason for this is the steep learning curve associated with effective use of

spatial data (KINDS, 2003).

Research over the past decade has revealed that inappropriately designed interfaces have led to problems

regarding the usability and accessibility of the services.

When paper maps were the sole tool for visualising geospatial information there were many efforts by the

geospatial community to provide tactile or tactual maps for the visually impaired… so as to not exclude

certain members of the community. Similarly, research has produced guidelines for use of colour on maps

(and other displays) that minimises interpretation problems for those with colour vision impairment…

The same needs to be done for contemporary visualisation products. Access to geospatial information,

and the interfaces that provide the “gateways” to this information, need to be designed in sympathy to all

users, so as to ensure equality of access and use (Cartwright et al., 2001).

Davies and Medyckyj-Scott’s (1994; 1996) (research in the mid 90’s identified key usability problems

that effect Geographical Information Systems (GIS) user interfaces. Issues that also need to be addressed

with regards to web-based virtual map libraries:

• Non technical end-users are often unable to adapt the interface to their preferences and comfort.

• User interfaces should comply more thoroughly with national, international and proprietary

interface standards, to enable users to transfer existing computing knowledge and skills to the

GIS and thus increase learnability.

• Extra functionality should not be brought in at the expense of usability.

• Problems of ease of use can only be solved through better design. Longer training courses have

not compensated for poor usability.

• The system usability, especially in interface display is strongly correlated with users’

productivity.

Page 76: Usability Studies - JISC Services and Information Environment

77 Centre for HCI Design

77

(Davies & Medyckyj-Scott 1994 and 1996)

Fabrikant (2001) therefore notes that to help overcome these issues, the geo-visualisation community

needs to focus upon two specific goals:

1. The need to develop task-centric visualisation tools.

2. The need for sound usability evaluation procedures.

Current usability practice

Usability evaluations of web-based geo-spatial information providers have developed on the back of this

previous research, and have begun to be carried out in recent years.

• Researchers at Oregon State University reviewed several large clearinghouses including the

National Geospatial Data Clearinghouse (Walsh et al., 2002). The established usability

assessment methods that were applied to their study included a user expectation survey and user

testing evaluations.

• Associate professor Moller-Jensen from the University of Copenhagen has recently conducted

query based evaluations and low fidelity prototyping to ‘…monitor the interaction between a

group of relatively inexperienced GIS-users and a standard internet map server (IMS)

application’ (Moller-Jensen, 1999).

• Researchers at Manchester University have also noted that ‘accessibility and usability of spatial

data sets are major bottlenecks to increasing the number of users and applications’ (Li et al,

1999). Thus they have applied HCI principles in the development of the Knowledge-based

Interface to National Data Sets (KINDS). KINDS provides a virtual map library that comprises of

a full UK national coverage directory and sub directories named after the corresponding tile of

the National Grid. Its key aim is to ‘…increase awareness, accessibility and usability of spatial

data sets’ (KINDS, 2003). A user survey including semi-structured interviews and a technical

questionnaire were conducted to assess the users' requirements.

Page 77: Usability Studies - JISC Services and Information Environment

78 Centre for HCI Design

78

The key findings from these studies indicate that ‘new users face a significant learning curve when

adopting spatial data… The combination of high level technical skills required often result in spatial data

handling being the preserve of the highly technically competent’. To overcome these issues, the

evaluators note that user friendly, easy to access and flexible interfaces need to be developed, thus

enabling all users to browse and handle spatial data effectively (Li et al, 1999).

Comparative services

JISC offers a variety of geo-spatial services including:

• Landmap http://www.landmap.ac.uk - Orthorectified satellite image mosaics of Landsat, SPOT and

ERS radar data and a high resolution Digital Elevation Model for the whole of the British Isles. These

data are in a form that can easily be merged with other data, such as road networks, in order that any

user can quickly produce a precise map of their area of interest (JISC, 2003a).

• Digimap http://edina.ac.uk/digimap - Comprehensive selection of Ordnance Survey® (OS) digital

map data and high-quality cartographic products, including: Land-Line.Plus, Meridian, 1:50,000

Scale Colour Raster, Strategi, Land-Form PANORAMA Contours, Land-Form PANORAMA DTM,

Gazetteer and Codepoint with Polygons (JISC, 2003b).

Some of the services these resources provide are unique. However the interaction and design of services

like the Basic Mapping interface on Digimap can be compared to that of other services, especially since

the user interface is one of the most important aspects of a geo-spatial information system. The sites

below all offer web-based mapping services that follow established web design guidelines, such as

providing icons to navigate their way around the site and manipulate the data. Li et al note that ‘potential

spatial data users can gain information about the data set far more easily by browsing its contents than by

reading a textual description’ (Li et al, 1999). Factors such as this can enable a comparative assessment of

the usability and accessibility of these services, and those provided by JISC to be carried out.

Page 78: Usability Studies - JISC Services and Information Environment

79 Centre for HCI Design

79

• The British Geological Survey http://www.bgs.ac.uk/geoindex/index.htm - Geoscience Data

Index mapping facility that allows users to generate OS maps that identify key areas of

geological interest i.e. boreholes in a region.

• GeoWeb North Van GIS http://www.geoweb.dnv.org/maps/index.html - Grid maps that divide

the District down into small sections, or grids, that show small areas in great detail. The site

additionally offers maps displaying one theme across the entire District.

• MultiMap http://www.multimap.com - A commercial mapping service that allows users to view

the retrieved data in twelve different scales. Panning options assist the user in their tasks.

Page 79: Usability Studies - JISC Services and Information Environment

80 Centre for HCI Design

80

4.2.4 Digital image libraries

The number of images available on the Web was estimated in 1997 to be between 10 and 30 million

(Eakins & Graham, 1999), a figure that we can assume has vastly increased in the past five years. Digital

image libraries are now a key resource in retriving such images over the Internet, but users often find it

hard to acess the information they want for:

• Interfaces are non intuative.

• Users do not know what information is available.

One reason for this, as proposed by Bird (1999) is that ‘…little attention has been paid to the user

requirements and expectations associated with content-based image retrieval’ (a technique for retrieving

images on the basis of automatically-derived features such as colour, texture and shape).

It is noted that more research is also required about ‘…how users can usefully be segmented into different

types, the needs of these types, and the implications for retrieval systems design’ (Eakins & Graham,

1999). With their involvement in the NSF International DL’s Program JISC is helping to uncover these

issues and thus increase the usability and accessibility of digital image libraries. The Visual Arts Data

Service (VADS) has also spearheaded this movement by establishing standards and good practice for

visual arts digital resources (Grout et al, 2003). The document further addresses usability issues such as

consistency and navigation of the user interface.

Research into the usability of digital image interfaces, with special reference to the use of colour selection

in content-based image retrieval (CBIR) systems, has been conducted by the Nijmegen Institute for

Cognition and Information (Van Den Broek et al., 2002). They have identified that many CBIR interfaces

are difficult to use and non-intuitive. Therefore, available information concerning the users' cognitive

abilities should be considered regarding all three components of CBIR-engines:

• Definition of the query by the user (i.e. in-put of content)

Page 80: Usability Studies - JISC Services and Information Environment

81 Centre for HCI Design

81

• The image retrieval engine, conducting intelligent image analyses (i.e. based on and adapted to

the users' characteristics).

• The presentation of retrieval results (i.e. the output to the user).

Image libraries also need to be accessible to all users. One of the key requirements to enable visually

impaired users who use a screen reader is for webmasters to include an alt-text whenever there's an

image, thus describing an image, like "London Bridge." .

Current usability practice

Many international parties, as outlined by a few of the key studies below, have conducted usability and

accessibility research into digital image libraries over the past years:

• A review of content-based image retrieval services was conducted by Venters and Cooper

(http://www.jtap.ac.uk/reports/htm/jtap-054.html#_Toc482422975) from the University of

Manchester in the 1990’s (a study funded by JISC). The authors conducted heuristic evaluations

on two web-based image services, ImageFinder and IMatch. The evaluations involved an

examination of the user interface against Nielsen’s ten heuristics to identify any usability

problems.

• The Applied Science & Technology Group, IBM UK Laboratories have conducted a user study

of CBIR as part the European Union funded Electronic Library Image Service for Europe

(ELISE) project. The ELISE project is concerned with the issues that surround building a

complete digital image service. The usability study was based around IBM's QBIC (Query By

Image Content) technology, and consisted of images of cultural artefacts from ELISE (Day,

1999).

• The Digital Knowledge Center (the DL research and development department of the Sheridan

Libraries) has worked with other library departments and groups outside the library to evaluate

the usability of a variety of web-sites. They have worked with Special Collections to conduct an

Page 81: Usability Studies - JISC Services and Information Environment

82 Centre for HCI Design

82

online survey of the prototype for the Roman de la Rose site, and are currently working with

them on the digital sheet music harvester usability project (DKC, 2003).

The studies above uncovered a selection of usability problems. Venters and Cooper

(http://www.jtap.ac.uk/reports/htm/jtap-054.html#_Toc482422975) noted that uncommon interface

elements contributed to usability of IMatch’s web-site. For example the icons used for copy and delete

functions did not correspond with standard Windows icons. Additionally, it was found that services such

as Imatch could be improved by implementing more consistency via a Multiple Document Interface

(MDI) environment, rather than presenting the user with two related, but separate interfaces. The Digital

Knowledge Center in conjunction with the Roman de la Rose digital image library have also used the

feedback from their usability surveys to make amendments to their original design, thus producing a

more user friendly service.

Comparative Services

JISC has a variety of digital image libraries:

• Bristol Biomedical Archive http://www.brisbio.ac.uk - 10,300 online biomedical images, from ILRT

at the University of Bristol, covering the fields of medicine, dentistry and veterinary science and

intended for use in teaching (JISC, 2003c).

• St Andrews University Library Image Collection http://www.helix.dmu.ac.uk/ - 15,000 online images

delivered via the HELIX Project at De Montfort University (JISC, 2003d).

• Visual Arts Data Service http://vads.ahds.ac.uk - Contains ten collections from JIDI (the JISC Image

Digitisation Initiative) providing online teaching and research materials which focus on image

collections as well as web-sites across a range of visual arts subject areas, including student degree

show material (JISC, 2003e).

Page 82: Usability Studies - JISC Services and Information Environment

83 Centre for HCI Design

83

The services below offer similar services to those provided by JISC, and have similarly attempted to

address issues regarding the ease and effectiveness of retrieving images, thus initiating best practices for

these resources.

• Picture Australia http://www.pictureaustralia.org/index.html - Internet based service that allows

you to search many online pictorial collections at the same time. Images can be searched for via

standard search box, collection or theme.

• The Metropolitan Museum of Art http://www.metmuseum.org/home.asp - The museum's

photographic services department maintains a digital operation that supports both archival and

collections-management functions throughout the institution.

Page 83: Usability Studies - JISC Services and Information Environment

84 Centre for HCI Design

84

5. Usability and Accessibility Framework for DL

5.1 Nature of digital libraries

Traditional bricks and mortar libraries can be defined as managed collections of information that enable

users to increase their knowledge. Modern digital libraries (DL’s) endeavour to provide the same

services, but deliver information over the Internet or Intranet; therefore they operate in the intersection

between traditional libraries and the information superhighway.

Internet/Intranet Libraries

Web-Based

Digital

Libraries

Figure 5: DL’s position in relation to traditional libraries and the Internet.

A crucial factor for libraries is that the information they preserve and deliver is effectively organised.

With regards to DL’s, Arms (2002) notes that a ‘[d]igital stream of data sent to earth from a satellite is

not a library. [However] The same data, when organised systematically, becomes a digital library

collection’. This is one of the key dimensions of a DL. Highly effective cataloguing, organisation and

structure of information separates DL’s from other ad-hoc web services where the information

architecture and navigational mechanisms have no particular justification.

Page 84: Usability Studies - JISC Services and Information Environment

85 Centre for HCI Design

85

Another key dimension is user behaviour. Web-sites are often designed to support browsing activities,

whereas DL’s need to support task oriented navigation. Helander and Vora (1997) define the difference

between these two information-seeking behaviours:

The main distinction between navigation and browsing is based upon user goals. In browsing,

users explore the available hypertext to get a general idea about one or several topics. Whereas,

in navigation, users have a specified goal in mind (Helander & Vora, 1997).

Figure 5 highlights the position of DL’s with regards to information seeking behaviour and the

organisation of information, with reference to the Internet.

Figure 6: Axis of users behaviour versus information organisation

Page 85: Usability Studies - JISC Services and Information Environment

86 Centre for HCI Design

86

5.2 Usability and accessibility iterative framework for DL’s

Libraries have always tried to remove obstacles to information access. A poorly designed DL is certainly

a barrier to the library user; therefore the need exists for a specific usability and accessibility framework

for DL’s, which if adopted can ensure quality and enhanced usability of a service.

We regard the most important aspect in evaluating a system to be the identification of real user problems;

therefore our framework plays specific attention to evaluation techniques that involve current and

prospective users. Expert evaluation methodologies are also conducted to supplement user evaluations

and address areas that are not covered by previous evaluation techniques. After each stage the findings

must be evaluated, enabling appropriate design and modification of the techniques in the next stage of the

framework, thus ensuring maximum effectiveness.

Figure 7: DL’s usability/accessibility framework

Page 86: Usability Studies - JISC Services and Information Environment

87 Centre for HCI Design

87

The framework can be broken down into seven key steps:

1. Conduct Query - Requirement Gathering

Identify satisfaction levels of current users of the system and establish key positive and negative

aspects of the interface, what features they would like to see etc.

2. Analysis

Evaluate current findings and identify issues not yet addressed

3. Perform Empirical (user) Evaluations

We regard user testing as the strongest evaluation technique, allowing us to identify real user

problems by observing users interacting with the system. Retrospective focus groups or interviews

conducted after the evaluations also provide a volume of qualitative data.

4. Analysis

Establish key problems and assess if any areas of the service have not been covered by user

evaluations

5. Expert Evaluations

Appropriate modification of expert evaluation techniques maybe required so that they supplement

previous evaluation findings, and address any areas or issues that have not as yet been covered

6. Analysis

Analyse all data identifying key issues that need to be addressed in the redesign of the service.

Establish new usability and accessibility goals for the design

7. Iterative Process

Re-conduct all stages in the iterative framework to evaluate redesign

Page 87: Usability Studies - JISC Services and Information Environment

88 Centre for HCI Design

88

The techniques in each stage of the process are:

Query Techniques User Testing Expert Evaluations

Questionnaires Retrospective Heuristic evaluation

Interviews Concurrent Cognitive Walkthrough

Focus groups

The evaluation techniques applied to the DL’s framework also need to address the highly organised and

task based nature of DL. In our evaluations of four JISC services the tasks applied to the user testing

evaluations and cognitive walkthroughs were designed with these two dimensions in mind, for example.

Page 88: Usability Studies - JISC Services and Information Environment

89 Centre for HCI Design

89

6. Instrument Development Tools

A variety of methodologies and techniques are applied to the DL framework. Each has been specifically

designed for DL usability and accessibility evaluations.

6.1 Questionnaire (See Appendix A)

Design decisions regarding the general scope and focus of the questionnaires had been made at an early

stage in the project by carrying out:

• Research into the aims and design of the application.

• By identifying key usability questions we wanted answered.

A range of information gathering tools were further utilised to evaluate what data we wished to retrieve

via the questionnaires, and how the questions could be best structured:

• Brainstorming sessions with the JISC team at City University.

• Analysis of the initial findings from walking through the application.

• Telephone and email correspondence with personnel from the four JISC services to uncover any

particular areas they were interested in gaining feedback from.

The guidelines that were referred to in section 3.7 of this report were also applied to the structure of the

questions, to ensure they followed standard design principles.

The questionnaire included the following types of questions to help ascertain a variety of information.

• Factual questions – to collect background information on end-users.

• Opinion and satisfactory statements – ask respondents to rate their views and attitudes as to

whether they like something or not by way of a Likert scale.

• Closed questions – enable greater statistical analysis of data.

Page 89: Usability Studies - JISC Services and Information Environment

90 Centre for HCI Design

90

• Open questions – to elicit general and unanticipated information

The structure of all four questionnaires generally follows the standard design of the Questionnaire for

User Interaction Satisfaction as designed by The University of Maryland (Shneiderman, 1998). They start

with an explanation of the purpose of the questionnaire, then general factual questions asking the user

background information. This is useful to find out the range of experience etc within the response group.

They then ask opinion questions relating to specific usability criteria and features of the application.

Eighteen opinion questions were included in all four surveys based upon an established usability

questionnaire, Computer System Usability Questionnaire by Lewis (1995). These general usability

questions were selected because our research has shown that they clearly uncover key usability issues

when applied to web-based services. The rest of the questions for each of the four services where devised

by ourselves, and correspond to the specific design and nature of each of the services.

Each questionnaire contained no more than 40 separate questions; to include many more would have

increased the possibility of:

1. Respondents being deterred from completing it due to the length \& time involved.

2. Respondents switching off and replying at random after a while.

6.2 Focus Group (See Appendix B)

Two different focus groups where held - concurrent and retrospective. The facilitators of the group

ensured that all parties had a chance to express their opinions and that all the key points had been

addressed by the end of the session. The 12 questions for the concurrent focus group directly address key

areas of usability. Special emphasis was also given to the organisation of the content and whether the

service assisted with task based information-seeking behaviour.

The retrospective focus groups were conducted after the user testing evaluations. The focus group

questions were used to guide these sessions and were based upon the 10 impression questions that the

users where asked to rate after completing their user testing. The semi-informal structure of this

Page 90: Usability Studies - JISC Services and Information Environment

91 Centre for HCI Design

91

technique enabled the participants to develop their ideas as the session progressed, via the interaction

with the rest of the group. Hence the participants expressed their opinions to a diverse range of usability

issues.

6.3 User Testing (See Appendix C)

Two different types of usability testing were conducted, formal and retrospective. It was important for

both techniques that the group of users who participated in the tests matched the target user group of JISC

services i.e.

• undergraduate/postgraduate students, librarians, researchers etc.

• experience of using web-sites to accomplish specific tasks

The test tasks were selected based upon common goals that users may wish to achieve throughout the

areas of the site under evaluation, and focused upon the organisational structure of the information

therein. When designing the test tasks, Nielsen’s (1995) guidelines for choosing test task were also taken

into account.

A standard consent form acknowledging the participant’s co-operation to take part in the test and to be

videotaped for evaluation purposes was devised. Additionally, a script to be read to all participants before

the evaluation, outlining the nature of test, was produced.

Formal and retrospective user testing evaluations involve the recording of end-users' direct interaction

with a system to collect both qualitative and quantitative data. However, the method by which this is

obtained differs; hence coding sheets were specifically designed for each technique.

Formal user testing

Participants who had indicated that they were willing to take part were asked to complete a short

questionnaire before being selected so we could ensure that we had a representative end user base. The

questions referred to their occupation, Internet experience and the familiarity with the application.

Page 91: Usability Studies - JISC Services and Information Environment

92 Centre for HCI Design

92

The coding sheet was designed for the evaluator to write all comments upon whilst observing the user.

For the purpose of analysis, it also enables the evaluator to specifically record whether the participant

successfully completed the task; the time taken to accomplish the task (or before the test was stopped);

the pathway that was followed; key errors that occurred and the users response to subjective satisfaction

questions (confidence, satisfaction and frustration). The coding sheet was divided into the following four

phases:

1. Demographics – background information i.e. name, area of interest or study

2. Free exploration – 5 minutes for the participant to explore the application.

3. Tasks – five core tasks for the user to attempt

4. Debriefing and impressions – semi-structured interview questions

The mandatory interview questions in phase 4 where designed to further elicit the participant’s views

regarding the application's usability. A funnel approach was adopted whereby general questions at the

beginning led to more specific ones later in the interview. The coding scripts allowed for additional

questions to be recorded as a result of the previous answer, or in response to an action or problem that the

participant encountered during the observational part of the user testing evaluation.

The formal user testing coding sheet was also modified for a selection of formal accessibility evaluations

that were conducted.

1. Demographics – background information i.e. name, area of interest or study

2. Internet skills and practice – background information regarding the user's experience with the

Internet

3. Free exploration – 5 minutes for the participant to explore the application.

4. Tasks – two core tasks for the user to attempt

5. Debriefing and impressions – semi-structured interview questions

Page 92: Usability Studies - JISC Services and Information Environment

93 Centre for HCI Design

93

The inclusion of phase 2 (Internet skills and practice) enables an evaluator to access how experienced the

participant is with the Internet, what sites they generally use and problems they currently experience. The

interview questions were the same as for the formal user testing evaluations, with additional unstructured

questions added to address accessibility issues when required.

Retrospective user testing

As with the formal user testing evaluations, the coding sheet for the retrospective evaluations was divided

into four sections:

1. Demographics – background information including experience with the Internet and application

2. Free exploration – 5 minutes for the participant to explore the application.

3. Tasks – five core tasks for the user to attempt

4. Impressions – 10 questions requiring a subjective satisfaction rating of between one and seven in

relation to usability issues

The coding sheet was designed so that the participants could personally write down any issues they

uncovered and answer the subjective satisfaction questions at the end of each task and in phase 4

themselves.

The impression questions for phase 4 derived out of the key areas of design and appropriate DL usability

issues. As previously mentioned, task oriented information-seeking behaviour and highly organised

contents are key dimensions of a DL. Therefore these questions are designed with these factors in mind.

6.4 Heuristics Evaluation (See Appendix D)

In the heuristic evaluation, a set of web usability guidelines derived from the principles of Nielsen 10

heuristics was adopted to evaluate each individual web-based application.

Page 93: Usability Studies - JISC Services and Information Environment

94 Centre for HCI Design

94

The set of guidelines were divided in nine categories with a set of specific prompts that relate directly to

web-based DL evaluations, thus assisting the evaluator in the heuristic evaluation to identify potential

usability and accessibility problems.

The nine categories were:

• Navigation and Information Architecture

• Consistency and Standards

• User Control

• Readability and Ease of Learning

• Aesthetic, graphic design and branding

• Language

• Help and Documentation

• Error Prevention and Presentation

• Technology

6.5 Personas and Scenarios (See Appendix E)

Personas and scenarios were created simulating the actual scenarios of current and prospective users of

the JISC web-based application. These were done by describing how specific individuals in specific

circumstances would use the JISC web-based applications. The personas and scenarios made

assumptions about who the existing, potential or target users were and what kind of experience and

knowledge they had. Each profile consists of three parts: a persona (a profile of the user), a photo of the

user and a scenario that matches the specific needs of the user.

Personas

For each persona we gave the person a name and described their demographics: age, occupation, gender,

education, hobbies, disabilities and level of computer skills.

Page 94: Usability Studies - JISC Services and Information Environment

95 Centre for HCI Design

95

Scenarios (specific to the service being investigated)

Each scenario was created based on the daily tasks of the user, when he/she uses his/her computer, the

purpose of using the computer and Internet. Most importantly, identify what he/she does with the JISC

web-based application and the importance of it and how this could help in accomplishing his/her task

more effectively by using the web-based application provided by JISC. This part of the scenario

addressed the user’s motivations, and real-life problems that determined their priorities. The common

scenarios created were focused upon task-oriented behaviour and required a high level of information

organisation.

6.6 Cognitive Walkthrough (See Appendix F)

Before conducting the cognitive walkthrough, we took the scenarios and personas created into account.

The scenarios created were adopted as the primary tasks that represented tasks which most users would

be doing when using the JISC web-based application.

A set of templates based on the CE+ theory (Wharton et al, 1994) which is an information- processing

model of human cognition that describes human- computer interaction in terms of four steps:

1) The user sets a goal to be accomplished with the system (for example, "check spelling of this

document").

2) The user searches the interface for currently available actions (menu items, buttons, command-line

inputs, etc.).

3) The user selects the action that seems likely to make progress toward the goal.

4) The user performs the selected action and evaluates the system's feedback for evidence that progress is

being made toward the current goal.

The set of templates were designed in a way that the usability experts could then simulate the roles of

each person’s profile created in the persona and scenarios. The simulation aimed to simulate the actual

users making sure that the site actually serves the needs of specific people in real life.

Page 95: Usability Studies - JISC Services and Information Environment

96 Centre for HCI Design

96

7. Discussions and Conclusion Our usability and accessibility framework for digital libraries concentrates on establishing a framework

that focuses on the main characteristics of digital libraries. This focus is based on the findings we

gathered from the extensive research we did of current usability practices adopted by JISC, the

requirements of the stakeholders and additional investigation in the area of digital libraries.

According to our findings, the main characteristics of a truly usable and accessible digital library are:

• Support task-based information seeking behaviour

• Highly organised information content

The framework was constructed in a way that specifically addressed these factors, and met the usability

and accessibility needs of digital libraries. Therefore ensuring that, by adopting the framework digital

libraries, truly usable and accessible digital libraries can be produced.

The evaluations of a selection of JISC services were conducted using the usability and accessibility

framework established for digital libraries. The methodologies adopted in this study were both analytic

and empirical. Query techniques were used to establish the requirements and key problems that users

currently experienced. User testing was conducted to further identify major usability and accessibility

issues with each of the JISC services. This was followed by analytic evaluations such as heuristics

evaluations and cognitive walkthroughs with usability and accessibility experts evaluating the JISC

services based on their expert knowledge. Each stage in the process was supported and supplemented

where needed by the one that followed in order to clarify the findings, identify further usability and

accessibility issues and produce an iterative process.

Digital libraries have made traditional services and resources more usable and accessible:

• Users are able to search and retrieve the information they require remotely in their own time frame.

• A more diverse variety of resources can be easily accessed

• Information can be continually updated

Page 96: Usability Studies - JISC Services and Information Environment

97 Centre for HCI Design

97

Digital Libraries also have the opportunity to help develop and support user’s idea creation and

information seeking processes. These generally involve initial ideas that expand as the task develops.

DL’s can support this process by incorporating visualisation techniques to help users develop and widen

their search. The KartOO portal displays an example of how this technique has been deployed. The

system gathers the results, compiles them and represents them in a series of interactive maps through a

proprietary algorithm (KartOO, 2003). Figure 8 demonstrates how KartOO visually represents the

relationship between related resources.

Figure 8: KartOO

Developments such as those used by KartOO can enable JISC’s digita

support end-user information seeking behaviour, and our DL framew

developments are truly usable and accessible, along with the following gu

Search term and associated links

l libraries to further extend and

ork can help ensure that these

idelines.

Page 97: Usability Studies - JISC Services and Information Environment

98 Centre for HCI Design

98

7.1 Suggestions to practitioners

Practitioners could apply the usability and accessibility evaluation framework into the evaluation of

digital libraries. For JISC practitioners, the usability and accessibility evaluation framework could be

adopted into the evaluation of the JISC Information Environment in order to assess the usability and

accessibility issues for JISC.

Furthermore, the section on the review of literature on HCI design methods, techniques and guidelines

provide some implementation suggestions for JISC with additional resources for further information.

JISC practitioners could reference these methods, techniques and guidelines as a usability toolkit for

JISC.

In the meantime, the usability and accessibility guidelines established from the findings of the four JISC

services evaluated, could be adopted into the re-design of existing JISC services and resources as well as

in the design for future JISC services. This would ensure appropriate usability and accessibility practices

being adopted into the JISC Information Environment and enable a usable and accessible service and

resource for the JISC Information Environment.

7.2 Suggestions to researchers

Future research directions for JISC could be in investigating how appropriate HCI design principles could

best be applied within JISC services and resources. Also, how to address current developments in HCI

design and synthesise these for use within the context of the JISC Information Environment. In particular,

to cover the role of HCI design in the delivery of learning, teaching and research, and most importantly,

to further investigate how JISC could be adopting these principles for HCI design within its practices

more formally in the future.

At the same time, an investigation of visualisation techniques for use by JISC services and resources

could be conducted, establishing sets of visualisation techniques and guidelines for JISC to enhance the

delivery of information for learning, teaching and research within the JISC Information Environment.

Page 98: Usability Studies - JISC Services and Information Environment

99 Centre for HCI Design

99

Furthermore, as JISC is aware, the Disability and Discrimination Act now requires that services must be

accessible to all, irrespective of disability. Therefore, there is a need for in-depth investigation into the

accessibility of each of the JISC individual services and resources. An in-depth accessibility evaluation

should be carried out on each individual JISC service and resource aiming to assess and evaluate their

accessibility. Meanwhile, an investigation could be conducted on how accessibility guidelines could best

fit into the JISC services and resources. This would enable JISC to apply the best usability and

accessibility practices into their current and future services and resources, ensuring that the JISC

Information Environment complies with the legislation and is able to practice the ‘Design for All’

approach as promoted by the European Design for All e-accessibility network.

In terms of digital libraries, future research could be done into specific digital libraries, such as specific

library services like geo-spatial information services and bibliographic database services. An in-depth

investigation into these services specifically would help to refine our framework and guidelines presented

in this project and to be able to further cater for the needs of different digital library services.

Page 99: Usability Studies - JISC Services and Information Environment

100 Centre for HCI Design

100

7.3 Usability and Accessibility Guidelines

7.3.1 Key principles of Human-Centred Design (HCD)

The HCD approach is a complement to software development methods, not a replacement for them. The

key principles of HCD are as follows:

• The active involvement of users and clear understanding of user and task requirements

One of the key strengths of human-centred design is the active involvement of end-users who have

knowledge of the usage context in which the system will be used. Involving end-users can also enhance

the acceptance of and commitment to the new software, as people come to feel that the system is being

designed in consultation with them rather than being imposed on them.

• An appropriate allocation of function between user and system

It is important to determine which aspects of a job or task should be handled by people and which can be

handled by software and hardware. This division of labour should be based on an appreciation of human

capabilities, their limitations, and a thorough grasp of the particular demands of the task.

• Iteration of design solutions

Iterative software design entails receiving feedback from end-users following their use of early design

solutions. These may range from simple paper mock-ups of screen layouts to software prototypes with

greater fidelity. The users attempt to accomplish 'real world' tasks using the prototype. The feedback from

this exercise is used to develop the design further.

• Multi-disciplinary design teams.

Human-centred system development is a collaborative process that benefits from the active involvement

of various parties, each of whom have insights and expertise to share. It is therefore important that the

development team is made up of experts with technical skills as well as stakeholders in the website. The

team might thus include:

• managers

• usability specialists

Page 100: Usability Studies - JISC Services and Information Environment

101 Centre for HCI Design

101

• end-users

• software engineers

• graphic designers

• writers

• editors

• interaction designers

• training and support staff

• task experts

7.3.2 General Presentation

1. Layout and navigation

• The organisation’s logo should be consistently positioned in the same spot

• The design and page layout should be applied consistently

• The navigational elements should be displayed consistently

2. Registration

• Users should be able to get assistance when they have forgotten their password

• The reasons for and benefits of registering should be made evident to users

• Only necessary information should be required

3. Information Architecture

• The content of site areas should meet user expectations, e.g. navigation should lead to the expected

content

• Rarely needed information should not be continuously presented to the users, but should be easily

accessible via a link

• The web-site should not force the users to follow a rigid structure

• It should be obvious where you are and where you can go next

4. Labelling and headings

• Link labels should reveal the content that they lead to

• The labels for site areas should be easily understandable

• The headings should be informative and concise and should accurately reflect the content

Page 101: Usability Studies - JISC Services and Information Environment

102 Centre for HCI Design

102

• Labels should be written consistently and acronyms should be avoided where possible

5. Images and Animation

• The images should add value to the information presented on the page

• Pictures should be well implemented and not distorted

7.3.3 Specific Usability Aspects

1. Search

• There should be a site specific search engine

• Any search facility should be found easily and should not be hidden

• There should be a search functionality that allows the user to narrow down the search or the search

results according to categories

• The search should not force the users to search by predefined menus or categories, i.e. keyword

search in all categories should be made available

• The search should be error tolerant. Slightly modified keywords should lead to the same or similar

results. Modifications which should be accepted include words with or without hyphen/space or

abbreviations

• Search hints should be provided (e.g of appropriate formats) within the search page or next to the

search box where the hints would be obvious to the users

2. Navigation

• The navigation should not change, i.e. the top-level navigation items should still be available on a

lower level

• The meaning of the navigation elements should be clear

• Clickable elements should be distinguished clearly from non-clickable content

• Already-visited links should be visually distinguishable in lists (e.g. search results) or in content (e.g.

news articles)

• Each page should include a link to the home page

3. Forms Layout

• Forms should be structured in a reasonable way, i.e. information that belongs together should be

grouped together

Page 102: Usability Studies - JISC Services and Information Environment

103 Centre for HCI Design

103

• In forms it should be clear which information is required and which is optional

• There should be instructions or examples (e.g. of formats) to help users to fill out forms

• Instructional text for forms should be clear and helpful

4. Contrast and Scanability

• The main pages (Home and 1st level) should be uncluttered, easily scanned and not too dense with

content

• The font type, font size, line length and line spacing should allow for easy reading

• Important information should be highlighted appropriately. For example visual clues such as

different font sizes should be used where necessary

• There should be a clear contrast between the background and the content (text, pictures, navigation)

of the page

5. Optimisation (size and print)

• The most important links should appear high enough on the page to be visible without scrolling

• [The content should fit on a screen with a resolution of 800x600 without scrolling] OMIT? see my

note

• The content should fit standard paper size (i.e. A4) if printed, i.e. none of the content will be cut off

• Print options, such as a printer friendly version, should be available

6. Help

• Breadcrumb navigation (e.g path) should support orientation and answer the question ‘Where am I on

this site?’

• If site has functionalities that go beyond pure content presentation, there should be help features or

FAQs available

• The help function should be easily found from every page

• The help section should offer answers to each important question about using the site

• The help section should be sorted alphabetically or thematically.

• The help function in a content page should be a direct link to the answers/help features for that

specific section of the content rather than taking users to the general help menu

• Error messages should be expressed in plain language (no codes), precisely indicating the problem,

and constructively suggesting a solution

Page 103: Usability Studies - JISC Services and Information Environment

104 Centre for HCI Design

104

7. Usage of Windows

• New browser windows or pop-up windows should not be used to show content of the site

8. Speed and Errors

• Overall the site should be stable, i.e. there should be no system errors

• Download times should be minimised, i.e. it should not take longer than 15s on a 56k modem to

download any page

• The first page of the search results should be displayed quickly (there should be no interruptions, such

as server problems.

7.3.4 Specific Accessibility Aspects

1. “Skip navigation option” should be included in the interface design. An appropriate method should

be used to facilitate the easy tracking of page content that provides users of assistive technology the

option to skip repetitive navigation links

2. Provide alternative text for all images. An "alt" (alternative text attribute) and/or a "longdesc" (long

description tag) should be provided as equivalent alternatives for any multimedia presentation, i.e.

text equivalent for every non- text element or in element content

3. As JISC users are mainly task oriented and information seeking focused, information provided should

be well structured with good categorisation of information. Provide navigation schemes to show users

where they are in the context of the site’s hierarchy

4. The interface layout should minimise the number of links available to users. This would assist users

to navigate and browse the site more efficiently and effectively with their assistive technologies.

5. Ensure users have control over the web page at all times. The link ‘Home’ should be provided in

every single page of the site and should be identical to users.

6. Provide each frame with a title. Title frames with text facilitate frame identification and navigation

7. Data tables should provide identification of row and column headers. Ensure the title of each row and

column header is clear. i.e. directly related to the content within that specific row or column

8. Documents should be readable without requiring an associated style sheet.

9. Use descriptive, clear text links and avoid the use of vague references such as "click," "here," &

“more” etc.

10. Avoid scrolling/moving text and the use of ‘…’, ‘+’ symbols in the content.

Page 104: Usability Studies - JISC Services and Information Environment

105 Centre for HCI Design

105

11. All information required for navigation or meaning should not depend on the ability to identify

specific colours. Background colours should be avoided since colour schemes can create problems

with legibility

12. Pages should be usable when scripts, applets, or other programmatic objects are turned off or not

supported, or should provide equivalent information on an alternative accessible page

13. Multiple browser testing has to be conducted on all the current versions of popular browsers such as

Netscape Navigator, Internet Explorer, Opera and Lynx

Page 105: Usability Studies - JISC Services and Information Environment

106 Centre for HCI Design

106

References Aalberg, T. (2002) Navigating in Bibliographic Catalogues. ECDL pg. 238-250

Arms, W. (2002) Digital Libraries. [Online]. Available: http://www.cs.cornell.edu/wya/DigLib [04 March

2003]

Belcher, M. & Chipchase, J. Sealy, I. (1999) JISC Web Site Usability Study: Report and

recommendations on the JISC web site. [Online]. Available: http://www.jisc.ac.uk/pub00/usability.doc

[10 Jan 2003]

Berners-Lee, T. (1989) Information Management: A Proposal. Retrieved 2nd March 2003, from

http://www.w3.org/History/1989/proposal.html

Bias, R. R. & Mayhew, D.J., Eds. (1994) Cost-Justifying Usability. Academic Press, Mass.

Bird C, Elliott P, Hayward P. (1999) Content-Based Retrieval for European Image Libraries, The

Challenge of Image Retrieval, Newcastle (in press)

Blomberg, J.L. & Henderson, A. (1990) Reflections on Participatory Design: Lessons from the Trillium

Experience, in Proceedings of CHI'90 (Seattle WA, April 1990), ACM Press, 353-359.

Brinck, T. & Gergle, D. Wood, SD. (2002) Designing Web Sites That Work – Usability For the Web.

London: Academic Press.

Bush V 1945 As we may think. The Atlantic Monthly, 176, 101-108.

Calde, S. Goodwin, K. & Reimann, R. (2002). SHS Orcas: The first integrated information system for

long-term healthcare facility management. [Online] Available:

http://www.aiga.org/resources/content/7/6/2/documents/FORUM_calde_case_032102.pdf [08 Jan 2003]

Page 106: Usability Studies - JISC Services and Information Environment

107 Centre for HCI Design

107

Capra, M. (2001) An Exploration of End-User Critical Incident Classification, MSc

Thesis, [Online]. Available: http://scholar.lib.vt.edu/theses/available/etd-11122001-

143830/unrestricted/MCapra_thesis.pdf [19 Dec 2002]

Cartwright W (2002) Building a Better Mousetrap: Considerations for the provision of appropriate new

media artifacts for enhancing the access to and use of Geospatial information, Cartography Vol 31 No 1,

June 2002, pp77-86

Colton, D. (1999) An Assessment of the Effectiveness, Impact, and End- use of a New Intranet at

Nationwide Building Society. City University: MSc

Cooper, A. (1999). The inmates are Running the Asylum. Indianapolis: Sams Publishing.

Danino, N. (2001) Heuristic Evaluation - a Step By Step Guide. Retrieved 2nd March 2003, from

http://www.sitepoint.com/article/heuristic-evaluation-guide

Davies, C., & Medyckyj-Scott, D. (1994). GIS usability: recommendation based on the user's view.

International Journal of Geographical Information Systems, 8(2)

Davies, C., & Medyckyj-Scott, D. (1996). GIS users observed. International Journal of Geographical

Information Systems, 10(4).

Day, M. (1999) Image retrieval: combining content-based and metadata-based approaches. [Online].

Available: http://www.ariadne.ac.uk/issue19/metadata/#12 [03 March 2003]

Denning, S., Hoiem, D., Simpson, M., & Sullivan, K. (1990). The value of thinking-aloud protocols in

industry: A case study at Microsoft Corporation. Proceedings of the Human Factors Society 34th Annual

Meeting (pp. 1285-1289). Santa Monica, CA: Human Factors Society.

Diaper, D. (2002) Scenarios and task analysis. Interacting with Computers

Page 107: Usability Studies - JISC Services and Information Environment

108 Centre for HCI Design

108

14(4): 379-395

Dix, A., Finlay. J., Abowd. G. & Beale. R. (1998) Human Computer Interaction. Hertfordshire: Prentice

Hall.

DKC. (2003) Usability. [Online]. Available: http://dkc.mse.jhu.edu/dkc_usability.html [03 March 2003]

Eakins, P.& Graham, M. (1999) Content-based Image Retrieval: A report to the JISC Technology

Applications Programme. [Online]. Available:

http://www.unn.ac.uk/iidr/research/cbir/report.html#Heading2 [12 March 2003]

Ellis, R. D., Jankowski, T. B., & Jasper, J. E. (1998) Participatory design of an Internet-based information

system for ageing services professionals. The Gerontologist, 38, 6 1998 , 743-748.

Ericsson, K. A. & Simon, H. A. (1984). Protocol analysis: Verbal reports as data. Cambridge, MA: The

MIT Press.

Gabbard, J. & Hix, D. (1997) A Taxonomy of Usability Characteristics in Virtual Environments, Virginia

Polytechnic Institute and State University: Blacksburg. [Online]. Available:

http://csgrad.cs.vt.edu/_jgabbard/ve/taxonomy [12 Jan 2003] p91.

Gilb, T. (1998) Principles of Software Engineering Management. Addison -Wesley

Glaze, G. (2002) Personas for the S2S Project [Online]. Available:

http://www.cs.utexas.edu/users/almstrum/cs373/general/personas.html [04 Jan 2003]

Gregor, P. Booth, P. Rowan, M. & Sloan, D. (1999) Accessibility Audit of JISC Web Site. 5 Nov 1999

Page 108: Usability Studies - JISC Services and Information Environment

109 Centre for HCI Design

109

Grout, C. & Ingram, C. Eds. (2001) Working with the Distributed National Electronic Resource (DNER):

Standards and Guidelines to Build a National Resource. [Online] Available:

http://www.jisc.ac.uk/dner/development/guidance/DNERStandards.html [15 Dec 2002]

Helander, TK & Vora PR (1997) Hypertext and its Implications for the Internet. In Helander, TK.

Landauer, P. Prabhu, P. Handbook on Human-Computer Interaction. Elsevier Science

Holtzblatt, K., & Beyer, H. R. (1993). Making customer-centered design work for teams.

Communications of the ACM, 36, 10, 92-103.

Hourihan, M. (2002) Taking the “You Out of User: My Experience Using Personas. [Online]. Available:

http://www.boxesandarrows.com/archives/002330.php [08 Jan 2003]

IBM (2001). Cost justifying case of use, Complex solutions are problems. [Online]. Available:

http://www-3.ibm.com/ibm/easy/eou_ext.nsf/Publish/23 [10 Jan 2003]

ISO 9241 (1994). ISO 9241-11 DIS Ergonomic Requirements for Office Work with Visual Displays

(VDTs): - Part II: Guidelines on Usability.

Ivory, M. & Hearst, M. (2001) The State of the Art in Automated Usability Evaluation of User Interfaces,

in ACM Computing Surveys, 33 (4), December 2001, pp. 173-197.

Jeffries, R., Miller, J. R., Wharton, C., & Uyeda, K. M. (1991). User interface evaluation in the real

world: A comparison of four techniques. Proceedings of CHI 91, 119-124. New York, NY: ACM.

JISC. (2000) Web Site Re-design. [Online]. Available: http://www.jisc.ac.uk/new/new-design.html

JISC. (2001) Annual Report 2001, A Far Reaching Vision.

Page 109: Usability Studies - JISC Services and Information Environment

110 Centre for HCI Design

110

JISC (2002a) Information Environment: Development Strategy 2001-2005 (Draft). [Online]. Available:

http://www.jisc.ac.uk/dner/development/IEstrategy.html [11 Dec 2002]

JISC (2002b) Working with the Distributed National Electronic Resource (DNER): Standards and

Guidelines to Build a National Resource. [Online]. Available:

http://www.jisc.ac.uk/dner/development/guidance/DNERStandards.html

JISC (2003a) Landmap, Collections [Online]. Available:

http://www.jisc.ac.uk/index.cfm?name=coll_landmap&src=alpha [02 April 2003]

JISC (2003b) Digimap, Collections [Online]. Available:

http://www.jisc.ac.uk/index.cfm?name=digimap&src=alpha [02 April 2003]

JISC (2003c) Bristol Biomedical Archive, Collections [Online]. Available:

http://www.jisc.ac.uk/index.cfm?name=coll_bba&src=alpha [02 April 2003]

JISC (2003d) St Andrews University Library Image Collection, Collections [Online]. Available:

http://www.jisc.ac.uk/index.cfm?name=standrewsuniversitylibraryimagecollection&src=alpha [02 April

2003]

JISC (2003e) Collections [Online]. Available:

http://www.jisc.ac.uk/index.cfm?name=visualartsdataserviceonlinecatalogue&src=alpha [02 April 2003]

John, B.E. and Marks, S.J. (1997) Tracking the effectiveness of usability evaluation methods. Behaviour

and Information Technology, Vol. 16, no. 4/5, 188-203.

Kantner, L. & Rosenbaum. (1997) Usability Studies of WWW sites: Heuristic Evaluation vs. Laboratory

Testing, SIGDOC 97Proceedings. [Online]. Available:

http://www.teced.com/PDFs/sigdoc97.pdf [05 Jan 2003]

Page 110: Usability Studies - JISC Services and Information Environment

111 Centre for HCI Design

111

KINDS (2003) About KINDS I. [Online]. Available:

http://www.kinds.ac.uk/kinds/AboutKINDS/default.htm [31 March 2003]

Kirakowski, J. (2002) Questionnaires in Usability Engineering: A list of frequently asked questions.

[Online]. Available: http://www.ucc.ie/hfrg/resources/qfaq1.html [28 Dec 2002]

Kirby, M (1991) Custom Manual, Technical Report DPO/STD/1.0, HCI Research Centre, University of

Huddersfield.

Lesk. M (1997) Practical Digital Libraries. Morgan Kaufmann Publishers, San Francisco, CA, 1997.

Lewis, J. R. (1995). IBM computer usability satisfaction questionnaires: Psychometric evaluation and

instructions for use. International Journal of Human-Computer Interaction, 7, 1, 57-78.

Li, CS., Bree, D., Moss, A., Petch, J. (1999) Developing Internet-Based User Interfaces for Improving

Spatial Data Access and Usability. [Online]. Available:

http://www.ncgia.ucsb.edu/conf/SANTA_FE_CD-OM/sf_papers/li_chunsheng/santa_fe.html [03 Feb

2003]

Ma T. & Zaphiris P (2003) The Usability and Content Accessibility of the E-government in the UK,

Human Computer Interaction International Conference, Crete, Greece.

MacCaulay, L; Fowler, C; Kirby, M & Hutt, A. (1990) USTM: A New Approach to Requirements

Specification. Interacting with Computers, 2 (1)

Manning, H., McCarthy, J., & Souza, R. (1998) Why Most Web Sites Fail,Interactive Technology Series,

Volume 3, Number 7, Forrester Research: September.

Marcus, A (2002). Return on Investment for Usable User-Interface Design: Examples and Statistics.

Aaron Marcus and Associates, Inc . (AM+A)

Page 111: Usability Studies - JISC Services and Information Environment

112 Centre for HCI Design

112

Minibayeva, N. (2002). A User-Based Approach to Cognitive Walkthrough and Heuristic Evaluation.

SLIS Event News [Online]. Available: http://www.slis.indiana.edu/news/story.php?story_id=406 [20 Jan

2002]

Molich, R., Bevan, N., Curson, I., Butler, S., Kindlund, E., Miller, D., & Kirakowski, J. (1998).

Comparative evaluation of usability tests. Proceedings of UPA 98.

Molich, R., Thomsen, A. D., Karyukina, B., Schmidt, L., Ede, M., van Oel, W., & Arcuri, M. (1999).

Comparative evaluation of usability tests. CHI 99 Extended Abstracts, 83-84. New York, NY: ACM.

Moller-Jensen, L (1999) Monitoring User Responses to Web-Based GI Interfaces. [Online]. Available:

http://agile.isegi.unl.pt/Conference/Roma1999/Abstract/Download/PDF/45-MollerJensen.PDF [03 Feb

2003]

Nicholas, D. (2001) Methods - Surveys, Sampling and Questionnaire Design, [Online]. Available:

http://webct.soi.city.ac.uk/SCRIPT/PP_01/scripts/serve_home [20 Dec 2002]

Nielsen, J. (1993) Usability Engineering. London: Academic Press Limited.

Nielsen, J. (1995) Severity Ratings for Usability Problems, Papers and Essays –

Heuristic Evaluation, [Online]. Available: http://www.useit.com/papers/heuristic/severityrating.html [03

Jan 2003]

Nielsen, J., & Landauer, T. K. (1993). A mathematical model of the finding of usability problems.

Proceedings of INTERCHI 93, 206-213. New York, NY: ACM.

Nielsen, J. (2002) Writing for the Web [Online]. Available: http://www.sun.com/980713/webwriting [20

Jan 2003]

Page 112: Usability Studies - JISC Services and Information Environment

113 Centre for HCI Design

113

Olson, J. S., & Moran, T. P. (1996). Mapping the method muddle: Guidance in using methods for user

interface design. In M. Rudisil, C. Lewis, P. G. Polson, & T. D. McKay (Eds.), Human-Computer

interface designs: Success stories, emerging methods, and real world context, (pp. 269-300). San

Francisco: Morgan Kaufmann Publishers, Inc.

Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., & Carey, T. (1994). Human-computer

interaction. Wokingham, UK: Addison-Wesley.

Preece, J., Rogers, Y. & Sharp, H. (2002) Interaction Design: Beyond Human –

Computer Interaction. New York: John Wiley & Sons.

Scapin, D. et al. (1999) A Framework for Organizing Web Usability Guidelines, Eval Web project.

[Online]. Available: http://www.tri.sbc.com/hfweb/scapin/Scapin.html#Vanderdonckt99 [18 Dec 2002]

Shepherd, A. (1998) HTA as a framework for task analysis. Ergonomics, 41/11, 1537-1552.

Shneiderman, B. (1992) Designing the user interface: Strategies for effective human-computer

interaction: Addison-Wesley Publishing Company, Inc., 1992

Shneiderman, B. (1998). Designing the user interface. Strategies for effective human-computer

interaction. 3d ed. Reading, MA: Addison-Wesley.

Shneiderman. B. (2000), Universal usability, Communications of the ACM, v.43 n.5, p.84-91, May 2000

Sloan M, (2001) Web Accessibility and the DDA, Refereed article, The Journal of Information, Law and

Technology (JILT). [Online]. Available: http://elj.warwick.ac.uk/jilt/01-2/sloan.html

Smith, P.& Reinertsen D. (1991). Developing products in half the time. New York, Van Nostrand

Reinhold.

Page 113: Usability Studies - JISC Services and Information Environment

114 Centre for HCI Design

114

Soergel, D. (2002) A Framework for Digital Library Research: Broadening the Vision. D-Lib Magazine

8(12)

Spool, J. (2002) Evolution Trumps Usability Guidelines UIE tips. Retrieved 2nd March 2003, from

http://www.uie.com/articles/evolution_trumps_usability/

Stanford University (2002). Design for Learning: an overview of the scenario-based framework with

examples of how to apply the theory into the design. [Online] Available:

http://ldt.stanford.edu/~gimiller/Scenario-Based/scenarioIndex2.htm [20 Jan 2003]

Theng Y.L.,Duncker, E., Mohd-Nasir, N., Buchanan, G., Thimbleby, H. (1999) Design Guidelines and

User-Centred Digital Libraries. ECDL: 167-183

Thornton, C. (2002) Got Usability? Talking with Jakob Nielsen. [Online]. Available:

http://www.boxesandarrows.com/archives/002321.php [17 Dec 2002]

Travis, D. (2002) Should you invest in an accessibility audit or a usability audit? SystemConcepts

Available: http://www.system-concepts.com/articles/accessaudit.html

UsableNet (2002) What is Accessibility [Online] Available:

http://www.usablenet.com/accessibility_usability/accessibility.html [09 Jan 2003]

Usability.gov. (2002) Methods for Designing Usable Web Sites. [Online] Available:

http://usability.gov/methods/

User Interface Engineering (2002) People search once, maybe twice [Online] Available:

http://www.uie.com/Articles/search_once.htm

Page 114: Usability Studies - JISC Services and Information Environment

115 Centre for HCI Design

115

Van Den Broek, EL., Kisters, VP.& Von Schmid, JPM. (2002) Content-Based Image Retrieval: Color-

selection exploited. [Online]. Available: http://hwr.nici.kun.nl/~vuurpijl/publications/dir02.pdf [10 April

2003]

Walsh, K., Pancake, C., Wright, S., Hanus, FJ. (2002) Humane Interfaces to Improve the Usability of

Data Clearinghouses. In: Egenhofer, MK and Mark, DM (Ed’s), GIScience 2002. Berlin. Springer

Wharton, C., Rieman, J., Lewis, C., & Polson, P. (1994). The cognitive walkthrough method: A

practitioner's guide. In Nielsen, J., & Mack, R. L. (Eds.), Usability inspection methods, 105-140. New

York, NY: John Wiley & Sons.

Zaphiris, P. & Ellis, R.D. (2001). ‘Website Usability and Content Accessibility of the top USA

Universities’. In Proceedings of WebNet 2001 Conference, October 23-27. Orlando, FL.

Page 115: Usability Studies - JISC Services and Information Environment

116 Centre for HCI Design

116

Appendix A: Sample Questionnaire Below is a set of questions that were used to compile the questionnaires for testing the usability of the four selected services. 1. What is your current occupation? Undergraduate student Postgraduate student Researcher Faculty member Other 2. How often do you use the Internet? Several times a day Once a day Several times a week Once a week Less than above 3. What services do you commonly use? <Service-dependent list> 4. How often do you use this application? Several times a week Once a week Several times a month Once a month Less than above 5. Overall, I am satisfied with this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 116: Usability Studies - JISC Services and Information Environment

117 Centre for HCI Design

117

6. It was simple to use this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A 7. I find the information retrieved by this application is very useful 1 Strongly disagree 2 3 4 5 Strongly agree N/A 8. I can effectively complete my work using this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A 9. I am able to complete my work quickly using this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A 10. I feel comfortable using this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 117: Usability Studies - JISC Services and Information Environment

118 Centre for HCI Design

118

11. It was easy to learn to use this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A 12. It was easy to remember how to use this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A 13. I believe I became productive quickly using this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A 14. Whenever I make a mistake using the application, I recover easily and quickly 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 118: Usability Studies - JISC Services and Information Environment

119 Centre for HCI Design

119

15. The support information (such as online help, on-screen messages, and other documentation) provided with this application is clear 1 Strongly disagree 2 3 4 5 Strongly agree N/A 16. It is easy to find the information I require 1 Strongly disagree 2 3 4 5 Strongly agree N/A 17. The terminology used is clear 1 Strongly disagree 2 3 4 5 Strongly agree N/A 18. The instructions are easy to understand 1 Strongly disagree 2 3 4 5 Strongly agree N/A 19. The instructions are effective in helping me complete tasks 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 119: Usability Studies - JISC Services and Information Environment

120 Centre for HCI Design

120

20. The organisation of text/menu options on the screens is clear 1 Strongly disagree 2 3 4 5 Strongly agree N/A 21. The layout of retrieved data is clear 1 Strongly disagree 2 3 4 5 Strongly agree N/A 22. The application clearly notifies me of what stage in the process I am at 1 Strongly disagree 2 3 4 5 Strongly agree N/A 23. The interface of this application is pleasant 1 Strongly disagree 2 3 4 5 Strongly agree N/A 24. I like using the interface of this application 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 120: Usability Studies - JISC Services and Information Environment

121 Centre for HCI Design

121

25. The application's navigational options are consistent 1 Strongly disagree 2 3 4 5 Strongly agree N/A 26. The application provides the versatility I require 1 Strongly disagree 2 3 4 5 Strongly agree N/A 27. This application has all the functions and capabilities I expect it to have 1 Strongly disagree 2 3 4 5 Strongly agree N/A 28. The main menu is effective in guiding me to specific areas 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 121: Usability Studies - JISC Services and Information Environment

122 Centre for HCI Design

122

29. I found the search function helpful in finding the information I was looking for 1 Strongly disagree 2 3 4 5 Strongly agree N/A 30. There were clear directions to guide me through the content to get the information I was looking for 1 Strongly disagree 2 3 4 5 Strongly agree N/A 31. Search results display too much information 1 Strongly disagree 2 3 4 5 Strongly agree N/A 32. The search function is easy to use 1 Strongly disagree 2 3 4 5 Strongly agree N/A 33. I find it easy to return to the previous page within the site 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 122: Usability Studies - JISC Services and Information Environment

123 Centre for HCI Design

123

34. I find it easy to return to the application’s homepage 1 Strongly disagree 2 3 4 5 Strongly agree N/A 35. I have to keep a mental note of where I am within the site 1 Strongly disagree 2 3 4 5 Strongly agree N/A 36. The main menu clearly identifies the services available 1 Strongly disagree 2 3 4 5 Strongly agree N/A 37. I feel lost when using the site 1 Strongly disagree 2 3 4 5 Strongly agree N/A 38. The information displayed matched my expectations 1 Strongly disagree 2 3 4 5 Strongly agree N/A

Page 123: Usability Studies - JISC Services and Information Environment

124 Centre for HCI Design

124

39. Adequate help is provided to assist me in performing a search. 1 Strongly disagree 2 3 4 5 Strongly agree n/a 40. I will visit the site again. 1 Strongly disagree 2 3 4 5 Strongly agree n/a 41. List the most NEGATIVE aspects of the application: 42. List the most POSITIVE aspects of the application: 43. Are there any difficulties that you encountered when using the application? 44. How could the application be improved? 45. If you wish to be entered into the FREE DRAW, please provide your email address

Page 124: Usability Studies - JISC Services and Information Environment

125 Centre for HCI Design

125

Appendix B: Focus Group Question Guidelines

A – Concurrent focus group

1. Who is the service aimed at: 2. How does Service X compare to other similar services regarding:

- usability - look and feel - navigation - versatility

3. Is the speed at which the service retrieves and delivers information adequate 4. Does the service inform users as to where they are on the site 5. Does the service inform users as to what is happening 6. Is the information well organised 7. Is the terminology used acceptable 8. Is their consistency in operation and design 9. Does the system enable tasks to be accomplished effectively and efficiently 10. Key positive issues 11. Key negative issues 12. Is the information architecture logical B – Retrospective focus group 1. All titles and heading are clear 2. The appearance of the site is aesthetically pleasing 3. Text is easy to read 4. The site has clear and consistent navigational options 5. The Help facility is very useful 6. The objective of the site is clear 7. The site delivers a high quality service 8. Organisation of content is logical 9. Remembering how to complete a task is easy 10. It was easy to learn how to use this site

Page 125: Usability Studies - JISC Services and Information Environment

126 Centre for HCI Design

126

Appendix C: User Testing Coding Form

PARTICIPANT NUMBER:

NAME:

SERVICE:

OBSERVATION SHEET

VERSION: 0.1

Page 126: Usability Studies - JISC Services and Information Environment

127 Centre for HCI Design

127

Phase 1: Demographics 2 minutes

Name:

Gender: Male Female

What is your current position?

____Undergraduate student

____Postgraduate student

____Researcher

____Faculty member

____Other

What is your area of interest for studies / research?

________________________________________________________

How often do you use the Internet?

____Several times a day

____Once a day

____Several times a week

____Once a week

____Less than above

Page 127: Usability Studies - JISC Services and Information Environment

128 Centre for HCI Design

128

How familiar are you with this service?

____1 Not at all familiar

____2

____3

____4

____5 Very Familiar

Page 128: Usability Studies - JISC Services and Information Environment

129 Centre for HCI Design

129

Phase 2: Free Exploration 5 minutes

Can you spend a few minutes exploring the application. It would be good if you could note below and on

the following sheet any positive aspects about the system, as well as any negative issues.

On completion of the task please complete the corresponding questions on the next page

Comments.

Page 129: Usability Studies - JISC Services and Information Environment

130 Centre for HCI Design

130

Questions

Could you rate the following questions on a scale of 1 to 7. You can choose one, or, seven, or any number

in between

Exploration Task

Not confident Very confident

1 How confident are you that you have

understood what this service provides? 1 2 3 4 5 6 7

Very disorientated Not disorienated

1 Did you feel disorientated (knowing where

you are) whilst performing this task ? 1 2 3 4 5 6 7

Very unsatisfying Very satisfying

2 Was doing this task satisfying? 1 2 3 4 5 6 7

Very easy Not easy

3 Rate your ease of navigation whilst

performing this task? 1 2 3 4 5 6 7

[end of this task]

Page 130: Usability Studies - JISC Services and Information Environment

131 Centre for HCI Design

131

Phase 3: Tasks

Task 1: <A number of tasks are developed specific to the application being tested. These

ask the user to carry out a specific activity and then report on this>

As before, when performing the tasks could you note below, and on the following sheet any positive

aspects about the system, as well as any negative issues.

On completion of each task please complete the corresponding questions (see next page)

Comments.

Page 131: Usability Studies - JISC Services and Information Environment

132 Centre for HCI Design

132

Questions

Could you rate the following questions on a scale of 1 to 7. You can choose one, or, seven, or any number

in between

Task 1

Not confident Very confident 1 How confident are you that you managed

to save your record via email? 1 2 3 4 5 6 7

Very disorientated Not disorienated 1 Did you feel disorientated (knowing where

you are) whilst performing this task ? 1 2 3 4 5 6 7

Very unsatisfying Very satisfying 2 Was doing this task satisfying?

1 2 3 4 5 6 7

Very easy Not easy 3

Rate your ease of navigation whilst

performing this task? 1 2 3 4 5 6 7

[end of this task]

Page 132: Usability Studies - JISC Services and Information Environment

133 Centre for HCI Design

133

Phase 4 : Impressions As before, could you rate the following questions on a scale of 1 to 7. You can choose one, or, seven, or

any number in between.

The questions in this section relate to the site as a whole.

Strongly disagree Strongly agree 1 All titles and headings are clear

1 2 3 4 5 6 7

Strongly disagree Strongly agree 1 The appearance of the site is aesthetically

pleasing 1 2 3 4 5 6 7

Strongly disagree Strongly agree 2 Text is easy to read

1 2 3 4 5 6 7

Strongly disagree Strongly agree 1 The site has clear and consistent

navigational options. 1 2 3 4 5 6 7

Strongly disagree Strongly agree 1 The Help facility is very useful

1 2 3 4 5 6 7

Strongly disagree Strongly agree 2 The objective of the site is clear

1 2 3 4 5 6 7

Strongly disagree Strongly agree 3 The site delivers a high quality service

1 2 3 4 5 6 7

Strongly disagree Strongly agree 1 Organisation of content is logical

1 2 3 4 5 6 7

Page 133: Usability Studies - JISC Services and Information Environment

134 Centre for HCI Design

134

Strongly disagree Strongly agree 1 Remembering how to complete a task is

easy 1 2 3 4 5 6 7

Strongly disagree Strongly agree 2 It was easy to learn how to use this site

1 2 3 4 5 6 7

Page 134: Usability Studies - JISC Services and Information Environment

135 Centre for HCI Design

135

Appendix D: Checklist for Heuristics Evaluations

Overall – General Usability evaluation for each site

Navigation and Information architecture The site should have a navigation that makes it easy to understand how to navigate to the different areas of the site. The organization of the information should be intuitive and easy to find. Is it clear where in the structure you are on the site? Are the different sections of the site clear and can you understand what content or features to

expect? Does each page have a clear heading to locate where you are? How deep is the link structure? Do all pages provide relevant content? Does the browser’s Back allow the users to see the last screen?

Consistency and standards Users should not have to wonder whether different words, situations, or actions mean the same thing or not. The site should follow the usability standards. Are the different information categories consistently organized? Is the organization of the content consistent? Is the navigation easy to use and consistent?

User control The site should always keep users informed about where he/she is in the information architecture and through appropriate feedback within reasonable time. Users often choose site functions by mistake and will need a clearly marked “emergency exit” to leave the unwanted state. Can the user end in the middle of a workflow (exit)? Can you change your mind in the middle of a workflow?

Readability and Ease of learning The user should not have to remember information from one part of the dialogue to another. Instructions for use of the site should be visible or easily retrievable whenever appropriate. Are the pages uncluttered and easily scanned? Are there areas or sections that are more difficult to learn than others? How demanding is it to learn how to use the site?

Page 135: Usability Studies - JISC Services and Information Environment

136 Centre for HCI Design

136

Is there relevant supportive content where needed?

Aesthetic, graphic design and branding Dialogues should not contain information, which are irrelevant or rarely needed. Every extra unit of information in a dialogue competes with the relevant units of information and diminishes their relative visibility. Is the graphical design functional? Does all the design elements support a task? Does the graphic design seem to support the brand of the company? Is there a coherence of online branding with offline perception/branding? Is the usage of icons or other visuals easy to understand and does it support the task? Is the visual design appealing? Is it the design consistent in the way actions are treated? Is consistent branding applied all across the site?

Language The system should speak the user’s language, with words, phrases, and concepts familiar to the user, rather than system-oriented terms. The site should follow real world conventions, making information appear in natural and logical order. Is the usage of labeling consistent? Does the text/editorials seem to be using the language that is familiar to the target audience? Is the content timely and credible?

Help and documentation Even though it is better if the site can be used without documentation, it may be necessary to provide help and documentation. Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out, and not be too lengthy. Are Help texts and instructions used efficiently on the site? How does the search feature work? Is the search result page presented clearly? Are the search results consistent with search parameters? Is there a relevant rating system of the results? Is there a site map and does it make it easy for the user to understand where he/she is?

Error prevention and presentation Even better than good error messages is a careful design that prevents a problem from occurring in the first place. Error messages should be expressed in plain language (no codes), precisely indicate the problem, and constructively suggest a solution.

Page 136: Usability Studies - JISC Services and Information Environment

137 Centre for HCI Design

137

How is the user informed about what error he has made and how to correct it?

Technology Technology is not something the user should have to deal with or understand. The site needs to perform according to expectations and be stable. Is the download time fast overall on the site? Are there dead links? Broken scripts or functionless forms?

Page 137: Usability Studies - JISC Services and Information Environment

138 Centre for HCI Design

138

Appendix E: Personas for cognitive walkthrough evaluation

Page 138: Usability Studies - JISC Services and Information Environment

139 Centre for HCI Design

139

Persona 1

Helen Bennett. BA student in History at Cambridge University

Helen is 22 and a third year undergraduate student. She still lives with her parents and has followed the standard educational path.

Helen uses the Internet around three times a week for email and browsing. She is a relatively inexperienced computer user who only

uses Microsoft Office programmes, and has no experience of using GIS software. Her only interaction with digital maps has come via

using MSM maps to find her way round Cambridge, which she finds quite simple to use.

Page 139: Usability Studies - JISC Services and Information Environment

140 Centre for HCI Design

140

For her dissertation entitled “The changing nature of the hopping industry in Kent” Helen wishes to retrieve a map of Staplehurst,

Kent and surrounding farmland. She wishes to include this map later in her Word document to support her research. Helen therefore

requires quite a large scaled map of the area that displays farmland around the small town.

Page 140: Usability Studies - JISC Services and Information Environment

141 Centre for HCI Design

141

Persona 2

Suzanne Porter. BA student at the Royal College of Art

21 year old Suzanne is in the second year of her course and is interested in cubism. She is a frequent user of the Internet; graphical

software packages, and counts herself an expert on PhotoShop. She has very little patience when using new applications however, and

refuses to read long textual instructions. She expects a programme to deliver what she wants immediately or she will go elsewhere.

Suzanne has been asked to find out how the university can deposit her classes project on the site.

Page 141: Usability Studies - JISC Services and Information Environment

142 Centre for HCI Design

142

Personas 3

Monica Honkins Lecturer in Business Computing, Surrey University

Monica just recently completed her thesis and obtained a PhD in computer science. Her research interest is in Business process re-engineering and

system engineering process. She is 28 years old and is now working at Surrey University as a lecturer. She is now teaching 1st year undergraduate

introduction to business computing module and would like to find out more about how to teach students effectively.

She is an expert in computing and familiar with most programming and software applications. She also uses the Internet regularly in her research.

Monica’s goal:

- Research on literatures in the area of business computing - Research on relevant course materials that would assist in her teachings. - Find out information on how to teach students effectively.

Page 142: Usability Studies - JISC Services and Information Environment

143 Centre for HCI Design

143

Persona 4

Karina Goodhead

Recent Graduate, University of Surrey, UK

Karina is a 26 years old recent graduate, she is a visually impaired person and just competed her postgraduate degree in IT at University of Surrey.

She likes judo and often participant in competitions national competitions for the disables. Her main interest is in computing and she uses the

Internet a lot to browse for information and do shopping on the Internet. She uses both Braille display and screen reader but she likes using the

screen reader when browsing the Internet and she’s an expert user in using the assistive technology – JAWs for Windows screen reader.

Page 143: Usability Studies - JISC Services and Information Environment

144 Centre for HCI Design

Karina’s goals:

- Research on journals and relevant articles in the area of computing.

- Keep up-to-date with the latest development in computing and sports (Judo)

- Shopping through the Internet.

144

Page 144: Usability Studies - JISC Services and Information Environment
Page 145: Usability Studies - JISC Services and Information Environment

146 Centre for HCI Design

Appendix F: Template for Cognitive Walkthrough

Action sequence Will the user be trying

to achieve the right

effect?

Will the user know

that the correct

action is available?

Will the user perceive

that the correct

action will achieve

the desired effect?

Appropriate

feedback provided?