HONOURS PROJECT LITERATURE SYNTHESIS Improving student engagement with educational material Deon Takpuie Supervised by: Professor Sonia Berman DEPARTMENT OF COMPUTER SCIENCE UNIVERISTY OF CAPE TOWN 2012 NRF FUNDED RESEARCH The financial assistance of the National Research Foundation (NRF) towards this research is hereby acknowledged. Opinions expressed and conclusions arrived at, are those of the author and are not necessarily to be attributed to the NRF. Category Min Max Chosen 1 Requirement Analysis and Design 0 20 20 2 Theoretical Analysis 0 25 0 3 Experiment Design and Execution 0 20 15 4 System Development and Implementation 0 15 5 5 Results, Findings and Conclusion 10 20 10 6 Aim Formulation and Background Work 10 15 10 7 Quality of Report Writing and Presentation 10 10 8 Adherence to Project Proposal and Quality of Deliverables 10 10 9 Overall General Project Evaluation 0 10 0 Total marks 80
114
Embed
Improving student engagement with educational materialdtakpuie/Project/front-end/...Improving student engagement with educational material Deon Takpuie Supervised by: Professor Sonia
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
HONOURS PROJECT LITERATURE SYNTHESIS
Improving student engagement with educational
material
Deon Takpuie
Supervised by: Professor Sonia Berman
DEPARTMENT OF COMPUTER SCIENCE
UNIVERISTY OF CAPE TOWN
2012
NRF FUNDED RESEARCH
The financial assistance of the National Research Foundation (NRF) towards this research is hereby
acknowledged. Opinions expressed and conclusions arrived at, are those of the author and are not
necessarily to be attributed to the NRF.
Category Min Max Chosen
1 Requirement Analysis and Design 0 20 20
2 Theoretical Analysis 0 25 0
3 Experiment Design and Execution 0 20 15
4 System Development and Implementation 0 15 5
5 Results, Findings and Conclusion 10 20 10
6 Aim Formulation and Background Work 10 15 10
7 Quality of Report Writing and Presentation 10 10
8 Adherence to Project Proposal and Quality of Deliverables 10 10
9 Overall General Project Evaluation 0 10 0
Total marks 80
1
Abstract The Vula wiki tool is under-utilized in the computer science department at UCT, and in some other
departments has been replaced by alternative wiki tools that are easier to use. Since the wiki can be a
valuable educational tool, it was decided thatgamificiation should be used to increase the usability of the
Vula wiki on mobile phones. This led to the development of a system for computer science undergraduate
students which used an iterative user-centered design approach; consisting of a design, implementation
and evaluation of a prototype in each stage. Initially, two low-fidelity then two high-fidelity prototypes
are developed whilst incorporating user feedback from the previous iteration. At the same time,
gamification rules, which are influenced largely by the GameFlow criteria for player enjoyment in games,
are refined continually.
Thereafter, the final system is completed and evaluated with a summative evaluation using the
experimental process (on just 10 users). Overall, there has been no (significant) improvement in the
usability of the Vula wiki through gamification, but was statistically significantly faster for 1(out of 5)
tasks and had statistically significantly less errors for 1(out of 5) tasks.
However, this is the first time gamification has been introduced to the Vula wiki and has been met by
positive user feedback (although honesty is not assured).
We believe this project to be a first step in tackling an important problem because it seems not many
students do use the Vula wiki based on this research. Future extensions include improvement of game
rules and general wiki functionality of the system.
2
Acknowledgements
Firstly, I would like to thank my main supervisor, Professor Sonia Berman, who helped every step of the
way and provided valuable feedback on my writing. Also, I am grateful for the idea of making the wiki
more usable for students.
I would also like to thank the co-supervisor, Mr. Stephen Marquad, for always willing to answer
questions about the Vula system, of which he is the coordinator.
I would also like to thank my project partner, Reitumetse Chaka, who suggested the idea of gamification,
which is essential to our project solution. Furthermore, I hereby acknowledge his wonderful work ethic
and timeliness in meeting deliverables.
I would also like to thank Samsung for sponsoring me with a mobile phone for the duration of the project.
I hereby acknowledged all the students who participated in user testing for this project, I am most grateful
for the time (and even extra) that you have given me.
I would also like to thank my mother for her endless sacrifices and support, as well as my late father for
instilling a passion for technology in me. Likewise, I also hereby acknowledge my High School computer
science teacher, Mr. Dale Mckenzie, for introducing programming to me and his constant encouragement.
I would also to thank my family and friends for their support as well.
Last but not least, I thank Jesus Christ, my Lord and Saviour, as I could not have done this without Him.
He has given me strength, passion and much needed perseverance to finish this project.
3
Table of Contents Abstract ................................................................................................................................................... 1
List of Figures ......................................................................................................................................... 8
List of Tables .......................................................................................................................................... 8
11.7 Image Icons Used ................................................................................................................. 108
11.8 Statistical Data ..................................................................................................................... 110
8
List of Figures Figure 1: System Overview ................................................................................................................... 13
Figure 2:The Gamification Loop (Liu et al. 2011) .................................................................................. 16
Figure 3: EcoLand Picture(Shiraishi et al. 2009) .................................................................................... 17
Table 7: Achievements List ................................................................................................................... 62
Table 8: Avatar list ................................................................................................................................ 62
Table 9: Badges list ............................................................................................................................... 63
Table 10: Dependent and independent variables ..................................................................................... 75
Table 11: Final Evaluation Grid ........................................................................................................... 107
Table 12: Image Icons Used ................................................................................................................ 108
Table 13: Average Clicks Stats ............................................................................................................ 110
Table 14: Average Task Completion Time ........................................................................................... 111
Table 15: Average Errors per Task ...................................................................................................... 112
10
1 Introduction Wikipedia(Wikipedia 2012b), a free encyclopedia, is increasingly being used to engage with students
academically as can be seen from the 36% of students who used it in America in 2007(Rainie& Tancer
2007). This project attempts to use gamification to make the Vula wiki more usable on mobile phones.
Vula is a University of Cape Town(UCT) derivative of Sakai(S. Foundation 2012), an open source
learning management system . Gamification is the use of game techniques in non-game contexts (like a
wiki, in this case). Currently, the Vula wiki tool is under-utilized in the computer science department at
UCT, and in some other departments has been replaced by alternative wiki tools that are easier to
use(Berman 2012).
Therefore, the objective is to is to determine whether gamification improves the usability of the Vula wiki
on mobile phones, mainly in terms of key usability components speed and accuracy(Nigel Bevan 1995).
Indeed, the primary contribution of the research is to make the Vula wiki easier for students to use (on
mobile) although the addition of gamification to the wiki has never been done before either. Before
moving onto the research problem, it is necessary to explain key concepts used such as: gamification and
mobile usability.
Gamification
This originates with human computer interaction heuristics techniques dating back to the 1980s (Malone
1982) with experiments on how addition of elements inherited from game designs can enhance the
interest of an audience. Effects of gamification on university students has been investigated in a research
(Fitz-walter et al. 2011) and showed an improvement in interactions with the system information. There is
also a study done at UCT that attempted to determine what gamification techniques students find most
enjoyable (Donovan 2012).
An example of gamification in the education environment is the rewarding of students with incentives
(like bonus marks) to encourage participation (Bustard et al. 2007). In the context of the Vula system,
students can be awarded for their participation in wiki tools. The wiki has been chosen because all user
reads and writes are logged. The wiki tools will not immediately replace the current ones but will be part
of a new Vula course site that will test the implementation on a group of students. Furthermore, since the
wiki is considered on the mobile platform only, it is important to understand usability in mobile devices.
Mobile Usability
Usability, in the broader sense, is the ease of use of an application(N Bevan & Azuma 1997). Software
designed on mobile phones is quite different to that designed on desktops , in that mobile phones have
limited memory(Jong et al. 2008) , screen size, limited page types and more(Jones & Marsden 2005,
p.251). Therefore, careful consideration needs to be taken in the area of mobile usability whilst solving
the pressing issue of this project, to be discussed next.
11
1.1 Problem Statement The Vula wiki tool is under-utilized in the computer science department at UCT, and in some other
departments it has been replaced by alternative wiki tools that are easier to use(Berman 2012). Since it is
a potentially a valuable educational tool, and its short-snippet engagement style is particularly conducive
to mobile interaction, this project is designed to investigate the problem of incentivizing mobile usage of
the Vula wiki by means of gamification. This is particularly important given the poor throughput rates in
tertiary education(McMillan 2007) and the encouraging successes of gamified learning in other contexts
such as the improved pass rates (in first and second year computer science course) at the University of
Ulster via a gamified education tool (Bustard et al. 2011). However, since the front-end of the project
involves software engineering, it is also important to state the shareholders and their needs, before
moving to the research questions:
Clients The clients are Professor Sonia Berman, computer science HOD (head of department) at UCT and also Mr. Stephen Marquard, co-coordinator of Vula UCT. Their interests are to see the wiki made easier for
students to use on a mobile.
Users Are university of Cape Town (UCT) computer science (undergraduate) students and their objective is to
enjoy using the wiki tool.
1.2 Research Questions In order explore the above problem; this research investigates whether a gamifiedVula wiki on mobile is
more usable than the current Vula wiki on mobile. To justify, given that usability can be thought of as
ease of use of an application(N Bevan & Azuma 1997), the research question ties in with the project aim
of making the Vula wiki easier to use on mobile phones. Usability in this context is measured by
effectiveness/”errors” (average errors per task) and efficiency/”speed” (average clicks/time to complete a
task) and to a lesser extent user satisfaction(Nigel Bevan 1995). Therefore, the research question can be
broken down into three separate hypotheses (where mobile abbreviates mobile phone):
Hypotheses
1. A gamifiedVula wiki on mobile is more usable than the current Vula wiki on mobile.
1.1 For all tasks, agamifiedVula wiki on mobile requires fewer key presses to navigate than the
current Vula wiki on mobile.
1.2 For all tasks, agamifiedVula wiki on mobile requires less time to navigate than the current Vula
wiki on mobile.
1.3 For all tasks, agamifiedVula wiki on mobile is less error prone than the current Vula wiki on
mobile.
Of these, the dependent variable is system choice (either the Vula or gamified mobile wiki) and
independent variables are: number of clicks (measured relatively as screen transitions), task completion
time (measured absolutely in seconds) and number of errors (in completing tasks). Also, key success
factors for the experiment thus are to improve the Vula wiki in terms of number of clicks, task completion
12
time and errors as well as to critical analyze why such an improvement was or was not possible.
Therefore, an experiment (which is described next) is designed to evaluate this.
1.3 Research Design The research questions are to be evaluated in a summative evaluation (Nielsen 1993)using experimental
evaluation(Hezel Associates 2010)but first the participants need to be defined. The participants will be
computer science undergraduate students who are to be selected randomly. In the final evaluation, half the
participants will be novices (users who have not used either Vula or gamified wiki before) and the other
half experts (those who have used both wikis before). Also, instrumentation (or data collection tools) will
be an evaluation grid/coding sheet(Jones & Marsden 2005, p.202) which will record: the number of
clicks, task completion time and number of errors for every task. Furthermore, a post-experimentation
interview will also be used to extract users subjective system opinions(Jones & Marsden 2005, p.240). In
terms of evaluation procedure, the participants would test both systems for approximately 30 minutes
(whilst data is noted on evaluation grid) and thereafter complete a brief 10 minute post-experiment
interview.
1.4 Theoretical Framework A user-centered design framework(Marsden et al. 2008) is used to develop prototype iteratively and using
appropriate usability evaluations at each stage including: conceptual model extractions, heuristic
evaluations, formative evaluations and summative evaluation (for the final system). Also, GameFlow
criteria for player enjoyment in games(Sweetser& Wyeth 2005) has been applied to develop the
gamification rules.
1.5 System Overview In order to evaluate the research questions, a number of elements had to be designed and these are
illustrated with their allocation to the relevant group members (figure 1). Essentially, the system was
developed using a two-tier architecture consisting of a front-end (focus of this research) and a back-end.
This architecture was used for simplicity and allowed a modular development. To elaborate, the front-end
interacts with the back-end at a software/application level whereas the game rules alter it at a user
interface level (such as the inclusion of badges on a page). Likewise, the back-end interacts with Sakai
(for authentication and such) whilst providing the implementation of the game rules. In terms of
languages, the front-end was developed using jQuery Mobile framework( the jQuery Foundation 2012c)
for cross-platform development by using HTML, Ajax and JavaScript(with jQuery) to interact with the
back-end. In addition, there was only one device used for testing as this was the Samsung galaxy
ace(sponsored by the company) but more smartphones can be used in the future, as jQuery Mobile is
cross-platform(Godwin-Jones 2011) although from experience minor user interface tweaking is required
for some platforms(like Blackberry). Given these system components, the research questions can be
tested via experimental evaluation.
13
Figure 1: System Overview
1.6 Ethical Issues Firstly, the appropriate ethical clearance from the UCT Science Faculty Ethics clearance and access to
UCT students (from UCT student affairs department) has been obtained, as this was necessary to conduct
experiments. Furthermore, participants signed a consent form, were treated with respect, paid R30 as well
as thanked for their participation.
In terms of professional and intellectual property issues regarding the software development, a number
had to be taken into account. Firstly, commercial development environments such as
Dreamweaver(Adobe 2012) could not be used due to cost, so open-source alternatives such as notepad++
(Ho 2012) were used. Also, the jQuery Mobile platform( the jQuery Foundation 2012c) used to create the
user interface posed no legal constraints as it is created by the jQuery Foundation( the jQuery Foundation
2012b) , which is a non-profit trade association. Care had to be taken to use only free images online and
to download images from their original source not just from Google Images(Google 2012), for instance.
Furthermore, content in pages is allowed to be liked using a open source thumbs up icon(Clker.com
2012)as the familiar Facebook‟s (Facebook 2012) like button is copyrighted.
1.7 Outline Chapter 2 examines the foundation literature needed for the project, from the areas of gamification and
mobile usability. Chapter 3 provides an overview of the prototyping iterations of the project as well as
their evaluations. Chapters 4 and 5, expounds on these various low and high-fidelity prototypes as well as
Front-
End/Mobile
User
Interface
Back-End
Sakai
Game
Rules
Embedded
post
(request)
function
calls
Reitumetse Chaka Vula Deon Takpuie
14
their evaluations.Chapter 6 discusses the game rules of the system and how they are derived. Chapter 7
discusses the final implementation, in terms of its improvements from the final prototype. Chapter 8
evaluates the final system with a summative evaluation (through experimental process) and analyzes the
findings. Lastly, the final chapter concludes on these findings based on the hypotheses and also draws out
future extensions to the project.
15
2 Previous Work
2.1 Introduction There is much literature on making students engage more with educational material(Fischer & Troendle
2003). In particular, this paper focuses on the recent work of using online learning tools to increase
student engagement with academic material with the intention of improving their pass rates. The research
conducted pointed to a pioneering field, “gamification”, as being helpful in increasing student
engagement with educational material(Fitz-walter, Tjondronegoro & Wyeth 2011a). Hence, gamification
is evaluated broadly initially and then its particular implementations are compared. In fact, gamification is
the only method evaluated since it is often successful and simple to use(Burke & Hiltbrand 2011). In
between, all relevant aspects are defined appropriately such as the chosen tool of gamification (the wiki
tool). Lastly, the issues of usability and user testing on a mobile application are expounded on.
2.2 Gamification
2.2.1 Introduction
Origin
Following the success of a location-based service Foursquare(Crowley & Selvadurai 2009), the idea of
introducing gaming elements in non-game situation is now being discussed in health, education, media
and many more fields. Gamification has predecessors and similarities with the HCI and the games studies
fields. The term gamification was first documented in 2008 but only received widespread usage in the
second half of 2010(Deterding, Dixon, Khaled & Lennart 2011a).
Defintion
Gamification can be defined as the concept of introducing gamimg elements to non-gaming activities.
Authors sometimes use the term “gamify”, which is the application of gamification(Muntean 2002).
Games comprise of rules and competition towards outcomes or goals by human participants. Games
typically give rise to play but play is more general than games. Serious games use complete games for
non-entertainment purposes whereas gamified applications use game elements that do not lead to
complete games but add enjoyment to non-game activities(Deterding, Khaled, Nacke & Dixon 2011d).
To follow is a good overview of gamification for non-experts, showing the key interface level
components:
16
Figure 2:The Gamification Loop (Liu et al. 2011)
The diagram (figure 2) can be interpreted as a challenge leads to a win condition and so forth. Badges
represent achievement level of user and the point system is impacted by every other component.
Furthermore, gamification is also described as the process of adding game elements to an application in
order to enhance user experience(Fitz-walter, Tjondronegoro & Wyeth 2011a). However, there is still
some dispute in the video game and digital media industry when it comes to the various interpretations of
gamification thus some use different terms to describe the phenomenon such as “gameful design”
(Deterding, Dixon, Khaled & Lennart 2011a).
Examples
$25 billion was spent by Americans on video games in 2010, showing how natural games are to society.
Consequently, games/game elements could be used to increase engagement of users with e-learning
applications(Muntean 2002). It also appears that through gamification systems users can be persuaded to
become more eco-friendly as shown in the system called EcoIsland(Liu et al. 2011). In the next picture
(figure 3), the sensors (such as a laptop) report pollution levels to the application and the trading of e-
mission rights occurs between neighbours (users).
17
Figure 3: EcoLandPicture(Shiraishi et al. 2009)
2.2.2 Essential Game Elements
Playing enjoyment is central to computer games, although there is no accepted model of player enjoyment
in games. A recent model, however, called GameFlow, attempts to fill this void and consist of eight
elements- concentration, challenge, skills, control, clear goals, feedback, immersion and social
interaction. Concentration is about providing stimuli from different sources and assigning the user enough
workload. Games most also be sufficiently challenging and match player‟s skill level. Immersion is about
players having a deep but effortless involvement in the game(Sweetser& Wyeth 2005).
This model has been used by many including Queensland University of Technology gamification of a
mobile orientation application(Fitz-walter, Tjondronegoro & Wyeth 2011a).
2.2.3 Benefits
In 2010, the user population of Farmville was 60 million users/1 percent of the world population(Burke &
Hiltbrand 2011). This game is about users maintaining a virtual environment.
Gamification popularity did not occur accidentally as it has standard principles. For instance, community
collaboration principle states giving challenges that must be solved by a community. In addition, other
principles called dynamics are also used for successful gamification. There is the use of theepic meaning
dynamic which deals with exciting users by making it feel they are working for something
significant/awe-inspiring. Then there is the free lunch dynamic which involves giving a user a gift which
is especially enjoyable when given by a friend. Lastly, users will repeatedly use an application they find
interesting thus gamification attempts to make applications interesting which should be any system‟s goal
as well. (Burke & Hiltbrand 2011)
18
Gamificationimproves student pass rates as seen when applied to a 1st year programming class in the
University of Ulster(Ulster 2012).This will be expounded on later in the paper.
2.2.4 Limitations
By continually rewarding users in a game, you may remove the moral value of the actions done by the
user. To counter this, one needs to ask whether the user would still be interested even if they will not be
rewarded.(Burke & Hiltbrand 2011)
There are potential issues that occur when adding game elements thus proper consideration must be taken
in the process. (Fitz-walter, Tjondronegoro & Wyeth 2011a)
Also, financial cost is a key issue to consider when adding game elements to an application and can be
reduced using a generic tool for game definition and management(D. Bustard, M. Black, et al. 2011). For
this Honours project, the cost relates more to work hours thus it is important to implement only the
necessary game elements.
Lastly, as a result of limited memory on mobile phones not all gamification features can be
implemented(completely) on them(Jong et al. 2008).
2.2.5 Psychology
There are many psychological theories that make games engaging. Flow(in games), which compares from
positive psychology, is the feeling of complete and energized focus in an activity, with a high level of
enjoyment and fulfillment(Chen 2007). There a psychological reasons why gamification works. For
instance: the use of badges in gamification has a social psychology (as well as human computer
interaction) derivation. The most motivating goals are said to be those just outside of comfortable reach.
Research has shown that people will even consume physical goods in order to achieve the goals of a
game. The actual fun and adventure of goal seeking is a major reward for any user (as there is no
monetary reward). Moreover, the fact that a goal can be embodied in a game badge, the user can have
something to show to friends as a mark of achievement(in the game) (Antin & Churchill 2011).
2.2.6 Conclusion
It is clear that gamification is a helpful tool in aiding learning however scope needs to be carefully
defined in the implementation as well asconsidering other limitations.
2.3 Gamification methods at university
2.3.1 Introduction
At the highest level, the methodologies of using game design elements in non-gaming contexts is still
growing and is of keen interest in the HCI(Human Computer Interaction) field.(Deterding, Sicart, Nacke,
O‟Hara, et al. 2011c)
19
This section outlines a number of gamification experiments in education, which naturally exhibits game-
like attributes like giving points/”marks” for assignments and passing a student to another year/”level”
when ready. (Fitz-walter, Tjondronegoro & Wyeth 2011a)
2.3.2 Types
Achievement Systems
The use of a personal orientation passport for smartphones was tested by 26 university students at the
Queensland University of Technology. The system uses game achievement mechanisms like reward for
application use (adding friends), answering service related questions and so forth. Students used the
application during orientation and provided feedback each evening about the highlights/events of
orientation. It was found that 96% of users felt that the achievement concept added value and comments
given by students include “such a fantastic twist”, “was genuinely fun” and “great for killing time
productively” (Fitz-walter, Tjondronegoro & Wyeth 2011a). Here is a picture (figure 4) of the mobile
orientation application which just shows a list of the student‟s achievements:
This page is deliberately empty as it looks to be the most challenging
of the lot(figure 15). The current Vula wiki has more than 9 icons per page, the maximum number of elements short-term memory can
hold(Ingber 2012) . However, after a heuristic evaluation discussed
below, it is recommended that the students use mark-down language for editing pages. A mark-down language is a highly human-readable
lightweight markup language.(Maeda et al. 2008)
Figure 15: Edit Page Screen – Paper
Prototype 1
Badges Page
The badges are displayed on the screen in a list format in which there
is a badge image and a short-description in each row(figure 16). This
format is inspired from the achievement screen of a previous
gamification effort of a university orientation application(Fitz-Walter & Tjondronegoro 2011). However, it is later discovered that a 2-d
grid to display badges allows for better affordance with students as
Foursquare(Foursquare 2012) use this display and it is a popular gamifed location-based service.(Lindqvist et al. 2011).
Figure 16: Badges Screen - Paper
Prototype 1
4.3.3 Evaluation
This prototype is evaluated using two heuristic evaluations, as this is a natural first evaluation of a new
interface. The first heuristic evaluation is conducted by a Masters student, in which the evaluator provides
advice on each screen based on personal opinion and Jacob Neilsen Heuristic guidelines(Nielsen 2005).
Thereafter, a second heuristic evaluation is conducted by the head supervisor, in which the prototype is
evaluated based on expert knowledge. A combination of the feedback from the both evaluations is
incorporated into the next iteration.
43
4.3.4 Findings
Home Page
Make it more gamified, such as showing a leaderboard so the student has a definite idea of the
game existence.
Search bar must be displayed top right, if the researches designer chooses to have one anyways.
Edit Page
It is suggested that mark-down language be used to allow students to bold, italicize and markup
text in general. This is basically a simplified mark-up language that allows students to perform
basic markups like embolden text, by surrounding the parts of word to be bold with underscore.
Furthermore, this mark-down will be the same as the current Vula wiki mark-up syntax for bold,
italicizing, etc.
Created/Read Page
Add a little badge for each section according to the section quality.
Add Group Page
Like the University of Ulster gamification example(D. Bustard, M. Black, et al. 2011), it will be
helpful to add a group aspect to the gamification. This allows group-cooperation and furthermore,
social interaction is a component to successful game creation.
General
A student must be able to not just like a page but also the sections of a page
There has been little focus so far (on the prototype) as to how the wiki pages will be displayed.
Emphasis must also be placed on the usability of the wiki improving even though gamification is
added.
In order to display gamification notifications, instead of using constant pop-ups (which may
irritate users) rather use a inbox icon that highlights each time a message (or gamification reward)
is received. Furthermore, Schneiderman‟s 8 golden rules mentions that users must be initiators of
actions (opening a mailbox if they want) and not responders to actions(in this case pop-ups
stopping their flow)(Shneiderman& Plaisant 2005, p.74) .
4.4 Conclusions This chapter discussed the low fidelity prototyping, beginning, with the requirements gathering from
which the functionality of the system was derived. A subset was tested on the first software/vertical
prototype. Gladly, the success of this prototype showed the project to be technological feasible. This led
to a paper prototype which was evaluated by heuristic evaluation and changes were integrated in later
iterations.
44
5 High Fidelity Prototype Iterations
5.1 Introduction The High-Fidelity prototypes described here logically follow from paper prototype 1 and eventually
mirror the final system.
Starting with the second paper prototype, its design is first discussed through incorporating changes from
its previous version as suggested by the expert evaluators. However, university students are also involved
in the evaluations now as well as the usual heuristic evaluators. Thereafter, the development of a software
prototype which incorporates these alterations is discussed and undergoes formative evaluation,
conceptual model extraction (both on students) as well as a heuristic evaluation (by head supervisor).
Lastly, an improved horizontal prototype is developed based on past feedback, evaluated the same way as
its predecessor and also marks the end of the prototyping cycles.
5.2 Paper Prototype 2
5.2.1 Requirements Changes
All the findings described in the previous section are implemented in this prototype.
5.2.2 Design
Once again, the following is a sample of the designs for paper prototype 2; the full collection is in
Appendix 11.2. In particular, the following screens discussed are all improvements from the past cycle.
Figure 17: Home Screen - Paper
Prototype 2
Home Page
The home page now includes a leaderboard of the top 10 students
(figure 17) to clearly introduce the game concept from the beginning, as suggested in the previous evaluation feedback. Also, there is also an
option to join a group, as suggested in past iteration. Finally, the search
bar is now a small search icon on the top right of screen but maybe
removed depending on user feedback. The treemaps of wiki pages is now viewable only after clicking the new icon ”wiki pages”, thus 1 step
longer to reach than before and this is Fitt‟s law(Seow 2005) in action,
as treemaps is not as important as the game elements thus harder to reach.
45
Create Page
The create page screen (figure 18) now has an extra input field to
label the introductory section to give student more control. Secondly,
there is also the inclusion of mark-down edit tips that can be found under the “edit tips” icon. When clicking this icon, the student will
see possibly familiar traditional Vula mark-up style syntax for
underling, italicizing text, etc.
Figure 18: Create Page Screen - Paper
Prototype 2
Edit Page
Likewise, the edit page(figure 19) also includes an extra field to edit
the current section name, as all content should be editable in a wiki (Leuf & Cunningham 2001). Secondly, the “edit tips” icon included
here exhibits the same functionality as create page and wherever else
it is included in the system.
Figure 19: Edit Page Screen- Paper
Prototype 2
46
Badges Page
The badges are displayed in a 2d-grid format(figure 20) just like Foursquare(Foursquare 2012) as previously mentioned, this is a
popular gamifed location based service(Lindqvist et al. 2011).
Secondly, in the software version there will also be a drill-down
option on each badge as this supports the detail-on-demand concept for users, whereby they can get more information on a collection if
wanted(Shneiderman& Plaisant 2005, p.594).
Figure 20: Badges Page Screen – Paper
Prototype 2
5.2.3 Evaluation
As usual, the head supervisor provides instant feedback on the prototype before it is exposed to the end-
users (students) for the first time. Five students were chosen, from all the computer science undergraduate
years of both genders and various ethnicity groups (a representative sample). Students were assured
anonymity at beginning of experiment and briefed about experiment. Thereafter, a conceptual model
extraction is carried out on the users/students as this is the first time they have seen a gamified mobile
wiki(Jones & Marsden 2005, p.197) . Each student is asked to explore the paper prototype by
transitioning between the screens, voicing out thoughts and asking the researcher for advice where
confused. Then, the student is asked what they think of each element on each screen, and all feedback is
noted. Lastly, the student is compensated R30 for their time.
5.2.4 Findings
The findings of these evaluations are fundamental to the system development as it actually provided the
users/students with a view of the system and its intuitiveness. The results helped provide the researcher
with rich ideas from users that led to a more usable and obvious design. However, there were also rare
cases of conflicting and at times, infeasible user ideas (given the project time frame).
To follow will be a discussion of the changes each screen in turn.
CORE SCREENS
Home Screen
The home screen must be less clumsy. This can be achieved by simply spacing out icons out
more.
It is not obvious what the application is about from looking at the home page, hence a
preliminary/introductory tutorial screen is recommended which explains the wiki in the context of
gamification.
47
Have a drill down on the avatar to see how far a person is from next badge.
The level of the character should be inserted into the actual gauge.
Clicking on the points gauge (speedometer), should send the user into the game profile screen.
If there is to be a help icon, it must be represented by a question mark.
Create Page Screen
A tutorial to let the user know that editing sections is possible(figure 21), should be included
before create page loads.
Figure 21: Create Page Tutorial Screen
It will also be helpful to have a preview button when creating a page, to see if user is happy with
changes before saving.
Edit Page Screen
Should rather use a scroll bar of icons to select markup such: as bold, italics, image (and others);
as opposed to memorizing mark-down language. Furthermore, this suggestion supports
recognition rather than recall (Nielsen 1993) hence minimize user memory load.
Choose Group Screen
If groups are predefined, a drop down combo box can be used to select group.
All Pages-View Screen
Each page needs to have a rating near it, as this facilitates for quality filtering and possibly
making it faster for the students to find the information they want(Li & Wu 2010).
48
Commenting feature
It is not clear why there are “view” and “talk” icons, on each menu header. In particular, it is not
clear, why there would be a view icon since the page is already being viewed. Therefore, just
have a talk icon visible when in view mode and when in talk mode have a view icon visible.
Game Screens
Exclusion of achievements screen
There was not a single user that could differentiate the purposes of the badges and achievements
screen therefore the achievement screen will be removed since most users naturally understood
badges.
Leaderboard Screen
Change the leaderboard title from “CS TOP 10” to “GAME TOP 10”, as this makes it clear that
the table refers to a game and not academic marks.
There is also a remark of why is there a leaderboard in the first place, but this will be clarified by
having a preliminary tutorial screen in the future.
Game Profile Screen
A tutorial screen should appear before the user first visits the game menu in order to explain the
game to the user. One user remarked, “If this is a game, where do I play”. Indeed, this tutorial
screen will appear even before enters the Home Screen, so the game theme is established right
from the beginning of the system(Sweetser& Wyeth 2005) . Furthermore, this suggestion saves
the researcher time in explaining the system to users during experiments.
See All
Browse pages by categories (like subjects) and then choose page.
Popular Pages
There seems to be an ambiguity in the term popular pages, does it mean pages that are most
visited by this user or pages that are most visited by the class. Indeed, a sub-title is to be added to
the popular pages screen clarifying that the latter interpretation is correct.
Treemaps are said to be a wonderful visual by most and must be kept although one user did find it
to be complex.
Other feedback
General Changes
An alternative means of showing notifications could be to have a page that highlights when a
reward has been unlocked.
Disable menu items when not needed in order to prevent errors. This resulted from a user‟s
statement, “I will keep clicking until something happens. I want to test the whole system”. In
particular, the user asked what would happen if the “See All pages” button is clicked whilst in
create page, and the answer is that all the data would be lost.
Restyle point gauge( the jQuery Foundation 2012a), such that the points label is inside the gauge
and also indicate the badge level near the points gauge.
49
One user informs that the message box (for game notifications) is synonymous with Vula email
notifications and prefers a page that highlights. Although, all other users also observed this same
relationship, they found it attention grabbing and useful.
It is also desirable to customize settings in the system such as font size, language and background.
5.3 Horizontal Software Prototype
5.3.1 Design
This prototype improves on the design of paper prototype 2 by incorporating the student changes. This
includes in the addition of a tutorial screen, the removal of mark-down language in edit page for user
friendly icons and more. However, the design is basically the same as the previous paper prototype hence
will not be repeated here. Although, to follow is a use case diagram that details all the functionality
(excepting commenting feature) in this prototype.
5.4 Use Cases The use cases of the gamified wiki are shown in the diagram (figure 22) below .The core use cases of the
system are general wiki functions such as create, read, edit and now even liking articles. In addition, the
other functionality supported is the gamification tasks such as checking game leaderboards, badges and
also game profile. Also, it is worth noting that the “<<extends>>” keyword in the diagram means that one
activity occurring may lead to (but not necessarily) to another activity occurring. For instance: reading of
an article may lead to commenting on it. The prototype implementation will be discussed next.
50
Figure 22: Gamified wiki Use Case Diagram
5.4.1 Implementation
Changes Implemented:
All changes are implemented (except a few) but here is a sample of the changes implemented:
Having only the badges screen and eliminated the achievements screen.
Appropriate menu disabling as this prevents students from making mistakes, a form of error-
prevention(Shneiderman& Plaisant 2005, p.74) .
Tutorial screens have also been included as this facilitates quick learning of the system (Shneiderman
& Plaisant 2005, p.74) and suggested by user.
A refined edit page screen is implemented without using mark-down language but rather a scroll bar
of icons for the markup, as suggested by users and this supports recognition rather than recall(Nielsen
2005).
51
Changes Not Implemented:
No help icon included but tutorial screens are used instead. In game literature, user help can be in the
simple form of a tutorial(Sweetser& Wyeth 2005).
Commenting system not done at all, due to time constraints.
Message box icon as a page that highlights not implemented as most students found the original
message box icon useful in alerting them of notifications(for gamification).
The drill-down on avatar is not implemented as this is only an additional request by one user and not
within current time constraints.
5.4.2 Screenshots
The following screenshots (figure 23) are images of the horizontal software prototype. Initially, the user
will first see the game help page, which will explain the basic gamification aspects of the wiki (like
getting points for creating pages and so forth). Then, they will get to the home page(after pressing
continue) and in order to create a page for the first time, they will read the create help screen. Thereafter,
they will create a page and can view it. Furthermore, it is worth noting that the menu buttons are
sometimes disabled in certain screens as this is a type of error prevention that was motivated by a
previous user‟s comment.
52
Figure 23: Horizontal Prototype Screens
Having presented the prototype, the evaluations and findings will now be discussed.
5.5 Evaluation and Findings
5.5.1 Test Subjects
Once again, a representative sample of five students was chosen across all the different computer science
undergraduate years of both genders and various ethnicity groups. Just to re-iterate, the gamified wiki is
only targeted at undergraduate computer science students. Also, the subjects were picked by direct
approach and beyond being guaranteed report anonymity they were also compensated R30 per
experiment.
5.5.2 Permission and consent
An application had to be made to student affairs at the University of Cape Town (UCT) in order to
conduct research on UCT students. In addition, an ethical clearance application was approved from the
Ethics Committee of the Faculty of Science given that anonymity is assured for all participants. Not to
mention, consent was obtained from the students themselves as they signed a consent form before each
evaluation.
5.5.3 Methodology
The evaluation methodology for this iteration is a combination of formative evaluation(Gediga et al.
1999) and conceptual model extraction. (Jones & Marsden 2005, p.197).In addition, the head supervisor
also provided expert opinion on the interface (i.e. heuristic evaluation). Overall, the entire evaluation is
53
spilt into navigation and user satisfaction/engagement issues c whilst attempting to extract quantitative
and qualitative data.
As mentioned in the design chapter, a formative evaluation is used as it is able to evaluate whether a
prototype at a given iteration (i.e. formative stage) is meeting system objectives(Gediga et al. 1999)
instead of waiting till the end of the project when mistakes cannot be rectified (Landau 2001).
Furthermore, it is used to compare usability and engagement (Di Bitonto et al. 2009) like in a comparative
study of two systems(Koenemann-Belliveau et al. 1994), namely: the current Vula mobile wiki and the
gamifiedVula mobile wiki. Indeed, the learning effect needs to be alleviated by alternating the order of
which system is evaluated first for each user. Also, a conceptual model extraction is used since this is the
users‟ first interaction with such a mobile interface(Jones & Marsden 2005, p.198) and it exposes
misconceptions the user has about the system. The formative evaluation tests usability/”ease of use”,
perceived value of system and finds unintended system consequences (Landau 2001). On the other hand,
the conceptual model extraction provides insight on the students‟ understanding of system terminology.
To follow is a description of the evaluation proceedings (table 4).
5.5.4 Structure of evaluation
The evaluation consisted of a navigation and user satisfaction/engagement test (Landau 2001). First, the
student is briefed about the experiment, assured ethical anonymity and that it is the system and not them
being evaluated. This is followed by a navigation test(using formative evaluation), in which the student is
asked to carry out a set of tasks(for both current and gamified wiki on mobile) where the number of clicks
and error rates are measured (Hartson et al. 1996) using a Microsoft document table. These tasks are to
create, edit and view a page on a wiki. Thereafter, the student is asked to explore the system screens
whilst giving an explanation of system icons and raising misunderstandings/suggestions as they go along
(Jones & Marsden 2005, p.198) . Finally, a short post-experiment interview takes place in which the
student is asked a set of questions inspired from Valeria Landau(Landau 2001), which checks how well
system objectives are achieved. These are:
Determining the value of the gamifed wiki to students
Discovering any confusing terminology/places where it is not obvious what to do
A follow up on anything interesting/”unintended usage” of the system observed from user
Comments about the visual appeal and readability of the current mobile Vula wiki and the gamified
one.
Any other comments on graphics used and if there is an information overload(i.e. is the system simple
enough)
Comments about the engagement of the current mobile Vula wiki and the gamified one.
Identify any other likes or dislikes the user has of the system.
Any final recommendations on the system
Table 4: Evaluation Plan
54
Title Description
Evaluation Objective Quantitative
Navigation- is the difference in navigation speed between the current mobile Vula wiki and the gamifiedVula wiki. This is measured by number of clicks for sample
tasks completion and number of errors experienced in doing so.
Readability- is there a difference in readability between the current mobile Vula wiki and the gamifiedVula wiki. This can be tested using Tullis display-
complexity metrics(Tullis 1984) but is not done as the appropriate software could
not be found.
Qualitative Engagement- is there a difference in the engagement between the mobile Vula
wiki and gamifiedVula wiki. This is measured by post-interview questioning.
Readability- do the students find the readability between the two mobile Vula wikis different and this can be measured by post-interview questioning.
Evaluation Structure Mentioned already
Time Per User 20 minutes(although is often longer)
Venue All tests conducted in Honours Lab and one in residence room.
User Group Computer science students from each undergraduate year and 4 race groups (of
both genders) represented.
Evaluation Method Formative evaluation (navigation test between wikis) and conceptual model extraction.
Equipment Mobile phone emulator(Butts & Cockburn 2002) for gamified wiki (as the
software was not fully mobile ready) but a mobile phone for Vula wiki, as there
no mobile phone emulator for the current Vula wiki.
Recording of data Data is recorded on a nearby computer as the experiment takes place using a word
document (for each user), which has predefined Microsoft table (recording
number of clicks, etc.) and set of interview questions.
Analysis of data Data is aggregated from each document, manually by the researcher into this report.
5.5.5 Findings
Perceived Value of the System
One student mentioned that there is value in the gamified mobile Vula wiki so long as the points rewarded
can actually result in real marks or even deadline extensions. Another student, said since they never really
considered ever using a wiki thus they did not initially see a need for it. However, the same student
mentioned that it is so much easier to use the gamifiedVula wiki to create and edit pages, but again the
gamification aspect did not add more value. Yet another student also found the new wiki useful, says that,
„at least one can interact with the system even after sharing notes with peers‟. Also, a student mentioned
that it is ideal to use a Vula wiki since they do not have to bother entering their internet details as it is
local to UCT. Furthermore, this student also said the game elements add to the enjoyment of the system
but did say the interface is not obvious especially as this person is not used to wikis and does not use Vula
frequently.
Ease of Use- Navigation
55
This begins by evaluating the difference between the current and gamified mobile Vula wikis (table 5 and
6). The users are asked to carry out set of tasks (create, edit, and view wiki page) whilst their number of
clicks (i.e. number of screen transitions) and errors is measured. The results show the following:
Create Page
Not a single user is able to create a page on the current Vula wiki as they do not know the markup
required. Whereas, for the gamifiedVula wiki, the users could create a page naturally although did get
confused with the terminology in “Place Page” screen.Also, the number of clicks taken to create a page
on the current mobile Vula wiki is undefined, since no one could do it, but should be 2(if they did).
For the gamified mobile Vula wiki, the clicks is 3(by everyone) but can be reduced to 2(in the next
prototype due to a user recommendation discussed later).
Edit Page
The edit page is 2 clicks for each mobile Vula wiki, as users were able to do this successfully for both
without errors.
View Page
The view page (for a page stored in root/home page) is 1 click, for the current mobile Vula wiki but 2
clicks for the gamifiedVula wiki due to extra gamification elements. Likewise, all users were able to view
Aumüller, D., 2005. SHAWN: Structure helps a wiki navigate. In In Proceedings of the BTW-Workshop
WebDB Meets IR. Karlsruhe, Germany. Available at: http://dbs.uni-
leipzig.de/file/aumueller05shawn.pdf [Accessed July 30, 2012].
Berman, P.S., 2012. Vula wiki underuse.
Bevan, N & Azuma, M., 1997. Quality in Use: Incorporating Human Factors into the Software
Engineering Lifecycle. In Proceedings of the 3rd International Software Engineering Standards
Symposium (ISESS ’97). Washington, DC, USA: IEEE Computer Society, p. 169--. Available at:
http://dl.acm.org/citation.cfm?id=850975.854938.
Bevan, Nigel, 1995. Measuring usability as quality of use. Software Quality Journal, 150.
Bevan, Nigel, 2001. International standards for HCI and usability. International Journal of Himan-Computer Studies, 55(4), pp.533-552. Available at:
http://linkinghub.elsevier.com/retrieve/pii/S1071581901904835 [Accessed July 30, 2012].
Beydeda, S. & Gruhn, V., 2001. Integrating White- and Black-Box Techniques for Class-Level
Regression Testing. In Proceedings of the 25th International Computer Software and Applications Conference on Invigorating Software Development. Washington, DC, USA: IEEE Computer
Society, pp. 357-362. Available at: http://dl.acm.org/citation.cfm?id=645983.675252.
Di Bitonto, P., Roselli, T. & Rossano, V., 2009. Formative evaluation of a didactic software for acquiring
problem solving abilities using Prolog. In Proceedings of the 8th International Conference on
Interaction Design and Children. New York, NY, USA: ACM, pp. 154-157. Available at:
http://doi.acm.org/10.1145/1551788.1551815.
90
Buchanan, G. et al., 2001. Improving mobile internet usability. In Proceedings of the 10th international
conference on World Wide Web. New York, NY, USA: ACM, pp. 673-680. Available at:
http://doi.acm.org/10.1145/371920.372181.
Burke, M. & Hiltbrand, T., 2011. How Gamification Will Change Business Intelligence. Business
Intelligence Journal, 16(2), pp.8-16. Available at:
Chin, D.N., 2001. Empirical Evaluation of User Models and User-Adapted Systems. User Modeling and
User-Adapted Interaction, 11(1-2), pp.181-194. Available at:
http://dx.doi.org/10.1023/A:1011127315884.
Clker.com, 2012. Thums Up clip art. Available at: http://www.clker.com/clipart-thums-up.html [Accessed
October 26, 2012].
Croninger, R.G. & Douglas, K.M., 2005. Missing data and institutional research. New directions for
institutional research, 2005(127), pp.33-49.
Crowley, D. & Selvadurai, N., 2009. Foursquare. Available at: https://foursquare.com/ [Accessed May
13, 2012].
Cunningham, W., 2012. Portland pattern repository‟s wiki. Available at: http://www.c2.com/cgi/wiki
[Accessed July 30, 2012].
Deterding, S., Dixon, D., Khaled, R. & Lennart, N., 2011a. From game design elements to gamefulness:
defining gamification. Proceedings of the 15th International Academic MindTrek Conference:
Envisioning Future Media Environments. Tampere, Finland, pp.9-15. Available at:
http://dl.acm.org/citation.cfm?id=2181040 [Accessed April 27, 2012].
Deterding, S., Dixon, D., Khaled, R. & Lennart, N., 2011b. From game design elements to gamefulness: defining gamification. Proceedings of the 15th, pp.9-15. Available at:
http://dl.acm.org/citation.cfm?id=2181040 [Accessed April 27, 2012].
91
Deterding, S., Sicart, M., Nacke, L., O‟Hara, K., et al., 2011c. Gamification. using game-design elements
in non-gaming contexts. Proceedings of the 2011 annual conference extended abstracts on Human factors in computing systems - CHI EA ’11, Vancouver, Canada, p.2425. Available at:
Deterding, S., Khaled, R., Nacke, L. & Dixon, D., 2011e. Gamification : Toward a Definition D. Tan & B. Begole, eds. Design, pp.12-15. Available at: http://gamification-research.org/wp-
Godwin-Jones, R., 2011. EMERGING TECHNOLOGIES MOBILE APPS FOR LANGUAGE
LEARNING. Language Learning & Technology, June 2011, Volume 15, Number 2 pp. 2–11.
Available at: http://www.llt.msu.edu/issues/june2011/emerging.pdf.
Gong, J., Tarasewich, P. & Science, I., 2004. GUIDELINES FOR HANDHELD MOBILE DEVICE. In
Proceedings of the 2004 Annual DSI Meeting. Boston, Massachusetts, USA, pp. 3751-3756. Available at: http://www.itu.dk/~jeppeh/Materiale til fag/Konceptudvikling/Guidelines for mobile
interface design.pdf.
Google, 2012. Google Images. Available at: http://www.images.google.com [Accessed October 26,
2012].
Guzdial, M. et al., 1994. Analyzing and Visualizing Log Files: A Computational Science of Usability,
Available at: http://smartech.gatech.edu/jspui/bitstream/1853/3586/1/94-08.pdf.
Hamilton, S. & Chervany, N.L., 1981. Evaluating Information System Effectiveness - Part I: Comparing
Evaluation Approaches. MIS Quarterly, 5(3), pp.pp. 55-69. Available at:
http://www.jstor.org/stable/249291.
Hartson, H.R. et al., 1996. Remote evaluation: the network as an extension of the usability laboratory. In
Proceedings of the SIGCHI conference on Human factors in computing systems: common ground.
New York, NY, USA: ACM, pp. 228-235. Available at: http://doi.acm.org/10.1145/238386.238511.
Hezel Associates, L., 2010. Testing the Efficacy and Impact of a Selected PBS TeacherLine Course,
Dr. Hun Myoung Park, 2009. Comparing Group Means: T-tests and One-way ANOVA Using Stata, SAS,
R, and SPSS. Available at: http://www.indiana.edu/~statmath/stat/all/ttest/ttest.pdf.
Immunopedia.org, 2010. Immunopedia website. Available at:
http://www.immunopaedia.org.za/index.php?id=4 [Accessed September 26, 2012].
Ingber, L., 2012. Columnar EEG magnetic influences on molecular development of short-term memory S.
F. P. G. Kalivas, ed., Hauppauge, NY: Nova. Available at:
http://www.ingber.com/smni11_stm_scales.pdf.
Iron Realms Entertainment, 2012. A list of experience level names. Available at:
/game/helpview/lusternia/a-list-of-experience-level-names [Accessed October 26, 2012].
Jenson, S., 2002. The Simplicity Shift, Cambridge,England.: Cambridge University Press 2002.
Jones, M. & Marsden, G., 2005. INTERACTION Edition, 1., Wiley.
Jong, T.D., Specht, M. & Koper, R., 2008. A reference model for mobile social software for learning.
International Journal of Continuing Engineering Education and Life-Long Learning, 18(1), p.118.
Available at: http://www.inderscience.com/link.php?id=16079.
Koenemann-Belliveau, J. et al., 1994. Comparative usability evaluation: critical incidents and critical threads. In Proceedings of the SIGCHI conference on Human factors in computing systems:
celebrating interdependence. New York, NY, USA: ACM, pp. 245-251. Available at:
http://doi.acm.org/10.1145/191666.191755.
Krasner, G. & Pope, S., 1988. A Description of the {Model-View-Controller} User Interface Paradigm in the Smalltalk-80 System. Journal of Object Oriented Programming, 1(3), pp.26-49. Available at:
Landau, V., 2001. “Formative Evaluation Planning”, in Developing an Effective Online Class, 2001.
Available at: 1. http://www.roundworldmedia.com/cvc/module10/notes10.html [Accessed
September 29, 2012].
Leuf, B. & Cunningham, W., 2001. The Wiki Way - Quick Collaboration on the Web 1st Editio., Addison-
Wesley Professional.
Li, N. & Wu, D.D., 2010. Using text mining and sentiment analysis for online forums hotspot detection
and forecast. Decision Support Systems, 48(2), pp.354-368. Available at:
http://linkinghub.elsevier.com/retrieve/pii/S0167923609002097 [Accessed March 2, 2012].
Lindqvist, J. et al., 2011. I‟m the mayor of my house: examining why people use foursquare - a social-
driven location sharing application. In Proceedings of the 2011 annual conference on Human factors in computing systems. New York, NY, USA: ACM, pp. 2409-2418. Available at:
http://doi.acm.org/10.1145/1978942.1979295.
94
Liu, Y., Alexandrova, T. & Nakajima, T., 2011. Gamifying intelligent environments. Proceedings of the
2011 international ACM workshop on Ubiquitous meta user interfaces - Ubi-MUI ’11, p.7.
Available at: http://dl.acm.org/citation.cfm?doid=2072652.2072655.
Maeda, K., Ma, X. & Strassel, S., 2008. Creating Sentence-Aligned Parallel Text Corpora from a Large
Archive of Potential Parallel Text using BITS and Champollion. In LREC. European Language
Resources Association. Available at: http://dblp.uni-
trier.de/db/conf/lrec/lrec2008.html#MaedaMS08.
Marsden, G., Maunder, A. & Parker, M., 2008. People are people, but technology is not technology. Philosophical Transcations of the Royal Society. Available at:
Nielsen, J., 1994. Enhancing the explanatory power of usability heuristics. Conference companion on Human factors in computing systems - CHI ’94, p.210. Available at:
Nielsen, J., 2000. Novice vs. Expert Users. Alertbox. Available at:
http://www.useit.com/alertbox/20000206.html [Accessed October 21, 2012].
Nielsen, J., 1993. Usability Engineering, Morgan Kaufmann.
Nielsen, J., 2005. Usability Heuristics. Available at:
http://www.useit.com/papers/heuristic/heuristic_list.html [Accessed September 26, 2012].
Palser, S., 2004. Mobile Bookkeeping Application for Micro Entrepreneurs in the Developing World.
Paul, S. & Hong, L., 2012. Who is Authoritative? Understanding Reputation Mechanisms in Quora. Proceedings of the Collective Intelligence 2012 conference in Cambridge, (2010). Available at:
http://arxiv.org/abs/1204.3724 [Accessed May 14, 2012].
95
Puah, C. & Abu Bakar, A.Z., 2011. Strategies for community based crowdsourcing. 2011 International
Conference on Research and Innovation in Information Systems, pp.1-4. Available at:
Smith, S. L., & Mosier, J.N., 1986. Guidelines for designing user interface software., Bedford,
Massachusetts, USA.
96
StackOverflow, 2012. Stackoverflow- Colloborativelyy edited Q&A for programmers. Available at:
www.stackoverflow.com [Accessed July 30, 2012].
Sulaiman, J. et al., 2009. Implementing Usability Attributes In E-learning System Using Hybrid Heuristics. 2009 International Conference on Information and Multimedia Technology, pp.189-193.
Available at: http://ieeexplore.ieee.org/lpdocs/epic03/wrapper.htm?arnumber=5381220 [Accessed
July 28, 2012].
Suleman, H., 2008. Automatic marking with Sakai. Proceedings of the 2008 annual research conference
of the South African Institute of Computer Scientists and Information Technologists on IT research in developing countries riding the wave of technology - SAICSIT ’08, pp.229-236. Available at:
Sweetser, P. & Wyeth, P., 2005. GameFlow : A Model for Evaluating Player Enjoyment in Games. , 3(3),
pp.1-24.
Tazzoli, R., 2004. Towards a semantic wiki wiki web. In In Demo Session at ESWC. Heraklion , Greece. Available at: http://www.tecweb.inf.puc-rio.br/semweb/space/Platypus+Wiki/platypuswiki.pdf
[Accessed July 30, 2012].
Telono, 2012. Telono User-Centered Design Process. Available at:
http://www.telono.com/en/services/usability/ucd-process [Accessed September 4, 2012].
Tullis, T.S., 1984. A Computer-Based Tool for Evaluating Alphanumeric Displays. In INTERACT 84 -
1st IFIP International Conference on Human-Computer Interaction. London, UK.
Turo, D., 1992. Improving the visualization of hierarchies with treemaps: design issues and experimentation. In Visualization ’92, IEEE Conference. Boston,Massachusetts, USA. Available at: