Top Banner
Purdue iPhone App Usability Study David Rosenthal, Danielle Clifford, Dienesa Le English 203: Introduction to Research Methods Purdue University, May 2011
16

Usability Study of the Purdue University iPhone App

Mar 30, 2016

Download

Documents

David Rosenthal

Usability Study of the Purdue University iPhone App
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Usability Study of the Purdue University iPhone App

Purdue iPhone App Usability Study

David Rosenthal, Danielle Clifford, Dienesa Le

English 203: Introduction to Research Methods Purdue University, May 2011

Page 2: Usability Study of the Purdue University iPhone App

2

BackgroundThe Purdue University iPhone application was developed by Purdue’s Office of Marketing & Media and released in May 2010. The first version featured a map of campus, news feed, video feed, and a listing of campus events. In response to the enthusiastic reception by students, an update was released in August 2010 with new features, including a series of interactive tours of Purdue’s campus and access to daily menus from Purdue’s five dining courts.

By April 2010, the app had been downloaded by more than 22,000 unique users and had experienced more than 366,000 sessions, or uses. As the one-year anniversary of the app’s release approaches, Purdue is looking to the future with plans to release a new version of the app in time for the arrival of first-year students in June 2011.

Despite the wide interest in the application and the university’s plans for further development of the app, little is known concretely about how well it is meeting the needs and expectations of its primary users: Purdue students. Conversations with members of the Online Experience group within Purdue’s Marketing & Media Office reveal that while feedback from students has been solicited on an informal basis, particularly from the 25-member Student Advisory Council, no systematic study has been done to examine how students use the application.

For these reasons, we sought to conduct a usability study of the Purdue app with a population comprised of current students. Upon further investigation, we narrowed our study specifically to the Events feature of the app, which, along with News, remains one of the least-used features. This paper will present our findings specifically with regards to the Events feature as well as provide a model for future studies of the Purdue app.

Our study is based on the research question of how usable the Purdue app is for Purdue students. We define “usability” as meeting the needs of the user in a satisfactory and efficient manner from the perspective of the user.

IN THIS PAPER

Background 2 Data Collection 3

Participant Population

Usability Test

Post-Test Questionnaire Data Analysis 5

Video Recording Data

Questionnaire Data Results 7

Implications 9

Conclusions 11

Appendix A: Usability Test Facilliator Script 12

Spreadsheet1: Quantitative Data Results from Usability Test

Spreadsheet2: Time Analysis

Spreadsheet3: Questionnaire Responses

Spreadsheet4: Categorization of Questionnaire Responses

Page 3: Usability Study of the Purdue University iPhone App

3

Data CollectionWe collected both qualitative and quantitative data in scenarios designed to mirror real-world uses of the application. The desired population for this study was current Purdue University students.

Participant PopulationWe first attempted to recruit participants from among the members of the Marketing & Media Student Advisory Council (SAC), administering a brief screening survey. The survey asked questions about basic demographics (gender, age, etc.), use of technology and social media, and familiarity with Apple devices; and concluded by asking if the respondent was interested in participating in our study. The purpose in collecting this data was to establish a baseline allowing us to determine how prior experience affected the use of the application.

After failing to recruit any participants from the SAC, we administered the same survey to students in the English 203: Introduction to Research Methods course, whose instructor offered extra credit for their participation. Six students expressed interest in participating and were sent an email asking them to select a time to take the usability test. Three students replied and these became our participants.

The sample was composed of three females in the second, third and fourth years of their undergraduate degrees. Two of the participants (P1 and P3) reporting owning an iPhone or iPod Touch. All three participants reported using e-mail at least once a week, two reported using Facebook at least once a week, and two reported using YouTube at least once a week. One participant uses Google Apps on a weekly basis.

Usability TestThe usability tests were administered in a study room in the Humanities, Social Science and Education (HSEE) Library on Purdue’s campus. The room contained one table and four chairs, two placed on each side. A video camera was set up on a tripod angled down toward the table in order to record participants’ interactions with the iPhone. The facilitator sat directly across the table from the participant. Note-takers sat in the other two chairs. Figures 1 and 2 show the setup of the room. The first test was conducted in a different room than the second and third, but the setup was the same.

Figure 1: Diagram of room setupFigure 2: Photo of camera setupP = Participant, F = Facillitator, N = Note-takers

Page 4: Usability Study of the Purdue University iPhone App

4

The usability test began with the facilitator following a script giving the participant an introduction to the study and how the usability test would be conducted. The participant then signed a consent form at which point the recording began. The facilitator then directed the participant through a series of six tasks involving the use of the application, providing instructions on each task after the previous one was completed. Each of the participants completed the test using the same device, an iPhone 3G belonging to one the researchers.

The first task was passive, asking the participant to report their expectations before beginning to use the app. The remaining tasks test the usability of the app in two common usage scenarios, or contexts. In the first scenario, the user does not have a particular event in mind but is looking for events within a given time period or events of a certain type. In the second scenario, the user is looking for information on a particular event they are already aware of. In both cases, the user requires certain key pieces of information including the date, time, and location of the event. The tasks modeling both scenarios asked the user to find and report these pieces of information. The tasks included the following:

1. Observe the Events icon and describe your expectations for this feature.2. Find 3 events occurring within the next week (specific dates were provided). 3. Find one event occurring after this week (specific date was provided).4. Find a musical or performing arts events. 5. Find an academic event or lecture. 6. Find __________ event (a specific event name or event sponsor was provided).

The full text of the facilitator script is presented in Appendix A.

As the participants performed each task, they were asked to narrate their thoughts and actions aloud. A simplified outline of the tasks was placed in front of the participants to remind them of the instructions for each task. The camera recorded the user’s interactions with the screen of the iPhone, as well as the dialogue during the test. Observers took notes recording things that could not be seen in the video, including participants’ physical orientation to the device and emotional reactions during each task based on facial expressions.

Post-Test QuestionnaireAt the conclusion of the usability test, participants were asked to complete a post-test questionnaire administered online using a laptop. Participants were asked to rate how easy it was to use the app on a scale of 1 to 5. This was followed by a series of open-ended questions asking participants to share their impressions and assessments of the app. Questionions are presented in Spreadhsheet3.

Data Collection

Page 5: Usability Study of the Purdue University iPhone App

5

Video Recording

The video recordings provided a wealth of data, including the following: amount of time each participant took to perform each task, amount of time each participant took to view each event, amount of time for the events to load. Each of these measures was recorded independently by three researchers and an average data set, standard error, and percent error were calculated.

Task Time

We recorded how long it took participants to complete tasks two through six using the timestamp displayed by the video player. The task time was calculated starting from the moment the participant initiateed contact with the Events button (see figure 3.1) and ending when the facilitator verbally indicated that the task had been completed

Loading Time of Events Feature

The amount of time participants spent waiting for the feature to load was also significant. Each of the three participants opened the feature 5 times over the course of the test, giving a total of 15 loads. The feature load time was calculated starting from the moment the participant pressed the Events button (see figure 3.1) and ending when the list of events appeared (see figure 3.3)

Event Viewing Time

We then looked closer to determine how long it took for participants to view each event. Since some tasks required participants to view multiple events, there were more events than tasks. Event times differ from task times because they do not include the time spent browsing or scrolling through the list of events. The event viewing time was calculated starting when the participant initiated contact with an event from the list (see figure 3.4) and ending when the participant returned to the list. This measure includes the time spent waiting for the event to load.

Data Analysis

Figure 3.1: Figure 3.2: Waiting Figure 3.3: Figure 3.4: Pressing Events button for events list to load Events list appears Selecting an event

Page 6: Usability Study of the Purdue University iPhone App

6

Data AnalysisLoading Time of Individual Events

Even after loading the Events feature, additional time was consumed once an event was selected to load. The event load time was calcluated started at the moment an event was selected (see figure 4.1) and ending when the event article text appeared on the screen (see figure 4.4).

Post-Test Questionnaire

The data generated by the post-test questionnaire contributed significantly to our understanding of the usability of the Events feature. Our analysis focuses on the four open-ended questions, to which participants provided long, detailed responses full of insights. We broke the responses down into phrases, using this as our unit of analysis, and assigned the phrases into four categories:

1. Participants’ observations about the Events feature

2. Actions taken by participants while using the Events feature

3. Participants expectation or assumptions about the feature

4. Participants’ needs and expectations that are unmet by the app

The choice of categories follows a logical pattern of thought observed in participant responses: first, the user discovers the physical and virtual characteristics of the app, and then he or she takes actions in order to get what they need from the app within its parameters. The user then notes where the app falls short of his or her expectation and offers suggestions for improvement.

Categorization of responses was completed independently by each of the researchers. We then merged the spreadsheets, noting commonalities, and we identified the sentiments that were repeated by more than one participant. The questionnaire responses are presented in Spreadsheet3 and categorization of responses is presented in Spreadsheet4.

Figure 4.1: Figure 4.2: Waiting for Figure 4.3: Waiting for Figure 4.4: Seleting an event article text to load article text to load Article text appears

Page 7: Usability Study of the Purdue University iPhone App

7

ResultsTask Time

The time spent to completed each task varied significantly both for each participant and between participants. All of the tasks were completed successfully by each participant except for Task 6 by Participant 3, which the participant eventually decided to conclude without completing. The average time for each participant to complete a task ranged from approximately 1 to 2 minutes. Table 1 shows the average values, standard error, and percent error. Full results are presented in Spreadsheet 1.

Event Viewing Time

The event time also varied, but to a lesser extent. We looked at a total of 23 event views by 3 participants. Table2 and Figure 5 show the length of time each event was viewed by each participant. The time spent viewing an event ranged from 19 to 60 seconds, with the average time for each participant ranging from 28 to 44 seconds. The overall average across all events views was 37.2 seconds with a standard error of 1.36 seconds or 3.7% of the average value. Full event data is presented in Spreadsheet 1.

Table 2: Time spent viewing each event (in seconds)

Table 1: Average time to complete each task (in seconds)

Page 8: Usability Study of the Purdue University iPhone App

8

Results

Waiting Time

The loading time for the Events feature ranged from 8 to 10 seconds, with an average of 8.6 seconds and a standard error of 0.178 seconds or 2.1%. The time to load a particular event ranged from 5 to 39 seconds, with an average of 9.9 seconds. Average values are shown in Figure 6 and full results are presented in Spreadsheet 1 and Spreadsheet 2.

Figure 5: Average time spent viewing each event (in seconds)

Figure 6: Waiting time (in seconds)

Events feature Individual events

Page 9: Usability Study of the Purdue University iPhone App

9

Implications1. Information architecture matters

The most common deficiency of the app noted by our participants is a consistent lack of organization which makes the Events features difficult to use and time-consuming. The feature lacks organization in a number of ways highlighted by the following observations (each of which was reported independently by multiple participants):

• Key information is not readily available. This was the most common observation about the Events feature. Participants expressed frustration that they often could not find the date, time, or location of an event in its title or short description, requiring them to open extraneous events that did not turn out to be what they were looking for. Even when the desired event was opened, participants often had to read through large portions of text to find the desired information buried in an article. Participants took a number of “corrective actions” in order to make up for the deficiencies in the app, including zooming in and opening extraneous events.

In total, 29.3% of participants’ time using the app was spent browsing through events and 45.6% was spent reading specific event pages. The average time spent viewing an event was 37.2 seconds and the maximum time was 59.3 seconds.

• Events are not presented chronologically or filterable by date. All three participants expected the Events to be displayed in a calendar format, noting that this would be easier to use. Short of calendar, participants expected the events to be listed in chronological order, which is not the case.

• Events are not filterable by event type or category (i.e. academic, social, etc.).

• Events are not searchable by keyword. Participants noted that a search function would make the feature easier to use, especially when looking for a specific event.

In future versions of the application, developers can address these inefficiencies by replacing article from the university events feed with dedicated content which displays key information prominently, incorporating a filtering system by category, and adding additional navigation options.

45.6%

29.3%

25.0%Viewing particular event

Percent of time spent browsing, viewing, and waiting

Browsing through events list

Waiting for app to load

Page 10: Usability Study of the Purdue University iPhone App

10

Implications2. Time is a major concern

Participants expressed dissatisfaction with the amount of time it took them to find desired information using the Events feature of the Purdue app. Two out of three participants stated that they would not use the Purdue app to find events in the future. When asked why, they said that other online sources, especially the Purdue website, were more efficient and easier to use.

As usage time increases, satisfaction with the app decreases. The participant who gave the app the most favorable rating for its ease of use (a 2 out of 5, with 5 being the most difficult to use) also completed the usability test in the least amount of time. The participants who rated the app with a 4 on the same scale took an average of 3 minutes and 10 seconds longer, or 52.8% more time, to complete the test. One average, these participants also spent more time viewing each event (41.6 seconds on average, compared to 28.5 seconds). While it is difficult to make a definitive claim due to the limited sample size, the trend certainly indicates that time is a source of dissatisfaction with the app.

The amount of time spent waiting for the app to load was also significant. Participants spent between 21% and 28% (and an average of 25.0%) of their time with the app just waiting for pages to load. However, there were no noticeable differences in waiting time between the participant who gave a usability rating of 2 and those who gave a rating of 4, out of 5. A larger sample size would be needed to confidently determine the affect of waiting time.

3. Prior experience may not be a significant factor

As stated previously, two of the participants had prior experience using an iPhone or iPod Touch while the third had no prior experience. However, we were not able to identify any correlation between this fact and either the amount of time taken to complete the tasks or the level of satisfaction with the app. This doesn’t necessarily mean that no correlation exists – the sample size is too small to determine this with confidence.

Page 11: Usability Study of the Purdue University iPhone App

11

ConclusionsGiven the nature of this study as a pilot study, we faced both limited time and a lack of resources. These challenges resulted in several limitations to our research.

The ideal population would have been a random sample of all Purdue students. However, we were only able to recruit participants from among our fellow English 203 students, which introduced some bias to the study. The small number of participants limited our abilities to produce meaningful conclusions as well, but also allowed us to analyze their responses in great detail.

During the data analysis process, we identified several methodological changes that would have produced more results and could be incorporated into future studies of the Purdue iPhone app or other iPhone apps. One such addition would be to ask participants to report their satisfaction with the app after each task performed. This would allow researchers to tie user satisfaction levels more directly to specific interactions with the app. We also realized that more structured planning is needed in the note-taking process in order to produce more meaningful observations, especially when more than one note-taker is used.

In order to mitigate the risks of computational and human error arising from the subjective nature of some of our measurements, we had three researchers independently code the data. We then merged the data sets and calculated average values as well as the standard error and percent error. The percent error for nearly all values is quite low, ranging from 0.5% to 4%, indicating a lack of major errors.

We believe that one of the more meaningful outcomes of this work is not the results presented but the methodology we developed in order to investigate a relatively new emerging technology. Many questions remain about the Purdue app, both with regards to the Events feature as well as its other features. With regards to the Events feature, further inquiry can be made into its efficiency and ease-of-use, but questions should also be asked about the usefulness of its content. For example, are the events listed (mostly university functions and programs sponsored by academic departments) relevant or interesting to students, who make up the primary users? The methods described here can also be applied to the study of other iPhone apps and applications for other mobile devices.

Page 12: Usability Study of the Purdue University iPhone App

12

Appendix A: Usability Test Facillitator ScriptThank you for agreeing to participate in our research study. Our goal is to determine how easy or difficult it is to use the Purdue iPhone application. Today we are asking you to serve as an evaluator of the app by completing 6 tasks. As you perform each task, people narrate your actions and thought processes. Please describe what you are doing aloud using as much detail as possible. We will record your interactions with the app only as they relate to this purpose and will destroy all recordings upon completion of this study.

We are testing the app, not you. So don’t worry about making mistakes. There are no right or wrong answers; we just want to see how you use the app. If you ever feel that you are lost, frustrated or cannot complete the task, just let me know and we can move on to the next task. I will not be able to offer any suggestions or hints. However, there may be times when I’ll ask you to explain why you said or did something. Now, please review and sign the Participant Release Form* and return it to me when you are finished.

Before we begin, please make sure the iPhone is positioned in front of the camera. For each task, please do not start until all the instructions are given.

Task 1. Now we will begin with the first ask. For this task, open the Purdue app and spend a moment looking at the home screen. Look at the icon labeled “Events.” Without opening this feature, tell me what you might expect to find in this area. Please begin. (Follow up question: How would you expect the information to be organized or presented?)

Task 2. Now, please return to the application’s home screen. In this task, use the Events feature to find three events taking place in the next week - between today and April 22. As you locate each event, tell me the event’s name, date, and location. As you perform this task, please describe what you are doing aloud using as much detail as possible. You may begin.

Task 3. Return to the home screen. For the next task, wait until I tell you to begin. Use the Events feature to find an event taking place on April 22 (date one week from current date) or later. When you locate the event, tell me the event’s name, date, and location.

Task 4. Once again, return to the home screen and wait until I tell you to begin. In the next couple tasks, you will be asked to find a certain type of event. In this task, use the Events feature to find a musical or performing arts event taking place in the future. When you locate the event, tell me the event’s name, date, and location.

Task 5. Now return to the home screen and wait until I tell you to begin. In this task, use the Events feature to find an academic event or lecture taking place in the future. When you locate the event, tell me the event’s name, date, and location.

Task 6. Now return to the home screen and wait until I tell you to begin. For the final task, use the Events feature to find an event hosted by the Manufacturing Extension Partnership Center. When you locate the event, tell me its date and location.

Thank you for participating in this research study. We have a quick questionnaire for you to fill out. Then you are free to go.

Page 13: Usability Study of the Purdue University iPhone App

Spreadsheet1

Task time* P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3Task 2 139 209 156 141 198 156 139 198 150 139.67 201.67 154 0.6667 3.6667 2 0.4773 1.8182 1.2987Task 3 40 56 74 41 56 75 39 55 71 40 55.667 73.333 0.5774 0.3333 1.2019 1.4434 0.5988 1.6389Task 4 69 63 71 69 63 74 68 60 70 68.667 62 71.667 0.3333 1 1.2019 0.4854 1.6129 1.677Task 5 67 75 106 65 75 109 65 72 104 65.667 74 106.33 0.6667 1 1.453 1.0152 1.3514 1.3664Task 6 47 60 243 46 62 240 45 59 240 46 60.333 241 0.5774 0.8819 1 1.2551 1.4617 0.4149TOTAL 362 463 650 362 454 654 356 444 635 360 453.67 646.33 2 5.4874 5.7831 0.5556 1.2096 0.8948AVERAGE 72.4 92.6 130 72.4 90.8 130.8 71.2 88.8 127 72 90.733 129.27 0.4 1.0975 1.1566 0.5556 1.2096 0.8948

Event View Time* P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3

Event 1 37 55 45 38 57 44 37 56 45 37.333 56 44.667 0.3333 0.5774 0.3333 0.8929 1.031 0.7463Event 2 39 60 26 39 59 25 37 59 27 38.333 59.333 26 0.6667 0.3333 0.5774 1.7391 0.5618 2.2206Event 3 23 39 40 22 40 40 20 40 41 21.667 39.667 40.333 0.8819 0.3333 0.3333 4.0704 0.8403 0.8264 avgEvent 4 18 39 55 18 41 57 19 42 56 18.333 40.667 56 0.3333 0.8819 0.5774 1.8182 2.1686 1.031 37.2134Event 5 26 33 41 26 43 42 27 41 42 26.333 39 41.667 0.3333 3.0551 0.3333 1.2658 7.8335 0.8Event 6 33 32 26 32 43 26 33 42 27 32.667 39 26.333 0.3333 3.5119 0.3333 1.0204 9.0048 1.2658 std errorEvent 7 25 33 38 25 34 41 25 33 40 25 33.333 39.667 0 0.3333 0.8819 0 1 2.2233 1.360322Event 8 27 25 25 25.667 0.6667 2.5974Event 9 53 54 52 53 0.5774 1.0893 % errorTOTAL 201 291 351 200 317 354 198 313 355 199.67 307 353.33 0.8678 6.5659 1.2019 0.4346 2.1387 0.3401 3.655463AVERAGE 28.71 41.57 39 28.57 45.29 39.33 28.29 44.71 39.44 28.524 43.857 39.259 0.124 0.938 0.1335 0.4346 2.1387 0.3401

Event Feature Load Time* P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3 P1 P2 P3

Open 1 8 9 9 8 8 10 8 10 10 8 9 9.6667 0 0.5774 0.3333 0 6.415 3.4483 avgOpen 2 8 10 9 8 11 10 9 10 10 8.3333 10.333 9.6667 0.3333 0.3333 0.3333 4 3.2258 3.4483 8.577778Open 3 8 7 7 8 7 7 8 8 8 8 7.3333 7.3333 0 0.3333 0.3333 0 4.5455 4.5455Open 4 8 8 8 7 7 9 8 8 9 7.6667 7.6667 8.6667 0.3333 0.3333 0.3333 4.3478 4.3478 3.8462 std errorOpen 5 8 8 10 8 8 11 8 8 12 8 8 11 0 0 0.5774 0 0 5.2486 0.178345TOTAL 40 42 43 39 41 47 41 44 49 40 42.333 46.333 0.5774 0.8819 1.7638 1.4434 2.0833 3.8068AVERAGE 8 8.4 8.6 7.8 8.2 9.4 8.2 8.8 9.8 8 8.4667 9.2667 0.1155 0.1764 0.3528 1.4434 2.0833 3.8068 % error

* all times reported in seconds2.079152

Signal Strength 3 5 5 3 5 5 3 5 5 3 5 5 0 0 0 0 0 0

% ErrorCoder1 Coder2 Coder3 Avg Std Error

Page 14: Usability Study of the Purdue University iPhone App

Spreadsheet2

Total task time (tasks 2-5)**Time Between Event

Open/Close* P1 P2 P3 P1 P2 P3 P1 P2 P3

Event 1 37.33 56 44.67 17 39 16 Task 2 139.667 201.67 154Event 2 38.33 59.33 26 7 7 7 Task 3 40 55.667 73.333Event 3 21.67 39.67 40.33 6 6 7 Task 4 68.6667 62 71.667Event 4 18.33 40.67 56 10 5 6 Task 5 65.6667 74 106.33Event 5 26.33 39 41.67 6 7 9Event 6 32.67 39 26.33 6 7 10Event 7 25 33.33 39.67 4 11 16 Total Waiting time (2-5)**Event 8 25.67 10 P1 P2 P3Event 9 53 9 Feature load* 32 34.333 35.333TOTAL 199.7 307 353.3 56 82 90 Event load 56 82 71Adjusted (total - load time) 143.7 225 263.3 88 116.33 106.33AVERAGE 28.52 43.86 39.26

* All times reported in seconds avg** Values copied from Spreadsheet1 9.91304

Time use distribution (all tasks)*

P1 P2 P3 P1 P2 P3Total 360 453.7 646.3 100 100 100 Avg % Std Error % ErrorViewing 143.7 255 263.3 39.92 56.2 40.74 45.6203 5.29746581 11.6121Scrolling 120.3 74.4 246.7 33.42 16.4 38.17 29.3272 6.60817637 22.5326Waiting 96 124.3 136.3 26.67 27.4 21.09 25.0525 1.99049327 7.94528

Time use distribution (excluding task 6)*

P1 P2 P3 P1 P2 P3Total 314 393.3 405.3 100 100 100 Avg % Std Error % ErrorViewing 143.7 225 203.7 45.75 57.2 50.25 51.0679 3.33064419 6.52199Scrolling 82.33 52 95.33 26.22 13.22 23.52 20.987 3.96081879 18.8728Waiting 88 116.3 106.3 28.03 29.58 26.23 27.9451 0.96579625 3.45605

% of total

% of total

Average time** Event Load time

51.1%

21.0%

27.9%Viewing particular event

Browsing through events list

Waiting for app to load

Percent of time spent browsing, viewing, and waiting (tasks 2-5)

45.6%

29.3%

25.0%Viewing particular event

Percent of time spent browsing, viewing, and waiting (tasks 2-6)

Browsing through events list

Waiting for app to load

Page 15: Usability Study of the Purdue University iPhone App

Spreadsheet3

Please describe your general impressions of the Events

feature. What did you like or not like about it, and why?

Describe some of the difficulties you encountered, if any. Why did you find

these to be challenging?

Based on your experience using the application, is it something you would continue to use on

your own to learn about university events?

If not, where would you go instead? How does this source

compare with the Purdue iPhone app?

P1

I did not find the events feature very difficult to use; however, I believe there are easier ways to make it more user friendly. I would like to see it set up

potentially in a calendar format or even by event categories (ie. performances, speakers etc). It would also be easier if

there was a search bar so I could navigate more quickly through the list of

events.

Finding specific events was more difficult. Even though I ended up finding a Purdue Republicans

event quickly it would normally take me longer as it is a very specific event and the list is not organized in any way that I could tell. I found them challenging

because there was no sense of organization. It was like looking through a database with no search terms. For example, I want to find something but I don't know

where to look for it. Therefore, it becomes time-consuming and could be frustrating.

As it reroutes me to the Purdue events website anyway, I would find it easier to just use my computer to look up these events as they are organized on the

website. I would probably only use this application if I forgot the time and place of an event and did not have the time to sit down at a computer (ie. on the go).

As mentioned previously, I would probably just go to the main Purdue event webpage. This source is more organized and clearer with categories

provided for the different events. It is also organized in terms of date in each section. Therefore, it is more user

friendly.

P2

I liked the ability to simply scroll through events. There was no need to think about how to navigate a calendar. On the other hand, I assumed that the events would be ordered from now to future and that was not the case. That made it a little

more difficult to find something within a specific date range. However, if I have

basic parameters with which to search for events, a way to narrow down the list would be nice. Basically, I would have liked a search bar or a filter system.

One of the difficulties was that when looking for an event hosted by a specific group, there was no way to tell that from the list unless it was in the title or brief description you could see. If the info is not there, it would be missed, and I was worried about that.

Another was that the dates could not be seen from the main list unless they happened to be in the brief description. So if I wanted an event on a specific day or within a specific time frame and none of them had the date in the description, I had to open each one and read the full description to find the date. In that

respect, the calendar would have been easier.

Honestly, I rarely go to events, but if I did and I had access to the app, I think I might use it. Part of it is the fun of using the app with the cool features, but part

would be the convenience of every event listed in one place instead of on a bunch of different websites which I would have

to search through.

If I knew generally what I was looking for, like PMU events, I would google them or go straight to their web page if I knew it.

If I just wanted to browse Purdue events I am not sure where I would go. I don't

know of another source that has all the events.

The app is definitely preferable to not

knowing where to go to browse events. However, I think going to the source

when I know what I am looking for would be preferable to using the app and

having to look through everything to find a small group of events that fit my

P3

I was looking for more specific headers in the list to indicate what the events were. I expect that they are organized by date, but they aren't and the dates, locations

and sponsers aren't listed in the main list, which makes it difficult to find, I dont'

want to click on multiple articles to find the information. The article about the

event should be secondary concern to the time, place, and other relevant

information. I would love a day by day calendar so I can plan into the future.

I had to scan the preview to find what the events were because the titles weren't specific enough. the dates should be part of the title, or else I will just go online

to find the information I am looking for. the articles are too long to read quickly, therefore the font is small and you have to zoom in every time you want information.

Not at all. I get more information from fliers around campus than the app.

again, fliers. they are everywhere, eyecatching and specific. I don't have to stand there and read the entire thing to know if I am interested. I know almost

immediatly what the event is. I also search online a lot for PMU events or go to the school's website if i am loooking

for a specific area, like engineering events.

Page 16: Usability Study of the Purdue University iPhone App

Spreadsheet4

Things particpants observe about Events feature Actions participant took

Participant expectations or assumptions Deficiencies or suggestions for improvement

9-3 titles weren’t specific enough 9-3 zoom in every time

8-2 assumed that the events would be ordered from now to future and that was not the case

8-1 It would also be easier if there was a search bar so I could navigate more quickly

9-3 the font is small and you have to zoom in every time you want information 9-3 scan the preview 8-3 expect that they are organized

by date, but they aren't 8-2 would have liked a search bar or a filter system

9-3 articles are too long to read quickly9-2 had to open each one and read the full description

8-3 was looking for more specific headers in the list to indicate what the events were

8-1 would like to see it set up in a calendar format

9-2 the dates could not be seen from the main list unless they happened to be in the brief description

8-3 don’t want to click on multiple articles to find the information

11-1 more user friendly 9-2 calendar would have been easier

8-3 dates, locations and sponsers aren’t listed in the main list 11-2 look through everything

9-2 wanted an event on a specific day or within a specific time frame 9-3 the dates should be part of the title

9-1 not organized in any way that I could tell 11-1 more organized and clearer with categories 8-1 I would like to see it set up... by event categories

10-1 reroutes me to the Purdue events website 8-3 so I can plan into the future 8-3 I would love a day by day calendar

9-1 It was like looking through a database with no search terms 11-1 organized in terms of date 8-3 The article about the event should be secondary concern to the time, place, and other relevant information

8-2 a little more difficult to find something within a specific date range

10-1 look up these events as they are organized on the website 8-2 a way to narrow down the list would be nice

8-1 I did not find the events feature very difficult to use 8-2 would have liked a search bar or a filter system

11-1 It [the Purdue website] is organized in terms of date in each section. Therefore, it is more user friendly.

8-1 easier if there was a search bar 8-2 way to narrow down the list would be nice 11-1 categories provided for the different events

11-2 I don't know of another source that has all the events 8-1 navigate more quickly 9-3 titles weren't specific enough

8-2 the ability to simply scroll through events 8-3 dates, locations and sponsers aren't listed in the main list

9-1 I want to find something but I don't know where to look for it 9-1 no sense of organization

10-2 the convenience of every event listed in one place instead of on a bunch of different websites

8-3 I don't want to click on multiple articles to find the information

Categorization of participant responses to post-test questionnaire

Bold indicated items that were categorized the same way by all 3 codersItalics indicates items that were categorized the same way be 2 codersRed / Blue / Green indicates similar responses given by more than one participant

Numbers indicate question number and participant number. For example, 9-2 represents a response by Participant 2 to question 9.