User Research Hacks UX Lisbon | May 16, 2012 Gene Smith | @gsmith #uxhacks
Jan 27, 2015
User Research Hacks UX Lisbon | May 16, 2012
Gene Smith | @gsmith
#uxhacks
The Problem
Client wants major insights into their customers and design problems
Client has a small budget, limited time, or no executive buy-in to conduct research
The Problem
Client wants major insights into their customers and design problems
Client has a small budget, limited time, or no executive buy-in to conduct research
hackkludgy but e!ective way of solving a problem
The Problem
Client wants major insights into their customers and design problems
Client has a small budget, limited time, or no executive buy-in to conduct research
hackkludgy but e!ective way of solving a problem
user research hackkludgy but e!ective way of getting some data that will give you insight into users’ needs so you can design something halfway decent
Kludgy but effective
• Simple tools• Creative experimental design or just
some creative thinking about how to improve our results
• Spreadsheets, pivot tables & other post-study analysis tools
This Presentation• Three user research hacks
1. Determining users’ content priorities2. A/B testing mock-ups3. Getting better interview responses
• Goal: give you some ideas for your next low-budget, high-impact user research project
#1. Content PrioritiesThe client: regional financial services company with 2,500 employees and 100+ branchesThe project: a ground-up Intranet redesignThe problem: no mandate to conduct research with front-line staff.
Our Solution
• Content prioritization card sort• Include questions about region and role
to segment the data.• Distribute the surveys directly to
branch managers and use the tell-two-friends method.
Why Content Prioritization?
• We needed to understand actual behaviour• This survey design let us extract a lot
insights from one data set.• What content do people need daily? • How does that differ by job function?• Are there significant differences between
different locations or job functions?
Ranking Content Use
• Hunch: front-line staff relied on Intranet heavily for business-critical information. How could we show that with the data we had?
• We segmented respondents into two groups: all front-line staff and corporate staff
• We looked at the % of content each group used daily and ranked their responses
Front-line staff: > 95% using two Intranet resources dailyCorporate staff: 55% using two Intranet resources daily
Front-line staff: > 80% using 12 Intranet resources dailyCorporate staff: < 30% using 12 Intranet resources daily
1
1 2
2
Results• Much better understanding of information
needs for major job functions• Design meaningful role-based
customization features• Confirmed that front-line staff relied on
the Intranet more heavily• Secured additional participation from
front-line staff
#2. A/B Testing Mock-ups
The client: large utility companyThe project: assist them with usability evaluation for their websiteThe problem: how do we help them choose between mock-ups?
Experimental Design: A Digression
• In general, good experiments will meet these two criteria• random sampling of a population• random assignment to one or more
experimental groups
Our Solution• We created two ChalkMark surveys with identical
questions but different designs.• We used SurveyMonkey’s random assignment feature
to randomly direct participants to one of the surveys.• We compared responses to each question to see
what was different.
SurveyMonkeyRandom Assignment
ChalkMarkDesign #1
ChalkMarkDesign #2
SurveyMonkeyRandom Assignment
ChalkMarkDesign #1
ChalkMarkDesign #2
ChalkMarkDesign #3
SurveyMonkeyRandom Assignment
UserTesting.comCurrent Website
UserTesting.comPrototype
Lessons Learned• Test significantly different designs• Limits to chaining tools together
• Integration with panel management/recruiting software
• Tracking participants for incentives• Have a clear hypothesis you’re trying to
prove/disprove
#3. Boosting Interview Responses
The client: regional governmentThe project: understand how citizens access and experience government servicesThe problem: how do we get people to talk about something abstract like services?
Our Solution• Emotional response cards• We used a set of 50 cards with
emotional adjectives on them to help elicit in-depth responses from participants.
• Used physical cards in 20 in-home interviews, used PDF file for 20 telephone interviews.
How They Worked• We started with Microsoft’s Product Reaction
Cards, which includes a list of 118 product characteristics
• We reduced the number of cards to 50 and tried to include opposing characteristics (similar to BERT)
• At the end of the interview we handed participants the cards and asked them to pick the cards that described the experiences they had just talked about
http://www.uxmag.com/articles/organized-approach-to-emotional-response-testing
http://www.uxforthemasses.com/bert/
The Results
• People remember emotions• Few experiences are all +ve or -ve• Props help people express themselves• Emotions keep people honest• Emotions lead to better stories
Conclusion• These are some ways we’re pushing
our user research practice• We’re able to get a lot of value from
simple tools and creative thinking• Please share your own ideas at the
break, on Twitter (#uxhacks) or on your blog