EHR Usability Test Report of AllMeds Specialty EHR Version 11 Report based on ISO/IEC 25062:2006 Common Industry Formatfor Usability Test Reports AllMeds Specialty EHR Version 11 Date of Usability Test: Date of Report: December 6th, 2017 Report Prepared By: Stephen Nichols Table of Contents 1 EXECUTIVE SUMMARY 2 INTRODUCTION 3 METHOD 3.1 PARTICIPANTS 3.2 STUDY DESIGN 3.3 TASKS 3.4 PROCEDURE 3.5 TEST LOCATION 3.6 TEST ENVIRONMENT 3.7 TEST FORMS AND TOOLS 3.8 PARTICIPANT INSTRUCTIONS 3.9 USABILITY METRICS 4 RESULTS 4.1 DATA ANALYSIS AND REPORTING 4.2 DISCUSSION OF THE FINDINGS 5 APPENDICES 5.1 APPENDIX 1: PARTICIPANT DEMOGRAPHICS AND INFORMED CONSENT FORM 5.2 APPENDIX 2: EXAMPLE MODERATORS GUIDE 5.3 APPENDIX 3: SYSTEM USABILITY SCALE QUESTIONAIRE
51
Embed
EHR Usability Test Report of AllMeds Specialty EHR Version 11 › pdfs › AllMeds-SED-Report.pdf · A usability test of AllMeds Specialty EHR version 11 was conducted on 10/15/2017
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
EHR Usability Test Report of AllMeds Specialty EHR Version 11
Report based on ISO/IEC 25062:2006 Common Industry Formatfor Usability Test Reports
AllMeds Specialty EHR Version 11
Date of Usability Test:
Date of Report: December 6th, 2017
Report Prepared By: Stephen Nichols
Table of Contents
1 EXECUTIVE SUMMARY
2 INTRODUCTION
3 METHOD 3.1 PARTICIPANTS
3.2 STUDY DESIGN
3.3 TASKS
3.4 PROCEDURE
3.5 TEST LOCATION
3.6 TEST ENVIRONMENT
3.7 TEST FORMS AND TOOLS
3.8 PARTICIPANT INSTRUCTIONS
3.9 USABILITY METRICS
4 RESULTS 4.1 DATA ANALYSIS AND REPORTING
4.2 DISCUSSION OF THE FINDINGS
5 APPENDICES 5.1 APPENDIX 1: PARTICIPANT DEMOGRAPHICS AND INFORMED CONSENT FORM
5.2 APPENDIX 2: EXAMPLE MODERATORS GUIDE
5.3 APPENDIX 3: SYSTEM USABILITY SCALE QUESTIONAIRE
1. EXECUTIVE SUMMARY
A usability test of AllMeds Specialty EHR version 11 was conducted on 10/15/2017 in Oak Ridge, TN by AllMeds
QA and Informatic team. The purpose of this test was to test and validate the usability of the current user
interface, and provide evidence of usability in the EHR Under Test (EHRUT). During the usability test, 10
healthcare professionals matching the target demographic criteria served as participants and used the EHRUT
in simulated, but representative tasks.
This study collected performance data on 7 measures typically conducted on an EHR:
• 170.315 (a)(2) Computerized provider order entry — laboratory
• 170.315 (a)(3) Computerized provider order entry — diagnostic
• 170.315 (a)(5) Demographics
• 170.315 (a)(6) Problem list
• 170.315 (a)(9) Clinical decision support
• 170.315 (a)(14) Implantable device list
• 170.315 (b)(2) Clinical information reconciliation and incorporation
The EHRUT uses the DrFirst “Rcopia” application for all medications and allergy measures in the EHRUT. The
current Rcopia usability study for these measures can be found on the Certified Health IT Product List website
https://chpl.healthit.gov and includes the following:
• Participant's satisfaction ratings of the system
All participant data was de-identified — no correspondence could be made from the identity of the
participant to the data collected. Following the conclusion of the testing, participants were asked to
complete a post-test questionnaire and were compensated with $0 for their time. Various
recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes
Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability
of the EHRUT.
The results of the study indicated that the EHRUT was satisfactory with regards to effectiveness and efficiency and that the participants were very satisfied with the system. In addition to the performance data, the following qualitative observations were made:
• Major Findings o The look and feel of the user interface generally provides an intuitive experience for
most users. o Least experienced users have the most difficulty in navigating the interface and
workflows. o Most of the errors or inability to complete tasks was due to a non-software technical
issue that did not push the current version of the application and databases to all servers.
• Areas for Improvement o Provide more context sensitive help and mouse over information on major interface
objects. o Continue to review application for interface consistency issues.
2. INTRODUCTION The EHRUT tested for this study was AllMeds Specialty EHR version 11. Designed to present medical
information to healthcare providers in ambulatory specialty practices. The usability testing attempted
to represent realistic exercises and conditions.
The purpose of this study was to test and validate the usability of the current user interface, and
provide evidence of usability in the EHRUT. To this end, measures of effectiveness, efficiency and user
satisfaction, such as ease of use and time taken, were captured during the usability testing.
3. METHOD
3.1 PARTICIPANTS
A total of 10 participants were tested on the EHRUT. Participants in the test were EHR Trainers.
Participants were not compensated for their time. In addition, participants had no direct connection
to the development of the EHRUT. Participants were given the opportunity to have the same
orientation and level of training as the actual end users would have received. For the test purposes,
end-user characteristics were identified and used to select test participants.
Recruited participants had a mix of backgrounds and demographic characteristics conforming to the
recruitment screener. The following is a table of participants by characteristics, including
demographics, professional experience, computing experience and user needs for assistive
technology. Participant names were replaced with Participant IDs so that an individual's data cannot
be tied back to individual identities.
All experience values are in months.
Part ID Gender Age Education Occupation
or Role Computer Experience
Professional Experience
Product Experience
Assistive Technology Needs
User01 Female 50-59 Associate Degree
Clinical Assistant
345 396 181 None
User02 Male 40-49 Master's Degree
MD 60 396 24 None
User03 Female 30-39 Bachelor's Degree
Physician Assistant
300 320 95 None
User04 Female 40-49 Associate degree
RN 180 240 72 None
User05 Female 50-59 Bachelor's Degree
RN 280 340 180 None
User06 Female 30-39 Bachelor's Degree
Physician Assistant
168 240 36 None
User07 Female 50-59 High School Graduate
Clinical Assistant
240 480 66 None
User08 Male 60-69 Master's Degree
MD 480 600 100 None
User09 Male 20-29 High School Graduate
Clinical Assistant
72 240 5 None
User10 Female 40-49 No High School Diploma
Clinical Assistant
420 96 5 None
10 participants (matching the demographics in the section on Participants) were recruited and 10
participated in the usability test. None of the participants failed to show for the study. Participants
were scheduled for 60 minutes sessions with 15 minutes in between each session for debrief by the
administrator and to reset systems to proper test conditions. A spreadsheet was used to keep track of
the participant schedule, and included each participant's demographic characteristics.
3.2 STUDY DESIGN
Overall, the objective of this test was to uncover areas where the application performed well that is,
effectively, efficiently, and with satisfaction and areas where the application failed to meet the needs
of the participants. The data from this test may serve as a baseline for future tests with an updated
version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In
short, this testing serves as both a means to record or benchmark current usability, but also to
identify areas where improvements must be made.
During the usability test, participants interacted with EHR. Each participant used the system in the
same location, and was provided with the same instructions. The system was evaluated for
effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each
participant:
• Number of tasks successfully completed within the allotted time without assistance
• Time to complete the tasks
• Number and types of errors
• Path deviations
• Participant's verbalizations (comments)
• Participant's satisfaction ratings of the system
Additional information about the various measures can be found in Section 3.9 on Usability Metrics.
3.3 TASKS
A number of tasks were constructed that would be realistic and representative of the kinds of activities
a user might do with this EHR, including:
• 170.315(a)(2) CPOE Labs
• 170.315(a)(3) CPOE Imaging
• 170.315(a)(5) Demographics
• 170.315(a)(6) Problem List
• 170.315(a)(9) Clinical Decision Support
• 170.315(a)(14) Implantable Devices
• 170.315(b)(2) Clinical Information Reconciliation
Tasks were selected based on their frequency of use, criticality of function, and those that were
required for 2015 Edition certification.
3.4 PROCEDURES
Upon arrival, participants were greeted; their identity was verified and matched with a name on
the participant schedule. Participants were then assigned a participant ID.
To ensure that the test ran smoothly, two staff members participated in this test, the usability
administrator and the data logger. The usability testing staff conducting the test was
experienced usability practitioners.
The Administrator moderated the session including administering instructions and tasks. The
administrator also monitored task times, obtained post-task rating data, and took notes on the
participant comments. A second person served as the data logger and took notes on task
success, path deviations, number and type of errors, and comments.
Participants were instructed to perform the tasks (see specific instructions below):
• As quickly as possible making as few errors and deviations as possible.
• Without assistance; administrators were allowed to give immaterial guidance and
clarification on tasks, but not instructions on use.
• Without using a think aloud technique.
For each task, the participants were given a written copy of the task. Task timing began once
the administrator finished reading the question. The task time was stopped once the
participant indicated they had successfully completed the task. Scoring is discussed below in
Section 3.9.
Following the session, the administrator gave the participant the post-test questionnaire (e.g.,
the System Usability Scale, see Appendix 4), thanked each individual for their participation.
Participants' demographic information, task success rate, time on task, errors, deviations, verbal
responses, and post-test questionnaire were recorded into a spreadsheet.
Participants were thanked for their time.
3.5 TEST LOCATION
The test facility included a quiet testing room with a table and computers for participants. Only the
participants, data logger, and administrator were in the room. To ensure the environment was
comfortable for the users, noise levels were kept to a minimum with the ambient temperature within
a normal range. All of the safety instruction and evacuation procedures were valid and visible to the
participants.
3.6 TEST ENVIRONMENT
The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing
was conducted in office location. For testing, the computers used were notebooks running Windows
7.0 or Windows 10. The participants used a mouse and keyboard when interacting with the EHRUT.
The EHRUT used the monitor screen with a minimum resolution of 1024 X 768, with a minimum
screen display size of 15 inches and 32 bit color settings. The application was set up by the test
laboratory according to the vendor's documentation describing the system set-up and preparation.
The application itself was running on remote desktop virtual machine using a test database on a LAN
connection. Technically, the system performance of 3-6 seconds was representative to what actual
users would experience in a field implementation. Additionally, participants were instructed not to
change any of the default system settings (such as control of font size).
3.7 TEST FORMS AND TOOLS
During the usability test, various documents and instruments were used, including:
1. Informed Consent
2. Moderator's Guide
3. Post-test Questionnaire
The participant's interaction with the EHRUT was observed the manually noted.
3.8 PARTICIPANT INSTRUCTIONS
The administrator reads the following instructions aloud to the each:
Thank you for participating in this study. Your input is very important. Our session today will
last about 60 minutes. During that time you will use an instance of an electronic health
record. I will ask you to complete a few tasks using this system and answer some questions.
You should complete the tasks as quickly as possible making as few errors as possible. Please
try to complete the tasks on your own following the instructions very closely. Please note
that we are not testing you we are testing the system, therefore if you have difficulty all this
means is that something needs to be improved in the system. I will be here in case you need
specific help, but I am not able to instruct you or provide help in how to use the application.
Overall, we are interested in how easy (or how difficult) this system is to use, what in it would
be useful to you, and how we could improve it. I did not have any involvement in its creation,
so please be honest with your opinions. All of the information that you provide will be kept
confidential and your name will not be associated with your comments at any time. Should
you feel it necessary you are able to withdraw at any time during the testing.
Following the procedural instructions, participants were shown the EHR and as their first task,
were given time 60 minutes to explore the system and make comments. Once this task was
complete, the administrator gave the following instructions:
For each task, I will read the description to you and say "Begin." At that point, please perform
the task and say "Done" once you believe you have successfully completed the task. I would
like to request that you not talk aloud or verbalize while you are doing the tasks. I will ask you
your impressions about the task once you are done.
Participants were then given 7 tasks to complete.
3.9 USABILITY METRICS
According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic
Health Records, EHRs should support a process that provides a high level of usability for all users.
The goal is for users to interact with the system effectively, efficiently, and with an acceptable
level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were
captured during the usability testing.
The goals of the test were to assess:
1. Effectiveness of AllMeds Specialty EHR version 11 by measuring participant success rates
2. Efficiency of AllMeds Specialty EHR version 11 by measuring the average task time and
path deviations
3. Satisfaction with AllMeds Specialty EHR version 11 by measuring ease of use ratings
4. RESULTS
4.1 DATA ANALYSIS AND REPORTING
The results of the usability test were calculated according to the methods specified in the
Usability Metrics section above. Participants who failed to follow session and task instructions
had their data excluded from the analyses. there are data exclusions. The usability testing results
for the EHRUT are detailed below (see Table 4.2. The results should be seen in light of the
objectives and goals outlined in Section 3.2 Study Design. The data should yield actionable
results that, if corrected, yield material, positive impact on user performance.
Summary of Performance and Rating Data collected on EHRUT – Table 4.1
Summary of Performance and Rating Data collected on EHRUT – Table 4.1
Task/Measure N
Task Success
Path Deviation Task Time (seconds)
Errors Task
Rating 5-Easy
# Mean (SD)
Observed/Optimal Mean (SD)
Deviations Observed/Optimal
Mean (SD)
Mean (SD)
Add, Change, Access a Lab Order
a.2 100% 29/29 192 0 0 4.07
Record, Change, Access a Diagnostic Order
a.3 100% 24/24 236 0 0 3.87
Add, Change, Access a Demographic Data
a.5 100% 10/10 322 0 .2 3.90
Create, Change, and Delete Problem
a.6 70% 7/7 170 0 .9 2.50
Create, Trigger Alert, Delete Existing CDS
a.9 100% 29/29 382 0 0 3.40
Lookup, Add, Edit/Deactivate, Run Report on Implantable Devices
a.14 90% 22/22 346 0 2.5 3.20
Assign, Send, Reconcile CCD/Clinical Information
b.2 100% 800 0 0 4.1
The results from the SUS (System Usability Scale) scored the subjective satisfaction with the
system based on performance with these tasks to be: 3.57. Broadly interpreted, scores under 3
represent systems with poor usability; scores over 4 would be considered above average.
4.2 DISCUSSION OF THE FINDINGS
EFFECTIVENESS
o The users were able to perform the tasks in an effective manner. o Most of the errors or inability to complete tasks was due to a non-software technical issue that
did not push the current version of the application and databases to all servers.
EFFICIENCY
o Users were generally able to perform tasks in an efficient manner.
SATISFACTION
o Most users were satisfied with ease of use in performing the tasks.
MAJOR FINDINGS
o The look and feel of the user interface generally provides an intuitive experience for most users.
o Least experienced users have the most difficulty in navigating the interface and workflows.
AREAS FOR IMPROVEMENT
o Provide more context sensitive help and mouse over information on major interface objects. o Continue to review application for interface consistency issues.
Name of Senior
Company
Representative:
Lucy Stephenson, MSN, RN Title of
Senior Company
Representative:
VP Customer Relations/Informatics
Signature of
Senior Company
Representative:
Date Signed:
12/11/2017
5. APPENDICES
The following appendices include supplemental data for this usability test report. Following is a list of
the appendices provided:
1. Participant demographics and Informed Consent Form
2. Example Moderator’s Guide
3. System Usability Scale Questionnaire
5.1 APPENDIX 1: PARTICIPANT DEMOGRAPHICS AND INFORMED CONSENT FORM
Following is a high-level overview of the participants in this study.
Gender
Men 3
Women 7
Total (participants) 10
Occupation/Role
RN/BSN 2
Physician 2
Physician Assistant 2
Clinical Assistant 4
Years of Experience
Mean Years experience 6.4
Total (participants) 10
Sample Form AllMeds EHR Usability Study
User Consent and Demographics Form
AllMeds would like to thank you for your participation in this study. The purpose of the study is to evaluate the usability of the AllMeds Specialty EHR. Your participation in this study will include performing specific tasks within the EHR; and completing a short survey following the study. The study should take approximately 60 minutes. The information collected by AllMeds during the study is for research purposes only.
NOTE: Participant Names will NOT be used in any reports. All information will be referenced to the Tester ID ONLY.
By signing below, I agree to participate in the study.
Iftheparticipantabandonedthetask,didnotreachthecorrectansweror performed it incorrectly, or reached the end of the allotted timebeforesuccessfulcompletion, thetaskwascountedas “Fail.” Notasktimesweretakenforfailedattempts.Thetotalnumberoferrorswascalculatedforeachtaskanddividedbythe total number of times that task was attempted. Results arepresentedastheaverageerrorrate.Note:Notalldeviationsarecountedaserrors
Effectiveness:• Prompted
Successes
Becausesometasksaredependentuponthesuccessfulcompletionofprevious tasks, participants may receive a limited number of“prompts” to helpprepare the systemdata for thepre-requisites forsubsequenttasks.Whenaparticipantwasabletocompletethedataentryonataskwith3orfewerprompts,thetaskwascountedasan“Assisted”competition.NotasktimeswererecordedforAssistedcompletions.
Efficiency:• Task
Deviations
The participant’s path (i.e., steps) through the application wasrecorded.Deviationsoccurifforexample,theparticipantnavigatedtoan incorrect screen, clicked on an incorrectmenu item, followed anincorrectlink,orinteractedincorrectlywithanon-screencontrol.
Efficiency:• TaskTime
Each task was timed from the administrator’s prompt “Begin” untilsaid, “Done.” If the participant failed to say, “Done,” timing stoppedwhentheparticipantstoppedperformingthetask.Only task times for tasks that were successfully completed wereincludedintheaveragetasktimeanalysis.Averagetimepertaskwascalculatedforeachtask.
Satisfaction:• EaseofUse
ratings• System
Satisfaction
Participant’ssubjectiveimpressionoftheeaseofuseoftheapplicationwas measured by administering both a single post-task question aswellastwopost-sessionquestionnaires.Aftereachtask,theparticipantdeterminedonascaleof1to5theirsubjective satisfactionwith performance on the task. These data areaveragedacrossparticipants.To measure participants’ confidence in and likeability of the EHRoverall, the testing team administered electronic versions of theSystem Usability Scale (SUS) and the Computer System UsabilityQuestionnaire(CSUQ).SeetheSUSquestionnaireasAppendixD.,andtheCSUQasAppendixE.
AppendixB:InformedConsentFormTheUsabilityPeoplewouldliketothankyouforparticipatinginthisstudy.Thepurposeofthisstudy is toevaluateanelectronichealthrecordssystem. Ifyoudecide toparticipate,youwillbeaskedtoperformseveraltasksusingtheprototypeandgiveyourfeedback.Thestudywilllastabout45minutes.AgreementIunderstandandagreethatasavoluntaryparticipant inthepresentstudyconductedbyThe Usability People. I am free towithdraw consent or discontinue participation at anytime. I understandandagree toparticipate in the study conducted and recordedbyTheUsabilityPeople.I understandand consent to theuse and releaseof the video recordingbyTheUsabilityPeople.Iunderstandthattheinformationandvideoisforresearchpurposesonlyandthatmynameandimagewillnotbeusedforanypurposeotherthanresearch.Irelinquishanyrights to the video and understand the video recordingmay be copied and used by TheUsabilityPeoplewithoutfurtherpermission.I understand and agree that the purpose of this study is tomake software applicationsmoreusefulandusableinthefuture.I understandandagree that thedata collected from this studymaybe sharedoutsideofTheUsabilityPeople. Iunderstandandagreethatdataconfidentiality isassured,becauseonlyde-identifieddata– i.e., identificationnumbersnotnames–willbeused inanalysisandreportingoftheresults.I agree to immediately raise any concerns or areas of discomfort with the studyadministrator.IunderstandthatIcanleaveatanytime.Pleasecheckoneofthefollowing:____YES,Ihavereadtheabovestatementandagreetobeaparticipant.____NO,Ichoosenottoparticipateinthisstudy.Signature:_____________________________________Date_____________________
AppendixE:ComputerSystemUsabilityQuestionnairePleaseprovideyourimpressionoftheusabilityofthesystembyansweringeachofthequestionsbelow:1.Overall,IamsatisfiedwithhoweasyitistousethissystemStrongly 1234567 NA StronglyDisagree Agree2.ItwassimpletousethissystemStrongly 1234567 NA StronglyDisagree Agree3.IcaneffectivelycompletemyworkusingthissystemStrongly 1234567 NA StronglyDisagree Agree4.IamabletocompletemyworkquicklyusingthissystemStrongly 1234567 NA StronglyDisagree Agree5.IamabletoefficientlycompletemyworkusingthissystemStrongly 1234567 NA StronglyDisagree Agree6.IfeelcomfortableusingthissystemStrongly 1234567 NA StronglyDisagree Agree7.ItwaseasytolearntousethissystemStrongly 1234567 NA StronglyDisagree Agree8.IbelieveIbecameproductivequicklyusingthissystemStrongly 1234567 NA StronglyDisagree Agree9.ThesystemgiveserrormessagesthatclearlytellmehowtofixproblemsStrongly 1234567 NA StronglyDisagree Agree10.WheneverImakeamistakeusingthesystem,IrecovereasilyandquicklyStrongly 1234567 NA StronglyDisagree Agree
11.Theinformation(suchasonlinehelp,on-screenmessages,andotherdocumentation)providedwiththissystemisclearStrongly1 234567 NA StronglyDisagree Agree12.ItiseasytofindtheinformationIneededStrongly 1234567 NA StronglyDisagree Agree13.TheinformationprovidedforthesystemiseasytounderstandStrongly 1234567 NA StronglyDisagree Agree14.TheinformationiseffectiveinhelpingmecompletethetasksandscenariosStrongly 1234567 NA StronglyDisagree Agree15.TheorganizationofinformationonthesystemscreensisclearStrongly 1234567 NA StronglyDisagree Agree16.TheinterfaceofthissystemispleasantStrongly 1234567 NA StronglyDisagree Agree17.IlikeusingtheinterfaceofthissystemStrongly 1234567 NA StronglyDisagree Agree18.ThissystemhasallthefunctionsandcapabilitiesIexpectittohaveStrongly 1234567 NA StronglyDisagree Agree19.Overall,IamsatisfiedwiththissystemStrongly 1234567 NA StronglyDisagree Agree