Top Banner
Test Results Summary for 2014 Edition EHR Certification 173200R0021PRA V1.0, January 17, 2018 Part 1: Product and Developer Information 1.1 Certified Product Information 1.2 Developer/Vendor Information Part 2: ONCAuthorized Certification Body Information 2.1 ONCAuthorized Certification Body Information 1/17/2018 Signature and Date ONCACB Contact: Adam Hardcastle This test results summary is approved for public release by the following ONCAuthorized Certification Body Representative: Adam Hardcastle EHR Certification Body ONCACB Authorized Representative Function/Title San Luis Obispo, CA 93401 Website: www.infogard.com Email: [email protected] Phone: (805) 7830810 Developer/Vendor Contact: Chris Lohl ONCACB Name: InfoGard Laboratories, Inc. Address: 709 Fiero Lane Suite 25 Phoenix, AZ 85012 Website: www.cgmus.com/us Email: [email protected] Phone: (888) 6277633 Test Type: Modular EHR Developer/Vendor Name: CompuGroup Medical Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification Product Name: CGM Enterprise EHR™ Product Version: 10.1.2 Domain: Ambulatory ©2018 InfoGard. May be reproduced only in its original entirety, without revision 1
57

ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Aug 01, 2020

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

Part 1: Product and Developer Information1.1 Certified Product Information

1.2 Developer/Vendor Information

Part 2: ONC‐Authorized Certification Body Information2.1 ONC‐Authorized Certification Body Information

1/17/2018

Signature and Date

ONC‐ACB Contact: Adam Hardcastle

This test results summary is approved for public release by the following ONC‐Authorized Certification Body 

Representative:

Adam Hardcastle EHR Certification Body 

ONC‐ACB Authorized Representative Function/Title

San Luis Obispo, CA 93401

Website: www.infogard.com

Email: [email protected]

Phone: (805) 783‐0810

Developer/Vendor Contact: Chris Lohl

ONC‐ACB Name: InfoGard Laboratories, Inc.

Address: 709 Fiero Lane Suite 25

Phoenix, AZ 85012

Website: www.cgmus.com/us

Email: [email protected]

Phone: (888) 627‐7633

Test Type: Modular EHR

Developer/Vendor Name: CompuGroup Medical 

Address: 3300 N. Central Ave, Suite 2100

ONC HIT Certification Program 

Test Results Summary for 2014 Edition EHR Certification

Product Name: CGM Enterprise EHR™ 

Product Version: 10.1.2

Domain:  Ambulatory 

©2018 InfoGard. May be reproduced only in its original entirety, without revision 1

Page 2: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

2.2 Gap Certification

(a)(1) (a)(19) (d)(6) (h)(1)

(a)(6) (a)(20) (d)(8) (h)(2)

(a)(7) (b)(5)* (d)(9) (h)(3)

(a)(17) (d)(1) (f)(1)

(a)(18) (d)(5) (f)(7)**

*Gap certification allowed for Inpatient setting only

**Gap certification allowed for Ambulatory setting only

2.3 Inherited CertificationThe following identifies criterion or criteria certified via inherited certification

        No inherited certification

      (a)(14)       (b)(9)       (e)(3) Amb. only       (h)(2)

      (a)(15)       (c)(1)       (f)(1)       (h)(3)

      (a)(12)       (b)(7)       (e)(1)       (g)(4)

      (a)(13)       (b)(8)       (e)(2) Amb. only       (h)(1)

      (a)(10)       (b)(5)       (d)(8)       (g)(2)

      (a)(11)       (b)(6) Inpt. only       (d)(9)  Optional       (g)(3)

      (a)(8)       (b)(3)       (d)(6)       (f)(7) Amb. Only

      (a)(9)       (b)(4)       (d)(7)       (g)(1)

      (d)(3)

      (a)(6)       (b)(1)       (d)(4)      (f)(6)  Amb. only

      (a)(7)       (b)(2)       (d)(5)

      (a)(3)       (a)(18)       (d)(1)       (f)(4) Inpt. only

      (a)(4)       (a)(19)       (d)(2)      (f)(5) Amb. only

      (a)(5)       (a)(20)

      (a)(1)       (a)(16)  Inpt. only       (c)(2)       (f)(2)

      (a)(2)       (a)(17) Inpt. only       (c)(3)       (f)(3)

The following identifies criterion or criteria certified via gap certification

§170.314

        No gap certification

§170.314

©2018 InfoGard. May be reproduced only in its original entirety, without revision 2

Page 3: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

Part 3: NVLAP‐Accredited Testing Laboratory Information

3.1 NVLAP‐Accredited Testing Laboratory Information

1/19/2018

3.2  Test Information

3.2.1 Additional Software Relied Upon for Certification

        No additional software required

ChilKat 314(c)(1‐3)eCQM and Automated Measure 

Calculation

SureScripts with conectivity via 

CGM eRX

314(b)(3), 314(a)(10), 

314(a)(2)

eRx, CPO Meds, Drug to Drug 

Check

Signature and Date

Additional Software Applicable CriteriaFunctionality provided by 

Additional Software

Persivia (Alere Analytics)314(a)(8), 

314(c)(1‐3)eCQM and CDS

For more information on scope of accreditation, please reference 

http://ts.nist.gov/Standards/scopes/1004320.htm

Part 3 of this test results summary is approved for public release by the following Accredited Testing Laboratory 

Representative:

Mark Shin EHR Approved Signatory

ATL Authorized Representative Function/Title

Email: [email protected]

Phone: (805) 783‐0810

ATL Contact: Milton Padilla

Accreditation Number: NVLAP Lab Code 100432‐0

Address: 709 Fiero Lane Suite 25

San Luis Obispo, CA 93401

Website: www.infogard.com

Report Number:  17‐3200‐R‐0021 V1.1

Test Date(s):  Sept. 22 ‐ Oct. 4, 2017

ATL Name: InfoGard Laboratories, Inc.

©2018 InfoGard. May be reproduced only in its original entirety, without revision 3

Page 4: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

3.2.2 Test Tools

Version

1.0.6

        No test tools required

3.2.3 Test Data

3.2.4 Standards

3.2.4.1 Multiple Standards Permitted

(a)(13)

      §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

      §170.207(j)

HL7 Version 3 Standard: Clinical 

Genomics; Pedigree

        No alteration (customization) to the test data was necessary

The following identifies the standard(s) that has been successfully 

tested where more than one standard is permitted

Criterion # Standard Successfully Tested

(a)(8)(ii)(A)(2)

      §170.204(b)(1)

HL7 Version 3 Implementation 

Guide: URL‐Based Implementations 

of the Context‐Aware Information 

Retrieval (Infobutton) Domain

      §170.204(b)(2)

HL7 Version 3 Implementation 

Guide: Context‐Aware Knowledge 

Retrieval (Infobutton) Service‐

Oriented Architecture 

Implementation Guide

HL7 v2 Laboratory Restults Intervace (LRI) Validation Tool

HL7 v2 Syndromic Surveillance Reporting Validation Tool

Transport Testing Tool

Direct Certificate Discovery Tool

Edge Testing Tool

        Alteration (customization) to the test data was necessary and is 

        described in Appendix [insert appendix letter ]

Test Tool

Cypress

ePrescribing Validation Tool

HL7 CDA Cancer Registry Reporting Validation Tool

HL7 v2 Electronic Laboratory Reporting (ELR) Validation Tool

HL7 v2 Immunization Information System (IIS) Reporting Valdiation 

Tool

©2018 InfoGard. May be reproduced only in its original entirety, without revision 4

Page 5: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

 

        None of the criteria and corresponding standards listed above are applicable

3.2.4.2 Newer Versions of Standards 

        No newer version of a minimum standard was tested

The following identifies the newer version of a minimum standard(s) that has been 

successfully tested 

Newer Version Applicable Criteria

(e)(3)(ii)       Annex A of the FIPS Publication 140‐2

Common MU Data 

Set (15)

      §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

      §170.207(b)(2)

The code set specified at 45 CFR 

162.1002(a)(5) (HCPCS and CPT‐4)

(e)(1)(i)       Annex A of the FIPS Publication 140‐2

(e)(1)(ii)(A)(2)

      §170.210(g) 

Network Time Protocol Version 3 

(RFC 1305) 

      §170. 210(g)

Network Time Protocol Version 4 

(RFC 5905)

(b)(7)(i)

      §170.207(i) 

The code set specified at 45 CFR 

162.1002(c)(2) (ICD‐10‐CM) for the 

indicated conditions 

      §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

(b)(8)(i)

     §170.207(i) 

The code set specified at 45 CFR 

162.1002(c)(2) (ICD‐10‐CM) for the 

indicated conditions 

     §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

(a)(16)(ii)

      §170.210(g) 

Network Time Protocol Version 3 

(RFC 1305) 

      §170. 210(g)

Network Time Protocol Version 4 

(RFC 5905)

(b)(2)(i)(A)

      §170.207(i) 

The code set specified at 45 CFR 

162.1002(c)(2) (ICD‐10‐CM) for the 

indicated conditions 

      §170.207(a)(3)

IHTSDO SNOMED CT® International 

Release July 2012 and US Extension 

to SNOMED CT® March 2012 

Release

(a)(15)(i)

      §170.204(b)(1) 

HL7 Version 3 Implementation 

Guide: URL‐Based Implementations 

of the Context‐Aware Information 

Retrieval (Infobutton) Domain

      §170.204(b)(2)

HL7 Version 3 Implementation 

Guide: Context‐Aware Knowledge 

Retrieval (Infobutton) Service‐

Oriented Architecture 

Implementation Guide

©2018 InfoGard. May be reproduced only in its original entirety, without revision 5

Page 6: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

3.2.5 Optional Functionality

      No optional functionality tested

Common MU Data 

Set (15)      Express Procedures according to the standard specified 

at §170.207(b)(4) (45 CFR162.1002(c)(3): ICD‐10‐PCS)

(f)(3)

      Ambulatory setting only – Create syndrome‐based public 

health surveillance information for transmission using the 

standard specified at §170.205(d)(3) (urgent care visit 

scenario)

(f)(7)       Ambulatory setting only – transmission to public health 

agencies – syndromic surveillance  ‐ Create Data Elements

Common MU Data 

Set (15) 

      Express Procedures according to the standard specified 

at §170.207(b)(3) (45 CFR162.1002(a)(4): Code on Dental 

Procedures and Nomenclature)

(b)(2)(ii)(B)

      Transmit health information to a Third Party using the 

standards specified at §170.202(a) and (b) (Direct and XDM 

Validation)

(b)(2)(ii)(C)        Transmit health information to a Third Party using the 

standards specified at §170.202(b) and (c) (SOAP Protocols)

(e)(1)

     View, download and transmit data to a third party   using 

the standard specified at §170.202(d) (Edge Protocol IG 

version 1.1)

(a)(4)(iii)       Plot and display growth charts

(b)(1)(i)(B)       Receive summary care record using the standards 

specified at §170.202(a) and (b) (Direct and XDM Validation)

(b)(1)(i)(C)        Receive summary care record using the standards 

specified at §170.202(b) and (c) (SOAP Protocols)

Criterion # Optional Functionality Successfully Tested

©2018 InfoGard. May be reproduced only in its original entirety, without revision 6

Page 7: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

3.2.6 2014 Edition Certification Criteria* Successfully Tested

TP** TD*** TP** TD***

1.2

1.2 1.4

1.1 1.5

1.4 1.2

*For a list of the 2014 Edition Certification Criteria, please reference http://www.healthit.gov/certification 

(navigation: 2014 Edition Test Method)

**Indicates the version number for the Test Procedure (TP)

***Indicates the version number for the Test Data (TD)

      (b)(8)       (h)(2)

      (b)(9)       (h)(3)

      (c)(1)

      (c)(2)

      (b)(5)       (g)(3)

      (b)(6) Inpt. only       (g)(4)

      (b)(7)       (h)(1)

      (b)(2)       (f)(7) Amb. only

      (b)(3)       (g)(1)

      (b)(4)       (g)(2)

      (a)(19)

      (a)(20)       (f)(6) Optional &                          Amb. only      (b)(1)

      (a)(16)  Inpt. only       (f)(3)

      (a)(17) Inpt. only       (f)(4) Inpt. only

      (a)(18)       (f)(5) Optional &                           Amb. only

      (a)(13)       (e)(3) Amb. only

      (a)(14)       (f)(1)

      (a)(15)       (f)(2)

      (a)(10)       (d)(9)  Optional

      (a)(11)       (e)(1)

      (a)(12)       (e)(2) Amb. only

      (a)(7)       (d)(6)

      (a)(8)       (d)(7)

      (a)(9)       (d)(8)

      (a)(4)       (d)(3)

      (a)(5)       (d)(4)

      (a)(6)       (d)(5)

      (a)(1)       (c)(3)

      (a)(2)       (d)(1)

      (a)(3)       (d)(2)

Criteria #Version

Criteria #Version

©2018 InfoGard. May be reproduced only in its original entirety, without revision 7

Page 8: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

 

3.2.7 2014 Clinical Quality Measures*Type of Clinical Quality Measures Successfully Tested:

CMS ID Version CMS ID Version CMS ID Version CMS ID Version

2 90 136 155

22 117 137 156

50 122 138 157

52 123 139 158

56 124 140 159

61 125 141 160

62 126 142 161

64 127 143 163

65 128 144 164

66 129 145 165

68 130 146 166

69 131 147 167

74 132 148 169

75 133 149 177

77 134 153 179

82 135 154 182

CMS ID Version CMS ID Version CMS ID Version CMS ID Version

9 71 107 172

26 72 108 178

30 73 109 185

31 91 110 188

32 100 111 190

53 102 113

55 104 114

60 105 171

*For a list of the 2014 Clinical Quality Measures, please reference 

http://www.cms.gov (navigation: 2014 Clinical Quality Measures)

Ambulatory CQMs

Inpatient CQMs

        Ambulatory

        Inpatient

        No CQMs tested

©2018 InfoGard. May be reproduced only in its original entirety, without revision 8

Page 9: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

3.2.8  Automated Numerator Recording and Measure Calculation

3.2.8.1 Automated Numerator Recording

        Automated Numerator Recording was not tested 

3.2.8.2 Automated Measure Calculation

        Automated Measure Calculation was not tested 

3.2.9 Attestation

*Required if any of the following were tested: (a)(1), (a)(2), (a)(6), (a)(7), (a)(8), (a)(16), (a)(18), 

(a)(19), (a)(20), (b)(3), (b)(4), (b)(9)

**Required for every EHR product

        Safety‐Enhanced Design* A

        Quality Management System** B

        Privacy and Security C

(a)(9) (a)(17) (b)(5)

Attestation Forms (as applicable) Appendix

(a)(6) (a)(15) (b)(3) (e)(2)

(a)(7) (a)(16) (b)(4) (e)(3)

(a)(4) (a)(13) (a)(20) (b)(9)

(a)(5) (a)(14) (b)(2) (e)(1)

Automated Numerator Recording Successfully Tested

(a)(1) (a)(11) (a)(18) (b)(6)

(a)(3) (a)(12) (a)(19) (b)(8)

(a)(7) (a)(16) (b)(4) (e)(3)

(a)(9) (a)(17) (b)(5)

(a)(5) (a)(14) (b)(2) (e)(1)

(a)(6) (a)(15) (b)(3) (e)(2)

(a)(3) (a)(12) (a)(19) (b)(8)

(a)(4) (a)(13) (a)(20) (b)(9)

Automated Numerator Recording Successfully Tested

(a)(1) (a)(11) (a)(18) (b)(6)

©2018 InfoGard. May be reproduced only in its original entirety, without revision 9

Page 10: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

Appendix A: Safety Enhanced Design

©2018 InfoGard. May be reproduced only in its original entirety, without revision 10

Page 11: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Safety-enhanced design §170.314(g)(3)

Page 12: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

2 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

EXECUTIVE SUMMARY

A usability test of CGM ENTERPRISE EHR™ version 10.0.4 was conducted remotely by CompuGroup Medical via four waves of usability testing. The waves took place during the weeks of November 4, 2013, January 27, 2014, February 24, 2014, and March 10, 2014. The usability test followed the NISTIR 7741 User Centered Design approach.1 The purpose of this test was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). For usability testing 9 healthcare providers matching the target demographic criteria were used. For wave 1 eight served as participants, for wave 2 six served as participants, for wave 3 four served as participants, and for wave 4 three served as participants. During testing the participating providers used the EHRUT in simulated, but representative tasks. This study collected performance data on 7 2014 ONC HIT certification criteria using 36 tasks typically conducted on an EHR:

List of Tasks

Certification Criteria Task Wave of Testing

§170.314(a)(7) Medication allergy list Enter a medication allergy 1

§170.314(a)(7) Medication allergy list Change a medication allergy 1

§170.314(a)(7) Medication allergy list Inactivate a medication allergy and view a list of active and inactive medication allergies

1

§170.314(a)(7) Medication allergy list Indicate no known drug allergies 1

§170.314(a)(6) Medication list Enter a medication 1

§170.314(a)(6) Medication list Change medication dosage 1

§170.314(a)(6) Medication list Discontinue a medication 1

§170.314(a)(6) Medication list View patient’s medication history 1

§170.314(a)(6) Medication list Mark that a patient is not taking any medications 1

§170.314(b)(3) eRx Prescribe a medication 1

§170.314(b)(3) eRx Renew two medications 1

§170.314(b)(3) eRx Prescribe a medication with a different pharmacy 1

§170.314(b)(3) eRx Change the dose of a medication and eRx 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a major drug-drug interaction and cancel the order 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a moderate drug-drug interaction and override it 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Prescribe a medication that will not trigger an interaction and eRx 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a drug-allergy interaction and override it 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Change the drug-drug interaction settings 1

§170.314(a)(1) Computerized provider Place and print a prescription (medication) 2

1 Robert M. Schumacher User Centric. Inc, and Svetlana Z. Lowry Information Access Division Information

Technology Laboratory National Institute of Standards and Technology, NISTIR 7741 NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records (November 2010) p. 2-62

Page 13: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

3 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

order entry

§170.314(a)(1) Computerized provider order entry

Enter a medication as a called In Non-CPOE prescription 2

§170.314(a)(1) Computerized provider order entry

Place a lab order using the search function and a radiology order using the pre-defined list

2

§170.314(a)(1) Computerized provider order entry

Place and change a lab order using the pre-defined list and print the order

2

§170.314(a)(1) Computerized provider order entry

Place and change a radiology order using the pre-defined list 2

§170.314(a)(8) Clinical decision support

Access a patient encounter and review CDS alert 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered problem list data 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered medication list data 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered demographics 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered lab results 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered vital signs 3

§170.314(a)(8) Clinical decision support

Review and accept a CDS alert from entered problem list data 3

§170.314(a)(8) Clinical decision support

Enter information that will not trigger a CDS alert 3

§170.314(b)(4) Clinical information reconciliation

Reconcile medications for a new patient without a summary of care document

2

§170.314(b)(4) Clinical information reconciliation

Reconcile medications 4

§170.314(b)(4) Clinical information reconciliation

Reconcile allergies 4

§170.314(b)(4) Clinical information reconciliation

Reconcile problems 4

§170.314(b)(4) Clinical information reconciliation

Validate and reconcile SOC document 4

During the one-on-one usability tests, each participant was greeted by the moderator who was logged into WebEx from their computer; participants had previously signed a usability test user agreement. Participants had prior experience with the EHR. Prior to each set of tasks, training was provided in the same manner as what a real end user would receive when purchasing or upgrading CGM ENTERPRISE EHR™. The moderator introduced the test, and instructed participants to complete a series of tasks (given one at a time) using the EHRUT. During the testing, the moderator timed the test and, along with the data logger recorded user performance data on paper and electronically. A trainer also observed the test. The moderator did not give the participant assistance in how to complete the task. Participant screens, head shots and audio were recorded for subsequent review and analysis.

Page 14: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

4 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

The following types of data were collected for each participant:

Number of tasks successfully completed within the allotted time without assistance

Time to complete the tasks

Number and types of errors

Path deviations

Participant’s verbalizations

Participant’s satisfaction ratings of the system

Steps taken to complete each task along with deviations from the optimal path All participant data was de-identified – no correspondence could be made from the identity of the participant to the data collected. Following the conclusion of the testing, participants were asked to complete a post-test questionnaire and were compensated for their time in the amount of $200/hour. Various recommended metrics, in accordance with the examples set forth in the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, were used to evaluate the usability of the EHRUT. Following is a summary of the performance and rating data collected on the EHRUT.

Task Success

% Error

%

Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations) Task

Rating AVG STD AVG STD AVG STD

Enter a medication allergy 100.00% 0.00% 29.57 5.56 1.08 0.20 1.01 0.04 1.00

Change a medication allergy 87.50% 0.00% 27.57 7.28 1.28 0.34 1.06 0.17 1.25

Inactivate a medication allergy and view a list of active and inactive medication allergies

87.50% 0.00% 9.29 2.06 1.43 0.32 1.00 0.00 1.00

Indicate no known drug allergies 100.00% 0.00% 10.00 1.77 1.33 0.24 1.00 0.00 1.00

Enter a medication 62.50% 25.00% 42.40 9.21 1.46 0.32 1.17 0.26 1.25

Change medication dosage 25.00% 50.00% 40.00 2.83 1.70 0.12 1.00 0.00 1.50

Discontinue a medication 87.50% 0.00% 23.00 5.39 1.31 0.31 1.00 0.00 1.13

View patient’s medication history

100.00% 0.00% 8.88 2.90 1.78 0.58 1.00 0.00 1.13

Mark that a patient is not taking any medications

75.00% 0.00% 18.67 2.07 1.07 0.12 1.04 0.10 1.88

Prescribe a medication 62.50% 37.50% 47.00 9.03 1.38 0.27 1.00 0.00 1.75

Renew two medications 87.50% 0.00% 16.14 1.77 1.54 0.17 1.14 0.38 1.13

Prescribe a medication with a different pharmacy

100.00% 0.00% 50.00 11.65 1.59 0.37 1.06 0.13 1.63

Change the dose of a medication and eRx

75.00% 12.50% 65.17 15.22 1.59 0.37 1.20 0.08 1.50

Trigger a major drug-drug interaction and cancel the order

100.00% 0.00% 22.75 3.85 1.30 0.22 1.13 0.27 1.25

Trigger a moderate drug-drug interaction and override it

100.00% 0.00% 23.88 4.64 1.54 0.30 1.06 0.18 1.00

Prescribe a medication that will 75.00% 0.00% 12.33 2.42 1.23 0.24 1.00 0.00 1.00

Page 15: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

5 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Task Success

% Error

% Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations)

Task Rating

not trigger an interaction and eRx

Trigger a drug-allergy interaction and override it

87.50% 12.50% 24.14 5.76 1.34 0.32 1.00 0.00 1.25

Change the drug-drug interaction settings

100.00% 0.00% 33.88 7.57 1.54 0.34 1.03 0.07 1.50

Place and print a prescription (medication)

100.00% 0.00% 53.67 5.65 1.39 0.15 1.00 0.00 2.00

Enter a medication as a called In Non-CPOE prescription

83.33% 16.67% 131.00 38.10 1.29 0.38 1.02 0.05 2.92

Place a lab order using the search function and a radiology order using the pre-defined list

66.67% 33.33% 40.00 1.83 1.10 0.05 1.08 0.15 1.67

Place and change a lab order using the pre-defined list and print the order

83.33% 16.67% 31.20 6.34 1.02 0.21 1.03 0.06 1.83

Place and change a radiology order using the pre-defined list

100.00% 0.00% 22.50 2.88 1.00 0.13 1.00 0.00 1.50

Access a patient encounter and review CDS alert

100.00% 0.00% 20.25 3.10 1.19 0.18 1.00 0.00 1.50

Trigger a CDS intervention from entered problem list data

0.00% 100.00% n/a n/a n/a n/a n/a n/a 1.50

Trigger a CDS intervention from entered medication list data

100.00% 0.00% 87.75 7.93 1.02 0.09 1.00 0.00 2.50

Trigger a CDS intervention from entered demographics

75.00% 25.00% 35.00 14.42 1.35 0.55 0.90 0.00 1.75

Trigger a CDS intervention from entered lab results

75.00% 0.00% 28.00 1.73 1.75 0.11 1.00 0.00 1.25

Trigger a CDS intervention from entered vital signs

100.00% 0.00% 29.75 2.22 1.75 0.13 1.00 0.00 1.75

Review and accept a CDS alert from entered problem list data

50.00% 50.00% 52.50 16.26 1.59 0.49 1.05 0.07 1.50

Enter information that will not trigger a CDS alert

50.00% 50.00% 70.00 1.41 1.63 0.03 1.00 0.00 1.75

Reconcile medications for a new patient without a summary of care document

100.00% 0.00% 93.00 24.22 1.62 0.42 1.44 0.12 1.25

Reconcile medications 66.67% 33.33% 163.50 2.12 1.35 0.02 1.08 0.04 2.67

Reconcile allergies 0.00% 100.00% n/a n/a n/a n/a n/a n/a 2.67

Reconcile problems 0.00% 100.00% n/a n/a n/a n/a n/a n/a 2.67

Validate and reconcile SOC document

66.67% 33.33% 7.50 0.71 1.88 0.18 1.00 0.00 2.67

Page 16: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

6 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

The results from the System Usability Scale scored the subjective satisfaction with the system based on performance with these tasks to be: 68.

2

In addition to the performance data, the following qualitative observations were made: Major findings: Overall, participants liked the functionality presented during the usability testing and generally would recommend the software to their peers. Most areas of the software yielded good quantitative results (high success and efficiency, low error rate.) One area that was especially commented upon positively was the orders tab, especially the CPOE for radiology and labs. However, participants did comment that they didn’t like having to leave the orders area in order to create a medication order. This workflow is due to the lack of integration between CGM ENTERPRISE EHR™ and the third party e-prescribing software, and was a common complaint for all areas relying on the third party e-prescribing workflow. Participants mentioned they did not like having to leave the patient encounter and would like to be able to prescribe or enter a medication and look at the progress note at the same time. Participants also commented that they didn’t like how much scrolling was required as well as the numerous drop downs. Areas related to medication allergy entry and drug-interaction checking were generally positive, with the exception being any limitations related to the actual entry of the medication order. The brand new areas relating to CDS and clinical information reconciliation were not as favorably received and did not reach the highest level of success. However, many participants did notate that with some practice they would be able to more easily navigate the workflow. For clinical decision support, the tasks requiring the user to refresh the screen were less positively received than those that did not require a refresh, and many commenters mentioned they would like to see the software immediately update both for clinical decision support as well as in general. The CDS alert itself was clear to most participants, although one user did comment that it wasn’t immediately obvious that the CDS button turned red when there was an alert available. This is also reflected by the quantitative data, as there were times when a user selected the CDS alert even when it was not lit up. For clinical information, participants notated that they thought the screens were a bit busy and had some trouble figuring out what had been added or removed from the record. What needed to be done was not immediately apparent, and the quantitative data shows this clearly by the lower success and higher error rate. Areas for improvement:

Based on the quantitative findings, verbal report of the participants, and observations from the data logger, the following changes would most likely improve the overall usability of CGM ENTERPRISE EHR™.

2 See Tullis, T. & Albert, W. (2008). Measuring the User Experience. Burlington, MA: Morgan Kaufman (p. 149).

Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average.

Page 17: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

7 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Integrate Third Party eRX/Medication List Software – Many of the issues with the medication list, eRx, drug-drug interaction, and medication related CPOE functionality could be alleviated through integration of third party software. Currently, because it is not integrated, there is no way to alter the interface. Once integrated, the following changes should be implemented:

Clean up the interface so less scrolling is required.

Clearly distinguish the workflow for when a prescription is entered versus a medication that was previously prescribed.

Allow providers to prescribe or add medications while viewing the progress note/patient encounter.

Institute a “modify” button that can be used to change a medication (either a prescription or not) that keeps the old instance within the medication history.

Clinical Decision Support – Eliminate the need to refresh screens in order for problem list and vital sign data to be saved. This would affect clinical decision support as well as assist providers in their general day to day practice. Additionally, the clinical decision support pop-up should be modified so that there is less information, which would make it more practical to the user and less overwhelming. Clinical Information Reconciliation – Make missing information for clinical information reconciliation more obvious. Clean up the screens so that they are less busy, and modify the workflow so that a user can see what has been added to or removed from the reconciled list.

INTRODUCTION The EHRUT tested for this study was CGM ENTERPRISE EHR™ version 10.0.4. Designed to present medical information to healthcare providers in an ambulatory setting, the EHRUT consists of combinable modules and customizable templates that help streamline workflow and help customers achieve maximum productivity. The usability testing attempted to represent realistic exercises and conditions. The purpose of this study was to test and validate the usability of the current user interface, and provide evidence of usability in the EHR Under Test (EHRUT). To this end, measures of effectiveness, efficiency and user satisfaction, such as task time and number of clicks per task, were captured during the usability testing.

METHOD PARTICIPANTS A total of 9 participants were tested on the EHRUT(s). Participants in the test were MDs, PAs, and NPs. Participants were recruited by CompuGroup Medical and were compensated for their time in the amount of $200/hour. In addition, participants had no direct connection to the development of or

Page 18: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

8 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

organization producing the EHRUT(s). Participants were not from the testing or supplier organization. Participants were given the opportunity to have the same orientation and level of training as the actual end users would have received. For the test purposes, end-user characteristics were identified and translated into a recruitment screener used to solicit potential participants; an example of a screener is provided in Appendix 1. Recruited participants had a mix of backgrounds and demographic characteristics conforming to the recruitment screener. The following is a table of participants by characteristics, including demographics, professional experience, and computing experience. Participant names were replaced with Participant IDs so that an individual’s data cannot be tied back to individual identities.

Part. #

Sex Age Range Level of Education

Position and Title

Specialty Time Spent in Position

Method used to Document Patient Records

General EHR Use

EHR Knowledge

1 F 23-39 Post-Graduate

PA Family Medicine 4 years all electronic Daily Good

2 M 40-59 MS CFNP Family Practice 5 months all electronic Daily Good

4 F 40-59 Master’s Degree

NP Women's Health

19 years all electronic Daily Good

5 F 23-39

Post-Graduate (Masters in Nursing)

NP Pediatrics 8.5 years some paper, some electronic

Daily Good

6 F 40-59 MD MD Neurology 27 years all electronic Daily Excellent

8 F 23-39 MPAs PA-C Neurology 9 years all electronic Daily Excellent

9 M 40-59 MD (post-graduate)

MD Family Practice 11 years all electronic Daily Excellent

10 M 23-39 MD (post-graduate)

MD Family Practice 5 years all electronic Daily Good

11 F 23-39 MD MD Cardiology 2 years some paper, some electronic

Daily Good

Participants Per Wave Participant #’s

Wave I 1,2,4,6,8,9,10,11

Wave II 2,4,5,6,8,9

Wave III 2,4,6,8

Wave IV 4,6,8

For the first two waves of usability testing, participants were scheduled for 2 hour sessions, and for the third and fourth waves of usability testing participants were scheduled for 30 minute sessions. At least thirty minutes were in between each session for debrief by the moderator, trainer, and data logger, and to reset systems to proper test conditions. A spreadsheet was used to keep track of the participant schedule, and included each participant’s demographic characteristics.

Page 19: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

9 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

STUDY DESIGN Overall, the objective of this test was to uncover areas where the application performed well – that is, effectively, efficiently, and with satisfaction – and areas where the application failed to meet the needs of the participants. The data from this test may serve as a baseline for future tests with an updated version of the same EHR and/or comparison with other EHRs provided the same tasks are used. In short, this testing serves as both a means to record or benchmark current usability, but also to identify areas where improvements must be made. During the usability test, participants interacted remotely with one EHR via Cisco Webex Web Conferencing. Each participant took control of a system that was in one location, and was provided with the same instructions. The system was evaluated for effectiveness, efficiency and satisfaction as defined by measures collected and analyzed for each participant:

Number of tasks successfully completed within the allotted time without assistance

Time to complete the tasks

Number and types of errors

Path deviations

Participant’s verbalizations (comments)

Participant’s satisfaction ratings of the system TASKS A number of tasks were constructed that would be realistic and representative of the kinds of activities a user might do with this EHR, including:

List of Tasks

Certification Criteria Task Wave of Testing

§170.314(a)(7) Medication allergy list Enter a medication allergy 1

§170.314(a)(7) Medication allergy list Change a medication allergy 1

§170.314(a)(7) Medication allergy list Inactivate a medication allergy and view a list of active and inactive medication allergies

1

§170.314(a)(7) Medication allergy list Indicate no known drug allergies 1

§170.314(a)(6) Medication list Enter a medication 1

§170.314(a)(6) Medication list Change medication dosage 1

§170.314(a)(6) Medication list Discontinue a medication 1

§170.314(a)(6) Medication list View patient’s medication history 1

§170.314(a)(6) Medication list Mark that a patient is not taking any medications 1

§170.314(b)(3) eRx Prescribe a medication 1

§170.314(b)(3) eRx Renew two medications 1

§170.314(b)(3) eRx Prescribe a medication with a different pharmacy 1

§170.314(b)(3) eRx Change the dose of a medication and eRx 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a major drug-drug interaction and cancel the order 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a moderate drug-drug interaction and override it 1

Page 20: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

10 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

List of Tasks

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Prescribe a medication that will not trigger an interaction and eRx 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Trigger a drug-allergy interaction and override it 1

§170.314(a)(2) Drug-drug, drug-allergy interactions checks

Change the drug-drug interaction settings 1

§170.314(a)(1) Computerized provider order entry

Place and print a prescription (medication) 2

§170.314(a)(1) Computerized provider order entry

Enter a medication as a called In Non-CPOE prescription 2

§170.314(a)(1) Computerized provider order entry

Place a lab order using the search function and a radiology order using the pre-defined list

2

§170.314(a)(1) Computerized provider order entry

Place and change a lab order using the pre-defined list and print the order

2

§170.314(a)(1) Computerized provider order entry

Place and change a radiology order using the pre-defined list 2

§170.314(a)(8) Clinical decision support

Access a patient encounter and review CDS alert 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered problem list data 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered medication list data 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered demographics 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered lab results 3

§170.314(a)(8) Clinical decision support

Trigger a CDS intervention from entered vital signs 3

§170.314(a)(8) Clinical decision support

Review and accept a CDS alert from entered problem list data 3

§170.314(a)(8) Clinical decision support

Enter information that will not trigger a CDS alert 3

§170.314(b)(4) Clinical information reconciliation

Reconcile medications for a new patient without a summary of care document

2

§170.314(b)(4) Clinical information reconciliation

Reconcile medications 4

§170.314(b)(4) Clinical information reconciliation

Reconcile allergies 4

§170.314(b)(4) Clinical information reconciliation

Reconcile problems 4

§170.314(b)(4) Clinical information reconciliation

Validate and reconcile SOC document 4

Tasks were selected based on their frequency of use, criticality of function, and those that may be most troublesome for users. Tasks should always be constructed in light of the study objectives.

Page 21: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

11 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

PROCEDURES After logging into the WebEx session, participants were greeted by the moderator and the trainer. Participants had been assigned a participant ID previously during the recruitment portion. Each participant was instructed to print out the participant guides for the tasks which had been emailed previously. To ensure that the test ran smoothly, three staff members participated in this test, the usability moderator, who served as the main administrator, the data logger, and a trainer. The moderator moderated the session, including administering instructions and tasks, and obtained post-task ranking data. The moderator also served as the host of the WebEx session and was responsible for setting up tasks and passing control to the participant prior to each set of tasks. The data logger took notes on task times, task success, path deviations, number and type of errors, and comments. Participants were instructed to perform the tasks (see specific instructions below):

As quickly as possible making as few errors and deviations as possible.

Without assistance; administrators were allowed to give immaterial guidance and clarification on tasks, but not instructions on use.

Without using a think aloud technique. Task timing began once the administrator finished reading the question. The task time was stopped once the participant indicated they had successfully completed the task. Scoring is discussed below. Following the session, the administrator gave the participant the post-test questionnaire (the System Usability Scale, see Appendix 7), and thanked each individual for their participation. Once the System Usability Scale form was returned, participants were compensated based on an hourly rate. TEST LOCATION Testing was performed remotely using Cisco WebEx Conferencing. The participant, moderator, and data logger all logged in separately to a previously set up WebEx meeting session. The trainer was located in the same room as the moderator. The moderator served as the host for the WebEx session, and shared his computer with CGM ENTERPRISE EHR™ installed in order to facilitate training. Control of the session was passed to the participant for each of the tasks. TEST ENVIRONMENT The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted remotely. The participants used a mouse and keyboard when interacting with the EHRUT. The application was set up by CompuGroup Medical. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size).

Page 22: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

12 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

TEST FORMS AND TOOLS During the usability test, various documents and instruments were used, including:

1. Usability Test User Agreement 2. Addendum to Usability Test User Agreement 3. Moderator’s Guide 4. Usability Testing Closing Comments and Final Questions 5. System Usability Scale Questionnaire

Examples of these documents can be found in Appendices 2-7 respectively. The Moderator’s Guide was devised so as to be able to capture required data. The participant’s interaction with the EHRUT was captured and recorded digitally with WebEx. Facial expressions and verbal comments were additional captured and recorded with WebEx. PARTICIPANT INSTRUCTIONS The administrator reads the following instructions aloud to the each participant (also see the full moderator’s guide in Appendices 4-6 ): Thank you for participating in this study. Our session today will last approximately 90 minutes. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not say anything more than asked. If you get lost or have difficulty, I cannot answer or help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. Please be honest with your opinions. The product version you will be using today is still in the development stages. Some of the data may not make sense. During the tasks you will be performing in each session, we will ask you to make decisions that may contradict how you would normally practice. Please follow the tasks as we have outlined them because we have designed them to test the usability of specific workflows. We are recording the audio and video of our session today. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. We will be utilizing WebEx today for the session which will allow you to watch and hear a brief training of the EHR areas you will be performing tasks in; once the training is completed, I will ask you to review the task you will be performing before we begin and I will turn the controls for the session over to you so that you can complete each task. The WebEx session will also be recording you and your actions throughout each task. If there are any technical difficulties we will restart the task.

Page 23: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

13 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

I will demonstrate briefly to familiarize you with the process and will then turn the controls over to you to give you a few minutes to get comfortable with using WebEx. Following the procedural instructions, participants were shown the EHR and were given time to explore the system and make comments. Once this task was complete, the moderator administered the tasks, and gave the following instructions During these tasks we will ask you to make decisions that may differ from how you would practice. Please follow the tasks as we have outlined them because we have designed them to test the usability of specific workflows. Please take a few moments to review Task _ and let me know when you are ready and we will begin. Once the participant was ready to begin, the moderator read the task and instructed the participant to say begin, and to say “Stop” when he/she was done. USABILITY METRICS According to the NIST Guide to the Processes Approach for Improving the Usability of Electronic Health Records, EHRs should support a process that provides a high level of usability for all users. The goal is for users to interact with the system effectively, efficiently, and with an acceptable level of satisfaction. To this end, metrics for effectiveness, efficiency and user satisfaction were captured during the usability testing. The goals of the test were to assess:

1. Effectiveness of CGM ENTERPRISE EHR™ by measuring participant success rates and errors 2. Efficiency of CGM ENTERPRISE EHR™ by measuring the average task time and path deviations 3. Satisfaction with CGM ENTERPRISE EHR™ by measuring ease of use ratings

DATA SCORING The following table details how tasks were scored, errors evaluated, and the time data analyzed

Measures Rationale and Scoring

Effectiveness: Task Success

A task was counted as a “Success” if the participant was able to achieve the correct outcome, without assistance, within the time allotted on a per task basis. The total number of successes were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage. Task times and number of clicks were recorded for successes.

Effectiveness: Task Failures

If the participant abandoned the task, did not reach the correct answer or performed it incorrectly, or reached the end of the allotted time before successful completion, the task was counted as a “Failures.” No task times were taken for errors. If the participant completed the task incorrectly, this was counted as an error. The total number of errors were calculated for each task and then divided by the total number of times that task was attempted. The results are provided as a percentage.

Efficiency: Task Deviations

The participant’s path (i.e., steps) through the application was recorded. Deviations occur if the participant, for example, went to a wrong screen, clicked on an incorrect menu item, followed an incorrect link, or interacted incorrectly with an on-screen control. This path was compared to the

Page 24: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

14 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

optimal path. The number of steps in the observed path is divided by the number of optimal steps to provide a ratio of path deviation. Efficiency was only counted for tasks that were successfully completed.

Efficiency: Task Time

Each task was timed from when the moderator said “Start” until the participant said, “Stop.” If he or she failed to say “Stop,” the time was stopped when the participant stopped performing the task. Only task times for tasks that were successfully completed were included in the average task time analysis. Average time per task was calculated for each task. Standard deviation was also calculated. Task time efficiency was calculated by dividing the observed task time by the optimal time.

Satisfaction: Task Rating

Participant’s subjective impression of the ease of use of the application was measured by administering both a simple post-task question as well as a post-session questionnaire. After each task, the participant was asked to rate “Overall, this task was:” on a scale of 1 (Simple) to 5 (Difficult). These data are averaged across participants. To measure participants’ confidence in and likeability of CGM ENTERPRISE EHR™ overall, the testing team administered the System Usability Scale (SUS) post-test questionnaire. Questions included, “I think I would like to use this system frequently,” “I thought the system was easy to use,” and “I would imagine that most people would learn to use this system very quickly.” See full System Usability Score questionnaire in Appendix 7.

RESULTS DATA ANALYSIS AND REPORTING The results of the usability test were calculated according to the methods specified in the Usability Metrics section above. Participants who failed to follow session and task instructions had their data excluded from the analyses. The usability testing results for the EHRUT are detailed below:

Task Success

% Error

%

Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations) Task

Rating AVG STD AVG STD AVG STD

Enter a medication allergy 100.00% 0.00% 29.57 5.56 1.08 0.20 1.01 0.04 1.00

Change a medication allergy 87.50% 0.00% 27.57 7.28 1.28 0.34 1.06 0.17 1.25

Inactivate a medication allergy and view a list of active and inactive medication allergies

87.50% 0.00% 9.29 2.06 1.43 0.32 1.00 0.00 1.00

Indicate no known drug allergies 100.00% 0.00% 10.00 1.77 1.33 0.24 1.00 0.00 1.00

Enter a medication 62.50% 25.00% 42.40 9.21 1.46 0.32 1.17 0.26 1.25

Change medication dosage 25.00% 50.00% 40.00 2.83 1.70 0.12 1.00 0.00 1.50

Discontinue a medication 87.50% 0.00% 23.00 5.39 1.31 0.31 1.00 0.00 1.13

View patient’s medication history

100.00% 0.00% 8.88 2.90 1.78 0.58 1.00 0.00 1.13

Mark that a patient is not taking 75.00% 0.00% 18.67 2.07 1.07 0.12 1.04 0.10 1.88

Page 25: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

15 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Task Success

% Error

% Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations)

Task Rating

any medications

Prescribe a medication 62.50% 37.50% 47.00 9.03 1.38 0.27 1.00 0.00 1.75

Renew two medications 87.50% 0.00% 16.14 1.77 1.54 0.17 1.14 0.38 1.13

Prescribe a medication with a different pharmacy

100.00% 0.00% 50.00 11.65 1.59 0.37 1.06 0.13 1.63

Change the dose of a medication and eRx

75.00% 12.50% 65.17 15.22 1.59 0.37 1.20 0.08 1.50

Trigger a major drug-drug interaction and cancel the order

100.00% 0.00% 22.75 3.85 1.30 0.22 1.13 0.27 1.25

Trigger a moderate drug-drug interaction and override it

100.00% 0.00% 23.88 4.64 1.54 0.30 1.06 0.18 1.00

Prescribe a medication that will not trigger an interaction and eRx

75.00% 0.00% 12.33 2.42 1.23 0.24 1.00 0.00 1.00

Trigger a drug-allergy interaction and override it

87.50% 12.50% 24.14 5.76 1.34 0.32 1.00 0.00 1.25

Change the drug-drug interaction settings

100.00% 0.00% 33.88 7.57 1.54 0.34 1.03 0.07 1.50

Place and print a prescription (medication)

100.00% 0.00% 53.67 5.65 1.39 0.15 1.00 0.00 2.00

Enter a medication as a called In Non-CPOE prescription

83.33% 16.67% 131.00 38.10 1.29 0.38 1.02 0.05 2.92

Place a lab order using the search function and a radiology order using the pre-defined list

66.67% 33.33% 40.00 1.83 1.10 0.05 1.08 0.15 1.67

Place and change a lab order using the pre-defined list and print the order

83.33% 16.67% 31.20 6.34 1.02 0.21 1.03 0.06 1.83

Place and change a radiology order using the pre-defined list

100.00% 0.00% 22.50 2.88 1.00 0.13 1.00 0.00 1.50

Access a patient encounter and review CDS alert

100.00% 0.00% 20.25 3.10 1.19 0.18 1.00 0.00 1.50

Trigger a CDS intervention from entered problem list data

0.00% 100.00% n/a n/a n/a n/a n/a n/a 1.50

Trigger a CDS intervention from entered medication list data

100.00% 0.00% 87.75 7.93 1.02 0.09 1.00 0.00 2.50

Trigger a CDS intervention from entered demographics

75.00% 25.00% 35.00 14.42 1.35 0.55 0.90 0.00 1.75

Trigger a CDS intervention from entered lab results

75.00% 0.00% 28.00 1.73 1.75 0.11 1.00 0.00 1.25

Trigger a CDS intervention from entered vital signs

100.00% 0.00% 29.75 2.22 1.75 0.13 1.00 0.00 1.75

Review and accept a CDS alert from entered problem list data

50.00% 50.00% 52.50 16.26 1.59 0.49 1.05 0.07 1.50

Enter information that will not 50.00% 50.00% 70.00 1.41 1.63 0.03 1.00 0.00 1.75

Page 26: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

16 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Task Success

% Error

% Task Time (seconds)

Efficiency (Task Time)

Efficiency (Deviations)

Task Rating

trigger a CDS alert

Reconcile medications for a new patient without a summary of care document

100.00% 0.00% 93.00 24.22 1.62 0.42 1.44 0.12 1.25

Reconcile medications 66.67% 33.33% 163.50 2.12 1.35 0.02 1.08 0.04 2.67

Reconcile allergies 0.00% 100.00% n/a n/a n/a n/a n/a n/a 2.67

Reconcile problems 0.00% 100.00% n/a n/a n/a n/a n/a n/a 2.67

Validate and reconcile SOC document

66.67% 33.33% 7.50 0.71 1.88 0.18 1.00 0.00 2.67

EFFECTIVENESS Generally speaking, participants who were able to successfully complete tasks did so with very few deviations. However, the success percentage was not 100% for any single group of tasks (i.e. medication allergy list.). In some cases, a task success of less than 100% was solely the result of participants going over the allotted time to complete the task. These tasks are reflected in the quantitative data chart by a 0.00% error percentage. The only task failures for the medication allergy section were caused by participants going over the allotted task time. Additionally, participants deviated very little from the optimal path. The only main deviations involved participants clicking on an area that was already selected (i.e. the reaction section) or having to re-type in the search field due to misspelling. The medication list, eRx, drug-drug interaction, and medication related CPOE functionality similarly yielded low error percentages. These four areas of the software all utilize the same third party software, and therefore the tasks all occur within the same area and are intertwined. The medication list and eRx/CPOE areas are accessed the same way and default to eRx. Because of this, one main issue was that participants would forget to switch to the “manage meds” portion, which would allow them to enter a medication into the patient’s record without actually prescribing it. For the eRx tasks, the opposite was also true in one case. Users additionally struggled with the medication list task to change a medication dosage. On more than one occasion, the participant would use the “modify” button, to change the dose of the medication, but this does not keep the old instance within the medication history as the task dictates. For the eRx tasks, the tasks to electronically prescribe a formulary medication and change the dose of a medication and eRx yielded errors. As mentioned above, one participant incorrectly used the “manage meds” area to try to e-prescribe. Additionally, more than one participant did not select the medication that was in the patient’s formulary. The error rate on the task to change the dose of a medication was low, due to one participant not stopping the previous medication. The only issue with the drug-interaction tasks was the task to override a drug-allergy interaction. One participant selected the justification reason but did not actually override the task by clicking prescribe anyway. The task to override a drug-drug interaction yielded a 100% success rate, which required the same workflow, although in that case the participant was instructed to not enter a justification reason.

Page 27: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

17 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

For the CPOE tasks, as mentioned previously those involving CPOE for medication orders were generally very successful. Only one participant did not complete the task to enter a medication as a called in Non-CPOE order. However, this is a brand new workflow required for Meaningful Use, and therefore a slight error percentage is to be expected due to the learning curve. Tasks for CPOE for lab and radiology had a few minor errors. The workflows for the two types of orders are the same, so both positive and negatives are reflective of both. Changing an order before it was placed was not an issue, but one participant struggled to delete and change a placed order. Additionally, one participant was unable to figure out how to print the lab order, and one participant selected the wrong radiology order. The tasks for clinical decision support and clinical information reconciliation yielded the lowest success percentage, which is somewhat understandable as the functionality for these areas is brand new to this edition of CGM ENTERPRISE EHR™. The clinical decision support tasks that were the most challenging to the providers were those that required the user to refresh the screen prior to being able to see if there was a CDS alert. These were the tasks to activate CDS alerts based off of problem and vital sign data, as well as the task to review and accept a CDS alert, since the alert was triggered from entered problem list data. Additionally, although most participants were able to tell the difference between when a CDS alert was activated or not, on one occasion the participant clicked on the CDS button even though it was not red, meaning the CDS alert was not activated. During the clinical information reconciliation tasks, when adding medications, problems, and allergies from the summary of care document to the patient record, providers were supposed to add any additional information that did not come through the CCDA. This was completed for the medication reconciliation in most cases, but not all of the information was added for allergies and problems.

EFFICIENCY When participants were able to successfully complete tasks, they generally did so with very few deviations. Often times, the result of a lower task deviation efficiency score (scores farthest from one) were the result of one or two extra clicks, during which the participant did not really deviate from the optimal path. For example, a participant may have attempted to click on a button that was greyed out, but was able to figure out the correct steps after that. The areas with path deviations of note are listed and explained below:

Enter a medication – the issues for this task were due to data entry deviations on the sig screen. Two participants removed the duration information, therefore removing the quantity, which then needed to be re-added.

Change the dose of a medication and eRx – a few participants prescribed a new medication and then stopped the old one as opposed to clicking on the currently prescribed med and selecting the option to change the dosage.

In general during a few of the tasks participants would misspell a medication or allergy they were searching for for causing them to need to re-type and search. A similar issue was that one participant attempted to use abbreviations to search but was unable to due to the way the search function works in the system.

For many of the tasks, task time efficiency yielded a high efficiency score (task time efficiency score close to 1. However, this was not the case overall, especially with respect to the CPOE for lab and radiology tasks.

Page 28: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

18 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

As detailed above, there were a number of tasks where participants were unable to complete the task within the allotted time. The tasks and overall areas reflecting this, as well as those with lower task time efficiency (task time efficiency score farther from 1) include:

Tasks where the participant had to enter a medication – participants moved slowly through the sig screen and were often delayed when selecting the medication from a list (when using the search feature.)

Electronically renew two medications – the increased time and lowered efficiency seemed to mostly be caused by participants not quite knowing where to begin and which steps to take.

Tasks for clinical decision support were also weaker overall in this respect. Although some of this can be contributed to the functionality and workflows being new, some patterns did emerge. Time was added for many participants when they had to accept a clinical decision support alert. Users would scroll past the required areas on the CDS alert interface.

SATISFACTION The results from the SUS (System Usability Scale) scored the subjective satisfaction with the system based on performance with these tasks to be: 68. Broadly interpreted, scores under 60 represent systems with poor usability; scores over 80 would be considered above average. On an individual task level, users ranked, on average, most tasks 1-2 (between “very easy” and “easy”). The tasks slightly outside this ranking were:

Enter a medication as a called in NON-CPOE order

Tasks related to clinical information reconciliation The task ranking scores support the previously discussed quantitative data in these cases. Additionally, the task to review CDS alert based on entered prescription information was ranked as more difficult, which does not reflect the quantitative data. However, one can infer that this was the case because of the additional steps necessary to sync and refresh the data in order for the alert to appear.

MAJOR FINDINGS Overall, participants liked the functionality presented during the usability testing and generally would recommend the software to their peers. Most areas of the software yielded good quantitative results (high success and efficiency, low error rate.) One area that was especially commented upon positively was the orders tab, especially the CPOE for radiology and labs. However, participants did comment that they didn’t like having to leave the orders area in order to create a medication order. This workflow is due to the lack of integration between CGM ENTERPRISE EHR™ and the third party e-prescribing software, and was a common complaint for all areas relying on the third party e-prescribing workflow. Participants mentioned they did not like having to leave the patient encounter and would like to be able to prescribe or enter a medication and look at the progress note at the same time. Participants also commented that they didn’t like how much scrolling was required as well as the numerous drop downs.

Page 29: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

19 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Areas related to medication allergy entry and drug-interaction checking were generally positive, with the exception being any limitations related to the actual entry of the medication order. The brand new areas relating to CDS and clinical information reconciliation were not as favorably received and did not reach the highest level of success. However, many participants did notate that with some practice they would be able to more easily navigate the workflow. For clinical decision support, the tasks requiring the user to refresh the screen were less positively received than those that did not require a refresh, and many commenters mentioned they would like to see the software immediately update both for clinical decision support as well as in general. The CDS alert itself was clear to most participants, although one user did comment that it wasn’t immediately obvious that the CDS button turned red when there was an alert available. This is also reflected by the quantitative data, as there were times when a user selected the CDS alert even when it was not lit up. For clinical information, participants notated that they thought the screens were a bit busy and had some trouble figuring out what had been added or removed from the record. What needed to be done was not immediately apparent, and the quantitative data shows this clearly by the lower success and higher error rate.

AREAS FOR IMPROVEMENT Based on the quantitative findings, verbal report of the participants, and observations from the data logger, the following changes would most likely improve the overall usability of CGM ENTERPRISE EHR™. Integrate Third Party eRX/Medication List Software – Many of the issues with the medication list, eRx, drug-drug interaction, and medication related CPOE functionality could be alleviated through tight integration of third party software. Currently, because it is not integrated, there is no way to alter the interface. Once integrated, the following changes should be implemented:

Clean up the interface so less scrolling is required.

Clearly distinguish the workflow for when a prescription is entered versus a medication that was previously prescribed.

Allow providers to prescribe or add medications while viewing the progress note/patient encounter.

Institute a “modify” button that can be used to change a medication (either a prescription or not) that keeps the old instance within the medication history.

Clinical Decision Support – Eliminate the need to refresh screens in order for problem list and vital sign data to be saved. This would affect clinical decision support as well as assist providers in their general day to day practice. Additionally, the clinical decision support pop-up should be modified so that there is less information, which would make it more practical to the user and less overwhelming.

Page 30: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

20 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Clinical Information Reconciliation – Make missing information for clinical information reconciliation more obvious. Clean up the screens so that they are less busy, and modify the workflow so that a user can see what has been added to or removed from the reconciled list.

Page 31: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

21 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDICES The following appendices include supplemental data for this usability test report. Following is a list of the appendices provided:

1. Sample Recruiting Screener 2. Usability Test User Agreement 3. Addendum to Usability Test User Agreement 4. Usability Testing Moderator Introduction 5. Example Moderator’s Guide (Medication List) 6. Usability Testing Closing Comments and Final Questions 7. System Usability Scale Questionnaire

Page 32: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

22 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 1 – SAMPLE RECRUITING SCREENER Participant Name (First & Last):

Practice Name:

Phone Number:

Email Address:

General Screening:

1. Male or Female?

2. Have you participated in a focus group or usability test in the past 06 months?

3. Do you, or does anyone in your home, work in marketing research, usability research, web design?

4. Do you, or does anyone in your home, have a commercial or research interest in an electronic health record software or consulting company?

5. Which of the following best describes participant’s age? [23 to 39; 40 to 59; 60 to 74; 75 and older]

6. Which of the following best describes your race or ethnic group? [e.g., Caucasian, Asian, Black/African-American, Latino/or Hispanic, etc.]

7. Do you require any assistive technologies to use a computer? [if so, please describe, for example corrective lenses – lens strength]

8. Do you have a private work area to perform usability testing free of distractions [such as a private office]?

9. What internet connectivity options do you have available? LAN or Wi-Fi? Is this connectivity available in a place that is quiet and free of distractions? [describe]

Page 33: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

23 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Professional Demographics:

10. What is your name, position and title? (Must be healthcare provider that can prescribe)

MD: Specialty: DO: Specialty : NP: Specialty : PA: Specialty: Other: Name & Specialty :

11. How long have you held this position?

12. Describe your work affiliation and environment? [e.g., private practice, health system, government clinic, etc.]

13. Which of the following describes your highest level of education? [e.g., high school graduate/GED, some college, college graduate (RN, BSN), postgraduate (MD/PhD), other (explain)]

Computer Expertise:

14. Besides reading email or accessing the EHR, what other activities do you do on the computer? [e.g., research; reading news; shopping/banking; digital pictures; programming/word processing, etc.

15. About how many hours per week do you spend on the computer? [Recruit according to the demographics of the intended users, [e.g., 1 to 10, 11 to 20, 21 to 30, 31 to 40, 40 plus]

16. What computer platform do you usually use? [e.g., Mac, Windows, etc.]

17. In the last month, how often have you used an electronic health record? [Daily, _ Week(s); _ Mos]

18. How many years have you used an electronic health record? Has that been with EEHR or have you used multiple EHR’s

19. How many EHRs do you use or are you familiar with?

20. How would you rate your ability to learn software applications? a. Excellent b. Good c. Average d. Poor

Page 34: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

24 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

EEHR Experience:

How would you rate your overall knowledge of the EHR?

e. Excellent

f. Good

g. Average

h. Poor

22. How does your practice typically document patient records?

a. On paper b. Some paper, some electronic c. All electronic

23. Do you currently use Dr. First for ePrescribing Medications?

a. Yes b. No c. Sometimes

24. Do you currently use the “orders” functionality in the EHR for lab and radiology

orders?

a. Yes b. No c. Sometimes

25. How would you rate your internet bandwidth/connectivity speed in your office?

a. Excellent b. Good c. Fair d. Poor

Availability

26. Would you make yourself available to participate in two 1-2 hr. testing sessions that will take place the weeks of Nov 4th and Dec 2nd?

27. If yes, what time of day are you available (morning, afternoon, evening)? What days are best for you (M, T, W, Th, F, S, Sun)?

Page 35: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

25 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 2 – USABILITY TEST USER AGREEMENT This Usability Test User Agreement (“Agreement”) is made and entered into on October___, 2013 (“Effective Date”) by and between CompuGroup Medical, Inc., a Delaware corporation, having its principal place of business at 125 High Street, 8th Floor, Boston, MA 02110 (“CGM”) and the person or entity executing this Agreement (“User”). From time to time, CGM and User shall collectively be referred to herein as “parties,” and individually as “party.” This Agreement shall become effective upon execution of this Agreement by the User (the “Effective Date”), whose authorized signature below shall also serve to acknowledge the User’s acceptance of the terms and conditions herein, on behalf of User and User’s medical practice .

RECITALS WHEREAS, CGM licenses electronic medical records software; WHEREAS, User has certain knowledge and skill with regards to these electronic medical records software products; and WHEREAS, CGM desires to sponsor a usability test group to evaluate an electronic health record system for the purpose of promoting collaboration, knowledge acquisition and overall product improvement (the “Purpose”). NOW, THEREFORE, in consideration of the foregoing and the promises and mutual covenants contained herein and for other good and valuable consideration, the receipt and adequacy of which are hereby acknowledged by the Parties, and intending to be legally bound hereby, the Parties hereto agree as follows:

AGREEMENT

The recitals set forth above in this Agreement are, by this reference, incorporated into and deemed a part of this Agreement.

1. Term. The term of this Agreement shall commence on the Effective Date stated above and shall end on the earlier of February 28, 2014 or when terminated by either party upon giving thirty (30) days’ written notice to the other party. Notwithstanding any termination, the obligations of the parties concerning confidentiality will, with respect to Confidential Information (as defined below) that constitutes a “trade secret” (as that term is defined under applicable law), be perpetual, and will, with respect to other Confidential Information, remain in full force and effect during the term and for five (5) years following the receipt of the Confidential Information or the termination of this Agreement, whichever is later.

2. Obligations.

2.1. Obligation of CGM. CGM shall use commercially reasonable efforts to host two (2) usability test sessions of approximately two (2) to three (3) hours in length during the term of this Agreement (“Usability Test Sessions”). CGM shall compensate User at an hourly rate of two hundred U.S. dollars (US$200.00) per hour of WebEx (remote) participation in every Usability Test Session; however CGM shall not be responsible for any other incidental expenses related to User’s attendance of Usability Test Sessions. User participation in Usability Test Session shall be measured in ten (10) minutes intervals, and payment shall be prorated accordingly.

Page 36: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

26 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

2.2. Obligation of User. User agrees to use best efforts to attend each WebEx (remote) Usability Test Session hosted by CGM during the term of this Agreement. User will be trained on the system

then asked to perform several tasks using a prototype and give feedback. User will adhere to all requirements regarding protection and use of Confidential Information.

3. Confidential lnformation.

3.1. Definition. “Confidential Information” means: (i) the terms and conditions of this Agreement; (ii) all information marked as “Confidential,” “Proprietary” or a similar legend if disclosed in writing or other tangible form; (iii) all information identified as “confidential,” “proprietary” or the like at the time of disclosure if information is disclosed orally; or (iv) all information User knows or reasonably should know is confidential, proprietary or trade secret information of CGM. For the avoidance of doubt, CGM product roadmaps, product development plans, pre-release products or product information, sales and marketing plans, and research and development activities, constitute Confidential Information whether or not designated as “Confidential” or “Proprietary.”

3.2. Exceptions to Confidential Information. User shall have no obligation with respect to information which (i) was rightfully in possession of or known to User without any obligation of confidentiality prior to receiving from CGM; (ii) is, or subsequently becomes, legally and publicly available without breach of this Agreement; (iii) is rightfully obtained by User from a source other than CGM without any obligation of confidentiality; or (iv) is developed by or for the User without use of the Confidential Information and such independent development can be shown by documentary evidence. Further, User may disclose Confidential Information pursuant to a valid order issued by a court or government agency, provided that User provides to CGM: (a) prior written notice of such obligation; and (b) a reasonable opportunity to oppose such disclosure or obtain a protective order.

3.3. User’s Obligation Regarding Confidential Information Received from CGM. User may only use Confidential Information in furtherance of the Purpose and shall not disclose the Confidential Information to any third party; provided, however, User may disclose Confidential Information to other members and employees of User’s medical practice pursuant to the terms of this Agreement, where applicable and in furtherance of the Purpose. User shall safeguard CGM’s Confidential Information with the same degree of care, but not less than reasonable care, as it uses to protect its own confidential or proprietary information.

3.4. CGM Ownership of Confidential Information. CGM retains all right, title and interest to the Confidential Information. No license to any existing or future intellectual property right is either granted or implied by the disclosure of Confidential Information. User may not reverse-engineer, decompile, or disassemble, modify or copy (except for making a single back-up copy) any software disclosed under this Agreement or in connection with the usability testing. User shall not remove, overprint, deface or change any notice of confidentiality, copyright, trademark, logo, legend or other notice on or related to Confidential Information, whether originals or copies.

3.5. Return or Destruction of Confidential Information. Upon written demand, User shall: (i) cease using the Confidential Information, (ii) return the Confidential Information and all copies, notes or extracts thereof to CGM within seven (7) calendar days of receipt of demand and/or destroy same, at the election of CGM; and (iii) certify in writing that User has complied with the obligations set forth in this paragraph.

3.6. Disclaimer. All CONFIDENTIAL INFORMATION PROVIDED BY CGM TO USER IS PROVIDED “AS IS.” CGM shall not be liable for the accuracy or completeness of the Confidential Information,

Page 37: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

27 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

and there are no express or implied representations or warranties by CGM with respect to the infringement of any intellectual property rights, or any right of privacy, or any rights of third persons.

4. User Information. User acknowledges and agrees that: (i) this Agreement does not protect disclosures made by User to CGM; and (ii) CGM does not wish to receive confidential, proprietary or trade secret information from User in connection with Usability Test Sessions. In the event that User wishes to disclose confidential, proprietary or trade secret information to CGM outside the scope of Usability Test Sessions or otherwise, the parties will execute a separate non-disclosure agreement detailing with specificity the information to be disclosed and the purpose(s) therefor.

5. Feedback. By providing any comments, suggestions, improvements or any other information or materials in connection with the Usability Test Sessions (collectively “Feedback”), User grants to CGM (including its sublicensees and assigns) a non-exclusive, irrevocable, worldwide, perpetual, royalty-free license, under all of User’s intellectual property rights, to use, display, copy, edit, create derivative works, market, sell, import, and distribute (including through resellers or multiple tiers of distribution) such Feedback. CGM may disclose and sublicense Feedback to third parties for any purpose. Any use of Feedback by CGM is in its sole discretion. User warrants and represents that it has all necessary rights to disclose Feedback or any other information or materials in connection with Usability Test Sessions.

6. Press Release/Disclosure. User shall not issue any press release or public disclosure regarding this Agreement or Usability Test Sessions without the prior written consent of CGM. User authorizes CGM to disclose User's name and practice information to other members of Usability Test Sessions and other third parties at CGM’s sole discretion.

7. Agreement to Participate in Usability Testing. User agrees to participate in Usability Testing conducted and recorded by CGM. User understands that participation in this Usability Testing is voluntary and agrees to immediately raise any concerns or areas of discomfort during the session with the testing administrator. User understands that she or he can leave a Usability Testing Session at any time; however, if User chooses to do so, User will be compensated only for the time he or she actually participated in the Usability Testing Session.

8. Recording Release. User agrees to participate in audio, video, and/or digital recording during the Usability Testing Sessions. User understands and consents to the use and release of any such recordings by CGM. User understands that the information and recording is for research and certification purposes only, and CGM agrees that User’s name and image will not be used for any other purpose without written authorization from User. User relinquishes any rights to the recording and understands the recording may be copied and used by CGM without further permission.

9. General.

9.1. Each party acknowledges that monetary remedies may be inadequate to protect Confidential Information and that CGM may seek injunctive relief in the event of any threatened or actual breach of any of the obligations hereunder.

9.2. This Agreement does not create a joint venture, employment relationship, agency, or partnership between the parties, which are independent contractors.

9.3. User may not assign this Agreement.

9.4. If any term of this Agreement shall be held to be illegal or unenforceable, such provision shall be modified to the minimum extent necessary so as to make it valid and enforceable or

Page 38: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

28 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

severed if such modification is impossible, and as so modified the entire Agreement shall remain in full force and effect.

9.5. This Agreement shall be construed in accordance with the laws of the Commonwealth of Massachusetts, excluding its conflict of laws rules. Any legal or equitable action by or against Company arising out of or related in any way to this Agreement shall be brought solely in the federal or state courts located in Suffolk County, Massachusetts, the parties each expressly consenting to (and waiving any such challenge or objection to) such sole and exclusive personal jurisdiction and venue.

9.6. This Agreement is the entire agreement of the parties pertaining to the subject matter of this Agreement and may be modified only by a writing signed by both parties. This Agreement supersedes any and all prior oral discussions and/or written correspondence or agreements between the parties with respect thereto, all of which are excluded. The failure of a party to enforce its rights in the case of any breach of this Agreement shall not be construed to constitute a waiver of its rights with respect to any subsequent breach.

9.7. Except as set forth below, any notice required or permitted to be given by either party under this Agreement shall be in writing and will be effective and deemed given: (a) when delivered personally; (b) when sent by confirmed facsimile or e-mail (followed by the actual document by first class mail/overnight delivery service); (c) three days after having been sent by registered or certified mail, return receipt requested, postage prepaid; or (d) one day after deposit with a commercial overnight delivery service specifying next day delivery (or two days for international courier packages specifying two-day delivery), with written verification of receipt. To be effective, any notice to CGM hereunder must be addressed as follows: xxx

CompuGroup Medical, Inc.

By: ______________________________________

Print Name: _______________________________

Title: ____________________________________

Date: ____________________________________

Company:_______________________________

By: _____________________________________

Print Name: _____________________________

Title: __________________________________

Date: __________________________________

Page 39: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

29 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 3 – ADDENDUM TO USABILITY TEST USER AGREEMENT A. The reference in Section 1 of the Agreement to “February 28, 2014” as the end of the term of the Agreement is hereby replaced with “March 31, 2014” as the new end of the term of the Agreement. B. The reference in Section 2.1 of the Agreement to “two (2) usability test sessions” is hereby replaced with “”three (3) usability test sessions.” For clarity, any usability test sessions already conducted since the Effective Date of the Agreement count toward the total number of usability test sessions contemplated in Section 2.1 of the Agreement. EXECUTED BY THE AUTHORIZED SIGNATURES BELOW OF: Company (“User”): ________________________

CompuGroup Medical, Inc.

Signature: _______________________________ Signature: __________________________________

Printed Name: ____________________________ Printed Name: _______________________________

Title/Position: ____________________________

Date: ___________________________________

Title/Position: _______________________________

Date: _______________________________________

Page 40: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

30 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 4 – USABILITY TESTING MODERATOR INTRODUCTION Thank you for participating in this study. Our session today will last approximately 90 minutes. I will ask you to complete a few tasks using this system and answer some questions. We are interested in how easy (or how difficult) this system is to use, what in it would be useful to you and how we could improve it. You will be asked to complete these tasks on your own trying to do them as quickly as possible with the fewest possible errors or deviations. Do not say anything more than asked. If you get lost or have difficulty, I cannot answer or help you with anything to do with the system itself. Please save your detailed comments until the end of a task or the end of the session as a whole when we can discuss freely. Please be honest with your opinions. The product version you will be using today is still in the development stages. Some of the data may not make sense. During the tasks you will be performing in each session, we will ask you to make decisions that may contradict how you would normally practice. Please follow the tasks as we have outlined them because we have designed them to test the usability of specific workflows. We are recording the audio and video of our session today. All of the information that you provide will be kept confidential and your name will not be associated with your comments at any time. We will be utilizing WebEx today for the session which will allow you to watch and hear a brief training of the EHR areas you will be performing tasks in; once the training is completed, I will ask you to review the task you will be performing before we begin and I will turn the controls for the session over to you so that you can complete each task. The WebEx session will also be recording you and your actions throughout each task. If there are any technical difficulties we will restart the task. I will demonstrate briefly to familiarize you with the process and will then turn the controls over to you to give you a few minutes to get comfortable with using WebEx. Do you have any questions or concerns? Are you ready to begin?

Page 41: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

31 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 5 – EXAMPLE MODERATOR’S GUIDE (MEDICATION LIST)

Summative Testing Process for §170.314(a)(6) Medication List

Moderator Guide: During these tasks we will ask you to make decisions that may differ from how you would practice. Please

follow the tasks as we have outlined them because we have designed them to test the usability of specific

workflows. Do not use the favorites list unless the task specifies it.

Pre-Entered Data:

Using Patient Record: James Whitcomb Demographic Data: DOB: 05/29/1953 Previously Entered Medication:

Simvastatin 20 mg tablet by mouth once daily.

Please take a few moments to review Task A and let me know when you are ready and we will begin.

PAUSE and wait for the participant to say they are ready.

Patient James Whitcomb is in the clinic today and during your visit today with him he gives you information about medications that have been prescribed previously or elsewhere that need to be added to his medication list.

TASK A: ENTER A MEDICATION

Enter the following active medication information:

Metoprolol Succinate 25 mg Extended Release tablet,

o 1 tablet by mouth daily

o Quantity 30 Refills: None

o Start Date: Unknown.

START

PARTICIPANT SAYS STOP Thank you for completing Task A On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task?

Page 42: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

32 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Optimal Path:

START (Rx tab of the EHR Encounter)

Click “Prescribe using ePrescribing” hyperlink to launch Dr. First

Click Manage Meds

Click on search field

Enter Metoprolol

Click “Find”

Click Metoprolol Succinate 25 mg Extended Release tablet (verify the sig, should match what the provider is supposed to enter)

Click Continue

STOP

TASK B: CHANGE MEDICATION STRENGTH/DOSAGE Please take a few moments to review Task B and let me know when you are ready and we will begin.

PAUSE and wait for the participant to say they are ready.

James Whitcomb has told you that her Metoprolol Succinate dosage has changed.

Change the Metoprolol 25 mg extended release tablet to Metoprolol succinate 50 mg extended release tablet, only the dosage has changed, the instructions remain the same.

Make the change in the medication list, making sure that Metoprolol succinate 25 mg extended release tablet remains within the medication history.

START

PARTICIPANT SAYS STOP Thank you for completing Task B On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task?

Optimal Path:

START (Manage Meds in eRX)

Metoprolol succinate 25 mg stop link

Select Justification Drop Down

Select “Dosage Change”

Click Stop Medication

Click on search field

Enter metoprolol succinate

Click “find”

Page 43: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

33 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Select metoprolol succinate 50 mg extended release tablet (verify sig)

Continue

STOP

TASK C: DISCONTINUE A MEDICATION Please take a few moments to review Task C and let me know when you are ready and we will begin.

PAUSE and wait for the participant to say they are ready. James Whitcomb tells you he has completed taking the Simvastatin 20 mg on October 1, 2013.

Stop Simvastatin 20 mg.

The patient completed this medication on October 1, 2013. START

PARTICIPANT SAYS STOP Thank you for completing Task C On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task?

Optimal Path: START (Manage Meds in eRX)

Click Stop link for Simvastatin 20 mg

Select Justification Drop Down

Select “Completion of Therapy”

Change date to October 1, 2013

Click Stop Medication button STOP

TASK D: VIEW THE PATIENT’S MEDICATION HISTORY Please take a few moments to review Task D and let me know when you are ready and we will begin.

PAUSE and wait for the participant to say they are ready. View James Whitcomb’s Medication History

START PARTICIPANT SAYS STOP

Page 44: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

34 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

Thank you for completing Task D On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task? Optimal Path: START (Manage Meds in eRX)

Click Show Medication History STOP TASK E: MARK THAT PATIENT IS CURRENTLY NOT TAKING ANY MEDICATIONS Please take a few moments to review Task E and let me know when you are ready and we will begin.

PAUSE and wait for the participant to say they are ready. James Whitcomb has informed you that he is no longer taking any medications. Mark that change in his record. START PARTICIPANT SAYS STOP Thank you for completing Task E On a scale of 1 to 5, one being simple and 5 being difficult, how would you rate this task? Optimal Path: START (Manage Meds in eRX)

Stop Metoprolol 50 mg

Click Prescribe

Mark Patient Takes No Medications radio button STOP

Page 45: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

35 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 6 – USABILITY TESTING CLOSING COMMENTS AND FINAL QUESTIONS

Closing Comments

This concludes Usability Testing At this time I would like to ask you a few questions:

1) What was your overall impression of this system?

2) What aspects of the system did you like the least?

3) Were there any features that you were surprised to see?

4) What features did you expect to encounter, but did not see? That is, is there anything that is missing in this application?

5) Compare this system to other systems that you have used.

6) Would you recommend this system to your colleagues based on the functionality that you saw today?

We have a formal survey which is a critical part of our analysis of usability. Please take a moment and complete it now. Thank you for participating.

Page 46: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

36 | CompuGroup Medical

Safety-Enhanced Design

© Copyright 2014 CompuGroup Medical, Inc. All rights reserved. | May not be reproduced without prior written permission. | www.CGMus.com

APPENDIX 7 – SYSTEM USABILITY SCALE QUESTIONNAIRE (SUS)

Strongly Strongly Disagree Agree

1. I think that I would like to use this system frequently.

1 2 3 4 5

2. I found the system unnecessarily complex.

1 2 3 4 5

3. I thought the system was easy to use.

1 2 3 4 5

4. I think that I would need the support of

a technical person to be able to use this system.

1 2 3 4 5

5. I found the various functions in this system were well integrated.

1 2 3 4 5

6. I thought there was too much inconsistency

in this system.

1 2 3 4 5

7. I would imagine that most people would learn

to use this system very quickly.

1 2 3 4 5

8. I found the system very cumbersome to use.

1 2 3 4 5

9. I felt very confident using the system.

1 2 3 4 5

10. I needed to learn a lot of things before I could get going with this system.

1 2 3 4 5

11. Please provide any additional feedback on the back of the questionnaire.

Page 47: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

Amendment: User-Centered Design Process §170.314(g)(3) Safety-enhanced design

The below user-centered design process was used in the development of CGM ENTERPRISE EHR™

10.0.4:

NISTIR 7741: NIST Guide to Processes Approach for improving the Usability of Electronic Health Records https://www.nist.gov/manuscript-publication-seach.cfm?pub_id=907313

The UCD method was applied to the following certification criteria:

§170.314(a)(1) Computerized provider order entry

§170.314(a)(2) Drug-drug, drug-allergy interaction checks

§170.314(a)(6) Medication list

§170.314(a)(7) Medication allergy list

§170.314(a)(8) Clinical decision support

§170.314(b)(3) Electronic prescribing

§170.314(b)(4) Clinical information reconciliation

Page 48: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

Amendment: Test Environment §170.314(g)(3) Safety-enhanced design

Test Environment The EHRUT would be typically be used in a healthcare office or facility. In this instance, the testing was conducted in remotely each participant logged in from a computer located in his/her own practice.. For testing, the computer used was a HP EliteBook running Windows 7. The participants used a mouse and keyboard when interacting with the EHRUT. The EHRUT used a display with the following set-up: approximately 15 inches, resolution – 1280x768, Windows Basic colors. The application was set up by CompuGroup Medical according to the vendor’s documentation describing the system set-up and preparation. The application itself was running on Windows using a test database on a LAN connection. Technically, the system performance (i.e., response time) was representative to what actual users would experience in a field implementation. Additionally, participants were instructed not to change any of the default system settings (such as control of font size).

Page 49: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

Amendment: Demographics §170.314(g)(3) Safety-enhanced design

Participants

Part ID

Gender Age Range

Education Occupation/role

Professional Experience

Computer Experience Product Experience

Assistive Technology Needs

1 Female 23-39

PA PA – Family Medicine

4 Years in Position

Uses computer 31-40 hours a week for research, reading news, shopping/banking, digital pictures, word processing; typically uses Windows computers

Uses product daily – has used for 3 months

None

2 Male

40-59

CFNP Nurse – Family Practice

5 months in Position

Uses computer 40+ hours a week for reading news, banking, digital pictures, word processing; typically uses Windows and Mac computers

Uses product daily – has used for 4 months

None

4 Female 40-59

NP

Nurse – Women’s Health

19 Years in Position

Uses computer 40+ hours a week for shopping, banking, word processing; typically uses Windows computers

Uses product daily – has used for 9 months

Yes – not specified

5 Female 23-39

NP

Nurse – Pediatrics

8.5 Years in Position

Uses computer 11-20 hours a week for reading news, shopping, banking, digital pictures; typically uses Windows computers

Uses product daily – has used for 2-3 years

None

6 Female 40-59

MD

Medical Doctor – Neurology

27 Years in Position

Uses computer 40+ hours a week for research, reading news, shopping,

Uses product daily – has used for 3

Yes – not specified

Page 50: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

CompuGroup Medical US • 3300 N. Central Ave, Suite 2100 • Phoenix, AZ 85012

Synchronizing Healthcare • www.CGMus.com

banking, digital pictures, word processing; typically uses Windows computers

years and 11 months

8 Female 23-39

PA-C

PA – Neurology

9 Years in Position

Uses computer 40+ hours a week for, shopping, banking, word processing; typically uses Windows computers

Uses product daily – has used for 6 years

None

9 Male 40-59

MD

Medical Doctor – Family Practice

11 Years in Position

Uses computer 40+ hours a week for research, reading news, digital pictures, word processing; typically uses Windows computers

Uses product daily – has used for 2 ½ years

None

10 Male 23-39

MD

Medical Doctor – Family Practice

5 Years in Position

Uses computer 40+ hours a week for research, reading news, shopping, banking, digital pictures, word processing; typically uses Windows and Mac computers

Uses product daily – has used for 2 ½ years

None

11 Female 23-39

MD

Medical Doctor – Cardiology

2 Years in Position

Uses computer 31-40 hours a week for research, reading news, shopping, banking, digital pictures, word processing; typically uses Windows computers

Uses product daily – has used for 12 years

None

Page 51: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

Appendix B: Quality Management System

©2018 InfoGard. May be reproduced only in its original entirety, without revision 11

Page 52: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Quality Management System Attestation Form-EHR-37-V02

InfoGard Laboratories, Inc. Page 1

For reporting information related to testing of 170.314(g)(4).

Vendor and Product Information

Vendor Name CompuGroup Medical, Inc.

Product Name CGM ENTERPRISE EHR™

Product Version 10.1

Quality Management System

Type of Quality Management System (QMS) used in the development, testing, implementation, and maintenance of EHR product.

Based on Industry Standard (for example ISO9001, IEC 62304, ISO 13485, etc.). Standard:

A modified or “home-grown” QMS.

No QMS was used.

Was one QMS used for all certification criteria or were multiple QMS applied?

One QMS used.

Multiple QMS used.

Description or documentation of QMS applied to each criteria: 14-03-04 EEHR QMS

Not Applicable.

Statement of Compliance

I, the undersigned, attest that the statements in this document are completed and accurate.

Vendor Signature by an Authorized Representative

Date July 15, 2015

rachel.gilbert
Stamp
Page 53: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

Appendix C: Privacy and Security

©2018 InfoGard. May be reproduced only in its original entirety, without revision 12

Page 54: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification
Page 55: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification
Page 56: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification
Page 57: ONC HIT Program Test Results Summary for 2014 Edition EHR ... · Address: 3300 N. Central Ave, Suite 2100 ONC HIT Certification Program Test Results Summary for 2014 Edition EHR Certification

Test Results Summary for 2014 Edition EHR Certification

17‐3200‐R‐0021‐PRA V1.0, January 17, 2018

Test Results Summary Document History Version Date

V1.0 1/17/2018Initial release

END OF DOCUMENT

Description of Change

©2018 InfoGard. May be reproduced only in its original entirety, without revision 13