How Did WE Do? Evaluating the Student Experience

Post on 22-Jan-2018

120 Views

Category:

Healthcare

1 Downloads

Preview:

Click to see full reader

Transcript

Welcome

The National Cooperative Agreement on

Clinical Workforce Development

Presented by the

the Community Health Center, Inc.

& the National Nurse-Led Care Consortium

WEBINAR 4: How did WE do? Evaluating the Student Experience

June 6, 2017

Let’s Review..

Why Start or Expand Student Experience Programs at Your Health Center?

Creating a Process that Works for YOU: Infrastructure for a Successful Student Training Program

Spring 2016 #2Spring 2016 #1

How to Make it Work for the Students

Spring 2016 #3

How to Create Life Changing Experiences for Students

Spring 2016 #4

Developing a Student Training Process: On-Boarding and Orientating to Your Health Center Community

Spring 2017 #1

Team Based Care 101 for Health Professions Students

Spring 2017 #2Spring 2017 #3

Enhancing the Student Experience Through Effective Precepting

How did WE do? Evaluating the Student Experience

Spring 2017 #4

SpeakersFrom Community Health Center, Inc.:

Anna Rogers, Director of the National Cooperative Agreement

Reema Mistry, MPH, Project Coordinator, National Cooperative Agreement

Amanda Schiessl, Interprofessional Student Coordinator

From National Nurse-Led Care Consortium:

Kristine Gonnella, MPH, Director for Training and Technical Assistance

Cheryl Fattibene, Chief Nurse Practitioner Officer

From UPENN School of Nursing

Carrie Doherty, Director of Implementation-CMS/GNE Demonstration

From Community College in Philadelphia:

Laureen Tavolaro-Ryler, MSN CNS

From the National Nurse Practitioner Residency and Fellowship Training Consortium

Candice Rettie, PhD, Executive Director

Learning Objectives:

1. Participants will describe two key components for a

student to successfully interface with HR in a FQHC

environment.

2. Participants will identify two reasons why providing

evaluative feedback will enhance the student training

program

Get the Most Out of Your Zoom Experience

• Send your questions using Q&A function in Zoom

• Look for our polling questions

• Recording and slides are available after the presentation on our website within

one week www.chc1.com/nca

Program Evaluation: Improving Performance

Presented by Candice S. Rettie, PhD NCA Webinar June 2017

Overview of the SessionDefinitions and Process of Good Program Evaluation

Designing Meaningful Evaluation for Your FQHC Educational Program– Crosswalk evaluation with curriculum– Integrated throughout the learning experience– Creates explicit expectations for learner– Documents programmatic success– Fosters improvement positive growth, creativity and innovation

Characteristics of Useful Evidence

Learning ObjectivesKnowledge:

– Understand the purpose of evaluation– Know the characteristics of good evaluation– Understand the process of evaluation– Understand the connection with curriculum

Attitude:– Embrace the challenge– Value the outcomes

Skills– To be gained by partnering with local experts and

collaborate focusing on local program

Definitions:Evaluation: systematic investigation of merit, worth, or significance of effort; Program evaluation: evaluate specific projects and activities that target audiences may take part in;Stakeholders: those who care about the program or effort.

Approach: practical, ongoing evaluation involving program participants, community members, and other stakeholders.

Importance: 1. Helping to clarify program plans;2. Improving communication among participants and partners;3. Gathering the feedback needed to improve and be accountable for program

outcomes/effectiveness;4. Gain Insight about best practices and innovation;5. Determine the impact of the program;6. Empower program participants and contribute to organizational growth.

1. Develop a Written Plan Linked to Curriculum2. Collect Data3. Analyze Data4. Communicate and Improve

4 Basic Steps to Program Evaluation

Fitting the Pieces Together: Program Evaluation

Program Curriculum

PreceptorFacultyStaff

Learner

Institution

Overall Program

Crosswalk Curriculum and Evaluation

Program Evaluation Feedback Loops

Learner performance

Instructor and staff performance

Program curriculum performance

Programmatic and Institutional performance

Evaluation Process How Do You Do It?

Steps in Evaluation:

Engage stakeholders

Describe the program

Focus the evaluation design

Gather credible evidence

Justify conclusions Analyze, synthesize and interpret findings, provide alternate explanations

Feedback, follow up and disseminate Ensure use and share lessons learned

Level 1: Reaction (Satisfaction Surveys) Was it worth time; was it successful? What were biggest strengths/weaknesses? Did they like physical plant?

Level 2: Learning (Formative: Performance Observation/checklist; Monitored skill demonstration) Observable/measurable behavior change? Can they teach others/patients?

Level 3: Behavior (Observations/interviews; 3-12 weeks later) New or changed behavior after the program? Can they teach others? Are trainees aware of change?

Level 4: Results (Program Goals/Institutional goals; 3-12 mo later) Improved employee retention? Increased productivity for new employees? Higher morale?

Kirkpatrick Model of Evaluation

Relevance of Evaluation

• What will be evaluated?

• What criteria will be used to judge program performance?

• What standards of performance on the criteria must be reached for the program to be considered successful?

• What evidence will indicate performance on the criteria relative to the standards?

• What conclusions about program performance are justified based on the available evidence?

Questions Guiding the Evaluation Process

Credible evidence -- Raw material of a good evaluationBelievable, trustworthy, and relevant answers to evaluation questions

Indicators (evidence)Translate general concepts about program and expected effects into specific, measurable parts. (eg: increase in patient panel / billable hours over 1 year)

SourcesPeople, documents, or observations (eg: trainees, faculty, patients, billable hours, reflective journals). Use multiple sources -- enhances the evaluation's credibility. Integrate qualitative and quantitative information -- more complete and more useful for needs and expectations of a wider range of stakeholders.

QuantityDetermine how much evidence will be gathered in an evaluation. All evidence collected should have a clear, anticipated use.

LogisticsWritten Plan: Methods, timing (formative and summative), physical infrastructure to gather/handle evidence.Must be consistent with cultural norms of the community, must ensure confidentiality is protected.

Goals of Good Evaluation

• Purposeful, systematic

• Formative (on-going) and summative (final) data collection

• Conduct quality assurance

• Determine training effectiveness

• Communicate results

• Build capacity

Take Away Points:Partner with your local university education experts

Crosswalk the evaluation with the curriculum• Anchored in the goals and objectives of the curriculum

Accuracy, Utility, Feasibility, Propriety• Measurable and observable criteria of acceptable performance

• Multiple, expert ratings/raters: Multiple observations give confidence in findings and provide an estimate of reliability (reproducibility or consistency in ratings).

• Validity achieved with synthesis of measurements that are commonly accepted, meaningful, and accurate

Formative and summative

Conclusions need to be relevant and meaningful

The Community Tool Box, (Work Group for Community Health at the U of Kansas): incredibly complete and understandable resource, provides theoretical overviews, practical suggestions, a tool box, checklists, and an extensive bibliography.

Pell Institute: user-friendly toolbox that steps through every point in the evaluation process: designing a plan, data collection and analysis, dissemination and communication, program improvement.

CDC has an evaluation workbook for obesity programs; concepts and detailed work products can be readily adapted to NP postgraduate programs.

Another wonderful resource, Designing Your Program Evaluation Plans, provides a self-study approach to evaluation for nonprofit organizations and is easily adapted to training programs. There are checklists and suggested activities, as well as recommended readings.

NNPRFTC website – blogs: http://www.nppostgradtraining.com/Education-Knowledge/Blog/ArtMID/593/ArticleID/2026/Accreditation-Standard-3-Evaluation

Resources:

POLLING QUESTION

Do you use an evaluation tool to assess the student training experience at your health center?

How did WE do? Evaluating the Student

Experience

LAUREEN TAVOLARO - RYLEYCOMMUNITY COLLEG E OF PHILA DELPHIA

CH ERYL FAT TIBENE AND KRIST INE G ONNELLANATI ONAL NURSE - LED CARE CONSORTIUM

CAROLINE D OUG H ERTYUPENN S C HOOL OF N URS I N G

National Nurse-Led Care Consortium (NNCC)

About us:

Membership organization that supports nurse-led care and nurses at the front lines of care.

Supported by the Health Resources and Services Administration (HRSA) under grant number U30CS09736, a National Training and Technical Assistance Cooperative Agreement (NCA).

Focus of discussion:

1. Discuss how we identify and assess successes and challenges in our student training program?

2. How do we adjust processes to improve upon the student training program and experience?

CMS GNE Demonstration Project

Goals:

• Test feasibility, effectiveness, and cost of increasing APRN production with Medicare payment model

• Increase APRNs in primary care settings, underserved areas and providing care to Medicare beneficiaries

• Increase clinical training opportunities

Evaluating the Student Experience1. Identify current process

• Initial interface• Administrative onboarding• Clinical onboarding• Training• Off-boarding

2. Evaluate steps of process: What is and isn’t working?• Informal discussions• Focus groups• Surveys• Observations

3. Communicate outcomes of evaluation• Provider• Administrative• Clinic staff• Student• Academic partnership

4. Adjust process based on outcomes

Survey:

Student:

1. This experience was different than my other clinical rotations. yes/no, If yes: how was it different?

2. I would recommend this experience to my classmates.

3. Please rate this training experience (1-5)

4. I felt more likely to ask questions during this rotation than others.

5. I received more feedback on my performance during this rotation than others.

6. Suggestions for future semesters

Survey

Preceptor:

1. This experience was different than my precepting experiences in the past. If yes, how was it different?

2. Please rate this training experience (1-5)

3. This experience allowed me to spend more time with my patients compared to when I have had a student with me in the past.

4. This experience makes me more likely to commit to precepting a student in future semesters.

5. Suggestions for future semesters

STUDENT

EXPERIENCE

Identify

Process

Evaluate

Process

Communicate

Outcomes

Adjust

Process

Evaluating the Student Experience

Community College of Philadelphia: 19130 Zip Code Project

Quality Safety and Education for Nursing (QSEN)

Community College of Philadelphia: 19130 Zip Code Project

• Clear Objectives with Community Partners

• Clear Objectives with Students

• Measurement of the Objectives

• Share Documents

• End of Semester Wrap up

Community College of Philadelphia: 19130 Zip Code Project

• Continuous Evaluation of Service Learning

• Assessment of the Community

• Curriculum Review• Shift in Community Based

Care

Reminders

Complete our survey!

Sign up at www.chc1.com/NCA

top related