Top Banner
03/15/22 M. Corley 1 NRS Data Monitoring for Program Improvement Unlocking Your Data
70

NRS Data Monitoring for Program Improvement

Dec 31, 2015

Download

Documents

lucas-william

NRS Data Monitoring for Program Improvement. Unlocking Your Data. Objectives—Day 1. Describe the importance of getting involved with and using data; Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model; - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 1

NRS Data Monitoring for Program Improvement

Unlocking Your Data

Page 2: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 2

Objectives—Day 1

1. Describe the importance of getting involved with and using data;

2. Identify four models for setting performance standards as well as the policy strategies, advantages,and disadvantages of each model;

3. Determine when and how to adjust standards for local conditions;

4. Set policy for rewards and sanctions for local programs;

5. Identify programmatic and instructional elements underlying the measures of educational gain, NRS follow-up, enrollment, and retention.

     

Page 3: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 3

Agenda—Day 1 Welcome, Introduction, Objectives, Agenda Review The Power of Data

– Why Get Engaged with Data? Exercise

– The Data-driven Program Improvement Model

– Setting Performance Standards

– Adjusting Standards for Local Conditions

– Establishing a Policy for Rewards and Sanctions

Getting Under the Data– Data Pyramids

– Data Carousel

Evaluation and Wrap-up for Day 1

Page 4: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 4

Objectives—Day 2

1. Distinguish between the uses of desk reviews and on-site monitoring of local programs;

2. Identify steps for monitoring local programs;

3. Identify and apply key elements of a change model; and

4. Work with local programs to plan for and implement changes that will enhance program performance and quality.

Page 5: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 5

Agenda—Day 2

Agenda Review Planning for and Implementing Program Monitoring

– Desk Reviews Versus On-site Reviews – Data Sources (small group work)– Steps and Guidelines for Monitoring Local Programs

Planning for and Implementing Program Improvement– A Model of the Program Improvement Process– State Action Planning

Closing and Evaluation

Page 6: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 6

STOP! Why Get Engaged with Data?

Page 7: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 7

Question for Consideration

Why is it important to be able to produce evidence of what your state (or local) adult education program

achieves for its students?

Page 8: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 8

The Motivation Continuum

Intrinsic Extrinsic

Which is the more powerful force for change?

Page 9: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 9

NRS Data-driven Program Improvement (Cyclical Model)

STEPS– Set performance standards– Examine program elements underlying the

data– Monitor program data, policy, and

procedures– Plan and implement program improvement– Evaluate progress and revise, as necessary,

and recycle

Page 10: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 10

What’s Under Your Data?The Powerful Ps

__Performance_(Data)_

Program Policies

Procedures

Processes

Products

Page 11: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 11

NRS Data-driven Program Improvement Model

NRSDATA Examine Program

Elements Underlying the Data

Monitor Program Data, Policy, Procedures

Plan and Implement Program

Improvement; Evaluate

Improvement

Set Performance Standards

Page 12: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 12

Educational Gains for ESL Levels and Performance Standards

80%91%

33%

22%

50%

26%26%

34%36%

27%

31%

0%

20%

40%

60%

80%

100%

Beg.

Lit

Beg. Low

Int.

High

Int.

Low

Adv.

High

Adv.

Program

Performance Standards

Exhibit 1-2

Page 13: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 13

Questions Raised by Exhibit 1-2 How were performance standards set? Based on past

performance?

Are standards too low at the higher levels?

Is performance pattern similar to that of previous years? If not, why not?

What are program’s assessment and placement procedures? Same assessments for high and low ESL?

How do curriculum and instruction differ by level?

What are student retention patterns by level?

Page 14: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 14

The Power of Data: Setting Performance Standards

Page 15: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 15

Essential Elements of Accountability Systems• Goals

• Measures

• Performance Standards

• Sanctions and Rewards

Page 16: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 16

National Adult Education Goals

educational gain, GED credential attainment, entry into postsecondary

education, and employment.

Reflected in NRS Outcome Measures of

Page 17: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 17

Performance Standards Similar to a “sales quota”: how well are you

going to perform this year? – Should be realistic and attainable, but

– Should stretch you toward improvement

Set by each state in collaboration with ED

Each state’s performance is a reflection of the aggregate performance of all the programs it funds

Page 18: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 18

Standards-setting Models

Continuous Improvement Relative Ranking External Criteria Return on Investment (ROI)

Page 19: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 19

Continuous Improvement Standard based on past performance

Designed to make all programs improve compared to themselves

Works well when there is stability and a history of performance on which to base standard

Ceiling reached over time, resulting in little additional improvement

Page 20: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 20

Relative Ranking Standard is mean or median performance of

all programs

Programs ranked relative to each other

Works for stable systems where median performance is acceptable

Improvement focus mainly on low-performing programs

Little incentive for high-performing programs to improve

Page 21: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 21

External Criteria Set by formula or external policy

Promotes a policy goal to achieve a higher standard

Used when large-scale improvements are called for, over the long term

No consideration of past performance: unrealistic, unattainable

Page 22: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 22

Return on Investment Value of program :: Cost of program

A business model; answers question, Are services or program worth the investment?

Can be a powerful tool for garnering funding (high ROI) or for losing funding (low ROI)

May ignore other benefits of program

Page 23: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 23

Decision Time for State Teams

1. Which model(s) do you favor for setting standards for/with locals?

2. Is it appropriate to use one statewide model or different models for different programs?

3. How will you involve the locals in setting the standards they will be held to?

Page 24: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 24

Question for Consideration

How do the standard-setting model(s) that states select represent a policy statement on the relationship between performance and quality that states want to instill in local programs?

Page 25: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 25

Adjusting Standards for Local Conditions

Research suggests that standards often need to be adjusted for local conditions before locals can work to improve program quality.

WHY IS THIS SO?

Page 26: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 26

Factors that May Require Adjustment of Standards

Student Characteristics– An especially challenging group

– Students at lower end of level

– Influx of different types of students

Local Program Elements External Conditions

Page 27: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 27

Shared Accountability

State and locals share responsibility to meet accountability requirements– State provides tools and environment

for improved performance

– Locals agree to work toward improving performance

Page 28: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 28

Locals should know…

The purpose of the performance standards;

The policy and programmatic goals the standards are meant to accomplish;

The standard-setting model that the state adopts; and

That State guidance and support is available to locals in effecting change.

Page 29: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 29

Shared Accountability Which state-initiated efforts have been

easy to implement at the local level? Which have not? What factors contributed to locals’

successfully and willingly embracing the effort?

What factors contributed to a failed effort?

Page 30: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 30

Shared Accountability

Locals Out of Control??

Hot Dog!! We’re really moving!

Anything Happening Out

There??

GetOFF

our backs!!

High

LowLow High

State Administrative Control

Local P

rog

ram

In

volv

em

en

t

Page 31: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 31

What About Setting Rewards and Sanctions?

Which is the more powerful motivator: rewards or sanctions?

List all the different possible reward structures you can think of for local programs.

How might sanctioning be counter-productive?

List sanctioning methods that will not destroy locals’ motivation to improve or adversely affect relationships with the state office.

Page 32: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 32

Variations on a Theme Exercise (Refer to H-10). Brainstorm as many possible rewards

or incentives as you can for recognizing local programs that meet their performance standards.

Then brainstorm sanctions that the state might impose on local programs that do not meet their performance standards.

Select a recorder for your group to write one reward per Post-It Note and one sanction per Post-It Note.

When you have finished, wait for further instructions from the facilitator.

Page 33: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 33

Summary of Local Performance Standard-setting Process

Procedure Goal

Select standard-setting model

Reflect state policies;

Promote program improvement

Set rewards and sanctions policy

Create incentives;

Avoid unintended effects

Make local adjustments

Ensure standards are fair & realistic for all programs

Provide T/A Create atmosphere of shared accountability

Monitor often Identify and avoid potential problems

Page 34: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 34

Getting Under the Data

NRS data, as measured and reported by states, represent the product of underlying programmatic and instructional decisions and procedures.

Page 35: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 35

Four Sets of Measures

1. Educational gain

2. NRS Follow-up Measures– Obtained a secondary credential– Entered and retained employment– Entered postsecondary education

3. Retention

4. Enrollment

Page 36: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 36

Educational Gain

I n s t r u c t i o n

Educational Gain

Goal Setting and Placement ProceduresRetention

Professional DevelopmentClass Organization

Assessment Procedures

Assessment Policies and Approach

Page 37: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 37

Follow-up Measures

G o a l-S e t t i n g

GED

Support Services

Tracking Procedures

Professional Development

Retention

EmploymentPostsecondary

Instruction

Page 38: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 38

Retention

I n s t r u c t i o n

Retention

Students

Class Schedules and Locations

Placement Procedures

Support Services

Professional Development

Retention Support and Policies

Page 39: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 39

Enrollment

R e c r u i t m e n t

Enrollment

Community Characteristics

Class Schedules and Locations

Professional Development

Instruction

Page 40: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 40

Data Carousel

Page 41: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 41

Question for Consideration

How might it benefit local programs if the State office were to initiate and maintain a regular monitoring schedule to compare local program performance against performance standards?

Page 42: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 42

Regular Monitoring of Performance

Compared with Standards Keeps locals focused on outcomes and

processes; Highlights issues of importance; Increases staff involvement in the process; Helps refine data collection processes and

products; Identifies areas for program improvement; Identifies promising practices; Yields information for decision-making; Enhances program accountability.

Page 43: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 43

BUT…

How can states possibly monitor performance of all local programs?

Don’t we have enough to do already??

Where will we find staff to conduct the reviews?

You’re kidding, right??

Page 44: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 44

Not!

Page 45: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 45

So….Let’s Find Some Answers How can you monitor performance of

locals without overburdening state staff?

What successful models are already out there??

How does your state office currently ensure local compliance with state requirements?

Can you build on existing structures?

Page 46: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 46

Approaches to Monitoring

Desk Reviews– Ongoing process– Useful for

quantitative data• Proposals

• Performance measures

• Program improvement plans

• Staffing patterns

• Budgets

On-site Reviews– Single event,

lasting 1-3 days– Useful for

qualitative data– Review of

processes & program quality

– Input from diverse stakeholders

Page 47: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 47

Advantages and Disadvantages of Desk Reviews

Advantages Disadvantages

Data, reports, proposals, etc., already in state office

Assumes accurate data that reflect reality

Review can be built into staff’s regular workload

Local staff and stakeholders not heard

Data is quantitative; can be compared to previous years

Static view of data; no interaction in context

No travel time or costs required

No team perspective

Page 48: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 48

Advantages and Disadvantages of On-site Reviews

Advantages DisadvantagesData is qualitative; review of processes & program quality

Stressful for local program and team

Input from perspectives of diverse stakeholders

Arranging site visits and team is time-intensive for both locals and state

State works with locals to explore options for improvement; provides T/A

Requires time out-of-office

Opportunity to recognize strengths; offer praise; identify best practices

Incurs travel costs

Page 49: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 49

Data Collection Strategies for Monitoring

1. Program Self-Reviews (PSRs)

2. Document Reviews

3. Observations

4. Interviews

Page 50: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 50

Program Self-Reviews

Conducted by local program staff

Review indicators of program quality

Completed in advance of monitoring visit and can help focus the on-site review

Results can guide the program improvement process

Page 51: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 51

Document Reviews

Can review from a distance:– Proposals

– Qualitative and quantitative reports

– Improvement plans

Can review on-site:– Student files

– Attendance records

– Entry and update records

– Course evaluations

Page 52: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 52

Qualitative and Quantitative Data 

Page 53: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 53

Observations Interactions

– during meetings– At intake and orientation– In hallways and on grounds– In the classroom

Link what is observed to– Indicators of quality– Activities in the program plan– Professional development workshops

Page 54: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 54

Interviews

Help clarify or explore ambiguous findings

Provide information re: stakeholders’ opinions, knowledge, and needs– Administrative, instructional, and support staff– Community partners – Community agencies (e.g., employment, social

services)– Learners

Page 55: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 55

Fill in the Boxes: Monitoring with Indicators of Program Quality

In teams of 4-5 and using H-12, fill in the data sources you would expect to use, the questions you would ask locals, and the strategies you would use in conducting a desk review versus an on-site review.

Page 56: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 56

Steps for Monitoring Local Programs

1. Identify state policy for monitoring; gather support from stakeholders.

2. Consider past practices when specifying scope of work for monitoring.

3. Identify persons to lead and participate in monitoring.

4. Identify resources available for monitoring locals.5. Determine process for collecting data with clearly

defined criteria for rating; conduct monitoring.6. Report findings and recommendations.

7. Follow-up on results.

Page 57: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 57

Data Help… Measure student progress

Measure program effectiveness

Assess instructional effectiveness

Guide curriculum development

Allocate resources wisely

Promote accountability

Report to funders and to the community

Meet state and federal reporting requirements

Show trends

Page 58: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 58

BUT…

Data do not help: If the data are not valid and reliable;

If the appropriate questions are not asked after reviewing the data; or

If data analysis is not used for making wise decisions.

Page 59: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 59

A Word about the Change Process

Factors that allow us to accept change:1. There is a compelling reason to do so;

2. We have a sense of ownership of the change;

3. Our leaders model they are serious about supporting the change;

4. We have a clear picture of what the change will look like; and

5. We have organizational support for lasting systemic change.

Page 60: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 60

Stages of Change

1. Maintenance of the old system

2. Awareness of new possibilities

3. Exploration of those new possibilities

4. Transition to some of those possibilities or changes

5. Emergence of a new infrastructure

6. Predominance of the new system

Page 61: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 61

A Word of Caution Start small; don’t overwhelm locals with a “data

dump.” Begin with the core issues, such as educational gain. Listen to what the data tell about the big picture; don’t

get lost in too many details. Work to create trust and build support by laying data

on the table without fear of recrimination. Provide training opportunities for staff on how to use

data. Be patient, working with what is possible in the local

program.Source: Spokane, WA School Superintendent Brian Benzel

Page 62: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 62

Planning and Implementing Program Improvement

Stages of the Program Improvement Process

1. Planning;

2. Implementing;

3. Evaluating; and

4. Documenting Lessons Learned and Making Adjustments, as needed

Page 63: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 63

Planning Questions

Who should be included on your program improvement team?

How will you prioritize areas needing improvement?

How will you identify and select strategies for effecting improvement?

Page 64: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 64

Guiding Questions for Strategies

Is the strategy: Clear and understandable to all users? One specific action or activity, or dependent on other

activities? (If so, describe the sequence of actions.) An activity that will lead to accomplishing the goal? Observable and measurable? Assignable to specific persons? Based on best practices? One that all team members endorse? Doable—one that can be implemented?

Page 65: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 65

Implementation Questions

Who will be responsible for taking the lead on ensuring that the change is implemented?

Who will be members of the “change” team and what will be their roles?

How will expectations for the change be promoted and nurtured?

How will the change be monitored?

Page 66: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 66

Evaluation Questions

How will the changes that are implemented be evaluated?

How will the team ensure that both short- and long-term effects are measured?

Who will interpret the results?

Who will be on the look-out for unintended consequences?

Page 67: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 67

Possible Evaluation Results

Significant improvement with no significant unintended consequences: Stay the course.

Little or no improvement: Stay the course OR scrap the changes?

A deterioration in outcomes: Scrap the changes.

Page 68: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 68

Documenting the Process

Document what worked and what didn’t; lessons learned; and logical next steps or changes to the

plan.

Use as guide for future action.

Page 69: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 69

State Planning Time

In your state teams, consider the questions on H-14 and begin planning.

Consider the stakeholders you want to include in your planning for data monitoring and program improvement.

Consider the problems you anticipate facing and propose solutions to those problems.

Complete H-14 to the best of your ability and be prepared to report on your plan in one hour.

Page 70: NRS Data Monitoring for Program Improvement

04/19/23 M. Corley 70

Thank you

Great Audience! Great Participation! Great Ideas! Live Long and Prosper! Good Luck!!