Top Banner
OERC RESEARCH RELATED TO STUDENT GROWTH MEASURES Jill L. Lindsey, PhD Wright State University Marsha Lewis, MA Ohio University OERC Institutional Partners Making Research Work for Education
24
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Documents oerc_160913_va_symp_lindsey_lewis-final

OERC RESEARCH

RELATED TO STUDENT

GROWTH MEASURES

Ji l l L. Lindsey, PhD

Wright State University

Marsha Lewis, MA

Ohio University

OERC Institutional Partners

M a k i n g Resea rch Work for E d u ca t i on

Page 2: Documents oerc_160913_va_symp_lindsey_lewis-final

THE OHIO EDUCATION RESEARCH CENTER

• A collaborative of Ohio-

based research universities

& institutions

• Focused on a statewide

research agenda

• Addressing critical issues

of education practice

and policy

• Custodians of an Ohio

Longitudinal Data Archive

Page 3: Documents oerc_160913_va_symp_lindsey_lewis-final

3

DUAL SGM RESEARCH AGENDA

Measuring Student Growth

Using Student Growth Measures in

relation to other variables to guide

policy & practice

Page 4: Documents oerc_160913_va_symp_lindsey_lewis-final

Educator Evaluation Studies

Extended Testing for Value-Added

Reporting

Student Growth Measures (SGM)

Policy & Practice

4

FUNDED PROJECTS RELATED TO

SGM

Page 5: Documents oerc_160913_va_symp_lindsey_lewis-final

5

OTES/OPES FINDINGS RELATED TO SGM

LEAs piloting or implementing in 2012-13 did not use SGM (n=37).

Educators were generally positive about the new evaluation systems.

Educators expressed a lack of trust & misunderstandings about value-added data and SGM.

Educators expressed concerns about unfairness of using different student growth measures for evaluation.

Page 6: Documents oerc_160913_va_symp_lindsey_lewis-final

6

EXTENDED TESTING FOR VALUE-ADDED

MEASURE (VAM)

Eager for reliable student growth measures for

"untested" grades and subjects

Question validity of vendor tests & lack

assessment literacy

Worry about too much testing and stress on

primary students (K-2)

Question how # students and % instructional time

impact VAM

After HB required LEAs to use Extended Testing

VAM in evaluation, a few LEAs dropped out

Page 7: Documents oerc_160913_va_symp_lindsey_lewis-final

7

SGM POLICY & PRACTICE STUDY

Page 8: Documents oerc_160913_va_symp_lindsey_lewis-final

EXAMPLE OF LINKAGE SCREEN

TEACHER VIEW

8

Page 9: Documents oerc_160913_va_symp_lindsey_lewis-final

LINK/ROSTER VERIFICATION SURVEYS

Surveyed a sample of Ohio teachers who

linked for the first time in 2011 and all Ohio

teachers who linked in 2013.

Asked about experiences with linkage

training, linkage process, perceptions of

accuracy, suggestions for improvement…

9

Page 10: Documents oerc_160913_va_symp_lindsey_lewis-final

FINDINGS

2011 2013

Yes 80% 81%

No 20% 19%

Teachers- Did you have any students this year for whom you shared the proportion of instructional time with another teacher?

10

Page 11: Documents oerc_160913_va_symp_lindsey_lewis-final

FINDINGS

2011 2013

Yes 46% 58%

No 23% 26%

Don't know 31% 16%

Teachers- Do you think the linkage process accurately captured what was happening in your classroom (i.e., students you taught last year, their length of enrollment, and your percentage of instructional time with them)?

11

Page 12: Documents oerc_160913_va_symp_lindsey_lewis-final

FINDINGS

2011 2013

Not at all

confident

39% 31%

Somewhat

confident

55% 61%

Very confident 6% 9%

Teachers- Given your experience with the linkage process, how confident are you that the linkage process improves the accuracy of the teacher-level value-added data?

12

Page 13: Documents oerc_160913_va_symp_lindsey_lewis-final

TEACHER-STUDENT DATA

LINK SENSITIVIT Y ANALYSIS

Another question in SGM Research Project:

How consequential to teacher-level value-added

measures is the precision of their reported

percentages of shared instructional

responsibility?

13

Page 14: Documents oerc_160913_va_symp_lindsey_lewis-final

LINKAGE SENSITIVITY ANALYSIS

Provided SAS list of districts that responded to 2011

linkage survey

Developed nine scenarios of instructional

responsibility (varied # students, % students shared)

SAS identified 882 teachers in the 62 districts and

recalculated their value-added scores for 9

instructional time scenarios (10% responsibility

through 90% responsibility)

14

Page 15: Documents oerc_160913_va_symp_lindsey_lewis-final

LINKAGE SENSITIVITY ANALYSIS

15

Page 16: Documents oerc_160913_va_symp_lindsey_lewis-final

LINKAGE SENSITIVITY ANALYSIS

Teacher value-added effectiveness levels (5

categories) remained largely stable with changes

in % instructional responsibility.

5 Categories from Teacher Value-Added Reports:

Least effective (OTES SGM rating “below”)

Approaching average effectiveness

Average effectiveness (OTES rating “expected”)

Above Average effectiveness

Most effective (OTES SGM rating “above”)

16

Page 17: Documents oerc_160913_va_symp_lindsey_lewis-final

LINKAGE SENSITIVITY ANALYSIS-

PRELIMINARY FINDINGS

For OAA Math 2012 overall:

Moving teachers from 50% to 60% responsibility resulted in

<4% of teachers being classified into adjacent effectiveness

level.

Moving teachers from 50% to 90% responsibility resulted in

14% of teachers being classified into another effectiveness

level. All but 3 teachers classified into adjacent

effectiveness level. *Reclassification likely further reduced by OTES SGM classification into 3 levels.

Teachers who share >75% of students most affected by

changes in instructional responsibility.

17

Page 18: Documents oerc_160913_va_symp_lindsey_lewis-final

STUDENT GROWTH MEASURES IN OHIO’S

NEW TEACHER EVALUATION SYSTEM

The Ohio Teacher Evaluation System (OTES) was

implemented for the first time in 2012-13 by 26

Ohio LEAs.

OTES final summative rating of teacher performance

is comprised of 50% teacher performance on

standards and 50% student academic growth

measures.

“Category A” teachers in OTES are teachers with

teacher-level value-added data available for some or

all of the classes they teach.

18

Page 19: Documents oerc_160913_va_symp_lindsey_lewis-final

CATEGORY A TEACHERS IN EARLY

IMPLEMENTING LEAS

24 unique “implementing” LEAs with available data.

Total number of teachers with available OTES rating

data in implementing districts: 2,001

Total number of category A teachers: 398* (20% of

all teachers with OTES rating data had value-added

data as all or par t of their student academic growth

measure portion)

*46 category A teachers were exempt f rom S GM so not included in fo l lowing f igures .

19

Page 20: Documents oerc_160913_va_symp_lindsey_lewis-final

20

16%

69%

15%

Distribution of SGM for Category A Teachers in

OTES-Implementing LEAs 2012-13

Below

Expected

Above

Page 21: Documents oerc_160913_va_symp_lindsey_lewis-final

CONGRUENCE OF THE TWO SIDES OF OTES

FOR TEACHERS WITH VALUE-ADDED DATA (CATEGORY A TEACHERS)

SGM Category

Teacher

Performance

Rating: Below Expected Above

Accomplished 4 46 18

Skilled 46 194 34

Developing 5 5 0

Ineffective 0 0 0

21

Page 22: Documents oerc_160913_va_symp_lindsey_lewis-final

DISTRIBUTION OF CATEGORY A TEACHER

FINAL SUMMATIVE RATING BY STUDENT

GROWTH MEASURE CATEGORY

SGM Category

Teacher Final

Summative

Rating: Below Expected Above

Accomplished 0 0 52

Skilled 0 240 0

Developing 50 5 0

Ineffective 5 0 0

22

Page 23: Documents oerc_160913_va_symp_lindsey_lewis-final

NEXT STEPS

Research OTES/OPES implementation in 2013-14

Examine distribution of OPES/OTES ratings by

district typology and other factors once 2013-14

data are available

Examine impact of different weighting of Value-

Added Data in SGM and congruence/differential

effects of VAA and other student growth

measures (e.g. Student Learning Objectives)

23

Page 24: Documents oerc_160913_va_symp_lindsey_lewis-final

QUESTIONS?

[email protected] | oerc.osu.edu