Top Banner
PENNSYLVANIA STATE EVALUATION FOR PRINCIPAL EFFECTIVENESS: PERCEPTIONS OF PRINCIPALS TOWARDS STATEWIDE APPRAISAL A Dissertation Submitted to the Faculty of Immaculata University By Thomas Evert III Immaculata, Pennsylvania April 2014
207

Evert Thomas - Final Dissertation Draft for Publication 04232014

May 04, 2023

Download

Documents

Khang Minh
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Evert Thomas - Final Dissertation Draft for Publication 04232014

PENNSYLVANIA STATE EVALUATION FOR PRINCIPAL EFFECTIVENESS:

PERCEPTIONS OF PRINCIPALS TOWARDS STATEWIDE APPRAISAL

A Dissertation Submitted to the Faculty

of Immaculata University

By

Thomas Evert III

Immaculata, Pennsylvania April 2014

Page 2: Evert Thomas - Final Dissertation Draft for Publication 04232014

ii

Page 3: Evert Thomas - Final Dissertation Draft for Publication 04232014

iii

© 2014 Thomas Evert III

All Rights Reserved

Page 4: Evert Thomas - Final Dissertation Draft for Publication 04232014

iv

Abstract

The purpose of this study was to examine the perceptions of principals regarding the

implementation of a new statewide Pennsylvania Principal Effectiveness Evaluation

(PPEE) and its impact on principal leadership practices, teacher instructional practices,

and student achievement scores. Accountability for the improvement of student

achievement scores and teacher instructional practices has filtered into a statewide

principal evaluation system that has raised concerns about the consistency, fairness,

effectiveness, and value attributed to the new process of principal evaluation. Eleven

school districts located within southeastern Pennsylvania served as the study sites for this

research. Data were collected using a Google© Drive online survey that consisted of

multiple Likert-scale questions, forced-choice questions, open-ended response questions,

and principal interview responses. The study sample consisted of 25 principals, five of

whom also participated within an interview process. This study found that the new

statewide PPEE would impact principal leadership practices if individuals take a serious

approach to the evaluation rubric that provided principals and evaluators with clear

definitions for accountability and measurements relative to differentiation within job

performance levels. Secondly, this study found that the new statewide PPEE would

impact teacher instructional practices because principals would hold teachers more

accountable for classroom learning and student performance, which has filtered down to

affect their own evaluations. Lastly, this study found that the new statewide PPEE would

impact student achievement scores due to a trickle down effect on principals holding

teachers accountable for student growth, and teachers then holding students more

accountable for improved achievement scores.

Page 5: Evert Thomas - Final Dissertation Draft for Publication 04232014

v

To my Father and Mother,

for their unconditional love and support.

To my Grandparents and Godparents,

for their additional love and support.

Page 6: Evert Thomas - Final Dissertation Draft for Publication 04232014

vi

Acknowledgements

To my Committee Chairperson, Dr. Jeffrey Rollison, who guided and trusted my

decisions throughout the completion of my dissertation to finish this doctoral program in

educational leadership.

To my Committee Member, Dr. Monica McHale-Small, who challenged me to

improve my proficiency with research because of her own expertise in education as a

researcher and scholar.

To my Committee Member, Dr. Bruce Rachild, who helped to focus my research

and provided valuable feedback while expecting no less than perfection throughout the

dissertation process.

Thank you to Dr. Thomas O’Brien, Associate Dean for the College of Graduate

Studies, and Dr. Sharon McGrath, the fourth reader, for reviewing my final dissertation

draft.

Thank you to Dr. Michael Masko, the Assistant Executive Director of the Bucks

County Intermediate Unit #22, for providing contact information to Dr. David Volkman,

an Executive Assistant within the Pennsylvania Department of Education, who granted

permission to use their new statewide Pennsylvania Principal Effectiveness Evaluation

within my study. The school district superintendents who granted permission and the

principals who volunteered their time to participate in my study. In addition, my school

district superintendent, Dr. David Baugh, for his inspiration and support.

A special thank you to Dr. William Frabizio who influenced my life in Music

Education and Professor Daniel Huschke who encouraged my Educational Leadership.

Ms. Stacey Del Buono for her dedication and friendship throughout the years.

Page 7: Evert Thomas - Final Dissertation Draft for Publication 04232014

vii

Table of Contents

Page

Signatures ............................................................................................................................ ii Abstract .............................................................................................................................. iv Acknowledgements ............................................................................................................ vi List Of Figures ................................................................................................................. xiii List Of Tables .................................................................................................................. xiv Chapter One: Introduction ...................................................................................................1

Overview ..................................................................................................................1

Pennsylvania Educator Effectiveness Project ..........................................................1

Rationale for Pennsylvania Principal Effectiveness ................................................2

Danielson Framework in Pennsylvania Teacher and Principal Effectiveness .........3

Pennsylvania Implementation of Principal Effectiveness ........................................4

Need for the Study ...................................................................................................7

Statement of the Problem .........................................................................................9

Definition of Terms ..................................................................................................9

Limitations .............................................................................................................11

Research Questions ................................................................................................12

Summary ................................................................................................................13

Chapter Two: Literature Review .......................................................................................14

Overview ................................................................................................................14

Introduction ............................................................................................................14

School Leadership and Principal Effectiveness .....................................................15

Page 8: Evert Thomas - Final Dissertation Draft for Publication 04232014

viii

Leadership Success and Principal Effectiveness .......................................16

Leadership Approach and Principal Effectiveness ....................................17

Instructional Leadership and Principal Effectiveness ............................................17

Shared leadership .......................................................................................19

Transformational leadership ......................................................................20

School Environment and Principal Effectiveness ..................................................20

Student Learning and Principal Effectiveness ...........................................21

Student Achievement and Principal Effectiveness ....................................22

Accountability Data and Principal Effectiveness ......................................22

Definition of Principal Effectiveness .....................................................................23

Traits of Effectiveness and the Principal ...................................................24

Research on Principal Effectiveness ..........................................................24

Principal Effectiveness Influence on Teacher Effectiveness .....................25

Research on Principal Effectiveness Evaluation ....................................................27

Design of Principal Evaluation ..................................................................28

Goal of Principal Evaluation ......................................................................29

National Performance Standards of Principal Evaluation .........................30

Psychometric Properties of Principal Evaluation ......................................32

Measurements within Principal Evaluation ...........................................................33

Performance Levels of Principal Evaluation .............................................35

Student Growth Factors in Principal Evaluation .......................................35

Weighted School Level Factors in Principal Evaluation ...........................36

Data Integrity in Principal Evaluation .......................................................37

Page 9: Evert Thomas - Final Dissertation Draft for Publication 04232014

ix

Training Evaluators in Principal Evaluation ..............................................37

Feedback in Principal Evaluation ..............................................................37

State and District Principal Evaluation Systems ....................................................38

State-Level Evaluation Systems ................................................................40

Tennessee Evaluation Model .....................................................................41

Elective State-Level Evaluation Systems ..................................................41

Colorado Evaluation Model .......................................................................42

District Evaluation Systems with Required Parameters ............................42

Illinois Evaluation Model ..........................................................................43

Pennsylvania Principal Effectiveness Evaluation Model ..........................43

Summary…………………………………………………………………………45

Chapter Three: Methods And Procedures………………………………………………..48

Introduction………………………………………………………………………48

Setting ……………………………………………………………………………48

Participants .............................................................................................................49

Instruments .............................................................................................................52

Reliability and Validity ..........................................................................................54

Design ....................................................................................................................55

Procedures ..............................................................................................................56

Data Analysis .........................................................................................................58

Summary ................................................................................................................60

Chapter Four: Results ........................................................................................................61

Introduction ............................................................................................................61

Page 10: Evert Thomas - Final Dissertation Draft for Publication 04232014

x

Online Survey Participant Demographic Data .......................................................61

Implementation Plan of PDE Information Survey Data ........................................64

Principal Evaluation Feedback Information Survey Data .....................................66

Relationships within Educational Practice Information Survey Data ....................70

Principal Leadership Performance Effort Information Survey Data .....................73

Principal Leadership Performance and Reflective Change Information

Survey Data ................................................................................................78

Domain Achievability Information Survey Data ...................................................84

Open-ended Response Questions ...........................................................................85 First Open-ended Question ........................................................................85 Second Open-ended Question ....................................................................87 Third Open-ended Question .......................................................................88 Fourth Open-ended Question .....................................................................89 Fifth Open-ended Question ........................................................................90 Principal Interview Responses ...............................................................................91 Question One .............................................................................................92 Question Two .............................................................................................93 Question Three ...........................................................................................94 Question Four.............................................................................................96 Question Five .............................................................................................97 Question Six ...............................................................................................99 Summary ..............................................................................................................101 Chapter Five: Discussion .................................................................................................102

Page 11: Evert Thomas - Final Dissertation Draft for Publication 04232014

xi

Summary of the Study .........................................................................................102

Summary of the Results .......................................................................................109

Research Question 1 ................................................................................109

Research Question 2 ................................................................................111

Research Question 3 ................................................................................112

Limitations of the Study.......................................................................................113

Relationship to Other Research ...........................................................................114

Recommendations for Further Research ..............................................................116

Conclusion ...........................................................................................................117

References…………….. ..................................................................................................118

Appendix

A: Research Ethics Review Board ...................................................................................132

B: Permission to Use PDE Principal Evaluation Domains ..............................................133

C: Letter to School District Superintendents ...................................................................134

D: Invitation to Participate and Google© Drive Online Survey ......................................135

E: Interview Consent Form ..............................................................................................149

F: Six Scripted Interview Questions ................................................................................150

G: Demographic Information for All Surveyed Participants ...........................................151

H: Implementation Plan of PDE Data for All Surveyed Participants ..............................152

I: Principal Evaluation Feedback Data for All Surveyed Participants .............................153

J: Relationships within Educational Practice Data for All Surveyed Participants ...........156

K: Principal Leadership Performance Effort Data for All Surveyed Participants ...........158

Page 12: Evert Thomas - Final Dissertation Draft for Publication 04232014

xii

L: Principal Leadership Performance and Reflective Change Data for All Surveyed

Participants ..................................................................................................................162

M: Domain Achievability Data for All Surveyed Participants ........................................166

N: Open-ended Response Data for All Surveyed Participants ........................................167

O: Interview Response Data for All Voluntary Participants ...........................................183

Page 13: Evert Thomas - Final Dissertation Draft for Publication 04232014

xiii

List of Figures Figure Page

1.1 Connectedness in Frameworks for Principal and Teacher Effectiveness ................4 2.1 Principal Effectiveness System for Pennsylvania ..................................................44 3.1 Framework and Design of Study on Perceptions of Principals .............................55

Page 14: Evert Thomas - Final Dissertation Draft for Publication 04232014

xiv

List of Tables Table Page

3.1 Participant Districts by AYP Status for Last Three Years .....................................49 3.2 Principal Participants by School Level and Gender ..............................................50 3.3 Certified-Teacher Population for Principal Participants ........................................51 3.4 Enrollment of Students for Principal Participants .................................................51 4.1 Participants by School Level .................................................................................62 4.2 Participants by Gender ...........................................................................................62 4.3 Participants by Total Years of Experience as a Principal ......................................63 4.4 Participants by Certified-Teacher Population ........................................................63 4.5 Participants by Enrollment of Students ..................................................................64 4.6 Level of Agreement for the New PPEE Evaluation System ..................................65 4.7 Perceptions of New PPEE Rubric as an Evaluation Tool ......................................65 4.8 Current Frequency of Leadership Evaluative Feedback ........................................66 4.9 Current Format of Leadership Evaluative Feedback .............................................67 4.10 Current Satisfaction with Process of Evaluative Feedback ...................................68 4.11 Perceptions of Evaluation Feedback as Accurate Measurement of Personal Efforts to Initiate Positive Change .........................................................................69 4.12 Status of Evaluative Feedback ...............................................................................70 4.13 Perceptions on the New Statewide Pennsylvania Principal Effectiveness

Evaluation to Impact Positive Change ...................................................................71

Page 15: Evert Thomas - Final Dissertation Draft for Publication 04232014

xv

4.14 Perceptions on Student Performance Data to Account Towards Principal Job Performance within New PPEE ......................................................................73 4.15 Participant Involvement Effort on Criteria in Domain 1a through 1e ...................74 4.16 Participant Involvement Effort on Criteria in Domain 2a through 2f ....................76 4.17 Participant Involvement Effort on Criteria in Domain 3a through 3e ...................77 4.18 Participant Involvement Effort on Criteria in Domain 4a through 4c ...................78 4.19 Level of Change Effect to Participant on Criteria in Domain 1a through 1e ........80 4.20 Level of Change Effect to Participant on Criteria in Domain 2a through 2f .........81 4.21 Level of Change Effect to Participant on Criteria in Domain 3a through 3e ........82 4.22 Level of Change Effect to Participant on Criteria in Domain 4a through 4c ........84 4.23 Perceptions of Achievability in PPEE Domains ....................................................85 4.24 General Themes from Question#10 .......................................................................86 4.25 General Themes from Question #11 ......................................................................87 4.26 General Themes from Question #19 ......................................................................88 4.27 General Themes from Question #20 ......................................................................90 4.28 General Themes from Question #21 ......................................................................91

Page 16: Evert Thomas - Final Dissertation Draft for Publication 04232014

1

 

Chapter One – Introduction

Overview

To gain a greater understanding of how evaluation systems are used to measure

and appraise the performance of school principals, it is necessary to investigate the

relationships between principal leadership practices, classroom instructional practices,

and student achievement scores. A growing number of state-level and district-level

principal evaluation systems have emphasized instructional leadership practices, teacher

accountability, and student achievement as measurable factors of performance in the

assessment process of principals (Clifford, Hansen, & Wraight, 2012). In addition, the

perceptions regarding the purposes, processes, and outcomes of evaluations often vary

between principals and their evaluators (Brown-Sims, 2010; Condon & Clifford, 2010;

Portin, Feldman, & Knapp, 2006).

Pennsylvania Educator Effectiveness Project

Since 2010, the Pennsylvania Department of Education (PDE) has been working

towards the development of an educator effectiveness model for certified-teachers, non-

teaching professionals, and school principals that provided the necessary training tools

for professional growth to ensure a fair effective evaluation process. PDE adopted 22 Pa.

Code in Chapter 19 related to the educator effectiveness rating tool, under section 1123

of the Public School Code of 1949 (24 P. S. § 11-1123) amended June 30, 2012 (P. L.

684, No. 82 as Act 82) that declared PDE is required to develop a rating tool to measure

the effectiveness of classroom teachers and publish this rating tool in the statutes of the

Pennsylvania Bulletin by June 30, 2013 (Pennsylvania Bulletin, 2013). On July 1st 2013,

field-testing of a new Pennsylvania teacher effectiveness evaluation system for classroom

Page 17: Evert Thomas - Final Dissertation Draft for Publication 04232014

2

 

teachers was implemented by PDE. Similarly, the second implementation will focus on

school principals and non-teaching professionals and is planned for July 1st 2014. PDE

has conducted the educator effectiveness project, to support classroom teachers and

school leadership, with the overall goal of improving student achievement for all children

within the Pennsylvania public school system (Pennsylvania Department of Education,

2014).

Rationale for Pennsylvania Principal Effectiveness

Efforts to support Pennsylvania school superintendents, with an accurate and

objective assessment tool to rate the performance levels of principals on their essential

duties as building leaders, can be traced back to 2004-5 (Pennsylvania Department of

Education, 2012). Rationale for Principal Effectiveness (2012) indicated a group of

superintendents, principals, and individuals from higher education reviewed research on

how school leaders could impact student achievement. Consequently, the results were a

set of three core and six corollary leadership standards incorporated into Act 45 of 2007

that are currently known as the Pennsylvania Inspired Leadership Program (PIL). A team

assembled by PDE studied existing research to create a principal rubric concentrated on:

• Providing sample evidence that could be measured within each of the Core and

Corollary Standards.

• Establishing competency levels for each of the Core and Corollary Standards,

requiring an explanation of the evidence used to substantiate the numerical ratings

for each of the Domains and the overall competency level.

• Determining frequency of assessments.

Page 18: Evert Thomas - Final Dissertation Draft for Publication 04232014

3

 

• Utilizing assessments that are valid and help inform principal professional

development needs.

• Incorporating multiple forms of assessment and varying the types of data

collected to obtain a holistic view of principal performance.

(Pennsylvania Department of Education, 2012, p. 1)

This work resulted in the groundwork of four Domains, contained within the proposed

Principal Effectiveness Rubric, and paralleled influence by the Danielson Framework for

Teaching (2007) to include:

• Domain 1: Strategic/Cultural Leadership

• Domain 2: Systems Leadership

• Domain 3: Leadership for Learning

• Domain 4: Professional and Community Leadership

(Pennsylvania Department of Education, 2012, p. 2)

Danielson Framework in Pennsylvania Teacher and Principal Effectiveness

The Pennsylvania Department of Education’s Statement on Danielson Framework

(2013a) affirmed a teacher observation/evidence practice model, utilized within the PDE

Teacher Effectiveness Evaluation as of July 1st 2013, would be the Danielson Framework

for Teaching (2007). Officially, the Pennsylvania Department of Education (PDE) has

not mandated any specific edition of the Charlotte Danielson Framework (Pennsylvania

Department of Education, 2013b). However, professional development and training is

based on the 2007 edition of the Danielson Framework. In addition, any local education

agency (LEA) within Pennsylvania that anticipated using an assessment structure other

than the Danielson Framework for Teaching, must submit a request for approval to PDE

Page 19: Evert Thomas - Final Dissertation Draft for Publication 04232014

4

 

in relation to the alternative evaluation system (Pennsylvania Department of Education,

2013b).

The framework for principal effectiveness and the Danielson Framework for

teacher effectiveness are connected by systemic goals of effective practices aligning

student-centered efforts and resources to examine how student learners best achieve

academically (Pennsylvania Department of Education, 2013c). Figure 1.1 shows the

connectedness within the frameworks for Principal and Teacher Effectiveness.

Figure 1.1

Connectedness in Frameworks for Principal and Teacher Effectiveness

Principal and Teacher Effectiveness Frameworks: How Are They Connected (2013c)

identified eight essential factors necessary to stimulate highly strategic discussion in

creating an environment that fosters student achievement as: vision, common standards,

high expectations for all, instruction, assessment, collaboration, safety and security, and

lastly professionalism.

Pennsylvania Implementation of Principal Effectiveness

The Pennsylvania Department of Education (PDE) is implementing a new

statewide Pennsylvania Principal Effectiveness Evaluation (PPEE) to measure the

Page 20: Evert Thomas - Final Dissertation Draft for Publication 04232014

5

 

performance of principals in 2014. PPEE was developed using the Danielson (1996)

rubric method for teacher evaluation. Similarly, the PPEE is comprised of four domains

and contains 19 principal performance tasks in a rubric format (Pennsylvania Department

of Education, 2012):

1a: Creating an Organizational Vision, Mission, and Strategic Goals

1b: Using Data for Informed Decision Making

1c: Building a Collaborative and Empowering Work Environment

1d: Leading Change Efforts for Continuous Improvements

1e: Celebrating Accomplishments and Acknowledging Failures

2a: Leveraging Human and Financial Resources

2b: Ensuring School Safety

2c: Complying with Federal, State, and LEA Mandates

2d: Establishing and Implementing Expectations for Students and Staff

2e: Communicating Effectively and Strategically

2f: Managing Conflict Constructively

3a: Leading School Improvement Initiatives

3b: Aligning Curricula, Instruction, and Assessments

3c: Implementing High Quality Instruction

3d: Setting High Expectations for All Students

3e: Maximizing Instructional Time

4a: Maximizing Parent and Community Involvement and Outreach

4b: Showing professionalism

4c: Supporting Professional Growth

Page 21: Evert Thomas - Final Dissertation Draft for Publication 04232014

6

 

Alignment of the PPEE to legislative and corollary standards of the Pennsylvania

Inspired Leadership Program (PIL) suggested PDE is taking a stricter standpoint on

accountability and expectations for effective school leadership practices within all school

districts of Pennsylvania. Effective school leadership has emerged as the essential factor

for attaining achievement of legislative state goals to increase academic outcomes and

narrow student achievement gaps while improving the educational opportunities for all

students in Pennsylvania. Consequently, PDE and state legislators approached the new

principal evaluation system as a catalyst towards improvement for school effectiveness

within the areas of principal instructional leadership, teacher classroom instructional

practices, and student achievement scores.

The Department of Education in Pennsylvania (PDE) is not the only state to begin

making improvements on their principal evaluation system, prompted by No Child Left

Behind 2001 (NCLB) and Race to the Top 2010 (RTTT), as federal initiatives passed into

law set these expectations for school principals. Acknowledging the connection between

effective school leadership and student achievement, NCLB (2001) has required principal

evaluation to include summative measurement for student performance. More recently,

RTTT (2010) required teacher and principal evaluations at the state and district level to

rate the levels of school effectiveness based on a criteria of measurements that indicated

if they were improving schools, increasing student achievement, and narrowing student

achievement gaps.

Pennsylvania complied with the Race to the Top (U.S. Department of Education,

2010) initiative and implemented the first stage of individual pilot studies for teacher and

principal standards-based evaluation systems (Pennsylvania Department of Education,

Page 22: Evert Thomas - Final Dissertation Draft for Publication 04232014

7

 

2012) envisioned as the groundwork of PDE’s statewide appraisal models. Race to the

Top (RTTT) required states to address improvement for educator effectiveness in the

following areas: measuring individual student growth; designing transparent evaluation

systems for teachers and principals that differentiate effectiveness while accounting for

student growth; annual evaluations of teachers and principals; and using evaluation

results to inform decisions (Lane & Horner, 2010).

Need for the Study

A majority of educator evaluation research is focused on teachers, not principals,

and even less on the topic of principal effectiveness (Catano & Stronge, 2007; Davis,

Kearney, Sanders, Thomas, & Leon, 2011). Information about professional practices of

school principals can serve as a foundation to understanding the essential characteristics

of principal effectiveness evaluation and appraisal design (Clifford, Hansen, & Wraight,

2012). In addition, available research raised the questions about consistency, fairness,

effectiveness, and value attributed to current principal evaluation practices (Condon &

Clifford, 2010; Goldring et al., 2009; Heck & Marcoulides, 1996; Portin, Feldman, &

Knapp, 2006; Thomas, Holdaway, & Ward, 2000). Specifically, the existing research on

principal evaluation systems has proposed:

• Principals view performance evaluation as perfunctory and limited in the

value of feedback, professional development, or accountability to school

improvement initiatives (Portin, Feldman, & Knapp, 2006).

• Principal evaluations are inconsistently conducted and performance is

inconsistently measured (Thomas, Holdaway, & Ward, 2000).

Page 23: Evert Thomas - Final Dissertation Draft for Publication 04232014

8

 

• Performance evaluations may not align with existing national or state

professional standards for best practices (Heck & Marcoulides, 1996) or

standards for personnel evaluation (Goldring et al., 2009).

• Few widely available principal evaluation instruments display psychometric

rigor to examine validity and reliability (Condon & Clifford, 2010; Goldring

et al., 2009; Heck & Marcoulides, 1996).

Studies need to define the purpose and process of principal evaluation, due to a lack of

professional agreement on “what should be evaluated and how” (Sanders & Kearney,

2011, p. 2). In contrast, the time has passed when the school principal can establish

yearly goals aligned to district initiatives, meeting only annually with the district

superintendent, who would determine whether or not the principal’s work was

satisfactory (The Wallace Foundation, 2009).

The Elementary and Secondary Education Act 2011 (ESEA) and prior No Child

Left Behind Act of 2001 (NCLB) were catalysts that encouraged federal policies and state

initiatives to require the redesign of principal evaluation systems across the country. The

American Recovery and Reinvestment Act 2009 (ARRA) and the Race to the Top 2010

(RTTT) competition likewise promoted states and local school districts to develop more

rigorous evaluations aligned to assess principals based on their effectiveness. In addition,

the same federal policies and state initiatives provided similar incentive that required the

redesign and improvement of existing teacher evaluation systems.

Vitcov and Bloom (2010) indicated principal leadership practices are second only

to the quality of teacher classroom instruction that influences student achievement. An

effective principal can “set the organizational direction and culture that influences how

Page 24: Evert Thomas - Final Dissertation Draft for Publication 04232014

9

 

their teachers perform” (Council of Chief State School Officers, 2008, p. 9). Therefore,

the principal indirectly influences teacher classroom instructional quality to effect student

achievement within their schools by creating conditions for academic success. Although

the principal’s influence has been deemed indirect, “principal effectiveness is defined by

these outcomes” (Clifford, Hansen, & Wraight, 2012, p. 4).

This study is intended to report data regarding principals’ perceptions on the

impact of principal leadership practices, teacher instructional practices, and student

achievement scores in relationship to the proposed 2014 implementation of a statewide

Pennsylvania Principal Effectiveness Evaluation (PPEE).

Statement of the Problem

The Pennsylvania Department of Education (PDE) will implement a statewide

Pennsylvania Principal Effectiveness Evaluation (PPEE) in 2014 throughout the entire

Commonwealth of Pennsylvania. The PPEE will consist of four domains to include:

strategic and cultural leadership; systems leadership; leadership for learning; and lastly

professional and community leadership. Each of the four domains will be comprised of

subsections for a total of 19 principal performance tasks within a rubric format. Do

principals perceive the new statewide PPEE as fair in relationship to how it will affect

principal leadership practices, teacher instructional practices, and student achievement

scores (Clifford, Hansen, & Wraight, 2012)?

Definition of Terms

For the purpose of this study the following definitions relating to this research

will be used:

Page 25: Evert Thomas - Final Dissertation Draft for Publication 04232014

10

 

Inter-Rater Reliability – A measurement depicting how different evaluators will

rate the same observable behaviors the same way.

Pennsylvania Principal Effectiveness Evaluation – A statewide Pennsylvania

framework for principal evaluation scheduled for implementation in the 2014 school

year. Specifically, similar to the Danielson (2007) rubric framework divided into the four

areas of: (1) strategic and cultural leadership; (2) systems leadership; (3) leadership for

learning; and (4) professional and community leadership with 19 principal performance

tasks.

Performance Artifacts – A collection of materials that exhibit evidence of

professionalism focused on teaching, learning, and student progress, and analyze

principal behaviors, actions, and practices.

Principal Leadership Practices – Items listed on the new statewide Pennsylvania

Principal Effectiveness Evaluation rubric.

Rubric Evaluation – A measurable level of performance along a continuum to

assess expectations, support self-reflection of professional practice, and facilitate

communication between evaluator and the principal.

Serious Approach to Evaluation – A requirement that each principal take

ownership and be held accountable to improve their individual performance ratings as

delineated within the evaluation rubric descriptions for each domain of the new statewide

Pennsylvania Principal Effectiveness Evaluation which vary between failing, needs

improvement, proficient, and distinguished.

Page 26: Evert Thomas - Final Dissertation Draft for Publication 04232014

11

 

Teacher Instructional Practices – Principal accountability for task-performances

that guarantee quality teacher classroom instruction and student learning within every

classroom of the school.

Student Achievement Scores – Students achieving at advanced or proficient

academic status on their Pennsylvania System of School Assessment (PSSA) score and

connected with school status on annual yearly progress (AYP), which is unofficially a

measurement of satisfactory teacher classroom instruction and linked to effective task-

performance of the principal.

Value-Added Models (VAMs) – A complex statistical model that attempts to

determine the extent to which specific teachers and schools will affect individual student

achievement growth over time.

Limitations

A primary limitation of this study would be a lack of participation by principals in

Pennsylvania to take the Google© Drive online survey, forced-choice, open-ended, and

interviews questions, which would prevent any generalized application of the findings to

a larger population of principals. Another limitation would be an imbalance of principal

respondents, within the selected categories of public school principals from Pennsylvania,

resulting in non-generalizability data across the three levels of elementary, middle, and

high school principals. In addition, the researcher was reliant on volunteers to participate

in this study. The participation rate should be considered, when applying any limited

results from collected data, to a general population. The accuracy of the Google© Drive

online survey, forced-choice, open-ended, and interview questions depend upon the

individual comfort levels of the participants to supply candid and honest input. A

Page 27: Evert Thomas - Final Dissertation Draft for Publication 04232014

12

 

possibility exists that the researcher may code the data inaccurately during final analysis,

to unknowingly reflect accidental bias. A final limiting factor of this study rests upon the

perceptions of the principal participants that are humanly subjective within nature.

Assistant principals did not participate in the study because not every elementary

or secondary school has an assistant principal nor is there equitable responsibilities in

relationship to the position of the school principal. District-level administrators did not

participate in the study because the new statewide Pennsylvania Principal Effectiveness

Evaluation proposed by the Pennsylvania Department of Education was not directed

towards them nor did the researcher intend to examine the perceptions of evaluators.

Research Questions

This study examined the proposed implementation effects of the statewide

Pennsylvania Principal Effectiveness Evaluation in 2014 throughout the Commonwealth

of Pennsylvania. The following questions guided the study and provided data on the

perceptions of principals:

1. What are the perceptions of principals regarding the implementation of a

statewide Pennsylvania Principal Effectiveness Evaluation and its impact

on principal leadership practices?

2. What are the perceptions of principals regarding the implementation of a

statewide Pennsylvania Principal Effectiveness Evaluation and its impact

on teacher instructional practices?

3. What are the perceptions of principals regarding the implementation of a

statewide Pennsylvania Principal Effectiveness Evaluation and its impact

on student achievement scores?

Page 28: Evert Thomas - Final Dissertation Draft for Publication 04232014

13

 

Summary

The Pennsylvania Department of Education has proposed the implementation of a

new statewide Pennsylvania Principal Effectiveness Evaluation (PPEE) in 2014 that is

intended to standardize all leadership practices and measurement of performance

expectations for principals throughout the Commonwealth of Pennsylvania. This study

will examine data on the perceptions of principals regarding the implementation of the

new PDE statewide PPEE and its impact on principal leadership practices, teacher

instructional practices, and student achievement scores. Principals are expected to

benefit from the new statewide PPEE within the four domains of strategic and cultural

leadership; systems leadership; leadership for learning; and lastly professional and

community leadership (Pennsylvania Department of Education, 2012a). For this reason,

the new statewide PPEE is expected to have a positive effect on the principal’s leadership

regarding improvement in educator effectiveness, professional practices, standards or

competency levels, student growth or academic achievement, annual evaluation, and

meaningful support through professional development.

Page 29: Evert Thomas - Final Dissertation Draft for Publication 04232014

14

 

Chapter Two – Literature Review

Overview

This chapter is focused on literature related to principal evaluation systems as

reviewed from a historical and current viewpoint. The review provides an understanding

of what criteria are used to define the performance of principal leadership practices and

how these measurements link to teacher instructional practices and student achievement

scores. Principal evaluation systems from other states are reviewed to identify similar

trends throughout the United States. Additionally, chapter two examines a number of

studies for comparison with standards in Pennsylvania that guide principal evaluations.

Introduction

A school principal’s evaluation is a fundamental component to standards-based

accountability and linked to overall school improvement of teacher instructional practices

and student achievement scores (Clifford, Hansen, & Wraight, 2012). The principal

evaluation process, when designed correctly, is vital to the improvement of leadership

qualities and organizational performance that utilizes formative and summative feedback

as identifiers of weaknesses for professional development, and a collective responsibility

for school-wide improvement in setting organizational goals and objectives for the

school’s leader (Goldring et al., 2009). The Pennsylvania Department of Education

(PDE) contended the measurement of principal effectiveness is an important element

towards promoting and sustaining acceptable levels of teacher performance because it

impacts on student learning (Pennsylvania Department of Education, 2012).

Page 30: Evert Thomas - Final Dissertation Draft for Publication 04232014

15

 

School Leadership and Principal Effectiveness

Soehner and Ryan (2011) stated leadership and how it is understood, defined, and

applied by the principal make it a very personal undertaking. Sergiovanni (2005) alleged

that school leadership is comprised of three dimensions: the heart, the head, and the hand;

the heart of leadership pertains to what the person believes, values, dreams about and is

committed to; it is the person’s personal vision. Sergiovanni (2005) further added, “the

hand of leadership has to do with the actions we take, the decisions we make, the

leadership and management behaviors we use as our strategies become institutionalized

in the form of school programs, policies, and procedures” (p. 2). Fullan (2001) defined

the role of leadership by the principal as:

A school administrator is an educational leader who promotes the success of all

students by facilitating the development, articulation, implementation, and

stewardship of a vision of learning that is shared and supported by the school

community…[along with promoting] the success of all students by ensuring

management of the organization, operations, and resources for a safe, efficient

and effective learning environment…[while also] promoting the success of all

students by advocating, nurturing, and sustaining a school culture and

instructional program conducive to student learning and staff professional growth.

(p. 50)

Specifically, school leadership requires action and strategies that originate from our

personal vision, experience, and reflective abilities (Soehner & Ryan, 2011). Gastil

(1994) proposed, “leadership is an interaction between two or more members of a group

that often involves a structuring or restructuring of the situation and the perceptions and

Page 31: Evert Thomas - Final Dissertation Draft for Publication 04232014

16

 

expectations of the members” (p. 954). Furthermore Gastil (1994) indicated, “Leadership

occurs when one group member modifies the motivation or competencies of others in the

group [and] any member of the group can exhibit some amount of leadership” (p. 954).

Stewart (2006) pointed out, “leaders induced followers to act for certain goals that

represent the values and the motivations—the wants and needs, the aspirations and

expectations— of both leaders and followers ” (p. 3). In reality, “leadership is the

process of persuasion or example by which an individual or leadership team induces a

group to pursue objectives held by the leader or shared by the leader and his or her

followers” (Gardner, 2000, p. 3).

Leadership success and principal effectiveness. Seashore Louis, Leithwood,

Wahlstrom, and Anderson (2010) communicated, “leadership success depends greatly on

the skill with which the leaders adapt their practices to the circumstances in which they

find themselves, their understanding of the underlying causes of the problems they

encounter, and how they respond to those problems” (p. 94). Seashore Louis et al. (2010)

further noted:

Leadership is all about organizational improvement; more specifically, it is about

establishing agreed-upon and worthwhile directions for the organization in

question, and doing whatever it takes to prod and support people to move in those

directions. Our general definition of leadership highlights these points: it is about

direction and influence. Stability is the goal of what is often called management.

Improvement is the goal of leadership. But both are very important. (p. 10)

Page 32: Evert Thomas - Final Dissertation Draft for Publication 04232014

17

 

This importance on principal leadership has increased attentiveness towards mentoring

and preparing school leaders (Davis, Darling-Hammond, LaPointe, & Meyerson, 2005;

Hale & Moorman, 2003).

Leadership approach and principal effectiveness. Studies over the last three

decades have linked the quality of principal leadership with positive school instructional

results and student achievement (Andrews & Soder, 1987; Brewer, 1993; Cheng, 1991;

Goldring & Pasternak, 1994; Hallinger & Heck, 1998; Leithwood, 1994; Leithwood,

Jantzi, Silins, & Dart, 1993; Waters, Marzano, & McNulty, 2003). Clifford, Hansen, and

Wraight (2012) detailed the changing conceptions and approaches to principal leadership:

• Traditional Manager – leaders uphold traditions in school and community and

work to create a more efficient system to attain goals.

• Supervisor of Standards – leaders shape staff and student behaviors to meet

organizational or societal standards and ensure that people adhere to established

norms.

• Adaptive Leader – leaders work closely with each teacher and adjust leadership

approaches to move individuals toward achievement of organizational goals.

• Instructional Leader – leaders encourage teachers to problem solve and revise

practice by facilitating self-reflection and collaborative learning.

• Leader Among Leaders – leaders recognize their limitations and the limitations of

their position and the capacity of others to lead.

Instructional Leadership and Principal Effectiveness

Glasman (1984) reported that principals of effective schools exhibit leadership

behaviors such as, “setting corresponding instructional strategies, providing orderly

Page 33: Evert Thomas - Final Dissertation Draft for Publication 04232014

18

 

atmospheres, frequently evaluating student progress, coordinating instructional programs,

and supporting teachers” (p. 288). Instructional leadership theory is concentrated on the

principal’s role to frame the school’s mission, coordinate and monitor the instructional

program, and develop a positive learning culture (Hallinger and Heck, 1998). Current

research has focused on seven principal behaviors that depict instructional leadership as,

“making suggestions, giving feedback, modeling effective instruction, soliciting opinions,

supporting collaboration, providing professional development opportunities, and giving

praise for effective teaching” (Nettles & Herrington, 2007, p. 725). The role of principal

as an instructional leader has evolved into actively supporting the quality of teachers

within each classroom (Soehner & Ryan, 2011). Ballard and Bates (2008) affirmed,

“…the quality of a teacher in the classroom is the single most important factor in

determining how well a child learns” (p. 560). Gaziel (2007) recognized, “principals do

not affect the academic achievement of individual students in the same manner that

teachers do; that is, through direct classroom instruction” (p. 18). Instructional leadership

for principals requires multi-tasking through communication with teachers “about new

educational strategies, technologies, and other tools that promote effective instruction”

(Vanderhaar, Munoz, & Rodosky, 2006, p. 18). However, Seashore Louis, Leithwood,

Wahlstrom, and Anderson (2010) pointed out the distinction between “principals who

provided support to teachers by― popping in and―being visible as compared with

principals who were very intentional about each classroom visit and conversation, with

the explicit purpose of engaging with teachers about well-defined instructional ideas and

issues” (p. 90).

Page 34: Evert Thomas - Final Dissertation Draft for Publication 04232014

19

 

Shared leadership. Primarily, the principal is the foundation for instructional

leadership within each school (Soehner & Ryan, 2011). Reasonably, not all principals

have expert knowledge in all curriculum content areas or levels, but are still able to

utilize their individual talents to support student learning that indirectly effects student

achievement (Seashore Louis, Leithwood, Wahlstrom, & Anderson, 2010). The same

researchers found principals frequently delegate instructional leadership to department

chairs and teacher leaders grounded in “shared leadership and instructional leadership are

important variables, [which] are indirectly related to student achievement” (Seashore

Louis et al., 2010, p. 51). As a result, a key component to instructional leadership occurs

when there is a conscious shift of responsibilities from the principal to the teacher, with

the teacher taking a greater interest in student learning and being more aware of the needs

of their students (Soehner & Ryan, 2011). Seashore Louis, Leithwood, Wahlstrom, and

Anderson (2010) held the contention that instructional leadership of the principal is

secondary only to the classroom instruction of the teacher:

Based on a preliminary review of research, that leadership is second only to

classroom instruction as an influence on student learning, after six additional

years of research, we are even more confident about this claim. To date we have

not found a single case of a school improving its student achievement record in

the absence of talented leadership. (p. 9)

O’Donnell and White (2005) acknowledged, “Principals who strive to be instructional

leaders are committed to meeting the needs of their schools by serving stakeholders and

pursuing shared purposes…findings suggested that what principals do over time might

influence higher student test scores” (p. 57).

Page 35: Evert Thomas - Final Dissertation Draft for Publication 04232014

20

 

Transformational leadership. Burns (1978) and Bass (1998) emphasized the

theory of transformational leadership in terms of collaboration with other stakeholders,

particularly the role of the principal in inspiring and motivating the staff, developing a

commitment to a common vision, building the staff’s capacity to work collaboratively,

and shaping the organizational culture. Robinson, Lloyd, and Rowe (2008) conducted a

meta-analysis using the results of 22 studies that compared the effects of instructional and

transformational leadership on student outcomes. These researchers estimated the effect

of leadership on student outcomes was three to four times greater. They concluded, “the

more leaders focus their relationships, their work, and their learning on the core business

of teaching and learning, the greater their influence on student outcomes” (Robinson,

Lloyd, & Rowe, 2008, p. 636).

School Environment and Principal Effectiveness

Soehner and Ryan (2011) regarded an effective principal as an active participant

able to measure the school climate through being visible in the hallways and instructional

classrooms, while simultaneously focused on academics and ethical behavior of students

and staff. Nettles and Herrington (2007) implied an effective principal, “understands

how to balance school culture, the student population, and the community to promote

increased student achievement” (p. 731). Vanderhaar, Munoz, and Rodosky (2006)

classified three leadership practices relative to effective principals and school culture:

• Situational Awareness – the principal is aware of details and undercurrents in

running the school and uses information to address current and potential

problems.

Page 36: Evert Thomas - Final Dissertation Draft for Publication 04232014

21

 

• Intellectual Stimulation – the principal ensures that faculty and staff are made

aware of the most current theories and practices and incorporates discussion of

these as aspects of school culture.

• Staff Input – teachers are involved in the design and implementation of important

decisions. (p. 18)

The essential commitment of an effective school principal is to keep chaos at bay and

safeguard a school climate in which all students can learn (Klinker, 2006).

Student learning and principal effectiveness. Gaziel (2007) stated, “schools

that make a difference in students’ learning are led by principals who make a significant

contribution to the effectiveness of staff and in the learning of pupils in their charge” (p.

18). Gaziel (2007) continued to highlight, “principals influence student learning

indirectly by developing a school mission that provides an instructional focus for teachers

throughout the school, and this creates a school environment that facilitates student

learning” (p. 19). Researchers agreed many factors contribute to student learning that

have not yet been fully explained, but school leadership is recognized as the second most

influential school level factor towards fulfillment of student achievement and is second

only to the level of quality instruction delivered through teacher classroom practices

(Hallinger & Heck, 1998; Leithwood, Louis, Anderson, & Wahlstrom, 2004; Murphy &

Datnow, 2003; Supovitz & Poglinco, 2001; Waters, Marzano, & McNulty, 2003).

Studies correlated with student-growth reported the effects of principal leadership on

student achievement, as a strong influence upon student learning even though indirect

and not easily measurable, and principal leadership actions explain between .25 and .34

of the variation in overall student performance (Leithwood et al., 2004). Student

Page 37: Evert Thomas - Final Dissertation Draft for Publication 04232014

22

 

achievement needs to be a shared concern between administrators and teachers in the

school environment, but principals must be active participants and engaged in the

creation of a climate for student learning if a common goal for student achievement is to

be the end product (Soehner and Ryan, 2011).

Student Achievement and principal effectiveness. Nettles and Herrington

(2007) explained, “the viewpoint that principals have a direct effect on student learning

has largely been abandoned and replaced by a focus on the indirect relationships that

principals create through their interactions with teachers and their educational school

environment” (p. 729). Other research from Kaplan, Owings, and Nunnery (2005)

concluded from a national investigation that examined 15 years of research on school

leadership, “an outstanding principal exercises a measurable though indirect effect on

school effectiveness and student achievement” (p. 29). Leithwood and Jantzi (2008)

found, in studying 96 principals, the school principal’s sense of collective self-efficacy

does positively predict a school’s student achievement level. Conversely, student

achievement can be impacted by many internal and external factors such as student

health, work ethic, and socioeconomic status that are not controllable within the school

system (McGuigan & Hoy, 2006). A noteworthy variable in any examination of student

achievement remains the influence of parents and students to affect their own academic

success (Seashore Louis, Leithwood, Wahlstrom, & Anderson, 2010).

Accountability data and principal effectiveness. The seminal research from

Glasman (1984) maintained for over 30 years, “the current call is for the principal to be

specifically accountable for the performance of students” (p. 283). However, student

achievement is not easy to define or measure based on a wide variety of data points from

Page 38: Evert Thomas - Final Dissertation Draft for Publication 04232014

23

 

classroom assessments, district benchmarks, and standardized testing (Soehner & Ryan,

2011). Superintendents are holding principals accountable for student achievement even

though studies indicate no direct link (Kaplan, Owings, & Nunnery, 2005; Ross & Gray,

2006). Accountability centered on data from student assessments can drive principals to

create a school culture focused only on student achievement and not other individual

needs for their students (Goe, Bell & Little, 2008; Wechsler & Shields, 2008). Gallagher

(2012) recommended that principals not develop school cultures too narrowly focused on

data alone, which excludes teacher professional judgment and creativity to meet

individual student needs. Furthermore, principals should attempt to create cultures of

collaboration that integrate data systems and teams of teachers to measure instructional

practices and accountability for student achievement (Gallagher, 2012).

Definition of Principal Effectiveness

Empirical studies on principal effectiveness have been challenged by the lack of

data to study school principals, their multifaceted work, and their influence on school

outcomes (Rice, 2010). Branch, Hanushek, and Rivkin (2009) agreed, “little systematic

evidence exists about the quantitative importance of principals” (p. 2). However, the

U.S. Department of Education (2011) outlined the following definition of an effective

principal:

A Principal whose students, overall and for each subgroup, achieve acceptable

rates (e.g., at least one grade level in an academic year) of student growth. States,

local education agencies, or schools must include multiple measures, provided

that principal effectiveness is evaluated, in significant part, by student growth….

Supplemental measures may include, for example, high school graduation rates

Page 39: Evert Thomas - Final Dissertation Draft for Publication 04232014

24

 

and college enrollment rates, as well as evidence of providing supportive teaching

and learning conditions, strong instructional leadership, and positive family and

community engagement. (Clifford, Hansen, & Wraight, 2012, p.64)

Traits of effectiveness and the principal. Nettles and Herrington (2007) listed

eight traits shared by effective principals to include: (1) recognizing school should be

centered on teaching and learning; (2) communicating the school’s mission on a

consistent basis with all stakeholders; (3) enforcing that standards are challenging and

attainable for students; (4) establishing clear academic goals and checking the progress of

students to meet them; (5) conducting principal walkthroughs within classrooms to

inspect instructional quality of teachers; (6) creating an atmosphere of trust and sharing;

(7) prioritizing professional development for staff; and finally (8) not accepting

ineffective teachers (p. 282).

Research on principal effectiveness. Principal effectiveness research continues

to develop and examine evidence related to leadership practices that make a difference in

schools (Davis, Kearney, Sanders, Thomas, & Leon, 2011). Studies on the practices of

effective principals, while less empirical in nature, described how principals perform

leadership tasks and focused on how these leadership tasks affected schools (Spillane,

Halverson, & Diamond, 2004). Horng, Kalogrides, and Loeb (2009) studied effective

schools research to conclude effective principals influence a variety of school outcomes

to include: student achievement through safeguarding the quality of teachers that exist

within classrooms; providing support and an organizational structure that ensures quality

classroom instruction and student learning; and effectively allocating resources aligned

with their school vision and goals. Research from National Center for Analysis of

Page 40: Evert Thomas - Final Dissertation Draft for Publication 04232014

25

 

Longitudinal Data in Education Research (CALDER) inspected longitudinal state data to

estimate the effects of school leadership, specifically principal effectiveness, based on the

qualities of an effective principal as follows: greater teacher satisfaction related to

working in the school; better parent perceptions related to the school; improved student

attendance related to the school; and an increased student academic performance relative

to the school (Rice, 2010). The same research implied the effectiveness of principals

depends on: the principal’s individual level of experience; the principal’s individual

sense of efficacy towards particular tasks; and lastly, the principal’s individual allocation

of time on daily job responsibilities. Rice (2010) added that principals with experience

and skills related to effective practices are less likely found working in schools with high-

poverty and low-achieving schools. Research on low-performing schools that function

comparable to high-performing schools, support the following observable leadership

practices for effective principals: implementation of a coherent standards-based

curriculum and instructional programs in the school; use of student assessment data to

improve classroom instruction; sticking with district reform or school initiatives over

time; tailoring learning strategies to address the individual needs of students; and

ensuring instructional resources (Sebring, Allensworth, Bryk, Easton, & Luppescu, 2006;

Watts et al., 2006; Williams, Kirst, Haertel, & et al., 2005). School effectiveness

research concluded principals are critical within the process of school improvement

(Leithwood & Riehl, 2005).

Principal effectiveness influence on teacher effectiveness. Branch, Hanushek,

and Rivkin (2009) stated principals do not have a direct impact on student achievement,

but do have an impact on teacher effectiveness. Effective principals must work to

Page 41: Evert Thomas - Final Dissertation Draft for Publication 04232014

26

 

improve the conditions of classroom instruction and school culture that permits their

teachers to impact student learning (Horng, Kalogrides, & Loeb, 2009). O’Donnell and

White (2005) identified that principals are the key component to fostering trust and

exhibiting an attitude of caring for staff, students, and parents. These researchers held,

“effective principals expect and help teachers to design and facilitate learning

experiences that inspire, interest, and actively involve students” (p. 5). Wahlstrom and

Louis (2008) reinforced, “supportive principal behavior and faculty trust were

significantly correlated in their sample of secondary schools and that schools with higher

levels of engaged teachers, including commitment to students, had higher levels of trust

in colleagues” (p. 462). Therefore, effective principals are critical to retaining effective

teachers that positively contribute to, instead of disrupting through turnover, the efforts of

improving school culture, teacher effectiveness, and student achievement (Beteille,

Kalogrides, & Loeb, 2009). The role of the school principal has changed dramatically

over the last 20 years (Levine, 2005). The Wallace Foundation (2006) stated the role of

an effective school principal has moved away from “superhero or virtuoso soloist” (p. 2)

towards an “orchestra conductor” (p. 2) who shares leadership and distributes it across

their school. This conceptualization of school leadership views an effective principal as

the individual responsible for creating a community through sharing authority and

distributing leadership roles to teachers whose skills, capacities, and competencies are

similar to an effective principal (Spillane, Halverson, & Diamond, 2004; Steiner, Hassel,

& Hassel, 2008). Davis, Kearney, Sanders, Thomas, and Leon (2011) called attention to

modern standards-based and performance-based principal evaluations that now put

emphasis towards instructional and collaborative leadership practices.

Page 42: Evert Thomas - Final Dissertation Draft for Publication 04232014

27

 

Research on Principal Effectiveness Evaluation

There is little to no agreement on what and how to evaluate principals because

few reliable and valid methods for principal effectiveness evaluation exist (Goldring et

al., 2009). Additionally, there is little agreement on what measurements and information

should be used as authentic and dependable evidence for principal effectiveness (Porter,

Goldring, Murphy, Elliot, & Cravens, 2006). Goldring et al. (2007) studied the

evaluation methods of 66 districts and concluded the evaluation instruments varied

between every district along with emphasis on the importance of instructional leadership.

Catano and Stronge (2007) analyzed the contents of principal evaluation instruments and

found that most districts focused on the principal’s instructional leadership,

organizational management, and community relations. Knapp, Copland, Plecki, and

Portin (2006) analyzed other district methods for educational leadership assessment and

were able to identify three major uses for principal evaluations: evaluating performance;

providing formative feedback for continued professional development; and investigating

how to improve the schools. The same researchers examined the nature of leadership

assessment for school principals and how it evolved within many districts to link with

school improvement. Other shifts in the applications of principal assessments were

identified as: moving away from traits and dispositions towards behaviors and actions

focused on outcomes; the adoption and use of national and state leadership standards; an

increased focus on instructional leadership to improve student learning; greater use of

performance to pinpoint needs for professional development; and an increased

understanding of the influences that affect school leadership (Knapp, Copland, Plecki, &

Portin, 2006).

Page 43: Evert Thomas - Final Dissertation Draft for Publication 04232014

28

 

Friedman (2002) ascertained that performance evaluation is the key for providing

trusted feedback on the work of principals, in consideration to their collective feelings of

isolation with peers, and based on the unique demands of the position. Available studies

examined and questioned the consistency, fairness, effectiveness, and value of current

principal evaluation practices (Condon & Clifford, 2010; Goldring et al., 2009; Heck &

Marcoulides, 1996; Portin, Feldman, & Knapp, 2006; Thomas, Holdaway, & Ward,

2000). Conversely, limited research is available on the design or effects from

performance evaluation on principals, schools, and students (Clifford & Ross, 2011).

Clifford, Hansen, and Wraight (2012) felt principal evaluation is key in principal

effectiveness, job performance accountability, classroom instruction, student learning,

and self-reflection.

Design of principal evaluation. Goldring et al. (2009) studied evaluation

systems used in 35 urban districts within 9 states that were engaged in leadership

initiatives to impact school improvement efforts. These researchers measured the actions

of principals and examined the evaluation instrument. Reeves (2009) reviewed data from

over 300 principal evaluation instruments and 500 more from 21 states to investigate

experience levels related to work in comparison to the content within the evaluation

instruments. Lastly, Condon and Clifford (2010) examined principal performance

assessment instruments based on validity and reliability.

Researchers have criticized that several principal evaluation systems do not focus

on the correct items such as: clear performance standards; thoroughness in design; and

attentiveness towards implementation (Goldring et al., 2009; Reeves 2009). The same

researchers explained that a common weakness of modern principal evaluation systems

Page 44: Evert Thomas - Final Dissertation Draft for Publication 04232014

29

 

remains a focus on personal knowledge attainment and individual traits. Fenton et al.

(2010) noted the difficulty with measuring outcomes that connected principal traits to

student achievement. Principal evaluation systems need to be more concentrated on the

behaviors and actions of principals that effect particular teacher effectiveness and student

achievement outcomes (Fenton et al., 2010). Experts recommend the following seven

categories as a guide to principal effectiveness evaluation: (1) what is the purpose of the

evaluation; (2) what is assessed or measured; (3) what are the sources of evidence; (4)

who is assessed; (5) who provides feedback; (6) when does assessment occur and how is

assessment conducted; and (7) what are the psychometric qualities of the assessment

(Brown-Sims, 2010; Condon & Clifford, 2010; Portin, Feldman, & Knapp, 2006).

Goal of principal evaluation. Clifford, Hansen, and Wraight (2012) explained

the goals of principal evaluation in terms of formative and summative assessments to

indicate: formative assessment measures competencies, and results can be used to inform

future performance decisions or actions; and summative assessment shows overall

competence without any opportunity for improvement or remediation. Orr (2011)

researched the goals of principal evaluation systems to determine school districts and

states call attention to one or more of the following criteria:

• Improvement of principal practice (formative) – principal evaluation system

provides evidence and feedback on performance, which can be used by principals

to improve their practices. The evaluation measures principal effectiveness and is

intended to inform upon professional development improvement and growth.

Page 45: Evert Thomas - Final Dissertation Draft for Publication 04232014

30

 

• Decision about principal competency (summative) – principal evaluation system

provides school district staff with evidence of the principal’s performance, which

can be used for decisions about job retention, advancement, or compensation.

• Articulation of state or district goals – principal evaluation system connects state

and district educational improvement priorities through the selection to weighting

of competencies.

• Support for teacher growth and evaluation – principal plays a fundamental role in

the evaluation of teachers and creation of conditions that sustain teacher practices,

accountability, and professional development opportunities within compliance for

teacher evaluation. (Clifford, Hansen, & Wraight, 2012, p. 17)

Reeves (2009) recognized that school districts do not commonly attribute a principal’s

evaluation directly with student achievement outcomes or teacher effectiveness ratings.

Goldring et al. (2009) founded principal evaluation instruments measure too many

categories focused on the actions of the principal, such as: general management,

implementation of school vision, parent and community relationships, decision-making

based on data, and communication skills. In contrast, the same researcher identified a

majority of principal evaluation systems do not emphasize the performance of critical

behaviors by school principals that directly impact student achievement, particularly

leadership practices within the domains of classroom instruction, school culture, and

management of human resources.

National performance standards of principal evaluation. Reeve (2009) and

Goldring et al. (2009) reported many states and school districts have agreed on specific

leadership standards for school principals. However, these researchers contended the

Page 46: Evert Thomas - Final Dissertation Draft for Publication 04232014

31

 

evaluation instruments usually do not align to adequately or specifically measure those

standards. Reeves (2009) recommended that effective evaluation systems contain clear

definitions along with a detailed rubric related to performance levels to measure aspects

of principal leadership standards.

At present, 40 states acknowledge the Interstate School Leaders’ Licensure

Consortium (ISLLC) standards as a comprehensive set of research-based leadership

behaviors and actions specific to principal performance competencies and expectations

for evaluation systems. In 1996, the Council of Chief State School Officers (CCSSO)

published the ISLLC standards that led to more than 80% of states designing their own

leadership standards based on the ISLLC criteria for over ten years (Fenton et al., 2010).

Sanders and Kearney (2008) reported the CCSSO (2008) revised their standards into six

operational performance expectations for school leaders:

• Vision, Mission, and Goals – education leaders ensure the achievement of all

students by guiding the development and implementation of a shared vision of

learning, strong organizational mission, and high expectations for every student.

• Teaching and Learning – education leaders ensure achievement and success of all

students by monitoring and continuously improving teaching and learning.

• Managing Organizational Systems and Safety – education leaders ensure the

success of all students by managing organizational systems and resources for a

safe, high-performing learning environment.

• Collaborating with Families and Stakeholders – education leaders ensure the

success of all students by collaborating with families and stakeholders who

Page 47: Evert Thomas - Final Dissertation Draft for Publication 04232014

32

 

represent diverse community interests and needs and mobilizing community

resources that improve teaching and learning.

• Ethics and Integrity – education leaders ensure the success of all students by

being ethical and acting with integrity.

• The Education System – education leaders ensure the success of all students by

influencing interrelated systems of political, social, economic, legal, and cultural

contexts affecting education to advocate for their teachers’ and students’ needs.

(Fenton et al., 2010, p. 19)

ISLLC standards and performance expectations are not a comprehensive guide for states

or local education agencies to use for the assessment of principal effectiveness (Fenton et

al., 2010).

Psychometric properties of principal evaluation. Condon and Clifford (2010)

established most evaluation instruments to assess principal performance contain two

problems: (1) most were constructed using principal leadership research over a decade

old; and (2) most are not tested for psychometric properties. These researchers examined

eight principal evaluation instruments to conclude only one, the Vanderbilt Assessment

of Leadership in Education or VAL–ED, met high standards in content related to validity

and reliability. Goldring et al. (2009) affirmed the same concern about the psychometric

properties within principal evaluation instruments. Goldring et al. (2009) offered the

following overall critique of evaluation systems:

There is little discussion of psychometric properties, evaluation procedures, or

evaluator training among the sampled assessment instruments and procedures…

Page 48: Evert Thomas - Final Dissertation Draft for Publication 04232014

33

 

There is little consistency in how assessments are developed, which leadership

standards are used, and if the measures are reliable and valid. (p. 35)

Principal evaluation systems encounter the same problems as teacher evaluation systems

in terms of the supervisors and participants not being invested in the process as a tool for

systematic improvement and learning (Fenton et al., 2010). Kimball and Milanowski

(2009) observed that partial attention from upper school district administrators and

inadequate effort to train evaluators properly on how to use the principal evaluation

instruments resulted in contradictory experiences among participants.

Measurements within Principal Evaluation

Clifford, Hansen, and Wraight (2012) proposed that principal evaluation systems

should be clear on purpose, type, and standards used to measure principal practices and

outcomes. These researchers delineated principal practices as “the quality of principals’

performance on certain tasks or functions” (p. 28) and outcomes as “anticipated impact

on schools, teaching, and students” (p. 28). The Elementary and Secondary Education

Act 2011 (ESEA), Race to the Top 2010 (RTTT), American Recovery and Reinvestment

Act 2009 (ARRA), and No Child Left Behind Act of 2001 (NCLB) all supported and

refined a federal definition for principal effectiveness based upon the use of valid and

reliable measurements of practice and outcomes (U.S. Department of Education, 2011;

U.S. Department of Education, 2010). Race to the Top 2010 (RTTT) required states to

implement principal evaluation systems capable to “differentiate effectiveness using

multiple rating categories that take into account data on student growth…as a significant

factor” (U.S. Department of Education, 2010, p. 34). Race to the Top 2010 (RTTT)

stressed using multiple measurements of principal performance to provide a holistic

Page 49: Evert Thomas - Final Dissertation Draft for Publication 04232014

34

 

picture beyond the solitary principal observation. The Elementary and Secondary

Education Act 2011 (ESEA) required states to regulate a consistent method for principal

evaluation measurement, based on validity and reliability, throughout school districts.

Clifford et al. (2012) simplified the key terms related to measurements and methods for

principal evaluation:

• Validity – a measure that focuses on an assessment’s ability to measure what it is

intended to measure for prescribed purposes.

• Reliability – a measure of consistency and stability of a given instrument or rater.

Measures are said to be reliable when responses are consistent and stable for each

individual who is assessed.

• Feasibility – a sense that measures can be implemented as prescribed, given

financial, human, or other constraints.

• Utility – evidence that measures provide actionable feedback which principals can

use to make changes in practice.

• Fairness – evaluation measures and methods should be consistently administered

to principals in a given population by trained staff and held to similar standards.

In addition, the structure of measurements and methods for state and district principal

evaluation must outline the following: the frequency, order, and timing of the evaluation

procedures for all principals; any procedural steps given discretion to district level

administrators that evaluate principals; clarification on evidence collection and guidelines

for evaluation; and lastly the scoring or rating method correlated to the determination of

different evidence between performance levels, weighting of domains, or specific areas

of priorities for individual principals (Clifford, Hansen, & Wraight, 2012, p. 41).

Page 50: Evert Thomas - Final Dissertation Draft for Publication 04232014

35

 

Performance levels of principal evaluation. Clifford, Hansen, and Wraight

(2012) detailed that clear interpretations for each level of principal performance need to

be established with rubrics, examples, and documentation to reduce misunderstandings

over the rating method and measurements. The same researchers specified two further

requirements: (1) states need to designate performance levels for principals that account

for different levels of experience from novice, developing, proficient, and exemplary; and

(2) evaluation systems that contain four or more performance levels are typically more

precise in providing feedback than systems limited to only two levels. Currently, three

types are being used to distinguish levels of principal performance:

• Scorecards – a single form displaying a “score” that may be quantitative or

qualitative (e.g., proficient, distinguished) for each practice, standard, or outcome.

• Rubrics – a set of tables with cells to include descriptors of practices or outcomes

for each level. Principals’ scores are highlighted on the rubric.

• Checklist – a single form that shows whether or not principals met established

performance expectations. (Clifford, Hansen, & Wraight, 2012, p. 43)

Student growth factors in principal evaluation. Student growth used as

measurement within performance evaluations is a major concern for teachers and

principals. Presently, federal guidelines towards the use of student growth measurements

are stipulated as rigor between two points in time (Secretary’s Priorities for Discretionary

Grant Programs, 2010). Additionally, student growth measurements need to be fair,

valid, and reliable based on an intentional purpose with the results able to be attributed to

individual teachers and principals (Herman, Heritage, & Goldschmidt, 2011). Holdheide,

Goe, Croft, and Reschly (2010) highlighted that certain student growth measurements are

Page 51: Evert Thomas - Final Dissertation Draft for Publication 04232014

36

 

not appropriate within the contexts of teacher or principal evaluation in relationship to

students with learning disabilities, English language learners, gifted, or at-risk students.

The same researchers emphasized that student growth measurements become more

complicated as multiple teachers are responsible for classroom instruction and the

principal’s observation efforts more difficult to distinguish recognition for co-taught

practices.

Weighted school level factors in principal evaluation. Clifford, Hansen, and

Wraight (2012) conveyed school districts can weight school level factors such as student

growth, value-added teacher instructional quality, and school performance measurements.

Often, states and districts use a percentage of the teacher value-added or growth scores

attributed to a school as a factor in the performance evaluation of the principal. The

following areas contain commonly weighted factors used within principal evaluation:

• Student growth measures – value-added models; student achievement trends;

percentage of student learning objectives (SLO) achieved in a school; and locally

or regionally used subject specific test results.

• Instructional quality measures – placement indicators for subject areas requiring

certified teachers; teacher retention rates; and specific measures of instructional

quality.

• School performance measures – student attendance, attrition, and behavioral

incidents; school climate; community participation, interaction, and satisfaction;

progress on school improvement plans; and progress on school fiscal management

plans. (Clifford, Hansen, & Wraight, 2012, p. 29)

Page 52: Evert Thomas - Final Dissertation Draft for Publication 04232014

37

 

Weight assigned to school level factors should reflect goals and values of the state,

district, or principals to support teacher classroom instruction and student learning

(Clifford et al., 2012). Accordingly, some districts may weight school level student

growth as 25% of a principal’s total evaluation, while another district might weight

school climate at 50% of their principal’s performance evaluation.

Data Integrity in principal evaluation. Clifford, Hansen, and Wraight (2012)

communicated that procedural safeguards need to be in place to ensure data integrity

relative to an infrastructure responsible for collecting, validating, interpreting, tracking,

and communicating principal performance data. In addition, these researchers advocated

for the security of teacher and student performance data that impacts on principal

evaluations. Principal evaluation data is ultimately used to inform state and district

decisions that guide professional development and assess the quality of their evaluation

system (Clifford, Hansen, & Wraight, 2012).

Training evaluators in principal evaluation. Evaluation systems are dependent

upon the quality of support that is invested for training the evaluators. Clifford, Hansen,

and Wraight (2012) suggested if states and districts plan to put into effect a new principal

evaluation system that attention be focused on the fidelity of implementation, inter-rater

reliability, and evaluator feedback. These same researchers added the evaluators must be

responsible for monitoring and follow-through of the evaluation process, collecting data

with integrity, properly interpreting information, and providing feedback.

Feedback in Principal Evaluation. Principal evaluation enables feedback on

leadership practices that improve principal effectiveness and increased accountability

towards job performance (Orr, 2011). Individuals lose trust if the evaluation process is

Page 53: Evert Thomas - Final Dissertation Draft for Publication 04232014

38

 

not structured around meaningful feedback (Clifford, Hansen, & Wraight, 2012). The

Joint Committee on Standards for Educational Evaluation (2010) recognized effective

forms of feedback within principal evaluations to include:

• A report assessment in each evaluation area, standard, or domain.

• Personal growth and comparative information between other principals

within similar schools.

• A written narrative summarizing the evaluation process, findings,

feedback, and a plan for improvement.

DeNisi and Kluger (2000) found employees valued a written narrative and conversation

with a trusted, experienced evaluator or supervisor focused on actionable feedback. Data

collected within principal evaluation feedback holds the potential for providing principals

with support and professional development opportunities. Clifford, Hansen, and Wraight

(2012) concluded all states and districts must clearly communicate how evaluation data

will or will not be used within the assessment process for principals.

State and District Principal Evaluation Systems

Augustine et al. (2009) reported several policies to establish new leadership

standards and create new principal evaluation systems that followed the adoption of

NCLB legislation. Accountability in NCLB (2001), Race to the Top (U.S. Department of

Education, 2010), and the Elementary and Secondary Education Act (U.S. Department of

Education, 2011) has emphasized the importance of principal assessment and evaluation.

Consequently, many states are in the process of adopting statewide principal evaluation

models to increase achievement and outcomes on standardized testing for students.

Page 54: Evert Thomas - Final Dissertation Draft for Publication 04232014

39

 

Federal and state legislation are fundamental in the development of all principal

evaluation systems. Legislative interpretations at the state-level can cause variations in

implementation of a consistent policy and guidelines throughout an entire state (Berman

& McLaughlin, 1976). Principal evaluation has varied among schools, districts, and

states dependent upon local frameworks for design and implementation (Clifford,

Hansen, & Wraight, 2012). Over 40 states passed legislation adopting one or more sets

of national professional practices standards for performance evaluation and preparation

purposes (Anthes, 2005; Hale & Moorman, 2003):

• Interstate School Leadership Licensure Consortium (ISLLC) Standards and

Indicators – devised using principal and school effectiveness literature from

Council of Chief State School Officers (2008). Standards can be found at

www.ccsso.org.

• National Board for Professional Teaching Standards (NBPTS): Standards for

Principals – designed to guide principal development as instructional leaders and

reinforce the NBPTS master principal assessment system. Standards can be found

at www.nbpts.org.

• National Association of Elementary School Principals’ Leading Learning

Communities: Standards for What Principals Should Know and Be Able to Do –

focused on principals as instructional leaders and participants in learning

communities within schools for continuous improvement of student learning.

Standards can be found at www.naesp.org.

The Joint Committee on Standards for Educational Evaluation’s Personnel Evaluation

Standards (2010) suggested policymakers and evaluation designers review principal

Page 55: Evert Thomas - Final Dissertation Draft for Publication 04232014

40

 

evaluation systems based on: designed with direct involvement of principals to build

trust; connected to district- and state-level principal support systems; aligned with teacher

performance assessments; the evaluation instrument is rigorous, fair, equitable, reliable,

and accurate; includes multiple rating categories to differentiate evaluation performance;

gathers evidence of principal performance using multiple measures of practice;

communicates all results to principals with transparency; and provides training, support,

and evaluation for principal evaluators. The following three-implementation models of

educator evaluation design have been utilized by most states along with the creation of

hybrids because each approach has strengths and weaknesses (Clifford et al., 2012, p. 9):

• Simultaneous design – principal and teacher evaluation systems are designed at

the same time but separately. A single committee can be convened to design both

systems, or two separate committees might work in parallel. Subcommittees can

share ideas.

• Principal first design – a principal evaluation system committee is convened for

the purpose of principal evaluation design prior to launching a teacher evaluation

system design.

• Teacher first design – a teacher evaluation system committee is convened for the

purpose of teacher evaluation design prior to launching a principal evaluation

system design.

State-Level evaluation systems. A state-level evaluation system will strictly

interpret legislation to stipulate the components, measurements, and administration of the

assessment model for principals (Clifford, Hansen, & Wraight, 2012). The state directs

school districts use only state-approved evaluation models. During the 2012 school year,

Page 56: Evert Thomas - Final Dissertation Draft for Publication 04232014

41

 

the state of Tennessee implemented a statewide principal evaluation model across all of

its school districts. The redesigned of principal evaluations in Tennessee was prompted

by Race to the Top (U.S. Department of Education, 2010). The design process included

state administrators, district superintendents, school principals, and teachers to adopt the

single model that incorporated value-added measures of student performance as a

significant portion into their principals’ evaluations (Tennessee Department of Education,

2011).

Tennessee evaluation model. Tennessee principals must be assessed using the

state’s model based on the Tennessee Instructional Leadership Standards (TILS). As of

April 2011, the State Board of Education adopted regulations for five levels of principal

performance and multiple performance measures with weights as follows:

• School level value-added measure from the TVASS (35%).

• Student achievement data (15%).

• Qualitative scores on TILS rubric to include school climate surveys (35%).

• Quality of teacher evaluations (15%).

Tennessee requires two annual, on-site observations, announced and unannounced, and

provides a list of approved measures for student achievement and school climate, and

working conditions surveys (Tennessee Department of Education, 2011).

Elective state-level evaluation systems. An elective state-level evaluation

system will strictly interpret state and federal legislation to stipulate that districts adopt

certain aspects of an evaluation system, but allow local discretion on other aspects of the

system (Clifford, Hansen, & Wraight, 2012). In the elective state-level evaluation system

model, the state establishes the core principal evaluation model and ensures that districts

Page 57: Evert Thomas - Final Dissertation Draft for Publication 04232014

42

 

comply with core elements of the model while allowing districts to add standards, which

reflect local initiatives and values. The state of Colorado passed legislation to implement

a new principal evaluation system that required districts to adopt seven principal quality

standards, but does not prohibit districts from adding standards (Colorado Department of

Education, 2011).

Colorado evaluation model. In 2010, the Colorado legislature passed SB10-191,

requiring all districts to adopt new teacher and principal evaluation systems by 2014–15.

The same legislation established a common definition of principal effectiveness, seven

principal quality standards, and the following requirements for Colorado’s Elective State-

Level Evaluation System:

• School-wide student growth scores must account for 50% of the final score.

• Evaluation must occur annually.

• Results must be used in human resource decisions.

• Principals ranked “unsatisfactory” must be provided professional development

and support to improve.

This Colorado model for principal evaluation allows the districts to adapt rubrics, forms,

and guidance on selecting performance measures for principals (Colorado Department of

Education, 2011).

District evaluation systems with required parameters. In a district evaluation

system, a statewide principal evaluation system may be impractical or inappropriate, but

the state’s role remains to help ensure that school districts obey all applicable legislation

aligned with state-level audits of district criteria or other measurable information in ways

complying with state-level standards (Clifford, Hansen, & Wraight, 2012). In a district

Page 58: Evert Thomas - Final Dissertation Draft for Publication 04232014

43

 

evaluation system model, the individual districts influence development and professional

support within their local principal evaluations. The state of Illinois recommended that

districts use the state-level model, but districts retain the option to develop a local model

of principal evaluation model for review and approval by a state committee (Illinois

Department of Education, 2012).

Illinois evaluation model. In 2010, the Illinois legislature passed law requiring

all districts to evaluate principals by 2012–13. The state provided a principal evaluation

system model for districts with an option to develop their own models and submit them

for state approval. The Illinois State Board of Education has proposed the following

requirements for all approved models:

• Student growth must be a significant factor in every evaluation.

• Evaluation of principal practice must account for 50% of a principal’s score.

• Student growth must be measured using data from two assessment types.

• Annual evaluation must include two formal observations as site visits.

• There are four levels of performance.

Unlike the Illinois teacher evaluation model, the state does not require districts to use the

state’s default model for student growth for principal evaluation (Illinois Department of

Education, 2012).

Pennsylvania Principal Effectiveness Evaluation Model. Reports from the

2009-2010 school year evaluations indicated that 99.4% of teachers and 99.2% of

principals across the state of Pennsylvania were rated as satisfactory, yet the results on

the 2011 Pennsylvania System of School Assessment (PSSA) showed 26% of students

performed at basic or below basic levels in reading, and 23% performed at basic or below

Page 59: Evert Thomas - Final Dissertation Draft for Publication 04232014

44

 

basic levels in math (Pennsylvania Office of the Governor, 2011). The Pennsylvania

Department of Education has indicated that their new statewide Pennsylvania Principal

Effectiveness Evaluation will attribute an overall score based on: 50% of student growth

measurements from observation evidence; 15% from building-level data; 15% on teacher

PVAAS scores and Danielson rating rubric; and lastly 20% from elective or student

learning objectives (SLO) measurements (Pennsylvania Department of Education,

2012b). Figure 2.1 represents amounts of percentages attributed to multiple components

within the Principal Effectiveness System for Pennsylvania (Pennsylvania Department of

Education, 2012b).

Figure 2.1

Principal Effectiveness System for Pennsylvania

Page 60: Evert Thomas - Final Dissertation Draft for Publication 04232014

45

 

Student growth would be measured using value-added model scores, graduation rates,

promotion rates, attendance rates, participation in advance placement courses, and scores

for the PSAT and SAT (Pennsylvania Department of Education, 2012b). PDE would

maintain a secure data infrastructure that links individual student PSSA data to their

teachers, and individual teachers with their principals to ensure accurate reporting of

school annual yearly progress (AYP). Confidentiality of student data would be preserved

through unique identifiers that allow for the Pennsylvania Value Added Assessment

System (PVAAS) to track academic growth on each student.

Summary

Principal leadership is recognized as the second most influential school level

factor in fulfillment of student achievement and is second only to the level of quality

instruction delivered through teacher classroom practices (Seashore Louis, Leithwood,

Wahlstrom, & Anderson, 2010). Over the past 30 years, studies have measured the

quality of principal leadership with school instructional results and student achievement

outcomes (Hallinger & Heck, 1998; Waters, Marzano, & McNulty, 2003).

Knapp, Copland, Plecki, and Portin (2006) identified three major uses for

principal evaluations: performance feedback; professional development; and

investigating how to improve the schools. Available studies question the consistency,

fairness, effectiveness, and value of current principal evaluation practices (Condon &

Clifford, 2010; Goldring et al., 2009; Portin, Feldman, & Knapp, 2006). Experts

recommended the following seven categories as a guide to principal effectiveness

evaluation: (1) what is the purpose of the evaluation; (2) what is assessed or measured;

(3) what are the sources of evidence; (4) who is assessed; (5) who provides feedback; (6)

Page 61: Evert Thomas - Final Dissertation Draft for Publication 04232014

46

 

when does assessment occur and how is assessment conducted; and (7) what are the

psychometric qualities of the assessment (Brown-Sims, 2010; Condon & Clifford, 2010;

Portin et al., 2006). Goldring et al. (2009) affirmed concerns about psychometric

properties within principal evaluation instruments and the consistency in how

assessments are developed, which leadership standards are used, and if the measures are

reliable and valid. Clifford, Hansen, and Wraight (2012) simplified the key terms

applicable to measurements and methods for principal evaluation into validity, reliability,

feasibility, utility, and fairness. Clifford et al. (2012) clarified principal evaluation as

formative assessment to measure competencies and results can be used to inform future

performance decisions or actions; and summative assessment that showed overall

competence with no opportunities for further improvement.

Herman, Heritage, and Goldschmidt, 2011 stated student growth measurements

need to be fair, valid, and reliable based on an intentional purpose with the results to be

attributed to individual teachers and principals. Goe, Croft, and Reschly (2010)

ascertained student growth measurements are not appropriate within the contexts of

teacher or principal evaluation for students with learning disabilities, English language

learners, gifted, or at-risk students.

Clifford, Hansen, and Wraight (2012) suggested if states and districts plan to put

into effect a new principal evaluation system then attention needs to be focused on the

fidelity of implementation, inter-rater reliability, and evaluator feedback.

The Joint Committee on Standards for Educational Evaluation’s Personnel

Evaluation Standards (2010) suggested legislators and evaluation designers to review

principal evaluation systems based on: designed with direct involvement of principals to

Page 62: Evert Thomas - Final Dissertation Draft for Publication 04232014

47

 

build trust; connected to district- and state-level principal support systems; aligned with

teacher performance assessments; the evaluation instrument is rigorous, fair, equitable,

reliable, and accurate; includes multiple rating categories to differentiate evaluation

performance; gathers evidence of principal performance using multiple measures of

practice; communicates all results to principals with transparency; and provides training,

support, and evaluation for principal evaluators. Chapter three will discuss the methods

and procedures used in the study.

Page 63: Evert Thomas - Final Dissertation Draft for Publication 04232014

48

 

Chapter Three – Methods and Procedures

Introduction

The Pennsylvania Department of Education (PDE) developed a new statewide

Pennsylvania Principal Effectiveness Evaluation (PPEE) scheduled for implementation

during the 2014 school year to rate the performance of principals. Vitcov and Bloom

(2010) reported principal leadership is second only to the quality of teacher instruction

towards influencing student achievement. For that reason, the statewide Pennsylvania

Principal Effectiveness Evaluation would have an effect on the practice of principal

leadership in schools and teacher classroom instruction that will influence student

achievement scores. Clifford, Hansen, and Wraight (2012) indicated that information on

professional practice of the school principal is the foundation to understanding the

characteristics of principal effectiveness evaluation and appraisal design. Existing

research on principal evaluation questions the consistency, fairness, effectiveness, and

value attributed to current principal evaluation practices (Condon & Clifford, 2010;

Goldring et al., 2009). Lastly, current research has described the perspectives

surrounding the purpose and process of principal evaluation, due to a lack of professional

agreement, on “what should be evaluated and how” (Sanders & Kearney, 2011, p. 2).

Setting

The districts chosen for the study were located in southeastern Pennsylvania.

These districts were comprised of 38 elementary schools, 16 middle schools, and 12 high

schools. Annual Yearly Progress (AYP) status levels for the eleven selected districts

indicated their ability to maintain AYP for two school years between 2009-2010 and

2010-2011. Specifically, eight districts continually sustained AYP over the past three

Page 64: Evert Thomas - Final Dissertation Draft for Publication 04232014

49

 

school years while three districts declined into warning. The participation from Districts

A, D, and I represented 3 (27%) of those in warning status while Districts B, C, E, F, G,

H, J, and K accounted for 8 (73%) of those recurrently making AYP status for the 2011-

2012 school year. Table 3.1 illustrates the AYP status level of the eleven participant

districts over the past three school years, 2009-2010 through 2011-2012, as either in

Warning or Made AYP.

Table 3.1

Participant Districts by AYP Status for Last Three Years

Participant Districts 2009-2010 2010-2011 2011-2012

District A Made AYP Made AYP Warning

District B Made AYP Made AYP Made AYP

District C Made AYP Made AYP Made AYP

District D Made AYP Made AYP Warning

District E Made AYP Made AYP Made AYP

District F Made AYP Made AYP Made AYP

District G Made AYP Made AYP Made AYP

District H Made AYP Made AYP Made AYP

District I Made AYP Made AYP Warning

District J Made AYP Made AYP Made AYP

District K Made AYP Made AYP Made AYP

Note. Participant Districts (N=11).

Participants

This study was conducted in eleven school districts located within southeastern

Pennsylvania. The districts consisted of 38 elementary school principals (58%), 16

middle school principals (24%), and 12 high school principals (18%). Participants in this

Page 65: Evert Thomas - Final Dissertation Draft for Publication 04232014

50

 

study included only the principals within each individual elementary, middle, and high

schools of the participating school districts. Specifically, a total population of 18 male

(27%) and 20 (30%) female elementary principals, 12 male (18%) and 4 female (6%)

middle school principals, and lastly 7 male (11%) and 5 female (8%) high school

principals. Table 3.2 illustrates the gender and school level status of the principal

participants.

Table 3.2

Principal Participants by School Level and Gender

School Level Male Principals Female Principals

Elementary School 18 20

Middle School 12 4

High School 7 5

Note. Principal Participants (N=66).

Assistant principals did not participate in the study because not every elementary

or secondary school has an assistant principal nor is there equitable responsibilities in

relationship to the position of school principal. District-level administrators did not

participate in the study because the new statewide Pennsylvania Principal Effectiveness

Evaluation proposed by the Pennsylvania Department of Education was not directed

towards them, nor did the researcher intend to examine the perceptions of evaluators.

Participants were asked to state how many total years of experience they had as a

principal. Participants were given the choice of a range from first year, two to five years,

six to nine years, and 10 or more years.

Participants were asked to indicate the certified-teacher population within their

schools. Participants were given the choice of a range from 25 teachers or less (3%), 26

Page 66: Evert Thomas - Final Dissertation Draft for Publication 04232014

51

 

to 50 teachers (47%), 51 to 100 teachers (41%), and 101 or more teachers (9%). Table

3.3 reflects the distribution of certified-teacher population for principal participants.

Table 3.3

Certified-Teacher Population for Principal Participants

Certified-Teacher Population Principal Participants

25 or Less 2

26-50 31

51-100 27

101+ 6

Note. Principal Participants (N=66).

Participants were asked to specify the enrollment of students within their schools.

Participants were given the choice of a range from 300 students or less (8%), 301 to 500

students (32%), 501 to 1000 students (48%), 1001 or more students (12%). Table 3.4

reflects the enrollment of students for principal participants.

Table 3.4

Enrollment of Students for Principal Participants

Enrollment of Students Principal Participants

300 or Less 5

301-500 21

501-1000 32

1001+ 8

Note. Principal Participants (N=66).

Page 67: Evert Thomas - Final Dissertation Draft for Publication 04232014

52

 

Instruments

This study used an online survey (Appendix D) that consisted of Likert-scale,

forced-choice, open-ended response, and voluntary phone interview questions (Appendix

F) to collect data from participants. The researcher designed the online survey and

interview questions for data collection within this study. A panel of principals and

assistant principals that did not participate in the study reviewed each collection

instrument to ensure comprehensibility, clearness, and eliminate vagueness.

The online survey was used to obtain demographic information from principals

that included participant school level, gender, years of experience as a principal, current

certified-teacher population, and current enrollment of students. The online survey was

designed for qualitative inquiry into the perceptions of principals regarding the

implementation of the new statewide Pennsylvania Principal Effectiveness Evaluation.

Specifically, the research questions within this study focused on the impact to principal

leadership practices, teacher instructional practices, and student achievement scores.

The researcher obtained written permission from the Pennsylvania Department of

Education to adapt and use, in modified format, the specific nomenclature within their

statewide Pennsylvania Principal Effectiveness Evaluation (Appendix B). Dr. David W.

Volkman, an Executive Assistant in Department of Education and Office of Elementary

and Secondary Education, granted permission to the researcher, solely for the purpose of

research related to this dissertation. The researcher affirmed that the utilization of any

modified language within the study, as contained within the statewide Pennsylvania

Principal Effectiveness Evaluation, would be solely used for the purpose of research

within this dissertation and would not be produced for commercial and/or other use.

Page 68: Evert Thomas - Final Dissertation Draft for Publication 04232014

53

 

The online survey was created using Google© Drive, a secure website, that

invited participant involvement through an email invitation. It recorded all of the

submitted responses on a spreadsheet that was accessible to only the researcher. The

online survey consisted of five demographic questions (#1-5), six PDE implementation

questions (#6-11), six principal evaluation feedback questions (#12-17), five relationships

within educational practice questions (#18-22), nineteen principal leadership performance

effort questions (#23-41), four principal leadership performance and reflective change

questions (#42-45), and one domain achievability question (#46). A total of 41 Likert-

scale and forced-choice type questions along with 5 open-ended response questions (#10,

11, 19, 20, and 21) comprised the online survey. Google© Drive online survey data was

collected using multiple Likert-scale, forced-choice, and open-ended response type

questions from each participant. There were no neutral answers allowed in order to

prompt and force participants into choosing a meaningful response. Likert-scale

responses within this study always included four participant choices such as strongly

agree, agree, disagree, and strongly disagree. Similarly, forced-choice responses within

this study always included two choices such as yes or no. The online survey also

included five questions that required open-ended response, and two questions which

allowed for a textbox response if the participant selected the option of ‘other’. At the end

of the survey the researcher asked for volunteers to participate in either a face-to-face,

videoconference, or phone interview. Participants that wished to volunteer for an

interview were asked to contact the researcher using a separate email address at the

conclusion of the online survey.

Page 69: Evert Thomas - Final Dissertation Draft for Publication 04232014

54

 

The researcher collected data using the online survey to determine the perceptions

of principals towards the implementation of the new statewide Pennsylvania Principal

Effectiveness Evaluation from the Pennsylvania Department of Education and its impact

on principal leadership practices, teacher instructional practices, and student achievement

scores. Face-to-face, videoconference, and phone interviews were only scheduled with

participants that indicated their willingness to participate in the process. The face-to-

face, videoconference, and phone interview process consisted of six questions designed

to triangulate information and allow participants to further express individual perceptions

to the researcher in their own words. Face-to-face, videoconference, and phone interview

questions allowed participants an opportunity to communicate their perceptions in a way

that may not have been available if using the open-ended response sections alone within

the survey.

Reliability and Validity

Reliability and validity in this study occurred through a process of triangulation.

Triangulation was defined as using multiple sources of data to strengthen each single

point of research (Marshall & Rossman, 2011). The researcher utilized data collection

methods that consisted of an online survey using Google© Drive to include Likert-scale

questions, forced-choice questions, open-ended response questions, and a voluntary face-

to-face, videoconference, or phone interview from participants. Triangulation of these

multiple data sources assisted this researcher to acquire a greater understanding and

interpretation of complex phenomenon within the study (Denzin & Lincoln, 2011).

Page 70: Evert Thomas - Final Dissertation Draft for Publication 04232014

55

 

Design

The researcher conducted this study using qualitative research methods. Creswell

(2009) defined qualitative research as a method of inquiry to review all data, make sense

of it, and organize it into categories or themes that cut across all sources. Researchers

build patterns from themes based upon their interpretation of what is seen, heard, and

prior understanding. This study involved the researcher being immersed into a setting of

situational influences to allow for the study of both the participants and researcher

viewpoints (Marshall & Rossman, 2011). This qualitative study provided triangulation of

data collection using an online survey consisting of multiple Likert-scale questions,

forced-choice questions, open-ended response questions, and either a face-to-face,

videoconference, or phone interview. Figure 3.1 illustrates the design and framework of

this study to examine the perceptions of principals.

Figure 3.1

Framework and Design of Study on Perceptions of Principals  

   

Page 71: Evert Thomas - Final Dissertation Draft for Publication 04232014

56

 

Procedures

Permission was granted by the Research Ethics Review Board of Immaculata

University (Appendix A) to conduct this study. Before conducting this study, the

researcher attained official approval from the superintendents of the school districts in

which the study would take place. Using the US mail service, the researcher sent letters

to the superintendents (Appendix C) of potential districts explaining the data collection

procedures and expressed that all collected data would remain confidential and/or

anonymous. Only upon receiving written permission back from each superintendent did

the researcher proceed to coordinate efforts towards contacting each participant through

email notification of the study.

Prior to distribution of the online survey to potential participants involved in the

study, a panel of principals and assistant principals not involved in the study reviewed the

online survey. The purpose of principals and assistant principals reviewing the online

survey was to gather feedback about the clarity of the questions and if necessary the need

for development of clearer questions. The panel of reviewers was provided the online

survey that contained individual boxes for comments after each question. The boxes for

comments afforded the panel of reviewers an opportunity to share their individual

concerns or recommendations relative to the clarity of each question. Only after the

panel of reviewers completed the task of examining each question was the online survey

ready for distribution to the participants in the study.

After securing the approval of the each superintendent to use their district email

accounts, email invitations (Appendix D) were sent to all 66 potential participants that

included a link to the online survey using Google© Drive. Google© Drive is a secure

Page 72: Evert Thomas - Final Dissertation Draft for Publication 04232014

57

 

website with no means to identify individual participants. The email invitations allowed

the researcher to introduce the study and inform the participants of their superintendent’s

permission to participate in this doctoral study. The researcher sent an online survey out

that gave participants two weeks to respond. One week from the day the online survey

was emailed using Google© Drive, the researcher sent a second email reminding the

potential participants that only one week remained available to complete the survey.

All potential participants who wanted to participate were required to click on the

link and access the online survey. The online survey was immediately accessible to them

using Google© Drive through their Internet connection. All potential participants that

did not want to participate in the online survey could delete the email or choose not to

respond to the researcher. All potential participants that accidentally clicked into the

online survey were able to exit at any time with their identity remaining anonymous.

Prior to allowing access of the online survey questions, each potential participant

was required to verify their consent before participating within the study. Before clicking

the link of the online survey, potential participants were presented with an introduction

from the researcher about the online survey using Google© Drive and a consent

statement explaining the purpose of the study. The consent statement communicated to

all potential participants that participation in this doctoral study guaranteed their

responses would remain confidential and anonymous. Consent verification from each

potential participant was secured when they individually clicked yes to the link following

the online consent statement. After clicking yes to the link, each participant was directed

to the group of questions in the first section of the online survey. If a potential participant

clicked no, they were exited from the online survey.

Page 73: Evert Thomas - Final Dissertation Draft for Publication 04232014

58

 

All online survey questions were collected and recorded using Google© Drive.

The researcher could not identify who participated in the online survey or how individual

participants responded to the questions. Participants that agreed to participate in a face-

to-face, videoconference, or phone interview were informed to contact the researcher

using a separate email that they wished to volunteer. The face-to-face, videoconference,

or phone interview consisted of six scripted questions. The interview questions were

designed by the researcher and permitted the volunteer participants to expand on their

responses to the online survey questions within the study.

Before the interview each participant was provided with a second consent form

(Appendix E) by email, and informed of the intention by the researcher to audio record

and transcribe all dialogue during the interview for the study. Participants were required

to sign the consent form prior to the start of their interview and either email as a PDF file

or fax it back to the researcher. The face-to-face, videoconference, and phone interviews

were conducted off school grounds to protect the identity of the participants. All of the

interviews were audio recorded and transcribed by the researcher. No names were used

during the interviews to keep the identity of the interviewees confidential. A copy of the

transcribed interview was given back to the interviewee to ensure accuracy of responses

during the dialogue.

Data Analysis

Demographic questions were analyzed for professional information relative to the

participant’s school level, gender, years of experience being a principal, current certified-

teacher and student enrollment populations within the study. Pennsylvania Department

of Education implementation questions were analyzed to examine an understanding of

Page 74: Evert Thomas - Final Dissertation Draft for Publication 04232014

59

 

the participant’s awareness, fairness, and value attributed towards the new statewide

Pennsylvania Principal Effectiveness Evaluation. Principal evaluation feedback

questions were analyzed to investigate the current methods of evaluative feedback within

districts relative to job performance of the principal. Relationships within educational

practice questions were analyzed to inquiry about accountability linking the areas of

principal leadership practice, teacher instructional practice, and student achievement

scores. Principal leadership performance effort questions were analyzed to ascertain the

amount of effort and time necessary to successfully accomplish the performance criteria

items in each of the four domains listed within the new statewide Pennsylvania Principal

Effectiveness Evaluation. Principal leadership performance and reflective change

questions were analyzed to determine if participants would change or adjust their existing

professional practices or behaviors due to perceived implications of the new statewide

Pennsylvania Principal Effectiveness Evaluation. The domain achievability question was

analyzed to identify whether the four domains are realistic in terms of achievability to the

participants within this study. In each of the seven sections of the online survey, the data

generated from participant responses was sorted, coded, and correlated to each question.

The researcher used the data to examine if any trends emerged from the perceptions of

the participants.

Answers to the open-ended responses and textbox responses, if the participants

selected the option of ‘other’, were analyzed for trends. Many participants stated similar

ideas or themes but in different words. The researcher correlated the trends and assigned

a percentage based on the number of participants that responded with a similar idea or

theme compared to the number of participants who answered the question. The responses

Page 75: Evert Thomas - Final Dissertation Draft for Publication 04232014

60

 

to participant interview questions were analyzed to see whether the thoughts and

viewpoints expressed by those voluntary interviewees coincided with the data collected

in the Likert-scale, forced-choice, and open-ended responses from all the participants

within the study.

Summary

This chapter provided an overview of the setting, participants, instruments,

reliability and validity methods, design of the study, procedures to conduct the study

along with collection of data, and data analysis for the study. The researcher developed

an online survey using Google© Drive that consisted of Likert-scale, forced-choice, and

open-ended response questions to collect data from the participants within this qualitative

study. Specifically, the online survey consisted of five demographic questions, six PDE

implementation questions, six principal evaluation feedback questions, five relationships

within educational practice questions, nineteen principal leadership performance effort

questions, four principal leadership performance and reflective change questions, and one

domain achievability question. A face-to-face, videoconference, or phone interview only

included voluntary participants that agreed to participate in the interview process. The

interviews consisted of six scripted questions along with impromptu questions to

triangulate viewpoints against data collected within the online survey. In chapter four,

results from the online survey and interviews were compiled using data analysis intended

for the multiple Likert-scales, forced-choices, open-ended response questions, and

voluntary face-to-face, videoconference, or phone interviews.

Page 76: Evert Thomas - Final Dissertation Draft for Publication 04232014

61

 

Chapter Four – Results

Introduction

The purpose of this study was to examine the perceptions of principals as related

to the proposed 2014 implementation of a statewide Pennsylvania Principal Effectiveness

Evaluation (PPEE) and its effects on principal leadership practices, teacher instructional

practices, and student achievement scores. The data within this chapter was collected

using a Google© Drive online survey that included multiple Likert-scales, forced-

choices, open-ended response questions, and voluntary phone interviews. Participants

were required to answer questions one through five which covered demographic

information. The remaining questions within the Google© Drive online survey were left

to the discretion of the participants, which is why the total number of responses varies

from question to question.

Online Survey Participant Demographic Data

Participants in the study were asked to share demographic information on their

building assignments in the position of principal related to school level, gender, total

years of experience being a principal, current certified-teacher population, and current

enrollment of students (see Appendix G for complete data). Of the 66 possible principal

participants, from eleven school districts located within southeastern Pennsylvania, only

25 completed the Google© Drive online survey.

Question one asked participants to characterize their current school level within a

range of fixed choice. The fixed ranges resulted in participant school levels categorized

into elementary school level primary grade configurations of K-4, K-5, or K-6 (44%);

middle/junior high school level secondary configurations of 6-8 or 7-9 (32%); and high

Page 77: Evert Thomas - Final Dissertation Draft for Publication 04232014

62

 

school level secondary configurations of 6-12, 9-12, or 10-12 (24%). Table 4.1 illustrates

the number of participants by school level.

Table 4.1

Participants by School Level

School Level Principal Participants

Elementary School 11

Middle School 8

High School 6

Note. Principal Participants (N=25).

Question two requested participants to identify their gender as either a male

(64%) or female (36%). Table 4.2 shows the number of principal participants by gender.

Table 4.2

Participants by Gender

Gender Principal Participants

Male 16

Female 9

Note. Principal Participants (N=25).

Question three solicited participants to distinguish their total years of experience

as a principal. The ranges specified first year (16%); 2 to 5 years (28%); 6 to 9 years

(36%); and 10 or more years (20%). Table 4.3 indicates the number of participants by

total years of experience as a principal.

Page 78: Evert Thomas - Final Dissertation Draft for Publication 04232014

63

 

Table 4.3

Participants by Total Years of Experience as a Principal

Total Years of Experience as a Principal Principal Participants

First Year 4

2-5 7

6-9 9

10+ 5

Note. Principal Participants (N=25).

Question four required participants to indicate the current certified-teacher

population within their school. The ranges allowed for 25 or less teachers (0%); 26 to 50

teachers (52%); 51 to 100 teachers (40%); and 101 or more teachers (8%). Table 4.4

specifies the number of participants by current certified-teacher population.

Table 4.4

Participants by Certified-Teacher Population

Certified-Teacher Population Principal Participants

25 or Less 0

26-50 13

51-100 10

101+ 2

Note. Principal Participants (N=25).

Question five prompted participants to signify the current enrollment of students

within their school. The ranges accommodated for 300 or less students (0%); 301 to 500

students (24%); 501 to 1000 students (64%); and 1001 or more students (12%). Table

4.5 represents the number of participants by current enrollment of students.

Page 79: Evert Thomas - Final Dissertation Draft for Publication 04232014

64

 

Table 4.5

Participants by Enrollment of Students

Enrollment of Students Principal Participants

300 or Less 0

301-500 6

501-1000 16

1001+ 3

Note. Principal Participants (N=25).

Implementation Plan of PDE Information Survey Data

Questions six, seven, eight, and nine were divided into forced-choice and Likert-

scale inquiries that focused on participant awareness of the implementation plan by the

Pennsylvania Department of Education for its new statewide Pennsylvania Principal

Effectiveness Evaluation (see Appendix H for complete data). Questions ten and eleven

were open-ended questions (see Appendix N for complete data) and are discussed later.

Question six asked participants if they were aware that the Pennsylvania Department of

Education planned to implement a new statewide Pennsylvania Principal Effectiveness

Evaluation by the year 2014. Of the 25 participants to respond to the question, all 25

(100%) replied yes. Question seven inquired if the participants were involved in any

Pennsylvania Department of Education pilot program related to the new statewide

implementation of the principal effectiveness evaluation for Pennsylvania. Of the 25

participants that responded to the question, 11 (44%) reported yes and 14 (56%) stated

no. When asked in question eight to indicate a level of agreement with the Pennsylvania

Department of Education developing the new statewide evaluation system to be inclusive

for all elementary, middle, and high school principals: 1 (4%) strongly agreed; 20 (80%)

Page 80: Evert Thomas - Final Dissertation Draft for Publication 04232014

65

 

agreed; 3 (12%) disagreed; and 1 (4%) strongly disagreed. Table 4.6 notes the number of

participants by level of agreement with PDE developing a new statewide evaluation

system.

Table 4.6

Implementation Plan of PDE Information Question #8: Indicate your level of agreement with Pennsylvania Department of Education developing a new statewide evaluation system inclusive for all elementary, middle, and high school principals

Level of Agreement for the New PPEE Evaluation System Principal Participants

Strongly Agree 1

Agree 20

Disagree 3

Strongly Disagree 1

Note. Principal Participants (N=25).

Question nine asked participants to share their perception of the new statewide

Pennsylvania Principal Effectiveness Evaluation being a fair rubric tool for job

performance as the principal. Of the 25 participants that responded to the question, 20

(80%) communicated “fair” and 5 (20%) expressed “unfair”. Table 4.7 signifies the

number of participants that perceive the PDE principal effectiveness evaluation as either

a fair or unfair rubric tool for principal job performance.

Table 4.7

Implementation Plan of PDE Information Question #9: Do you perceive the new statewide Pennsylvania Principal Effectiveness Evaluation is a fair rubric tool for your job performance as a principal

Perceptions of New PPEE Rubric as an Evaluation Tool Principal Participants

Very Fair 0

Fair 20 (Table 4.7 continues)

Page 81: Evert Thomas - Final Dissertation Draft for Publication 04232014

66

 

(Table 4.7 continued)

Perceptions of New PPEE Rubric as an Evaluation Tool Principal Participants

Unfair 5

Very Fair 0

Note. Principal Participants (N=25).

Principal Evaluation Feedback Information Survey Data

Questions 12, 13, 14, 15, 16, and 17 in the survey explored the format, quality,

frequency, and measurement of evaluation feedback on job performance as the principal

(see Appendix I for complete data). Question 12 asked participants if they received

evaluative feedback on job performance in the position of principal. Of the 25 principal

participants that replied to the question, 24 (96%) listed yes and only 1 (4%) responded

no. When asked in Question 13 to characterize the frequency of their evaluative

feedback on job performance in the position of principal the results were: 15 (60%)

annual; 5 (20%) mid-year; 2 (8%) quarterly; and 3 (12%) other. Table 4.8 denotes the

number of participants and frequency in evaluative feedback as the principal on job

performance.

Table 4.8

Principal Evaluation Feedback Information Question #13: Characterize how often you receive evaluative feedback on job performance as the principal

Current Frequency of Leadership Evaluative Feedback Principal Participants

Annual 15

Mid-Year 5

Quarterly 2 (Table 4.8 continues)

Page 82: Evert Thomas - Final Dissertation Draft for Publication 04232014

67

 

(Table 4.8 continued)

Current Frequency of Leadership Evaluative Feedback Principal Participants

Other 3

None 0

Note. Principal Participants (N=25).

Question 14 asked participants to identify the primary format of their evaluative

feedback that was currently used to rate their job performance as the principal. The 25

participants selected from multiple fixed categories that recorded 12 (30%) self-reflection

or self-rating; 4 (10%) checklist or inventory items; 11(28%) rubric design or rating

scale; 11 (28%) written narrative or summary; and 2 (5%) other. Table 4.9 catalogs the

number of participants by primary format of their current evaluative feedback as the

principal that is received on job performance.

Table 4.9

Principal Evaluation Feedback Information Question #14: Identify the primary format of evaluative feedback you currently receive on job performance as the principal

Current Format of Leadership Evaluative Feedback Principal Participants

Self-reflection or Self-rating 12

Checklist or Inventory Items 4

Rubric Design or Rating Scale 11

Written Narrative or Summary 11

Other 2

None 0

Note. Principal Participants (N=25). Participants were able to response within each of the six categories as not to limit the reporting of only a single format of evaluative feedback in consideration of various combinations. The total responses from participants were 40.

Page 83: Evert Thomas - Final Dissertation Draft for Publication 04232014

68

 

Question 15 prompted participants to express a level of satisfaction towards the

process of evaluative feedback related to job performance as the principal within their

current school district. The 25 participants revealed the results of 4 (16%) very satisfied;

15 (60%) satisfied; 5 (20%) unsatisfied; and 1 (4%) very unsatisfied. Table 4.10 reflects

the number of participants by level of satisfaction with their current school district

process of evaluative feedback as the principal that is received on job performance.

Table 4.10

Principal Evaluation Feedback Information Question #15: How satisfied are you with the current process of evaluative feedback in your school district related to job performance as the principal

Current Satisfaction with Process of Evaluative Feedback Principal Participants

Very Satisfied 4

Satisfied 15

Unsatisfied 5

Very Unsatisfied 1

Not Applicable 0

Note. Principal Participants (N=25).

Question 16 examined the perception of participants related to evaluation

feedback on job performance as the principal, being an accurate measurement of their

personal efforts to initiate positive change within the three areas of principal leadership

practices, teacher instructional practices, and student achievement scores. When asked

first if evaluation feedback was an accurate measurement of personal efforts to initiate

positive change within the area of principal leadership the participant responses were: 7

(28%) strongly agreed; 15 (60%) agreed; and 3 (12%) disagreed. When asked second if

evaluation feedback was an accurate measurement of personal efforts to initiate positive

Page 84: Evert Thomas - Final Dissertation Draft for Publication 04232014

69

 

change within the area of teacher instructional practices the participant responses were: 7

(28%) strongly agreed; 16 (64%) agreed; and 2 (8%) disagreed. When asked third if

evaluation feedback was an accurate measurement of personal efforts to initiate positive

change within the area of student achievement scores the participant responses were: 6

(24%) strongly agreed; 12 (48%) agreed; and 7 (28%) disagreed. Table 4.11 categorizes

the number of participants by their perceptions that evaluation feedback was an accurate

measurement of personal efforts to initiate positive change in the areas of principal

leadership practices, teacher instructional practices, and student achievement scores.

Table 4.11

Principal Evaluation Feedback Information Question #16: Do you perceive evaluation feedback on job performance, as the principal, to be an accurate measurement of your efforts to initiate positive change within the following areas

Perceptions of Evaluation Feedback Principal Teacher Student as Accurate Measurement of Personal Leadership Instructional Achievement Efforts to Initiate Positive Change Practices Practices Scores Strongly Agree 7 7 6

Agree 15 16 12

Disagree 3 2 7

Strongly Disagree 0 0 0

Note. Principal Participants (N=25). Participants could select only one response under principal leadership practices, teacher instructional practices, and student achievement scores to represent their level of agreement or disagreement. The total responses from participants were 75.

Question 17 solicited the evaluative feedback status of participants relative to

satisfactory or unsatisfactory on their job performance as the principal during the past

three school years from 2010 through 2013. When asked about their status of evaluative

feedback for the 2010-2011 school year the participants reported 18 (72%) satisfactory

Page 85: Evert Thomas - Final Dissertation Draft for Publication 04232014

70

 

and 7 (28%) not applicable. When asked secondly about their status of evaluative

feedback for the 2011-2012 school year the participants reported 20 (80%) satisfactory

and 5 (20%) not applicable. When asked lastly about their status of evaluative feedback

for the 2012-2013 school year the participants reported 22 (88%) satisfactory and 3

(12%) not applicable. Table 4.12 organizes the number of participants by their evaluative

feedback status of satisfactory or unsatisfactory for school years of 2010 to 2013 based

on job performance as the principal.

Table 4.12

Principal Evaluation Feedback Information Question #17: What was the status of evaluative feedback on your job performance as the principal during the past three school years

Status of Evaluative Feedback 2010-2011 2011-2012 2012-2013

Satisfactory 18 20 22

Unsatisfactory 0 0 0

Not Applicable 7 5 3

Note. Principal Participants (N=25). Participants could select only one response under school year 2010-2011, 2011-2012, and 2012-2013 to represent their status of satisfaction or dissatisfaction. The total responses from participants were 75.

Relationships within Educational Practice Information Survey Data

Questions 18 and 22 were individual Likert-scale and forced-choice inquiries that

focused on participant perceptions relative to the implementation for the new statewide

Pennsylvania Principal Effectiveness Evaluation by the Pennsylvania Department of

Education to positively impact and improve upon educational practices within the three

areas of principal leadership practices, teacher instructional practices, and student

achievement scores (see Appendix J for complete data). Questions 19, 20, and 21 were

open-ended questions (see Appendix N for complete data) and are discussed later.

Page 86: Evert Thomas - Final Dissertation Draft for Publication 04232014

71

 

Question 18 concentrated on the perception of participant beliefs that implementation of

the new statewide Pennsylvania Principal Effectiveness Evaluation by the Pennsylvania

Department of Education would impact positive change upon the three specific areas of

principal leadership practices, teacher instructional practices, and student achievement

scores. When asked if the implementation of the new statewide Pennsylvania Principal

Effectiveness Evaluation would impact positive change on the area of principal

leadership the participant responses were: 4 (16%) strongly agreed; 16 (64%) agreed;

and 5 (20%) disagreed. When asked if the implementation of the new statewide

Pennsylvania Principal Effectiveness Evaluation would impact positive change on the

area of teacher instructional practices the participant responses were: 1 (4%) strongly

agreed; 19 (76%) agreed; and 5 (20%) disagreed. When asked last if the implementation

of the new statewide Pennsylvania Principal Effectiveness Evaluation would impact

positive change on the area of student achievement scores the participant responses were:

4 (16%) strongly agreed; 16 (64%) agreed; and 5 (20%) disagreed. Table 4.13 identifies

the number of participants by perception that the new statewide Pennsylvania Principal

Effectiveness Evaluation would impact positive change in the areas of principal

leadership practices, teacher instructional practices, and student achievement scores.

Table 4.13

Relationships within Educational Practice Information Question #18: I believe the Pennsylvania Department of Education is implementing the new statewide Pennsylvania Principal Effectiveness Evaluation to positively impact the following areas

Perceptions on the New Statewide Principal Teacher Student Principal Effectiveness Evaluation Leadership Instructional Achievement to Impact Positive Change Practices Practices Scores

Strongly Agree 4 1 4 Agree 16 19 16 (Table 4.13 continues)

Page 87: Evert Thomas - Final Dissertation Draft for Publication 04232014

72

 

(Table 4.13 continued)

Perceptions on the New Statewide Principal Teacher Student Principal Effectiveness Evaluation Leadership Instructional Achievement to Impact Positive Change Practices Practices Scores

Disagree 5 5 5

Strongly Disagree 0 0 0

Note. Principal Participants (N=25). Participants could select only one response under principal leadership practices, teacher instructional practices, and student achievement scores to represent their level of agreement or disagreement. The total responses from participants were 75.

Question 22 centered on the perception of participants in regards to a percentage

value of student performance data that should be used towards accountability in job

performance outcome on the new statewide Pennsylvania Principal Effectiveness

Evaluation system for principals within the three areas of principal leadership practices,

teacher instructional practices, and student achievement scores. When asked for a

specific percentage of student performance data to be used in accountability towards job

performance outcome in the area of principal leadership practices the participant replies

were: 12 (48%) from 0 to 25%; 7 (28%) from 26% to 50%; and 6 (24%) from 51% to

75%. When asked a particular percentage of student performance data to be used in

accountability towards job performance outcome in the area of teacher instructional

practices the participant replies were: 13 (52%) from 0 to 25%; 11 (44%) from 26% to

50%; and 1 (4%) from 51% to 75%. When asked lastly for a percentage of student

performance data to be used in accountability towards job performance outcome in the

area of student achievement scores the participant replies were: 12 (48%) from 0 to 25%;

10 (40%) from 26% to 50%; and 3 (12%) from 51% to 75%. Table 4.14 organizes the

perception of participants by a percentage range of student performance data that should

Page 88: Evert Thomas - Final Dissertation Draft for Publication 04232014

73

 

account towards job performance outcome on the new statewide Pennsylvania Principal

Effectiveness Evaluation system for principals.

Table 4.14

Relationships within Educational Practice Information Question #22: What percentage of student performance data do you perceive should be used to account for your job performance on the new statewide Pennsylvania Principal Effectiveness Evaluation system for principals within the following areas

Perceptions on Student Performance Principal Teacher Student Data to Account Towards Principal Leadership Instructional Achievement Job Performance within New PPEE Practices Practices Scores

0 to 25% of total principal evaluation 12 13 12

26% to 50% of total principal evaluation 7 11 10

51% to 75% of total principal evaluation 6 1 3

76% to 100% of total principal evaluation 0 0 0

Note. Principal Participants (N=25). Participants could select only one response under principal leadership practices, teacher instructional practices, and student achievement scores within a fixed percentage range. The total responses from participants were 75.

Principal Leadership Performance Effort Information Survey Data

Questions 23 through 41 in the survey focused on professional practices of

participants related to involvement in the performance criteria tasks defined within

domains 1a through 1e, domains 2a through 2f, domains 3a through 3e, and domains 4a

through 4c in terms of duration that ranged from daily, weekly, monthly, yearly, and/or

situational as needed (see Appendix K for complete data). When asked in question 23,

“How often do you involve yourself with the performance criteria from 1a: Creating an

Organizational Vision, Mission, and Strategic Goals of Domain 1”, participants stated: 3

(11%) daily; 10 (36%) weekly; 7 (25%) monthly; 3 (11%) yearly; and 5 (18%) situational

as needed. When asked in question 24, “How often do you involve yourself with the

performance criteria from 1b: Using Data for Informed Decision Making of Domain 1”,

Page 89: Evert Thomas - Final Dissertation Draft for Publication 04232014

74

 

participants responded: 7 (25%) daily; 10 (36%) weekly; 8 (29%) monthly; and 3 (11%)

situational as needed. When asked in question 25, “How often do you involve yourself

with the performance criteria from 1c: Building a Collaborative and Empowering Work

Environment of Domain 1”, participants indicated: 19 (76%) daily; 4 (16%) weekly; and

2 (8%) monthly. When asked in question 26, “How often do you involve yourself with

the performance criteria from 1d: Leading Change Efforts for Continuous Improvements

of Domain 1”, participants expressed: 6 (23%) daily; 12 (46%) weekly; 4 (15%)

monthly; 1 (4%) yearly; and 3 (12%) situational as needed. When asked in question 27,

“How often do you involve yourself with the performance criteria from 1e: Celebrating

Accomplishments and Acknowledging Failures of Domain 1”, participants revealed: 7

(26%) daily; 9 (33%) weekly; 7 (26%) monthly; and 4 (15%) situational as needed.

Table 4.15 illuminates the perceptions of participant involvement effort on criteria in

domain 1a through 1e.

Table 4.15

Principal Leadership Performance Effort Information Questions #23 to #27: How often do you involve yourself with the performance criteria from Domain 1a through 1e

Survey Question Daily Weekly Monthly Yearly Situational as Needed

#23 Criteria 1a 3 10 7 3 5

#24 Criteria 1b 7 10 8 0 3

#25 Criteria 1c 19 4 2 0 0 (Table 4.15 continues)  

Page 90: Evert Thomas - Final Dissertation Draft for Publication 04232014

75

 

(Table 4.15 continued)

Survey Question Daily Weekly Monthly Yearly Situational as Needed

#26 Criteria 1d 6 12 4 1 3

#27 Criteria 1e 7 9 7 0 4

Note. Principal Participants (N=25). Participants could select responses within each of the five categories of daily, weekly, monthly, yearly, and situational as needed to account for various combinations of time and frequency spent on each domain throughout the school year. The total responses from participants were 134. See Appendix K for the survey questions corresponding to Column 1.

When asked in question 28, “How often do you involve yourself with the

performance criteria from 2a: Leveraging Human and Financial Resources of Domain 2”,

participants specified: 10 (38%) daily; 7 (27%) weekly; 4 (15%) monthly; and 5 (19%)

situational as needed. When asked in question 29, “How often do you involve yourself

with the performance criteria from 2b: Ensuring School Safety of Domain 2”, participants

acknowledged: 20 (74%) daily; 2 (7%) weekly; 1 (4%) monthly; and 4 (15%) situational

as needed. When asked in question 30, “How often do you involve yourself with the

performance criteria from 2c: Complying with Federal, State, and LEA Mandates of

Domain 2”, participants replied: 14 (50%) daily; 5 (18%) weekly; 5 (18%) monthly; and

4 (14%) situational as needed. When asked in question 31, “How often do you involve

yourself with the performance criteria from 2d: Establishing and Implementing

Expectations for Students and Staff of Domain 2”, participants shared: 14 (48%) daily; 8

(28%) weekly; 2 (7%) monthly; 2 (7%) yearly; and 3 (10%) situational as needed. When

asked in question 32, “How often do you involve yourself with the performance criteria

from 2e: Communicating Effectively and Strategically of Domain 2”, participants

affirmed: 16 (50%) daily; 8 (25%) weekly; 4 (13%) monthly; 1 (3%) yearly; and 3 (9%)

Page 91: Evert Thomas - Final Dissertation Draft for Publication 04232014

76

 

situational as needed. When asked in question 33, “How often do you involve yourself

with the performance criteria from 2f: Managing Conflict Constructively of Domain 2”,

participants conveyed: 15 (58%) daily; 3 (12%) weekly; 1 (4%) monthly; and 7 (27%)

situational as needed. Table 4.16 displays the perceptions of participant involvement

effort on criteria in domain 2a through 2f.

Table 4.16

Principal Leadership Performance Effort Information Question #28 to #33: How often do you involve yourself with the performance criteria from Domain 2a through 2f

Survey Question Daily Weekly Monthly Yearly Situational as Needed

#28 Criteria 2a 10 7 4 0 5

#29 Criteria 2b 20 2 1 0 4

#30 Criteria 2c 14 5 5 0 4

#31 Criteria 2d 14 8 2 2 3

#32 Criteria 2e 16 8 4 1 3

#33 Criteria 2f 15 3 1 0 7

Note. Principal Participants (N=25). Participants could select responses within each of the five categories of daily, weekly, monthly, yearly, and situational as needed to account for various combinations of time and frequency spent on each domain throughout the school year. The total responses from participants were 168. See Appendix K for the survey questions corresponding to Column 1.

When asked in question 34, “How often do you involve yourself with the

performance criteria from 3a: Leading School Improvement Initiatives of Domain 3”,

participants indicated: 9 (32%) daily; 10 (36%) weekly; 5 (18%) monthly; 2 (7%) yearly;

and 2 (7%) situational as needed. When asked in question 35, “How often do you

involve yourself with the performance criteria from 3b: Aligning Curricula, Instruction,

and Assessments of Domain 3”, participants revealed: 4 (15%) daily; 10 (37%) weekly;

Page 92: Evert Thomas - Final Dissertation Draft for Publication 04232014

77

 

8 (30%) monthly; 2 (7%) yearly; and 3 (11%) situational as needed. When asked in

question 36, “How often do you involve yourself with the performance criteria from 3c:

Implementing High Quality Instruction of Domain 3”, participants expressed: 14 (48%)

daily; 12 (41%) weekly; 2 (7%) monthly; and 1 (3%) yearly. When asked in question 37,

“How often do you involve yourself with the performance criteria from 3d: Setting High

Expectations for All Students of Domain 3”, participants stated: 16 (57%) daily; 7 (25%)

weekly; 2 (7%) monthly; 1 (4%) yearly; and 2 (7%) situational as needed. When asked

in question 38, “How often do you involve yourself with the performance criteria from

3e: Maximizing Instructional Time of Domain 3”, participants responded: 9 (33%) daily;

11 (41%) weekly; 4 (15%) monthly; 1 (4%) yearly; and 2 (7%) situational as needed.

Table 4.17 represents the perceptions of participant involvement effort on criteria in

domain 3a through 3e.

Table 4.17

Principal Leadership Performance Effort Information Question #34 to #38: How often do you involve yourself with the performance criteria from Domain 3a through 3e

Survey Question Daily Weekly Monthly Yearly Situational as Needed

#34 Criteria 3a 9 10 5 2 2

#35 Criteria 3b 4 10 8 2 3

#36 Criteria 3c 14 12 2 1 0

#37 Criteria 3d 16 7 2 1 2

#38 Criteria 3e 9 11 4 1 2

Note. Principal Participants (N=25). Participants could select responses within each of the five categories of daily, weekly, monthly, yearly, and situational as needed to account for various combinations of time and frequency spent on each domain throughout the school year. The total responses from participants were 139. See Appendix K for the survey questions corresponding to Column 1.

Page 93: Evert Thomas - Final Dissertation Draft for Publication 04232014

78

 

When asked in question 39, “How often do you involve yourself with the

performance criteria from 4a: Maximizing Parent and Community Involvement and

Outreach of Domain 4”, participants shared: 2 (8%) daily; 11 (42%) weekly; 11 (42%)

monthly; 1 (4%) yearly; and 1 (4%) situational as needed. When asked in question 40,

“How often do you involve yourself with the performance criteria from 4b: Showing

Professionalism of Domain 4”, participants affirmed: 22 (85%) daily; 1 (4%) weekly; 2

(8%) monthly; and 1 (4%) situational as needed. When asked in question 41, “How often

do you involve yourself with the performance criteria from 4c: Supporting Professional

Growth of Domain 4”, participants specified: 10 (33%) daily; 7 (23%) weekly; 7 (23%)

monthly; 2 (7%) yearly; and 4 (13%) situational as needed. Table 4.18 illustrates the

perceptions of participant involvement effort on criteria in domain 4a through 4c.

Table 4.18

Principal Leadership Performance Effort Information Question #39 to #41: How often do you involve yourself with the performance criteria from Domain 4a through 4c

Survey Question Daily Weekly Monthly Yearly Situational as Needed

#39 Criteria 4a 2 11 11 1 1

#40 Criteria 4b 22 1 2 0 1

#41 Criteria 4c 10 7 7 2 4

Note. Principal Participants (N=25). Participants could select responses within each of the five categories of daily, weekly, monthly, yearly, and situational as needed to account for various combinations of time and frequency spent on each domain throughout the school year. The total responses from participants were 82. See Appendix K for the survey questions corresponding to Column 1.

Principal Leadership Performance and Reflective Change Information Survey Data

Questions 42 through 45 in the survey examined the perception of participants

towards each performance criteria to effect a level of change within their professional

Page 94: Evert Thomas - Final Dissertation Draft for Publication 04232014

79

 

practices (see Appendix L for complete data) based on domains 1a through 1e, domains

2a through 2f, domains 3a through 3e, and domains 4a through 4c in the new statewide

Pennsylvania Principal Effectiveness Evaluation by the Pennsylvania Department of

Education. In question 42, the principals were asked to rate each item contained in

Domain 1: Strategic Leadership and Cultural Leadership centered on how they perceive

each performance criteria would affect a level of change in their professional practices

based on the new statewide Pennsylvania Principal Effectiveness Evaluation system. For

item 1a: Creating an Organizational Vision, Mission, and Strategic Goals, the participants

indicated: 10 (40%) yes they would alter their principal practice; and 15 (60%) no they

would not alter their principal practice. In item 1b: Using Data for Informed Decision

Making, the participants shared: 10 (40%) yes they would alter their principal practice;

and 15 (60%) no they would not alter their principal practice. Regarding item 1c:

Building a Collaborative and Empowering Work Environment, the participants

expressed: 6 (24%) yes they would alter their principal practice; and 19 (76%) no they

would not alter their principal practice. Relating to item 1d: Leading Change Efforts for

Continuous Improvements, the participants responded: 15 (60%) yes they would alter

their principal practice; and 10 (40%) no they would not alter their principal practice.

Pertaining to item 1e: Celebrating Accomplishments and Acknowledging Failures, the

participants stated: 8 (32%) yes they would alter their principal practice; and 17 (68%)

no they would not alter their principal practice. Table 4.19 characterizes the perceptions

of participants by level of change effect on items in Domain 1: Strategic Leadership and

Cultural Leadership.

Page 95: Evert Thomas - Final Dissertation Draft for Publication 04232014

80

 

Table 4.19

Principal Leadership Performance and Reflective Change Information Question #42: Rate each item contained in Domain 1: Strategic Leadership and Cultural Leadership focused on how you perceive each performance criteria will affect a level of change in your professional practices based on the new statewide Pennsylvania Principal Effectiveness Evaluation system

Survey Question #42 by Criteria Yes, I would alter No, I would not alter my principal practice my principal practice

Criteria 1a 10 15

Criteria 1b 10 15

Criteria 1c 6 19

Criteria 1d 15 10

Criteria 1e 8 17

Note. Principal Participants (N=25). Participants could select only one response within each of the five domain criteria to represent either yes would alter or no would not alter. The total responses from participants were 125. See Appendix L for the survey questions corresponding to Column 1.

In question 43, the principals were asked to rate each item contained in Domain 2:

Systems Leadership focused on how they perceive each performance criteria would affect

a level of change in their professional practices based on the new statewide Pennsylvania

Principal Effectiveness Evaluation system. On item 2a: Leveraging Human and Financial

Resources, the participants revealed: 7 (28%) yes they would alter their principal

practice; and 18 (72%) no they would not alter their principal practice. In connection

with item 2b: Ensuring School Safety, the participants acknowledged: 3 (12%) yes they

would alter their principal practice; and 22 (88%) no they would not alter their principal

practice. In respect to item 2c: Complying with Federal, State, and LEA Mandates, the

participants conveyed: 6 (24%) yes they would alter their principal practice; and 19

(76%) no they would not alter their principal practice. In reference to item 2d:

Page 96: Evert Thomas - Final Dissertation Draft for Publication 04232014

81

 

Establishing and Implementing Expectations for Students and Staff, the participants

affirmed: 8 (32%) yes they would alter their principal practice; and 17 (68%) no they

would not alter their principal practice. In connection to item 2e: Communicating

Effectively and Strategically, the participants specified: 10 (40%) yes they would alter

their principal practice; and 15 (60%) no they would not alter their principal practice.

With regard to item 2f: Managing Conflict Constructively, the participants replied: 4

(16%) yes they would alter their principal practice; and 21 (84%) no they would not alter

their principal practice. Table 4.20 lists the perceptions of participants by level of change

effect on items in Domain 2: Systems Leadership.

Table 4.20

Principal Leadership Performance and Reflective Change Information Question #43: Rate each item contained in Domain 2: Systems Leadership focused on how you perceive each performance criteria will affect a level of change in your professional practices based on the new statewide Pennsylvania Principal Effectiveness Evaluation system

Survey Question #43 by Criteria Yes, I would alter No, I would not alter my principal practice my principal practice

Criteria 2a 7 18

Criteria 2b 3 22

Criteria 2c 6 19

Criteria 2d 8 17

Criteria 2e 10 15

Criteria 2f 4 21

Note. Principal Participants (N=25). Participants could select only one response within each of the six domain criteria to represent either yes would alter or no would not alter. The total responses from participants were 150. See Appendix L for the survey questions corresponding to Column 1.

In question 44, the principals were asked to rate each item contained in Domain 3:

Leadership for Learning determined on how they perceive each performance criteria

Page 97: Evert Thomas - Final Dissertation Draft for Publication 04232014

82

 

would affect a level of change in their professional practices based on the new statewide

Pennsylvania Principal Effectiveness Evaluation system. In item 3a: Leading School

Improvement Initiatives, the participants expressed: 11 (44%) yes they would alter their

principal practice; and 14 (56%) no they would not alter their principal practice. Relating

to item 3b: Aligning Curricula, Instruction, and Assessments, the participants stated: 13

(52%) yes they would alter their principal practice; and 12 (48%) no they would not alter

their principal practice. Regarding item 3c: Implementing High Quality Instruction, the

participants responded: 11 (44%) yes they would alter their principal practice; and 14

(56%) no they would not alter their principal practice. Pertaining to item 3d: Setting

High Expectations for All Students, the participants indicated: 8 (32%) yes they would

alter their principal practice; and 17 (68%) no they would not alter their principal

practice. For item 3e: Maximizing Instructional Time, the participants shared: 8 (32%)

yes they would alter their principal practice; and 17 (68%) no they would not alter their

principal practice. Table 4.21 records the perceptions of participants by level of change

effect on items in Domain 3: Leadership for Learning.

Table 4.21

Principal Leadership Performance and Reflective Change Information Question #44: Rate each item contained in Domain 3: Leadership for Learning focused on how you perceive each performance criteria will affect a level of change in your professional practices based on the new statewide Pennsylvania Principal Effectiveness Evaluation system

Survey Question #44 by Criteria Yes, I would alter No, I would not alter my principal practice my principal practice

Criteria 3a 11 14

Criteria 3b 13 12 (Table 4.21 continues)

Page 98: Evert Thomas - Final Dissertation Draft for Publication 04232014

83

 

(Table 4.21 continued)

Survey Question #44 by Criteria Yes, I would alter No, I would not alter my principal practice my principal practice

Criteria 3c 11 14

Criteria 3d 8 17

Criteria 3e 8 17

Note. Principal Participants (N=25). Participants could select only one response within each of the five domain criteria to represent either yes would alter or no would not alter. The total responses from participants were 125. See Appendix L for the survey questions corresponding to Column 1.

In question 45, the principals were asked to rate each item contained in Domain 4:

Professional and Community Leadership emphasized on how they perceive each

performance criteria would affect a level of change in their professional practices based

on the new statewide Pennsylvania Principal Effectiveness Evaluation system. In

connection to item 4a: Maximizing Parent and Community Involvement and Outreach,

the participants expressed: 15 (60%) yes they would alter their principal practice; and 10

(40%) no they would not alter their principal practice. In reference to item 4b: Showing

Professionalism, the participants stated: 3 (12%) yes they would alter their principal

practice; and 22 (88%) no they would not alter their principal practice. In respect to item

4c: Supporting Professional Growth, the participants responded: 3 (12%) yes they would

alter their principal practice; and 22 (88%) no they would not alter their principal

practice. Table 4.22 describes the perceptions of participants by level of change effect on

items in Domain 4: Professional and Community Leadership.

Page 99: Evert Thomas - Final Dissertation Draft for Publication 04232014

84

 

Table 4.22

Principal Leadership Performance and Reflective Change Information Question #45: Rate each item contained in Domain 4: Professional and Community Leadership focused on how you perceive each performance criteria will affect a level of change in your professional practices based on the new statewide Pennsylvania Principal Effectiveness Evaluation system

Survey Question #45 by Criteria Yes, I would alter No, I would not alter my principal practice my principal practice

Criteria 4a 15 10

Criteria 4b 3 22

Criteria 4c 3 22

Note. Principal Participants (N=25). Participants could select only one response within each of the three domain criteria to represent either yes would alter or no would not alter. The total responses from participants were 75. See Appendix L for the survey questions corresponding to Column 1.

Domain Achievability Information Survey Data

Question 46 in the survey concentrated on the perception of participants towards

each of the four domains on the new statewide Pennsylvania Principal Effectiveness

Evaluation by the Pennsylvania Department of Education to be achievable or

unachievable as the principal (see Appendix M for complete data). On items within

Domain 1: Strategic/Cultural Leadership the participants acknowledged: 24 (96%)

achievable and 1 (4%) unachievable. With regard to items within Domain 2: Systems

Leadership the participants replied: 23 (92%) achievable and 2 (8%) unachievable. In

connection to items within Domain 3: Leadership for Learning the participants stated: 23

(92%) achievable and 2 (8%) unachievable. Pertaining to items within Domain 4:

Professional and Community Leadership the participants conveyed a perfect 25 (100%)

achievable. Table 4.23 distinguishes the perceptions of participants by achievability of

Page 100: Evert Thomas - Final Dissertation Draft for Publication 04232014

85

 

domains within the new statewide Pennsylvania Principal Effectiveness Evaluation

system for principals from the Pennsylvania Department of Education.

Table 4.23

Domain Achievability Information Question #46: Indicate which Domains on the new statewide Pennsylvania Principal Effectiveness Evaluation system you perceive as achievable or unachievable as the principal

Perceptions of Achievability in PPEE Domains Achievable Unachievable

Domain 1: Strategic/Cultural Leadership 24 1

Domain 2: Systems Leadership 23 2

Domain 3: Leadership for Learning 23 2

Domain 4: Professional and Community Leadership 25 0

Note. Principal Participants (N=25). Participants could select only one response within each of the four domains to represent either achievable or unachievable. The total responses from participants were 100.

Open-ended Response Questions

Distributed among the multiple Likert-scale and forced-choice questions, the

Google© Drive online survey included five open-ended questions for the principals.

These questions were designed to allow each principal to provide information, in their

own words, as related to their perceptions about the new statewide Pennsylvania

Principal Effectiveness Evaluation within the three areas of principal leadership practices,

teacher instructional practices, and student achievement scores. The following data

represents their responses to each of the five questions (see Appendix N for complete

data).

First Open-ended Question

Question 10, the first open-ended inquiry within the online survey, asked “What

do you perceive will be the most difficult aspect in the implementation process of the

Page 101: Evert Thomas - Final Dissertation Draft for Publication 04232014

86

 

new statewide Pennsylvania Principal Effectiveness Evaluation for your district?” There

were 77 responses to this question that pertained to principal perceptions for

implementation of PDE information. When principals used different words to express

similar ideas, these responses were accounted for under each general theme. Table 4.24

illustrates the general themes that emerged from the responses.

Table 4.24

Open-ended Question #10: What do you perceive will be the most difficult aspect in the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation for your district

General Themes from Question#10 Number of Responses Percentage of Responses

Language within the Rubric 11 14%

Evidence & Documentation for 9 12% Domain Items

Consistency in All PA Districts 8 10%

Linking All Data Sources 8 10%

Necessary Resources & Other 8 10% Unforeseen Factors

Amount of Variables in Principal 7 9% Duties & Responsibilities

Communication of Information, 6 8% Expectations & Training

Not Doing the Same Old Thing 6 8% Under a New Name

Tracking Participant Information 6 8%

Roll-Out Logistics vs. Other PDE 4 5% Initiatives in Progress

Timely Results of PSSA Data 3 4%

Public Input & Involvement 1 1%

Note. Principal Participants (N=25). Percentages reflect 77 total responses from the participants. Participants could state more than one theme.

Page 102: Evert Thomas - Final Dissertation Draft for Publication 04232014

87

 

Second Open-ended Question

Question 11, the second open-ended inquiry within the online survey, asked

“What do you perceive will be the most seamless aspect in the implementation process of

the new statewide Pennsylvania Principal Effectiveness Evaluation for your district?”

There were 44 responses to this question that pertained to principal perceptions for

implementation of PDE information. The general themes to emerge from these responses

are outlined in Table 4.25. Similar ideas expressed by principals using different

terminology were accounted for under each general theme.

Table 4.25

Open-ended Question #11: What do you perceive will be the most seamless aspect in the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation for your district

General Themes from Question #11 Number of Responses Percentage of Responses

Not Sure 6 14%

PDE Communication & Information 6 14%

What Principals Already Are Doing 5 11%

Positive Outlook for Leading 5 11% Change Efforts Positive Outlook for New PPEE 5 11% & Positive Improvement

Danielson Rubric Already in Use 4 9%

Ownership & Responsibility for 4 9% New PPEE

Dedication & Time to New PPEE 3 7%

PDE Mandated for All PA Districts 2 5% (Table 4.25 continues)

Page 103: Evert Thomas - Final Dissertation Draft for Publication 04232014

88

 

(Table 4.25 continued)

General Themes from Question #11 Number of Responses Percentage of Responses

Support from Administration at the 2 5% District Level

Currently Participating in PDE Pilot 1 2%

Size of the School District 1 2%

Note. Principal Participants (N=25). Percentages reflect 44 total responses from the participants. Participants could state more than one theme.

Third Open-ended Question

Question 19, the third open-ended inquiry within the online survey, asked “Do

you perceive Principal Leadership Practices will improve due to the implementation

process of the new statewide Pennsylvania Principal Effectiveness Evaluation within

your district? In what way?” There were 49 responses to this question that pertained to

principal perceptions of relationships within educational practice. The general themes to

emerge from these responses are outlined in Table 4.26. Similar ideas expressed by

principals using different terminology were accounted for under each general theme.

Table 4.26

Open-ended Question #19: Do you perceive Principal Leadership Practices will improve due to the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation within your district

General Themes from Question #19 Number of Responses Percentage of Responses

Yes, Clearer Expectations within 11 22% a Common Framework

Yes, More Accountability 6 12%

No, the PPEE Process will not 6 12% Improve Leadership Practices (Table 4.26 continues)

Page 104: Evert Thomas - Final Dissertation Draft for Publication 04232014

89

 

(Table 4.26 continued)

General Themes from Question #19 Number of Responses Percentage of Responses

Yes, If Implemented with Fidelity 5 10% & Consistency

Yes, More Attention on Tasks 5 10% & Responsibilities

Yes, More Focus on Strengthens 5 10% & Weaknesses

Yes, More Reflective Process 4 8%

Yes, More Formal Communication 3 6% Process

Yes, Collaboration on Best Practice 2 4%

Yes, Feedback will Increase Positive 1 2% Performance

Not Sure 1 2%

Note. Principal Participants (N=25). Percentages reflect 49 total responses from the participants. Participants could state more than one theme.

Fourth Open-ended Question

Question 20, the fourth open-ended inquiry within the online survey, asked “Do

you perceive Teacher Instructional Practices will improve due to the implementation

process of the new statewide Pennsylvania Principal Effectiveness Evaluation within

your district? In what way?” There were 46 responses to this question that pertained to

principal perceptions of relationships within educational practice. The general themes to

emerge from these responses are outlined in Table 4.27. Similar ideas expressed by

principals using different terminology were accounted for under each general theme.

Page 105: Evert Thomas - Final Dissertation Draft for Publication 04232014

90

 

Table 4.27

Open-ended Question #20: Do you perceive Teacher Instructional Practices will improve due to the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation within your district

General Themes from Question #20 Number of Responses Percentage of Responses

Yes - More Attention to Support 8 17% Teacher & Instruction

No, the PPEE Process will not 8 17% Improve Teacher Practices

Yes, More Accountability on 7 15% Supervision of Teachers & Instruction

Not Sure 6 13%

Yes, Clearer Expectations within 5 11% Rubric for Leadership Linked with Teacher & Instruction

Yes, More Focus on Constructive 4 9% Communication with Teacher

Yes, More Attention on PSSA Data 4 9% & Test Scores

Yes, If Implemented with Fidelity 4 9% & Consistency

Note. Principal Participants (N=25). Percentages reflect 46 total responses from the participants. Participants could state more than one theme.

Fifth Open-ended Question

Question 21, the fifth open-ended inquiry within the online survey, asked “Do you

perceive Student Achievement Scores will improve due to the implementation process of

the new statewide Pennsylvania Principal Effectiveness Evaluation within your district?

In what way?” There were 44 responses to this question that pertained to principal

perceptions of relationships within educational practice. The general themes to emerge

Page 106: Evert Thomas - Final Dissertation Draft for Publication 04232014

91

 

from these responses are outlined in Table 4.28. Similar ideas expressed by principals

using different terminology were accounted for under each general theme.

Table 4.28

Open-ended Question #21: Do you perceive Student Achievement Scores will improve due to the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation within your district

General Themes from Question #21 Number of Responses Percentage of Responses

Yes - If Implemented with Fidelity 11 25% & Consistency

No, the PPEE Process will not 11 25% Improve Student Achievement

Yes - Clear Expectations in Rubric 8 18% to Link Principal Accountability with Teacher Instruction & Student Growth

Yes, More Attention on PSSA Data 8 18% & Test Scores

Not Sure 6 14%

Note. Principal Participants (N=25). Percentages reflect 44 total responses from the participants. Participants could state more than one theme.

Principal Interview Responses

Principal volunteers consisted of 1 high school, 1 middle school, and 3 elementary

school principals that were offered the options of a face-to-face, videoconference, or

phone interview. All chose to participate in a phone interview regarding their perceptions

of the proposed 2014 implementation of a statewide Pennsylvania Principal Effectiveness

Evaluation (PPEE) and its effect upon leadership practices, instructional practices, and

student achievement scores. The following represents the responses from each principal

to the questions posed to them during the phone interview (see Appendix O for complete

data).

Page 107: Evert Thomas - Final Dissertation Draft for Publication 04232014

92

 

Question One: “Why WILL the implementation of a statewide Pennsylvania

Principal Effectiveness Evaluation impact on principal leadership?”

• Principal P-1 stated, “It is holding us accountable for accurately reporting

evaluations of our own staff members. For our teachers and the phenomenon of

grade inflation, and not accurately reporting student progress, this will hold

principals more accountable to accurately evaluate teachers.”

• Principal P-2 responded, “I really think it will show principals a better option and

will give them more structure in how they can better fine tune what they are doing

within their buildings to instruct the leadership of their students and teachers that

are in their buildings. It kind of gives us a better guide or perimeter to work off.”

• Principal P-3 said, “In having reviewed the rubric and looking over it and now

having become part of a pilot to implement it this school year, the rubric will hold

us accountable for the domains that the state is saying, “This is what it means to

be a principal and to be effective you need to do.” My biggest concern is the

documentation of evidence for smart goals along with all the other things that we

often do as administrators. Just trying to keep up with that, maintain it, make sure

we are documenting good evidence, and we are providing what it may be the state

is looking for, as some of us have never done the pilot before, aren’t sure exactly

what they want to see.”

• Principal P-4 expressed, “I think it will definitely have an impact on principal

leadership because now leadership will be measured in some form, whereas

before it was very subjective how principals would demonstrate their leadership

Page 108: Evert Thomas - Final Dissertation Draft for Publication 04232014

93

 

district to district. I think the evaluation rubric gives clear definition to what

leadership should look like in every building.”

• Principal P-5 shared, “It is going to impact the roles of the principal in terms of

what we are looking for, what we are measuring for teachers, and it gives us the

rubric really to know what we should be looking for and working off, what is

expected from principals as well as what is expected for teachers. So it will give

common language and expectations across the board.”

Question Two: “Why will the implementation of a statewide Pennsylvania

Principal Effectiveness Evaluation NOT impact on principal leadership practices?”

• Principal P-1 replied, “If principal leadership practices are already in check and in

place then it wouldn’t be impacted. Those who would be impacted are the ones

that aren’t necessarily following good best practices to begin with!”

• Principal P-2 said, “I think we still have the exact same people doing some of the

exact same things and until we are better trained on how to implement it, I think

we are still going to get some of the same results because we are not forced to

make some of those changes and the same people are doing the same jobs. People

don’t want to have to do the extra things that they don’t want to do.”

• Principal P-3 responded, “I think the only way a statewide rubric doesn’t impact

you is if you just ignore that it exists because the rubric has very delineated

guidelines to be proficient, here is what you must do, and it is pretty specifically

lined out. And then for the distinguished, the word and you must do these things

to be distinguished, to either live there, or even breathe there for the moment

because nobody technically resides there forever. So, to not do it and to not be an

Page 109: Evert Thomas - Final Dissertation Draft for Publication 04232014

94

 

effective principal, I would say you are just violating that entire rubric. I am not

sure how you can’t do it unless you ignore that it exists, or if they suddenly say

we are not using it anymore.”

• Principal P-4 shared, “I think it all depends on how people really take it to heart,

the principals, and then the administration supervising principals. Will people

take it seriously? Will they follow through on the components of it? How will

people document? I think it will all come down to how everybody approaches the

rubric.”

• Principal P-5 stated, “Well it will give us the common language in terms of

expectations, but it will also then be a matter of principals gathering evidence to

support those categories so the challenge I believe is going to be from those

supervising principals to make sure there is consistency within a district. And I

believe that will be difficult to do.”

Question Three: “Why WILL the implementation of a statewide Pennsylvania

Principal Effectiveness Evaluation impact on teacher instructional practices?”

• Principal P-1 said, “It will impact teacher instructional practices because with

good leadership, me being held more accountable, I am then holding teachers

more accountable for what is happening in their classrooms, which would be in

effect, better instructional practices.”

• Principal P-2 replied, “I think because principals are now graded on the same

principles. So those principals have to look at what areas are they not making

their growth in because their outcome, or their evaluation are based on their

teachers scores, so those principals are going to look into those teachers and be

Page 110: Evert Thomas - Final Dissertation Draft for Publication 04232014

95

 

less adaptive to letting things slide. They are going to say, “look you need to do

better so we all do better” and it impacts the principal in that way.”

• Principal P-3 shared, “When you look at the domains that we as a principal need

to be responding to and working towards, and especially if you are writing smart

goals, so many of them are focused on instruction, and if you look at the domains

and what it says to be proficient you have to show evidence of, or you have to be

able to document what you are doing, between meetings and instructional

facilitation. I look just solely at data and data collection, and data meetings, and

student growth, and what I am going to have to say to the state here is what I did,

and here is why I am proficient. I think that is going to dramatically change how

some principals operate because some are great managers but not instructional

leaders. You might be a great instructional leader but don’t know how to manage.

Now, you have to be the whole package, or the state is going to say you are not

proficient. I think that is going to dramatically change how teachers are impacted,

because now I am more accountable, so I am going to be on them more about

instruction, and instructional goals. If my smart goals tied into my instruction, I

am going to expect them to know it, and for them to help me drive that goal

home.”

• Principal P-4 stated, “I think everything filters down from the principal through

the building. The principal obviously is the leader of the building and the one that

should really shape the educational climate and culture of the building. I think the

impact on teacher instruction and everything that is happening in the classrooms

Page 111: Evert Thomas - Final Dissertation Draft for Publication 04232014

96

 

comes from what the principal does. So by putting the rubric into play for the

principals, it filters down through to the teachers, and into the classrooms.”

• Principal P-5 responded, “Well with the new principal evaluation now focused on

principal accountability for student data, which in turn will come back to teachers

practice, it will be a way that I believe principals will be looking to hold teachers

accountable for the instructional practices they are delivering within the

classrooms.”

Question Four: “Why will the implementation of a statewide Pennsylvania

Principal Effectiveness Evaluation NOT impact on teacher instructional practices?”

• Principal P-1 replied, “Those who are already doing a good job at their

instructional leadership, and pushing, and questioning teachers, and have teachers

question themselves, and on the path of continued professional development of

doing better, and always looking to move forward, those people are not

necessarily going to be impacted because they are already following those

guidelines. Obviously, you have those that are in the trenches of some pretty dire

districts that have bad and poor scores, but at the same time being able to improve

with their improvement plans and move forward, and continuing to better

themselves, are not going to be affected.”

• Principal P-2 shared, “I think because, for principals, they inherently believe that

teachers are going to do well. And the majority of those teachers will still come

back as being proficient. We will find the good in anyone. So even though they

may be lacking in certain areas, we focus on the good because that’s the nature of

Page 112: Evert Thomas - Final Dissertation Draft for Publication 04232014

97

 

our positions, and that’s the nature, usually of our demeanors. We try and find the

good and don’t always look and try to solve the problem areas.”

• Principal P-3 stated, “I would say it wouldn’t impact it, if you as an administrator

are looking at your goals, and think there is nothing there your staff can do to help

you. But, if your whole building is a team and 15 % of their evaluation is

building data, and 15% of my evaluation is how my building performs on the

PSSA to impact my instructional goals then I just don’t see how it does not.”

• Principal P-4 responded, “It goes back to how seriously will people take it. How

much will they follow the various components, how much will the evaluations be

implemented with fidelity across the district. If the districts are not necessarily

following through with how it is supposed to be documented, how you are

supposed to be showing your evidence, and it’s kind of all done in a rush at the

end of the year, it is not going to have as much of an impact in the classroom as it

could.”

• Principal P-5 said, “It is going to be dependent upon the principals’ work that they

do in leading schools and in leading teachers. How that plays out is going to be

dependent on what a principal is looking for and the consistency a principal has in

evaluating teachers.”

Question Five: “Why WILL the implementation of a statewide Pennsylvania

Principal Effectiveness Evaluation impact on student achievement scores?”

• Principal P-1 replied, “Like the trickle affect, holding everybody more

accountable, ultimately will improve achievement scores. Having an effective

Page 113: Evert Thomas - Final Dissertation Draft for Publication 04232014

98

 

principal, who is holding teachers accountable for their own effectiveness,

ultimately will have a positive effect on student performance.”

• Principal P-2 shared, “Because I believe that principals will start going after those

problematic areas. And in specific cases will say, “Look, we are lacking in this

area. Therefore we are going to increase the instructional practices of those kids,

or for those teachers that will impact those kids, so those kids get a better

education.” Because the principals are being graded on that same program, and

you are not making headway in certain areas, therefore the trickle down effect

will impact the teachers and will impact the kids.”

• Principal P-3 stated, “It goes right back to that instructional goals and data. Data

collection! How are you helping students to grow, are you making your expected

growth under the new school improvement systems, and I know we don’t have

AYP, but those new systems are saying you must show growth in all of your

subgroups if you have them. So it is going to have a significant impact just

because we are now holding teachers accountable. They are going to have

individual data. You know they are being accountable to each other with 15% of

that pie, coming eventually, three years down the road. I think there is a lot more

accountability with special education, with the whole group, that we are all in this

together. Specialists, you can’t ignore it either because some of the pie is yours

too. I think student achievement is going to be significantly impacted because

now everybody feels a little more accountable to each other, and accountable to

what the students are going to do, because it effects their evaluation.”

Page 114: Evert Thomas - Final Dissertation Draft for Publication 04232014

99

 

• Principal P-4 responded, “It will impact on student achievement scores because it

defines what a principal should do in terms of data. It talks about how the

principal is the leader of change. How the principal can build an environment

where all the constituents are looking at the data and seeing how they can improve

student achievement. Everything that principals do connects ultimately back to

student achievement. It gives it some definition, gives it some clarity, and gives it

some purpose with it all relating back to student achievement.”

• Principal P-5 said, “The rubric and what we are being held accountable to, has

that piece, of student data that is involved in it. It should work from the principal,

to the teacher, to the instructional practices that ultimately impacts student

outcomes, and the additional piece of the SLOs that teachers will be writing and

therefore monitoring, to also in turn impact student achievement.”

Question Six: “Why will the implementation of a statewide Pennsylvania

Principal Effectiveness Evaluation NOT impact on student achievement scores?”

• Principal P-1 shared, “I don’t think that that’s a possibility. I think that by

holding the accountability measure up for everybody it is going to ultimately be

better for kids. Holding principals more accountable, to be able to hold teachers

more accountable, for teachers to hold students more accountable, is going to

improve their performances.”

• Principal P-2 stated, “I believe that when principals are looking at these areas they

are going to focus on certain areas that they feel they can make headway. And

certain areas where they don’t feel like they can get the ground made up, they are

going to push aside and say they are going to focus on a certain area where they

Page 115: Evert Thomas - Final Dissertation Draft for Publication 04232014

100

 

can make the most headway, and therefore not concentrate in other areas.

Therefore, kids will be missing instruction because of priorities not being put into

all areas. But, instead they will be looking on where can I get my score up higher

and focused in on that only.”

• Principal P-3 responded, “I think some of the categories don’t directly relate to

student achievement. Some of the categories are more managerial in nature. I

don’t expect my supervisor to observe me doing things with student achievement.

But, with the heightened focus on teacher observation I know a supervisor is

coming in and following me for a day or two, and watching what I do. They now

have to be more aware of how I run my building and what I am doing, and

looking at my observations, and holding me accountable too. The only way it

doesn’t is if you look at the areas where it doesn’t directly attribute back to

student achievement.”

• Principal P-4 said, “It depends on how differently districts implement it. It all

goes back to if people are implementing it with fidelity. If you are using it,

looking at the results, looking at areas and saying, “Wow! This is an area I need to

focus on and improve upon.” If I am not doing as well in that area or if I’m not

doing enough in that area, because everything in the rubric does ultimately, I

believe, connect back to student achievement. You must look at each component

and see if there are areas to improve. If you are not doing that, you are not using

it as a point of self-reflection, then you are not going to have the impact that you

could as a result of the rubric implementation.”

Page 116: Evert Thomas - Final Dissertation Draft for Publication 04232014

101

 

• Principal P-5 replied, “It is really dependent on what happens with the principal.

There are some pieces where the students change from year to year in terms of

their ability level. Of course, the one piece that is also being looked at in this

whole evaluation system is students’ achievement on standardized tests. My hope

is that the PVAAS is another piece of data to look at, but it depends on really

what measures we are looking at, to see if there is going to be any impact on

student achievement.”

Summary

The purpose of the study was to ascertain principal perceptions related to the

proposed 2014 implementation of a new statewide Pennsylvania Principal Effectiveness

Evaluation (PPEE) and its effects upon leadership practices, instructional practices, and

student achievement scores. This chapter included an analysis and review of the data

collected using a Google© Drive online survey for each multiple Likert-scale questions,

forced-choice questions, open-ended response questions, and the voluntary interview

responses. A summary of the study and its results, as well as, how these results related to

other research is discussed in chapter five.

Page 117: Evert Thomas - Final Dissertation Draft for Publication 04232014

102

 

Chapter Five: Discussion

Summary of the Study

The purpose of this study was to investigate the impact and implementation of a

new statewide Pennsylvania Principal Effectiveness Evaluation as perceived by only

principals. This study was conducted online with 11 school districts located throughout

Southeastern Pennsylvania. The researcher designed a Google© Drive survey that

collected data from 25 participants in this qualitative study using Likert-scale, forced-

choice, and open-ended questions. Only five voluntary participants agreed to further

participate after the survey with a phone interview that consisted of six scripted

questions.

Demographic information from principals that participated within this study were

closely balanced between the levels in elementary 11 (44%) and secondary 14 (56%)

schools (Table 4.1); principals with less than 11 (44%) and more than 14 (56%) five

years of experience (Table 4.3); and lastly, a total population of less than 13 (52%) and

more than 12 (48%) fifty certified-teachers within their schools (Table 4.4). Only two

demographic responses were uneven in the areas of female 9 (36%) and male 16 (64%)

principals by gender (Table 4.2); and a total population of less than 6 (24%) and more

than 19 (76%) five hundred enrolled students within their schools (Table 4.5).

All 25 principals were aware of the Pennsylvania Department of Education’s plan

to implement a statewide principal effectiveness evaluation within 2014. Less than half

of the principals 11 (44%) in this study participated in a pilot program by PDE focused

on the new statewide PPEE. A total of 21 (84%) principals indicated agreement with

PDE to develop a statewide evaluation system (Table 4.6) and 20 (80%) of the principals

Page 118: Evert Thomas - Final Dissertation Draft for Publication 04232014

103

 

perceived the PPEE as a fair rubric tool to rate their job performance (Table 4.7). The

principals perceived that the most difficult aspects of implementation for the new

statewide PPEE (Table 4.24) would be the language within the rubric 11 (14%); evidence

and documentation for domain items 9 (12%); consistency within all Pennsylvania

districts 8 (10%); linking all data sources 8 (10%); necessary resources and other

unforeseen factors 8 (10%); amount of variables in principal duties and responsibilities 7

(9%); communication of information, expectations, and training 6 (8%); not doing the

same old thing under a new name 6 (8%); tracking participant information 6 (8%); roll-

out logistics vs. other PDE initiatives in progress 4 (5%); timely results of PSSA data 3

(4%); and lastly public input and involvement 1 (1%). On the contrary, the principals

perceived the most seamless aspects of implementation for the new statewide PPEE

(Table 4.25) would be the PDE’s communication of information 6 (14%); what principals

are already doing 5 (11%); positive outlook for leading change efforts 5 (11%); positive

outlook for new PPEE and positive improvement 5 (11%); Danielson rubric already in

use 4 (9%); ownership and responsibility for new PPEE 4 (9%); dedication and time to

new PPEE 3 (7%); PDE mandated for all PA districts 2 (5%); support from

administration at the district level 2 (5%); currently participating in PDE pilot 1 (2%);

and lastly size of the school district 1 (2%). A total of 6 (14%) principals responded as

not sure relative to any perception of most seamless aspects for implementation of the

new statewide PPEE.

Almost every principal 24 (96%) indicated they received feedback on their job

performance within the frequency of annual 15 (60%); mid-year 5 (20%); quarterly 2

(8%); or other 3 (12%) evaluations (Table 4.8). In addition, the most frequent formats of

Page 119: Evert Thomas - Final Dissertation Draft for Publication 04232014

104

 

evaluative feedback (Table 4.9) were self-reflection or self-rating 12 (30%); rubric design

or rating scale 11 (28%); written narrative or summary 11 (28%); checklist or inventory

items 4 (10%); and lastly other 2 (5%) received by these principals. Most principals

expressed satisfaction 19 (76%) with their district’s current process of evaluation and

feedback (Table 4.10). A majority of the principals communicated their evaluation

feedback was an accurate measurement of efforts to initiate positive change (Table 4.11)

in principal leadership practices 22 (88%), teacher instructional practices 13 (92%), and

student achievement scores 18 (72%). All 25 principals received satisfactory evaluation

feedback between 2010 through 2013, and new principals similarly received satisfactory

evaluation feedback upon entering the position (Table 4.12).

Of the principals in this study, 20 (80%) believed PDE has implemented the new

statewide PPEE to positively impact (Table 4.13) upon leadership practices, instructional

practices, and achievement scores. Specifically, the principals perceived that principal

leadership practices will improve as a result to the implementation of the new statewide

PPEE (Table 4.26) based on clearer expectations within a common framework 11 (22%);

more accountability 6 (12%); if implemented with fidelity and consistency 5 (10%); more

attention on tasks and responsibilities 5 (10%); more focus on strengths and weaknesses 5

(10%); more reflective process 4 (8%); more formal communication process 3 (6%);

collaboration on best practice 2 (4%); and lastly feedback will increase positive

performance 1 (2%). A total of 6 (12%) principals perceived the new PPEE process will

not improve principal leadership practices and 1 (2%) was not sure relative to any

improvement in principal leadership practices.

Page 120: Evert Thomas - Final Dissertation Draft for Publication 04232014

105

 

Similarly, the principals perceived that teacher instructional practices will

improve as a result to the implementation of the new statewide PPEE (Table 4.27) based

on more attention to support teacher and instruction 8 (17%); more accountability on

supervision of teachers and instruction 7 (15%); clearer expectations within rubric for

leadership linked with teacher and instruction 5 (11%); more focus on constructive

communication with teachers 4 (9%); more attention on PSSA data and test scores 4

(9%); and lastly if implemented with fidelity and consistency 4 (9%). A total of 8 (17%)

principals perceived the new PPEE process will not improve teacher instructional

practices and 6 (13%) were not sure relative to any improvement in teacher instructional

practices.

Likewise, the principals perceived that student achievement scores will improve

as a result to the implementation of the new statewide PPEE (Table 4.28) based on if

implemented with fidelity and consistency 11 (25%); clear expectations in rubric to link

principal accountability with teacher instruction and student growth 8 (18%); and lastly

more attention on PSSA data and test scores 8 (18%). A total of 11 (25%) principals

perceived the new PPEE process will not improve student achievement scores and 6

(14%) were not sure relative to any improvement in student achievement scores.

In addition, the principals perceived that only a minimum percentage (Table

4.14), 0 to 50% of student performance data, should be used to account towards their own

job performance related to evaluative feedback in principal leadership practices 19

(76%), teacher instructional practices 24 (96%), and student achievement scores 22

(88%) on the new statewide PPEE.

Page 121: Evert Thomas - Final Dissertation Draft for Publication 04232014

106

 

The principals in this study involved themselves within the professional practices

of Domain 1: Strategic/Cultural Leadership (Table 4.15) of the new PPEE most often in

criteria 1a: Creating an Organizational Vision, Mission, and Strategic Goals via weekly

10 (36%) involvement; criteria 1b: Using Data for Informed Decision Making via weekly

10 (36%) involvement; criteria 1c: Building a Collaborative and Empowering Work

Environment via daily 19 (76%) involvement; criteria 1d: Leading Change Efforts for

Continuous Improvements via weekly 12 (46%) involvement; and lastly criteria 1e:

Celebrating Accomplishments and Acknowledging Failures via weekly 9 (33%)

involvement.

Similarly, the principals involved themselves within the professional practices of

Domain 2: Systems Leadership (Table 4.16) of the new PPEE most often in criteria 2a:

Leveraging Human and Financial Resources via daily 10 (38%) involvement; criteria 2b:

Ensuring School Safety via daily 20 (74%) involvement; criteria 2c: Complying with

Federal, State, and LEA Mandates via daily 14 (50%) involvement; criteria 2d:

Establishing and Implementing Expectations for Students and Staff via daily 14 (48%)

involvement; criteria 2e: Communicating Effectively and Strategically via daily 16 (50%)

involvement; and lastly criteria 2f: Managing Conflict Constructively via daily 15 (58%)

involvement.

Likewise, the principals involved themselves within the professional practices of

Domain 3: Leadership for Learning (Table 4.17) of the new PPEE most often in criteria

3a: Leading School Improvement Initiatives via weekly 10 (36%) involvement; criteria

3b: Aligning Curricula, Instruction, and Assessments via weekly 10 (37%) involvement;

criteria 3c: Implementing High Quality Instruction via daily 14 (48%) involvement;

Page 122: Evert Thomas - Final Dissertation Draft for Publication 04232014

107

 

criteria 3d: Setting High Expectations for All Students via daily 16 (57%) involvement;

and lastly criteria 3e: Maximizing Instructional Time via weekly 11 (41%) involvement.

In the same way, the principals involved themselves within the professional

practices of Domain 4: Professional and Community Leadership (Table 4.18) of the new

PPEE most often in criteria 4a: Maximizing Parent and Community Involvement and

Outreach via evenly between weekly 11 (42%) and monthly 11 (42%) involvement;

criteria 4b: Showing professionalism via daily 22 (85%) involvement; and lastly criteria

4c: Supporting Professional Growth via daily 10 (33%) involvement. In summary, the

principals involved themselves within the professional practices of all four domains on

the new PPEE most often in 11 criteria via daily and 7 criteria via weekly involvement,

while 1 criteria was evenly divided between weekly and monthly involvement.

Principals in this study did not perceive the need to alter principal practices that

affect any future level of changes toward their performance (Table 4.19) in Domain 1:

Strategic/Cultural Leadership for criteria 1a: Creating an Organizational Vision, Mission,

and Strategic Goals with 15 (60%); criteria 1b: Using Data for Informed Decision

Making with 15 (60%); criteria 1c: Building a Collaborative and Empowering Work

Environment with 19 (76%); and lastly criteria 1e: Celebrating Accomplishments and

Acknowledging Failures 17 (68%). On the contrary, the principals did perceive the need

to alter their principal practices that affect future level changes toward performance

(Table 4.19) in Domain 1: Strategic/Cultural Leadership of criteria 1d: Leading Change

Efforts for Continuous Improvements with 15 (60%).

Similarly, principals in this study did not perceive the need to alter principal

practices that affect any future level of changes toward their performance (Table 4.20) in

Page 123: Evert Thomas - Final Dissertation Draft for Publication 04232014

108

 

Domain 2: Systems Leadership for criteria 2a: Leveraging Human and Financial

Resources with 18 (72%); criteria 2b: Ensuring School Safety with 22 (88%); criteria 2c:

Complying with Federal, State, and LEA Mandates with 19 (76%); criteria 2d:

Establishing and Implementing Expectations for Students and Staff with 17 (68%);

criteria 2e: Communicating Effectively and Strategically with 15 (60%); and lastly

criteria 2f: Managing Conflict Constructively with 21 (84%).

Likewise, principals in this study did not perceive the need to alter principal

practices that affect any future level of changes toward their performance (Table 4.21) in

Domain 3: Leadership for Learning for criteria 3a: Leading School Improvement

Initiatives with 14 (56%); criteria 3c: Implementing High Quality Instruction with 14

(56%); criteria 3d: Setting High Expectations for All Students with 17 (68%); and lastly

criteria 3e: Maximizing Instructional Time with 17 (68%). On the contrary, the

principals did perceive the need to alter their principal practices that affect future level

changes toward performance (Table 4.21) in Domain 3: Leadership for Learning of

criteria 1b: Leading Change Efforts for Continuous Improvements with 13 (52%).

In the same way, the principals in this study did not perceive the need to alter

principal practices that affect any future level of changes toward their performance (Table

4.22) in Domain 4: Professional and Community Leadership for criteria 4b: Showing

professionalism with 22 (88%); and lastly criteria 4c: Supporting Professional Growth

with 22 (88%). On the contrary, the same principals did perceive the need to alter their

principal practices that affect future level changes toward performance (Table 4.22) in

Domain 4: Professional and Community Leadership of criteria 4a: Maximizing Parent

and Community Involvement and Outreach with 15 (60%). In summary, the principals

Page 124: Evert Thomas - Final Dissertation Draft for Publication 04232014

109

 

within this study perceived the need to alter their principal practices in 3 out of the 19

criteria of all four domains on the new PPEE.

The principals in this study perceived the new statewide PPEE as achievable

within Domain 1: Strategic/Cultural Leadership via 24 (96%); Domain 2: Systems

Leadership via 23 (92%); Domain 3: Leadership for Learning via 23 (92%), and lastly

Domain 4: Professional and Community Leadership via 25 (100%).

Summary of the Results

Research Question 1. What are the perceptions of principals regarding the

implementation of a statewide Pennsylvania Principal Effectiveness Evaluation and its

impact on principal leadership practices? Survey data reported 21 (84%) of the principals

indicated their agreement with PDE in the development of a statewide evaluation system

(Table 4.6). Additionally, 20 (80%) of the principals perceived the new PPEE as a fair

rubric tool to rate their job performance (Table 4.7). Of the 25 principals in this study, 20

(80%) believed the implementation of the new statewide PPEE would positively impact

principal leadership practices (Table 4.13). Specifically, the principals perceived that

principal leadership practices would improve as a result to the implementation of the new

statewide PPEE (Table 4.26) based on clearer expectations within a common framework

11 (22%); more accountability 6 (12%); if implemented with fidelity and consistency 5

(10%); more attention on tasks and responsibilities 5 (10%); more focus on strengths and

weaknesses 5 (10%); more reflective process 4 (8%); more formal communication

process 3 (6%); collaboration on best practice 2 (4%); and lastly feedback will increase

positive performance 1 (2%). Only 6 (12%) principals perceived the new PPEE process

Page 125: Evert Thomas - Final Dissertation Draft for Publication 04232014

110

 

would not improve principal leadership practices and 1 (2%) was not sure relative to any

improvement in principal leadership practices.

Additionally, the principals perceived that the minimum percentages (Table 4.14),

0 to 50% of student performance data, should be used to account towards their own job

performance related to evaluation outcome and feedback within principal leadership

practices 19 (76%) in comparison to, 51 to 100% of student performance data, which

totaled 6 (24%).

The principals perceived themselves already involved with professional practices

of all four domains on the new PPEE in 11 criteria via daily and 7 criteria via weekly,

while one criteria was evenly divided between weekly and monthly involvement. Also,

the principals perceived the need to alter their principal practices in only 3 out of the 19

criteria of all four domains on the new PPEE. In general, the principals perceived the

new statewide PPEE is achievable (Table 4.23) within Domain 1: Strategic/Cultural

Leadership via 24 (96%); Domain 2: Systems Leadership via 23 (92%); Domain 3:

Leadership for Learning via 23 (92%), and lastly Domain 4: Professional and Community

Leadership via 25 (100%).

Interview data from the principals (Appendix O) supported the new statewide

PPEE would impact principal leadership practices through increased accountability and

holding principals more accountable in terms of “what we are looking for”, “what we are

measuring”, and “what is expected from principals as well as teachers”. In addition, the

interview data proposed that principal leadership practices will now be measured using an

evaluation rubric that provides a set of common language and clearer definitions to “what

it should look like in every building” across the state of Pennsylvania. Furthermore,

Page 126: Evert Thomas - Final Dissertation Draft for Publication 04232014

111

 

principal interview data suggested the PPEE would not impact principal leadership

practices if individuals do not take a serious approach towards the evaluation rubric, and

those responsible for the supervision of principals are inconsistent with their application

of the new statewide PPEE evaluation process.

Research Question 2. What are the perceptions of principals regarding the

implementation of a statewide Pennsylvania Principal Effectiveness Evaluation and its

impact on teacher instructional practices? Survey data reported the principals perceived

that teacher instructional practices would improve as a result to the implementation of the

new statewide PPEE (Table 4.27) based on more attention to support teacher and

instruction 8 (17%); more accountability on supervision of teachers and instruction 7

(15%); clearer expectations within rubric for leadership linked with teacher and

instruction 5 (11%); more focus on constructive communication with teachers 4 (9%);

more attention on PSSA data and test scores 4 (9%); and lastly if implemented with

fidelity and consistency 4 (9%). Only 8 (17%) principals perceived the new PPEE

process would not improve teacher instructional practices and 6 (13%) were not sure

relative to any improvement in teacher instructional practices.

Additionally, the principals perceived that the minimum percentages (Table 4.14),

0 to 50% of student performance data, should be used to account towards their own job

performance related to evaluation outcome and feedback within teacher instructional

practices 24 (96%) in comparison to, 51 to 100% of student performance data, which

totaled 1 (4%).

Interview data from the principals (Appendix O) supported the new statewide

PPEE would impact teacher instructional practices through holding teachers more

Page 127: Evert Thomas - Final Dissertation Draft for Publication 04232014

112

 

accountable for student data and what is happening within their classrooms, which can

now negatively filter down to affect principal evaluation. In addition, the principal

interview data suggested the new statewide PPEE would not impact teacher instructional

practices if a principal had already established a culture around continued professional

development and communicated with staff about improvement on best practices. Also,

the interview data proposed that principals tend to focus on the good in teachers because

it is the nature of their position. Furthermore, the interview data advised that principals

must show consistency “on what they are looking for” in their own application of the

teacher evaluation process or the new statewide PPEE will not positively impact upon

classroom instruction practices.

Research Question 3. What are the perceptions of principals regarding the

implementation of a statewide Pennsylvania Principal Effectiveness Evaluation and its

impact on student achievement scores? Survey data reported the principals perceived that

student achievement scores would improve as a result to the implementation of the new

statewide PPEE (Table 4.28) based on if implemented with fidelity and consistency 11

(25%); clear expectations in rubric to link principal accountability with teacher

instruction and student growth 8 (18%); and lastly more attention on PSSA data and test

scores 8 (18%). Only 11 (25%) principals perceived the new PPEE process would not

improve student achievement scores and 6 (14%) were not sure relative to any

improvement in student achievement scores.

Additionally, the principals perceived that the minimum percentages (Table 4.14),

0 to 50% of student performance data, should be used to account towards their own job

performance related to evaluation outcome and feedback within student achievement

Page 128: Evert Thomas - Final Dissertation Draft for Publication 04232014

113

 

scores 22 (88%) in comparison to, 51 to 100% of student performance data, which totaled

3 (12%).

Interview data from the principals (Appendix O) supported the new statewide

PPEE would impact student achievement scores through a trickle down effect, to hold

teachers accountable for their 15% of student growth, while generating a positive effect

on student performance scores. In addition, the interview data suggested everybody feels

a little more accountable to each other, specifically for student achievement, because it

affects everyone’s evaluation. Furthermore, the interview data stated the new statewide

PPEE not impacting upon student achievement scores tends to be impossible due to the

design of the evaluation rubric and its systemic connection to student achievement. In

general, the principal interview data summarized the premise of holding principals more

accountable, to be able to hold teachers more accountable, and teachers then holding their

students more accountable, to improve achievement scores.

Limitations of the Study

A limitation of the study was procurement of permission from school districts

within Pennsylvania to conduct this study. The Research Ethics Review Board (RERB)

at Immaculata University approved further school districts to participate within this study

after only five participants responded to the online survey and none agreed to volunteer

for an interview. A total of 11 school districts finally granted permission for their

principals to participate in the study, which resulted in 25 principals completing the

online survey and five agreeing to be interviewed over the phone. It was not possible to

tell if all 11 school districts were represented in the study.

Page 129: Evert Thomas - Final Dissertation Draft for Publication 04232014

114

 

All five principals that agreed to be interviewed choose only to participate in

phone interviews. Four out of the five principals that agreed to a phone interview were

from the same school district.

The sample size of only 25 principals needs to be considered a factor when

applying any results from data within this study to larger populations, and therefore the

results are not generalizable.

Relationship to Other Research

Condon and Clifford (2010), Goldring et al. (2009), Portin, Feldman, and Knapp

(2006) examined and questioned the consistency, fairness, effectiveness, and value of

current principal evaluation practices. The results in the study agreed with researchers to

reveal that 80% of the principals perceived the PPEE as a fair rubric tool to rate their job

performance based on the evaluation design to indicate consistency, fairness, and value

for leadership practices. Brown-Sims (2010), Condon and Clifford (2010), Portin et al.

(2006) further addressed seven categories that guided principal effective evaluation as:

(1) what is the purpose of the evaluation; (2) what is assessed or measured; (3) what are

the sources of evidence; (4) who is assessed; (5) who provides feedback; (6) when does

assessment occur and how is assessment conducted; and (7) what are the psychometric

qualities of the assessment. This study found that principals communicated evaluation

feedback was an accurate measurement of their principal leadership practices (88%),

teacher instructional practices (92%), and student achievement scores (72%).

Soehner and Ryan (2011) discussed that the role of principals as instructional

leaders is to actively support the quality of teachers within each classroom. This was

evident in the results of the study, with over 80% of the principals that perceived the new

Page 130: Evert Thomas - Final Dissertation Draft for Publication 04232014

115

 

statewide PPEE evenly impacted upon leadership practices, instructional practices, and

achievement scores. Horng, Kalogrides, and Loeb (2009) added effective principals must

work to improve the conditions of classroom instruction and the school culture that

permits their teachers to impact student learning. It would be beneficial for veteran

principals who have been in their jobs to participate within the current Principal Inspired

Leadership (PIL) courses that new principals are required to complete over three years

through the Pennsylvania Department of Education. The PIL induction program would

support professional development for principals on the changing roles and current issues

related to instructional leadership.

Clifford, Hansen, and Wraight (2012) explained the goals of principal evaluation

in terms of formative and summative assessments that provides opportunities for future

performance and improvement within the evaluation process. This was apparent within

the study as 96% of principals indicated they received evaluative feedback on their job

performance through 60% annual, 20% mid-year, and 8% quarterly evaluations.

Reeves (2009) recommended that effective evaluation systems contain clear

definitions along with a detailed rubric related to performance levels for measuring

aspects of principal leadership standards. This was noticeable within the study because

28% of principals reported their primary format of evaluative feedback has been a rubric

design. Additionally, clear definitions within the rubric help principals to personalize

their perceptions of whether a need exists to alter principal practices relative to criteria

within specific PPEE domains.

Page 131: Evert Thomas - Final Dissertation Draft for Publication 04232014

116

 

Recommendations for Further Research

The primary focus of this study was to determine the perceptions of principals

regarding the implementation of a new statewide Pennsylvania Principal Effectiveness

Evaluation and its impact on principal leadership practices, teacher instructional

practices, and student achievement scores. The research relied upon triangulated data

drawn from principals within 11 school districts via an online Google© Drive survey

using Likert-scale, forced-choice, open-ended questions and phone interviews. The

researcher’s recommendations for further studies on this topic are as follows:

1. Replicate this study with a greater number of principals in Pennsylvania to

compare the results with a larger sample.

2. Replicate this study with assistant principals in Pennsylvania to compare the

results against the principal sample.

3. Examine why over 80% of principals perceived the new statewide PPEE as a

fair rubric tool to rate their job performance.

4. Study if the language within the evaluation rubric tool of the new statewide

PPEE is problematic for principals and their evaluators.

5. Investigate how difficult principals find the process of collecting evidence and

documentation for each domain item within the new statewide PPEE.

6. Explore any trends of the new statewide PPEE to continue with rating all

existing principals and those entering the position of principal as satisfactory.

7. Calculate the frequency of time spent by principals within each domain of the

new statewide PPEE.

Page 132: Evert Thomas - Final Dissertation Draft for Publication 04232014

117

 

8. Determine why principals perceived altering their practices to effect a future

level of change in the domains of 1d: Leading Change Efforts for Continuous

Improvements; 3b: Aligning Curricula, Instruction, and Assessments; and 4a:

Maximizing Parent and Community Involvement and Outreach on the new

statewide PPEE.

9. Explain why the majority of principals perceived all four domains of the new

statewide PPEE as extremely achievable.

10. Authenticate the psychometric properties of the new statewide PPEE related

to validity, reliability, feasibility, utility, and fairness.

Conclusion

The purpose of this qualitative study was to investigate the implementation and

impact of a new statewide Pennsylvania Principal Effectiveness Evaluation (PPEE) as

perceived by only principals. This study found that the new statewide PPEE would

impact principal leadership practices if individuals take a serious approach to the

evaluation rubric that provided principals and evaluators with clear definitions for

accountability and measurements relative to differentiation within job performance

levels. Secondly, this study found that the new statewide PPEE would impact teacher

instructional practices because principals would hold teachers more accountable for

classroom learning and student performance, which has filtered down to affect their own

evaluations. Lastly, this study found that the new statewide PPEE would impact student

achievement scores due to a trickle down effect on principals holding teachers

accountable for student growth, and teachers then holding students more accountable for

improved achievement scores.

Page 133: Evert Thomas - Final Dissertation Draft for Publication 04232014

118

 

References

American Recovery and Reinvestment Act of 2009, Pub. L. No. 111-5, 123 Stat. 115

(2009).

Andrews, R. L., & Soder, R. (1987). Principal instructional leadership and school

achievement. Educational Leadership, 44(6), 9–11.

Anthes, K. (2005). Leader standards. Denver, CO: Education Commission of the States.

Retrieved from http://www.ecs.org/clearinghouse/58/19/5819.doc

Augustine, C. H., Gonzalez, G., Ikemoto, G. S., Russell, J., Zellman, G. L., Constant, L.,

et al. (2009). Improving school leadership: The promise of cohesive leadership

systems. Santa Monica, CA: Rand.

Ballard, K., & Bates, A. (2008). Making a connection between student achievement,

teacher accountability, and quality classroom instruction. The Qualitative Report,

13(4), 560–580.

Bass, B. M. (1998). Transformational leadership: Industrial, military, and educational

impact. Mahwah, NJ: Lawrence Erlbaum Associates.

Berman, P., & McLaughlin, M. W. (1976). Implementation of educational innovation.

The Educational Forum, 40, 345–370.

Beteille, T., Kalogrides, D., & Loeb, S. (2009) Effective schools: Managing the

recruitment, development, and retention of high-quality teachers, (CALDER

Working Paper 37). Washington, D.C.: Urban Institute. Retrieved from

www.caldercenter.org

Page 134: Evert Thomas - Final Dissertation Draft for Publication 04232014

119

 

Branch, G. F., Hanushek, E. A., & Rivkin, S. G. (2009). Estimating principal

effectiveness (CALDER Working Paper 32). Washington, DC: The Urban

Institute. Retrieved from www.caldercenter.org

Brewer, D. J. (1993). Principals and student outcomes: Evidence from U.S. high schools.

Economics of Education Review, 12(4), 281–92.

Brown-Sims, M. (2010). Evaluating school principals. Tips & Tools. Washington, DC:

National Comprehensive Center for Teacher Quality.

Burns, J. M. (1978). Leadership. New York: Harper & Row Publishers.

Catano, N. & Stronge, J.H. (2007). What do we expect of school principals? Congruence

between principal and evaluation and performance standards. International

Journal of Leadership in Education. 10 (4), 379-399.

Cheng, Y. C. (1991). Leadership style of principals and organizational process in

secondary schools. Journal of Educational Administration, 29(2), 25–37.

Clifford, M., Hansen, U. J., & Wraight, S. (2012). A practical guide to designing

comprehensive principal evaluation systems: A tool to assist in the development

of principal evaluation systems. Washington, D.C.: National Comprehensive

Center for Teacher Quality. Retrieved from http://www.tqsource.org

Clifford, M., & Ross, S. (2011). Designing principal evaluation systems: Research to

guide decision-making. Washington, DC: National Association for Elementary

School Principals.

Colorado Department of Education. (2011). Users guide: Colorado model evaluation

system for principals and assistant principals. Retrieved from

Page 135: Evert Thomas - Final Dissertation Draft for Publication 04232014

120

 

http://www.cde.state.co.us/EducatorEffectiveness/downloads/Evaluating%20Prin

cipals/December%205%20Draft%20User%27s%20Guide%20Principal_Assistant

%20Principal_2011.pdf

Condon, C., & Clifford, M. (2010). Measuring principal performance: How rigorous are

commonly used principal performance assessment instruments? Naperville, IL:

Learning Point Associates.

Council of Chief State School Officers (CCSSO). (2008). Educational leadership policy

standards: ISLLC 2008. Washington, D.C.: Council of Chief State School

Officers. Retrieved from www.ccsso.org

Creswell, J. W. (2009) Research design: Qualitative, quantitative, and mixed methods

approaches. Los Angeles, CA: Sage Publications.

Danielson, C. (2011). Danielson framework for teaching evaluation instrument.

Alexandria, VA: Association for Supervision and Curriculum Development.

Retrieved from http://www.danielsongroup.org

Danielson, C. (2007). Enhancing professional practice: A framework for teaching.

Alexandria, VA: Association for Supervision and Curriculum Development.

Danielson, C. (1996). Enhancing professional practice: A framework for teaching.

Alexandria, VA: Association for Supervision and Curriculum Development.

Davis, S., Darling-Hammond, L., LaPointe, M., & Meyerson, D. (2005). School

leadership study: Developing successful principals (Review of Research).

Stanford, CA: Stanford University, Stanford Educational Leadership Institute.

Page 136: Evert Thomas - Final Dissertation Draft for Publication 04232014

121

 

Davis, S., Kearney, K., Sanders, N., Thomas, C., & Leon, R. (2011). The policies and

practices of principal evaluation: A review of the literature. San Francisco:

WestEd.

DeNisi, A. S., & Kluger, A. N. (2000). Feedback effectiveness: Can 360-degree

appraisals be improved? Academy of Management Executive, 14(1),

129–139.

Denzin, N. K., & Lincoln, Y. S. (2011). The SAGA handbook of qualitative research.

Thousand Oaks, CA: Sage Publications.

Fenton, B., Kelemen, M., Norskog, A., Robinson, D., Schnur, J., Simmons, M.,

Taliaferro, L., & Walker, RK. (2010). Evaluating principals: Balancing

accountability with professional growth. New Leaders for New Schools. New

York: NY. Retrieved from www.nlns.org.

Friedman, I. (2002). Burnout in school principals: Role related antecedents. Social

Psychology of Education, 5(3), 229–251. Retrieved from

http://www.springerlink.com/content/abww0kemqeu4tafl/fulltext.pdf

Fullan, M. (2001). The new meaning of educational change (3rd ed.). New York and

London: Teachers College Press.

Gallagher, M. (2012). How principals support teacher effectiveness. Leadership, 41(3),

32-37.

Gardner, J. (2000). The nature of leadership. San Francisco, CA: The Jossey-Bass Reader

on Educational Leadership.

Gastil, J. (1994). A definition and illustration of democratic leadership. Human Relations,

47(8), 953–975.

Page 137: Evert Thomas - Final Dissertation Draft for Publication 04232014

122

 

Gaziel, H. (2007). Re-examining the relationship between principal’s instructional/

educational leadership and student achievement. Journal of Social Science, 15(1),

17–24.

Glasman, N. (1984). Student achievement and the school principal. Educational

Evaluation and Policy Analysis, 6(3), 283–296.

Goe, L., Bell, C., & Little, O. (2008). Approaches to evaluating teacher effectiveness: A

research synthesis. Washington, D.C.: National Comprehensive Center for

Teacher Quality.

Goldring, E., Cravens, X., Murphy, J., Porter, A., Elliott, S., & Carson, B. (2009). The

evaluation of principals: What and how do states and urban districts assess

leadership? Elementary School Journal, 110(1), 19–39.

Goldring, E. B., & Pasternak, R. (1994). Principals’ coordinating strategies and school

effectiveness. School Effectiveness and School Improvement, 5(3), 239–253.

Goldring, E., Porter, A. C., Murphy, J., Elliot, S. N., & Cravens, X. (2007). Assessing

learner-centered leadership: Connections to research, professional standards and

current practices. Nashville, TN: Vanderbilt University.

Hale, E. L., & Moorman, H. N. (2003). Preparing school principals: A national

perspective on policy and program innovations. Washington, DC: Institute for

Educational Leadership, and Edwardsville, IL: Illinois Education Research

Council.

Hallinger, P., & Heck, R. H. (1998). Exploring the principal’s contribution to school

effectiveness: 1980–1995. School Effectiveness and School Improvement, 9(2),

157–191.

Page 138: Evert Thomas - Final Dissertation Draft for Publication 04232014

123

 

Heck, R. H., & Marcoulides, G. A. (1996). Principal assessment: Conceptual problem,

methodological problem, or both? Peabody Journal of Education, 68(1), 124–

144.

Herman, J. L., Hertitage, M., & Goldschmidt, P. (2011). Developing and selecting

assessments for student growth for use in teacher evaluation systems. Los

Angeles, CA: University of California, Assessment and Accountability

Comprehensive Center. Retrieved from

http://datause.cse.ucla.edu/DOCS/DSA_long_v6[1].pdf

Holdheide, L., Goe, L., Croft, A., & Reschly, D. (2010). Challenges in evaluating special

education teachers and English language learner specialists. Washington, D.C.:

National Comprehensive Center for Teacher Quality. Retrieved from

http://www.tqsource.org/publications/July2010Brief.pdf

Horng, E. L., Kalogrides, D., & Loeb, S. (2009). Principal preferences and the unequal

distribution of principals across schools (CALDER Working Paper 36).

Washington, D.C.: The Urban Institute. Retrieved from www.caldercenter.org

Illinois Department of Education. (2012). Education reform in Illinois: Non-regulatory

guidance on the Performance Evaluation Reform Act and Senate Bill 7. Retrieved

from http://www.isbe.net/PERA/pdf/pera_guidance.pdf

Joint Committee on Standards for Educational Evaluation. (2010). Personnel evaluation

standards. Iowa City, IA: Author. Retrieved from

http://www.jcsee.org/personnel-evaluation-standards

Kaplan, L., Owings, W., & Nunnery, J. (2005). Principal quality: A Virginia study

connecting interstate school leaders licensure consortium standards with student

Page 139: Evert Thomas - Final Dissertation Draft for Publication 04232014

124

 

achievement. NASSP Bulletin, 89(643), 28–44.

Kimball. S.M., & Milanowski, A.T. (2009). Assessing the promise of standards-based

performance evaluation for principals: Results from randomized trial. Leadership

and Policy in Schools, 8(3), 233-263.

Klinker, J. (2006). Qualities of democracy: Links to democratic leadership. Journal of

Thought, 41(2), 51–63.

Knapp, M. S., Copland, M. A., Plecki, M. L., & Portin, B. S. (2006). Leading, Learning

and Leadership Support. Seattle, WA: Center for the Study of Teaching and

Policy, University of Washington.

Lane, S., & Horner, C. (2011). Pennsylvania teacher and principal evaluation pilot final

report. Harrisburg, PA: Commonwealth of Pennsylvania.

Leithwood, K. (1994). Leadership for school restructuring. Educational Administration

Quarterly, 30(4), 498–518.

Leithwood, K., & Jantzi, D. (2008). Linking leadership to student learning: The

contributions of leader efficacy. Educational administration quarterly, 44(4),

496-528.

Leithwood, K., Jantzi, D., Silins, H., & Dart, B. (1993). Using the appraisal of school

leaders as an instrument for school restructuring. Peabody Journal of Education,

68(2), 85–109.

Leithwood, K., Louis, K., Anderson, S., & Wahlstrom, K. (2004). How leadership

influences student learning (Review of Research). New York: The Wallace

Foundation. Retrieved from http://www.wallacefoundation.org/knowledge-

Page 140: Evert Thomas - Final Dissertation Draft for Publication 04232014

125

 

center/school-leadership/key-research/Documents/How-Leadership-Influences-

Student-Learning.pdf

Leithwood, K., & Riehl, C. (2005). What we know about successful school leadership. In

W. Firestone & C. Riehl (Eds.), A New Agenda: Directions for Research on

Educational Leadership (pp. 22-47). New York: Teachers College Press.

Levine, A. (2005). Educating school leaders. New York: The Education Schools Project.

Retrieved from http://www.edschools.org/pdf/Final313.pdf

Marshall, C., & Rossman, G. B. (2011). Designing qualitative research. Thousand Oaks,

CA: Sage Publications.

McGuigan, L., & Hoy, W. (2006). Principal leadership: Creating a culture of academic

optimism to improve achievement for all students. Leadership and Policy in

Schools, 5(1), 203–229.

Murphy, J., & Datnow, A. (2003). Leadership lessons from comprehensive school

reform. San Francisco: Corwin Press.

Nettles, S., & Herrington, C. (2007). Revisiting the importance of the direct effects of

school leadership on student achievement: The implications for school

improvement policy. Peabody Journal of Education, 82(4), 724–736.

No Child Left Behind Act of 2001, Pub. L. No. 107-110, 115 Stat. 1425 (2002).

Retrieved from http://www.ed.gov/policy/elsec/leg/esea02/index.html

O’Donnell, R., & White, G. (2005). Within the accountability era: Principals’

instructional leadership behaviors and student achievement. NASSP Bulletin,

89(645), 56–71.

Page 141: Evert Thomas - Final Dissertation Draft for Publication 04232014

126

 

Orr, M. (2011). Evaluating leadership preparation program outcomes: USDOE school

leadership programs. Presented at U.S. Department of Education School

Leadership Program Communication Hub Working Conference “Learning and

Leading: Preparing and Supporting School Leaders,” Virginia Beach, VA.

Pennsylvania Bulletin. (2013). Rules and regulations: Title 22 Pennsylvania department

of education (22 PA.  Code Chapter 19) Educator Effectiveness Rating Tool for

Classroom Teachers. Doc. No. 13-1115, 43(25), 3337-3360.

Pennsylvania Department of Education. (2014). Educator effectiveness project.

Harrisburg, PA: Commonwealth of Pennsylvania.

Pennsylvania Department of Education. (2013a). Statement on Danielson Framework.

Harrisburg, PA: Commonwealth of Pennsylvania.

Pennsylvania Department of Education. (2013b). Classroom teacher approved practice

models for 2013-14. Harrisburg, PA: Commonwealth of Pennsylvania.

Pennsylvania Department of Education. (2013c). Principal and Teacher Effectiveness

Frameworks: How Are They Connected? Harrisburg, PA: Commonwealth of

Pennsylvania.

Pennsylvania Department of Education. (2012a). Rationale for principal effectiveness.

Harrisburg, PA: Commonwealth of Pennsylvania.

Pennsylvania Department of Education. (2012b). Measuring principal effectiveness.

Harrisburg, PA: Commonwealth of Pennsylvania.

Pennsylvania Office of the Governor. (2011). Governor Corbett Outlines Agenda for

Education Reform. Retrieved from http://www.governor.state.pa.us

Page 142: Evert Thomas - Final Dissertation Draft for Publication 04232014

127

 

Porter, A. C., Goldring, E., Murphy, J., Elliot, S. N., & Cravens, X. (2006). A framework

for the assessment of learning-centered leadership. Nashville, TN: Vanderbilt

University.

Portin, B. S., Feldman, S., & Knapp, M. S. (2006). Purposes, uses, and practices of

leadership assessment in education. New York: The Wallace Foundation.

Reeves, D.B. (2009). Assessing educational leaders: Evaluating performance for

improved individual and organizational results. Thousand Oaks, CA: Corwin

Press.

Rice, J. K. (2010). Principal effectiveness and leadership in an era of accountability:

What research says. Washington, D.C.: National Center for Analysis of

Longitudinal Data in Education Research (CALDER). Retrieved from

www.caldercenter.org

Robinson, V. M., Lloyd, C. A., & Rowe, K. J. (2008). The impact of leadership on school

outcomes: An analysis of the differential effects of leadership types. Educational

Administration Quarterly, 44(5), 635–674.

Ross, J., & Gray, P. (2006). School leadership and student achievement: The mediating

effects of teacher beliefs. Canadian Journal of Education, 29(3), 798–822.

Sanders, N., & Kearney, K. (2011). A brief overview of principal evaluation literature:

Implications for selecting evaluation models. San Francisco, CA: California

Comprehensive Assistance Center, WestEd.

Sanders, N.M., & Kearney, K.M. (2008). Performance expectations and indicators for

education leaders. Washington, D.C.: Council of Chief State School Officers.

Retrieved from www.ccsso.org

Page 143: Evert Thomas - Final Dissertation Draft for Publication 04232014

128

 

Seashore Louis, K., Leithwood, K., Wahlstrom, K., & Anderson, S. (2010). Investigating

the links to improved student learning. Washington, DC: Wallace Foundation.

Sebring, P. B., Allensworth, E., Bryk, A. S., Easton, J. Q., & Luppescu, S. (2006). The

essential supports for school improvement. Chicago, Il: University of Chicago,

Consortium on Chicago School Research at the University of Chicago.

Secretary’s Priorities for Discretionary Grant Programs, 75 Fed. Reg. 47,288 (2010).

Retrieved from http://www.gpo.gov/fdsys/pkg/FR-2010-08-05/pdf/2010-

19296.pdf

Sergiovanni, T. (2005). The principalship: A reflective practice perspective. Boston, MA:

Pearson.

Soehner, D., & Ryan, T. (2011). The interdependence of principal school leadership and

student achievement. Scholar-Practitioner Quarterly, 5(3), 274-288.

Spillane, J., Halverson, R., & Diamond, J. (2004). Towards a theory of school leadership

practice: Implications of a distributed perspective. Journal of Curriculum Studies,

36(1), 3–34.

Steiner, L., Hassel, E., & Hassel, B. (2008). School turnaround leaders: Competencies

for success. Chicago: The Chicago Public Education Fund. Retrieved from

http://www.publicimpact.com/competencies-high-performers

Stewart, J. (2006). Transformational leadership: An evolving concept examined through

the works of Burns, Bass, Avolio, and Leithwood. Canadian Journal of

Educational Administration and Policy, 54(1), 1–29.

Page 144: Evert Thomas - Final Dissertation Draft for Publication 04232014

129

 

Supovitz, J., & Poglinco, S. (2001). Instructional leadership in a standards-based reform.

Philadelphia: Consortium for Policy Research in Education. Retrieved from

http://www.cpre.org/images/stories/cpre_pdfs/AC-02.pdf

Tennessee Department of Education. (2011). Teacher and principal evaluation policy:

Final reading Item: IV. C. Nashville, TN: Author.

Thomas, D., Holdaway, E., & Ward, K. (2000). Policies and practices involved in the

evaluation of school principals. Journal of Personnel Evaluation in Education,

14(3), 215–240.

U.S. Department of Education. (2011). Elementary and Secondary Education Act

(ESEA). Washington, DC: Author. Retrieved from

http://www.ed.gov/esea/flexibility/documents/esea-flexibility.doc

U.S. Department of Education. (2010). Race to the top application for initial Funding,

CFDA Number: 84.395A.

Vanderhaar, J., Munoz, M., & Rodosky, R. (2006). Leadership as accountability for

learning: The effects of school poverty, teacher experience, previous achievement,

and principal preparation programs on student achievement. Journal of Personnel

Evaluation in Education, 19(1–2), 17–33.

Vitcov, B., & Bloom, G. (2011). Managing principals. American School Board Journal,

198(2), 26-28.

Wahlstrom, K., & Louis, K. (2008). How teachers experience principal leadership: The

roles of professional community, trust, efficacy, and shared responsibility.

Educational Administration Quarterly, 44(4), 458–495.

Page 145: Evert Thomas - Final Dissertation Draft for Publication 04232014

130

 

The Wallace Foundation (2009). Assessing the effectiveness of school leaders: New

directions and new processes. New York, NY: Author. Retrieved from

http://www.wallacefoundation.org/KnowledgeCenter/KnowledgeTopics/CurrentA

reasofFocus/EducationLeadership/Documents/Assessing-the-Effectiveness-of-

School-Leaders.pdf

The Wallace Foundation. (2006). A Wallace Perspective: Leadership for learning:

Making the connections among state district and school policies and practices.

New York: Author. Retrieved from http://www.wallacefoundation.org/

knowledge-center/school-leadership/district-policy-and-

practice/Documents/Wallace- Perspective-Leadership-for-Learning.pdf

Waters, T., Marzano, R., & McNulty, B. (2003). Balanced leadership: What 30 years of

research tells us about the effect of leadership on student achievement. Denver,

CO: Mid-continent Research for Education and Learning. Retrieved from

http://www.mcrel.org/PDF/LeadershipOrganizationDevelopment/5031RR_

BalancedLeadership.pdf

Watts, M. J., Campell, H. E., Gau, H., Jacobs, E., Rex, T., & Hess, R. K. (2006). Why

some schools with Latino children beat the odds and others don't. Tempe, AZ:

Arizona State University.

Wechsler, M. E., & Shields, P. M. (2008). Teaching quality in California: A new

perspective to guide policy. Santa Cruz, CA: Center for the Future of Teaching

and Learning.

Page 146: Evert Thomas - Final Dissertation Draft for Publication 04232014

131

 

Williams, T., Kirst, M., Haertel, E., & et al. (2005). Similar students, different results:

Why do some schools do better? A longitudinal survey of California elementary

schools serving low-income students. Mountain View, CA: EdSource.

Page 147: Evert Thomas - Final Dissertation Draft for Publication 04232014

132

 

Appendix A: Research Ethics Review Board

IMMACULATA UNIVERSITY RESEARCH ETHICS REVIEW BOARD REQUEST FOR PROTOCOL REVIEW--REVIEWER'S COMMENTS FORM

(R1297) Name of Researcher: Thomas Evert Project Title: Pennsylvania State Evaluation for Principal Effectiveness: Perceptions of Reviewer's Comments: Your proposal is approved. You may begin your research or collect your data. PLEASE NOTE THAT THIS APPROVAL IS VALID FOR ONE YEAR (365 days) FROM DATE OF SIGNING. Reviewer's Recommendations: __ __ Exempt X Approved _____ Expedited _____ Conditionally Approve _____ Full Review _____ Do Not Approve

September 4, 2013 ________________________________________________________________ Thomas F. O Brien, Ph.D., Ed.D., RERB Chair Date

Page 148: Evert Thomas - Final Dissertation Draft for Publication 04232014

133

 

Appendix B: Permission to Use PDE Principal Evaluation Domains

Granted by Dr. David W. Volkman

!"#$%&'()*+',#'$(*-.)*/0-1*2345352*67*8!

6'9:*-*;<*/

=#>?:@A3 !"#$%&'()'*+,$"--.)/'0.(.11$"0+,2+/'3($4%.&5'11'3($/3$61.7$89+*/7$3&$:39'-;$'(/3$<2&0.;=2.1/'3(1>

B'A:3 :3(9+;7$:+&)?$@A7$BC@D$E#DF#CG$%:$"H

C$;D3 I3,J5+(7$K+0'9!;3 "I"!H7$HLM:8<N339$8-/.&(33(7O32$?+0.$*.&5'11'3($/3$21.7$'($539'-'.9$-3&5+/7$/?.$(35.(),+/2&.$21.9$'($/?.$%&'()'*+,$"--.)/'0.(.11$"0+,2+/'3(P(1/&25.(/Q$P/$'1$2(9.&1/339$/?+/$/?.$2/','R+/'3($3-$+(;$539'-'.9$,+(S2+S.$+1$)3(/+'(.9$'($/?.$%&'()'*+,"--.)/'0.(.11$"0+,2+/'3($P(1/&25.(/$'1$/3$T.$21.9$13,.,;$-3&$/?.$*2&*31.1$3-$&.1.+&)?$+(9$U',,$(3/$T.$*&392).9$-3&)355.&)'+,$+(9V3&$3/?.&$21.1Q$P-$;32$+S&..$/3$/?.$-3&.S3'(S$)3(9'/'3(1$*,.+1.$&.1*3(9$/3$/?'1$.,.)/&3(')$5.11+S.'($/?.$+--'&5+/'0.Q$W.1/$U'1?.1$-3&$52)?$12)).11$'($/?.$&.1.+&)?$&.,+/.9$/3$;32&$9'11.&/+/'3(Q

K+0'9$XQ$I3,J5+($Y$"Z.)2/'0.$811'1/+(/K.*+&/5.(/$3-$"92)+/'3($Y$M--').$3-$",.5.(/+&;$[$<.)3(9+&;$"92)+/'3(DDD$:+&J./$</&../$Y$L+&&'1T2&S7$%8$@G@B\%?3(.#$G@GQGAD]\\@D$Y$^+Z#$G@GQB@E]BGA\UUUQ.92)+/'3(Q1/+/.Q*+Q21%!PIP_"N"K$8`K$aM`^PK"`HP8_$aM::6`Pa8HPM`H?'1$5.11+S.$'1$'(/.(9.9$3(,;$-3&$/?.$21.$3-$/?.$'(9'0'92+,$3&$.(/'/;$/3$U?')?$'/$'1$+99&.11.9$+(9$5+;$)3(/+'('(-3&5+/'3($/?+/$'1$)3(-'9.(/'+,Q$P-$/?.$&.+9.&$3-$/?'1$5.11+S.$'1$(3/$/?.$'(/.(9.9$&.)'*'.(/7$3&$/?.$.5*,3;..$3&+S.(/$&.1*3(1'T,.$-3&$9.,'0.&'(S$/?.$5.11+S.$/3$/?.$'(/.(9.9$&.)'*'.(/7$;32$+&.$?.&.T;$(3/'-'.9$/?+/$+(;9'11.5'(+/'3(7$9'1/&'T2/'3($3&$)3*;'(S$3-$/?'1$)3552(')+/'3($'1$1/&')/,;$*&3?'T'/.9Q$P-$;32$?+0.$&.).'0.9$/?'1)3552(')+/'3($'($.&&3&7$*,.+1.$'55.9'+/.,;$(3/'-;$/?.$1.(9.&$+(9$/?.($9.,./.$/?.$)3552(')+/'3($-&35$;32&.,.)/&3(')$5+',$1;1/.5Q

]]]]]M&'S'(+,$:.11+S.]]]]]^&35#$"I"!H7$HLM:8<$b5+',/3#/.0.&/cT.(1+,.519Q3&Sd<.(/#$:3(9+;7$:+&)?$@A7$BC@D$E#@D$%:H3#$I3,J5+(7$K+0'9<2Te.)/#$!.#$%&'()'*+,$"--.)/'0.(.11$"0+,2+/'3($4%.&5'11'3($/3$61.7$89+*/7$3&$:39'-;$'(/3$<2&0.;$=2.1/'3(1>P5*3&/+().#$L'S?

K&Q$K+0'9$XQ$I3,J5+(7N339$8-/.&(33(f$$P$U+1$S'0.($;32&$(+5.$T;$K&Q$:')?+.,$:+1J3$-&35$/?.$W2)J1$a32(/;$P6$gBBQ$$a2&&.(/,;7$P$+5$'(/?.$*&3).11$3-$1.//'(S$2*$a?+*/.&1$@7$B7$+(9$D$3-$5;$*&3*31.9$9'11.&/+/'3($+/$P55+)2,+/+$6('0.&1'/;$3($/?.$(.U%8$%&'()'*+,$"--.)/'0.(.11$"0+,2+/'3(Q$$81$*+&/$3-$a?+*/.&$D7$P$U32,9$,'J.$/3$3T/+'($*.&5'11'3($/3$21.7$+9+*/7$3&539'-;$/?.$U3&9'(S$U'/?'($/?.$-32&$935+'(1$'(/3$5;$12&0.;$h2.1/'3(1Q$$K&Q$:+1J3$T.,'.0.9$;32$)32,9$?.,*$5.$'(/?'1$.(9.+03&$3&$+/$,.+1/$*2/$5.$'($)3(/+)/$U'/?$135.3(.$*311.11'(S$/?.$+2/?3&'R+/'3($1/+/21$/3$+))35*,'1?$/?.*&3).11fH?+(J1$'($890+().7H35$"0.&/

H?35+1$"0.&/%&'()'*+,<(;9.&$:'99,.$<)?33,W.(1+,.5$H3U(1?'*$<)?33,$K'1/&')//.,#$B@iQGiCQBACC$Z$BDCB-+Z#$B@iQBEEQBAi@

b)'9#8KK^"8Wa]B^BC]EE\i]FF@^]^\DGWDiC\@iad

Page 149: Evert Thomas - Final Dissertation Draft for Publication 04232014

134

 

Appendix C: Letter to School District Superintendents

Page 150: Evert Thomas - Final Dissertation Draft for Publication 04232014

135

 

Appendix D: Invitation to Participate and Google© Drive Online Survey

Page 151: Evert Thomas - Final Dissertation Draft for Publication 04232014

136

 

Page 152: Evert Thomas - Final Dissertation Draft for Publication 04232014

137

 

Page 153: Evert Thomas - Final Dissertation Draft for Publication 04232014

138

 

Page 154: Evert Thomas - Final Dissertation Draft for Publication 04232014

139

 

Page 155: Evert Thomas - Final Dissertation Draft for Publication 04232014

140

 

Page 156: Evert Thomas - Final Dissertation Draft for Publication 04232014

141

 

Page 157: Evert Thomas - Final Dissertation Draft for Publication 04232014

142

 

Page 158: Evert Thomas - Final Dissertation Draft for Publication 04232014

143

 

Page 159: Evert Thomas - Final Dissertation Draft for Publication 04232014

144

 

Page 160: Evert Thomas - Final Dissertation Draft for Publication 04232014

145

 

Page 161: Evert Thomas - Final Dissertation Draft for Publication 04232014

146

 

Page 162: Evert Thomas - Final Dissertation Draft for Publication 04232014

147

 

Page 163: Evert Thomas - Final Dissertation Draft for Publication 04232014

148

 

Page 164: Evert Thomas - Final Dissertation Draft for Publication 04232014

149

 

Appendix E: Interview Consent Form

Consent Form for Face-to-Face, Video Conference, and/or Phone Interview

Page 165: Evert Thomas - Final Dissertation Draft for Publication 04232014

150

 

Appendix F: Six Scripted Interview Questions

Six Scripted Questions for Face-to-Face, Video Conference, and/or Phone Interview

Page 166: Evert Thomas - Final Dissertation Draft for Publication 04232014

151

 

Appendix G: Demographic Information For All Surveyed Participants

Questions #1 to #5

Participant Online Survey

Question #1

Online Survey

Question #2

Online Survey

Question #3

Online Survey

Question #4

Online Survey

Question #5 P-1 Elementary Male 10+ 26 to 50 301 to 500 P-2 Elementary Female 6 to 9 51 to 100 501 to 1000 P-3 Middle Male First Year 51 to 100 501 to 1000 P-4 Elementary Male 2 to 5 26 to 50 301 to 500 P-5 High Male First Year 101+ 1001+ P-6 Middle Female 2 to 5 26 to 50 501 to 1000 P-7 High Male 10+ 26 to 50 501 to 1000 P-8 Middle Female 10+ 51 to 100 501 to 1000 P-9 High Male 10+ 51 to 100 501 to 1000 P-10 Elementary Male 6 to 9 26 to 50 301 to 500 P-11 High Male 2 to 5 51 to 100 501 to 1000 P-12 Middle Male 6 to 9 26 to 50 301 to 500 P-13 Elementary Male 10+ 51 to 100 501 to 1000 P-14 Middle Male 2 to 5 26 to 50 501 to 1000 P-15 High Female 6 to 9 101+ 1001+ P-16 Elementary Male 2 to 5 26 to 50 501 to 1000 P-17 Elementary Male 6 to 9 26 to 50 301 to 500 P-18 Elementary Female 6 to 9 26 to 50 501 to 1000 P-19 Elementary Male First Year 26 to 50 501 to 1000 P-20 Middle Male 2 to 5 26 to 50 501 to 1000 P-21 High Female First Year 51 to 100 501 to 1000 P-22 Elementary Female 6 to 9 51 to 100 501 to 1000 P-23 Middle Male 2 to 5 51 to 100 1001+ P-24 Middle Female 6 to 9 51 to 100 501 to 1000 P-25 Elementary Female 6 to 9 26 to 50 301 to 500

Page 167: Evert Thomas - Final Dissertation Draft for Publication 04232014

152

 

Appendix H: Implementation Plan of PDE Data For All Surveyed Participants

Questions #6 to #9

Participant Online Survey Question #6

Online Survey Question #7

Online Survey Question #8

Online Survey Question #9

P-1 Yes No Strongly Disagree Unfair P-2 Yes No Agree Fair P-3 Yes No Agree Fair P-4 Yes Yes Agree Fair P-5 Yes Yes Agree Fair P-6 Yes No Agree Fair P-7 Yes Yes Disagree Fair P-8 Yes No Agree Fair P-9 Yes No Agree Fair P-10 Yes No Strongly Agree Fair P-11 Yes No Agree Fair P-12 Yes No Agree Fair P-13 Yes No Agree Fair P-14 Yes Yes Agree Fair P-15 Yes No Disagree Fair P-16 Yes Yes Agree Unfair P-17 Yes No Agree Fair P-18 Yes Yes Agree Unfair P-19 Yes Yes Agree Fair P-20 Yes Yes Agree Fair P-21 Yes Yes Agree Fair P-22 Yes Yes Agree Fair P-23 Yes No Agree Unfair P-24 Yes No Agree Fair P-25 Yes Yes Disagree Unfair

Page 168: Evert Thomas - Final Dissertation Draft for Publication 04232014

153

 

Appendix I: Principal Evaluation Feedback Data For All Surveyed Participants

Questions #12 to #15

Participant Online Survey Question #12

Online Survey Question #13

Online Survey Question #14

Online Survey Question #15

P-1 Yes Quarterly SR or SR, WN or S Very Satisfied P-2 Yes Other RD or RS Unsatisfied P-3 Yes Quarterly RD or RS Satisfied P-4 Yes Mid-Year SR or SR, RD or RS, WN or S Satisfied P-5 No Annual RD or RS Unsatisfied P-6 Yes Annual SR or SR, RD or RS, WN or S Very Satisfied P-7 Yes Annual SR or RS Unsatisfied P-8 Yes Annual SR or SR, CL or II, WN or S Satisfied P-9 Yes Annual CL or II Satisfied P-10 Yes Annual WN or S Satisfied P-11 Yes Mid-Year WN or S Satisfied P-12 Yes Other Other Very Unsatisfied P-13 Yes Annual RD or RS Satisfied P-14 Yes Mid-Year SR or SR Unsatisfied P-15 Yes Annual WN or S Satisfied P-16 Yes Annual RD or RS Unsatisfied P-17 Yes Annual RD or RS Satisfied P-18 Yes Mid-Year SR or SR, RD or RS, WN or S Satisfied P-19 Yes Other SR or SR, CL or II Satisfied P-20 Yes Mid-Year CL or II Satisfied P-21 Yes Annual SR or SR, WN or S, Other Very Satisfied P-22 Yes Annual SR or SR, RD or RS Satisfied P-23 Yes Annual SR or SR, WN or S Satisfied P-24 Yes Annual SR or SR, RD or RS Satisfied P-25 Yes Annual WN or S Very Satisfied

SR or SR=Self-Reflection or Self-Rating CL or II=Checklist or Inventory Items RD or RS=Rubric Design or Rating Scale WN or S=Written Narrative or Summary

Page 169: Evert Thomas - Final Dissertation Draft for Publication 04232014

154

 

Principal Evaluation Feedback Data For All Surveyed Participants

Question #16

Participant Online Survey Question #16

Principal Leadership Practices

Online Survey Question #16

Teacher Instructional Practices

Online Survey Question #16

Student Achievement Scores

P-1 Strongly Agree Strongly Agree Strongly Agree P-2 Agree Agree Agree P-3 Agree Agree Disagree P-4 Agree Agree Agree P-5 Disagree Agree Disagree P-6 Strongly Agree Strongly Agree Agree P-7 Agree Agree Disagree P-8 Agree Agree Agree P-9 Agree Agree Strongly Agree P-10 Strongly Agree Agree Disagree P-11 Agree Agree Agree P-12 Agree Agree Agree P-13 Agree Agree Agree P-14 Disagree Disagree Disagree P-15 Strongly Agree Strongly Agree Strongly Agree P-16 Disagree Disagree Disagree P-17 Strongly Agree Strongly Agree Agree P-18 Agree Strongly Agree Strongly Agree P-19 Strongly Agree Strongly Agree Strongly Agree P-20 Agree Agree Agree P-21 Agree Agree Disagree P-22 Agree Agree Agree P-23 Agree Agree Agree P-24 Agree Agree Agree P-25 Strongly Agree Strongly Agree Strongly Agree

Page 170: Evert Thomas - Final Dissertation Draft for Publication 04232014

155

 

Principal Evaluation Feedback Data For All Surveyed Participants

Question #17

Participant Online Survey Question #17

2010-2011

Online Survey Question #17

2011-2012

Online Survey Question #17

2012-2013 P-1 SPR SPR SPR P-2 SPR SPR SPR P-3 N/A N/A N/A P-4 SPR SPR SPR P-5 N/A SPR SPR P-6 SPR SPR SPR P-7 SPR SPR SPR P-8 SPR SPR SPR P-9 SPR SPR SPR P-10 SPR SPR SPR P-11 SPR SPR SPR P-12 SPR SPR SPR P-13 SPR SPR SPR P-14 SPR SPR SPR P-15 SPR SPR SPR P-16 N/A SPR SPR P-17 SPR SPR SPR P-18 SPR SPR SPR P-19 N/A N/A N/A P-20 N/A N/A SPR P-21 N/A N/A N/A P-22 SPR SPR SPR P-23 N/A N/A SPR P-24 SPR SPR SPR P-25 SPR SPR SPR

SPR=Satisfactory Performance Rating

Page 171: Evert Thomas - Final Dissertation Draft for Publication 04232014

156

 

Appendix J: Relationships within Educational Practice Data

For All Surveyed Participants

Question #18

Participant Online Survey Question #18

Principal Leadership Practices

Online Survey Question #18

Teacher Instructional Practices

Online Survey Question #18

Student Achievement Scores

P-1 Disagree Disagree Disagree P-2 Agree Agree Strongly Agree P-3 Agree Agree Disagree P-4 Strongly Agree Agree Agree P-5 Disagree Agree Agree P-6 Agree Agree Agree P-7 Agree Agree Agree P-8 Agree Disagree Disagree P-9 Agree Agree Strongly Agree P-10 Agree Agree Agree P-11 Agree Agree Agree P-12 Agree Agree Agree P-13 Agree Agree Agree P-14 Strongly Agree Agree Agree P-15 Disagree Disagree Disagree P-16 Agree Agree Agree P-17 Disagree Disagree Strongly Agree P-18 Agree Agree Agree P-19 Strongly Agree Strongly Agree Strongly Agree P-20 Strongly Agree Agree Agree P-21 Agree Agree Agree P-22 Agree Agree Agree P-23 Agree Agree Agree P-24 Agree Agree Agree P-25 Disagree Disagree Disagree

Page 172: Evert Thomas - Final Dissertation Draft for Publication 04232014

157

 

Relationships within Educational Practice Data For All Surveyed Participants

Question #22

Participant Online Survey Question #22

Principal Leadership Practices

Online Survey Question #22

Teacher Instructional Practices

Online Survey Question #22

Student Achievement Scores

P-1 0 to 25% 0 to 25% 0 to 25% P-2 26% to 50% 26% to 50% 0 to 25% P-3 26% to 50% 0 to 25% 0 to 25% P-4 0 to 25% 0 to 25% 0 to 25% P-5 0 to 25% 0 to 25% 0 to 25% P-6 51% to 75% 0 to 25% 0 to 25% P-7 51% to 75% 26% to 50% 26% to 50% P-8 0 to 25% 26% to 50% 26% to 50% P-9 26% to 50% 26% to 50% 51% to 75% P-10 0 to 25% 0 to 25% 0 to 25% P-11 26% to 50% 26% to 50% 26% to 50% P-12 0 to 25% 0 to 25% 51% to 75% P-13 0 to 25% 0 to 25% 26% to 50% P-14 26% to 50% 26% to 50% 26% to 50% P-15 51% to 75% 51% to 75% 51% to 75% P-16 0 to 25% 0 to 25% 26% to 50% P-17 26% to 50% 26% to 50% 26% to 50% P-18 51% to 75% 26% to 50% 0 to 25% P-19 51% to 75% 0 to 25% 26% to 50% P-20 51% to 75% 26% to 50% 0 to 25% P-21 0 to 25% 0 to 25% 0 to 25% P-22 0 to 25% 26% to 50% 26% to 50% P-23 26% to 50% 26% to 50% 26% to 50% P-24 0 to 25% 0 to 25% 0 to 25% P-25 0 to 25% 0 to 25% 0 to 25%

Page 173: Evert Thomas - Final Dissertation Draft for Publication 04232014

158

 

Appendix K: Principal Leadership Performance Effort Data

For All Surveyed Participants

Question #23 to #27

Participant Online Survey

Question #23 Domain 1a

Online Survey

Question #24 Domain 1b

Online Survey

Question #25 Domain 1c

Online Survey

Question #26 Domain 1d

Online Survey

Question #27 Domain 1e

P-1 SN M D W W P-2 M W D D W P-3 W D W W D P-4 M M D M D P-5 W M D D SN P-6 D D D D D P-7 SN M W W W P-8 SN W M SN M P-9 W D M W W P-10 Y W D Y W P-11 M D W W M P-12 D D D W D P-13 W W D W D P-14 W M D W M P-15 W W D D D P-16 W W W W W P-17 M M D M, SN W P-18 Y M D SN M P-19 W W D W M P-20 M W D M W P-21 M W D W W P-22 Y, SN W, SN D M M, SN P-23 W SN D W SN P-24 W, M, SN D, M, SN D D M, SN P-25 D D D D D

D=Daily W=Weekly M=Monthly Y=Yearly SN=Situational as Needed

Page 174: Evert Thomas - Final Dissertation Draft for Publication 04232014

159

 

Principal Leadership Performance Effort Data For All Surveyed Participants

Question #28 to #33

Participant Online Survey

Question #28 Domain 2a

Online Survey

Question #29 Domain 2b

Online Survey

Question #30 Domain 2c

Online Survey

Question #31 Domain 2d

Online Survey

Question #32 Domain 2e

Online Survey

Question #33 Domain 2f

P-1 SN D W W M SN P-2 M D D D D D P-3 W D W D, W D, W, M, Y SN P-4 W D M W M D P-5 SN SN D Y D D P-6 D D D D D SN P-7 D D D D W D P-8 D SN D D D D P-9 W D M W M D P-10 D D D M D M P-11 D W M D W W P-12 W D D D D D P-13 M D M D D W P-14 M D W W W D P-15 D D SN D D SN P-16 W W W W D D P-17 D D D D D D P-18 W D D Y, SN D D P-19 M D D, SN D, SN D, SN SN P-20 SN D D D D D P-21 W D M W W SN P-22 D D W, SN W W, SN W P-23 SN SN D M W D P-24 D, SN D, M, SN D, SN D, SN D, W, SN D, SN P-25 D D D D D D

D=Daily W=Weekly M=Monthly Y=Yearly SN=Situational as Needed

Page 175: Evert Thomas - Final Dissertation Draft for Publication 04232014

160

 

Principal Leadership Performance Effort Data For All Surveyed Participants

Question #34 to #38

Participant Online Survey

Question #34 Domain 3a

Online Survey

Question #35 Domain 3b

Online Survey

Question #36 Domain 3c

Online Survey

Question #37 Domain 3d

Online Survey

Question #38 Domain 3e

P-1 D M D D W P-2 W W D D W P-3 W, M, Y M, Y D, W, M, Y D, W, M, SN W, M P-4 W M W W M P-5 D SN W Y W P-6 D W D D D P-7 W W W W D P-8 D W D SN SN P-9 W W M W W P-10 Y W D D W P-11 D M D W W P-12 W W W D M P-13 M W W D W P-14 W W W D M P-15 D SN D D D P-16 M D W D D P-17 SN SN D D D P-18 W, SN M, Y D, W D Y, SN P-19 W M W D D P-20 D D D D D P-21 M M W M W P-22 M M W W W P-23 W W D W W P-24 D D D D D P-25 D D D D D

D=Daily W=Weekly M=Monthly Y=Yearly SN=Situational as Needed

Page 176: Evert Thomas - Final Dissertation Draft for Publication 04232014

161

 

Principal Leadership Performance Effort Data For All Surveyed Participants

Question #39 to #41

Participant Online Survey

Question #39 Domain 4a

Online Survey

Question #40 Domain 4b

Online Survey

Question #41 Domain 4c

P-1 M M M P-2 W D W P-3 M, Y D W, M P-4 W D W P-5 W D D P-6 W D D P-7 W D M P-8 M D D P-9 M M M P-10 M D D P-11 W D D P-12 D D M P-13 W D W P-14 M D M P-15 W D D P-16 W D D P-17 M D D P-18 M D W, Y P-19 M D Y, SN P-20 W D W P-21 SN D SN P-22 M W, SN W, SN P-23 M D M, SN P-24 W D D P-25 D D D

D=Daily W=Weekly M=Monthly Y=Yearly SN=Situational as Needed

Page 177: Evert Thomas - Final Dissertation Draft for Publication 04232014

162

 

Appendix L: Principal Leadership Performance and Reflective Change Data

For All Surveyed Participants

Question #42

Participant Online Survey

Question #42 Domain 1a

Online Survey

Question #42 Domain 1b

Online Survey

Question #42 Domain 1c

Online Survey

Question #42 Domain 1d

Online Survey

Question #42 Domain 1e

P-1 No Yes No Yes No P-2 Yes Yes No No No P-3 Yes No Yes Yes Yes P-4 Yes Yes No Yes Yes P-5 No No No No No P-6 Yes No No No No P-7 No Yes No No No P-8 No No No Yes No P-9 No No No Yes No P-10 No No No No No P-11 No No No No No P-12 No No Yes Yes No P-13 Yes Yes No Yes No P-14 Yes Yes Yes Yes Yes P-15 No No No No No P-16 Yes Yes Yes Yes Yes P-17 No No No No Yes P-18 Yes Yes No Yes No P-19 No No Yes Yes Yes P-20 No No No No No P-21 Yes Yes No Yes No P-22 No No No Yes No P-23 No No No Yes Yes P-24 Yes Yes Yes Yes Yes P-25 No No No No No

Yes=Yes, I would alter my principal practice No=No, I would not alter my principal practice

Page 178: Evert Thomas - Final Dissertation Draft for Publication 04232014

163

 

Principal Leadership Performance and Reflective Change Data

For All Surveyed Participants

Question #43

Participant Online Survey

Question #43 Domain 2a

Online Survey

Question 43 Domain 2b

Online Survey

Question #43 Domain 2c

Online Survey

Question #43 Domain 2d

Online Survey

Question #43 Domain 2e

Online Survey

Question #43 Domain 2f

P-1 No No Yes No Yes No P-2 No No No No No No P-3 Yes No Yes Yes Yes Yes P-4 No Yes No No Yes No P-5 No No No No No No P-6 No No No No No No P-7 No No No No No No P-8 No No No Yes Yes No P-9 No No Yes No No No P-10 No No No No No No P-11 No No No No No No P-12 No No No Yes Yes No P-13 Yes No Yes Yes No No P-14 No No No Yes Yes No P-15 No No No No No No P-16 Yes Yes Yes Yes Yes Yes P-17 No No No No No No P-18 No No No No No No P-19 Yes No No No No No P-20 No No No No No No P-21 No No No Yes Yes Yes P-22 Yes No No No Yes No P-23 Yes No No No No No P-24 Yes Yes Yes Yes Yes Yes P-25 No No No No No No

Yes=Yes, I would alter my principal practice No=No, I would not alter my principal practice

Page 179: Evert Thomas - Final Dissertation Draft for Publication 04232014

164

 

Principal Leadership Performance and Reflective Change Data

For All Surveyed Participants

Question #44

Participant Online Survey

Question #44 Domain 3a

Online Survey

Question #44 Domain 3b

Online Survey

Question #44 Domain 3c

Online Survey

Question #44 Domain 3d

Online Survey

Question #44 Domain 3e

P-1 No No Yes No No P-2 No Yes Yes No No P-3 Yes Yes Yes Yes No P-4 Yes No No No No P-5 No No No No No P-6 No No No No No P-7 No Yes Yes No Yes P-8 No No No No No P-9 Yes Yes No No No P-10 No No No No No P-11 No No No No No P-12 No Yes No No Yes P-13 Yes Yes Yes Yes Yes P-14 Yes Yes Yes Yes Yes P-15 No No No No No P-16 Yes Yes Yes Yes Yes P-17 No No No No No P-18 Yes Yes Yes Yes No P-19 Yes No No No No P-20 No No No No No P-21 Yes Yes Yes Yes Yes P-22 Yes Yes Yes Yes Yes P-23 No Yes No No No P-24 Yes Yes Yes Yes Yes P-25 No No No No No

Yes=Yes, I would alter my principal practice No=No, I would not alter my principal practice

Page 180: Evert Thomas - Final Dissertation Draft for Publication 04232014

165

 

Principal Leadership Performance and Reflective Change Data

For All Surveyed Participants

Question #45

Participant Online Survey

Question #45 Domain 4a

Online Survey

Question #45 Domain 4b

Online Survey

Question #45 Domain 4c

P-1 No No No P-2 No No No P-3 Yes Yes No P-4 Yes No No P-5 No No No P-6 Yes No No P-7 No No No P-8 Yes No No P-9 Yes No No P-10 Yes No No P-11 No No No P-12 Yes No No P-13 No No No P-14 Yes No Yes P-15 No No No P-16 Yes Yes Yes P-17 Yes No No P-18 No No No P-19 Yes No No P-20 No No No P-21 Yes No No P-22 Yes No No P-23 Yes No No P-24 Yes Yes Yes P-25 No No No

Yes=Yes, I would alter my principal practice No=No, I would not alter my principal practice

Page 181: Evert Thomas - Final Dissertation Draft for Publication 04232014

166

 

Appendix M: Domain Achievability Data For All Surveyed Participants

Question #46

Participant Online Survey

Question #46 Domain 1

Online Survey

Question #46 Domain 2

Online Survey

Question #46 Domain 3

Online Survey

Question #46 Domain 4

P-1 Achievable Achievable Achievable Achievable P-2 Achievable Achievable Achievable Achievable P-3 Achievable Achievable Achievable Achievable P-4 Achievable Achievable Achievable Achievable P-5 Achievable Achievable Achievable Achievable P-6 Achievable Achievable Achievable Achievable P-7 Achievable Achievable Achievable Achievable P-8 Achievable Achievable Achievable Achievable P-9 Achievable Achievable Achievable Achievable P-10 Achievable Achievable Achievable Achievable P-11 Achievable Achievable Achievable Achievable P-12 Achievable Achievable Achievable Achievable P-13 Achievable Achievable Achievable Achievable P-14 Achievable Achievable Achievable Achievable P-15 Achievable Achievable Achievable Achievable P-16 Achievable Unachievable Unachievable Achievable P-17 Achievable Achievable Achievable Achievable P-18 Unachievable Unachievable Achievable Achievable P-19 Achievable Achievable Achievable Achievable P-20 Achievable Achievable Achievable Achievable P-21 Achievable Achievable Achievable Achievable P-22 Achievable Achievable Unachievable Achievable P-23 Achievable Achievable Achievable Achievable P-24 Achievable Achievable Achievable Achievable P-25 Achievable Achievable Achievable Achievable

Page 182: Evert Thomas - Final Dissertation Draft for Publication 04232014

167

 

Appendix N: Open-ended Response Data For All Surveyed Participants

Question #10

Implementation Plan of PDE Information

Q10: What do you perceive will be the most difficult aspect in the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation for your district? P-1: Understanding the complexities that are woven into the evaluation. P-2: New system working on consistency. P-3: Rolling it out effectively in concert with the teacher effectiveness model and relative to when the state releases all of the components and their expectations. P-4: Common Language of rubric. P-5: N/A. P-6: The number of initiatives being administered by PDE at this time is creating a watered down effect on all initiatives. P-7: Doing everything needed and still having a life outside of school! P-8: Training and information for principals. P-9: Tracking participation. P-10: Having my evaluations of teachers directly linked to my overall score. P-11: Consistent evaluation of performance. P-12: It will be important to communicate expectations to all building principals. P-13: 3b - It is a challenge to harness that within in the building while meeting the expectation for consistency across all elementary schools in the district. P-14: The consistent and therefore fair implementation of the evaluation. While the state has created a more defined rubric and evaluation, it is still being implemented by the same administrators as in the past. P-15: 4a - there is a fine line between inviting community involvement and having members of the community believe that the school must do things the way they want things to be done. Finding the balance between soliciting input but sending

Page 183: Evert Thomas - Final Dissertation Draft for Publication 04232014

168

 

the clear message that educational professionals must be ultimate decision-makers in a high achieving district is sometimes difficult. P-16: Not all the language in the rubrics relates to an individual principal position - each district is different and the rubric language means different things in each district. P-17: I think the rubric will provide Principals and their supervisors with a workable tool for evaluation discussions. The part that I have the most difficulty with is the linking of SPP with teacher evaluations and the impact that may have on Principals' ratings. It is also not necessary to rate teachers and principals across four categories (failing to distinguished). P-18: Implementation of SMART goals with evidence documentation while trying to manage all the other aspects of being a building leaders. P-19: Continuity. Where I believe that initiatives surrounding accountability are appropriate, it is difficult to consistently measure one's performance due to all the variables surrounding their position (i.e. district size, economic stability, population, etc.). It is the same difficulties surrounding teacher effectiveness evaluations. P-20: The time it will take in order to provide adequate evidence and documentation for each of the components. P-21: Consistently documenting evidence for each component on the rubric. P-22: Domain 3b concerns me the most in that not only do we have to be instructional leaders, but now we must also be the curriculum coordinators (depending upon the interpretation of the rubric). As a single person who is responsible for knowing every aspect of every piece of your building and the students in which we are working to support, the burden that 3b puts on principals is unrealistic. P-23: Matching teacher evaluation to the relationship of student growth and performance results prior to having the scores. P-24: Similar to the challenges districts will face with the new teacher evaluation system, one challenge is the timing of the school data in relationship to when evaluations are typically completed. I anticipate the principals will need to have specific evidence to support categories that are different from past practice. P-25: Student performance on PSSA tests.

Page 184: Evert Thomas - Final Dissertation Draft for Publication 04232014

169

 

Question #10 General Themes within the Open-ended Response Data

Participant

T1 T2 T3 T4 T5 T6 T7 T8 T9 T10 T11 T12 P-1 X X X X X X P-2 X P-3 X X X X X P-4 X P-5 P-6 X X X P-7 X X X P-8 X X X X X P-9 X X P-10 X X P-11 X X X X X X X X X P-12 X X X P-13 X X X X X X P-14 X X X P-15 X P-16 X X P-17 X X X P-18 X X X P-19 X X X X P-20 X X P-21 X P-22 X X P-23 X X X P-24 X X X X P-25 X X X

T1=Communication of Information, Expectations, and Training T2=Consistency in All PA Districts T3=Roll-Out Logistics vs. Other PDE Initiatives in Progress T4=Language within the Rubric T5=Necessary Resources & Other Unforeseen Factors T6=Tracking Participant Information T7=Linking All Data Sources T8=Not Doing the Same Old Thing Under a New Name T9=Public Input & Involvement T10=Evidence and Documentation of Domain Items T11=Amount of Variables in Principal Duties & Responsibilities T12=Timely Results of PSSA Data

Page 185: Evert Thomas - Final Dissertation Draft for Publication 04232014

170

 

Open-ended Response Data For All Surveyed Participants

Question #11

Implementation Plan of PDE Information

Q11: What do you perceive will be the most seamless aspect in the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation for your district? P-1: Dedicating the necessary time to figure it out and make it work within our district. P-2: Unsure. P-3: I'm not really sure as of yet. I am a first year principal so having not been evaluated as one before makes this a difficult question to answer. P-4: Understanding the expectations of the process. P-5: N/A. P-6: It is very similar to the tool we currently use. P-7: This will not be seamless. P-8: Small district, "quality control." P-9: It is mandatory. P-10: Not sure. P-11: Acceptance of responsibility. P-12: It is realistic with regard to, it reflects much of what we already do, but forces principals to ask themselves what they could do better or to be more effective. P-13: Communication - It is a hallmark of the work done in our district. What we/I are doing matches the descriptors for communication. P-14: A universal evaluation process across all LEAs. P-15: Domain 1d - leads change efforts for continuous improvement - this is already a part of our leadership style at this school. P-16: Not sure.

Page 186: Evert Thomas - Final Dissertation Draft for Publication 04232014

171

 

P-17: Again, I think the linking of teacher ratings with student growth measures and SPP is completely impractical and unfair. There are many high performing teachers in under-performing schools. P-18: Observations by my district office leadership to shadow and know we are doing our job. P-19: We are currently under the pilot for the Principal Evaluation. As far as the most seamless, I believe schools have become extremely accustom to change and put this into the same category. We will go through the process and do the best we can as we usually do and move forward. The most critical aspect, as it is for teachers, is that it is okay to fail. If someone is trying something new to help improve upon the instructional processes of the school and it doesn't work, that's okay. As long as you reflect upon it, work as a team to sort out the details to make improvements, it can be a valuable experience. P-20: Most of the domains/components encompass what principals already do on a daily basis. The work that is done daily will not need to change. P-21: Developing administrative and building goals that align with the rubric. P-22: Many of the pieces of the rubric principals are already doing. Data meetings, goal setting and professional strategies in teams are things that already exist for me. P-23: Using student data to plan initiatives and foster change in the building to improve achievement. P-24: A seamless aspect will be the use of the rubric in principal evaluations. The superintendent utilized it unofficially to review our performance for the 12-13 school year. P-25: We already use the Danielson model.

Page 187: Evert Thomas - Final Dissertation Draft for Publication 04232014

172

 

Question #11 General Themes within the Open-ended Response Data

Participant

T1 T2 T3 T4 T5 T6 T7 T8 T9 T10 T11 T12 P-1

X X X X

P-2 X

P-3 X

P-4 X P-5 X

P-6 X P-7 X

P-8 X P-9 X P-10 X

P-11 X P-12 X P-13 X P-14 X X P-15 X X X P-16 X

P-17 P-18

X X X X X X

P-19

X X X X X P-20 X P-21 X X X P-22 X X P-23 X X P-24 X X P-25 X

T1=Not Sure T2=Dedication & Time to New PPEE T3=PDE Communication & Information T4=Danielson Rubric Already in Use T5=Size of the School District T6=PDE Mandated for All PA Districts T7=Ownership & Responsibility for New PPEE T8=What Principals Already Are Doing T9=Positive Outlook for Leading Change Efforts T10= Currently Participating in PDE Pilot T11=Support from Administration at the District Level T12=Positive Outlook for New PPEE & Positive Improvement

Page 188: Evert Thomas - Final Dissertation Draft for Publication 04232014

173

 

Open-ended Response Data For All Surveyed Participants

Question #19

Relationships within Educational Practice Information

Q19: Do you perceive Principal Leadership Practices will improve due to the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation within your district? In what way? P-1: I do not believe Principal Leadership Practices will improve as a result of the Principal Effectiveness Evaluation. P-2: Yes, more accountability. P-3: I believe the process and rubric will cause people to focus more intently on specific areas of strength and weakness. It quantifies good leadership characteristics in a relatively concise manner. P-4: I think they will improve. I think the increased attention to the broad roles of the job of a principal will support an general increase in attention to tasks and responsibilities. P-5: No. P-6: It depends on what the previously utilized tool was in the district. Feedback has a tendency to increase productivity, especially when tied to merit pay increases for administrators. P-7: More academic accountability. P-8: More reflective and purposeful planning. P-9: It is more formal. P-10: I believe it will improve. Principals will be more aware of what they do and how it impacts their performance rating. P-11: Yes, clear expectations for all. P-12: They will only to the degree that school districts takes them seriously and implement them with fidelity. P-13: It provides a clear focus for principals.

Page 189: Evert Thomas - Final Dissertation Draft for Publication 04232014

174

 

P-14: Yes. Principals will be required to maintain better records with the programs that they implement and through those records we might be able to collaborate on the best practices for our teachers. P-15: I do not anticipate that my approach to my job will be impacted by this new format. P-16: Yes, if implemented in a consistent manner. P-17: Maybe marginally. It may cause some Principals to better reflect on their practice. P-18: The new rubric with personal SMART goals will focus leadership and provide a positive accountability for instructional leadership taking some of the focus off of the managerial requirements of the daily job. P-19: Yes, I do. I believe it will improve as it provides an open forum of communication and dialogue among colleagues as opposed to simply being handed a rating. This collaborative process increases the continuity and learning with and from team members. P-20: I don't believe there will be much improvement in performance as a result of this system. Principals who are not performing at a satisfactory level do not last very long in their respective positions, regardless of the evaluation method in place. The new evaluation framework just provides administrators with a common framework by which to focus on and document their work. P-21: Yes. I believe it will provide consistency and identifies practices to develop and ideals to work towards as a principal. P-22: I do not believe that the practice will improve leadership. In fact, I believe that it will create burn out. This is not a one-person job anymore. P-23: Yes. I think it will force principals into action when data warrants attention to the area. P-24: The new evaluation system clearly outlines the expected outcomes for principal effectiveness that did not previously exist in our district. Therefore, it should support principal growth and it should also identify areas of weakness. Improvement is dependent on the principal's recognition of a need to improve, as well as, the goals or improvement plan identified. P-25: I do not.

Page 190: Evert Thomas - Final Dissertation Draft for Publication 04232014

175

 

Question #19 General Themes within the Open-ended Response Data

Participant T1 T2 T3 T4 T5 T6 T7 T8 T9 T10 T11 P-1 X P-2 X P-3 X X X P-4 X X P-5 X P-6 X X P-7 X P-8 X P-9 X X P-10 X X X X P-11 X P-12 X P-13 X P-14 X X P-15 X P-16 X P-17 X X P-18 X X X X X P-19 X X X P-20 X X P-21 X X X X X P-22 X P-23 X X P-24 X X X P-25 X

T1=No, the PPEE Process will not Improve Leadership Practices T2=Yes, More Accountability T3=Yes, More Focus on Strengthens & Weaknesses T4=Yes, More Attention on Tasks and Responsibilities T5=Yes, Feedback will Increase Positive Performance T6=Yes, More Reflective Process T7=Yes, More Formal Communication Process T8=Yes, Clearer Expectations within a Common Framework T9=Yes, If Implemented with Fidelity and Consistency T10=Yes, Collaboration on Best Practice T11=Not Sure

Page 191: Evert Thomas - Final Dissertation Draft for Publication 04232014

176

 

Open-ended Response Data For All Surveyed Participants

Question #20

Relationships within Educational Practice Information

Q20: Do you perceive Teacher Instructional Practices will improve due to the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation within your district? In what way? P-1: No, I don't. Good teachers will teach, reflect, and get guidance from their administrators while setting professional goals for improvement. P-2: Yes, teacher will be more accountable. P-3: I can't say. P-4: I think they will improve. I think the increased attention to the broad roles of the job of a teacher will support a general increase in attention to tasks and responsibilities. P-5: No. P-6: I am not sure if the teacher practices will increase due to the principal tool. P-7: Yes. P-8: Not necessarily. P-9: Yes. We will be judged by test scores. P-10: I do. Principals will be more proactive with teachers in helping them improve. P-11: Possibly, but may depend on individual effort. P-12: They create an opportunity for evaluators to have constructive conversations with teachers. It will be effective to the extent that evaluators seize these opportunities. P-13: That depends on how a principal chooses to market the concept. We are all in it together in terms of the evaluative expectations. P-14: Yes. As principals we will be using better data to help quantify our evaluation. That data can then be used to improve teaching practices within our schools. P-15: No, I do not.

Page 192: Evert Thomas - Final Dissertation Draft for Publication 04232014

177

 

P-16: Possibly. If leadership practices improve due to the rubric, then there should be an increase in positive instructional practices. P-17: No. P-18: The focus on instruction with data, in connection with the increased teacher expectations to provide evidence with the Danielson rubric, instructional practices will improve. Accountability to themselves and to the building is higher than it ever has been. P-19: Yes, I do. For the same reasons as stated in the open-ended response #3. P-20: No. I believe the new Teacher Effectiveness Evaluation will have more of an impact on teaching practices than the Principal Evaluation system will. P-21: I believe that teacher instructional practices may improve over time, but I don't believe it will be an immediate change. But, rather will occur as a result of principal effectiveness increasing. P-22: No. In order to be an instructional leader you must be able to be in classrooms, know your teachers, know your students, know the data about your students, etc. This is an insurmountable task that is being asked of leaders and yet they expect us to create a positive culture and build relationships. P-23: I believe the new teacher effectiveness system will have a greater impact than the principal evaluation on teacher instructional practices. Teachers don't buy into change based on how their supervisor is rated. P-24: I believe there is potential to improve teacher instructional practices, but improvement relies on all components from PDE to be complete. At the present time, there is a new principal evaluation system, a new teacher effectiveness system, the recent approval of PA Core Standards, the piloting of a new ELA test, implementation of three Keystone exams, and potential for additional exams. The state presently does not have all components functioning seamlessly, but rather is developing as we go. However, the new system identifies what a principal should do to support teaching and learning. P-25: I do not.

Page 193: Evert Thomas - Final Dissertation Draft for Publication 04232014

178

 

Question #20 General Themes within the Open-ended Response Data

Participant T1 T2 T3 T4 T5 T6 T7 T8 P-1 X X P-2 X X X P-3 X P-4 X X X P-5 X P-6 X P-7 X X X X X X P-8 X P-9 X P-10 X X X P-11 X P-12 X X X P-13 X P-14 X P-15 X P-16 X X X P-17 X P-18 X X X P-19 X P-20 X P-21 X P-22 X P-23 X P-24 X X X X P-25 X

T1=No, the PPEE Process will not Improve Teacher Practices T2=Not Sure T3=Yes, More Accountability on Supervision of Teachers & Instruction T4=Yes, More Attention to Support Teacher & Instruction T5=Yes, More Attention on PSSA Data & Test Scores T6=Yes, More Focus on Constructive Communication with Teacher T7=Yes, If Implemented with Fidelity & Consistency T8=Yes, Clearer Expectations within Rubric linking Leadership to Teacher & Instruction

Page 194: Evert Thomas - Final Dissertation Draft for Publication 04232014

179

 

Open-ended Response Data For All Surveyed Participants

Question #21

Relationships within Educational Practice Information

Q21: Do you perceive Student Achievement Scores will improve due to the implementation process of the new statewide Pennsylvania Principal Effectiveness Evaluation within your district? In what way? P-1: No. I believe because high stakes testing is such an issue, teachers will want to participate in the assignment of students to their classroom. P-2: Yes, analysis of the data. P-3: No. I do not think scores will dramatically improve as a result (this year) of implementation. There are some aspects that have nothing to do with student achievement or instructional leadership and will not impact. There are other parts that will impact, but over time. P-4: I hope so. The rubric and process will focus more attention on the principal's use of data to support student growth. P-5: No. P-6: I think the tests mean more now than ever before and that can create increased achievement or increased anxiety. P-7: Hopefully teachers will check for content retention and understanding. P-8: If teacher practices do not improve, I am not sure how student scores will improve? P-9: All are accountable. P-10: I hope so. I believe with the new teacher evaluation system, the conferences with teachers are more richer. It serves as a learning model for them and a teaching tool for the principal. P-11: Yes, if our evaluation is based on growth. P-12: I think it can be a contributing factor. It will not be the driving force. P-13: It is hard to see the connection other, than generally, the principal establishes and supports the structure, program, and culture that supports student achievement.

Page 195: Evert Thomas - Final Dissertation Draft for Publication 04232014

180

 

P-14: I am not sure. I think any principal who wants to see their students succeed on achievement tests is already doing everything they can to make a difference. I don't think this evaluation will change what principals do just to get a better evaluation. On the other hand, the information collected for the evaluations, and shared could help fellow principals if implemented in their school. P-15: No, I do not. P-16: Possibly. Improved leadership practices should correlate to student achievement in the school. P-17: No. P-18: I expect they will because no teacher wants to be the one to see failure. The majority of teaching staff are perfectionists. Hopefully this innate characteristic will carry through to the performance of students. P-19: Yes, I do. If improvement is occurring as a leader, which is improving the instructional practices from teachers, then a direct improvement in achievement scores should equally come as a result. P-20: No. In my district, I do not believe that scores will increase as a result of the new Principal Effectiveness Evaluation. I believe that scores will increase as a result of the effective leadership already occurring on a daily basis to impact change. The components being addressed in the new evaluation were already included in our localized evaluation form. P-21: I feel the impact on student achievement will be a longer-term process and will not reflect an immediate change. P-22: I think that with any new programming change, there is a dip that showcases a transition. I believe this will be the same. If a teacher begins to struggle we must provide opportunities for success. I believe that this will also have to be done for school leaders. Therefore, if and when scores improve, it is not going to be a result of the new evaluation system. It will be a result of the knowledge and the teams in the building that makes that happen. P-23: I believe that scores will improve as a result of implementing the new SSP, as measured by the new "rules of the game." I don't see greater overall student achievement as a result of this process as compared to the old system. P-24: Not initially. The state tests are also changing (ELA and Keystones) in addition to the shift to PA Core Standards. At the present time, PDE SAS does not have the necessary materials to support teaching and learning in relation to the PA Core Standards in the same manner as was present with the implementation of PSSAs grades 3-8. At that time, resources were available at the start of the school year,

Page 196: Evert Thomas - Final Dissertation Draft for Publication 04232014

181

 

whereas at the present moment there is a lack of PA resources for teachers and administrators. This impacts the ability to lead/guide teachers with any curricular changes expected. Once all components are completed and there is a clearly articulated plan from PDE, then I believe there is the potential to support positive change. Until that time, it will be challenging. P-25: I do not.

Page 197: Evert Thomas - Final Dissertation Draft for Publication 04232014

182

 

Question #21 General Themes within the Open-ended Response Data

Participant T1 T2 T3 T4 T5 P-1 X P-2 X P-3 X X P-4 X X P-5 X P-6 X P-7 X P-8 X P-9 X X P-10 X X P-11 X P-12 X P-13 X X P-14 X X X P-15 X P-16 X X P-17 X P-18 X X P-19 X X X P-20 X X X P-21 X X P-22 X X P-23 X X X X P-24 X X P-25 X

T1=No, the PPEE Process will not Improve Student Achievement T2=Not Sure T3=Yes, More Attention on PSSA Data & Test Scores T4=Yes, If Implemented with Fidelity & Consistency T5=Yes, Clear Expectations in Rubric to Link Principal Accountability with Teacher Instruction & Student Growth

Page 198: Evert Thomas - Final Dissertation Draft for Publication 04232014

183

 

Appendix O: Interview Response Data For All Voluntary Participants

Participant #1 Responses to the Six Scripted Questions Interview

Q1: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on principal leadership? P-1: It is holding us accountable for accurately reporting evaluations of our own staff members. For our teachers and the phenomenon of grade inflation, and not accurately reporting student progress, this will hold principals more accountable to accurately evaluate teachers.

Q2: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on principal leadership practices? P-1: If principal leadership practices are already in check and in place then it wouldn’t be impacted. Those who would be impacted are the ones that aren’t necessarily following good best practices to begin with. Q3: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on teacher instructional practices? P-1: It will impact teacher instructional practices because with good leadership, me being held more accountable, I am then holding teachers more accountable for what is happening in their classrooms, which would be in effect, better instructional practices.

Q4: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on teacher instructional practices? P-1: Those who are already doing a good job at their instructional leadership, and pushing, and questioning teachers, and have teachers question themselves, and on the path of continued professional development of doing better, and always looking to move forward, those people are not necessarily going to be impacted because they are already following those guidelines. Obviously, you have those that are in the trenches of some pretty dire districts that have bad and poor scores, but at the same time being able to improve with their improvement plans and move forward, and continuing to better themselves, are not going to be affected. Q5: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on student achievement scores? P-1: Like the trickle affect, holding everybody more accountable, ultimately will improve achievement scores. Having an effective principal, who is holding teachers accountable for their own effectiveness, ultimately will have a positive effect on student performance.

Page 199: Evert Thomas - Final Dissertation Draft for Publication 04232014

184

 

Q6: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on student achievement scores? P-1: I don’t think that that’s a possibility. I think that by holding the accountability measure up for everybody it is going to ultimately be better for kids. Holding principals more accountable, to be able to hold teachers more accountable, for teachers to hold students more accountable, is going to improve their performances.

Page 200: Evert Thomas - Final Dissertation Draft for Publication 04232014

185

 

Participant #2 Responses to the Six Scripted Questions Interview

Q1: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on principal leadership? P-2: I really think it will show principals a better option and will give them more structure in how they can better fine tune what they are doing within their buildings to instruct the leadership of their students and teachers that are in their buildings. It kind of gives us a better guide or perimeter to work off. Q2: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on principal leadership practices? P-2: I think we still have the exact same people doing some of the exact same things and until we are better trained on how to implement it, I think we are still going to get some of the same results because we are not forced to make some of those changes and the same people are doing the same jobs. People don’t want to have to do the extra things that they don’t want to do. Q3: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on teacher instructional practices? P-2: I think because principals are now graded on the same principles. So those principals have to look at what areas are they not making their growth in because their outcome, or their evaluation are based on their teachers scores, so those principals are going to look into those teachers and be less adaptive to letting things slide. They are going to say, “look you need to do better so we all do better” and it impacts the principal in that way. Q4: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on teacher instructional practices? P-2: I think because, for principals, they inherently believe that teachers are going to do well. And the majority of those teachers will still come back as being proficient. We will find the good in anyone. So even though they may be lacking in certain areas, we focus on the good because that’s the nature of our positions, and that’s the nature, usually of our demeanors. We try and find the good and don’t always look and try to solve the problem areas. Q5: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on student achievement scores? P-2: Because I believe that principals will start going after those problematic areas. And in specific cases will say, “Look, we are lacking in this area. Therefore we are going to increase the instructional practices of those kids, or for those teachers that will impact those kids, so those kids get a better education.” Because the principals are being graded

Page 201: Evert Thomas - Final Dissertation Draft for Publication 04232014

186

 

on that same program, and you are not making headway in certain areas, therefore the trickle down effect will impact the teachers and will impact the kids. Q6: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on student achievement scores? P-2: I believe that when principals are looking at these areas they are going to focus on certain areas that they feel they can make headway. And certain areas where they don’t feel like they can get the ground made up, they are going to push aside and say they are going to focus on a certain area where they can make the most headway, and therefore not concentrate in other areas. Therefore, kids will be missing instruction because of priorities not being put into all areas. But, instead they will be looking on where can I get my score up higher and focused in on that only.

Page 202: Evert Thomas - Final Dissertation Draft for Publication 04232014

187

 

Participant #3 Responses to the Six Scripted Questions Interview

Q1: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on principal leadership? P-3: In having reviewed the rubric and looking over it and now having become part of a pilot to implement it this school year, the rubric will hold us accountable for the domains that the state is saying, “This is what it means to be a principal and to be effective you need to do.” My biggest concern is the documentation of evidence for smart goals along with all the other things that we often do as administrators. Just trying to keep up with that, maintain it, make sure we are documenting good evidence, and we are providing what it may be the state is looking for, as some of us have never done the pilot before, aren’t sure exactly what they want to see. Q2: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on principal leadership practices? P-3: I think the only way a statewide rubric doesn’t impact you is if you just ignore that it exists because the rubric has very delineated guidelines to be proficient, here is what you must do, and it is pretty specifically lined out. And then for the distinguished, the word and you must do these things to be distinguished, to either live there, or even breathe there for the moment because nobody technically resides there forever. So, to not do it and to not be an effective principal, I would say you are just violating that entire rubric. I am not sure how you can’t do it unless you ignore that it exists, or if they suddenly say we are not using it anymore. Q3: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on teacher instructional practices? P-3: When you look at the domains that we as a principal need to be responding to and working towards, and especially if you are writing smart goals, so many of them are focused on instruction, and if you look at the domains and what it says to be proficient you have to show evidence of, or you have to be able to document what you are doing, between meetings and instructional facilitation. I look just solely at data and data collection, and data meetings, and student growth, and what I am going to have to say to the state here is what I did, and here is why I am proficient. I think that is going to dramatically change how some principals operate because some are great managers but not instructional leaders. You might be a great instructional leader but don’t know how to manage. Now, you have to be the whole package, or the state is going to say you are not proficient. I think that is going to dramatically change how teachers are impacted, because now I am more accountable, so I am going to be on them more about instruction, and instructional goals. If my smart goals tied into my instruction, I am going to expect them to know it, and for them to help me drive that goal home. Q4: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on teacher instructional practices?

Page 203: Evert Thomas - Final Dissertation Draft for Publication 04232014

188

 

P-3: I would say it wouldn’t impact it, if you as an administrator are looking at your goals, and think there is nothing there your staff can do to help you. But, if your whole building is a team and 15 % of their evaluation is building data, and 15% of my evaluation is how my building performs on the PSSA to impact my instructional goals then I just don’t see how it doesn’t. Q5: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on student achievement scores? P-3: It goes right back to that instructional goals and data. Data collection! How are you helping students to grow, are you making your expected growth under the new school improvement systems, and I know we don’t have AYP, but those new systems are saying you must show growth in all of your subgroups if you have them. So it is going to have a significant impact just because we are now holding teachers accountable. They are going to have individual data. You know they are being accountable to each other with 15% of that pie, coming eventually, three years down the road. I think there is a lot more accountability with special education, with the whole group, that we are all in this together. Specialists, you can’t ignore it either because some of the pie is yours too. I think student achievement is going to be significantly impacted because now everybody feels a little more accountable to each other, and accountable to what the students are going to do, because it effects their evaluation. Q6: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on student achievement scores? P-3: I think some of the categories don’t directly relate to student achievement. Some of the categories are more managerial in nature. I don’t expect my supervisor to observe me doing things with student achievement. But, with the heightened focus on teacher observation I know a supervisor is coming in and following me for a day or two, and watching what I do. They now have to be more aware of how I run my building and what I am doing, and looking at my observations, and holding me accountable too. The only way it doesn’t is if you look at the areas where it doesn’t directly attribute back to student achievement.

Page 204: Evert Thomas - Final Dissertation Draft for Publication 04232014

189

 

Participant #4 Responses to the Six Scripted Questions Interview

Q1: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on principal leadership? P-4: I think it will definitely have an impact on principal leadership because now leadership will be measured in some form, whereas before it was very subjective how principals would demonstrate their leadership district to district. I think the evaluation rubric gives clear definition to what leadership should look like in every building. Q2: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on principal leadership practices? P-4: I think it all depends on how people really take it to heart, the principals, and then the administration supervising principals. Will people take it seriously? Will they follow through on the components of it? How will people document? I think it will all come down to how everybody approaches the rubric. Q3: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on teacher instructional practices? P-4: I think everything filters down from the principal through the building. The principal obviously is the leader of the building and the one that should really shape the educational climate and culture of the building. I think the impact on teacher instruction and everything that is happening in the classrooms comes from what the principal does. So by putting the rubric into play for the principals, it filters down through to the teachers, and into the classrooms. Q4: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on teacher instructional practices? P-4: It goes back to how seriously will people take it. How much will they follow the various components, how much will the evaluations be implemented with fidelity across the district. If the districts are not necessarily following through with how it is supposed to be documented, how you are supposed to be showing your evidence, and it’s kind of all done in a rush at the end of the year, it is not going to have as much of an impact in the classroom as it could. Q5: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on student achievement scores? P-4: It will impact on student achievement scores because it defines what a principal should do in terms of data. It talks about how the principal is the leader of change. How the principal can build an environment where all the constituents are looking at the data and seeing how they can improve student achievement. Everything that principals do

Page 205: Evert Thomas - Final Dissertation Draft for Publication 04232014

190

 

connects ultimately back to student achievement. It gives it some definition, gives it some clarity, and gives it some purpose with it all relating back to student achievement. Q6: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on student achievement scores? P-4: It depends on how differently districts implement it. It all goes back to if people are implementing it with fidelity. If you are using it, looking at the results, looking at areas and saying, “Wow! This is an area I need to focus on and improve upon.” If I am not doing as well in that area or if I’m not doing enough in that area, because everything in the rubric does ultimately, I believe, connect back to student achievement. You must look at each component and see if there are areas to improve. If you are not doing that, you are not using it as a point of self-reflection, then you are not going to have the impact that you could as a result of the rubric implementation.

Page 206: Evert Thomas - Final Dissertation Draft for Publication 04232014

191

 

Participant #5 Responses to the Six Scripted Questions Interview

Q1: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on principal leadership? P-5: It is going to impact the roles of the principal in terms of what we are looking for, what we are measuring for teachers, and it gives us the rubric really to know what we should be looking for and working off, what is expected from principals as well as what is expected for teachers. So it will give common language and expectations across the board. Q2: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on principal leadership practices? P-5: Well it will give us the common language in terms of expectations, but it will also then be a matter of principals gathering evidence to support those categories so the challenge I believe is going to be from those supervising principals to make sure there is consistency within a district. And I believe that will be difficult to do. Q3: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on teacher instructional practices? P-5: Well with the new principal evaluation now focused on principal accountability for student data, which in turn will come back to teachers practice, it will be a way that I believe principals will be looking to hold teachers accountable for the instructional practices they are delivering within the classrooms. Q4: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on teacher instructional practices? P-5: It is going to be dependent upon the principals’ work that they do in leading schools and in leading teachers. How that plays out is going to be dependent on what a principal is looking for and the consistency a principal has in evaluating teachers. Q5: Why WILL the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation impact on student achievement scores? P-5: The rubric and what we are being held accountable to, has that piece, of student data that is involved in it. It should work from the principal, to the teacher, to the instructional practices that ultimately impacts student outcomes, and the additional piece of the SLOs that teachers will be writing and therefore monitoring, to also in turn impact student achievement. Q6: Why will the implementation of a statewide Pennsylvania Principal Effectiveness Evaluation NOT impact on student achievement scores?

Page 207: Evert Thomas - Final Dissertation Draft for Publication 04232014

192

 

P-5: It is really dependent on what happens with the principal. There are some pieces where the students change from year to year in terms of their ability level. Of course, the one piece that is also being looked at in this whole evaluation system is students’ achievement on standardized tests. My hope is that the PVAAS is another piece of data to look at, but it depends on really what measures we are looking at, to see if there is going to be any impact on student achievement.