Implementing Lessons Learned: Establishing A Culture of Evidence-Based Assessment

Post on 14-Jan-2016

25 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

DESCRIPTION

Implementing Lessons Learned: Establishing A Culture of Evidence-Based Assessment. Marilee J. Bresciani, Ph.D. Associate Professor, Postsecondary Education and Co-Director of the Center for Educational Leadership, Innovation, and Policy San Diego State University 3590 Camino Del Rio North - PowerPoint PPT Presentation

Transcript

Implementing Lessons Learned: Establishing A Culture of Evidence-Based Assessment

Marilee J. Bresciani, Ph.D.Associate Professor, Postsecondary Education andCo-Director of the Center for Educational Leadership, Innovation, and PolicySan Diego State University3590 Camino Del Rio NorthSan Diego, California, U.S.A.619-594-8318Marilee.Bresciani@mail.sdsu.edu

My Lens

Former K-12 Instructor Doctorate in administration,

curriculum, and instruction Primarily an administrator; teaching 1-

2 courses a year; now a FT faculty member

OBA consultant and Reviewer for WASC and Australia

My research is in exploring systematic implementation of OBA and its effects

My Context

Participants are primarily instructors from 2- and 4-year public institutions with some proprietary institutions representation

You teach users of technology You care about what your learners learn You use what you know to inform

designers of technology and curriculum how to do a better job with their products

Overview Overview of Outcomes-Based

Assessment Discuss its History and Importance Share Strategies for Implementing

Sustainable Evidence-Based Decision Making within your Locus of Control

Applying Lessons Learned to Implementation

Questions to Consider as You Return Home

Bresciani, M.J.

Ask Yourself These Questions

How would you describe what outcomes-based assessment is to your faculty, staff, and students?

How would you describe its importance to what is happening in the classroom, your institution, and the student’s next steps?

The Assessment Cycle (Bresciani, 2006)

The key questions…• What are we trying to do and why? or• What is my program supposed to accomplish? or• What do I want students to be able to do and/or

know as a result of my course/workshop/orientation/program?

• How well are we doing it?• How do we know?• How do we use the information to improve or

celebrate successes?• Do the improvements we make contribute

to our intended end results?

The IterativeSystematicAssessment CycleBresciani, 2006

Mission/Purposes

Goals

Outcomes

Implement Methods to

Deliver Outcomes and

Methods to Gather Data

Gather Data

Interpret Evidence

Make decisions to improve programs; enhance student learning and development;

inform institutional decision-making, planning,

budgeting, policy, public accountability

Outcomes-Based Assessment Discoveries (Bresciani, 2006)

Most people do capitalize on their innate intellectually curiosity to find out what works

Most people just don’t articulate their intended end results (e.g., outcomes) ahead of time

Most people don’t document the decisions made based on their results

Most people don’t follow up later to see if their decisions made the intended improvement

This Process Works for:(Bresciani, 2007)

Evaluating the effectiveness of the technology Via outcomes for the users

Evaluating the effectiveness of the technology training Via evaluating outcomes of student learning

Evaluating how well the technology contributes to the improvement of student learning Via crossing the outcomes

Students can Participate (Bresciani, 2006)

In every phase of the assessment process

By having them do so provides them opportunities to evidence their problem solving, critical thinking, and analytical reasoning, and communication skills

And get a better picture of what you want them to learn and why

and take more responsibility for their learning

The Purpose (Bresciani, 2006)

Outcomes-Based assessment does not exist for assessment’s sake

It is taking what most of us already do, and making it systematic

Its purpose is to reflect on the end result of doing - - are we accomplishing that which we say we are?

The Purpose, Cont. (Bresciani, 2006)

It is NOT personnel evaluation It can be both formative and summative Planning is incorporated into it All types of assessment have value

(e.g., needs, utilization, satisfaction, learning and development) but we have to pay attention to evaluating student learning

Drivers of Outcomes-Based Assessment (Ewell, 2003)

To improve the underperforming student Competency Movement in Business and Industry Bologna Declaration of 1999 Government Conversation, 1985

The Higher Education Re-authorization Act Testimonies in USA, 2002 and 2006

Response to NCLB Legislation

Drivers, Cont.National Commission on the

Future of Higher Education (2006)

Demand for Public Information about Performance

• Transparency of outcomes and results• Comparable measures of quality• Demonstration of value-added of the

entire educational experience

Drivers, Cont. Accountability requirements handed

back to statesPerformance indicatorsStandardized tests

Discipline Standards could be designed by disciplines

Increased focus on private education Still no focus on evaluating

transferability of learning

How does this context change your answers to these previously posed questions?

How would you describe what outcomes-based assessment is to your faculty, staff, and students?How would you describe its importance to what is happening in the classroom, your institution, and the student’s next steps?

How do you make this process, effective, efficient, and enduring?

Consider how you would apply this criteria to what you do in the classroom or the design of curriculum and/or technology.

Criteria for Good Practices for Outcomes-Based Assessment Program Review (OAPR) (Bresciani, 2006)

Criterion 1: Clear understanding of purpose, goals, & expectations for OAPR Why, who, what, when, and how?

Criterion 2: Collaboration within and across Outcomes, Evaluation methods, Criteria,

Interpretation of Results, Decisions and Recommendations, Values

Criterion 3: Use of results To improve programs; to inform performance

indicators; to address value-added; to inform transferability of learning

Criteria, Cont. (Bresciani, 2006)

Criterion 4: Awards and recognition Common sense awards and recognition,

position descriptions and hiring criteria, evaluation of job performance criteria

Criterion 5: Resources to support program review Technological Professional development Individual consultation

Criterion 6: Coordination of the process

Criteria, Cont. (Bresciani, 2006)

Criterion 7: FlexibilityDelicate balance between structure

and creativity Criterion 8: Addressing barriers Criterion 9: Evaluation of the

program review processShort range and long range goalsMeta-Assessment

Inverted Funnel of Application to Lessons Learned

How would you apply these good practice criterion to the lessons you learned at this conference or vice versa?

Technical Disclaimer

I have not used any of these products so feel free to help me out here…

Assistance with Input Assessment

SAM 2007 E-Portfolio Management System

Assistance with Formative Assessment

SAM 2007 E-Portfolio Management System Case Grader Office 2007 Office 2007 “In a Flash”

Assistance with Summative Assessment

E-Portfolio Management System SAM 2007 Case Grader Office 2007 – if you

used course-embedded program assessment

Strategies for Collaboration

E-Portfolio Management System

Strategies for Coordination of the Process

E-Portfolio Management System

Questions to Consider (Bresciani, 2007)

ProcessWhen implementing an

effective, efficient, and enduring outcomes-based assessment process in your organization, consider the questions on the handout in the back of the room

Questions to Consider, Cont. (Bresciani, 2007)

When considering the use of Technology to promote outcomes-based assessment…Does the use of this technology reinforce the

reasons we are engaged in outcomes-based assessment?

How does the use of this technology promote formative assessment, summative assessment, or the assessment of transferable skills/knowledge?

Questions to Consider, Cont. (Bresciani, 2007)

Is our professional development to train users of our technology coupled with the professional development for outcomes-based assessment or instructional design?

How does the use of this technology enable us to better use the results to improve student learning and development?

Questions to Consider, Cont. (Bresciani, 2007)

How does the use of this technology improve efficiencies? And who benefits from those improved efficiencies?

How does this technology constrain or promote creativity/flexibility within each discipline?

How does this technology aid the coordination of the process? Or does it contribute to isolated thinking and doing?

Questions to Consider, Cont. (Bresciani, 2007)

How can data that this technological tool gathers help us evaluate the extent that we are transforming our organization through the use of evidence-based decision-making?“Keep your eye on the prize”

Challenge Questions to Consider

How does this technology help illustrate the transferability of learning/skills?

How does this technology help us illustrate, evaluate, and improve the learning/skills as they relate to the requisite skills required by employers?

What do you need to refine in order to make this happen?

Questions?

Marilee.Bresciani@mail.sdsu.edu

References

Bresciani, M.J. (2006). Outcomes-Based Academic and Co-Curricular Program Review: A Compilation of Institutional Good Practices. Sterling, VA: Stylus Publishing

top related