1 Assessment for learning in Assessment for learning in science: science: Issues in learning to Issues in learning to interpret student work interpret student work Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles Lawrence Hall of Science
36
Embed
Assessment for learning in science: Issues in learning to interpret student work
Assessment for learning in science: Issues in learning to interpret student work. Center for the Assessment and Evaluation of Student Learning (CAESL) University of California, Berkeley University of California, Los Angeles Lawrence Hall of Science. UCLA Shaunna Clark Joan Herman - PowerPoint PPT Presentation
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
1
Assessment for learning in Assessment for learning in science: science:
Issues in learning to interpret Issues in learning to interpret student workstudent work
Center for the Assessment and Evaluation of Student Learning (CAESL)
University of California, BerkeleyUniversity of California, Los Angeles
Lawrence Hall of Science
2
WestEdDiane
CarnahanKaren Cerwin
Kathy DiRannaJo Topps
U.C. BerkeleyMaryl Gearhart
Jennifer Pfotenhauer
Cheryl Schwab
UCLAShaunna ClarkJoan Herman
Sam NagashimaEllen OsmundsonTerry Vendlinski
Lawrence Hall of Science
Lynn BarakosCraig Strang
3
PresentationPresentation
• Program and participants
• Research framework • Design and methods• Selected findings• Implications
3
4
Situation • assessments in materials of variable qualityassessments in materials of variable quality• teachers lack expertise to reviseteachers lack expertise to revise• professional practices not well establishedprofessional practices not well established
Argument • science education reform (NRC/NSES) science education reform (NRC/NSES) • known benefits of classroom assessment known benefits of classroom assessment
(e.g., Black & Wiliam, 1998; Sloane & Wilson, (e.g., Black & Wiliam, 1998; Sloane & Wilson, 2000)2000)
• value of reflective practice and long term value of reflective practice and long term collaboration (Garet et al, 2001)collaboration (Garet et al, 2001) 4
Impetus for programImpetus for program
5
Principles • integrated with practice• long term• collaborative
Recent participation in (percentage)professional organizations 89assessment 95
Perceived [weak (1) – very strong (5)]content knowledge 4.5knowledge of standards 4.5knowledge of assessment 4.3confidence in teaching science 4.7
Academy participantsAcademy participants
6
7
Interwoven structures• district vertical teams w/ administratorsdistrict vertical teams w/ administrators• cross district grade level teamscross district grade level teams• independent classroom implementationindependent classroom implementation
Series of portfolios • repeated opportunities to build expertiserepeated opportunities to build expertise
7
Program organizationProgram organization
8
Portfolio: I. PlanPortfolio: I. Plan
Establish learning goals• analyze ‘conceptual flow’ of materialsanalyze ‘conceptual flow’ of materials• align with standardsalign with standards
Select assessments• choose key assessments to track choose key assessments to track
progress: pre -> junctures -> post progress: pre -> junctures -> post • identify the concepts assessed identify the concepts assessed • anticipate ‘expected student anticipate ‘expected student
responses’responses’ 8
9
Portfolio: II. Portfolio: II. ImplementationImplementation
Interpret student work• refine assessment criteriarefine assessment criteria• scorescore• chart and identify trendschart and identify trends
Use evidence• document instructional follow-document instructional follow-
up and feedbackup and feedback9
10
Portfolio: III. Evaluate & Portfolio: III. Evaluate & reviserevise
10
Evaluate using student work• alignment with goals and alignment with goals and
instructioninstruction• quality of tasks and criteriaquality of tasks and criteria• methods of analysismethods of analysis
Revise and strengthen
11 11
Rated for completeness Complete:Complete: I, II (some student I, II (some student
work), IIIwork), III
Partial: Partial: I I oror III, II (some) III, II (some)
Minimal: Minimal: I or III onlyI or III only
None (but participating)None (but participating)
Portfolio Completion
12
Fall 2003 Spring 2004 Fall 2004
Level (22) (23) (19)
Complete .50 .39 .47
Partial* .32 .30 .16
Minimal .18 .26 .16
None .00 .00 .21*but includes student work
Portfolio Completion
12
13
StudyStudy
13
Focus • Growth in understanding and practice• Supports and barriers