March 2, 2007 AAC&U Conference, Miami, FL 1 The Bigger Picture: The Next Level of Assessment Practice Barbara D. Wright Associate Director, Western Association of Schools and Colleges [email protected]
Mar 27, 2015
March 2, 2007 AAC&U Conference, Miami, FL 1
The Bigger Picture: The Next Level of Assessment Practice
Barbara D. WrightAssociate Director, Western Association of Schools and [email protected]
March 2, 2007 AAC&U Conference, Miami, FL 2
Our roadmap . . . What are your goals for this workshop?
Questions? Some key points about assessment The hierarchy of specificity: an exercise Break Supporting structures Case studies Your plans Wrap-up, workshop evaluation
March 2, 2007 AAC&U Conference, Miami, FL 3
Goals, questions
Gathering evidenceInterpretation
Use
The Assessment Loop
March 2, 2007 AAC&U Conference, Miami, FL 4
What exactly is assessment?
It’s a systematic process of 1) setting goals for or asking questions about student learning, 2) gathering evidence, 3) interpreting it, and 4) using it to improve the effects of college on students’ learning and development – at any level of analysis from the individual student to the course, program, or institution.
March 2, 2007 AAC&U Conference, Miami, FL 5
Other (subordinate) steps in the assessment process . . .
Planning Mapping goals onto curriculum Adding outcomes to syllabi Offering faculty development Reporting Communicating Adding assessment to program review Assessing the assessment
March 2, 2007 AAC&U Conference, Miami, FL 6
Mapping outcomes onto curriculum and pedagogy -- it can reveal . . .
where the skill is taught how it is taught how consistently it is reinforced where there are intervention points (But don’t obsess on syllabi or course
descriptions. Ultimately, they’re just inputs, not outcomes.)
March 2, 2007 AAC&U Conference, Miami, FL 7
A faculty lament . . .
We already test and assign grades. We flunk the ones who don’t measure up. Why do we have to do assessment?
March 2, 2007 AAC&U Conference, Miami, FL 8
Testing and Grading vs. Assessment Evaluation first,
feedback second Quality assurance Individuals Private Follow-up random,
serendipitous Follow-up not
supported Focus on the student,
the course
Feedback first, evaluation second
Quality improvement Samples Collective, collegial Follow-up systematic,
expected Follow-up supported,
rewarded Focus on the program.
the institution
March 2, 2007 AAC&U Conference, Miami, FL 9
Levels of Assessment
Individual student learning within courses
Individual student learning across courses
Courses Programs The institution
From: Levels of Assessment, Miller and Leskes, AAC&U, 2005
March 2, 2007 AAC&U Conference, Miami, FL 10
From Levels of Assessment, Miller and Leskes, AAC&U, 2005
“Evidence of student learning should be used for multiple levels of assessment … The best evidence … comes from direct observation of student work rather than from an input inventory (e.g., list of courses completed) or summary of self reports … Course-embedded assignments provide the most valid evidence for all levels of analysis.”
March 2, 2007 AAC&U Conference, Miami, FL 11
From Levels of Assessment, Miller and Leskes, AAC&U, 2005
“The ways of sampling, aggregating, and grouping the evidence for analysis … depend on the original questions posed. The questions will also determine how the data are interpreted to produce action.”
Sample questions at the institutional level: ►Just how information-literate are our graduates? ►Do they make steady progress throughout their college career?
March 2, 2007 AAC&U Conference, Miami, FL 12
From Levels of Assessment, Miller and Leskes, AAC&U, 2005
“Faculty members and staff accomplish aggregation by describing standards, translating them into consistent scoring scales, and anonymously applying the resulting rubrics to the evidence at hand. Such a process does not assign a grade to an individual student but rather attempts to understand better the learning process and how to improve.”
March 2, 2007 AAC&U Conference, Miami, FL 13
Some complex learning goals -- Communication Critical thinking Information literacy Quantitative problem-solving Team and leadership skills Intercultural competence Ability to transfer knowledge, skills Exercise of civic, social responsibility
March 2, 2007 AAC&U Conference, Miami, FL 14
Methods for complex outcomes are open-ended pose authentic, compelling tasks stimulate student engagement, creativity require integration of knowledge, skills,
dispositions demonstrate cumulative learning are educative for students and educators
alike provide meaningful info for improvement
March 2, 2007 AAC&U Conference, Miami, FL 15
Methods for complex outcomes … at any level of analysis --
Portfolios Capstones Performances Common assignments Secondary readings Course management programs Local tests Student self-assessment
March 2, 2007 AAC&U Conference, Miami, FL 16
Four dimensions of learning --
What students learn (cognitive learning, skills, dispositions)
How well? (thoroughness, complexity, subtlety, agility, transferability)
What happens over time? (cumulative, developmental effects)
Is this good enough? (federal concern)
March 2, 2007 AAC&U Conference, Miami, FL 17
The rubric – it defines what we’re looking for offers a set of scoring guidelines tells where to look tells what to look for (criteria) and provides descriptors of each level of quality
In other words, it’s a tool for determining “what” and “how well.”
March 2, 2007 AAC&U Conference, Miami, FL 18
The hierarchy of specificity
Institution-wide goals
College-wide goals
Department- & program- wide goals
Course-level goals
March 2, 2007 AAC&U Conference, Miami, FL 19
The hierarchy of specificity
Oral & written communication
Professional communication
Ability to write for business
Ability to write a business plan
March 2, 2007 AAC&U Conference, Miami, FL 20
Now you try it -- Communication Critical thinking Information literacy Quantitative problem-solving Team and leadership skills Intercultural competence Ability to transfer knowledge, skills Exercise of civic, social responsibility ?
March 2, 2007 AAC&U Conference, Miami, FL 21
Think horizontally as well as vertically . . .
Oral & written communication
Professional communication
Ability to write for business
Ability to write a business plan
Internship * Student government * Business courses * Gen Ed
March 2, 2007 AAC&U Conference, Miami, FL 22
Now you try it . . . Communication Critical thinking Information literacy Quantitative problem-solving Team and leadership skills Intercultural competence Ability to transfer knowledge, skills Exercise of civic, social responsibility ? . . .
across the college experience
March 2, 2007 AAC&U Conference, Miami, FL 23
Remember – when you’ve got data, you’re only halfway there
Data are not information; information is not knowledge
an inclusive “community of interpretation” is needed to make meaning of the findings
Interpretation should lead to plans for improvement, shared commitment
Reports, plans, and recommendations do not equal action
Action requires resources Faculty can’t do it all on their own
March 2, 2007 AAC&U Conference, Miami, FL 24
Who needs to be involved?
Faculty Student affairs, library, academic
support, IR, director of TLC, etc. Students Chairs, deans, VPAA Advisory board members, external
experts, faculty ?
March 2, 2007 AAC&U Conference, Miami, FL 25
“Institutionalizing assessment” – 2 aspects:
The PLAN for assessment (i.e. shared definition, purpose, values, vocabulary, communication, use of findings)
The STRUCTURES and RESOURCES that make the plan doable
March 2, 2007 AAC&U Conference, Miami, FL 26
Three alternatives for institutionalizing assessment:
Centralized Ability to focus on desired priorities, level of analysis Administrative control Efficiency Economies of scale
Central administration grows There are administrative, PR costs Connection to grassroots lacking Faculty are alienated or disinterested: Assessment is
an administrator’s job
March 2, 2007 AAC&U Conference, Miami, FL 27
Three alternatives for institutionalizing assessment:
Decentralized Minimal additional costs Close to classroom, sites where learning occurs High faculty ownership, involvement Discipline-appropriate approaches
Perception: “Sure, just dump it on the faculty” Communication challenging Duplication, inefficiency are likely There are opportunity costs There is no uniform approach Higher levels of analysis problematic
March 2, 2007 AAC&U Conference, Miami, FL 28
Three alternatives for introducing and institutionalizing assessment:
A middle course Identify elements that can be handled at
institutional level, e.g. Overall coordination Data storage, reporting (IR) Articulation of policy, parameters, levels
Implementation (e.g. budget for software, consultants, conferences)
Provision of rewards (individual, program) Communication (faculty, publics,
accreditors, state, etc.)
March 2, 2007 AAC&U Conference, Miami, FL 29
Three alternatives for introducing and institutionalizing assessment:
A middle course Delegate elements best handled “on the ground” in
individual programs, e.g. Definition of course-level outcomes Development of methods Interpretation of findings Recommendations for improvement Reflection on what has been learned Contributions to higher-level analysis
March 2, 2007 AAC&U Conference, Miami, FL 30
Assessment at higher levels requires-
A formal structure Assessment acknowledged in policy,
contracts, other documents Dedicated resources
Budget Personnel Location Accountability Etc.
March 2, 2007 AAC&U Conference, Miami, FL 31
How to institutionalize --
Make assessment a freestanding function
Attach to an existing function, e.g. Accreditation Academic program review Annual reporting process Center for Teaching Excellence Institutional Research
March 2, 2007 AAC&U Conference, Miami, FL 32
Make assessment freestanding --
Maximum flexibility Minimum threat,
upset A way to start
Little impact Little sustainability Requires
formalization eventually, e.g. Office of Assessment
Positives and Negatives
May be difficult to connect to higher levels of analysis
March 2, 2007 AAC&U Conference, Miami, FL 33
Attach to accreditation --
Maximum motivation Likely compliance Resources available Staff, faculty assigned Clear cause/effect
Resentment of external pressure
Us/them dynamic Episodic, not ongoing Reporting, gaming, not
improving Little faculty
involvement Little connection to the
classroom, learning Main focus: inputs
Positives and Negatives
May be easier to connect
March 2, 2007 AAC&U Conference, Miami, FL 34
Attach to Center for Teaching Excellence --
Strong impact possible
Ongoing Close connection to
faculty, classroom, learning
Maximum responsiveness to “use” phase
Impact depends on how broadly assessment is done
No enforcement Little/no reporting,
communicating Rewards,
recognition vary, may be lip service
Positives and Negatives
March 2, 2007 AAC&U Conference, Miami, FL 35
Attach to program review --
Some impact (depending on stakes)
Some compliance Some resources
available Staff, faculty assigned Cause/effect varies
Impact depends on how well PR is done
Episodic, not ongoing Inputs, not outcomes Reporting, not
improving Generally low faculty
involvement Weak connection to
the classroom, learning
Positives and Negatives
March 2, 2007 AAC&U Conference, Miami, FL 36
Attach to annual report --
Some impact (depending on stakes)
Ongoing Some compliance Habit, expectation Closer connection to
classroom, learning Cause/effect possible Allows flexibility
Impact depends on how seriously, how well AR is done
No resources Reporting, not
improving, unless specified
Chair writes; faculty involvement varies
Positives and Negatives
March 2, 2007 AAC&U Conference, Miami, FL 37
How can we increase weighting of learning & assessment in PR? E.g.,
Optional part One small part of total
PR process
“Assessment” vague, left to program
Various PR elements of equal value (or no value indicated)
Little faculty involvement
Required Core of the process
(emphasized in instructions)
Assessment expectations defined
Points assigned to PR elements; student learning gets 50% or more
Broad involvement
From to
March 2, 2007 AAC&U Conference, Miami, FL 38
Don’t confuse program-level assessment and program review
Program-level assessment means we look at learning on the program level (not the individual student or course level) and ask what the key learning experiences of a program add up to.
Program review looks for program-level assessment of student learning but goes beyond it, also examining other components of the program, e.g. mission, faculty, facilities, demand, etc.
March 2, 2007 AAC&U Conference, Miami, FL 39
New trends for PR/assessment (cf. WASC accreditation process)
Create a program portfolio Keep program data continuously
updated Do assessment on annual cycle Enter assessment findings, uses, by
semester or annually For periodic PR, review portfolio and
write reflective essay on student AND faculty learning
March 2, 2007 AAC&U Conference, Miami, FL 40
Budget items Released time, support staff time Equipment, supplies (e.g. purchased
instruments, software, computers, servers, books, photocopying)
Development (e.g., consultants, workshops)
Incentives and rewards (e.g. faculty mini-grants, travel, stipends, merit)
Communications
March 2, 2007 AAC&U Conference, Miami, FL 41
So what’s your plan?
What learning goal is a priority for higher-level analysis?
What’s your question? What’s already in place? What do you need? Who needs to be involved? What’s the timeline?