General Education Assessment AAC&U GE and Assessment Conference March 1, 2007.

Post on 27-Mar-2015

215 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

Transcript

General Education Assessment

AAC&U GE and Assessment Conference

March 1, 2007

Program Assessment

an on-going process designed to monitor and improve student learning. Faculty:

• Develop learning outcomes

• Verify alignment

• Collect assessment data

• Close the loop

Definitions

• Direct vs. Indirect Assessment

• Embedded Assessment

• Authentic Assessment

• Formative vs. Summative Assessment

• Triangulation

Quotations from the Wise and Experienced

Assessment Steps

1. Define learning outcomes

2. Check for alignment

3. Develop an assessment plan

4. Collect assessment data

5. Close the loop

6. Improve the assessment process

Learning Outcomes

• Clarify what faculty want students to learn

• Clarify how the assessment should be done

Types of Outcomes

• Knowledge

• Skills

• Values

Levels of Outcomes

• Course

• Program

• Institutional

GE Program Outcomes

• Focus on how students can demonstrate their learning

• Should be widely distributed

• Should be known by all stakeholders

• Guide course and curriculum planning

• Encourage students to be intentional learners

• Focus assessment efforts

Goals vs. Outcomes

Examples

Is each a goal or an outcome?

Bloom’s Taxonomy

Review Examples of GE Outcomes

Types of GE Outcomes

• Short list of more general outcomes

• Longer list of outcomes related to specific requirements

Ensuring/Verifying Alignment

• Course Certification

• Course Recertification

• Alignment Projects

GE Alignment Questions

• Curriculum Cohesion

• Pedagogy and Grading

• Support Services

• GE Instructors

• Learning-Centered Campuses

Cohesive Curriculum

• Coherence

• Synthesizing experiences

• On-going practice

• Systematically created opportunities to develop increasing sophistication

Alignment Matrix (Curriculum Map)

• I = Introduced

• D = Developed & Practiced with Feedback

• M = Demonstrated at the Mastery Level Appropriate for Graduation

Is this a cohesive curriculum?

Assessment Plan

• Who?

• What?

• When?

• Where?

• How?

Assessment should be meaningful, manageable, and sustainable.

We don’t have to assess every outcome in every

student every year.

Levels of GE Assessment

• Course-Level

• Program-Level

• Institution-Level

Comparisons of the Three Approaches

Sampling

• Relevant Samples

• Representative Samples

• Reasonably-Sized Samples

Ethical Issues to Consider

• Anonymity

• Confidentiality

• Informed Consent

• Privacy

Sample Assessment Plans

Find examples of:

• Direct assessment

• Indirect assessment

• Formative assessment

• Summative assessment

• Alignment-related assessment

• Triangulation

Assessment Techniques

• Direct Assessment

• Indirect Assessment

Properties of Good Assessment Techniques

• Valid

• Reliable

• Actionable

• Efficient and Cost-Effective

• Engaging to Respondents

• Interesting to Us

• Triangulation

Direct Assessment

• Published Tests

• Locally-Developed Tests

• Embedded Assessment

• Portfolios

• Collective Portfolios

Indirect Assessment

• Surveys

• Interviews

• Focus Groups

Rubrics

• Holistic Rubrics

• Analytic Rubrics

Online Rubrics

Rubric Category Labels

Rubric Strengths

• Efficient use of faculty time

• Precisely define faculty expectations

• Training can be effective

• Criterion-referenced judgments

• Can be used by others

Using Rubrics for Grading and Assessment

• Numbers for grading

• Categories for assessment

• Numbers and other criteria under individual faculty control

• Speed up grading

• Provide formative feedback

Using Rubrics in Courses

1. Hand out rubric with assignment.

2. Use rubric for grading.

3. Develop rubric with students.

4. Students apply rubric to examples.

5. Peer feedback using rubric.

6. Self-assessment using rubric.

Creating a Rubric

1. Adapt an existing rubric

2. Analytic approach

3. Expert-systems approach

Managing Group Readings

• One reader/document

• Two independent readers/document

• Paired readers

Before inviting colleagues:

1. Develop and pilot test rubric.

2. Select exemplars.

3. Develop a recording system.

4. Consider pre-programming a spreadsheet.

Orientation and Calibration

Closing the Loop

• Celebrate!

• Change pedagogy

• Change curriculum

• Change student support

• Change faculty support

Bringing It All Together

• Campus-wide conversations

• Institution-wide implications for faculty/staff development

• Quality-assurance process

• Reporting structure

• Implications for funding or infrastructure development

Some Friendly Suggestions

1. Focus on what is important.

2. Don’t try to do too much at once.

3. Take samples.

4. Pilot test procedures.

5. Use rubrics.

6. Close the loop.

7. Keep a written record.

top related