Minot State University : 500 University Avenue West : Minot, ND 58707 January 8 Liaison Meeting Agenda and Minutes Present: Linda Cresap, Deb Jensen, Lori Willoughby, Mark Singer, Nicole Wald, Paul Markel, Tim Morris, Robert Crackel, Charlie Young, Heidi Super, Holly Pedersen, Linda Pettersen, Erin Holt, Lisa Borden-King, Nicole Thom-Arens, Gary Rabe, Jessica Smestad (chair of Academic Assessment Committee), Laurie Geller (VPAA), Michael Brooks (director of academic assessment) Absent: Agenda: 1. Questions and Comments 2. Follow-up on SLG development 3. Types of Assessment Tools 4. Example of SLG Development – Honors Program 5. Questions and Comments 6. Dismiss Minutes: Meeting opened with time for questions and comments. A question was raised of when SLO’s would be due, with a late February time frame set for purposes of have both SLG’s and SLO’s printed in the 2019-20 catalog. Brief feedback on SLG progress was presented by Director of Assessment with all programs making substantive progress. Three distinct resources were presented to liaisons. All three resources relayed specific kinds of assessment instruments that could be used to gather assessment data. The first resource distinguished tools in terms of the types of learning desired. The second resource distinguished the types of instruments that might be used based on level of learning. The third resource provided examples of direct and indirect instruments. Several questions followed regarding how to choose between types, and why overlap existed between direct and indirect instruments. The meeting ended with an example of a program, Honors, development of SLG’s and SLO’s, and presenting liaisons with the task of connecting SLO’s to SLG’s. 10/07/19
8
Embed
January 8 Liaison Meeting Agenda and Minutes...Assessment with all programs making substantive progress. Three distinct resources were ... scoring rubrics or performance checklists
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Minot State University : 500 University Avenue West : Minot, ND 58707
January 8 Liaison Meeting Agenda and Minutes
Present: Linda Cresap, Deb Jensen, Lori Willoughby, Mark Singer, Nicole Wald, Paul Markel,
Tim Morris, Robert Crackel, Charlie Young, Heidi Super, Holly Pedersen, Linda Pettersen, Erin
Holt, Lisa Borden-King, Nicole Thom-Arens, Gary Rabe, Jessica Smestad (chair of Academic
Assessment Committee), Laurie Geller (VPAA), Michael Brooks (director of academic
assessment)
Absent:
Agenda:
1. Questions and Comments2. Follow-up on SLG development3. Types of Assessment Tools4. Example of SLG Development – Honors Program5. Questions and Comments6. Dismiss
Minutes:
Meeting opened with time for questions and comments. A question was raised of when SLO’s would be due, with a late February time frame set for purposes of have both SLG’s and SLO’s printed in the 2019-20 catalog. Brief feedback on SLG progress was presented by Director of Assessment with all programs making substantive progress. Three distinct resources were presented to liaisons. All three resources relayed specific kinds of assessment instruments that could be used to gather assessment data. The first resource distinguished tools in terms of the types of learning desired. The second resource distinguished the types of instruments that might be used based on level of learning. The third resource provided examples of direct and indirect instruments. Several questions followed regarding how to choose between types, and why overlap existed between direct and indirect instruments. The meeting ended with an example of a program, Honors, development of SLG’s and SLO’s, and presenting liaisons with the task of connecting SLO’s to SLG’s.
10/07/19
500 University Avenue W, Minot, ND 58707 701-858-3990 1-800-777-0750 MinotStateU.edu
Director of Academic Assessment
Assessment Instruments in Relation to Levels of Learning*
Choose the assessment instrument (assignment, exam, project, performance) based on the level of student learning you want to
evaluate. Listed below are types of appropriate assessment instruments for different levels of learning. This is not a complete list of
types of assessment instruments.
Levels of Learning and Definitions Types of assessment instruments
Knowledge—Remember previously learned information Exam/test using matching, multiple choice, fill in the blank, short
answer and/or essay question; presentation
Comprehension—Demonstrate an understanding of the facts Exam/test using matching, multiple choice, fill in the blank, short
Examples of Direct and Indirect Measures of Student Learning (Course, Program, and Institutional Levels)
Direct Measures Indirect Measures
Course
Course and homework assignments
Examinations and quizzes
Standardized tests
Term papers and reports
Observations of field work, internship performance, service learning, or clinical experiences
Research projects
Class discussion participation
Rubric (a criterion-based rating scale) scores for writing, oral presentations, and performances
Artistic performances ad products
Grades that are based on explicit criteria related to clear learning goals
Course evaluation
Test blueprints (outlines of the concepts and skills covered on tests)
Percent of class time spent in active learning
Number of student hours spent on service learning
Number of student hours spend on homework
Number of student hours spent at intellectual or cultural activities related to the course
Grades that are not based on explicit criteria related to clear learning goals
Program
Capstone projects, senior theses, exhibits, or performances
Pass rates or scores on licensure, certification, or subject area tests
Student publications or conference presentations
Employer and internship supervisor ratings of students performance
Focus group interviews with students, faculty members or employers
Registration or course enrollment information
Department or program review data
Job placement
Employer or alumni surveys
Student perception surveys
Proportion of upper-level courses compared to the same program at other institutions
Graduate school placement rates
Institutional
Performance on tests of writing, critical thinking, or general knowledge
Rubric (criterion-based rating scale) scores for class assignments in General Education, interdisciplinary core courses, or other courses required of all students
Performance on achievement tests
Explicit self-reflections on what students have learned related to institutional programs such as service learning (e.g., asking students to name the three most important things they have learned in the program)
Locally-developed, commercial, or national surveys of student perceptions or self-report of activities (e.g. National Survey of Student Engagement)
Transcript studies that examine patterns and trends of course selection and grading
Annual reports including institutional benchmarks, such as graduation and retention rates, grade point averages of graduates, etc.
Reproduced from Middle States Commission on Higher Education (2007). Student Learning Assessment: Options and
500 University Avenue W, Minot, ND 58707 701-858-3990 1-800-777-0750 MinotStateU.edu
Director of Academic Assessment
Examples of Indirect Measures of Student Learning
• Course grades provide information about student learning indirectly because of a series of reasons, such as: a) due to the focus on
student performance or achievement at the level of an individual class, such grades do not represent an indication of learning over a
longer course of time than the duration of that particular class or across different courses within a program; b) grading systems vary
from class to class; and c) grading systems in one class may be used inconsistently from student to student
• Grades assigned to student work in one particular course also provide information about student learning indirectly because of the
reasons mentioned above. Moreover, graded student work in isolation, without an accompanying scoring rubric, does not lead to
relevant meaning related to overall student performance or achievement in one class or a program
• Comparison between admission and graduation rates
• Number or rate of graduating students pursuing their education at the next level
• Reputation of graduate or post-graduate programs accepting graduating students
• Employment or placement rates of graduating students into appropriate career positions
• Course evaluation items related to the overall course or curriculum quality, rather than instructor effectiveness
• Number or rate of students involved in faculty research, collaborative publications and/or presentations, service learning, or
extension of learning in the larger community
• Surveys, questionnaires, open-ended self-reports, focus-group or individual interviews dealing with current students’ perception
of their own learning
• Surveys, questionnaires, focus-group or individual interviews dealing with alumni’s perception of their own learning or of their
current career satisfaction (which relies on their effectiveness in the workplace, influenced by the knowledge, skills, and/or
dispositions developed in school)
• Surveys, questionnaires, focus-group or individual interviews dealing with the faculty and staff members’ perception of student
learning as supported by the programs and services provided to students
• Quantitative data, such as enrollment numbers
• Honors, awards, scholarships, and other forms of public recognition earned by students and alumni
*Reproduced from Cleveland State University, which adapted it from Maki, P.L. (2004). Assessing for learning: building a sustainable commitment across the institution. Sterling, VA: AAHE; and Suskie, L. (2004). Assessing student learning: A common sense guide. San Francisco, CA: Anker Publishing Company, Inc.