2015-2016 Annual Report to College Senate Academic Program Assessment Committee May 4, 2016 The Academic Program Assessment Committee was active in 2015-2016; this report provides a brief recounting of our activities. APAC’s charge is to oversee and facilitate academic program assessment planning and to assist academic departments and programs in undertaking programmatic assessment and reporting. Additionally, most of APAC’s efforts this year have been made to address meeting the Middle States Commission on Higher Education accreditation standards (emphasis on Standards V and VI) on program improvement and institutional effectiveness. In carrying out its charge, the Committee undertook the following activities: Fall 2015 • Established a shared drive for APAC • Discussed Student Opinion Survey results • APAC chair attended “Becoming an Assessment Facilitator” Workshop with IAC and GEAC chairs • The committee discussed APAC feedback process o Determined that the current annual reporting cycle is still confusing to some, but the timeline definitely works in favor of getting better feedback to departments within the academic calendar year o Determined that more face-to-face interactions are beneficial to the process • Revised the APAC Guidelines documents o Consulted with Graduate Committee and added language that clarifies the differences between graduate and undergraduate assessment practices o Added language/graphic that demonstrates how Academic Program Assessment Plans connect to the new College Strategic Plan o Added language to clarify requirements for distance learning o Updated plan and report checklists o See appendix titled ‘APAC Guidelines’ for documentation of completed work in this area • Reviewed (late submissions) APAC 2014-17 plans, and (late submissions) annual reports for all academic departments. All plans and reports were reviewed by two-person peer- review workgroups, the APAC chair, and the Associate Provost for Institutional Effectiveness. o The Committee discussed at length the need to reach out to individual departments that are struggling to get assessment done in a timely manner, including face-to-face meetings between department chairs and APAC faculty committee members. o See appendix titled ‘APAC Status Report (PLANS)’ and ‘APAC Status Report (REPORTS) 2014-15’ for documentation of completed work in this area
40
Embed
2015-2016 Annual Report to College Senate Academic Program … Committees/2016... · 2016-11-10 · 2015-2016 Annual Report to College Senate Academic Program Assessment Committee
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
2015-2016 Annual Report to College Senate Academic Program Assessment Committee
May 4, 2016 The Academic Program Assessment Committee was active in 2015-2016; this report provides a brief recounting of our activities. APAC’s charge is to oversee and facilitate academic program assessment planning and to assist academic departments and programs in undertaking programmatic assessment and reporting. Additionally, most of APAC’s efforts this year have been made to address meeting the Middle States Commission on Higher Education accreditation standards (emphasis on Standards V and VI) on program improvement and institutional effectiveness. In carrying out its charge, the Committee undertook the following activities: Fall 2015
• Established a shared drive for APAC • Discussed Student Opinion Survey results • APAC chair attended “Becoming an Assessment Facilitator” Workshop with IAC and
GEAC chairs • The committee discussed APAC feedback process
o Determined that the current annual reporting cycle is still confusing to some, but the timeline definitely works in favor of getting better feedback to departments within the academic calendar year
o Determined that more face-to-face interactions are beneficial to the process • Revised the APAC Guidelines documents
o Consulted with Graduate Committee and added language that clarifies the differences between graduate and undergraduate assessment practices
o Added language/graphic that demonstrates how Academic Program Assessment Plans connect to the new College Strategic Plan
o Added language to clarify requirements for distance learning o Updated plan and report checklists o See appendix titled ‘APAC Guidelines’ for documentation of completed work
in this area • Reviewed (late submissions) APAC 2014-17 plans, and (late submissions) annual reports
for all academic departments. All plans and reports were reviewed by two-person peer-review workgroups, the APAC chair, and the Associate Provost for Institutional Effectiveness.
o The Committee discussed at length the need to reach out to individual departments that are struggling to get assessment done in a timely manner, including face-to-face meetings between department chairs and APAC faculty committee members.
o See appendix titled ‘APAC Status Report (PLANS)’ and ‘APAC Status Report (REPORTS) 2014-15’ for documentation of completed work in this area
2015-2016 Annual Report to College Senate Academic Program Assessment Committee
May 4, 2016 Spring 2016
• The committee discussed membership, expiring terms, and preparation for new chair for Fall 2016
o Brenda Seery agreed to stay on APAC and serve a second term o John Schaumloffel agreed to stay on APAC and serve a second term o No new members need to be solicited for the upcoming year o Joshua Palmatier agreed to chair APAC beginning in Fall 2016
• The committee created a sample plan and report for the Defense of the Dark Arts Hogwarts Department, owing to an extensive effort from Julia Blau
o The sample plan and report are currently being reviewed by Associate Provost Wade Thomas and Dean Susan Turell
o The sample plan and report will be posted on the OIAE website in fall, and sent out in the Annual Report reminder memo along with the APAC Guidelines Document
o See appendix titled ‘Hogwarts Assessment Report’ and ‘Hogwarts Assessment Plan’ for documentation of completed work in this area
• Gathered data and created spreadsheet that demonstrates the status of current assessment practices/compliance at SUNY for Audit of Assessment in preparation for next Middle States review
• Discussed APAC expectations from programs that have external accreditation (including Business, Human Ecology, Theatre, Music)
o Reviewed SUNY memo to Presidents re: assessment § II.C. The evaluation of academic programs indicates that “programmatic
accreditation by an accrediting body recognized by CHEA (Council for Higher Education Accreditation), or the U.S. Secretary of Education that includes the assessment of student learning satisfies the SUNY expectation for academic program evaluation.”
o APAC members agreed that all accredited programs are required to submit annual reports to APAC that reflect ongoing data collection
§ These annual reports do not necessarily have to follow the APAC reporting guidelines, but do need to demonstrate the assessment of SLOs for that year.
• Reviewed annual reports for all academic departments (for data collected during calendar year 2015). All reports were reviewed by two-person peer-review workgroups, the APAC chair, and the Associate Provost for Institutional Effectiveness.
o The annual reporting cycle is as follows: § March 1 – Annual APAC Reports are due to APAC via Office of
Institutional Assessment and Effectiveness § April 1 – APAC members submit feedback to deans and department
chairs § May 1 – Final approval and feedback from deans to department chairs
and APAC via Office of Institutional Assessment and Effectiveness o See appendix titled ‘APAC Status Report (REPORTS) 2015-16’ for
documentation of completed work in this area
2015-2016 Annual Report to College Senate Academic Program Assessment Committee
May 4, 2016 Items for discussion next year:
• Continuation of discussion about audit of assessment for next Middle States visit • Complete/update audit of assessment spreadsheet • Set meeting with Deans’ Council to discuss their role in reviewing APAC reports/plans
o Work together to create template for Dean feedback or questions for Deans to consider when collecting data for their school’s academic program assessment
• Complete and post sample report and plan AND completed checklists and Dean comments for both
• Create a statement for guidelines document that clarifies the annual reporting requirements for accredited programs (with reference to the SUNY memo to presidents)
2015-2016 Committee Membership The committee shall be composed of five Faculty members approved by the College Senate and four additional Faculty members appointed by the Provost. Each of the five Senate-approved members shall represent a different one of the five academic schools, and shall be nominated by the Presiding Officer of the Faculty after consultation with the Provost and the College Senate Steering Committee. The Associate Provost for Institutional Effectiveness shall serve as an ex-officio member of the Committee. Senate Appointees School Member Name & Department End of Term Term Number Arts and Humanities Julie Licata (Music) 2017 2 Economics and Business Jing Yang (Management, Marketing
Social Science Miguel Leon (History) 2018 1 Provost Appointees School Member Name & Department End of Term Term Number Natural and Mathematical and Sciences
John Schaumloffel (Chemistry & Biochemistry)
2018 2
Arts and Humanities Daniel Nahson (Foreign Languages & Literatures)
2017 1
Social Science Julia Blau (Psychology) 2018 1 Education and Human Ecology
Appendix F: Sample Aggregated Data …………………………………………...……………. 19
3
Introduction The Academic Program Assessment Committee (APAC) was established by the Provost and College Senate to facilitate academic program assessment. The committee consists of nine faculty members: the College Senate elects five as representatives from each school; the Provost appoints the remaining four members. APAC assists faculty and academic departments in applying best practice principles to procure meaningful assessment data in the most efficient manner. APAC regards faculty in departments and programs as experts in their fields who are best able to determine meaningful educational experiences for students and are in the best position to assess the impacts of those experiences. The guidance from APAC is designed to assist the institution in meeting the Middle States Commission on Higher Education Standards for Accreditation and Requirements for Affiliation.1 The SUNY Board of Trustees resolved in March 2010 that all campuses must have in place assessment plans that meet or exceed Middle States standards or those of specialized accreditors. All learning experiences, regardless of modality (such as distance education), program pace/schedule, level and setting are to be consistent with higher education expectations.2 This document guides members of all academic programs to plan and assess in a collaborative, inclusive, and participatory process. It encourages alignment with SUNY’s Master Plan (a document revisited every four years as required by NYS Education Law section 354) as well as local college plans. College leaders (e.g., vice presidents, deans, etc.) should communicate these comprehensive expectations to academic programs to build and sustain understanding as well as advance interactions and cooperative efforts among divisions. As conditions change, these guidelines and periodic peer reviews are intended to advance the “consideration and use of assessment results for improvement of educational effectiveness.”3 Advice and assistance is available upon request from the Academic Program Assessment Committee (APAC) via its representatives from each academic division.
1 Notably Standard V, Educational Effectiveness: Assessment of student learning and achievement demonstrates that the institution's students have accomplished educational goals consistent with their program of study, degree level, the institution's mission, and appropriate expectations for institutions of higher education. (MSCHE Standards for Accreditation and Requirements of Affiliation, thirteenth edition , 2014. 2 MSCHE Standard III. 3 MSCHE Standard V, 3.
4
Alignment Assessment plans and processes are related to the annual reports required of each academic unit at the end of the year and programmatic reviews that must be conducted at least every seven years. They provide content for the annual report assessment plans to help academic programs describe goals and objectives for the year as they relate to overall campus direction and summarize major accomplishments as well as challenges. These complementary documents help academic programs plan ahead and enable them to use feedback for justifying adaptations and change. Assessment also guides strategic planning, resource planning, and sustained improvement.
SUNY Plans Ex: Rethinking SUNY, Power of SUNY, SUNY Excels, etc.
SUNY Oneonta Strategic Plan
SUNY Oneonta Academic Master Plan
Program Plan and Review
Self-‐study at 7-‐year intervals
or according to approved specialized accreditation review cycle
Assessment Cycle Timelines and Reporting Deadlines Long-term Academic Program Assessment Timeline
Fall 2009-Spring 2011 First APAC plans created Fall 2011-Spring 2014 – Implemented the first 3-year cycle Fall 2014 – Created new 3-year assessment plans/timelines, with revisions if necessary Fall 2014-Spring 2017 – Implementation of 2nd 3-year cycle Fall 2017 – Create new 3-year assessment plans/timelines, with revisions if necessary Fall 2017-Spring 2020 – Implementation of 3rd 3-year cycle Fall 2020 – Create new 3-year assessment plans/timelines, with revisions if necessary
Annual Reporting Timeline
March 1 – Annual APAC Reports are due to APAC via Office of Institutional
Assessment and Effectiveness April 1 – APAC members submit feedback to Deans and Department Chairs (APAC
members will schedule face-to-face meetings with departments that receive low rankings before forwarding their recommendation to the Dean)
May 1 – Final approval and feedback from Deans to Department Chairs and APAC via Office of Institutional Assessment and Effectiveness
New 3-year Assessment Deadlines (2014, 2017, 2020, 2023, 2026, etc.)
October 15 – New or updated 3-year assessment plans/timelines are due to APAC via
Office of Institutional Assessment and Effectiveness November 15 – APAC members submit feedback to Deans and Department Chairs
(APAC members will schedule face-to-face meetings with departments that receive low rankings before forwarding their recommendation to the Dean)
December 15 – Final approval and feedback from Deans to Department Chairs and APAC via Office of Institutional Assessment and Effectiveness
6
Developing an Assessment Plan in Four Steps (For undergraduate and graduate programs) The Academic Program Assessment Plan establishes the knowledge base, skills, behaviors, and perhaps even attitudes that students of an academic program can be expected to exhibit/hold/master/demonstrate upon graduation. Each plan should also address how the content and design of the program’s curriculum lead to students’ achievement of program expectations, how the program has assessed the effectiveness of its curriculum, and how it has used assessment information to improve the academic program. Mission, Goals, and Objectives Each academic department should have a clear mission statement that is publicly disseminated and aligned with the College mission. Departments should have program goals that provide a focus for faculty, administrators and other constituencies on intentions, purposes, and delivery. Distinguishing Among Goals , Objectives, and Outcomes4
Goals: • General intentions/purposes that are broad and more long-range in scope and not
changing over the planning horizon • May use words or phrases directly out of unit mission statement • Not directly measurable • Often a “process” statement (i.e., begin with verbs such as establish, provide, enhance)
Objectives: • Specific and measureable based on measures of expected outcomes • Typically there are multiple objectives for each overall goal • Often a change-oriented statement that shows directionality compared to moving
up/down, or maintaining high/low levels when a ceiling/floor exists (i.e., include words such as increase, decrease, improve, maintain)
Outcomes: • Very specific statements translate into assessable measures • Expected outcomes refer to anticipated results and include criteria for determining
success • Actual outcomes refer to the actual results of the assessment
4 The College uses a common operational language and definitions for outcomes assessment.
7
Step I. Establishing Objectives: “What knowledge and competencies do we expect students to gain from our program?” (See Appendix B: APAC Plan Checklist, Step 1) Faculty members should arrive at a consensus around the desired student learning outcomes associated with the programmatic objectives of their discipline as well as what it means to be in synchronicity with institutional expectations regarding students’ intellectual growth. The following question assists in developing the consensus: “What difference do we intend to make in our students as a result of their experiences with us and our curriculum with respect to knowledge, behaviors, skills, and attitudes?” Faculty should:
• Examine and review existing program objectives. • Elicit and discuss faculty members’ perceptions of program objectives (both
actual and aspirational). • Analyze and compare program objectives with stated institutional expectations
regarding students’ intellectual growth; the College's mission and strategic plan; the Academic Master Plan; programmatic objectives at comparable peers or aspirant institutions; expectations expressed by the field at large (e.g., as determined by examination of current textbooks, communication with national organizations in the discipline); criteria and standards of certification and accreditation agencies and/or national associations in the discipline if applicable; and results from the most recent program review.
• Make the objectives understandable to students. The assessment plan should include approximately 4-8 Student Learning Outcomes (SLOs). Note that creating too many learning objectives will make assessment formidable and threatens its success. Undergraduate programs may focus on objectives that cover:
• Demonstration of knowledge from different areas of subject matter • Demonstration of writing and presentation skills • Demonstration of synthesis of various theories • Demonstration of analytical/critical thinking skills • Demonstration of research skills and/or original thought
Graduate programs may find it useful to focus on broader objectives such as:5
• Demonstration that students develop as professionals in the field • Demonstration of mastery of the research skills of the discipline
5 Baker, Marilyn J., Assessment and review of graduate programs: A policy statement. Washington, D.C.: Council of Graduate Schools, 2011, p. 30.
8
Step II. Activities and Strategies: “How do courses and other experiences built into the curriculum relate to each other and contribute to programmatic goals?” (See Appendix C: APAC Plan Checklist, Step 2) Faculty should review all activities that are aimed at accomplishing programmatic objectives. First and foremost, this step requires a focus on a different question: “Do we offer activities and experiences in our curriculum that make it possible for students to achieve programmatic objectives?” In addition, it is important that faculty members reach consensus on the rationale for individual courses, program requirements, and program structure when undertaking this step. (See Appendix E: Sample Curriculum Maps) In attempting to accomplish this step, faculty should consider the following actions:
• Determine the extent to which program objectives are embedded in specific courses and make adjustments as appropriate (e.g., strengthening the coverage of objectives that are not sufficiently addressed, de-emphasizing objectives that are covered excessively).
• Review and analyze curricular coherence, focusing on the role individual courses are intended to serve, the rationale for all program requirements (including distribution requirements in the major and cognates), and rationales for pre-requisites.
• As appropriate, review program components that serve different purposes in the curriculum (i.e., major, minor, concentration, service courses).
• Determine strategies for assuring comparability of multiple sections of the same course with respect to programmatic objectives. .
• Examine the relationship of the program to other College requirements (e.g., General Education).
• Determine that curricula delivered by distance education are, “coherent, cohesive, and comparable in academic rigor to programs offered in traditional instructional formats.”6
Step III. Assessment: “How do we know students are achieving programmatic goals?” (See Appendix D: APAC Plan Checklist, Full Plan) Collect information that will provide direct feedback regarding the effectiveness of a program in terms of its stated learning objectives. Implementation involves first asking the question: “What evidence do we have to demonstrate whether students are meeting our expectations for their learning?” Each department should have clear expectations about what constitutes good assessment practice and have strategies in place to help faculty develop or acquire effective tools for assessing learning outcomes. Faculty members – especially those teaching different sections of the same course – should be encouraged to use comparable methods for assessing student learning outcomes. Relying primarily on course-embedded assessment can be the least time-and labor-intensive, is sometimes most economical, and assures student motivation to do well.
6 MSCHE “Distance Education Programs: Interregional Guidelines for Evaluation of Distance Education,” 2011, p. 9.
9
All departments need to collect and compile student data that are relevant to each programmatic objective. These tasks could be assigned to either an individual or a group (e.g., a departmental assessment committee). (See Appendix F: Sample Aggregated Data) It is important to note that there are differences between undergraduate and graduate education, and in terms of assessment those differences are most likely reflected within the assessment tools. In research-based graduate programs, a larger portion of student learning takes place outside of the classroom than in undergraduate programs. Therefore, graduate program assessment is seldom as course-based as undergraduate assessment may be. 7 Graduate programs may determine that there are many acceptable tools for measuring outcomes out of the classroom. Some such tools are:8
• Graduate placement information • Evaluation rubrics from preliminary exams and final defenses • Number of student publications • Results of certain exit interview questions • Surveys of recent graduates • Updated student CVs
Departments and academic programs should:
• Establish expectations for measures being used to assess student performance, relying on existing literature on good assessment practices to assure valid, reliable, and representative data.
• At most, focus on 3-4 student-learning outcomes each year. • Encourage faculty to use a wide variety of quantitative and qualitative measures to assess
student performance, including senior thesis/research projects, student portfolios, pre- and post-assessments within courses, departmentally generated exams, standardized tests, oral proficiency exams, and student teaching or internship evaluations.
• Leave the final selection of measures to be administered in a course-embedded fashion up to the faculty members teaching the course.
• Examine program effectiveness through comparisons with information provided by other programs or other groups of interest (e.g., certification agencies, national organizations in the discipline).
• Consider capstone courses as a good place to collect outcomes assessment data. • Evaluate student perceptions of the program through strategies such as senior exit
interviews and alumni surveys. • Ensure that assessment of student learning in distance education courses and programs
follow processes used in onsite courses or programs, reflect good practice in assessment methods, and are amply supported by analysis and evidence.9
7 Baker, Marilyn J., Assessment and review of graduate programs: A policy statement. Washington, D.C.: Council of Graduate Schools, 2011, p. 36. 8 Baker, Marilyn J., Assessment and review of graduate programs: A policy statement. Washington, D.C.: Council of Graduate Schools, 2011, p. 31-‐32. 9 MSCHE “Distance Education Programs: Interregional Guidelines for Evaluation of Distance Education,” 2011, p.10.
10
Step IV. Closing the Loop: “How can we use assessment of student learning to improve our program?” (See Appendix D: APAC Plan Checklist, Full Plan) The assessment process now provides the opportunity to compare expected outcomes with actual outcomes relative to objectives and activities. This final step asks: “What are we doing well, what could we do better, and how can we improve?” Faculty in the program must review assessment data and discuss findings with each other and perhaps other stakeholders. Decisions should then be made on the continuation of activities that lead to the realization of program objectives and the discontinuation or revision of activities that are not. It is also possible that the assessment process may lead to the revision or elimination of old objectives and/or the development of new ones. Faculty should consider the following actions:
• Provide aggregate data to faculty for review and discussion (individual faculty data should never be shared with other faculty members).
• Reach conclusions regarding program effectiveness as revealed for each learning objective, identifying both strengths and weaknesses revealed through the assessment.
• Offer recommendations for changes in curriculum and teaching as appropriate. • The development of a new statement of departmental objectives for next assessment
round as appropriate.
11
Guidelines to Writing Annual APAC Reports When writing the Annual APAC Report, departments should reference Appendix A-Annual Report Checklist, and provide the following (preferably in the following order):
• The entire 3-year data collection plan timeline (for reference, when is each SLO assessed) • Summary of how the department participated in assessment within the last two semesters
(due to the change in the reporting cycle, this should reflect the previous calendar year) • Aggregated data for each SLO assessed within the last two semesters • Narrative describing what the data for each SLO reveals • Summary of departmental reflections on what the data reveals (positive and negative)
o For this area, DO NOT be afraid to indicate ‘negative’ results; remember that the data is not tied to individual instructors
• Summary of planned curricular or other program-related changes, and justification for changes
o Be sure to reference both the old and the new, so the changes are clear to someone not completely familiar with your plan
o Include justification for the changes (from departmental reflections) o Include timeline for implementing changes
• Summary of changes that need to happen beyond the department to accommodate the needs of the department
o It is very possible that data proves that students are not meeting expectations in certain areas; if this happens, consider all possible reasons for not meeting expectations. Is it class sizes that are too large? Is it out-of-date lab equipment?
NOTE: APAC members are available to assist departments at any point during the process of creating a plan, or writing an annual report. If you do not know who the current members are, please contact the Office of Institutional Assessment and Effectiveness or consult the Senate website.
12
Programmatic Use of Academic Program Assessment at SUNY Oneonta The development and implementation of assessment plans can easily and advantageously be incorporated into existing planning, evaluation, and reporting requirements, in particular the Annual Reporting process which is specific to the College and the Self-Study Process that is part of SUNY’s Assessment of the Major and required for external accrediting agencies.
Each academic program at the College is required to provide an Annual Report at the end of the academic year, and to use that report to develop plans for the subsequent academic year. According to the guidelines for preparing the Annual Report, academic programs are asked to describe the outcomes of student learning assessments conducted during the year, to include “a summary of the assessment methods that were used and the results, including benchmarking as appropriate. Also, whenever possible, detail course-specific student learning outcomes and their relationship(s) to departmental or programmatic expectations.” Programs ought to summarize other major accomplishments that took place during that year.
The Self-Study Process at SUNY Oneonta takes place for most academic programs on a 7--year schedule as required by SUNY System Administration. This process is comprehensive in nature and focuses on a wide range of issues and questions of interest to the program (e.g., facilities, faculty workload and credentials, resources, faculty presentations and publications, student awards, student enrollment, as well as student retention and graduation rates). As part of this process, according to SUNY guidelines, each academic program must include as part of its self-study the assessment of student learning.
If organized and managed appropriately, these processes all contribute to a single, important goal: enabling an academic program to plan, assess, and document its efforts on an ongoing basis. To be specific, the Academic Program Assessment Plan delineates the specific student learning objectives a program intends to assess in a given year, since it is not necessary for programs to assess all objectives every year. Annual Assessment Reports are also to be included in the departments’ Annual Report, presenting the outcomes of student learning assessments and other accomplishments, and stating which student learning outcomes are scheduled for the following year. Finally, assessment results and evidence of program improvement are part of the program review self study.
13
Institutional Use of Academic Program Assessment at SUNY Oneonta APAC intends to pave the way for greater institutional effectiveness. These guidelines are intended to raise the importance and visibility of Academic Program Assessment at the administrative/executive levels: Provost, AMP, resource allocation, SP, President’s Executive Council, and the President’s Cabinet. Assessment should be utilized in institutional planning (AMP, SP, resource allocation, etc.). In order for that to happen, departments must articulate data collected in programmatic assessment to indicate and justify their needs in their Annual APAC Reports, and Deans’ should forward relevant concerns.
2 = approaches expectations: no resubmission needed. Approval recommended after addressing suggested changes 1 = does not meet expectations: resubmission required NA = Not Applicable
Item from APAC Guidelines for Programmatic Assessment
Score
Comments from APAC
Context The report includes the complete 3-year assessment plan, including list of all SLOs and the t imetable for when each SLO will be assessed.
All changes/amendments to the current assessment plan have been clearly documented, including a statement of the reason for making the changes/amendments, and it is clear that all changes/amendments are consistent with the integrity of the current plan.
Current Year Reporting
The report notes how each SLO is measured—within a course or courses, or using an external measure—and describes how the specific assignments, items within assignments, or other measures are used.
The performance criteria are clearly defined. For example, definitions are given for terms such as “exceeding expectations,” “meeting expectations,” and “approaching expectations,” using language that allows a reader unfamiliar with the discipline to understand the expectations for acceptable performance.
The summary chart provided is complete and includes the total number of students evaluated. The accompanying narrative notes trends or variations in performance as applicable.
Data is of sufficient quality and comparability to allow for meaningful discussion of results.
Results are measured against external and /o r long i tud ina l in te rna l benchmarks, if applicable..
Student perceptions—based on interview or survey results—are discussed, i f u s e d .
Evidence is provided that data were used to inform reflection on the program, including discussion(s) by faculty. “Next steps” are noted. Evidence might include influence on curricular decisions, program design, or budget requests. Language must clearly indicate where decisions were influenced by data (even if no change occurred).
Overall Evaluation
Revised December 2015
15
Appendix B: APAC 3-year Assessment Plan Checklists, Step 1
Item from Guidelines Y/N Comments There are a reasonable number of SLOs.
Collectively, the SLOs are appropriate in scope.
Each SLO covers a distinct competency and is measurable.
The process for arriving at the SLOs is described.
The SLOs are connected to the College Mission Statement, to the goals of the programmatic field/discipline, or to certification agencies/national associations as appropriate.
Revised December 2015
16
Appendix C: APAC 3-year Assessment Plan Checklists, Step 2
SUNY College at Oneonta APAC 3-‐Year Assessment Plan, Step 2 Checklist
Item from Guidelines Y/N Comments The process for developing the curriculum map is described.
The document includes a table or tables mapping the SLOs to specific courses.
The curriculum map demonstrates the program is addressing all SLOs.
The narrative provides a rationale for the program’s requirements.
The narrative describes how the program’s courses relate to other programs and the College’s general education requirements.
The narrative describes the steps taken to ensure that SLOs are being consistently met in courses with multiple sections.
17
Appendix D: APAC 3-year Assessment Plan Checklists, Full Plan
APAC Program Assessment Plan Checklist, Full Plan Program: _______________________________________ Reviewers: _________________________________
Item from APAC Guidelines for Programmatic Assessment Y/N Comments from APAC
The plan includes a contextual narrative that describes the process by which assessment measurements were selected and approved by the faculty.
The plan includes a timetable for the assessment of each SLO during a three-‐year period.
The plan: describes the various methods (qualitative and quantitative) to be used for assessing each SLO.
describes where in the program each SLO is to be measured and assessed.
provides assurances that each SLO will be mapped to specific assignments, items with assignments, or other measures and not overall course grades.
The plan indicates the benchmarks to be used to help assess program effectiveness (programs at other colleges, related national organizations, professional certification agencies, etc.), if applicable.
The process and method(s) of the student assessment are described, if used.
The plan describes a process for reviewing the results of the assessment process and incorporating them into curriculum, teaching, departmental objectives, and future assessment planning as appropriate.
Revised December 2015
18
Appendix E: Sample Curriculum Maps The simplest case: Which courses cover which SLOs?
Introductory Course X X History/Theories X X X Methods X X X Required Course 1 X X X X Required Course 2 X X X Required Course 3 X X X Required Course 4 X X Capstone X X X X X
COURSE
SLOs 1 2 3 4 5 6 7
Introductory Course I I History/Theories I R R Methods I R R Required Course 1 R R R R Required Course 2 R M M Required Course 3 R R M Required Course 4 M M Capstone M M M M M
Introductory Course E E History/Theories E P P Methods E, L E, L L Required Course 1 E E E, P P Required Course 2 P P P Required Course 3 E E P Required Course 4 I, PO I, PO Capstone PO PO PO PO PO Assessment Key: P-Paper E-Exam PO-Portfolio O=Oral Presentation L-Lab Assignment I-Internship
19
Appendix F: Samples of Aggregated Data Simplest case (all that is required)
Courses Assessment Measure(s)
Performance Criteria
# of Students
% Exceeding Standards
% Meeting Standards
% Approaching Standards
% Not Meeting Standards
All Portfolio 4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting
338 18% 59% 13% 10%
By Course Level
Courses Assessment Measure(s)
Performance Criteria
# of Students
% Exceeding Standards
% Meeting Standards
% Approaching Standards
% Not Meeting Standards
All Portfolio 4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting
338 18% 59% 13% 10%
By Competency Level
Courses Assessment Measure(s)
Performance Criteria
# of Students
% Exceeding Standards
% Meeting Standards
% Approaching Standards
% Not Meeting Standards
All Portfolio 4-5=Exceeding 3=Meeting 2=Approaching 1=Not Meeting
338 18% 59% 13% 10%
APAC STATUS REPORT (PLANS)(teams revised 9/11/15)
IMPLEMENTATION STATUS 2014-‐2017 Review TeamRevisions/Updates
Received To Team Team Review To Dean Dean ActionDean Action
Date Comments School of Arts & Humanities
Art & Computer Art JY & CD 11/20/1411/24/2014;
resent 3/4/15 3/16/15 3/17/15 approved 5/1/15
English JY & CD10/29/2014; resub.
1/19/1511/14/2014;
1/19/15 12/2/1412/18/2014;
resub. 1/19/15*agree w/APAC;
approved2/12/2015;
5/1/15 *revisions necessary; rev. recd. 1/19/15
Foreign Languages & Literatures JP & ML11/5/2014; resub.
3/25/15 11/14/14 12/2/1412/18/2014;
resub. 3/26/15*agree w/APAC;
approved2/12/2015;
5/1/15 *revisions necessary; rev. recd. 3/25/15Music & Music Industry JS & JB 10/29/14 11/14/14 2/16/15 2/17/15 approved 5/1/15
Philosophy JY & CD 9/28/15; *11/4/15 resub.11/4/2015;
**resub. 2/8/16 12/7/15 12/9/15 approved 2/8/16*Will resubmit by 11/1/15; **team to reeval.
Theatre JY & CD 12/22/14 12/22/14 1/5/15 1/7/15*agree w/APAC;
approved2/12/2015;
5/1/15 *revisions necessary School of Economics & Business
Economics & Business Division NA *6/14/13 *Have AACSB accreditationSchool of Education & Human Ecology
Education Division NA NA NA NA NA NA *Have NCATE Accreditation
Anthropology JP & ML 11/21/2014; rev.3/9/15 11/24/14 2/16/152/17/2015; rev.
3/11/15see comments to
chair 2/22/15
Communication Arts JP & ML11/3/2014; rev. 2/18/15 & 3/5/15 11/14/14 12/2/14
12/18/2014; resub. 2/18/15
& 3/5/15 see comments to
chair2/22/15; 12/22/14
Proposing to integrate 2 existing majors into one w/new SLOs
Cooperstown Graduate Program BS & DN 4/8/154/9/2015; resent
3/2/16
12/21/15-submitted
incorrect form; 4/14/16 on
correct form 4/14/16 approved 4/21/16Overhauling curriculum & developing new science museum studies program.
Environmental Sustainability * 6/4/15 6/5/15 9/7/15 6/4/15 approved 6/4/15 *New major eff. Fall 2016
Geography BS & DN 1/27/15 1/28/15 2/23/15 2/23/15see comments to
chair 10/2/15
School of Social Sciences
APAC STATUS REPORT (PLANS)(teams revised 9/11/15)
IMPLEMENTATION STATUS 2014-‐2017 Review TeamRevisions/Updates
Received To Team Team Review To Dean Dean ActionDean Action
Date Comments
History BS & DN11/20/2014; revised
10/30/15 11/24/14 12/12/1412/18/2014;
11/3/15see comments to
chair
12/22/2014; 11/3/15 rev. approved
Political Science BS & DN 3/23/2015; Rev. 6/2/15 3/23/15 3/31/153/31/2015;
resub. 6/2/15see comments to
chair4/8/2015; rev.
approved 6/8/15
Psychology JP & ML 10/31/14 11/14/14 12/2/14 12/18/14see comments to
chair 2/10/15
Sociology BS & DN 10/31/14 12/2/14 12/2/14 12/18/14see comments to
chair 12/22/14Last updated: 4/21/16No Report Still out for review
APAC STATUS REPORT-ANNUAL REPORTS(teams revised 9/11/15)
IMPLEMENTATION STATUS 2014-‐2015 Review Team Report received To TeamTeam
Review/Score To Dean Dean ActionDean Action
Date Comments School of Arts & Humanities
Art & Computer Art JY & CD 2/27/15 3/4/15 3/16/2015; 2 3/17/15 approved 5/1/15English JY & CD 3/19/15 3/19/15 4/1/15; 3 4/1/15 approved 5/1/15Foreign Languages & Literatures JP & ML 9/29/15 9/29/15 10/21/15; 3 10/21/15 approved 10/23/15Music JS & JB 9/22/15 9/22/15 3/9/16; 3 3/9/16 approved 3/9/16
Music Industry JS & JB9/22/2015; rev.
3/10/169/22/2015;
3/10/16 3/9/16; 1 3/9/16 to submit report 3/9/16
Philosophy JY & CDwill submit asap; dean's approval of plan sent to dept. chair 3/21/16
Theatre JY & CD 4/11/16
*will not be submitting an APAC report because Provost indicated NAST accreditation waives need for APAC report; they will still submit their NAST report in lieu of APAC report
School of Economics & Business
Economics & Business Division NA *Have AACSB accreditationSchool of Education & Human Ecology
Education Division NA NA NA NA NA NA *Have NCATE AccreditationHuman Ecology-General JP & ML 4/20/15 4/20/15 4/27/15; 3 4/27/15 approved 4/27/15
Human Ecology-Food Services JP & ML 10/7/15 10/7/15 10/21/15; 3
submitted No data available for 2014-2015 per chairAnthropology JP & ML 9/23/15 *9/24/2015 3/29/2016; 3 3/30/16 approved 4/5/16 Communication & Media JP & ML 3/22/16 3/22/16 3/29/2016; 2 3/29/16 approved 4/19/16 see dean's comments to chair
Cooperstown Graduate Program *BS & DNNew plan
developed 2015
2015-overhauling curriculum and developing new science museum studies program
Environmental Sustainability * *New major eff. Fall 2016Geography BS & DN 3/9/15 3/9/15 3/25/15; 3 3/25/15 approved 3/29/15 see dean's commentsHistory BS & DN 2/27/15 3/4/15 3/23/2015; 3 3/24/15 approved 3/29/15 see dean's comments
Political Science BS & DN
NO SEPARATE REPORT
RECEIVED
APPROVED REVISED PLAN
6/8/15Psychology JP & ML 3/17/15 3/17/15 4/1/15; 3 4/1/15 approved 4/8/15 see dean's comments
School of Social Sciences
APAC STATUS REPORT-ANNUAL REPORTS(teams revised 9/11/15)
IMPLEMENTATION STATUS 2014-‐2015 Review Team Report received To TeamTeam
Review/Score To Dean Dean ActionDean Action
Date CommentsSociology BS & DN 2/27/15 3/4/15 12/21/15; 3 12/21/15 approved 12/21/15 see dean's comments
Last updated: 4/19/16 No Report Still out for reviewScoring: 3 = meets expectations 2 = approaches expectations; no resubmission needed. Approval recommended after addressing suggested changes. 1 = does not meet expectations; resubmission requiredTEAMS REVISED 9/11/15
APAC STATUS REPORT-ANNUAL REPORTS(teams revised 9/11/15)
IMPLEMENTATION STATUS 2015-‐2016 Review Team Report received To TeamTeam
Review/Score To Dean Dean ActionDean Action
Date Comments School of Arts & Humanities
Art & Computer Art JY & JL 4/8/16 4/8/16 4/28/16; 2 4/29/16
English JY & JL 3/1/16 3/1/16 3/18/16; 2
3/18/2016; 4/29/16 resent
Foreign Languages & Literatures JP & ML 4/7/16 4/7/16 4/14/2015; 3 4/14/16 approved 4/16/16
Music JS & JBWill submit over summer per R Roman May 3
Music Industry JS & JBWill submit over summer per R Roman May 3
Philosophy JY & JL
Will submit asap; dean's approval of plan sent to dept. chair 3/21/16. Working on it per C Keegan May 3
Theatre JY & JL *4/11/2016APAC is currently reviewing the accreditation dox
School of Economics & Business
Economics & Business Division APAC3/1/16 acct'g &
econ. *Have AACSB accreditation. School of Education & Human Ecology
Education Division NA NA NA NA NA NA *Have NCATE AccreditationHuman Ecology-General JP & ML 4/25/16 4/25/16 4/27/2016; 2 4/27/16 Human Ecology-Food Services JP & ML 3/22/16 3/22/16 3/29/2016; 3 3/30/16
Human Ecology-Fashion/Textiles JS & JBwill submit by 1st wk. in April per Dr. Yun-Jung Choi
Human Ecology-Child & Fam. Studies JS & JBwill submit by 1st wk. in April per Rose Avanzato
Human Ecology-Dietetics N/ASchool of Nat. & Mathematical Science
Africana & Latino Studies JS & JBchair will send by
3/31/16Will be sent in early-mid May per B Wambui May 3
Anthropology JP & ML 3/23/16 3/23/16 3/29/2016; 3 3/30/16 approved 4/5/16Communication & Media JP & ML 3/22/16 3/22/16 3/29/2016; 2 3/29/16 approved 4/19/16 see dean's comments to chairCooperstown Graduate Program (MA - in History Museum Studies) *BS & DN 3/2/16 3/2/16 4/13/16; 2 4/14/16
recommended 4/19/16 see dean's comments to chairLast updated: 4/29/16
School of Social Sciences
APAC STATUS REPORT-ANNUAL REPORTS(teams revised 9/11/15)
IMPLEMENTATION STATUS 2015-‐2016 Review Team Report received To TeamTeam
Review/Score To Dean Dean ActionDean Action
Date CommentsNo Report Still out for reviewScoring: 3 = meets expectations 2 = approaches expectations; no resubmission needed. Approval recommended after addressing suggested changes. 1 = does not meet expectations; resubmission requiredTEAMS REVISED 9/11/15
Defense Against the Dark Arts 2014-‐2017Assessment Plans
Submitted to the
Academic Programs Assessment Committee
By
Elladora Cresswell, N.E.W.T., Chair, Defense Against the Dark Arts Department
October 31, 2014
2
Defense Against the Dark Arts Assessment Plan 2014-‐2017
Defense Against the Dark Arts (DADA) will not make any changes to the five student learning objectives (SLOs) developed in the first cycle of data. Therefore, our five learning objectives are:
SLO1: Students will be able to explain the fundamental differences between the light arts and the dark arts (i.e. Intention, Outcome, and Context). (Assessed in DADA 100: Intro to the Defense Against the Dark Arts)
SLO2: Students will be able to perform the basic defensive spell set (i.e. Expelliarmus, Impedimenta, and Stupefy). (Assessed in DADA 106: Basic Defense Practicum).
SLO3: Students will demonstrate knowledge of Stevenson’s Laws for magical defense. (Assessed in DADA 601: N.E.W.T. Preparatory Course I.).
SLO4: Students will demonstrate the ability to identify threats in a target-‐rich environment and minimize collateral damage. (Assessed in DADA 206: Intermediate Defense Practicum, and DADA 306: Advanced Defense Practicum).
SLO5: Students will be able to perform the advanced defensive spell set (i.e. Protego, Horribilis, and Expecto Patronum). (Assessed in DADA 361: O.W.L. Preparatory Course II).
Two student learning objectives are assessed annually across a three year cycle. Thus, our assessment schedule is:
Assessment Year Student Learning Objectives Assessed 2014-‐2015 SLO1: Students will be able to explain the fundamental differences
between the light arts and the dark arts (i.e. Intention, Outcome, and Context). SLO 3: Students will demonstrate knowledge of Stevenson’s Laws for magical defense.
2015-‐2016 SLO2: Students will be able to perform the basic defensive spell set (i.e. Expelliarmus, Impedimenta, and Stupefy) SLO4: Students will demonstrate the ability to identify threats in a target-‐rich environment and minimize collateral damage.
2016-‐2017 SLO5: Students will be able to perform the advanced defensive spell set (i.e. Protego, Horribilis, and Expecto Patronum)
3
Specific SLO Assessment Strategies
Year 1: SLO1 & SLO3 Assessment SLO 1: Fundamental Differences
SLO 1 will be assessed in our introductory course, DADA 100: Introduction to the Defense Against the Dark Arts. DADA 100 is taught every Fall semester and is the first class incoming first years take in the DADA sequence. Since there are multiple sections of this class, we will make sure to collect data on more than one section (to avoid inter-‐house differences); however, assessment will remain the same.
The introductory class contains both academic and practical sections, and this SLO will be assessed in both. For the written portion, the faculty will work together to create items that will assess this knowledge and will insert them into the exams. There are two exams – midterm and final – and the items will be inserted into both.
For the practical portion, the student is involved in training scenarios throughout the semester that demonstrate knowledge of the three fundamentals. The four full-‐time faculty members in DADA will develop the evaluation rubric for this section of this assignment through consensus agreement. Care will be taken to establish consistency in evaluation across the four faculty members. Inter-‐rater reliability will be established.
Once the students written and practical work have been evaluated, the criterion for achieving SLO 1 will be that 80% of the students earn 75% or more of the available points (i.e. 80% of the students will meet or exceed expectations).
SLO 3: Stevenson’s Laws
SLO 3 will be assessed in DADA 401: N.E.W.T. Preparatory Course I. DADA 401 is taught in the Fall semester, and students that pass move on to DADA 401: N.E.W.T. Preparatory Course II which is taught solely in the Spring Semester. A requirement for passing DADA 401 is a term paper unpacking the ramifications of Stevenson’s Laws in relationship to the need to keep the Wizarding World a secret from the Muggle World.
Stevenson’s Laws are as follows: 1. A wizard may not injure a muggle; or, through inaction, allow one to come to harm. 2. A wizard may protect their own existence as long as such protection does not conflict
with the First Law. 3. A wizard must obey the laws of the Ministry of Magic, except where such laws would
conflict with the First or Second Laws.
While not all papers follow the same argument or address the laws in the same way, all of the papers require a section explaining the laws themselves. First, as Stevenson intended them; then as they were reinterpreted during the Defense Summit ten years later.
The four full-‐time faculty members in DADA will develop the evaluation rubric for this section of this assignment through consensus agreement. Care will be taken to establish consistency in evaluation across the four faculty members. Inter-‐rater reliability will be
4
established. Once all papers have been evaluated, the criterion for achieving SLO 3 will be if 80% or more of the students will meet or exceed expectations.
SLO 2 is assessed in DADA 106: Basic Defense Practicum. This class is taught in the Fall semester and is the first (of 4) of our required practical defense series. This series is designed to give witches and wizards a chance to practice all defensive spells they might need in a safe environment. The theory of the basic defensive skill set (i.e. Expelliarmus, Impedimenta, and Stupefy) is taught in the semester prior to DADA 201 and so practice on this set begins on the first day of class.
After the Second Wizarding War, the Defensive Decree required that all wizarding schools (as a condition of accreditation) require a practicum on defensive skills and that all students be able to complete the basic defensive skill set before they are able to graduate. As such, our expectations for this particular SLO are higher than the others; moreover, we are as concerned about progress on the skill level as well as the basic ability to complete the set. That is, not only must every student be able to achieve these spells, they must get better at the spells over the course of the semester (i.e. be able to remove wands from the hands of more attackers at once; be able to freeze larger attackers for longer; and be able to stun stronger wizards for longer).
We therefore have two criteria for this SLO that must be met in order for us to achieve SLO 3. First, that 95% of the students be at or above the expectations by the end of the semester. Second, that at least 80% of the students show progress on all three spells. The tests in this class directly assess these exact concerns so the scores related to the three spells in question can be used as a stand in measure for the SLO. SLO 4: Threat Identification
SLO 4 is assessed in two courses, DADA 206: Intermediate Defense Practicum, and DADA 306: Advanced Defense Practicum.
DADA 206 is offered in the Spring semester and is the second (of 4) courses in the
required practical defense series. This course focuses largely on the theory of defensive magic application and so contains an entire section on the ethics of bystander management and the responsibility every witch and wizard has to maintain the International Statute of Wizarding Secrecy. Every student must identify an event in wizarding history where these ethics were involved and present it as a case study to the class. In the presentation, they must address the bystander component specifically. The four full-‐time faculty members in DADA will develop an evaluation rubric for these presentations through consensus agreement. Care will be taken to
5
establish consistency in evaluation across the four faculty members. Inter-‐rater reliability will be established.
DADA 306 is offered in the Fall semester and is the third (of 4) courses in the required practical defense series. In addition to learning new spells, this course focuses on the application of defensive magic in real-‐world scenarios. Students run various drills in controlled environments that are meant to mimic possible defensive situations. These simulations pit them against three basic scenarios (with variations week to week): a scenario in which multiple magical attackers are present, and a small number (one or two) of muggles are present (the goal of this exercise is to learn to differentiate between muggle and attacker); a scenario in which one magical attacker is present, and there is a large crowd of muggles (typically this presents as a chase); and a scenario in which a large group of magical attackers is pitted against another large group of attackers (this plays out in two versions – one intended to practice tactical ability, the other intended to give students a sense of battle from “eye-‐level”).
SLO 4 is more directly assessed in the first two scenarios presented above. The four full-‐time faculty members in DADA will develop the evaluation rubric for these two scenario types through consensus agreement. Care will be taken to establish consistency in evaluation across the four faculty members. Inter-‐rater reliability will be established.
Once all students have been evaluated in both 206 and 306, the criterion for achieving
SLO 4 will be if 80% or more of the students earn 75% of the points or higher on the assignment (i.e. 80% of the students will meet or exceed expectations). Year 3: SLO5 Assessment
SLO 5 is assessed in DADA 361: O.W.L. Preparatory Course II. This course is offered during the spring semester each year. There are three prerequisites for this course: an A (Acceptable) or higher in DADA 351: O.W.L. Preparatory Course I; an A or higher in DADA 306: Advanced Defense Practicum; and an A or higher in DADA 298: Advanced Defensive Theory. As such, the standards for judging the outcomes are higher, but we still expect a similar proportion of students to meet or exceed those expectations.
In DADA 361, students are required to do several drills of a broad range of defensive spells. At three points during the semester, they are given the opportunity to “test out” of the class. That is, if they are able to perform all of the defensive spells required for the course, they are moved to an advanced training program where they learn more complex spells. The expectation is that the vast majority of the students will “test out” by the third test. If they are unable to do so, we recommend additional training before sitting for their O.W.L.s.
The advanced defensive spell set (i.e. Protego Horribilis, Everte Statum, and Expecto Patronum) are not the only spells expected to be performed during the test, so scores on the test are not a good stand-‐in for this SLO. However, during the test, instructors take records on individual spell completion so we will be able to use this data to assess the SLO. Once all students have been evaluated, the criterion for achieving SLO 5 will be if 80% or more of the
6
students are able to perform the three identified spells by the end of the semester (i.e. 80% of the students will meet or exceed expectations by the last test).
A central goal of program assessment is to gain knowledge about the successes and shortcomings of a program's curriculum and resources in terms of achieving its student learning objectives. When SLOs were not meet successfully, changes in curriculum or pedagogy were discussed as well as which resources might better help us meet these standards. While some of our difficulties meeting our SLOs in the last round of assessment were do to external circumstances (the Second Wizarding War, Dark Professors, the school being a battleground, etc.) some were due to curricular and pedagogical concerns. For example, we now separate the training of defensive theory and defensive practice into different courses as we found that (particularly the younger) students were distracted by the practice and did not absorb as much of the theory. Separating them has increased performance drastically. In addition, in the past year we have hired two new full time faculty members – bringing our departmental total to four – which will allow more sections to be taught of each course, thereby lowering the class size.
COURSE
SLOs 1 2 3 4 5
DADA 100 E, D DADA 106 D DADA 401 PA DADA 206 PR DADA 306 D DADA 361 D
PA -‐ Paper E -‐ Exam D -‐ Demonstration PR -‐ Presentation
7
COURSE
SLOs 1 2 3 4 5 6 7
Introductory Course E E History/Theories E P P Methods E, L E, L L Required Course 1 E E E, P P Required Course 2 P P P Required Course 3 E E P Required Course 4 I, PO I, PO Capstone PO PO PO PO PO Assessment Key: P-Paper E-Exam PO-Portfolio O=Oral Presentation L-Lab Assignment I-Internship
Defense Against the Dark Arts Spring 2015 Assessment
Summary of Assessment Defense Against the Dark Arts: Student Learning Objectives (SLOs).
SLO1: Students will be able to explain the fundamental differences between the light
arts and the dark arts (i.e. Intention, Outcome, and Context). (Assessed in DADA 100: Intro to the Defense Against the Dark Arts)
SLO2: Students will be able to perform the basic defensive spell set (i.e. Expelliarmus, Impedimenta, and Stupefy). (Assessed in DADA 106: Basic Defense Practicum).
SLO3: Students will demonstrate knowledge of Stevenson’s Laws for magical defense. (Assessed in DADA 252: History of Magical Defense).
SLO4: Students will demonstrate the ability to identify threats in a target-‐rich environment and minimize collateral damage. (Assessed in DADA 206: Intermediate Defense Practicum, and DADA 306: Advanced Defense Practicum).
SLO5: Students will be able to perform the advanced defensive spell set (i.e. Protego, Horribilis, and Expecto Patronum). (Assessed in DADA 361: O.W.L. Preparatory Course II).
Our assessment schedule is:
Assessment Year Student Learning Objectives Assessed 2014-‐2015 SLO1: Students will be able to explain the fundamental differences
between the light arts and the dark arts (i.e. Intention, Outcome, and Context). SLO 3: Students will demonstrate knowledge of Stevenson’s Laws for magical defense.
2015-‐2016 SLO2: Students will be able to perform the basic defensive spell set (i.e. Expelliarmus, Impedimenta, and Stupefy) SLO4: Students will demonstrate the ability to identify threats in a target-‐rich environment and minimize collateral damage.
2016-‐2017 SLO5: Students will be able to perform the advanced defensive spell set (i.e. Protego, Horribilis, and Expecto Patronum)
2014-‐2015: SLO1 & SLO3 Assessment
SLO1: Fundamental Differences
SLO 1 was assessed in two sections of our introductory course, DADA 100: Introduction
to the Defense Against the Dark Arts. DADA 100 is taught every Fall semester and is the first class incoming first years take in the DADA sequence. The introductory class contains both academic and practical sections, and this SLO was assessed in both.
Academic assessment was collected at two points during the semester – the midterm and the final. Three short-‐answer questions were inserted into both the midterm and the final
that are intended to assess the differences between dark arts and light arts. The three questions address the three fundamental components of difference: intention, outcome, and context. Results were pooled across the two sections.
On the midterm, the questions were: 1. Describe the role intention plays in the ability to perform Episkey (heals minor injuries). 2. Name three differences between the outcomes of light spells and the outcomes of dark spells 3. Some spells (such as Confundus) can be considered both light and dark depending on the context. Explain a context where Confundus would be a light art spell and a context where it would be a dark art spell. On the final, the questions were: 1. What is meant by intentional drift? 2. What is the more important factor in evaluating the outcome of a particular spell – the external result or the internal result? 3. What is meant by radical context dependency? and what does it have to do with the historical view of witches and wizards? (give specific examples other than He Who Must Not Be Named).
On both the midterm and the final, the answers to the questions are rated out of 10 points, which gives a score from 0 to 20 for each student for each content area (intention, outcome, context).
For the practical portion, the students are involved in training scenarios throughout the
semester that demonstrate knowledge of the three fundamentals. The four full-‐time faculty members in DADA created an evaluation rubric for these scenarios that we believed could be applied objectively. Across the semester, each student received a score out of 10 for each content area.
The academic portion of this assessment gave each student a score out of 20 for each
area and the practical portion gave them a score out of 10 for each content area. Therefore, each student had a final score out of 30 for each content area Our criterion for achieving SLO 1 was 80% or more of the students to earn 22 points or higher in each category (roughly 75% of points). For all measures, n = 65. Intention:
Not Meeting Expectations
Approaching Expectations
Meeting Expectations
Exceeding Expectations
1% 2% 82% 15%
Total meeting or exceeding expectations = 97%
Outcome:
Not Meeting Expectations
Approaching Expectations
Meeting Expectations
Exceeding Expectations
6% 19% 66% 9%
Total meeting or exceeding expectations = 75%
Context:
Not Meeting Expectations
Approaching Expectations
Meeting Expectations
Exceeding Expectations
6% 11% 63% 20%
Total meeting or exceeding expectations = 83%
For SLO1, students met the assessment criterion both Intention and Context, but not
Outcome. The criteria was almost met with 75% of students achieving a score of 15 points or higher. The DADA faculty discussed these outcomes and the general consensus is that Outcome differences are difficult for a young child (particularly those unused to the magical world) to grasp at the outset of the discussion and the actual definitions are largely only addressed early in the semester. That is, the theory of Outcome is presented in the first week of the course, and discussed in application but not definition for the rest of the semester. We believe periodic reminders of the definitions of terms would both help cement the theory and inform discussions throughout the course. More data is needed to determine if this change would affect the outcome.
SLO 3: Stevenson’s Laws
SLO 3 was assessed in two sections of DADA 252: History of Magical Defense. Our original assessment plan had us assessing this in DADA 601: N.E.W.T. Preparatory Course I; however, APAC suggested that since only a dozen or so students take that course each year, and this concept is fundamental to our program we should assess it earlier in the sequence. DADA 252 was chosen because Stevenson’s Laws are discussed at length during the presentation of the First Wizarding War.
Stevenson’s Laws are as follows: 1. A wizard may not injure a muggle; or, through inaction, allow one to come to harm.
2. A wizard may protect their own existence as long as such protection does not conflict with the First Law.
3. A wizard must obey the laws of the Ministry of Magic, except where such laws would conflict with the First or Second Laws.
In DADA 252, SLO 3 was assessed in an essay asking students to link the First Wizarding
War to Stevenson’s Epiphany, which lead to his Laws. Stevenson has declared one had nothing to do with the other, but was present at many of He Who Shall Not Be Named’s many Muggle Massacres. The essay prompt requires a section outlining the laws and unpacking what they might mean in various contexts. The section was graded on a 9 point rubric: they earned a point each for (1) correctly defining each law, for (2) correctly understanding each law, and for (3) demonstrating a deeper understanding of the complexity of each law. The first two are considered essential, the last is reserved for the best students. As such, we consider 6 points to be mastery, and expect that 80% of our students will achieve mastery.
Fall 2014 (n = 32):
Not Meeting Expectations
Approaching Expectations
Meeting Expectations
Exceeding Expectations
2% 4% 71% 23%
Total meeting or exceeding expectations = 94%
Spring 2015 (n = 35):
Not Meeting Expectations
Approaching Expectations
Meeting Expectations
Exceeding Expectations
1% 2% 82% 15%
Total meeting or exceeding expectations = 97%
The acceptable criterion of 80% or more of students earning 6 or more points was met in both terms. At our 2014 summer retreat, we made the decision to change the methodology of how this section was taught and we believe these results reflect that change. However, there is now some concern that the students are being “led” to the right answer – that we are teaching to the test. We continue to struggle to find an assessment measure that accurately reflects students’ knowledge without changing the quality of their education.
Closing the Loop
We assessed SLO 1 and SLO 3 this year. We found that SLO 1 is largely being met; however, there is room for improvement in our presentation of Outcome effects in determining
light vs. dark magic. At our retreat in Valhalla this summer we intend to discuss further how we might address these issues with incoming students. We are struggling with two basic issues in that class that make it difficult. First, some students are unaware of the wizarding world before entering Hogwarts (e.g. they are Muggle-‐born) and so find themselves overwhelmed with these issues, whereas wizard-‐born have an instinctual understanding of the issues (having grown up in magical families) and get bored when the issues are covered too slowly. We have discussed having separate classes for the two groups, but most of us feel this would cause unwanted division. This summer there is a conference on how to improve muggle-‐born and wizard-‐born relations (particularly in terms of issues in education) and we believe our department would benefit from at least one of us (if not all) attending. We lack the requisite funding for such a trip.
The second main issue we struggle with is the turnover in the Defense Against the Dark Arts faculty. Between death, madness, dismissal for child abuse, and being in service of the Dark Lord (not to mention the upheaval after the Second Wizarding War), we have lost many of our faculty over the previous years. This has led to unevenness in the education of the first years. Our goal going forward is to first, hire more quality faculty and second, to retain that faculty long enough that they can try many approaches to the incoming First Years and discover which pedagogy is most effective. As such, we require the administration to first, approve more tenure-‐track lines and second, make the hiring package competitive. Specifically, we believe that while the salary is commensurate with other Wizarding Schools, the expectation of service (particularly around the holidays) is keeping many young talented professors from applying and/or accepting a position here.
SLO 3 is being met across the board, however we have some concerns about teaching to the test. We will continue to revise the curriculum in the hopes of addressing that concern going forward.