Annual Institutional Effectiveness Report Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 1 Annual Institutional Effectiveness Report 2017- 2018 Report prepared by the Office for Institutional Effectiveness and Strategic Planning OIEP.COFC.EDU
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 1
Annual Institutional Effectiveness Report
2017-2018
Report prepared by the Office for Institutional Effectiveness and Strategic Planning OIEP.COFC.EDU
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 2
The Assessment Process
The College of Charleston follows an annual assessment model for the systematic submission and review of academic program and administrative unit assessment reports (plans and results). Key elements of this model include articulated student learning/operational outcomes, multiple assessment measures, performance targets, peer mentoring and review, and broad-based participation. Academic programs and administrative units use the assessment template to guide the required structure of the assessment reports that are housed in Compliance Assist (an assessment planning and management system). The College of Charleston Institutional Effectiveness (IE) Assessment model defines two broad categories: academic programs and administrative units. Academic programs include undergraduate and graduate educational programs, stand-alone minors, certificates, and the general education program. Administrative units include administrative support services, academic and student support services, and centers and institutes. Assessment Calendar with Deadlines
Academic Programs
• September 1: Data entry for assessment reports (plans and results) completed by all academic programs and administrative units to Compliance Assist. and submission of self-completed rubrics for the plans and results. For example, on September 1, 2018 assessment results for 2017-18 and assessment plans for 2018-19 are due.
• September 1-15: Independent reviews conducted and rubrics completed by the respective Deans Assessment Committees (DACs) and returned to the respective assessment coordinator(s).
• September 30: Final edits to plans and results as well as final rubrics uploaded to Compliance Assist.
Administrative Units
• June 30: Data entry for assessment reports (plans and results) completed by all administrative units to Compliance Assist and submission of self-completed rubrics for the plans and results. For example, on June 30, 2018 assessment results for 2017-18 and assessment plans for 2018-19 are due.
• July 1-15: Independent reviews conducted and rubrics completed by the respective Administrative Assessment Committees (AACs) and returned to the respective assessment coordinator(s).
• July 30: Final edits to plans and results as well as final rubrics uploaded to Compliance Assist.
Roles and Responsibilities The College of Charleston IE assessment model engages broad-based participation and encompasses several key faculty, staff, and administrator roles. The IE assessment model is an ongoing, broad-based process and involves collaborations between assessment coordinators, the Deans Assessment Committees (DACs) members at the school level, the Administrative Assessment Committees (AACs) members at the division level, the chairs of the DACs and AACs who comprise the Institutional Assessment Committee (IAC), the Provost or Executive Vice Presidents, the President, the Office for Institutional Effectiveness and Strategic Planning (OIEP).
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 3
Assessment coordinators (faculty and staff members) work collaboratively with colleagues in their programs or units to develop an assessment report (plans and results) and coordinate their program’s or unit’s ongoing assessment process. The Deans Assessment Committees (DACs) are school level assessment committees that exist for each school or college and the General Education Program. The DACs consists of faculty across the varying disciplines. The Administrative Assessment Committees (AACs) are division level assessment committees that exists for each of the nine divisions and consist of staff members and administrators from the respective divisions. These committees serve as mentors and work collaboratively with their programs/units to assist the assessment coordinators in their assessment efforts and to provide a review of the quality of the assessment reports based on established criteria provided in the Institutional Assessment Rubrics. The committees use the IE rubrics to focus discussions on the rubric indicators, so as to increase the quality of assessment reports. The chair of each DAC and AAC serves on the Institutional Assessment Committee (IAC). The Institutional Assessment Committee (IAC) is an institutional-level committee that consists of the chairs from each of the DACs and AACs. It oversees the implementation of the IE assessment process, facilitates campus discussion and reflection on use of results to make improvements in student learning and operations. The IAC also ensures the quality of the reviews conducted by the DACs and AACs through its oversight of the review process. Annually, each member of the IAC presents a DAC or AAC report to IAC about the quality of the results and plans. Committee rosters and meeting minutes are archived at the OIEP website. Additional responsibilities of the IAC include:
helping to create and maintain a culture of at the College of Charleston,
ensuring the use of assessment results to make improvements,
motivating faculty and staff participation in all steps of the assessment process,
providing feedback on assessments to promote continuous improvement,
involving students by promoting awareness of institutional measures,
coordinating assessment efforts,
generating ways to involve external stakeholders in meaningful assessment activities,
working with other campus entities to incorporate institutional data,
coordinating and collaborating to provide faculty/professional development, and
ensuring that new faculty and staff receive information about assessment. The Executive Vice Presidents (EVPs) and the President review a random sample of completed rubrics for programs and units and provide additional feedback, if necessary. OIEP serves as a support office for assessment coordinators, the DACs, the AACs, the IAC, the EVPs, and the President.
Summary of Completion Rates in the Assessment Cycle
As a part of the assessment process, assessment coordinators (faculty and staff members) from each program or unit work collaboratively with their colleagues to develop the outcomes, select and implement measures, analyze results, and plan for improvements based on the results. There are two phases to this collaborative process that represent the two parts of an assessment report, planning and results. Assessment coordinators: 1) report results from the previous year's assessment plan based on
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 4
data analysis and use results to make changes in curriculum, pedagogy, or operations; and provide necessary changes in the use of results and assessment summary sections in Compliance Assist; and 2) develop an assessment plan for the current year, which includes measurement of the effect of changes implemented based on the results of the previous year. The assessment plan includes a mission statement, assessment process, outcomes, measures and a curriculum or a functional map. In order to demonstrate compliance with both the appropriate accrediting standards related to assessment and the College of Charleston’s assessment procedures, OIEP staff conduct an audit of the completion status of all assessment reports, presenting the planning and the results phases separately. The data from this audit uses the following criterion as its primary indicators of completion: complete, partial, and missing. A rating of "complete" is determined as 100% of the required fields within the IE template containing content (text). A "partial" rating is determined as some percentage less than 100% of the required fields with content, but more than 0%. A rating of "missing" is determined as 0% of the required fields with content. The required fields included within the planning phase of the audit are program information, mission statement, assessment process, functional map, outcomes, assessment measures, research & service, and the relating of outcomes to strategic initiatives. The required fields included within the results phase of the audit are assessment results, meet targets, use of assessment results, impact on budget, and the assessment summary report. Note: The data presented throughout this report were collected from Compliance Assist within two weeks after the institutional deadlines, so these data may not reflect the current completion status of any given program/school or unit/division. Tables 1-3 show the completion rates for academic programs and schools and Table 4-6 show completion rates for administrative units. Table 1. Completion Rates for 2017-2018 Academic Assessment Reports
School Number of Programs
Plans Results
Complete Partial Missing Complete Partial Missing
SOTA 8 38% 63% 0% 38% 13% 50%
SB 14 71% 21% 7% 50% 43% 7%
EHHP 15 20% 73% 7% 100% 0% 0%
HSS 17 82% 12% 6% 53% 35% 12%
LCWA 24 46% 54% 0% 67% 33% 0%
SSM 23 78% 17% 4% 52% 39% 9%
SPS 3 0% 100% 0% 0% 67% 33%
Interdisciplinary Minors 5 100% 0% 0% 60% 40% 0%
Gen Ed* 1 100% 0% 0% 100% 0% 0%
HONS 1 100% 0% 0% 0% 100% 0%
UCSC 33 33% 55% 12% 27% 18% 24%
Total 144 53% 41% 6% 50% 34% 15% *General Education has 7 distribution areas; however, is counted as 1 program for the assessment audit.
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 5
Table 2. Comparison of Completion Rates for Academic Programs
Year Number of Programs
Plans Results
Complete Partial Missing Complete Partial Missing
2015-2016 140 91% 8% 1% 81% 11% 8%
2016-2017 144 92% 8% 1% 54% 35% 12%
2017-2018 144 53% 41% 6% 50% 34% 15%
Table 3. Comparison of Complete Academic Assessment Reports
Total 91% 92% 53% 81% 54% 50% *General Education has 7 distribution areas; however, is counted as 1 program for the assessment audit. ** EHHP was given an extension by the President to report past the institutional deadline.
Table 4. Completion Rates for 2017-2018 Administrative Assessment Reports
Division Number of Units
Plans Results
Complete Partial Missing Complete Partial Missing
Academic Affairs 18 6% 94% 0% 33% 50% 17%
Business Affairs 7 14% 86% 0% 0% 86% 14%
Enrollment Planning 2 0% 100% 0% 0% 100% 0%
Facilities Management 3 0% 100% 0% 0% 67% 33%
Information Technology
2 0% 100% 0% 0% 0% 100%
Institutional Advancement
1 0% 100% 0% 0% 0% 100%
Marketing and Communication
1 0% 100% 0% 0% 0% 100%
Presidents Division 11 9% 91% 0% 0% 64% 36%
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 6
Student Affairs 14 0% 100% 0% 36% 64% 0%
Centers and Institutes 26 23% 77% 0% 12% 35% 4%
Total 85 11% 89% 0% 16% 52% 16%
Table 5. Comparison of Completion Rates for Administrative Units
Year Number of Units
Plans Results
Complete Partial Missing Complete Partial Missing
2015-2016 85 73% 27% 0% 49% 31% 19%
2016-2017 85 66% 34% 0% 27% 67% 6%
2017-2018 85 11% 89% 0% 16% 52% 16%
Table 6. Comparison of Complete Administrative Assessment Reports
Marketing and Communication 100% 100% 0% 0% 100% 0%
Presidents Division 100% 82% 9% 0% 0% 0%
Student Affairs 80% 73% 0% 100% 27% 36%
Centers and Institutes
52% 73% 23% 57% 54% 12%
Total 73% 66% 11% 49% 27% 16% ** Enrollment Planning, Facilities Management, and Information Technology were not formal divisions prior to 2017-2018, thus were not reported as such for assessment. Due to low completion rates from many academic program and administrative units the, Office for Institutional Effectiveness and Strategic Planning implemented several strategies to increase completion. The initial strategy incorporated the use of emails to DAC and AAC chairs to encourage completion within their respective schools and divisions. The second strategy used a similar emailing structure, but focused directly on the assessment coordinators. The third strategy involved a thorough review of data to determine which programs and units were missing which sections of the assessment template. The resulting analysis informed the office of which programs and units required one-on-one consultation with OIEP. The fourth strategy was to hold meetings with the Deans and EVPs of the respective schools and divisions to highlight missing components. The fifth strategy involved the Provost sending memos to missing units and programs.
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 7
Quality Assurance Process
The assessment coordinators submit the assessment reports (plans and results) for review to the assigned DAC or AAC committee. The assigned DAC or AAC chair and/or mentor in each school or division review the quality of the assessment reports (plan and results) based on established criteria defined in the IE assessment rubrics: the Assessment Plan Rubric and the Assessment Results Rubric. These rubrics are a tool for providing specific feedback to improve the quality and increase the rigor of the assessment reports (plans and results) by setting expectations and promoting discussion. The Assessment Plan Rubric and the Assessment Results Rubric enhance the collaborative process to deepen the culture of assessment. Based on feedback from DACs and AACs, assessment coordinators have the opportunity to revise and improve the quality of plans and reports unit. Table 8 demonstrates the aggregated data of the plan rubric ratings for the programs and units and Table 9 for results rubric ratings. Table 7 provides the coding scheme for the rubrics. See Appendix A for plan and results rubric ratings by individual programs and units. Table 10 shows the average rubric rating for both academic programs and administrative units. Table 7. Codes for Assessment Rubrics
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 9
*Percentages based on those programs/units that completed rubrics.
Table 10. Year-to-Year Comparison of Average Rubric Ratings*
Plans Results
Academic Administrative Total Academic Administrative Total 2015-2016 2.80 2.81 2.80 3.32 2.94 3.18
2016-2017 3.75 3.70 3.72 3.44 3.23 3.42
2017-2018 3.42 2.95 3.27 3.83 3.68 3.79 *Percentages based on those programs/units that completed rubrics.
Evidence of Continuous Improvement
The primary purpose of IE assessment is to collect data to identify gaps in student learning, and operations. This is demonstrated when assessment data presents an opportunity for improvement and a new strategy is implemented to remove the gap, and ultimately, enhance student learning and/or advance operational effectiveness. Programs and units collect data to evaluate the impact of an implemented change to improve student learning and operations. The use of prior year’s results to improve student learning and operations demonstrates a “closed loop” process. As specified by indicator(s) 9 on the plan assessment rubric and 7 and 8 on the results assessment rubric (see Appendix B), closing of the loop is essential to an exemplary assessment report. Table 11 provides a summary of the number of programs and units from their respective schools and divisions that received exemplary ratings on their assessment plan and results rubrics, thus, closing the loop. Table 11. Closing the Loop*
Marketing and Communications (N=1)
Missing Missing Missing Missing Missing
President’s Division (N=11, n=1) 0% 0% 0% 0% 100%
Student Affairs (N=14, n=5) 0% 0% 0% 0% 100%
Centers, Institutes, Offices, and Programs within Schools (N=26, n=23)
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 10
*Percentages based on those programs/units that completed rubrics. ** Prior to 2016-2017, First-Year Experience was counted with the academic programs/schools, but for 2016-2017 forward will be counted with the administrative units. *** Enrollment Planning, Facilities Management, and Information Technology were not formal divisions prior to 2017-2018, thus were not reported as such for assessment.
Institutional Measures: Surveys
The College of Charleston has utilized a variety of survey datasets to assess its student learning, student engagement, and student success. OIEP plans, coordinates, administers, and publishes results from several national and institutional surveys conducted at the College of Charleston. National surveys consist of The CIRP freshman Survey, Your First College Year Survey(YFCY), College Senior Survey(CSS), National Survey of Student Engagement (NSSE), The Beginning College Survey of Student Engagement (BCSSE), and Faculty Survey of Student Engagement(FSSE). In addition, the ETS Proficiency Profile standard test is administered every three years at the College of Charleston. Surveys at the institutional level are Senior Exit Surveys(SES), Post-Graduation Survey (six-months post-graduation, one-year post-graduation, three-years post-graduation and five-years post-graduation). Results from these surveys are communicated via emails and presentations, and previous and current survey analytical reports are published on the OIEP website. The results of these enterprise-level surveys are used to evaluate student learning. In the 2017-2018 academic year, OIEP administered Senior Exit Survey, Post-Graduation Survey (six-months post-graduation, one-year post-graduation, three-years post-graduation and five-years post-graduation) and the ETS Proficiency Profile test.
Graduate School 4% 8% 11% 10% 11% 36%
Honors College 0% 100% 100% 100% 100% 100%
General Education 100% 100% 100% 100% 100% 100%
TOTAL 16% 25% 26% 26% 35% 38%
Division/Area
Academic Affairs 6% 0% 11% 11% 30% 0%
Business Affairs 0% 58% 17% 0% 20% Missing
Enrollment Planning *** *** 0% *** *** 0%
Facilities Management *** *** 0% *** *** Missing
Information Technology *** *** 100% *** *** 100%
Institutional Advancement 0% 0% 0% 0% 0% Missing
Marketing and Communications 0% 100% 100% 0% 0% Missing
President’s Division 0% 0% 0% 8% 0% 100%
Student Affairs 0% 40% 0% 7% 31% 100%
Centers, Institutes, Offices, and Programs within Schools 5% 4% 5% 5% 9% 30%
TOTAL 3% 19% 8% 6% 16% 35%
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 11
Assessing the College’s Mission Academic programs and administrative units aligned their outcomes with strategic initiative(s) from the College’s Strategic Plan to assess the institution’s mission. Table 12 summarizes the number of outcomes supporting each strategic initiative for the past three years, while Figure 1 demonstrates the same comparison over the past six years. Table 12. Strategic Initiatives being supported by Assessment Reports
2: Develop nationally recognized academic programs at the graduate level. 52 13 76 23 80 7
3: Develop and support a highly qualified, diverse and stable base of faculty and staff. 6 41 1 49 3 16
4: Identify, attract, recruit, enroll and retain academically distinguished, well-prepared, diverse students.
20 75 38 95 34 41
5: Enhance and support co-curricular and extracurricular programs and facilities to promote and sustain an integrated, campus-wide approach to holistic education of students.
22 94 7 94 5 45
6: Align all aspects of the administrative and academic policies, procedures and practices to support the College’s purpose and achieve its envisioned future.
3 71 4 66 1 40
7: Provide appropriate, up-to-date facilities and infrastructure to support and enhance academic programs and co-curricular opportunities for students.
0 41 0 25 1 10
8: Engage with local, national and international constituents to leverage higher education for a stronger South Carolina.
22 39 22 68 24 38
9: Establish campus wide policies and practices aimed at creating enhanced non-state resources and promoting greater fiscal responsibility and self-sufficiency.
1 35 0 33 0 16
10: Brand the College of Charleston so that it is nationally and internationally recognized for a personalized liberal arts education with specific areas of distinction at the undergraduate and graduate level.
17 28 26 37 0 1
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 12
Figure 1. Number of Outcomes Aligned to the College’s Strategic Initiatives
Institutional Effectiveness Consultations and Workshops
Throughout the fall and spring semesters, OIEP employed numerous assessment strategies (sessions and workshops) to improve the quality of assessment at the College of Charleston. There were 229 units/programs available for attendance at these sessions and workshops in the 2017-2018 academic year. One of these strategies included “Connecting the Dots” sessions created to increase collaboration between units and enhance the quality of assessment overall. The structure of the sessions was a two-hour workshop with 2-5 units involved per session. Also, the process included a Session 1 and Session 2 for each unit (10 units received a combined Session 1 and Session 2 in one session). Unit leaders were encouraged to invite as many contributing faculty/staff members as possible. For the administrative side, 70 units were represented in at least one of the 3 offerings, with 27 units were represented at Session 1 and 2. For the academic programs, 49 programs participate in part 1 of the Connect the Dots series, which included 52 faculty members. Another strategy utilized this academic year was a workshop series (3 workshops) crafted for Student Affairs around the topic of rubric, both design and application. The attendance for these workshops were as follows: 13 participants in the first, 12 in the second, and 6 in the third. There were 4 participants that attended all 3 workshops. An additional four workshops were offered to train faculty and staff on the use of the IE rubric; 7 faculty/staff attended those workshops.
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 13
Appendix A Detailed Summary of Plan and Results Rubric Ratings
As part of the annual review process, each academic program’s and administrative unit’s assessment report is rated using 2 developmental rubrics for the 2 sections (plans and results) created by OIEP (see Appendix B). The plan rubric has 9 indicators and the results rubric has 8 indicators. As described in Table 13 and on each rubric, based on the number of indicators successfully completed, the program or unit receives a rubric rating. Tables 14-34 provide each program’s and unit’s rubric score. Table 13. Codes for Assessment Rubrics
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 25
Table 34. Centers, Institutes, Offices, and Programs within Academic Schools Assessment Rubrics
Centers, Institutes, Offices, and Programs
(N=26 Administrative Units)
2015-2016 2016-2017 2017-2018
Plans Results Plans Results Plans Results
A Talent Development USDE 1 3 3 Missing 2 2
Afterschool and Summer Learning Resources Center NA NA 3 4 3 4
Art Attack 1 1 4 3 2 3
Autism Project 2 2 4 4 3 3
Call Me MISTER 1 4 3 2 2 2
Carter Real Estate Center 3 3 4 2 2 5
Center for Coastal Environmental and Human Health NA NA NA NA 2 3
Center for Continuing and Professional Education Missing Missing Missing Missing Missing Missing
Center for Entrepreneurship 2 2 2 3 4 5
Center for Partnerships to Improve Education 2 3 3 5 3 5
Center for Public Choice and Market Process 2 3 4 4 4 4
FitCatz 1 1 2 1 3 5
Global Business Resource Center 2 2 2 1 2 2
Grice Marine Laboratory NA NA 2 1 2 5
Halsey Institute of Contemporary Art 2 3 4 1 1 Missing
Lowcountry Graduate Center Missing Missing Missing Missing Missing Missing
N.E. Miles Early Childhood Development Center 2 4 4 4 2 5
Office of Economic Analysis 1 2 2 4 2 4
Office of Professional Development in Education 3 3 4 2 3 4
Office of Student Services and Credentialing 1 2 4 4 2 3
Office of Tourism Analysis 2 3 4 4 4 4
Riley Center for Livable Communities 5 5 5 5 5 5
SCDOE grant 1 1 2 2 Missing 2
Student Success Center 2 4 4 3 4 4
Teacher Leader Program 1 3 4 Missing Missing 2
Teaching Fellows 1 2 3 1 3 1
Tech Fit 1 1 1 4 NA NA
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 26
Appendix B College of Charleston Institutional Effectiveness Assessment Plan Rubric
Academic/Administrative Unit: ____________________________ Rubric Completed By:_____________________ Date Completed:_____________ Reviewed with Assessment Coordinator (Initial/Date):_________________ Rating:_____________ Instructions: Please review the assessment plan and check which indicators are met as well as provide necessary comments. Based on the number of indicators met, please identify the rating (Level) of the assessment plan and indicate above.
Levels Indicators Comments
Establishing (Level 1)
Three or fewer indicators from the Developing category are met.
Emerging (Level 2)
Five or fewer indicators from the Developing category are met.
Developing (Level 3) ALL of the Developing indicators (#1-6) are met.
1. Program/Unit's mission statement concisely defines the purpose, functions, and key constituents. [The Program/Unit's mission is aligned to the School/Division/College Strategic Plan.]2. The assessment process describes: • Strategies to assess the outcomes. A strategy is a plan of action intended to accomplish a specific outcome/measure. • A plan to use the data for improving student learning and/or operations. • How the data will be shared within the Program/Unit and the College. [The assessment process describes how evidence-based decision making leads to improvement for the Program/Unit and how the planevolves over time. The assessment process description should present a clear understanding of how the Program/Unit utilizes assessment data for continuous quality improvement.] 3. Number of outcomes: • Administrative Units - minimum of three outcomes. • Academic Programs (undergraduate, graduate, stand-alone minors, certificates) - minimum of three student learning outcomes. [The outcomes are specific, measurable, attainable, results oriented, and time bound. The outcomes are clearly related to the mission and focus on activities of the Program/Unit.] 4. Measures and Performance Targets: a minimum of two appropriate, quantitative measures, with at least one being a direct measure, per outcome. Measures for the outcomes define specific performance targets and strategies to achieve the targets. [The measure matches the outcome, uses appropriate direct and indirect methods, indicates desired level of performance, helps identify what to improve, and is based on tested, known methods. The performance target is meaningful; based on existing benchmarks, previousresults, and existing standards. Grades and/or GPA should not be used as measures.] 5. The assessment plan directly links outcomes to the School/Division/College Strategic Plan. 6. Relevant assessment instruments (e.g., rubrics, survey instruments, logs, reports, etc.) are uploaded in ComplianceAssist (e.g., viaURL, as attachments, etc., if not proprietary). [If instrument is proprietary, please state so in the report.]
Proficient (Level 4) ALL of the Developing indicators plus at least one of the Proficient Indicators (#7 & 8) are met.
7. Clearly defined curriculum or functional map is provided.[Courses/functions are listed and linked to outcomes. Clear levels of learning defined for all outcomes at all levels (Introduce, Enhance, Reinforce).]8. The assessment plan promotes continuous quality improvement by having outcomes and measures that are formative in nature. Formative assessments provide ongoing feedback that can be used to improve student learning and operations. [The primary purpose of IE assessment is to collect data to identify gaps in student learning, and operations. This is demonstrated whenassessment data presents an opportunity for improvement and a new strategy is implemented to remove the gap. For best practices, when a measure has a performance target of 100%, or is constant for 2-3 assessment cycles, it is advisable to conduct a granular(disaggregate) analysis to identify gaps in learning and/or operations.]
Exemplary (Level 5) ALL nine indicators are met.
9. The assessment plan “closes the loop” by linking new strategies (changes) to previous assessment results. [Program/Unit collects data to evaluate the impact of an implemented change to improve student learning and operations. The use of prior year’s results to improve student learning and operations demonstrates a “closed loop” process.]
Annual Institutional Effectiveness Report
Annual Institutional Effectiveness Report, 2017-2018 Prepared by the Office for Institutional Effectiveness and Strategic Planning Page 27
College of Charleston Institutional Effectiveness Assessment Results Rubric
Academic/Administrative Unit: ____________________________ Rubric Completed By: ______________________ Date Completed: _____________ Reviewed with Assessment Coordinator (Initial/Date): _______________ Rating: _____________ Instructions: Please review the assessment plan and check which indicators are met as well as provide necessary comments. Based on the number of indicators met, please identify the rating (Level) of the assessment plan and indicate above.
Levels Indicators Comments
Establishing (Level 1)
Three or fewer indicators from the Developing category are met.
Emerging (Level 2)
Five or fewer indicators from the Developing category are met.
Developing (Level 3)
ALL of the Developing indicators (#1-6) are met.
1. Complete, aggregated, and relevant data are provided for each measure.[If there are extenuating circumstances that lead to missing data, an explanation must be provided.Missing data for extenuating circumstances is only permitted for one assessment cycle. If appropriate,data should be disaggregated by distance learning, off-site locations, and mode of delivery.]2. Data reporting is complete, concise, and well-presented.[Reported data are aligned and appropriate to the outcome and the corresponding measure.Sampling methodology, population size (N), and sample size (n) must be provided.]3. Results clearly specify whether the performance target (performance expectations) for eachmeasure have been met.[Assessment results are used for comparison of actual vs. expected performance targets. Dataprovides evidence of performance targets met, partially met, or not met.]4. Results provide evidence that the assessment findings informed discussion and improvements inthe Program/Unit.5. Results include at least one applied and/or planned change(s) based on the assessment data toimprove student learning, program quality, or unit operations. If no changes are provided, resultsshould identify an area of improvement for the next cycle.[The discussion of the results should specifically identify any curricular/operational/budget changes asa result of assessment.]6. Relevant assessment instruments (e.g., rubrics, survey instruments, etc.) are uploaded inComplianceAssist (e.g., via URL, as attachments, etc., if not proprietary).
Proficient (Level 4)
ALL of the Developing indicators plus indicator #7 are met.
7. The assessment report demonstrates how data analysis “closes the loop” by assessing the impactof applied changes.[Current year's results are compared to the previous year's results to evaluate the impact of apreviously reported change to demonstrate use of results to improve student learning andoperations.]
Exemplary (Phase 5)
ALL eight indicators are met.
8. The impact of “closing the loop” with an improvement is demonstrated by analyzing follow-updata. [Examples of improvement(s) in student learning, program quality, or unit operations areprovided and are directly linked to assessment data. The primary purpose of IE assessment is to assessthe impact of an implemented change.