Top Banner
STATE PERFORMANCE PLAN / ANNUAL PERFORMANCE REPORT: PART B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting on FFY 2019 South Carolina PART B DUE February 1, 2021 U.S. DEPARTMENT OF EDUCATION WASHINGTON, DC 20202 1 Part B
84

Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Aug 22, 2021

Download

Documents

dariahiddleston
Welcome message from author
This document is posted to help you gain knowledge. Please leave a comment to let me know what you think about it! Share it to your friends and learn new things together.
Transcript
Page 1: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

STATE PERFORMANCE PLAN / ANNUAL PERFORMANCE REPORT: PART B

for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act

For reporting on FFY 2019

South Carolina

PART B DUE February 1, 2021

U.S. DEPARTMENT OF EDUCATIONWASHINGTON, DC 20202

1 Part B

Page 2: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

IntroductionInstructionsProvide sufficient detail to ensure that the Secretary and the public are informed of and understand the State’s systems designed to drive improved results for students with disabilities and to ensure that the State Educational Agency (SEA) and Local Educational Agencies (LEAs) meet the requirements of IDEA Part B. This introduction must include descriptions of the State’s General Supervision System, Technical Assistance System, Professional Development System, Stakeholder Involvement, and Reporting to the Public.

Intro - Indicator DataExecutive SummaryThe South Carolina Department of Education’s (SCDE) strategic vision includes that all students graduate prepared for success in college, careers, and citizenship. By 2022 districts will have available a system of personalized and digital learning that supports students in a safe learning environment to meet the Profile of the South Carolina Graduate. Defined core priorities include supporting the social-emotional learning, health, and safety needs of students through a whole-child approach; strengthening standards, curriculum, instruction, and assessment alignment within schools and districts; enhancing infrastructures, resources, data, and technology of the State's public educational systems; addressing the equity needs of districts and schools through differentiated supports and school transformation; and promoting educator and school leadership development. The Office of Special Education Services (OSES) worked with the National Center for Systemic Improvement (NCSI) to develop a shared vision aligned to that of the SCDE. The vision statement describes the grounding assumptions, purpose, and goals for the office and reads: “If the OSES provides consistent, collaborative, proactive direction and support focused in the areas of academics, social emotional learning, early childhood development, and post-secondary transition by using data-based decision making, quality instruction (evidence-based practices), family and community engagement, and fidelity in implementation then local education agencies (LEAs) will have the infrastructure, capacity, and sustainability to provide students with disabilities equitable access and opportunity to meet the profile of the South Carolina graduate (world-class knowledge, world-class skills, and life and career characteristics).” Additional information related to data collection and reportingThe SCDE continues to work with the South Carolina State General Assembly to assist LEAs within the same county with an average daily membership of less than 1500 students to consolidate to conserve resources and provide better services to students within the LEAs. Three LEAs did, in fact, consolidate during the 2018-19 school year and opened the 2019-20 school year as one LEA. Therefore, any data reported for lag years such as Indicators 2 and 4, will reflect 88 LEAs while all data for the current reporting year (FY19) will contain data for 86 LEAs.To restate, for the FY19 APR (school year 2019-20), the state had 86 LEAs. For the Indicators requiring the use of lag data (FY18 or 2018-19 school year), the state had 88 LEAs.

On March 15, 2020, the governor of South Carolina issued an emergency proclamation closing all physical public school and government buildings throughout the state beginning the next day, March 16, 2020. This was in response to the COVID pandemic. This emergency proclamation was extended through the end of the 2019-2020 school year and beyond. LEAs had to deploy emergency learning opportunities (ELOs) literally overnight. These ELOs were exactly what the name suggests – opportunities to provide some sort of continued learning activities in response to the unprecedented emergency created by the pandemic. Some LEAs and some schools in the state had the infrastructure in place to provide these ELOs on-line/virtually; others did not and so the ELOs consisted mainly of paper-and-pencil packets of work sent home through any means possible.

These COVID closures impacted children and their families in the following areas: Amount, quality, and type of learning time and content; Mode and frequency of communication between home and school; Access to WiFi, internet, or devices; Student engagement; Attendance/participation in ELOs; Access of students to basic healthcare services (both for physical and mental health); Access to meals; and Families’ financial situation.

COVID closures impacted the OSES in the following areas: Data collection, analysis, and reporting; Assessments and accountability; Finance; In-person learning/instructional modes; Provision of student health and wellness (mental and physical); Student outcomes; Teacher recruitment and retention; and Virtual/hybrid learning.

The SCDE, as did all other state departments of education, had to develop and provide guidance related to alternative means to provide not only instruction, but also other critical services such as meals, healthcare, mental health services, and transition services. The state continues to struggle to get an adequate infrastructure in place to allow equitable access to remote learning for all families. Although the state is beginning to collect data on the impact of the pandemic on children in public schools in the short-term (March to June 30), the disruption across the educational systems has continued and the long-term effects are yet to be known.Number of Districts in your State/Territory during reporting year 86General Supervision SystemThe systems that are in place to ensure that IDEA Part B requirements are met, e.g., monitoring, dispute resolution, etc.The SCDE has a system of general supervision in place to ensure that the Individuals with Disabilities Education Act (IDEA) Part B requirements are met. This system is designed to ensure that students with disabilities receive a free, appropriate public education (FAPE). This general supervision system includes the State’s Performance Plan; policies, procedures, and effective practices; effective dispute resolution; data on processes and results; integrated monitoring activities; targeted technical assistance (TA) and professional development (PD); corrections and sustaining of improvement; and

2 Part B

Page 3: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

fiscal management and accountability. Descriptions of the components of the SCDE’s general supervision system are set forth below along with references and links to forms and detailed information utilized in the various general supervision processes. The focus of all eight elements of the state’s general supervision system is on improving outcomes for students with disabilities and ensuring these students have equitable access and opportunity to meet the Profile of the South Carolina Graduate as described above. The OSES has made this results-based accountability the primary emphasis of all activities within the office. With the assistance of NCSI, the OSES will continue to refine its general supervision system to focus on the requirements and activities that are most closely related to improving outcomes for students with disabilities and their families. The review and revision of the state’s system of general supervision and development of the OSES mission statement has led to the coalescing of work around four areas to improve outcomes – early childhood, academic, social-emotional, and post-secondary. Staff provide integrated, tiered TA and PD based on LEA data. Additional information about the state's system of general supervision may be found at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/#sppAndDet.Technical Assistance SystemThe mechanisms that the State has in place to ensure the timely delivery of high quality, evidenced based technical assistance and support to LEAs.The state’s TA and PD systems are based on summative outcomes data that are collected and analyzed annually. OSES staff work with LEAs to conduct root cause analyses to identify success gaps and causes for these gaps. LEAs develop implementation plans that address these causes and are designed to improve outcomes in one or more of the focus areas. South Carolina uses a tiered system of TA and PD provision based on data. The state provides universal TA and PD to all LEAs, parents, students, and other stakeholders. This universal TA is in the form of modules, infographics, links to state and national TA providers, and guidance documents. South Carolina uses all available information, including LEA determination data, profile data, and root cause analysis data to determine areas in need of improvement at the LEA level. Once LEA-level needs are identified, the state provides targeted and intensive TA, depending on the LEA’s needs, aligned to those needs. Targeted technical assistance is provided to those LEAs with data indicating a need for more focused TA. This TA is provided via modules designed to address specific priority areas, face-to-face and virtual trainings, cohort trainings for LEAs with similar needs, and other methods. The state uses state-wide TA mechanisms such as collaborations with the Transition Alliance of South Carolina, the Behavior Alliance of South Carolina, and the South Carolina Partnerships for Inclusion to provide intensive TA to LEAs with data indicating significant needs in one or more of the four focus areas. This intensive TA is typically provided at the LEA- or school-level and is highly tailored and individualized. On-going evaluation of TA and PD looks at fidelity of implementation as well as changing outcomes. This evaluation also includes communication with and from stakeholders to ensure needs are being addressed. The purpose of the evaluation is to not only guide implementation, but also to ensure sustainability of corrections and improvements. Professional Development SystemThe mechanisms the State has in place to ensure that service providers have the skills to effectively provide services that improve results for students with disabilities.The state’s TA and PD systems are based on summative outcomes data that are collected and analyzed annually. OSES staff work with LEAs to conduct root cause analyses to identify success gaps and causes for these gaps. LEAs develop implementation plans that address these causes and are designed to improve outcomes in one or more of the focus areas. South Carolina uses a tiered system of TA and PD provision based on data. The state provides universal TA and PD to all LEAs, parents, students, and other stakeholders. This universal TA is in the form of modules, infographics, links to state and national TA providers, and guidance documents. South Carolina uses all available information, including LEA determination data, profile data, and root cause analysis data to determine areas in need of improvement at the LEA level. Once LEA-level needs are identified, the state provides targeted and intensive TA, depending on the LEA’s needs, aligned to those needs. Targeted technical assistance is provided to those LEAs with data indicating a need for more focused TA. This TA is provided via modules designed to address specific priority areas, face-to-face and virtual trainings, cohort trainings for LEAs with similar needs, and other methods. The state uses state-wide TA mechanisms such as collaborations with the Transition Alliance of South Carolina, the Behavior Alliance of South Carolina, and the South Carolina Partnerships for Inclusion to provide intensive TA to LEAs with data indicating significant needs in one or more of the four focus areas. This intensive TA is typically provided at the LEA- or school-level and is highly tailored and individualized. On-going evaluation of TA and PD looks at fidelity of implementation as well as changing outcomes. This evaluation also includes communication with and from stakeholders to ensure needs are being addressed. The purpose of the evaluation is to not only guide implementation, but also to ensure sustainability of corrections and improvements.Stakeholder InvolvementThe mechanism for soliciting broad stakeholder input on targets in the SPP, including revisions to targets.The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs,

3 Part B

Page 4: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic. Apply stakeholder involvement from introduction to all Part B results indicators (y/n)YESReporting to the PublicHow and where the State reported to the public on the FFY18 performance of each LEA located in the State on the targets in the SPP/APR as soon as practicable, but no later than 120 days following the State’s submission of its FFY 2018 APR, as required by 34 CFR §300.602(b)(1)(i)(A); and a description of where, on its Web site, a complete copy of the State’s SPP, including any revision if the State has revised the SPP that it submitted with its FFY 2018 APR in 2020, is available.The SCDE’s report on the performance of each LEA/SOP toward SPP targets may be found at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/data-collection-and-reporting/district-lea-profiles/ffy-2018/. The most recent (FFY2018 SPP/APR) may be found at https://sites.ed.gov/idea/spp-apr-letters?selected-category=sppapr-part-b&selected-year=&state=South-Carolina.

Intro - Prior FFY Required Actions The State's IDEA Part B determination for both 2019 and 2020 is Needs Assistance. In the State's 2020 determination letter, the Department advised the State of available sources of technical assistance, including OSEP-funded technical assistance centers, and required the State to work with appropriate entities. The Department directed the State to determine the results elements and/or compliance indicators, and improvement strategies, on which it will focus its use of available technical assistance, in order to improve its performance.The State must report, with its FFY 2019 SPP/APR submission, due February 1, 2021, on: (1) the technical assistance sources from which the State received assistance; and (2) the actions the State took as a result of that technical assistance.

In the FFY 2019 SPP/APR, the State must report FFY 2019 data for the State-identified Measurable Result (SiMR). Additionally, the State must, consistent with its evaluation plan described in Phase II, assess and report on its progress in implementing the SSIP. Specifically, the State must provide: (1) a narrative or graphic representation of the principal activities implemented in Phase III, Year Five; (2) measures and outcomes that were implemented and achieved since the State's last SSIP submission (i.e., April 1, 2020); (3) a summary of the SSIP’s coherent improvement strategies, including infrastructure improvement strategies and evidence-based practices that were implemented and progress toward short-term and long-term outcomes that are intended to impact the SiMR; and (4) any supporting data that demonstrates that implementation of these activities is impacting the State’s capacity to improve its SiMR data.

Response to actions required in FFY 2018 SPP/APRConsistent with the determination of needs assistance, South Carolina has accessed technical assistance from a variety of sources including OSEP-funded centers during FFY19. The various forms of support have included conference calls, use of materials and fact sheets, on-site and virtual consultation and training, webinars, E-newsletters, video conferencing, modules, and social media. The NCSI, the National Center for Intensive Intervention (NCII), the IDEA Data Center (IDC), and ECTA have been the TA centers providing the majority of support for the SPP/APR process, the review and revision of the state’s general supervision system, and the development, review, and revision of fiscal policies, procedures, and practices. The NCII continues to assist the state to expand its SSIP and to develop Tier 3 interventions with an ultimate goal of improving reading proficiency and implementation of the SSIP work. South Carolina is participating actively in cohort 2 of the NCII Data-Based Individualization TA. The OSES staff has participated in face-to-face and virtual meetings with NCII staff to support the development of the infrastructure for data-based individualization in schools. The Pyramid Model Consortium and Pyramid Innovation Center also provide significant support to the OSES in implementing this evidence-based model of social-emotional supports for young children at a state level. This work includes building infrastructure in our state to support early childhood educators and parents in teaching children desired behaviors and reducing unwanted behavior so that preschool suspensions and expulsions can be reduced. The Pyramid Model Consortium has provided guidance in developing and implementing the Pyramid Model in South Carolina. The majority of technical assistance during the FY19 year has come from NCSI, IDC, and NASDSE. Both TA centers have been instrumental, not only in continuing support for the review and revision of the state’s system of general supervision, but also in the support of the state during the pandemic. Weekly national calls as well as frequent state-specific calls provided invaluable support during this unprecedented time, not only in terms of guidance and practices, but also for moral support. This support enabled OSES staff to pass along the same supports to LEA staff, parents, and students. NASDSE was able to provide state directors a platform through which to share concerns, strategies, and guidance. South Carolina has used TA from these providers to continue developing goals and strategies within four focus areas to improve outcomes for students with disabilities – early childhood, academic, social-emotional, and post-secondary. South Carolina would like to express its heart-felt appreciation to the staff at these TA centers for their hard work, support and assistance during this unprecedented time. The US Department of Education, Office of Special Education and Rehabilitative Services, Office of Special Education Programs (OSEP) continues to provide ongoing guidance with respect to both programmatic and fiscal areas. Over the last year, OSEP staff have been critical in assisting the state. That support has included routine, monthly conference calls, periodic calls with respect to fiscal questions, email correspondence, onsite technical assistance, national conference presentations with South Carolina staff, and presentations of OSEP staff at State events.

4 Part B

Page 5: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Intro - OSEP ResponseThe State's determinations for both 2019 and 2020 were Needs Assistance. Pursuant to section 616(e)(1) of the IDEA and 34 C.F.R. § 300.604(a), OSEP's June 25, 2020 determination letter informed the State that it must report with its FFY 2019 SPP/APR submission, due February 1, 2021, on: (1) the technical assistance sources from which the State received assistance; and (2) the actions the State took as a result of that technical assistance. The State provided the required information.

Due to the circumstances created by the COVID-19 pandemic, and resulting school closures, the State does not have any FFY 2019 data for indicator 17.

Intro - Required ActionsThe State's IDEA Part B determination for both 2020 and 2021 is Needs Assistance. In the State's 2021 determination letter, the Department advised the State of available sources of technical assistance, including OSEP-funded technical assistance centers, and required the State to work with appropriate entities. The Department directed the State to determine the results elements and/or compliance indicators, and improvement strategies, on which it will focus its use of available technical assistance, in order to improve its performance. The State must report, with its FFY 2020 SPP/APR submission, due February 1, 2022, on: (1) the technical assistance sources from which the State received assistance; and (2) the actions the State took as a result of that technical assistance.

5 Part B

Page 6: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 1: GraduationInstructions and MeasurementMonitoring Priority: FAPE in the LRE Results indicator: Percent of youth with Individualized Education Programs (IEPs) graduating from high school with a regular high school diploma. (20 U.S.C. 1416 (a)(3)(A))Data SourceSame data as used for reporting to the Department of Education (Department) under Title I of the Elementary and Secondary Education Act (ESEA).MeasurementStates may report data for children with disabilities using either the four-year adjusted cohort graduation rate required under the ESEA or an extended-year adjusted cohort graduation rate under the ESEA, if the State has established one.InstructionsSampling is not allowed.Describe the results of the State’s examination of the data for the year before the reporting year (e.g., for the FFY 2019 SPP/APR, use data from 2018-2019), and compare the results to the target. Provide the actual numbers used in the calculation.Provide a narrative that describes the conditions youth must meet in order to graduate with a regular high school diploma and, if different, the conditions that youth with IEPs must meet in order to graduate with a regular high school diploma. If there is a difference, explain.Targets should be the same as the annual graduation rate targets for children with disabilities under Title I of the ESEA.States must continue to report the four-year adjusted cohort graduation rate for all students and disaggregated by student subgroups including the children with disabilities subgroup, as required under section 1111(h)(1)(C)(iii)(II) of the ESEA, on State report cards under Title I of the ESEA even if they only report an extended-year adjusted cohort graduation rate for the purpose of SPP/APR reporting.

1 - Indicator Data Historical Data

Baseline Year Baseline Data

2011 38.40%

FFY 2014 2015 2016 2017 2018

Target >= 42.30% 44.30% 46.30% 48.30% 50.30%

Data 43.20% 49.02% 52.06% 53.54% 52.10%

Targets

FFY 2019

Target >= 54.40%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

6 Part B

Page 7: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Prepopulated Data

Source Date Description Data

SY 2018-19 Cohorts for Regulatory Adjusted-Cohort Graduation Rate

(EDFacts file spec FS151; Data group 696)

07/27/2020 Number of youth with IEPs graduating with a regular diploma

*1

SY 2018-19 Cohorts for Regulatory Adjusted-Cohort Graduation Rate

(EDFacts file spec FS151; Data group 696)

07/27/2020 Number of youth with IEPs eligible to graduate 7,909

SY 2018-19 Regulatory Adjusted Cohort Graduation Rate (EDFacts file

spec FS150; Data group 695)

07/27/2020 Regulatory four-year adjusted-cohort graduation rate table

54.4%2

FFY 2019 SPP/APR Data

Number of youth with IEPs in the current year’s

adjusted cohort graduating with a regular diploma

Number of youth with IEPs in the current

year’s adjusted cohort eligible to graduate

FFY 2018 Data FFY 2019 Target FFY 2019 Data Status Slippage

*1 7,909 52.10% 54.40% 54.4%2 Did Not Meet Target

No Slippage

Graduation Conditions Choose the length of Adjusted Cohort Graduation Rate your state is using: 4-year ACGRProvide a narrative that describes the conditions youth must meet in order to graduate with a regular high school diploma and, if different, the conditions that youth with IEPs must meet in order to graduate with a regular high school diploma. If there is a difference, explain.South Carolina uses the following to determine the four-year graduation cohort rate. Section 200.19 of the Title I regulations issued under the Elementary and Secondary Education Act on December 2, 2002, defines graduation rate to mean: the percentage of students, measured from the beginning of high school, who graduate from public high school with a regular diploma (not including a GED or any other diploma not fully aligned with the state's academic standards) in the standard number of years; or, another more accurate definition developed by the state and approved by the Secretary in the state plan that more accurately measures the rate of students who graduate from high school with a regular diploma; and avoids counting a dropout as a transfer. South Carolina has stringent guidelines for graduation with a diploma, offering only one recognized academic diploma for all students. Graduation with a state–issued regular diploma in South Carolina requires the completion of twenty-four-unit courses in specified areas. Diploma requirements include only the successful completion of twenty-four units of study as prescribed by South Carolina Board of Education Regulation and state law. South Carolina high school graduation requirements can be found at https://ed.sc.gov/districts-schools/state-accountability/high-school-courses-and-requirements/.Are the conditions that youth with IEPs must meet to graduate with a regular high school diploma different from the conditions noted above? (yes/no)NOProvide additional information about this indicator (optional)NA

1 - Prior FFY Required ActionsNone

1 - OSEP Response

1 - Required Actions

1 Data suppressed due to privacy protection2 Percentage blurred due to privacy protection

7 Part B

Page 8: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 2: Drop OutInstructions and MeasurementMonitoring Priority: FAPE in the LREResults indicator: Percent of youth with IEPs dropping out of high school. (20 U.S.C. 1416 (a)(3)(A))Data SourceOPTION 1:Same data as used for reporting to the Department under section 618 of the Individuals with Disabilities Education Act (IDEA), using the definitions in EDFacts file specification FS009.OPTION 2:Use same data source and measurement that the State used to report in its FFY 2010 SPP/APR that was submitted on February 1, 2012.MeasurementOPTION 1:States must report a percentage using the number of youth with IEPs (ages 14-21) who exited special education due to dropping out in the numerator and the number of all youth with IEPs who left high school (ages 14-21) in the denominator.OPTION 2:Use same data source and measurement that the State used to report in its FFY 2010 SPP/APR that was submitted on February 1, 2012.InstructionsSampling is not allowed.OPTION 1:Use 618 exiting data for the year before the reporting year (e.g., for the FFY 2019 SPP/APR, use data from 2018-2019). Include in the denominator the following exiting categories: (a) graduated with a regular high school diploma; (b) received a certificate; (c) reached maximum age; (d) dropped out; or (e) died.Do not include in the denominator the number of youths with IEPs who exited special education due to: (a) transferring to regular education; or (b) who moved, but are known to be continuing in an educational program.OPTION 2:Use the annual event school dropout rate for students leaving a school in a single year determined in accordance with the National Center for Education Statistic's Common Core of Data.If the State has made or proposes to make changes to the data source or measurement under Option 2, when compared to the information reported in its FFY 2010 SPP/APR submitted on February 1, 2012, the State should include a justification as to why such changes are warranted.Options 1 and 2:Data for this indicator are “lag” data. Describe the results of the State’s examination of the data for the year before the reporting year (e.g., for the FFY 2019 SPP/APR, use data from 2018-2019), and compare the results to the target.Provide a narrative that describes what counts as dropping out for all youth and, if different, what counts as dropping out for youth with IEPs. If there is a difference, explain.

2 - Indicator DataHistorical Data

Baseline Year Baseline Data

2011 4.40%

FFY 2014 2015 2016 2017 2018

Target <= 4.40% 4.20% 4.00% 3.80% 3.60%

Data 3.61% 3.30% 3.52% 4.20% 4.03%

Targets

FFY 2019

Target <= 3.40%

Targets: Description of Stakeholder InputThe OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in

8 Part B

Page 9: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

Please indicate the reporting option used on this indicator Option 2Prepopulated Data

Source Date Description Data

SY 2018-19 Exiting Data Groups (EDFacts file spec

FS009; Data Group 85)

05/27/2020 Number of youth with IEPs (ages 14-21) who exited special education by graduating with a regular high school diploma (a)

3,186

SY 2018-19 Exiting Data Groups (EDFacts file spec

FS009; Data Group 85)

05/27/2020 Number of youth with IEPs (ages 14-21) who exited special education by receiving a certificate (b)

585

SY 2018-19 Exiting Data Groups (EDFacts file spec

FS009; Data Group 85)

05/27/2020 Number of youth with IEPs (ages 14-21) who exited special education by reaching maximum age (c)

298

SY 2018-19 Exiting Data Groups (EDFacts file spec

FS009; Data Group 85)

05/27/2020 Number of youth with IEPs (ages 14-21) who exited special education due to dropping out (d)

1,828

SY 2018-19 Exiting Data Groups (EDFacts file spec

FS009; Data Group 85)

05/27/2020 Number of youth with IEPs (ages 14-21) who exited special education as a result of death (e)

31

Has your State made or proposes to make changes to the data source under Option 2, when compared to the information reported in its FFY 2010 SPP/APR submitted on February 1, 2012? (yes/no)NOUse a different calculation methodology (yes/no)NOChange numerator description in data table (yes/no)NOChange denominator description in data table (yes/no)NO FFY 2019 SPP/APR Data

Number of youth with IEPs who exited

special education due to dropping out

Total number of High School Students with

IEPs by CohortFFY 2018

Data FFY 2019 Target FFY 2019 Data Status Slippage

1,677 27,856 4.03% 3.40% 6.02% Did Not Meet Target Slippage

Provide reasons for slippage, if applicable The OSES noted in the period from 2017-18 to 2018-2019 South Carolina’s unemployment rate according to the SC Department of Employment and Workforce (DEW) decreased from 3.5% to 2.8%. However, the DEW does not report unemployment by disability status and age group. So, though the unemployment rate decreased, and students leaving school for employment has the potential to make an impact on dropout rate there is no way to determine if leaving for employment was a cause of the increases in dropouts.

An annual report released by the state Department of Education reveals the number of dropouts for all student decreased in just over ½ of the public-school districts in South Carolina and the overall statewide dropout rate decreased from 2.4 to 2.2%. The students with disabilities dropout rate increased during this time. When districts are examined in relation to students with disabilities:

less than ½ of school districts decreased their dropout rate 8% districts experienced no change

9 Part B

Page 10: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

30% experienced less than a 1% change less than 14% of districts experienced 1% or larger change*

*However, the districts with greater than a 1% change had a small n-size, so the percentage change appeared large for that district however had minor impact at the aggregated state level.

The OSES noted at a state level there was an increase in dropout rates and a comparable increase in teacher vacancies and wanted to investigate if there was a relationship. The OSES examined the data of districts to compare the percent of student dropouts with the percent of teacher special education vacancies and/or number of teachers who are not certified to teach special education. When examined at the district level there was no significantly significant correlational relationship between the two variables.

There was no one defining cause that could be found to explain the increase in the rate of students with disabilities dropout rate. There was an overall small increase across at least 44% of districts across the state which has created a large statewide increase.Provide a narrative that describes what counts as dropping out for all youthThe State Board of Education defines dropout as a student who leaves school for any reason, other than death, prior to graduation or completion of a course of studies and without transferring to another school or institution. Is there a difference in what counts as dropping out for youth with IEPs? (yes/no)YESIf yes, explain the difference in what counts as dropping out for youth with IEPs below.The Policies and Procedures for the Collection of School Dropout Data, July 2015, has a slightly different definition of "dropout". If a student holds a state certificate or a district special education certificate and has completed the requirements of his/her IEP, has reached the age of twenty-one, or has entered a residential or day care facility, the student is not counted as a dropout. South Carolina has used this method of reporting since 2010 and there have been no changes in reporting. For more information and definitions regarding South Carolina's dropout please see the document SCDE Policies and Procedures for Dropout: https://ed.sc.gov/districts-schools/school-safety/discipline-related-reports/dropout-data/2019-dropout-policies-and-procedures-manual/. As outlined in the SCDE Policies and Procedures for Dropout, there are varying definitions. Provide additional information about this indicator (optional)There was a need to provide an alternate option for students with disabilities who are unable to earn a regular high school diploma to demonstrate their ability to transition into the work community. The uniform state-recognized South Carolina High School Credential (SCHSC) is aligned with the State's Profile of the South Carolina Graduate and to a course of study for these students with disabilities, whose IEP team determines this course of study is appropriate. The purpose of the SCHSC is to provide equitable job-readiness opportunities for these students throughout the state, ensure they have evidence of employability skills, and honor the work they have undertaken in our public schools. The state has been working closely with TA centers on providing guidance on alternative credentials for the state. The first set of students should receive a SC High School Credential in 2022. Additional information may be found at https://thesccredential.org/.

2 - Prior FFY Required ActionsNone

2 - OSEP Response

2 - Required Actions

10 Part B

Page 11: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 3B: Participation for Students with IEPsInstructions and MeasurementMonitoring Priority: FAPE in the LREResults indicator: Participation and performance of children with IEPs on statewide assessments:

A. Indicator 3A – ReservedB. Participation rate for children with IEPsC. Proficiency rate for children with IEPs against grade level and alternate academic achievement standards.

(20 U.S.C. 1416 (a)(3)(A))Data Source3B. Same data as used for reporting to the Department under Title I of the ESEA, using EDFacts file specifications FS185 and 188.MeasurementB. Participation rate percent = [(# of children with IEPs participating in an assessment) divided by the (total # of children with IEPs enrolled during the testing window)]. Calculate separately for reading and math. The participation rate is based on all children with IEPs, including both children with IEPs enrolled for a full academic year and those not enrolled for a full academic year.InstructionsDescribe the results of the calculations and compare the results to the targets. Provide the actual numbers used in the calculation.Include information regarding where to find public reports of assessment participation and performance results, as required by 34 CFR §300.160(f), i.e., a link to the Web site where these data are reported.Indicator 3B: Provide separate reading/language arts and mathematics participation rates, inclusive of all ESEA grades assessed (3-8 and high school), for children with IEPs. Account for ALL children with IEPs, in all grades assessed, including children not participating in assessments and those not enrolled for a full academic year. Only include children with disabilities who had an IEP at the time of testing.

3B - Indicator DataReporting Group SelectionBased on previously reported data, these are the grade groups defined for this indicator.

Group

Group Name Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8 Grade 9

Grade 10

Grade 11

Grade 12 HS

A Overall X X X X X X X X X X X

Historical Data: Reading

Group Group Name Baseline FFY 2014 2015 2016 2017 2018

A Overall 2005 Target >= 95.00% 95.00% 95.00% 95.00% 95.00%

A Overall 86.00% Actual 96.57% 98.33% 98.87% 99.09% 98.64%

Historical Data: Math

Group Group Name Baseline FFY 2014 2015 2016 2017 2018

A Overall 2005 Target >= 95.00% 95.00% 95.00% 95.00% 95.00%

A Overall 87.00% Actual 96.36% 98.20% 98.98% 99.18% 98.70%

Targets

Subject Group Group Name 2019

Reading A >= Overall 95.00%

Math A >= Overall 95.00%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other

11 Part B

Page 12: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

FFY 2019 Data Disaggregation from EDFactsInclude the disaggregated data in your final SPP/APR. (yes/no)YESData Source: SY 2019-20 Assessment Data Groups - Reading (EDFacts file spec FS188; Data Group: 589)Date:

Reading Assessment Participation Data by Grade

Grade 3 4 5 6 7 8 9 10 11 12 HS

a. Children with IEPs

b. IEPs in regular assessment with no accommodations

c. IEPs in regular assessment with accommodations

f. IEPs in alternate assessment against alternate standards

Data Source: SY 2019-20 Assessment Data Groups - Math (EDFacts file spec FS185; Data Group: 588)Date:

Math Assessment Participation Data by Grade

Grade 3 4 5 6 7 8 9 10 11 12 HS

a. Children with IEPs

b. IEPs in regular assessment with no accommodations

c. IEPs in regular assessment with accommodations

f. IEPs in alternate assessment against alternate standards

FFY 2019 SPP/APR Data: Reading Assessment

12 Part B

Page 13: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

GroupGroup Name

Number of Children with

IEPs

Number of Children with

IEPs Participating

FFY 2018 Data FFY 2019 Target

FFY 2019 Data Status Slippage

A Overall 98.64% 95.00% N/A N/A

FFY 2019 SPP/APR Data: Math Assessment

GroupGroup Name

Number of Children with

IEPs

Number of Children with

IEPs Participating

FFY 2018 Data FFY 2019 Target

FFY 2019 Data Status Slippage

A Overall 98.70% 95.00% N/A N/A

Regulatory InformationThe SEA, (or, in the case of a district-wide assessment, LEA) must make available to the public, and report to the public with the same frequency and in the same detail as it reports on the assessment of nondisabled children: (1) the number of children with disabilities participating in: (a) regular assessments, and the number of those children who were provided accommodations in order to participate in those assessments; and (b) alternate assessments aligned with alternate achievement standards; and (2) the performance of children with disabilities on regular assessments and on alternate assessments, compared with the achievement of all children, including children with disabilities, on those assessments. [20 U.S.C. 1412 (a)(16)(D); 34 CFR §300.160(f)]

Public Reporting InformationProvide links to the page(s) where you provide public reports of assessment results. Due to the waiver (see below), there were no assessment results to report pubically.Provide additional information about this indicator (optional)On March 15, 2020, the governor of SC issued an emergency proclamation closing all physical public school buildings beginning March 16. This executive order was subsequently extended through the end of the 2019-20 school year. On March 17, 2020, Superintendent Spearman sent a letter to U.S. Department of Education Deputy Secretary of Education, indicating South Carolina’s intent to suspend federally required assessments and giving notice of submitting additional accountability measure waivers. On March 20, 2020, the U.S. Department of Education released a form for states to request such waivers. The SCDE completed and submitted the waiver form on the same day, and the U.S. Department of Education reviewed and gave South Carolina approval to suspend assessments and waived certain accountability measures. These waivers were requested and granted due to the pandemic. All states and territories were granted similar waivers.

For spring 2020, South Carolina did not administer any of the following assessment programs that are included in Indicator 3: SC READY (English language arts and mathematics in grades 3–8); End-of-Course Examination Program (English, Algebra, Biology, United States History and the Constitution) the requirement that these examinations count 20 percent has been waived; or Alternate Assessments.

The USED approval letter may be found at https://ed.sc.gov/newsroom/covid-19-coronavirus-and-south-carolina-schools/usde-covid-waiver-approval/.

3B - Prior FFY Required ActionsWithin 90 days of the receipt of the State's 2020 determination letter, the State must provide to OSEP a Web link that demonstrates that it has reported, for FFY 2018, to the public, on the statewide assessments of children with disabilities in accordance with 34 C.F.R. § 300.160(f).  In addition, OSEP reminds the State that in the FFY 2019 SPP/APR, the State must include a Web link that demonstrates compliance with 34 C.F.R. § 300.160(f) for FFY 2019.Response to actions required in FFY 2018 SPP/APR The SCDE has posted the required information related to the reporting to the public on the FY18 statewide assessments of children with disabilities in accordance with 34 C.F.R. § 300.160(f). Due to the granting of a waiver for all statewide accountability measures for FY19, no data are available to post.

The information may be found at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/#sppAndDet and is entitled SC IDEA Assessment Indicator 3.

3B - OSEP ResponseOSEP's response to the State's FFY 2018 SPP/APR required the State to provide OSEP with a Web link that demonstrates that it has reported, for FFY 2018, to the public, on the statewide assessments of children with disabilities in accordance with 34 C.F.R. § 300.160(f). The State provided the required information.

The State was not required to provide any data for this indicator. Due to the circumstances created by the COVID-19 pandemic, and resulting school closures, the State received a waiver of the assessment requirements in section 1111(b)(2) of the ESEA, and, as a result, does not have any FFY 2019 data for this indicator.

3B - Required Actions

13 Part B

Page 14: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 3C: Proficiency for Students with IEPsInstructions and Measurement Monitoring Priority: FAPE in the LREResults indicator: Participation and performance of children with IEPs on statewide assessments:

A. Indicator 3A – ReservedB. Participation rate for children with IEPsC. Proficiency rate for children with IEPs against grade level and alternate academic achievement standards.

(20 U.S.C. 1416 (a)(3)(A))Data Source3C. Same data as used for reporting to the Department under Title I of the ESEA, using EDFacts file specifications FS175 and 178.MeasurementC. Proficiency rate percent = [(# of children with IEPs scoring at or above proficient against grade level and alternate academic achievement standards) divided by the (total # of children with IEPs who received a valid score and for whom a proficiency level was assigned)]. Calculate separately for reading and math. The proficiency rate includes both children with IEPs enrolled for a full academic year and those not enrolled for a full academic year.InstructionsDescribe the results of the calculations and compare the results to the targets. Provide the actual numbers used in the calculation.Include information regarding where to find public reports of assessment participation and performance results, as required by 34 CFR §300.160(f), i.e., a link to the Web site where these data are reported.Indicator 3C: Proficiency calculations in this SPP/APR must result in proficiency rates for reading/language arts and mathematics assessments (combining regular and alternate) for children with IEPs, in all grades assessed (3-8 and high school), including both children with IEPs enrolled for a full academic year and those not enrolled for a full academic year. Only include children with disabilities who had an IEP at the time of testing.

3C - Indicator DataReporting Group SelectionBased on previously reported data, these are the grade groups defined for this indicator.

Group

Group Name Grade 3 Grade 4 Grade 5 Grade 6 Grade 7 Grade 8 Grade 9

Grade 10

Grade 11

Grade 12 HS

AElement

ary/Middle

X X X X X X

B High School X X X X X

Historical Data: Reading

Group

Group Name Baseline FFY 2014 2015 2016 2017 2018

A Elementary/Middle 2015 Target

>= 9.31% 10.69% 20.00% 30.00% 40.00%

A Elementary/Middle 10.69% Actual 9.31% 10.69% 9.43% 9.95% 13.23%

B High School 2015 Target >= 39.83% 41.06% 42.56% 42.56% 44.06%

B High School 41.06% Actual 39.83% 41.06% 32.50% 36.89% 35.09%

Historical Data: Math

Group

Group Name Baseline FFY 2014 2015 2016 2017 2018

A Elementary/Middle 2015 Target

>= 17.68% 13.32% 22.32% 31.32% 40.32%

A Elementary/Middle 13.32% Actual 17.68% 13.32% 12.37% 13.46% 15.56%

B High School 2015 Target

>= 57.96% 52.55% 54.05% 55.55% 57.05%

B High School 52.55% Actual 57.96% 52.55% 33.55% 25.77% 27.30%

Targets

Subject Group Group Name 2019Readin

g A >= Elementary/Middle 40.00%

Reading B >= High School 44.06%

14 Part B

Page 15: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Math A >= Elementary/Middle 40.32%Math B >= High School 57.05%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

FFY 2019 Data Disaggregation from EDFactsInclude the disaggregated data in your final SPP/APR. (yes/no)YESData Source: SY 2019-20 Assessment Data Groups - Reading (EDFacts file spec FS178; Data Group: 584)Date:

Reading Proficiency Data by Grade

Grade 3 4 5 6 7 8 9 10 11 12 HS

a. Children with IEPs who received a valid score and a proficiency was assigned

b. IEPs in regular assessment with no accommodations scored at or above proficient against grade level

c. IEPs in regular assessment with accommodations scored at or above proficient against grade level

f. IEPs in alternate assessment against alternate standards scored at or above proficient against grade level

Data Source: SY 2019-20 Assessment Data Groups - Math (EDFacts file spec FS175; Data Group: 583)

15 Part B

Page 16: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Date:

Math Proficiency Data by Grade

Grade 3 4 5 6 7 8 9 10 11 12 HS

a. Children with IEPs who received a valid score and a proficiency was assigned

b. IEPs in regular assessment with no accommodations scored at or above proficient against grade level

c. IEPs in regular assessment with accommodations scored at or above proficient against grade level

f. IEPs in alternate assessment against alternate standards scored at or above proficient against grade level

FFY 2019 SPP/APR Data: Reading Assessment

Group Group Name

Children with IEPs who received a valid score

and a proficiency

was assigned

Number of Children with IEPs Proficient

FFY 2018 Data FFY 2019 Target

FFY 2019 Data Status Slippage

A Elementary/Middle 13.23% 40.00% N/A N/A

B High School 35.09% 44.06% N/A N/A

FFY 2019 SPP/APR Data: Math Assessment

Group Group Name

Children with IEPs who received a valid score

and a proficiency

was assigned

Number of Children with IEPs Proficient

FFY 2018 Data FFY 2019 Target

FFY 2019 Data Status Slippage

A Elementary/Middle 15.56% 40.32% N/A N/A

B High School 27.30% 57.05% N/A N/A

Regulatory InformationThe SEA, (or, in the case of a district-wide assessment, LEA) must make available to the public, and report to the public with the same frequency and in the same detail as it reports on the assessment of nondisabled children: (1) the number of children with disabilities participating in: (a) regular assessments, and the number of those children who were provided accommodations in order to participate in those assessments; and (b) alternate assessments aligned with alternate achievement standards; and (2) the performance of children with disabilities on regular assessments and on alternate assessments, compared with the achievement of all children, including children with disabilities, on those assessments. [20 U.S.C. 1412 (a)(16)(D); 34 CFR §300.160(f)]

Public Reporting InformationProvide links to the page(s) where you provide public reports of assessment results. Due to the waiver, there were no assessment results to report to the public.Provide additional information about this indicator (optional)

16 Part B

Page 17: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

On March 15, 2020, the governor of SC issued an emergency proclamation closing all physical public school buildings beginning March 16. This executive order was subsequently extended through the end of the 2019-20 school year. On March 17, 2020, Superintendent Spearman sent a letter to U.S. Department of Education Deputy Secretary of Education, indicating South Carolina’s intent to suspend federally required assessments and giving notice of submitting additional accountability measure waivers. On March 20, 2020, the U.S. Department of Education released a form for states to request such waivers. The SCDE completed and submitted the waiver form on the same day, and the U.S. Department of Education reviewed and gave South Carolina approval to suspend assessments and waived certain accountability measures. These waivers were requested and granted due to the pandemic. All states and territories were granted similar waivers.

For spring 2020, South Carolina did not administer any of the following assessment programs that are included in Indicator 3: SC READY (English language arts and mathematics in grades 3–8); End-of-Course Examination Program (English, Algebra, Biology, United States History and the Constitution) the requirement that these examinations count 20 percent has been waived; or Alternate Assessments.

The USED approval letter may be found at https://ed.sc.gov/newsroom/covid-19-coronavirus-and-south-carolina-schools/usde-covid-waiver-approval/.

3C - Prior FFY Required Actions

Response to actions required in FFY 2018 SPP/APR

3C - OSEP ResponseThe State was not required to provide any data for this indicator. Due to the circumstances created by the COVID-19 pandemic, and resulting school closures, the State received a waiver of the assessment requirements in section 1111(b)(2) of the ESEA, and, as a result, does not have any FFY 2019 data for this indicator.

3C - Required Actions

17 Part B

Page 18: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 4A: Suspension/ExpulsionInstructions and Measurement Monitoring Priority: FAPE in the LREResults Indicator: Rates of suspension and expulsion:

A. Percent of districts that have a significant discrepancy in the rate of suspensions and expulsions of greater than 10 days in a school year for children with IEPs

(20 U.S.C. 1416(a)(3)(A); 1412(a)(22))Data SourceState discipline data, including State’s analysis of State’s Discipline data collected under IDEA Section 618, where applicable. Discrepancy can be computed by either comparing the rates of suspensions and expulsions for children with IEPs to rates for nondisabled children within the LEA or by comparing the rates of suspensions and expulsions for children with IEPs among LEAs within the State.MeasurementPercent = [(# of districts that meet the State-established n size (if applicable) that have a significant discrepancy in the rates of suspensions and expulsions for greater than 10 days in a school year of children with IEPs) divided by the (# of districts in the State that meet the State-established n size (if applicable))] times 100.Include State’s definition of “significant discrepancy.”InstructionsIf the State has established a minimum n size requirement, the State may only include, in both the numerator and the denominator, districts that met that State-established n size. If the State used a minimum n size requirement, report the number of districts excluded from the calculation as a result of this requirement.Describe the results of the State’s examination of the data for the year before the reporting year (e.g., for the FFY 2019 SPP/APR, use data from 2018-2019), including data disaggregated by race and ethnicity to determine if significant discrepancies are occurring in the rates of long-term suspensions and expulsions of children with IEPs, as required at 20 U.S.C. 1412(a)(22). The State’s examination must include one of the following comparisons:

--The rates of suspensions and expulsions for children with IEPs among LEAs within the State; or--The rates of suspensions and expulsions for children with IEPs to nondisabled children within the LEAs

In the description, specify which method the State used to determine possible discrepancies and explain what constitutes those discrepancies.Indicator 4A: Provide the actual numbers used in the calculation (based upon districts that met the minimum n size requirement, if applicable). If significant discrepancies occurred, describe how the State educational agency reviewed and, if appropriate, revised (or required the affected local educational agency to revise) its policies, procedures, and practices relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards, to ensure that such policies, procedures, and practices comply with applicable requirements.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If discrepancies occurred and the district with discrepancies had policies, procedures or practices that contributed to the significant discrepancy and that do not comply with requirements relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards, describe how the State ensured that such policies, procedures, and practices were revised to comply with applicable requirements consistent with the Office of Special Education Programs (OSEP) Memorandum 09-02, dated October 17, 2008.If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken.If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for 2018-2019), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.

4A - Indicator DataHistorical Data

Baseline Year Baseline Data

2009 5.68%

FFY 2014 2015 2016 2017 2018

Target <= 5.58% 4.54% 4.54% 4.54% 3.40%

Data 4.55% 9.09% 9.09% 0.00% NVR

Targets

FFY 2019

Target <= 3.40%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators.

18 Part B

Page 19: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

FFY 2019 SPP/APR DataHas the state established a minimum n-size requirement? (yes/no)NO

Number of districts that

have a significant

discrepancyNumber of districts in

the State FFY 2018 Data FFY 2019 TargetFFY 2019

Data Status Slippage

0 88 NVR 3.40% 0.00% Met Target No Slippage

Choose one of the following comparison methodologies to determine whether significant discrepancies are occurring (34 CFR §300.170(a)) Compare the rates of suspensions and expulsions of greater than 10 days in a school year for children with IEPs among LEAs in the StateState’s definition of “significant discrepancy” and methodologyFor the purposes of Part B Indicator 4A, South Carolina defines significant discrepancy as any LEA that meets the following criteria: rate ratio exceeding 2.50, without respect to subgroup or group size, in the out-of-school suspension/expulsions of students with IEPs (comparing one LEA to all other LEAs in the state). Additional information may be found in South Carolina's Definition of Significant Discrepancy at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/#sppAndDet. The document is entitled Indicator 4A Definition of Significant Discrepancy.Provide additional information about this indicator (optional)As stated in the introduction, the number of LEAs during this lag year, was 88.

Review of Policies, Procedures, and Practices (completed in FFY 2019 using 2018-2019 data)Provide a description of the review of policies, procedures, and practices relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards.South Carolina collected data for 88 LEAs for the 2018-19 reporting year. For any LEA above the threshold of 2.50, the OSES requires the completion of self-assessment documents and requires LEAs to provide evidence of their responses to issues relative to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards. The self-assessment focuses on three areas of compliance: 1. Development and implementation of IEPs, related to 34 CFR § 300.304(b)(1), 300.530(a), 300.530(b)(2), 300.530(c), 300.530 (d)(1)(i), 300.530(d)(4), 300.530(e)(1), 300.530(e)(1)(i), 300.530(e)(1)(ii), 300.530(e)(3), 300.530(f) (2),300.530(g), and 300.531; 2. Positive behavioral interventions and supports, related to 34 CFR § 300.324(a)(2)(i), 300.324(a)(3)(i), 300.530(d)(1)(ii), 300.530(e)(1), 300.530(f) (1)(i), and 300.530(f)(1)(ii); and 3. Procedural safeguards related to 34 CFR § 300.500, 300.501(c)(3), 300.504(c)(4), 300.530(d), and 300.530(h). The LEAs identified with significant discrepancy are given the opportunity to provide additional details as to other factors contributing to the significant discrepancy in the rates of long-term suspensions and expulsions of students with disabilities. After the LEAs submit the required documentation, OSES staff with expertise in policies, procedures, practices, and data analyses review and conduct follow-up discussions with the identified LEAs for additional or clarifying information.

The State DID NOT identify noncompliance with Part B requirements as a result of the review required by 34 CFR §300.170(b)

Correction of Findings of Noncompliance Identified in FFY 2018

19 Part B

Page 20: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

0 0 0 0

Correction of Findings of Noncompliance Identified Prior to FFY 2018

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY

2018 APRFindings of Noncompliance

Verified as CorrectedFindings Not Yet Verified as

Corrected

4A - Prior FFY Required ActionsThe State did not provide valid and reliable data for FFY 2018. The State must provide valid and reliable data for FFY 2019 in the FFY 2019 SPP/APR.

Response to actions required in FFY 2018 SPP/APRThe state has worked with IDC to develop and implement data protocols to ensue all data reported are valid and reliable. The error with the FY18 data for this indicator was that the reporting year was incorrectly identified in the narrative of the APR.

The state has been working with IDC to develop and implement a review process for all data to ensure validity and reliability.

4A - OSEP Response

4A - Required Actions

20 Part B

Page 21: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 4B: Suspension/ExpulsionInstructions and Measurement Monitoring Priority: FAPE in the LRECompliance Indicator: Rates of suspension and expulsion:

B. Percent of districts that have: (a) a significant discrepancy, by race or ethnicity, in the rate of suspensions and expulsions of greater than 10 days in a school year for children with IEPs; and (b) policies, procedures or practices that contribute to the significant discrepancy and do not comply with requirements relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards.

(20 U.S.C. 1416(a)(3)(A); 1412(a)(22))Data SourceState discipline data, including State’s analysis of State’s Discipline data collected under IDEA Section 618, where applicable. Discrepancy can be computed by either comparing the rates of suspensions and expulsions for children with IEPs to rates for nondisabled children within the LEA or by comparing the rates of suspensions and expulsions for children with IEPs among LEAs within the State.MeasurementPercent = [(# of districts that meet the State-established n size (if applicable) for one or more racial/ethnic groups that have: (a) a significant discrepancy, by race or ethnicity, in the rates of suspensions and expulsions of greater than 10 days in a school year of children with IEPs; and (b) policies, procedures or practices that contribute to the significant discrepancy and do not comply with requirements relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards) divided by the (# of districts in the State that meet the State-established n size (if applicable) for one or more racial/ethnic groups)] times 100.Include State’s definition of “significant discrepancy.”InstructionsIf the State has established a minimum n size requirement, the State may only include, in both the numerator and the denominator, districts that met that State-established n size. If the State used a minimum n size requirement, report the number of districts excluded from the calculation as a result of this requirement.Describe the results of the State’s examination of the data for the year before the reporting year (e.g., for the FFY 2019 SPP/APR, use data from 2018-2019), including data disaggregated by race and ethnicity to determine if significant discrepancies are occurring in the rates of long-term suspensions and expulsions of children with IEPs, as required at 20 U.S.C. 1412(a)(22). The State’s examination must include one of the following comparisons

--The rates of suspensions and expulsions for children with IEPs among LEAs within the State; or--The rates of suspensions and expulsions for children with IEPs to nondisabled children within the LEAs

In the description, specify which method the State used to determine possible discrepancies and explain what constitutes those discrepancies.Indicator 4B: Provide the following: (a) the number of districts that met the State-established n size (if applicable) for one or more racial/ethnic groups that have a significant discrepancy, by race or ethnicity, in the rates of suspensions and expulsions of greater than 10 days in a school year for children with IEPs; and (b) the number of those districts in which policies, procedures or practices contribute to the significant discrepancy and do not comply with requirements relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If discrepancies occurred and the district with discrepancies had policies, procedures or practices that contributed to the significant discrepancy and that do not comply with requirements relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards, describe how the State ensured that such policies, procedures, and practices were revised to comply with applicable requirements consistent with the Office of Special Education Programs (OSEP) Memorandum 09-02, dated October 17, 2008.If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken.If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for 2018-2019), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.Targets must be 0% for 4B.

4B - Indicator Data

Not ApplicableSelect yes if this indicator is not applicable.NO

Historical Data

Baseline Year Baseline Data

2009 2.30%

FFY 2014 2015 2016 2017 2018

Target 0% 0% 0% 0% 0%

Data 0.00% 0.00% 0.00% 0.00% NVR

21 Part B

Page 22: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Targets

FFY 2019

Target 0%

FFY 2019 SPP/APR DataHas the state established a minimum n-size requirement? (yes/no)YESIf yes, the State may only include, in both the numerator and the denominator, districts that met the State-established n size. Report the number of districts excluded from the calculation as a result of the requirement.2

Number of districts that

have a significant

discrepancy, by race or ethnicity

Number of those

districts that have policies procedure, or practices that contribute to

the significant

discrepancy and do not

comply with requirements

Number of Districts that met the State's

minimum n-sizeFFY 2018

Data FFY 2019 TargetFFY 2019

Data Status Slippage

5 0 86 NVR 0% 0.00% Met Target No Slippage

Were all races and ethnicities included in the review? YESState’s definition of “significant discrepancy” and methodologySignificant Discrepancy: A rate ratio exceeding 2.50, with a group/subgroup size of greater than ten students, in the out-of-school suspension/expulsions of students with IEPs, by each race/ethnicity, (comparing one LEA to all other LEAs in the state). Additional information may be found at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/#sppAndDet in the document entitled Indicator 4B Significant Discrepancy Definition. Provide additional information about this indicator (optional)As stated in the introduction, the number of LEAs during this lag year, was 88.

Review of Policies, Procedures, and Practices (completed in FFY 2019 using 2018-2019 data)Provide a description of the review of policies, procedures, and practices relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards.South Carolina collected data for 88 LEAs for the 2018-19 reporting year. For the five LEAs identified as having significant discrepancy in the rates of long-term suspensions and expulsions (i.e., out of school suspensions exceeding ten days as found in Table 5) for any race/ethnicity, the OSES required the completion of self-assessment documents and required LEAs to provide evidence of their responses to issues relative to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards. The self-assessment focuses on three areas of compliance: 1. Development and implementation of IEPs, related to 34 CFR § 300.304(b)(1), 300.530(a), 300.530(b)(2), 300.530(c), 300.530 (d)(1)(i), 300.530(d)(4), 300.530(e)(1), 300.530(e)(1)(i), 300.530(e)(1)(ii), 300.530(e)(3), 300.530(f) (2),300.530(g), and 300.531; 2. Positive behavioral interventions and supports, related to 34 CFR § 300.324(a)(2)(i), 300.324(a)(3)(i), 300.530(d)(1)(ii), 300.530(e)(1), 300.530(f) (1)(i), and 300.530(f)(1)(ii); and 3. Procedural safeguards related to 34 CFR § 300.500, 300.501(c)(3), 300.504(c)(4), 300.530(d), and 300.530(h). The five identified LEAs were given the opportunity to provide additional details as to other factors contributing to the significant discrepancy in the rates of long-term suspensions and expulsions of students with disabilities. After each LEA submitted the required documentation, OSES staff with expertise in policies, procedures, practices, and data analyses reviewed and conducted follow-up discussions with the identified districts for additional or clarifying information. The OSES reviewed self-assessment documentation for the LEAs which were required to collect information and evidence regarding the development and implementation of IEPs, positive behavioral interventions and supports, and procedural safeguards found in the regulations outlined above. As noted above, the LEAs that had a significant discrepancy by race/ethnicity reviewed their policies, procedures, and practices related to the subgroup identified. All LEAs provided evidence that their policies and procedures were compliant according to relevant IDEA regulations.

The State DID NOT identify noncompliance with Part B requirements as a result of the review required by 34 CFR §300.170(b)

Correction of Findings of Noncompliance Identified in FFY 2018

22 Part B

Page 23: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

1 1 0 0

FFY 2018 Findings of Noncompliance Verified as CorrectedDescribe how the State verified that the source of noncompliance is correctly implementing the regulatory requirementsOne LEA reported an issue with practices in relation to the issuance of procedural safeguards following a disciplinary change in placement (§300.504(a)(3) and §300.530(h)). The LEA noted a delay in the issuance of the procedural safeguards ranging from one to fourteen days following the disciplinary change in placement. The OSES issued a written notification of noncompliance and required the LEA to correct the noncompliance as soon as possible but in no case later than one year. The LEA held meetings with all parents involved to explain the finding and to provide the procedural safeguards to parents, although after the action. The LEA developed a monitoring process that ensured the correction of the systemic issue as well as individual student-level noncompliance. The LEA has been monitoring the issuance of procedural safeguards following a disciplinary change in placement on a quarterly basis and reported no issues. The OSES reviewed the monitoring process and found it to be appropriate to ensure no future noncompliance would occur; the OSES also reviewed subsequent disciplinary changes in placement using the on-line IEP system and observed the LEA is implementing all regulatory requirements. The LEA is now implementing all regulatory requirements and is 100% compliant with these requirements.Describe how the State verified that each individual case of noncompliance was correctedThe LEA held meetings with all parents involved to explain the finding and to provide the procedural safeguards to parents, although after the action. The LEA developed a monitoring process that ensured the correction of the systemic issue as well as individual student-level noncompliance. The LEA provided an assurance that procedural safeguards were issued as well as dates for meetings of students involved. The subsequent review of data by the OSES through the on-line IEP system showed that the LEA corrected all individual findings of noncompliance in a timely manner and is now implementing these requirements with 100% compliance.Correction of Findings of Noncompliance Identified Prior to FFY 2018

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY 2018

APRFindings of Noncompliance Verified

as CorrectedFindings Not Yet Verified as

Corrected

Findings of Noncompliance Verified as CorrectedDescribe how the State verified that the source of noncompliance is correctly implementing the regulatory requirements

Describe how the State verified that each individual case of noncompliance was corrected

4B - Prior FFY Required ActionsThe State did not provide valid and reliable data for FFY 2018. The State must provide valid and reliable data for FFY 2019 in the FFY 2019 SPP/APR.Response to actions required in FFY 2018 SPP/APRThe state has worked with IDC to develop and implement data protocols to ensue all data reported are valid and reliable. The error with the FY18 data for this indicator was that the reporting year was incorrectly identified in the narrative of the APR.

The state has been working with IDC to develop and implement a review process for all data to ensure validity and reliability.

4B - OSEP Response

4B- Required Actions

23 Part B

Page 24: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 5: Education Environments (children 6-21)Instructions and Measurement Monitoring Priority: FAPE in the LREResults indicator: Education environments (children 6-21): Percent of children with IEPs aged 6 through 21 served:

A. Inside the regular class 80% or more of the day;B. Inside the regular class less than 40% of the day; andC. In separate schools, residential facilities, or homebound/hospital placements.

(20 U.S.C. 1416(a)(3)(A))Data SourceSame data as used for reporting to the Department under section 618 of the IDEA, using the definitions in EDFacts file specification FS002.MeasurementPercent = [(# of children with IEPs aged 6 through 21 served inside the regular class 80% or more of the day) divided by the (total # of students aged 6 through 21 with IEPs)] times 100.Percent = [(# of children with IEPs aged 6 through 21 served inside the regular class less than 40% of the day) divided by the (total # of students aged 6 through 21 with IEPs)] times 100.Percent = [(# of children with IEPs aged 6 through 21 served in separate schools, residential facilities, or homebound/hospital placements) divided by the (total # of students aged 6 through 21 with IEPs)]times 100.InstructionsSampling from the State’s 618 data is not allowed.Describe the results of the calculations and compare the results to the target.If the data reported in this indicator are not the same as the State’s data reported under section 618 of the IDEA, explain.

5 - Indicator Data Historical Data

Part Baseline FFY 2014 2015 2016 2017 2018

A 2005 Target >= 56.00% 57.00% 57.00% 58.00% 59.00%

A 49.31% Data 58.26% 60.71% 61.61% 62.17% 62.16%

B 2013 Target <= 18.48% 18.18% 18.18% 17.88% 17.88%

B 18.48% Data 17.83% 16.31% 15.84% 15.39% 15.15%

C 2005 Target <= 2.19% 2.00% 2.00% 1.70% 1.70%

C 2.36% Data 1.81% 1.71% 1.56% 1.46% 1.49%

Targets

FFY 2019

Target A >= 63.00%

Target B <= 15.50%

Target C <= 1.70%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators.

24 Part B

Page 25: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

Prepopulated Data

Source Date Description Data

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS002; Data group 74)

07/08/2020 Total number of children with IEPs aged 6 through 21 98,533

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS002; Data group 74)

07/08/2020A. Number of children with IEPs aged 6

through 21 inside the regular class 80% or more of the day

61,543

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS002; Data group 74)

07/08/2020B. Number of children with IEPs aged 6 through 21 inside the regular class less

than 40% of the day14,832

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS002; Data group 74)

07/08/2020 c1. Number of children with IEPs aged 6 through 21 in separate schools 482

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS002; Data group 74)

07/08/2020 c2. Number of children with IEPs aged 6 through 21 in residential facilities 203

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS002; Data group 74)

07/08/2020c3. Number of children with IEPs aged 6

through 21 in homebound/hospital placements

784

Select yes if the data reported in this indicator are not the same as the State’s data reported under section 618 of the IDEA.NO

FFY 2019 SPP/APR Data

Education Environments

Number of children with IEPs aged 6 through 21

served

Total number of

children with IEPs aged 6 through 21

FFY 2018 Data

FFY 2019 Target

FFY 2019 Data Status Slippage

A. Number of children with IEPs aged 6 through 21 inside the regular class 80% or more of the day

61,543 98,533 62.16% 63.00% 62.46% Did Not Meet Target No Slippage

B. Number of children with IEPs aged 6 through 21 inside the regular class less than 40% of the day

14,832 98,533 15.15% 15.50% 15.05% Met Target No Slippage

C. Number of children with IEPs aged 6 through 21 inside separate schools, residential facilities, or homebound/hospital placements [c1+c2+c3]

1,469 98,533 1.49% 1.70% 1.49% Met Target No Slippage

Use a different calculation methodology (yes/no)NOProvide additional information about this indicator (optional)

25 Part B

Page 26: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Because Child Count and LRE data were collected prior to the on-set of the significant impact of COVID, data for this Indicator were not impacted.

5 - Prior FFY Required ActionsNone

5 - OSEP Response

5 - Required Actions

26 Part B

Page 27: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 6: Preschool EnvironmentsInstructions and MeasurementMonitoring Priority: FAPE in the LREResults indicator: Preschool environments: Percent of children aged 3 through 5 with IEPs attending a:

A. Regular early childhood program and receiving the majority of special education and related services in the regular early childhood program; andB. Separate special education class, separate school or residential facility.

(20 U.S.C. 1416(a)(3)(A))Data SourceSame data as used for reporting to the Department under section 618 of the IDEA, using the definitions in EDFacts file specification FS089.MeasurementPercent = [(# of children aged 3 through 5 with IEPs attending a regular early childhood program and receiving the majority of special education and related services in the regular early childhood program) divided by the (total # of children aged 3 through 5 with IEPs)] times 100.Percent = [(# of children aged 3 through 5 with IEPs attending a separate special education class, separate school or residential facility) divided by the (total # of children aged 3 through 5 with IEPs)] times 100.InstructionsSampling from the State’s 618 data is not allowed.Describe the results of the calculations and compare the results to the target.If the data reported in this indicator are not the same as the State’s data reported under section 618 of the IDEA, explain.

6 - Indicator DataNot ApplicableSelect yes if this indicator is not applicable. NO

Historical Data

Part Baseline FFY 2014 2015 2016 2017 2018

A 2013 Target >= 48.88% 48.88% 48.90% 48.90% 49.00%

A 48.88% Data 50.40% 50.73% 49.71% 48.88% 50.00%

B 2013 Target <= 25.32% 24.50% 24.00% 23.50% 23.00%

B 25.32% Data 25.65% 25.72% 25.29% 23.67% 22.73%

Targets

FFY 2019

Target A >= 48.90%

Target B <= 22.50%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical

27 Part B

Page 28: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

Prepopulated Data

Source Date Description DataSY 2019-20 Child

Count/Educational Environment Data Groups (EDFacts file spec

FS089; Data group 613)

07/08/2020

Total number of children with IEPs aged 3 through 5 10,399

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS089; Data group 613)

07/08/2020 a1. Number of children attending a regular early childhood program and receiving the majority of special education and related services in the regular early childhood program 5,271

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS089; Data group 613)

07/08/2020

b1. Number of children attending separate special education class 2,266

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS089; Data group 613)

07/08/2020

b2. Number of children attending separate school 104

SY 2019-20 Child Count/Educational Environment Data Groups (EDFacts file spec

FS089; Data group 613)

07/08/2020

b3. Number of children attending residential facility 0

Select yes if the data reported in this indicator are not the same as the State’s data reported under section 618 of the IDEA.NO

FFY 2019 SPP/APR Data

Preschool Environments

Number of children with IEPs

aged 3 through 5

served

Total number of children

with IEPs aged 3

through 5FFY 2018

DataFFY 2019

TargetFFY 2019

Data Status Slippage

A. A regular early childhood program and receiving the majority of special education and related services in the regular early childhood program

5,27110,399 50.00% 48.90% 50.69% Met Target No Slippage

B. Separate special education class, separate school or residential facility 2,370 10,399 22.73% 22.50% 22.79% Did Not

Meet Target No Slippage

Use a different calculation methodology (yes/no) NO

Provide additional information about this indicator (optional)Because data for Child Count and LRE were collected prior to the on-set of the significant impact of COVID, data for this Indicator were not impacted.

6 - Prior FFY Required ActionsNone

6 - OSEP Response

6 - Required Actions

28 Part B

Page 29: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 7: Preschool OutcomesInstructions and MeasurementMonitoring Priority: FAPE in the LREResults indicator: Percent of preschool children aged 3 through 5 with IEPs who demonstrate improved:

A. Positive social-emotional skills (including social relationships);B. Acquisition and use of knowledge and skills (including early language/ communication and early literacy); andC. Use of appropriate behaviors to meet their needs.

(20 U.S.C. 1416 (a)(3)(A))Data SourceState selected data source.MeasurementOutcomes:

A. Positive social-emotional skills (including social relationships);B. Acquisition and use of knowledge and skills (including early language/communication and early literacy); andC. Use of appropriate behaviors to meet their needs.

Progress categories for A, B and C:a. Percent of preschool children who did not improve functioning = [(# of preschool children who did not improve functioning) divided by (# of preschool children with IEPs assessed)] times 100.b. Percent of preschool children who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers = [(# of preschool children who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers) divided by (# of preschool children with IEPs assessed)] times 100.c. Percent of preschool children who improved functioning to a level nearer to same-aged peers but did not reach it = [(# of preschool children who improved functioning to a level nearer to same-aged peers but did not reach it) divided by (# of preschool children with IEPs assessed)] times 100.d. Percent of preschool children who improved functioning to reach a level comparable to same-aged peers = [(# of preschool children who improved functioning to reach a level comparable to same-aged peers) divided by (# of preschool children with IEPs assessed)] times 100.e. Percent of preschool children who maintained functioning at a level comparable to same-aged peers = [(# of preschool children who maintained functioning at a level comparable to same-aged peers) divided by (# of preschool children with IEPs assessed)] times 100.

Summary Statements for Each of the Three Outcomes:Summary Statement 1: Of those preschool children who entered the preschool program below age expectations in each Outcome, the percent who substantially increased their rate of growth by the time they turned 6 years of age or exited the program.Measurement for Summary Statement 1: Percent = [(# of preschool children reported in progress category (c) plus # of preschool children reported in category (d)) divided by (# of preschool children reported in progress category (a) plus # of preschool children reported in progress category (b) plus # of preschool children reported in progress category (c) plus # of preschool children reported in progress category (d))] times 100.Summary Statement 2: The percent of preschool children who were functioning within age expectations in each Outcome by the time they turned 6 years of age or exited the program.Measurement for Summary Statement 2: Percent = [(# of preschool children reported in progress category (d) plus # of preschool children reported in progress category (e)) divided by (the total # of preschool children reported in progress categories (a) + (b) + (c) + (d) + (e))] times 100.InstructionsSampling of children for assessment is allowed. When sampling is used, submit a description of the sampling methodology outlining how the design will yield valid and reliable estimates. (See General Instructions on page 2 for additional instructions on sampling.)In the measurement include, in the numerator and denominator, only children who received special education and related services for at least six months during the age span of three through five years.Describe the results of the calculations and compare the results to the targets. States will use the progress categories for each of the three Outcomes to calculate and report the two Summary Statements. States have provided targets for the two Summary Statements for the three Outcomes (six numbers for targets for each FFY).Report progress data and calculate Summary Statements to compare against the six targets. Provide the actual numbers and percentages for the five reporting categories for each of the three outcomes.In presenting results, provide the criteria for defining “comparable to same-aged peers.” If a State is using the Early Childhood Outcomes Center (ECO) Child Outcomes Summary (COS), then the criteria for defining “comparable to same-aged peers” has been defined as a child who has been assigned a score of 6 or 7 on the COS.In addition, list the instruments and procedures used to gather data for this indicator, including if the State is using the ECO COS.

7 - Indicator DataNot ApplicableSelect yes if this indicator is not applicable.NO

Historical DataPart Baseline FFY 2014 2015 2016 2017 2018

A1 2013 Target >= 88.45% 88.45% 88.46% 88.46% 88.47%

A1 88.45% Data 87.46% 89.09% 89.44% 88.37% 86.60%

29 Part B

Page 30: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

A2 2013 Target >= 66.16% 66.16% 66.17% 66.17% 66.18%

A2 66.16% Data 62.37% 64.44% 64.16% 62.57% 60.74%

B1 2013 Target >= 86.13% 86.13% 86.14% 86.14% 86.15%

B1 86.13% Data 84.92% 87.84% 87.56% 86.67% 84.21%

B2 2013 Target >= 63.25% 63.25% 63.26% 63.26% 63.27%

B2 63.25% Data 59.81% 63.44% 61.42% 58.85% 58.45%

C1 2013 Target >= 89.25% 89.25% 89.26% 89.26% 89.27%

C1 89.25% Data 89.91% 90.43% 91.20% 88.90% 87.54%

C2 2013 Target >= 77.21% 77.21% 77.22% 77.22% 77.23%

C2 77.21% Data 76.42% 77.76% 77.56% 75.67% 74.25%

Targets

FFY 2019

Target A1 >= 88.47%

Target A2 >= 66.18%

Target B1 >= 86.15%

Target B2 >= 63.27%

Target C1 >= 89.27%

Target C2 >= 77.23%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

FFY 2019 SPP/APR DataNumber of preschool children aged 3 through 5 with IEPs assessed5,548Outcome A: Positive social-emotional skills (including social relationships)

Outcome A Progress Category Number of childrenPercentage of

Children

a. Preschool children who did not improve functioning 37 0.67%

b. Preschool children who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers 630 11.36%

30 Part B

Page 31: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Outcome A Progress Category Number of childrenPercentage of

Children

c. Preschool children who improved functioning to a level nearer to same-aged peers but did not reach it 1,645 29.65%

d. Preschool children who improved functioning to reach a level comparable to same-aged peers 2,256 40.66%

e. Preschool children who maintained functioning at a level comparable to same-aged peers 980 17.66%

Outcome A Numerator DenominatorFFY 2018

DataFFY 2019

TargetFFY 2019

Data Status Slippage

A1. Of those children who entered or exited the program below age expectations in Outcome A, the percent who substantially increased their rate of growth by the time they turned 6 years of age or exited the program. Calculation:(c+d)/(a+b+c+d)

3,901 4,568 86.60% 88.47% 85.40% Did Not Meet Target Slippage

A2. The percent of preschool children who were functioning within age expectations in Outcome A by the time they turned 6 years of age or exited the program. Calculation: (d+e)/(a+b+c+d+e)

3,236 5,548 60.74% 66.18% 58.33% Did Not Meet Target Slippage

Outcome B: Acquisition and use of knowledge and skills (including early language/communication)

Outcome B Progress Category Number of ChildrenPercentage of

Children

a. Preschool children who did not improve functioning 49 0.88%

b. Preschool children who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers 651 11.73%

c. Preschool children who improved functioning to a level nearer to same-aged peers but did not reach it 1,753 31.60%

d. Preschool children who improved functioning to reach a level comparable to same-aged peers 1,961 35.35%

e. Preschool children who maintained functioning at a level comparable to same-aged peers 1,134 20.44%

Outcome B Numerator DenominatorFFY 2018 Data

FFY 2019 Target

FFY 2019 Data Status Slippage

B1. Of those children who entered or exited the program below age expectations in Outcome B, the percent who substantially increased their rate of growth by the time they turned 6 years of age or exited the program. Calculation: (c+d)/(a+b+c+d)

3,714 4,414 84.21% 86.15% 84.14%Did Not Meet

TargetNo Slippage

B2. The percent of preschool children who were functioning within age expectations in Outcome B by the time they turned 6 years of age or exited the program. Calculation: (d+e)/(a+b+c+d+e)

3,095 5,548 58.45% 63.27% 55.79%Did Not Meet

TargetSlippage

Outcome C: Use of appropriate behaviors to meet their needs

31 Part B

Page 32: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Outcome C Progress Category Number of ChildrenPercentage of

Children

a. Preschool children who did not improve functioning 46 0.83%

b. Preschool children who improved functioning but not sufficient to move nearer to functioning comparable to same-aged peers 424 7.64%

c. Preschool children who improved functioning to a level nearer to same-aged peers but did not reach it 1,025 18.48%

d. Preschool children who improved functioning to reach a level comparable to same-aged peers 2,134 38.46%

e. Preschool children who maintained functioning at a level comparable to same-aged peers 1,919 34.59%

Outcome C Numerator DenominatorFFY 2018

DataFFY 2019

Target FFY 2019 Data Status Slippage

C1. Of those children who entered or exited the program below age expectations in Outcome C, the percent who substantially increased their rate of growth by the time they turned 6 years of age or exited the program.Calculation:(c+d)/(a+b+c+d)

3,159 3,629 87.54% 89.27% 87.05%Did Not

Meet Target

No Slippage

C2. The percent of preschool children who were functioning within age expectations in Outcome C by the time they turned 6 years of age or exited the program. Calculation: (d+e)/(a+b+c+d+e)

4,053 5,548 74.25% 77.23% 73.05%Did Not

Meet Target

Slippage

Part Reasons for slippage, if applicable

A1

The pandemic’s impact in the Spring of 2020 is still being assessed as families, children, and school staff continue to deal with this on-going crisis. The significant changes beginning in March 2020 have continued and in many cases, multiplied. This has made it very difficult to assess the impact on progress, particularly for this age range. The closure of the physical school buildings and the move to emergency learning opportunities not only halted traditional instruction and learning, thereby impacting progress, but also took away most of the traditional assessment tools and processes that had been used prior to the pandemic. Due to the lack of assessments regarding the impact of the pandemic of all children, with and without disabilities, it is difficult to adequately assess improvement. The closure of physical school buildings and daycares, the quarantining of families, and the emergency deployment of educational learning opportunities led to significant changes in the lives of these children and their families. These changes impacted the social, emotional, and mental health well-being of children and their families. The change included changes in routine, changes in continuity of services and supports, changes in continuity of health care, missing of significant life events, and loss of safety and security for many children in this age range.

Until more is known about the impact of the pandemic on children, families, schools, and education in general, it will be difficult to draw any conclusions as to whether there was true slippage that could be “corrected” or mitigated by traditional methods at the individual student level or whether there will need to be significant changes at the system’s level to address more permanent, long-term effects caused by the pandemic.

A2

The pandemic’s impact in the Spring of 2020 is still being assessed as families, children, and school staff continue to deal with this on-going crisis. The significant changes beginning in March 2020 have continued and in many cases, multiplied. This has made it very difficult to assess the impact on progress, particularly for this age range. The closure of the physical school buildings and the move to emergency learning opportunities not only halted traditional instruction and learning, thereby impacting progress, but also took away most of the traditional assessment tools and processes that had been used prior to the pandemic.Due to the lack of assessments regarding the impact of the pandemic of all children, with and without disabilities, it is difficult to adequately assess improvement. Children, even in this age range, were affected by the COVID pandemic. The closure of physical school buildings and daycares, the quarantining of families, and the emergency deployment of educational learning opportunities led to significant changes in the lives of these children and their families. These changes impacted the social, emotional, and mental health well-being of children and their families. The change included changes routine, changes in continuity of services and supports, changes in continuity of health care, missing of significant life events, and loss of safety and security for many children in this age range.

Until more is known about the impact of the pandemic on children, families, schools, and education in general, it will be difficult to draw any conclusions as to whether there was true slippage that could be “corrected” or mitigated by traditional methods at the individual student level or whether there will need to be significant changes at the system’s level to address more permanent, long-term effects caused by the pandemic.

B2 The pandemic’s impact in the Spring of 2020 is still being assessed as families, children, and school staff continue to deal with this on-

32 Part B

Page 33: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Part Reasons for slippage, if applicable

going crisis. The significant changes beginning in March 2020 have continued and in many cases, multiplied. This has made it very difficult to assess the impact on progress, particularly for this age range. The closure of the physical school buildings and the move to emergency learning opportunities not only halted traditional instruction and learning, thereby impacting progress, but also took away most of the traditional assessment tools and processes that had been used prior to the pandemic.Due to the lack of assessments regarding the impact of the pandemic of all children, with and without disabilities, it is difficult to adequately assess improvement. The changes described above also impacted the access to supports, services, and instruction related to acquisition and use of knowledge and skills (including early language/communication). As families' lives changed and priorities shifted to health and safety, the changes in instruction, services, supports and the emotional effects of the pandemic led to decline in the acquisition of these skills.

Until more is known about the impact of the pandemic on children, families, schools, and education in general, it will be difficult to draw any conclusions as to whether there was true slippage that could be “corrected” or mitigated by traditional methods at the individual student level or whether there will need to be significant changes at the system’s level to address more permanent, long-term effects caused by the pandemic.

C2

The pandemic’s impact in the Spring of 2020 is still being assessed as families, children, and school staff continue to deal with this on-going crisis. The significant changes beginning in March 2020 have continued and in many cases, multiplied. This has made it very difficult to assess the impact on progress, particularly for this age range. The closure of the physical school buildings and the move to emergency learning opportunities not only halted traditional instruction and learning, thereby impacting progress, but also took away most of the traditional assessment tools and processes that had been used prior to the pandemic.Due to the lack of assessments regarding the impact of the pandemic of all children, with and without disabilities, it is difficult to adequately assess improvement. The changes described above also impacted the access to supports, services, and instruction related to acquisition of appropriate use of behaviors to meet needs. The pandemic brought about changes and a reprioritization of needs, shifting those to health, safety, and security. Families’ struggles to meet these basic needs resulted in fewer children who attained performance commensurate with same-aged peers.

Until more is known about the impact of the pandemic on children, families, schools, and education in general, it will be difficult to draw any conclusions as to whether there was true slippage that could be “corrected” or mitigated by traditional methods at the individual student level or whether there will need to be significant changes at the system’s level to address more permanent, long-term effects caused by the pandemic.

Does the State include in the numerator and denominator only children who received special education and related services for at least six months during the age span of three through five years? (yes/no)YES

Sampling Question Yes / No

Was sampling used? NO

Did you use the Early Childhood Outcomes Center (ECO) Child Outcomes Summary Form (COS) process? (yes/no)YESList the instruments and procedures used to gather data for this indicator.South Carolina uses a statewide, special education case management and reporting system, called Frontline Enrich Central. In this online platform, the Child Outcomes Summary form is completed for applicable preschoolers, and data are collected at the district and state level. The OSES has robust procedures for collecting, verifying and analyzing the Indicator 7 data. A description of the reporting processes and procedures, training materials and other resources are available online at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/data-collection-and-reporting/data-collection-instructions/indicator-7-child-outcomes-summary-form-cosf/.

It must be noted that the data collection and verification for this Indicator have been affected by the COVID pandemic. The closure of the physical school buildings impacted not only the delivery of instructional services and supports, but also LEAs’ ability to collect valid and reliable data comparing children with and without disabilities. The lack of data regarding the pandemic’s impact on all children in this age range makes the data reported questionable. It will, therefore, be difficult to draw any accurate conclusions from these data.Provide additional information about this indicator (optional)The Governor, in an emergency declaration on March 15, 2020, closed all physical school and district buildings in response to the COVID 19 pandemic. LEAs had to deploy emergency learning opportunities (ELOs) literally overnight. These ELOs were exactly what the name suggests – opportunities to provide some sort of continued learning activities in response to the unprecedented emergency created by the pandemic. Due to the nature of the pandemic, the ELOs were not Some LEAs and some schools in the state had the infrastructure in place to provide these ELOs virtually; in these instances, the LEAs already had 1:1 devices in place for some grades/schools. In other LEAs and schools, the infrastructure to provide the ELOs virtually was not in place and so the ELOs consisted mainly of paper-and-pencil packets of work sent home through any means possible (pick-up at the school when school lunches were available, delivered by buses deployed to deliver school lunches, and other means).

LEAs worked to establish alternative means of communication with families other than the traditional “at school” and “face-to-face". Staff made contact with families by telephone, email, and virtually. Both ELO formats (virtual or paper-and-pencil) were challenging for this age child and their families. This age group has typically not developed extensive independent work skills and as is developmentally appropriate, tends to need direct instruction and support. Families were overwhelmed with health and safety needs and were not always able to provide this level of support. LEAs reported the following concerns from parents:

An inability to assist because of the families’ struggles to meet basic needs (food, shelter, clothing) for their families; An inability to assist due to limited time (continuing to work, caring for other family members during the pandemic); An inability to assist due to limited knowledge and skills; An inability to assist due to limited access to technology or devices or limited knowledge of technology in general.

LEAs reported some parents requested services be reduced or stopped because they were so overwhelmed with meeting their family’s basic needs. This change in priorities for many families led to more focus and effort being placed on meeting basic needs (food, shelter) and less on supporting skill development. This led to fewer children in this age range making progress toward and meeting age-expectancies for same-aged peers. However, the impact of the pandemic on children without disabilities must be taken into account as well. Assessments previously used to measure progress and to

33 Part B

Page 34: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

compare performance of children with and without disabilities do not contain normative data of the performance of children during a pandemic. Therefore, the lack of assessments with this sort of normative data must be taken into account when assessing “slippage”. It may very well be that there was no slippage when the effect of the pandemic on children without disabilities is taken in account.

As the state assisted LEAs to put more solid infrastructure in place to transition to more of a virtual ELO delivery model, parents reported this transition was difficult for this age group. They again cited the need for more support and assistance even with the virtual delivery format. LEA special education staff worked to develop and strengthen support and instruction for the ELOs during the spring of 2020.

7 - Prior FFY Required ActionsNone

7 - OSEP Response

7 - Required Actions

34 Part B

Page 35: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 8: Parent involvementInstructions and MeasurementMonitoring Priority: FAPE in the LREResults indicator: Percent of parents with a child receiving special education services who report that schools facilitated parent involvement as a means of improving services and results for children with disabilities.(20 U.S.C. 1416(a)(3)(A))Data SourceState selected data source.MeasurementPercent = [(# of respondent parents who report schools facilitated parent involvement as a means of improving services and results for children with disabilities) divided by the (total # of respondent parents of children with disabilities)] times 100.InstructionsSampling of parents from whom response is requested is allowed. When sampling is used, submit a description of the sampling methodology outlining how the design will yield valid and reliable estimates. (See General Instructions on page 2 for additional instructions on sampling.)Describe the results of the calculations and compare the results to the target.Provide the actual numbers used in the calculation.If the State is using a separate data collection methodology for preschool children, the State must provide separate baseline data, targets, and actual target data or discuss the procedures used to combine data from school age and preschool data collection methodologies in a manner that is valid and reliable.While a survey is not required for this indicator, a State using a survey must submit a copy of any new or revised survey with its SPP/APR.Report the number of parents to whom the surveys were distributed.Include the State’s analysis of the extent to which the demographics of the parents responding are representative of the demographics of children receiving special education services. States should consider categories such as race and ethnicity, age of the student, disability category, and geographic location in the State.If the analysis shows that the demographics of the parents responding are not representative of the demographics of children receiving special education services in the State, describe the strategies that the State will use to ensure that in the future the response data are representative of those demographics. In identifying such strategies, the State should consider factors such as how the State distributed the survey to parents (e.g., by mail, by e-mail, on-line, by telephone, in-person through school personnel), and how responses were collected.States are encouraged to work in collaboration with their OSEP-funded parent centers in collecting data.

8 - Indicator DataQuestion Yes / No

Do you use a separate data collection methodology for preschool children? NO

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

Historical Data

35 Part B

Page 36: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Baseline Year Baseline Data

2013 84.00%

FFY 2014 2015 2016 2017 2018

Target >= 84.00% 84.50% 84.50% 85.00% 85.00%

Data 80.16% 86.67% 84.92% 93.49% 97.46%

Targets

FFY 2019

Target >= 85.50%

FFY 2019 SPP/APR Data

Number of respondent parents who report schools facilitated

parent involvement as a means of improving services and results for children with

disabilities

Total number of respondent parents of

children with disabilities

FFY 2018 Data

FFY 2019 Target

FFY 2019 Data Status Slippage

2,909 3,246 97.46% 85.50% 89.62% Met Target No Slippage

The number of parents to whom the surveys were distributed.28,038Percentage of respondent parents11.58%Since the State did not report preschool children separately, discuss the procedures used to combine data from school age and preschool surveys in a manner that is valid and reliable.The OSES annually surveys parents of students with (IEPs, based upon an approved sampling plan. Surveys are sent to all parents within the specified, sampled school districts, which include parents of students with disabilities ages three through five years of age. To accomplish this, the OSES extracts a base file that contains names and addresses of parents for all children during the reporting year. As such, parents of preschool students with disabilities were included in the same method as were children ages six to twenty-one.

Sampling Question Yes / No

Was sampling used? YES

If yes, has your previously-approved sampling plan changed? NO

Describe the sampling methodology outlining how the design will yield valid and reliable estimates.As per the State's approved sampling methodology (see FY13 APR for a copy of the plan), the LEAs surveyed for the FY19 Indicator 8 were representative of the state. Of the 28,038 surveys distributed, 3246 were returned, fully completed. In order to determine what response rate would be sufficient to yield valid, reliable results, the state determined what sample size would be needed. Generally accepted statistical practices, using a standard 5 percent margin of error, a 95 percent confidence level, and a 0.50 response distribution (the highest possible), 379 returned, completed surveys would allow valid inferences to be drawn from a sample of respondents. With 3246 returned, completed surveys, the state is able to make valid inferences.

Survey Question Yes / No

Was a survey used? YES

If yes, is it a new or revised survey? NO

The demographics of the parents responding are representative of the demographics of children receiving special education services.

NO

If no, describe the strategies that the State will use to ensure that in the future the response data are representative of those demographics.The demographics of the respondents were representative of the demographics of children receiving special education services in South Carolina for age, gender, and primary disability category. All races/ethnicities were representative of the demographics of children receiving special education services except white and African-American. The state will continue to use strategies, such as reaching out directly to races/ethnicities who are under-represented in an attempt to solicit more responses. This will include publicizing the survey on the state and LEA websites and working with parent/student advocacy groups to publicize the survey to increase response rate. The OSES is investigating the possibility of including the survey in the state-wide IEP software so that it would be more accessible to parents, for instance as they attended their child's annual review.Include the State’s analyses of the extent to which the demographics of the parents responding are representative of the demographics of children receiving special education services.As shown in the data (https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/), there was an overrepresentation of respondents who identified as African American and underrepresentation of respondents who identified as White. To determine whether or not the data represent the demographics of the of the parents responding are representative of the demographics of children receiving special education services

36 Part B

Page 37: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

State, the OSES calculated the response rates along four demographic variables: 1. Age of students (based upon the child's age as of the state's Child Count); 2. Gender of students; 3. Race or ethnicity (using the federal reporting categories); and 4. Primary disability category. Next, the State compared the response rates, by demographics, to the State's Child Count for the reporting year (i.e., FY19). The state then compared these rates to one another. The state then reviewed the difference between the percentages of the demographic variables of the respondents. The threshold used by the state to determine representativeness was a 10 percent difference. All race/ethnicities were within representative limits other than African-American (over-represented) and white (under-represented). The link to the analysis may be found at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/indicator-8-representativeness/.Provide additional information about this indicator (optional)Since South Carolina began surveying parents for IDEA Part B Indicator 8, the state has used the Part B scale entitled "Schools' Effort to Partner with Parents," developed by the National Center for Special Education Accountability Monitoring (NCSEAM). This scale, used in many other states in measuring IDEA Part B Indicator 8, is considered a valid, reliable measure of parent involvement.

The state did not see a decrease in response rate. However, there was a decrease in the percent of parents reporting schools facilitated parent involvement as a means of improving services and results for children with disabilities (although the state still met the target). The impact of the COVID pandemic may very well have colored parents' responses to the survey due to the disruption of traditional instruction caused by the pandemic in the Spring. The survey, up to this point has been anonymous and this has impacted the ability of the OSES to follow up with specific parents. The OSES will carefully analyze the data collected for the same time period for the 2020-21 to compare results.

8 - Prior FFY Required ActionsIn the FFY 2019 SPP/APR, the State must report whether its FFY 2019 data are from a response group that is representative of the demographics of children receiving special education services, and, if not, the actions the State is taking to address this issue. The State must also include its analysis of the extent to which the demographics of the parents responding are representative of the demographics of children receiving special education services. Response to actions required in FFY 2018 SPP/APR

8 - OSEP Response

8 - Required ActionsIn the FFY 2020 SPP/APR, the State must report whether its FFY 2020 data are from a response group that is representative of the demographics of children receiving special education services, and, if not, the actions the State is taking to address this issue. The State must also include its analysis of the extent to which the demographics of the parents responding are representative of the demographics of children receiving special education services.

37 Part B

Page 38: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 9: Disproportionate RepresentationInstructions and MeasurementMonitoring Priority: DisproportionalityCompliance indicator: Percent of districts with disproportionate representation of racial and ethnic groups in special education and related services that is the result of inappropriate identification. (20 U.S.C. 1416(a)(3)(C))Data SourceState’s analysis, based on State’s Child Count data collected under IDEA section 618, to determine if the disproportionate representation of racial and ethnic groups in special education and related services was the result of inappropriate identification.MeasurementPercent = [(# of districts, that meet the State-established n and/or cell size (if applicable) for one or more racial/ethnic groups, with disproportionate representation of racial and ethnic groups in special education and related services that is the result of inappropriate identification) divided by the (# of districts in the State that meet the State-established n and/or cell size (if applicable) for one or more racial/ethnic groups)] times 100.Include State’s definition of “disproportionate representation.” Please specify in your definition: 1) the calculation method(s) being used (i.e., risk ratio, weighted risk ratio, e-formula, etc.); and 2) the threshold at which disproportionate representation is identified. Also include, as appropriate, 3) the number of years of data used in the calculation; and 4) any minimum cell and/or n-sizes (i.e., risk numerator and/or risk denominator).Based on its review of the 618 data for FFY 2018, describe how the State made its annual determination as to whether the disproportionate representation it identified of racial and ethnic groups in special education and related services was the result of inappropriate identification as required by 34 CFR §§300.600(d)(3) and 300.602(a), e.g., using monitoring data; reviewing policies, practices and procedures, etc. In determining disproportionate representation, analyze data, for each district, for all racial and ethnic groups in the district, or all racial and ethnic groups in the district that meet a minimum n and/or cell size set by the State. Report on the percent of districts in which disproportionate representation of racial and ethnic groups in special education and related services is the result of inappropriate identification, even if the determination of inappropriate identification was made after the end of the FFY 2019 reporting period (i.e., after June 30, 2020).InstructionsProvide racial/ethnic disproportionality data for all children aged 6 through 21 served under IDEA, aggregated across all disability categories.States are not required to report on underrepresentation.If the State has established a minimum n and/or cell size requirement, the State may only include, in both the numerator and the denominator, districts that met that State-established n and/or cell size. If the State used a minimum n and/or cell size requirement, report the number of districts totally excluded from the calculation as a result of this requirement because the district did not meet the minimum n and/or cell size for any racial/ethnic group.Consider using multiple methods in calculating disproportionate representation of racial and ethnic groups to reduce the risk of overlooking potential problems. Describe the method(s) used to calculate disproportionate representation.Provide the number of districts that met the State-established n and/or cell size (if applicable) for one or more racial/ethnic groups identified with disproportionate representation of racial and ethnic groups in special education and related services and the number of those districts identified with disproportionate representation that is the result of inappropriate identification.Targets must be 0%.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken. If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for FFY 2018), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.

9 - Indicator DataNot ApplicableSelect yes if this indicator is not applicable.NOHistorical Data

Baseline Year Baseline Data

2005 0.00%

FFY 2014 2015 2016 2017 2018

Target 0% 0% 0% 0% 0%

Data 0.00% 0.00% 0.00% 0.00% NVR

Targets

FFY 2019

Target 0%

FFY 2019 SPP/APR DataHas the state established a minimum n and/or cell size requirement? (yes/no)

38 Part B

Page 39: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

YESIf yes, the State may only include, in both the numerator and the denominator, districts that met the State-established n and/or cell size. Report the number of districts excluded from the calculation as a result of the requirement.2

Number of districts with

disproportionate representation of racial and

ethnic groups in special

education and related services

Number of districts with

disproportionate representation of racial and

ethnic groups in special

education and related services that is the result of inappropriate

identification

Number of Districts that met the State's

minimum n-sizeFFY 2018

Data FFY 2019 TargetFFY 2019

Data Status Slippage

0 0 84 NVR 0% 0.00% Met Target No Slippage

Were all races and ethnicities included in the review? YESDefine “disproportionate representation.” Please specify in your definition: 1) the calculation method(s) being used (i.e., risk ratio, weighted risk ratio, e-formula, etc.); and 2) the threshold at which disproportionate representation is identified. Also include, as appropriate, 3) the number of years of data used in the calculation; and 4) any minimum cell and/or n-sizes (i.e., risk numerator and/or risk denominator). The OSES uses data collected on Table 1 (Child Count) of Information Collection 1820-0043 (Report of Children with Disabilities Receiving Special Education under Part B of the IDEA, as amended) and EDFacts FS002 for all children with disabilities, ages 6 through 21, served under IDEA for calculations on this indicator. These data are collected annually as part of the October (fourth Tuesday) Child Count reporting. Disproportionate Representation Methodology South Carolina used a multitier process to determine the presence of disproportionate representation in special education and related services due to inappropriate identification. The first step was calculation of risk ratios using data submitted by LEAs in the OSEP 618 data tables. Using the electronic spreadsheet developed by Westat, South Carolina calculated the risk ratios for each LEA with regards to its composition of students in special education along the seven federally reported race/ethnic categories. This risk ratio directly compared the relative size of two risks by dividing the risk for a specific racial/ethnic group by the risk for a comparison group. This determined the specific race/ethnic group’s risk of being identified as having a disability as compared to the risk for all other students. A risk ratio above the state established criteria initiated the following process to determine whether the disproportionate representation was due to inappropriate identification. LEAs are determined to have disproportionate representation if they exceeded the risk ratio trigger. Based upon feedback from a stakeholder group in 2010, the OSES redefined the trigger to use a risk ratio of greater than 2.50 for overrepresentation. The data used by the state are only data from one reporting year (FY19). South Carolina collected data for all LEAs for the 2019-20 reporting year. South Carolina determined that a disability subgroup size (n-size) of less than 30 and a cell size of less than 10 would not yield valid disproportionate ratios. As such, two LEAs were excluded from consideration for disproportionate representation. Disproportionate Representation Definition: South Carolina defines disproportionate representation as occurring when an LEA has the following: a risk ratio greater than 2.50 for overrepresentation, with an n-size greater than 30 and a cell size greater than 10.Describe how the State made its annual determination as to whether the disproportionate representation it identified of racial and ethnic groups in special education and related services was the result of inappropriate identification.Any LEAs that are determined to have disproportionate representation must undertake the following process to determine whether the disproportionate presentation is due to inappropriate identification: using and completing an established rubric to examine LEA policies, procedures, and practices involved in the referral, evaluation, and identification of students with disabilities; completing individual folder reviews for a subset of student records from identified students with disabilities to examine the practices involved in the evaluation and identification of students with disabilities as required by 34 CFR §300.111, §300.201 and 300.301 through §300.311; and submitting a summary of findings and evidence to the OSES for verification. LEAs determined to have disproportionate representation had to undergo a guided self-assessment process with assistance from a team in the OSES. The self-assessment monitoring process is a focused review of an LEA’s policies, procedures, and practices. The process guided the LEA in its examination of the procedures used to identify children as students with disabilities. The LEA’s decision-making process and practices were examined to determine to what extent, if any, they contributed to the disproportionate representation of students in disability categories. The LEA’s evaluation practices were reviewed to determine if students of the identified racial and ethnic groups had received appropriate evaluations. The evaluations must have included a variety of assessment tools and strategies to gather relevant functional, developmental and academic information about the student that may assist in determining the student’s specific classification and the content of the student’s individualized education program (IEP), including information related to enabling the student to access and progress in the general education curriculum. This protocol was developed to guide the LEAs through the process of examining policies, procedures, and practices related to the identification of students with disabilities, collecting data and evidence to support determinations, and evaluating the effectiveness of these practices. The same folders reviewed for the use of eligibility criteria were also reviewed for the use of the evaluation process. The LEAs gathered available evidence that pertained to each regulatory requirement, evaluated the evidence, and determined whether the evidence supported compliance or noncompliance. For FY19, no LEAs were found to have exceeded the permissible risk for disproportionate over-representation; therefore, no further actions were required by LEAs in this area. Provide additional information about this indicator (optional)Because the state's Child Count data used to calculate this Indicator were collected in October of 2019, prior to the COVID-related disruption in education, there were no COVID-related impacts to this Indicator for FY 19.

39 Part B

Page 40: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Correction of Findings of Noncompliance Identified in FFY 2018

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

0 0 0 0

Correction of Findings of Noncompliance Identified Prior to FFY 2018

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY 2018

APRFindings of Noncompliance Verified

as CorrectedFindings Not Yet Verified as

Corrected

9 - Prior FFY Required ActionsThe State did not provide valid and reliable data for FFY 2018. The State must provide valid and reliable data for FFY 2019 in the FFY 2019 SPP/APR.

Response to actions required in FFY 2018 SPP/APRThe state actually reported the correct number of LEAs for this Indicator in FY18 (88 LEAs minus 2 LEAS not meeting the required N size = 86 LEAs). The original error occurred in the FY17 Introduction. Fy17 should have reflected 87 LEAs; the addition of 1 new LEA for FY18 led to 88 LEAs; and the consolidation of 3 into 1 LEA led to 86 LEAs for FY19. The state has subsequently worked with IDC to put checks and reviews in place to ensure the reporting of valid and reliable data for the appropriate reporting year and to clearly describe the changes in the number of LEAs from year to year.

9 - OSEP Response

9 - Required Actions

40 Part B

Page 41: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 10: Disproportionate Representation in Specific Disability Categories Instructions and MeasurementMonitoring Priority: DisproportionalityCompliance indicator: Percent of districts with disproportionate representation of racial and ethnic groups in specific disability categories that is the result of inappropriate identification. (20 U.S.C. 1416(a)(3)(C))Data SourceState’s analysis, based on State’s Child Count data collected under IDEA section 618, to determine if the disproportionate representation of racial and ethnic groups in specific disability categories was the result of inappropriate identification.MeasurementPercent = [(# of districts, that meet the State-established n and/or cell size (if applicable) for one or more racial/ethnic groups, with disproportionate representation of racial and ethnic groups in specific disability categories that is the result of inappropriate identification) divided by the (# of districts in the State that meet the State-established n and/or cell size (if applicable) for one or more racial/ethnic groups)] times 100.Include State’s definition of “disproportionate representation.” Please specify in your definition: 1) the calculation method(s) being used (i.e., risk ratio, weighted risk ratio, e-formula, etc.); and 2) the threshold at which disproportionate representation is identified. Also include, as appropriate, 3) the number of years of data used in the calculation; and 4) any minimum cell and/or n-sizes (i.e., risk numerator and/or risk denominator).Based on its review of the 618 data for FFY 2019, describe how the State made its annual determination as to whether the disproportionate representation it identified of racial and ethnic groups in specific disability categories was the result of inappropriate identification as required by 34 CFR §§300.600(d)(3) and 300.602(a), e.g., using monitoring data; reviewing policies, practices and procedures, etc. In determining disproportionate representation, analyze data, for each district, for all racial and ethnic groups in the district, or all racial and ethnic groups in the district that meet a minimum n and/or cell size set by the State. Report on the percent of districts in which disproportionate representation of racial and ethnic groups in special education and related services is the result of inappropriate identification, even if the determination of inappropriate identification was made after the end of the FFY 2019 reporting period (i.e., after June 30, 2020).InstructionsProvide racial/ethnic disproportionality data for all children aged 6 through 21 served under IDEA, aggregated across all disability categories.States are not required to report on underrepresentation.If the State has established a minimum n and/or cell size requirement, the State may only include, in both the numerator and the denominator, districts that met that State-established n and/or cell size. If the State used a minimum n and/or cell size requirement, report the number of districts totally excluded from the calculation as a result of this requirement because the district did not meet the minimum n and/or cell size for any racial/ethnic group.Consider using multiple methods in calculating disproportionate representation of racial and ethnic groups to reduce the risk of overlooking potential problems. Describe the method(s) used to calculate disproportionate representation.Provide the number of districts that met the State-established n and/or cell size (if applicable) for one or more racial/ethnic groups identified with disproportionate representation of racial and ethnic groups in special education and related services and the number of those districts identified with disproportionate representation that is the result of inappropriate identification.Targets must be 0%.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken.If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for FFY 2018), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.

10 - Indicator DataNot ApplicableSelect yes if this indicator is not applicable.NO

Historical Data

Baseline Year Baseline Data

2005 7.00%

FFY 2014 2015 2016 2017 2018

Target 0% 0% 0% 0% 0%

Data 0.00% 0.00% 0.00% 0.00% 1.16%

Targets

FFY 2019

Target 0%

41 Part B

Page 42: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

FFY 2019 SPP/APR DataHas the state established a minimum n and/or cell size requirement? (yes/no)YESIf yes, the State may only include, in both the numerator and the denominator, districts that met the State-established n and/or cell size. Report the number of districts excluded from the calculation as a result of the requirement.2

Number of districts with

disproportionate representation of racial and

ethnic groups in specific

disability categories

Number of districts with

disproportionate representation of racial and

ethnic groups in specific

disability categories that is the result of inappropriate identification

Number of districts that met the State's minimum n and/or

cell sizeFFY 2018

Data FFY 2019 TargetFFY 2019

Data Status Slippage

7 0 84 1.16% 0% 0.00% Met Target No Slippage

Were all races and ethnicities included in the review? YESDefine “disproportionate representation.” Please specify in your definition: 1) the calculation method(s) being used (i.e., risk ratio, weighted risk ratio, e-formula, etc.); and 2) the threshold at which disproportionate representation is identified. Also include, as appropriate, 3) the number of years of data used in the calculation; and 4) any minimum cell and/or n-sizes (i.e., risk numerator and/or risk denominator). The OSES uses data collected on Table 1 (Child Count) of Information Collection 1820-0043 (Report of Children with Disabilities Receiving Special Education under Part B of the IDEA, as amended) and EDFacts FS002 for all children with disabilities, ages 6 through 21, served under IDEA for calculations on this indicator. These data are collected annually as part of the October (fourth Tuesday) Child Count reporting. Disproportionate Representation Methodology South Carolina used a multitier process to determine the presence of disproportionate representation in special education and related services due to inappropriate identification. The first step was calculation of risk ratios using data submitted by LEAs in the OSEP 618 data tables. Using the electronic spreadsheet developed by Westat, South Carolina calculated the risk ratios for each LEA with regards to its composition of students in special education along the seven federally reported race/ethnic categories. This risk ratio directly compared the relative size of two risks by dividing the risk for a specific racial/ethnic group by the risk for a comparison group. This determined the specific race/ethnic group’s risk of being identified as having a disability as compared to the risk for all other students. A risk ratio above the state established criteria initiated the following process to determine whether the disproportionate representation was due to inappropriate identification. LEAs are determined to have disproportionate representation if they exceeded the risk ratio trigger. Based upon feedback from a stakeholder group in 2010, the OSES redefined the trigger to use a risk ratio of above 2.50 for overrepresentation. The data used by the state are only data from one reporting year (FY19). South Carolina collected data for all LEAs for the 2019-20 reporting year. South Carolina determined that a disability subgroup size of less than 30 and a cell size less than 10 would not yield valid disproportionate ratios. As the data show, two LEAs were excluded from consideration for disproportionate representation based on these criteria. Disproportionate Representation Definition: South Carolina defines disproportionate representation as occurring when an LEA has the following: a risk ratio greater than 2.50 for overrepresentation, with an n-size greater than 30 and cell size greater than 10.Describe how the State made its annual determination as to whether the disproportionate overrepresentation it identified of racial and ethnic groups in specific disability categories was the result of inappropriate identification.Any LEAs that are determined to have disproportionate representation must undertake the following process to determine whether the disproportionate presentation is due to inappropriate identification: using and completing an established rubric to examine LEA policies, procedures, and practices involved in the referral, evaluation, and identification of students with disabilities; completing individual folder reviews for a subset of student records from identified students with disabilities to examine the practices involved in the evaluation and identification of students with disabilities as required by 34 CFR §300.111, §300.201 and 300.301 through §300.311; and submitting a summary of findings and evidence to the OSES for verification.

LEAs determined to have disproportionate representation had to undergo a guided self-assessment process with assistance from a team in the OSES. The self-assessment monitoring process is a focused review of an LEA’s policies, procedures, and practices. The process guided the LEA in its examination of the procedures used to identify children as students with disabilities. The LEA’s decision-making process and practices were examined to determine to what extent, if any, they contributed to the disproportionate representation of students in disability categories. The LEA’s evaluation practices were reviewed to determine if students of the identified racial and ethnic groups had received appropriate evaluations. The evaluations must have included a variety of assessment tools and strategies to gather relevant functional, developmental and academic information about the student that may assist in determining the student’s specific classification and the content of the student’s individualized education program (IEP), including information related to enabling the student to access and progress in the general education curriculum. This protocol was developed to guide the LEAs through the process of examining policies, procedures, and practices related to the identification of students with disabilities, collecting data and evidence to support determinations, and evaluating the effectiveness of these practices. The same folders reviewed for the use of eligibility criteria were also reviewed for the use of the evaluation process. The LEAs gathered available evidence that pertained to each regulatory requirement, evaluated the evidence, and determined whether the evidence supported compliance or noncompliance.

For FY19, no LEAs were found to have policies, procedures, or practices that contributed to disproportionate representation; therefore, no further actions were required by LEAs in this area.

42 Part B

Page 43: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Provide additional information about this indicator (optional)Because the state's Child Count data used to calculate this Indicator were collected in October of 2019, prior to the COVID-related disruption in education, there were no COVID-related impacts to this Indicator for FY 19.

Correction of Findings of Noncompliance Identified in FFY 2018

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

1 1 0 0

FFY 2018 Findings of Noncompliance Verified as CorrectedDescribe how the State verified that the source of noncompliance is correctly implementing the regulatory requirementsWith assistance from the OSES team, the LEA created a team to use the review process described above to assess its policies, procedures, and practices. Based on the review, no issues with policies or procedures were identified. The data showed the issue was with consistent application of the policies and procedures (practice). Practices identified as noncompliant included inconsistent documentation of information and missing required evaluation components. The LEA developed a training process for relevant key participants in the evaluation process. The training materials were reviewed and approved by the OSES. Once the training was completed, the LEA monitored subsequent evaluations/identifications to ensure the practices across the LEA were consistently compliant. Through a review in the online data system, the OSES determined that the LEA is correctly implementing the regulatory requirements at the system’s level and is, therefore, demonstrating 100% compliance for this Indicator. Describe how the State verified that each individual case of noncompliance was correctedThe LEA conducted IEP meetings for all students in which the review indicated noncompliant practices. The teams used the training worksheet to identify missing evaluation components or missing documentation. When indicated, the teams conducted a reevaluation, reviewed existing information, determined the need for additional information, obtained consent, gathered the information, and then determined continued eligibility and educational need appropriately. The LEA reviewed all documentation to ensure completeness. The IEP teams then determined whether the noncompliance constituted a denial of FAPE for each student and if so, provided compensatory services. The OSES reviewed the data in the online system and verified that all findings of noncompliance at the student level were corrected and the LEA is, therefore, demonstrating 100% compliance.Correction of Findings of Noncompliance Identified Prior to FFY 2018

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY 2018

APRFindings of Noncompliance

Verified as CorrectedFindings Not Yet Verified as

Corrected

10 - Prior FFY Required ActionsNone

10 - OSEP Response

10 - Required Actions

43 Part B

Page 44: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 11: Child FindInstructions and MeasurementMonitoring Priority: Effective General Supervision Part B / Child FindCompliance indicator: Percent of children who were evaluated within 60 days of receiving parental consent for initial evaluation or, if the State establishes a timeframe within which the evaluation must be conducted, within that timeframe. (20 U.S.C. 1416(a)(3)(B))Data SourceData to be taken from State monitoring or State data system and must be based on actual, not an average, number of days. Indicate if the State has established a timeline and, if so, what is the State’s timeline for initial evaluations.Measurement

a. # of children for whom parental consent to evaluate was received.b. # of children whose evaluations were completed within 60 days (or State-established timeline).Account for children included in (a), but not included in (b). Indicate the range of days beyond the timeline when the evaluation was completed and any reasons for the delays.

Percent = [(b) divided by (a)] times 100.

InstructionsIf data are from State monitoring, describe the method used to select LEAs for monitoring. If data are from a State database, include data for the entire reporting year.Describe the results of the calculations and compare the results to the target. Describe the method used to collect these data, and if data are from the State’s monitoring, describe the procedures used to collect these data. Provide the actual numbers used in the calculation.Note that under 34 CFR §300.301(d), the timeframe set for initial evaluation does not apply to a public agency if: (1) the parent of a child repeatedly fails or refuses to produce the child for the evaluation; or (2) a child enrolls in a school of another public agency after the timeframe for initial evaluations has begun, and prior to a determination by the child’s previous public agency as to whether the child is a child with a disability. States should not report these exceptions in either the numerator (b) or denominator (a). If the State-established timeframe provides for exceptions through State regulation or policy, describe cases falling within those exceptions and include in b.Targets must be 100%.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken.If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for FFY 2018), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.

11 - Indicator DataHistorical Data

Baseline Year Baseline Data

2005 83.00%

FFY 2014 2015 2016 2017 2018

Target 100% 100% 100% 100% 100%

Data 99.61% 99.52% 99.65% 99.94% 99.74%

Targets

FFY 2019

Target 100%

FFY 2019 SPP/APR Data

44 Part B

Page 45: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

(a) Number of children for

whom parental consent to

evaluate was received

(b) Number of children whose

evaluations were

completed within 60 days

(or State-established

timeline) FFY 2018 Data FFY 2019 TargetFFY 2019

Data Status Slippage

14,551 14,262 99.74% 100% 98.01% Did Not Meet Target

Slippage

Provide reasons for slippageAs noted in the Introduction, the COVID pandemic has affected every aspect of the educational system due to the significant health and safety concerns resulting from the virus. Physical school, district, and government buildings remained close through the summer of 2020. Quarantine and mask ordinances were put into place. Although LEAs and school personnel were able to adapt and continue with some aspects of the education process, there were others that could not be adapted. The processes and routines dependent upon face-to-face interaction were affected the most. This included some aspects of the evaluation process. Although LEAs were able to continue with and complete some aspects of the initial evaluation process using alternative means (virtual meetings, telephone calls, video calls, emails), there were other aspects that could not be completed due to the face-to-face interaction required. So while LEAs were able, in some situations, to gather parent input, complete rating scales, assess current levels of performance, and observe instructional response, LEAs were unable to gather other information required by the evaluation planning team. There were components requested by the team that could only be gathered via close contact, face-to-face interaction. These included standardized cognitive, language, communication, and achievement assessments as well as other measures. Because the health and safety issues did not allow for this type of interaction and because these assessment measures could not be adapted or changed to accommodate for administration using another format without invalidating the results, the completion of the initial evaluation process was delayed for a small percentage of students.

Of the 289 students for whom the initial evaluation was not completed within the timeline, 255 of these students’ evaluations were delayed unavoidably due to the COVID pandemic. LEAs made good faith efforts to complete all aspects of the evaluation process possible. The aspects that were not able to be completed within the timeline due to the pandemic have been completed as soon as it was safe for both the students and staff. During this time, LEA and school staff communicated frequently with parents and families to keep them abreast of their efforts and to discuss alternatives.

Once face-to-face interactions with all safety precautions in place could begin, LEAs began scheduling the remaining assessment components as quickly as possible. In some instances, parents were uncomfortable due to continued health and safety concerns with any face-to-face interactions and asked that the completion of the components be delayed. LEAs have documented all good faith efforts and communications with parents during this time.

Although OSEP did not grant any waivers for this, or any other timelines related to IDEA, the OSES wanted to look at the impact of COVID on this Indicator. When the 255 students with COVID-related delays were removed from the calculations (for analysis purposes only), there were 34 students who were not assessed within the timelines. It appears that slightly over 88% of the delays, despite all good faith efforts by LEAs and families to complete the evaluations within the timeline, were COVID-related.Number of children included in (a) but not included in (b)289Account for children included in (a) but not included in (b). Indicate the range of days beyond the timeline when the evaluation was completed and any reasons for the delays.For the FY19 APR, the range of days beyond the 60-day timeline was from 1 to 141 days. The reasons for the delays (and subsequent noncompliance) include staff turnover; not ensuring that processes continued during school calendar, winter, and summer breaks; failure to move expeditiously to ensure that an evaluation occurred within sixty calendar days; the inability to engage parents after multiple varied attempts; difficulty scheduling hearing and vision screenings; difficulty arranging for related service providers’ input for assessments; inconsistent oversight of the evaluation process by the LEA; and inability to compete the evaluation process due to health and safety concerns related to the COVID pandemic. Data analysis showed that 88.24% of the delays were directly related to the pandemic.Indicate the evaluation timeline used:The State used the 60 day timeframe within which the evaluation must be conductedWhat is the source of the data provided for this indicator? State database that includes data for the entire reporting yearDescribe the method used to collect these data, and if data are from the State’s monitoring, describe the procedures used to collect these data. The OSES collects data from the statewide special education database, SC Enrich IEP, for use in the measurement of IDEA Part B Indicator 11. The data are pushed up from the LEA to the state level. The date range for this collection was July 1, 2019 – June 30, 2020. These data were reflective of all students for whom parental consent was received and who received an evaluation consistent with the requirements of IDEA Part B Indicator 11. A team of OSES staff with expertise in data collection, analyses, and reporting reviewed both quantitative and qualitative data from the SC Enrich IEP spreadsheet reports to determine the categorical analysis of each individual student for whom consent to evaluate was received. These staff also conducted follow-up communication with any LEA that exceeded the timeline for one or more children to determine whether or not there was any noncompliance by LEA. The OSES collected additional data and explanatory documentation to ensure the data and information were valid and reliable. Provide additional information about this indicator (optional)The OSES provided guidance and support to LEAs in searching for alternative means to complete evaluations during the pandemic. This included sharing information from test developers as to whether a particular assessment could be administered via alternative means such as virtually. This also included assisting LEAs with the identification, where available, of alternatives to frequently used assessments. In many situations, alternative administration methods or alternative assessments were simply not available. The health and safety concerns for all involved were a priority that prohibited the face-to-face interaction required by much of the evaluation process.

Correction of Findings of Noncompliance Identified in FFY 2018

45 Part B

Page 46: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

56 56 0 0

FFY 2018 Findings of Noncompliance Verified as CorrectedDescribe how the State verified that the source of noncompliance is correctly implementing the regulatory requirementsUpon review by the OSES, no LEAs were found to have systemic noncompliance for this Indicator. The noncompliance in each LEA appeared to be isolated instances. The OSES reviewed the policies, procedures, and practices related to evaluation for initial eligibility for special education in each LEA and found that the policies and procedures were aligned with all federal and state requirements. The findings noncompliance were related to distinct situations and not even to a particular staff member. All LEAs with findings were required to review practices and, if appropriate, develop additional procedures such as checklists or an additional review to ensure all evaluation team members were adhering to the regulatory requirements. Based on a review of updated data, the OSES determined that each LEA is correctly implementing the regulatory requirements at the system’s level and is, therefore, demonstrating 100% compliance for this Indicator at the system level.Describe how the State verified that each individual case of noncompliance was correctedThe LEAs in question submitted evidence, verified by the OSES, that these children had been evaluated, although late, and that each of the affected LEAs had held IEP meetings to determine whether the child had been denied a FAPE under the IDEA. If the team determined that FAPE had been denied, the team determined the amount of compensatory services needed and provided those services. The OSES verified this information in correspondence with each district; in reviews of additional documentation submitted as part of the data validity verification procedures; and in reviews of the online SC Enrich IEP system. Based on these reviews, the OSES determined that each LEA had corrected all individual, student-specific noncompliance and is therefore, demonstrating 100% compliance at the student level for this Indicator.

Correction of Findings of Noncompliance Identified Prior to FFY 2018

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY 2018

APRFindings of Noncompliance

Verified as CorrectedFindings Not Yet Verified as

Corrected

11 - Prior FFY Required Actions

Response to actions required in FFY 2018 SPP/APR

11 - OSEP Response

11 - Required ActionsBecause the State reported less than 100% compliance for FFY 2019, the State must report on the status of correction of noncompliance identified in FFY 2019 for this indicator. When reporting on the correction of noncompliance, the State must report, in the FFY 2020 SPP/APR, that it has verified that each LEA with noncompliance identified in FFY 2019 for this indicator: (1) is correctly implementing the specific regulatory requirements (i.e., achieved 100% compliance) based on a review of updated data such as data subsequently collected through on-site monitoring or a State data system; and (2) has corrected each individual case of noncompliance, unless the child is no longer within the jurisdiction of the LEA, consistent with OSEP Memo 09-02. In the FFY 2020 SPP/APR, the State must describe the specific actions that were taken to verify the correction.

If the State did not identify any findings of noncompliance in FFY 2019, although its FFY 2019 data reflect less than 100% compliance, provide an explanation of why the State did not identify any findings of noncompliance in FFY 2019.

46 Part B

Page 47: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 12: Early Childhood TransitionInstructions and MeasurementMonitoring Priority: Effective General Supervision Part B / Effective TransitionCompliance indicator: Percent of children referred by Part C prior to age 3, who are found eligible for Part B, and who have an IEP developed and implemented by their third birthdays. (20 U.S.C. 1416(a)(3)(B))Data SourceData to be taken from State monitoring or State data system.Measurement

a. # of children who have been served in Part C and referred to Part B for Part B eligibility determination.b. # of those referred determined to be NOT eligible and whose eligibility was determined prior to their third birthdays.c. # of those found eligible who have an IEP developed and implemented by their third birthdays.d. # of children for whom parent refusal to provide consent caused delays in evaluation or initial services or to whom exceptions under 34 CFR §300.301(d) applied.e. # of children determined to be eligible for early intervention services under Part C less than 90 days before their third birthdays.f. # of children whose parents chose to continue early intervention services beyond the child’s third birthday through a State’s policy under 34 CFR §303.211 or a similar State option.

Account for children included in (a), but not included in b, c, d, e, or f. Indicate the range of days beyond the third birthday when eligibility was determined and the IEP developed, and the reasons for the delays.

Percent = [(c) divided by (a - b - d - e - f)] times 100.

InstructionsIf data are from State monitoring, describe the method used to select LEAs for monitoring. If data are from a State database, include data for the entire reporting year.Describe the results of the calculations and compare the results to the target. Describe the method used to collect these data, and if data are from the State’s monitoring, describe the procedures used to collect these data. Provide the actual numbers used in the calculation.Category f is to be used only by States that have an approved policy for providing parents the option of continuing early intervention services beyond the child’s third birthday under 34 CFR §303.211 or a similar State option.Targets must be 100%.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken.If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for FFY 2018), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.

12 - Indicator DataNot ApplicableSelect yes if this indicator is not applicable.NO

Historical Data

Baseline Year Baseline Data

2005 78.00%

FFY 2014 2015 2016 2017 2018

Target 100% 100% 100% 100% 100%

Data 99.72% 99.73% 100.00% 100.00% 99.56%

Targets

FFY 2019

Target 100%

FFY 2019 SPP/APR Data

a. Number of children who have been served in Part C and referred to Part B for Part B eligibility determination. 3,079

47 Part B

Page 48: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

b. Number of those referred determined to be NOT eligible and whose eligibility was determined prior to third birthday. 700

c. Number of those found eligible who have an IEP developed and implemented by their third birthdays. 1,411

d. Number for whom parent refusals to provide consent caused delays in evaluation or initial services or to whom exceptions under 34 CFR §300.301(d) applied. 789

e. Number of children who were referred to Part C less than 90 days before their third birthdays. 26

f. Number of children whose parents chose to continue early intervention services beyond the child’s third birthday through a State’s policy under 34 CFR §303.211 or a similar State option. 0

Measure Numerator (c) Denominator (a-b-d-e-f)

FFY 2018 Data

FFY 2019 Target

FFY 2019 Data

Status Slippage

Percent of children referred by Part C prior to age 3 who are found eligible for Part B, and who have an IEP developed and implemented by their third birthdays.

1,411 1,564 99.56% 100% 90.22% Did Not Meet Target Slippage

Provide reasons for slippage, if applicableAs noted in the Introduction, the COVID pandemic has affected every aspect of the educational system due to the significant health and safety concerns resulting from the virus. Physical school, district, and government buildings remained close through the summer of 2020. Quarantine and mask ordinances were put into place. Although LEAs and school personnel were able to adapt and continue with some aspects of the education process, there were others that could not be adapted, particularly to this age group. The processes and routines dependent upon face-to-face interaction were affected the most. This included some aspects of the initial evaluation and transition processes. Although LEAs were able to continue with and complete some aspects of the initial evaluation process using alternative means (virtual meetings, telephone calls, video calls, emails), there were other aspects that could not be completed due to the face-to-face interaction required. These types of interactions with children this age are critical to relationship-building and obtaining valid and reliable data. Therefore, while LEAs were able, in some situations, to gather parent input, complete rating scales, assess current levels of performance, and observe instructional response, LEAs were unable to gather other information required by the evaluation planning and transition team. There were components requested by the team that could only be gathered via close contact, face-to-face interaction. These included standardized cognitive, language, communication, developmental, and pre-academic assessments as well as other measures. Because the health and safety issues did not allow for this type of interaction and because these assessment measures could not be adapted or changed to accommodate for administration in through another format without invalidating the results, the completion of the initial evaluation process and subsequently, the transition from C to B services, was delayed for a small percentage of students. In some instances, the child was able to continue to receive Part C services until the transition could be safely completed; however, as will be reflected in Part C’s APR, the pandemic impacted service delivery for Part C as well.

Of the 153 students for whom the initial evaluation and transition were not completed within the timeline, 142 of these students’ evaluations/transitions were delayed unavoidably due to the COVID pandemic. LEAs made good faith efforts to complete all aspects of the evaluation and transition process possible. The aspects that were not able to be completed within the timeline due to the pandemic have been completed as soon as it was safe for both the students and staff. During this time, LEA and school staff communicated frequently with parents and families to keep them abreast of their efforts and to discuss alternatives.

Once face-to-face interactions with all safety precautions in place could begin, LEAs began scheduling the remaining assessment components and transition meetings as quickly as possible. In some instances, parents were uncomfortable due to continued health and safety concerns with any face-to-face interactions and asked that the completion of the components be delayed. LEAs have documented all good faith efforts and communications with parents during this time.

Although OSEP did not grant any waivers for this, or any other timelines related to IDEA, the OSES wanted to look at the impact of COVID on this Indicator. When the 142 students with COVID-related delays were removed from the calculations (for analysis purposes only), there were 11 students who were not assessed/transitioned within the timelines. It appears that slightly over 92% of the delays, despite all good faith efforts by LEAs and families to complete the evaluations/transitions within the timeline, were COVID-related.Number of children who served in part C and referred to Part B for eligibility determination that are not included in b, c, d, e, or f153Account for children included in (a), but not included in b, c, d, e, or f. Indicate the range of days beyond the third birthday when eligibility was determined and the IEP developed, and the reasons for the delays.For FFY 2019, the range of days beyond the third birthday was from 1 to 361 days. The reasons for the delays (and subsequent noncompliance) include staff turnover; not ensuring that processes continued during school calendar, winter, and summer breaks; failure to move expeditiously to ensure that an evaluation occurred within sixty calendar days; the inability to engage parents after multiple varied attempts; difficulty scheduling hearing and vision screenings; difficulty arranging for related service providers’ input for assessments; inconsistent oversight of the evaluation process by the LEA; and inability to compete the evaluation process due to health and safety concerns related to the COVID pandemic. Data analysis showed that 92.81% of the delays were directly related to the pandemic.Attach PDF table (optional)

What is the source of the data provided for this indicator?State database that includes data for the entire reporting yearDescribe the method used to collect these data, and if data are from the State’s monitoring, describe the procedures used to collect these data.

48 Part B

Page 49: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

The Office of Special Education Services (OSES) collects data from the statewide special education database, SC Enrich IEP, for use in the measurement of IDEA Part B Indicator 12. The date range for this collection was July 1, 2019 – June 30, 2020. These data are reflective of all students who were referred from IDEA Part C Providers (BabyNet) in the state for the respective date range. For each local educational agency (LEA) and state-operated program (SOP), OSES staff extracted spreadsheet reports in July, 2020. A team of OSES staff with expertise in data collection, analysis, and reporting reviewed both quantitative and qualitative data from the SC Enrich IEP platform reports to determine the categorical analysis of each individual student referred from Part C (a – e in the above table); and whether or not student and/or LEA-level noncompliance existed. The OSES also collected additional data, documentation and information, as needed; corresponded with LEAs; and ensured that the data and information were valid and reliable.Provide additional information about this indicator (optional)The OSES provided guidance and support to LEAs in searching for alternative means to complete evaluations during the pandemic. This included sharing information from test developers as to whether a particular assessment could be administered via alternative means such as virtually. This also included assisting LEAs with the identification, where available, of alternatives to frequently used assessments. In many situations, alternative administration methods or alternative assessments were simply not available, particularly for this age range. The health and safety concerns for all involved were a priority that prohibited the face-to-face interaction required by much of the evaluation and transition process.

Correction of Findings of Noncompliance Identified in FFY 2018

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

7 7 0 0

FFY 2018 Findings of Noncompliance Verified as CorrectedDescribe how the State verified that the source of noncompliance is correctly implementing the regulatory requirementsUpon review by the OSES, no LEAs were found to have systemic noncompliance for this Indicator. The noncompliance in each LEA appeared to be isolated instances. The OSES reviewed the policies, procedures, and practices related to evaluation and transition from C to B in each LEA and found that the policies and procedures were aligned with all federal and state requirements. The findings of noncompliance were related to distinct situations and not even to a particular staff member. All LEAs with findings were required to review practices and, if appropriate, develop additional procedures such as checklists or an additional review to ensure all evaluation/transition team members were adhering to the regulatory requirements. Based on a review of updated data, the OSES determined that each LEA is correctly implementing the regulatory requirements at the system’s level and is, therefore, demonstrating 100% compliance for this Indicator at the system level.Describe how the State verified that each individual case of noncompliance was correctedThe LEAs in question submitted evidence, verified by the OSES, that these children had been evaluated and now have an IEP in place, although after their third birthday, and that each of the affected LEAs had held IEP meetings to determine whether the child had been denied a FAPE under the IDEA. If the team determined that FAPE had been denied, the team determined the amount of compensatory services needed and provided those services. The OSES verified this information in correspondence with each district; in reviews of additional documentation submitted as part of the data validity verification procedures; and in reviews of the online SC Enrich IEP system. Based on these reviews, the OSES determined that each LEA had corrected all individual, student-specific noncompliance and is therefore, demonstrating 100% compliance at the student level for this Indicator.

Correction of Findings of Noncompliance Identified Prior to FFY 2018

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY

2018 APRFindings of Noncompliance

Verified as CorrectedFindings Not Yet Verified as

Corrected

12 - Prior FFY Required ActionsNone

12 - OSEP Response

12 - Required ActionsBecause the State reported less than 100% compliance for FFY 2019, the State must report on the status of correction of noncompliance identified in FFY 2019 for this indicator. When reporting on the correction of noncompliance, the State must report, in the FFY 2020 SPP/APR, that it has verified that each LEA with noncompliance identified in FFY 2019 for this indicator: (1) is correctly implementing the specific regulatory requirements (i.e., achieved 100% compliance) based on a review of updated data such as data subsequently collected through on-site monitoring or a State data system; and (2) has corrected each individual case of noncompliance, unless the child is no longer within the jurisdiction of the LEA, consistent with OSEP Memo 09-02. In the FFY 2020 SPP/APR, the State must describe the specific actions that were taken to verify the correction.

If the State did not identify any findings of noncompliance in FFY 2019, although its FFY 2019 data reflect less than 100% compliance, provide an explanation of why the State did not identify any findings of noncompliance in FFY 2019.

49 Part B

Page 50: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 13: Secondary TransitionInstructions and MeasurementMonitoring Priority: Effective General Supervision Part B / Effective TransitionCompliance indicator: Secondary transition: Percent of youth with IEPs aged 16 and above with an IEP that includes appropriate measurable postsecondary goals that are annually updated and based upon an age appropriate transition assessment, transition services, including courses of study, that will reasonably enable the student to meet those postsecondary goals, and annual IEP goals related to the student’s transition services needs. There also must be evidence that the student was invited to the IEP Team meeting where transition services are to be discussed and evidence that, if appropriate, a representative of any participating agency was invited to the IEP Team meeting with the prior consent of the parent or student who has reached the age of majority. (20 U.S.C. 1416(a)(3)(B))Data SourceData to be taken from State monitoring or State data system.MeasurementPercent = [(# of youth with IEPs aged 16 and above with an IEP that includes appropriate measurable postsecondary goals that are annually updated and based upon an age appropriate transition assessment, transition services, including courses of study, that will reasonably enable the student to meet those postsecondary goals, and annual IEP goals related to the student’s transition services needs. There also must be evidence that the student was invited to the IEP Team meeting where transition services are to be discussed and evidence that, if appropriate, a representative of any participating agency was invited to the IEP Team meeting with the prior consent of the parent or student who has reached the age of majority) divided by the (# of youth with an IEP age 16 and above)] times 100.If a State’s policies and procedures provide that public agencies must meet these requirements at an age younger than 16, the State may, but is not required to, choose to include youth beginning at that younger age in its data for this indicator. If a State chooses to do this, it must state this clearly in its SPP/APR and ensure that its baseline data are based on youth beginning at that younger age.InstructionsIf data are from State monitoring, describe the method used to select LEAs for monitoring. If data are from a State database, include data for the entire reporting year.Describe the results of the calculations and compare the results to the target. Describe the method used to collect these data and if data are from the State’s monitoring, describe the procedures used to collect these data. Provide the actual numbers used in the calculation.Targets must be 100%.Provide detailed information about the timely correction of noncompliance as noted in OSEP’s response for the previous SPP/APR. If the State did not ensure timely correction of the previous noncompliance, provide information on the extent to which noncompliance was subsequently corrected (more than one year after identification). In addition, provide information regarding the nature of any continuing noncompliance, improvement activities completed (e.g., review of policies and procedures, technical assistance, training, etc.) and any enforcement actions that were taken.If the State reported less than 100% compliance for the previous reporting period (e.g., for the FFY 2019 SPP/APR, the data for FFY 2018), and the State did not identify any findings of noncompliance, provide an explanation of why the State did not identify any findings of noncompliance.

13 - Indicator DataHistorical Data

Baseline Year Baseline Data

2009 98.92%

FFY 2014 2015 2016 2017 2018

Target 100% 100% 100% 100% 100%

Data 96.60% 88.82% 91.90% 90.48% 96.88%

Targets

FFY 2019

Target 100%

FFY 2019 SPP/APR Data

Number of youth aged 16 and

above with IEPs that contain each

of the required components for

secondary transition

Number of youth with IEPs aged 16 and above FFY 2018 Data FFY 2019 Target

FFY 2019 Data Status Slippage

394 401 96.88% 100% 98.25% Did Not Meet Target No Slippage

50 Part B

Page 51: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

What is the source of the data provided for this indicator? State monitoringDescribe the method used to collect these data, and if data are from the State’s monitoring, describe the procedures used to collect these data. The OSES system for review of IEPs for this Indicator operates on a three-year data collection and review cycle. One third of all LEAs have IEPs reviewed annually. Each third contains a similar range of small, medium, and large LEAs in terms of the number of students with disabilities. The number of IEPs per LEA depends on the size of the LEA, but typically ranges from 10 to 20. The IEPs are uploaded into the state-wide database and then reviewed by OSES staff. Upon completion of the review, each LEA receives feedback, and is required to correct all findings of noncompliance as soon as possible, but in no case later than one year. OSES staff reviews IEPs using a tool based on information and guidance from the National Training and Technical Assistance Center (NTTAC). The tool used for Indicator 13 data collection may be found at: https://ed.sc.gov/districts-schools/special-education-services/post-secondary-outcomes/. A webinar reviewing required information, providing TA specific to this Indicator, and listing links to national TA center was recorded and posted on the OSES website. LEAs are referred to this universal TA.

Question Yes / No

Do the State’s policies and procedures provide that public agencies must meet these requirements at an age younger than 16?

YES

If yes, did the State choose to include youth at an age younger than 16 in its data for this indicator and ensure that its baseline data are based on youth beginning at that younger age?

NO

If no, please explainFederal regulations require post-secondary planning beginning at age 16 but allow states to set a younger age. South Carolina lowered the age to 13 to align with general education graduation requirements and the development of Individual Graduation Plans for all students. For indicator 13, OSEP has allowed states to use either the federal age (16) or the state requirement (13) if it is different. The state opted to use the federal age for our data collection to ensure comparability for data reporting purposes.Provide additional information about this indicator (optional)To assist LEAs in developing compliant and meaningful transition IEPs and in correcting the areas of noncompliance, OSES developed and posted an online module relating to Indicator 13 and post-secondary transition planning and services. OSES also developed activities, a post-module assessment, and other resources that were included with the module on the OSES website. The webinar reviewing required information, providing TA specific to this Indicator, and listing links to national TA center has been recorded and posted on the OSES website. LEAs are referred to this universal TA. The module and website resources are available at: https://ed.sc.gov/districts-schools/special-education-services/post-secondary-outcomes/. OSES also provided in-person professional learning opportunities for the LEAs with findings of noncompliance. OSES has collaborated with the Transition Alliance of South Carolina (TASC) to prove guidance to district transition teams and with ABLE-SC in order to strengthen knowledge of transition needs and services for students with disabilities. The links to these partners are: http://transitionalliancesc.org/ http://www.able-sc.org/ In addition to the efforts being made by OSES and its partners, South Carolina now has an add-on credential for transition that can be obtained by teachers at several colleges and universities. This was created with input from the South Carolina Advisory Council on the Education of Students with Disabilities in conjunction with stakeholders from the universities and other state agencies.

Correction of Findings of Noncompliance Identified in FFY 2018

Findings of Noncompliance Identified

Findings of Noncompliance Verified as Corrected Within One

YearFindings of Noncompliance

Subsequently CorrectedFindings Not Yet Verified as

Corrected

11 11 0 0

FFY 2018 Findings of Noncompliance Verified as CorrectedDescribe how the State verified that the source of noncompliance is correctly implementing the regulatory requirementsUpon review by the OSES, no LEAs were found to have systemic noncompliance for this Indicator. The noncompliance in each LEA appeared to be isolated instances. The OSES reviewed the policies, procedures, and practices related to transition IEPs in each LEA and found that the policies and procedures for all LEAs were aligned with all federal and state requirements. The findings noncompliance were related to distinct situations and not even to a particular staff member. All LEAs with findings were required to review practices and, if appropriate, develop additional procedures such as checklists or an additional review to ensure all IEP team members were adhering to the regulatory requirements. They were also required to complete the review of the module described above and attendance was tracked. All LEAs received on-site TA from one or both of the state-level TA providers. The TA was specific to the particular needs of each LEA. The OSES reviewed updated data in the on-line system as well as any corrective practices the LEAs were implementing and determined that each LEA is now correctly implementing the regulatory requirements at the system’s level and is, therefore, demonstrating 100% compliance for this Indicator at the system level.Describe how the State verified that each individual case of noncompliance was correctedAs described above, all LEAs had to complete the transition module and attendance was tracked. The OSES staff as well as TASC provided individualized TA for the LEAs based on the individual findings of noncompliance. Is some instances, this included having teachers bring the noncompliant IEPs and work through the issues with the TA providers. The LEAs then held IEP meetings for all students for whom there was a transition-related finding of noncompliance to correct the noncompliance and determine whether the student had been denied a FAPE under the IDEA. If the team determined that FAPE had been denied, the team determined the amount of compensatory services needed and provided those services. The OSES verified this information in correspondence with each district; in reviews of additional documentation submitted as part of the data validity verification procedures; and in reviews of the online SC Enrich IEP system. Based on these reviews, the OSES determined that each LEA had corrected all individual, student-specific noncompliance and is therefore, demonstrating 100% compliance at the student level for this Indicator.

Correction of Findings of Noncompliance Identified Prior to FFY 2018

51 Part B

Page 52: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Year Findings of Noncompliance Were

Identified

Findings of Noncompliance Not Yet Verified as Corrected as of FFY 2018

APRFindings of Noncompliance

Verified as CorrectedFindings Not Yet Verified as

Corrected

13 - Prior FFY Required ActionsNone

13 - OSEP Response

13 - Required ActionsBecause the State reported less than 100% compliance for FFY 2019, the State must report on the status of correction of noncompliance identified in FFY 2019 for this indicator. When reporting on the correction of noncompliance, the State must report, in the FFY 2020 SPP/APR, that it has verified that each LEA with noncompliance identified in FFY 2019 for this indicator: (1) is correctly implementing the specific regulatory requirements (i.e., achieved 100% compliance) based on a review of updated data such as data subsequently collected through on-site monitoring or a State data system; and (2) has corrected each individual case of noncompliance, unless the child is no longer within the jurisdiction of the LEA, consistent with OSEP Memo 09-02. In the FFY 2020 SPP/APR, the State must describe the specific actions that were taken to verify the correction.

If the State did not identify any findings of noncompliance in FFY 2019, although its FFY 2019 data reflect less than 100% compliance, provide an explanation of why the State did not identify any findings of noncompliance in FFY 2019.

52 Part B

Page 53: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 14: Post-School OutcomesInstructions and MeasurementMonitoring Priority: Effective General Supervision Part B / Effective TransitionResults indicator: Post-school outcomes: Percent of youth who are no longer in secondary school, had IEPs in effect at the time they left school, and were:Enrolled in higher education within one year of leaving high school.Enrolled in higher education or competitively employed within one year of leaving high school.Enrolled in higher education or in some other postsecondary education or training program; or competitively employed or in some other employment within one year of leaving high school.(20 U.S.C. 1416(a)(3)(B))Data SourceState selected data source.Measurement

A. Percent enrolled in higher education = [(# of youth who are no longer in secondary school, had IEPs in effect at the time they left school and were enrolled in higher education within one year of leaving high school) divided by the (# of respondent youth who are no longer in secondary school and had IEPs in effect at the time they left school)] times 100.B. Percent enrolled in higher education or competitively employed within one year of leaving high school = [(# of youth who are no longer in secondary school, had IEPs in effect at the time they left school and were enrolled in higher education or competitively employed within one year of leaving high school) divided by the (# of respondent youth who are no longer in secondary school and had IEPs in effect at the time they left school)] times 100.C. Percent enrolled in higher education, or in some other postsecondary education or training program; or competitively employed or in some other employment = [(# of youth who are no longer in secondary school, had IEPs in effect at the time they left school and were enrolled in higher education, or in some other postsecondary education or training program; or competitively employed or in some other employment) divided by the (# of respondent youth who are no longer in secondary school and had IEPs in effect at the time they left school)] times 100.

InstructionsSampling of youth who had IEPs and are no longer in secondary school  is allowed. When sampling is used, submit a description of the sampling methodology outlining how the design will yield valid and reliable estimates of the target population. (See General Instructions on page 2 for additional instructions on sampling.)

Collect data by September 2020 on students who left school during 2018-2019, timing the data collection so that at least one year has passed since the students left school. Include students who dropped out during 2018-2019 or who were expected to return but did not return for the current school year. This includes all youth who had an IEP in effect at the time they left school, including those who graduated with a regular diploma or some other credential, dropped out, or aged out.

I. DefinitionsEnrolled in higher education as used in measures A, B, and C means youth have been enrolled on a full- or part-time basis in a community college (two-year program) or college/university (four or more year program) for at least one complete term, at any time in the year since leaving high school.

Competitive employment as used in measures B and C: States have two options to report data under “competitive employment” in the FFY 2019 SPP/APR, due February 2021:

Option 1: Use the same definition as used to report in the FFY 2015 SPP/APR, i.e., competitive employment means that youth have worked for pay at or above the minimum wage in a setting with others who are nondisabled for a period of 20 hours a week for at least 90 days at any time in the year since leaving high school. This includes military employment.

Option 2: States report in alignment with the term “competitive integrated employment” and its definition, in section 7(5) of the Rehabilitation Act, as amended by Workforce Innovation and Opportunity Act (WIOA), and 34 CFR §361.5(c)(9). For the purpose of defining the rate of compensation for students working on a “part-time basis” under this category, OSEP maintains the standard of 20 hours a week for at least 90 days at any time in the year since leaving high school. This definition applies to military employment.

Enrolled in other postsecondary education or training as used in measure C, means youth have been enrolled on a full- or part-time basis for at least 1 complete term at any time in the year since leaving high school in an education or training program (e.g., Job Corps, adult education, workforce development program, vocational technical school which is less than a two-year program).

Some other employment as used in measure C means youth have worked for pay or been self-employed for a period of at least 90 days at any time in the year since leaving high school. This includes working in a family business (e.g., farm, store, fishing, ranching, catering services, etc.).II. Data ReportingProvide the actual numbers for each of the following mutually exclusive categories. The actual number of “leavers” who are:

1. Enrolled in higher education within one year of leaving high school;2. Competitively employed within one year of leaving high school (but not enrolled in higher education);3. Enrolled in some other postsecondary education or training program within one year of leaving high school (but not enrolled in

higher education or competitively employed);4. In some other employment within one year of leaving high school (but not enrolled in higher education, some other postsecondary education or training program, or competitively employed).

“Leavers” should only be counted in one of the above categories, and the categories are organized hierarchically. So, for example, “leavers” who are enrolled in full- or part-time higher education within one year of leaving high school should only be reported in category 1, even if they also happen to be employed. Likewise, “leavers” who are not enrolled in either part- or full-time higher education, but who are competitively employed, should only be reported under category 2, even if they happen to be enrolled in some other postsecondary education or training program.

53 Part B

Page 54: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

III. Reporting on the Measures/IndicatorsTargets must be established for measures A, B, and C.

Measure A: For purposes of reporting on the measures/indicators, please note that any youth enrolled in an institution of higher education (that meets any definition of this term in the Higher Education Act (HEA)) within one year of leaving high school must be reported under measure A. This could include youth who also happen to be competitively employed, or in some other training program; however, the key outcome we are interested in here is enrollment in higher education.

Measure B: All youth reported under measure A should also be reported under measure B, in addition to all youth that obtain competitive employment within one year of leaving high school.

Measure C: All youth reported under measures A and B should also be reported under measure C, in addition to youth that are enrolled in some other postsecondary education or training program, or in some other employment.

Include the State’s analysis of the extent to which the response data are representative of the demographics of youth who are no longer in secondary school and had IEPs in effect at the time they left school. States should consider categories such as race and ethnicity, disability category, and geographic location in the State.

If the analysis shows that the response data are not representative of the demographics of youth who are no longer in secondary school and had IEPs in effect at the time they left school, describe the strategies that the State will use to ensure that in the future the response data are representative of those demographics. In identifying such strategies, the State should consider factors such as how the State collected the data.

14 - Indicator DataHistorical Data

Measure Baseline FFY 2014 2015 2016 2017 2018

A2013 Target

>=15.11% 15.61%

16.00% 17.00% 18.00%

A 15.11% Data 25.55% 22.82% 26.21% 30.87% 24.44%

B2013 Target

>=43.20% 44.00%

45.00% 46.00% 47.00%

B 43.20% Data 53.64% 56.85% 57.36% 61.04% 54.63%

C2013 Target

>=54.00% 57.00%

60.00% 62.00% 64.00%

C 50.24% Data 58.10% 69.54% 84.39% 76.44% 69.87%

FFY 2019 Targets

FFY 2019

Target A >= 25.00%

Target B >= 50.00%

Target C >= 75.00%

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical

54 Part B

Page 55: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

FFY 2019 SPP/APR Data

Number of respondent youth who are no longer in secondary school and had IEPs in effect at the time they left school 2,197

1. Number of respondent youth who enrolled in higher education within one year of leaving high school 234

2. Number of respondent youth who competitively employed within one year of leaving high school 313

3. Number of respondent youth enrolled in some other postsecondary education or training program within one year of leaving high school (but not enrolled in higher education or competitively employed) 1,119

4. Number of respondent youth who are in some other employment within one year of leaving high school (but not enrolled in higher education, some other postsecondary education or training program, or competitively employed). 0

Measure

Number of respondent

youth

Number of respondent

youth who are no longer in secondary school and had IEPs in effect at the time they left

school FFY 2018 DataFFY 2019

Target FFY 2019 Data Status Slippage

A. Enrolled in higher education (1)

234 2,197 24.44% 25.00% 10.65% Did Not Meet Target Slippage

B. Enrolled in higher education or competitively employed within one year of leaving high school (1 +2)

547 2,197 54.63% 50.00% 24.90% Did Not Meet Target Slippage

C. Enrolled in higher education, or in some other postsecondary education or training program; or competitively employed or in some other employment (1+2+3+4)

1,666 2,197 69.87% 75.00% 75.83% Met Target No Slippage

Part Reasons for slippage, if applicable

A

According to Education Advisory Board (EAB), a company that focuses on enrollment management, student success, and institutional operations and strategy in higher education, in the research survey How going remote will impact your graduate enrollment funnel (8/27/20): While most postsecondary learners they surveyed were open to online instruction, 26.9% indicated that they did not want to pursue a degree online with 13.5% undecided. EAB advised institutions shifting to online instruction because of the pandemic, to be prepared to manage increased turbulence with enrollment.

The 13.79% percentage point drop in enrollment in higher education for students with disabilities is in line with what was being reported by EAB. The n-size of South Carolina’s indictor 14 is a relatively small sample but believed to reflect the impact of the COVID 19 pandemic.

B According to the South Carolina Department of Employment and Workforce (DEW) the seasonally adjusted monthly survey of households referencing the week of March 12, 2020 estimated the number of South Carolinians unemployment estimates increased to 61,898 people. That is a significant increase of 3,267 people since February 2020. The state’s seasonally adjusted unemployment rate increased to 2.6 percent over the month from February’s rate of 2.5 percent. Nationally, the unemployment rate moved higher from 3.5 percent in February to 4.4 percent in March, reflecting some of the early effects of the Coronavirus (COVID-19) on the household survey data.

Data from the March 2020 DEW news release stated the state’s estimated labor force (people working plus unemployed people looking for

55 Part B

Page 56: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Part Reasons for slippage, if applicable

work) increased to 2,396,550. That is an increase of 4,695 people since February 2020. That is an increase of 25,307 individuals over March 2019.

The statewide unemployment rate for all workers was affected greatly by the COVID 19 pandemic. When combined with the higher education rate drops and the decreased employment opportunities available because of the pandemic, and increased rate of unemployment for first time workers with disability is not surprising, even expected.

Please select the reporting option your State is using: Option 1: Use the same definition as used to report in the FFY 2015 SPP/APR, i.e., competitive employment means that youth have worked for pay at or above the minimum wage in a setting with others who are nondisabled for a period of 20 hours a week for at least 90 days at any time in the year since leaving high school. This includes military employment.

Sampling Question Yes / No

Was sampling used? NO

Describe the sampling methodology outlining how the design will yield valid and reliable estimates.

Survey Question Yes / No

Was a survey used? YES

If yes, is it a new or revised survey? NO

Include the State’s analyses of the extent to which the response data are representative of the demographics of youth who are no longer in secondary school and had IEPs in effect at the time they left school.To determine whether the data represent the demographics of the State, the OSES calculated the responses rates along three demographics variables: gender of the students; race or ethnicity (using the federal reporting categories); and primary disability category. Next the State compared the response rates, by demographics, to those who left school the previous year. The state then compared these rates to one another. The state then reviewed the difference between the percentages of the demographics variables of the respondents to the state. The threshold used by the state to determine representativeness was 10 percent. In other words, if the difference between the respondents and the population was -10 percent or less, the OSES finds the values representative. There were no differences that exceeded plus or minus 10.00 percent (Note: a negative value indicates under-representation; a positive value indicates over-representation). As a result, the state finds these data to be representative of the demographics of the State. A copy of this analysis may be found at https://ed.sc.gov/districts-schools/special-education-services/data-and-technology-d-t/#sppAndDet.

Question Yes / No

Are the response data representative of the demographics of youth who are no longer in school and had IEPs in effect at the time they left school?

YES

Provide additional information about this indicator (optional)The state contracted with a contact management center software-based company, beginning in FFY 2013. This company conducts a census of school exiters each year to follow-up on post-secondary experiences. Four options were employed by this group to obtain survey information: email, outbound calls, mailers/letters, and Facebook Messaging. Exiters include students who have aged-out, graduated with a regular high school diploma, and are non-returners who received a state certificate or are dropouts at or above age 17. The company conducts surveys one year after students exit school with a survey on postsecondary experiences. Exiters are identified through the state’s online special education student information system, SC Enrich IEP. These students have been verified as having exited with the 618 Table 4 submissions. In order to ensure valid data are provided for exiting students, the Office of Special Education Services follows-up with each LEA to ensure up-to-date contact information for students when they graduate, receive a state certificate, drop out of school, or die. For the post-secondary survey, the state provides the company with the population of exiters from the previous school year. A number of techniques are deployed by the company to collect student data. These included: 1. Emails with web links to complete the survey and the ability to respond to the email itself. 2. Outbound calls using live agents and call automation leaving personalized voicemail with callback numbers. 3. Mailers (letters) sent to residential addresses with callback numbers and website information. 4. Facebook Messaging when a Facebook profile match is found. The techniques above are listed by the order in which attempts were made to reach each leaver. Emails were attempted first (up to 4 attempts), followed by phone calls (up to 8 attempts), then letters/mailer and finally, searching Facebook profiles. In order to appropriately identify students for the particular categories of this indicator, OSES staff conducts additional analyses to ensure that students are correctly counted once in one of four conditions: 1. enrolled in higher education, 2. competitively employed, 3. enrolled in some other postsecondary education or training program, and 4. employed in some other employment. Higher education, as used in measures A, B, and C, means youth who have been enrolled on a full- or part-time basis in a community or technical college (2-year program) or college/university (4 or more year program) for at least one complete term, at any time in the year since leaving high school. Competitively employed, as used in measures B and C, means youth who have worked for pay at or above the minimum wage in a setting with others who are nondisabled for a period of twenty hours per week for at least 90 total days at any time in the year since leaving high school, which includes

56 Part B

Page 57: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

military employment. Other postsecondary education or training, as used in measure C, means youth who have been enrolled on a full or part-time basis for at least one complete term at any time in the year since leaving high school in an education or training program, which could include Job Corps, adult education, workforce development programs, on-the-job training, vocational educational programs which are less than two-years, and certificate programs (less than a two-year program). Other Employment, as used in measure C, means youth who have worked for pay or been self-employed for a period of at least 90 total days at any time in the year since leaving high school, including working in a family business. Exiters are defined as the population of students who have exited school during the previous school year to the reporting year of the Annual Performance Report (APR) for reasons that include: graduating with a South Carolina high school diploma; receiving a South Carolina state certificate; reaching maximum age; dropping out of school at age 17 and above; and not returning to school the subsequent year. South Carolina notes that while students with disabilities who have died are counted in state reporting of exiters, South Carolina does not include them in the definition of “exiters” for Part B SPP Indicator 14. Subsequently, their families are not provided surveys nor interviewed, and these students are not included in the survey process. The OSES implemented a new pilot strategy for data collection for the FFY 2018 survey as a means of 1) addressing overrepresentation and underrepresentation and 2) increasing the state & districts' response rate. This continued for the FY 19 survey as well. Previously the OSES used only a state-contracted vendor to solicit responses to the surveys (described above). Based on a recommendation from stakeholders, the OSES offered districts an opportunity to conduct the survey themselves using their data and contact information and to enter the survey data on a form provided by the OSES. The districts used their data to make additional, more personalized contacts with former students in an attempt to improve representativeness and response rate. Information gathered by the districts was sent to the OSES for analysis. Incentives were provided for districts in the pilot that were able to solicit a minimum number of responses. This pilot appears to have assisted in improving representativeness, but not in increasing numbers overall. The OSES will continue analyzing response rates and working to improve both of these areas. It is difficult to ascertain what the long-term effects of the pandemic will be for young adults with disabilities in higher education and the workforce.

As discussed under Slippage, the impact of the COVID pandemic on the ability of students with disabilities to continue education and employment was impacted. The state is directing resources and support in an attempt to mitigate the impact on this population. This includes working with our national TA centers as well as our in-state partners of the Transition Alliance of South Carolina and Able SC.

14 - Prior FFY Required ActionsNone

14 - OSEP Response

14 - Required Actions

57 Part B

Page 58: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 15: Resolution SessionsInstructions and MeasurementMonitoring Priority: Effective General Supervision Part B / General SupervisionResults Indicator: Percent of hearing requests that went to resolution sessions that were resolved through resolution session settlement agreements. (20 U.S.C. 1416(a)(3)(B))Data SourceData collected under section 618 of the IDEA (IDEA Part B Dispute Resolution Survey in the EDFacts Metadata and Process System (EMAPS)).MeasurementPercent = (3.1(a) divided by 3.1) times 100.InstructionsSampling is not allowed.Describe the results of the calculations and compare the results to the target.States are not required to establish baseline or targets if the number of resolution sessions is less than 10. In a reporting period when the number of resolution sessions reaches 10 or greater, develop baseline, targets and improvement activities, and report on them in the corresponding SPP/APR.States may express their targets in a range (e.g., 75-85%).If the data reported in this indicator are not the same as the State’s data under IDEA section 618, explain.States are not required to report data at the LEA level.

15 - Indicator DataSelect yes to use target rangesTarget Range not used

Prepopulated Data

Source Date Description Data

SY 2019-20 EMAPS IDEA Part B Dispute Resolution Survey;

Section C: Due Process Complaints

11/04/2020 3.1 Number of resolution sessions 23

SY 2019-20 EMAPS IDEA Part B Dispute Resolution Survey;

Section C: Due Process Complaints

11/04/2020 3.1(a) Number resolution sessions resolved through settlement agreements

10

Select yes if the data reported in this indicator are not the same as the State’s data reported under section 618 of the IDEA.NO

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

58 Part B

Page 59: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Historical Data

Baseline Year Baseline Data

2016 37.50%

FFY 2014 2015 2016 2017 2018

Target >= 60.00% 60.00% 37.50% 40.00% 42.50%

Data 42.86% 77.78% 37.50% 60.00% 63.16%

Targets

FFY 2019

Target >= 42.50%

FFY 2019 SPP/APR Data

3.1(a) Number resolutions

sessions resolved through

settlement agreements

3.1 Number of resolutions sessions

FFY 2018 Data FFY 2019 Target FFY 2019 Data Status Slippage

10 23 63.16% 42.50% 43.48% Met Target No Slippage

Provide additional information about this indicator (optional)It does not appear that there was an impact due to the COVID pandemic on the number of resolution sessions resolved through settlement agreements as the majority of the agreements occurred prior to the on-set of the pandemic in March.

15 - Prior FFY Required ActionsNone

15 - OSEP Response

15 - Required Actions

59 Part B

Page 60: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 16: MediationInstructions and MeasurementMonitoring Priority: Effective General Supervision Part B / General SupervisionResults indicator: Percent of mediations held that resulted in mediation agreements. (20 U.S.C. 1416(a)(3(B))Data SourceData collected under section 618 of the IDEA (IDEA Part B Dispute Resolution Survey in the EDFacts Metadata and Process System (EMAPS)).MeasurementPercent = (2.1(a)(i) + 2.1(b)(i)) divided by 2.1) times 100.InstructionsSampling is not allowed.Describe the results of the calculations and compare the results to the target.States are not required to establish baseline or targets if the number of resolution sessions is less than 10. In a reporting period when the number of resolution sessions reaches 10 or greater, develop baseline, targets and improvement activities, and report on them in the corresponding SPP/APR.States may express their targets in a range (e.g., 75-85%).If the data reported in this indicator are not the same as the State’s data under IDEA section 618, explain.States are not required to report data at the LEA level.

16 - Indicator DataSelect yes to use target rangesTarget Range is used

Prepopulated Data

Source Date Description Data

SY 2019-20 EMAPS IDEA Part B Dispute Resolution Survey;

Section B: Mediation Requests

11/04/2020 2.1 Mediations held 5

SY 2019-20 EMAPS IDEA Part B Dispute Resolution Survey;

Section B: Mediation Requests

11/04/2020 2.1.a.i Mediations agreements related to due process complaints

0

SY 2019-20 EMAPS IDEA Part B Dispute Resolution Survey;

Section B: Mediation Requests

11/04/2020 2.1.b.i Mediations agreements not related to due process complaints

4

Select yes if the data reported in this indicator are not the same as the State’s data reported under section 618 of the IDEA.NO

Targets: Description of Stakeholder Input The OSES relies heavily on its partnership with the South Carolina Advisory Council on the Education of Students with Disabilities (ACESD). This partnership is designed to authentically engage this critical group of stakeholders in collaborative activities that are directly aligned with educational results and functional outcomes for children with disabilities in South Carolina. Updates are provided at each quarterly ACESD executive committee and full council meeting regarding review, revision, progress, and outcomes. The Council participated in the extension of targets for SPP/APR indicators. Council members are family members and persons with disabilities, educators, advocates, departmental representatives, university professors and community members. A majority of members are individuals with disabilities and parents and grandparents of children with disabilities. The four standing committees reflect the focus areas of the OSES - Preschool, Safe Schools, Transition and Self-Advocacy, and Professional Development. In the Spring and Fall of 2018 at the Special Education Leadership Meeting, over 300 stakeholders received information and updates about the SPP/APR. The OSES staff discussed SPP/APR progress and the alignment with the vision. The leadership meeting provided opportunities designed to solicit recommendations from stakeholders relating to each SPP Indicator. The stakeholders in attendance represented administrators from every LEA in the state. These administrators included local special education directors, coordinators, school psychologists, speech-language pathologists, and other LEA-level administrators. In addition, faculty from numerous state institutes of higher education attended, along with representatives from many of the state's partner nonprofit organizations. Finally, advocates, such as mediators, due process hearing officers, and representatives from the state's parent training organization were present. Breakout sessions were designed to showcase LEAs using evidence-based practices to improve outcomes in one or more of the critical SPP indicators. During the Spring of 2020, the OSES convened three additional stakeholder groups specific to supporting district and school staff, students with disabilities, and their parents during the pandemic. The first was an advisory group comprised of key LEA special education directors, technical assistance partners, and student/parent advocacy representatives. This group, the COVID Stakeholder Group, met weekly (virtually) to identify needs, review guidance from national sources, and develop state-specific guidance and support. The OSES also convened weekly calls with all LEA special education directors across the state to share guidance developed by key national technical assistance centers (NCSI, Early Childhood Technical Assistance Center (ECTA), and National Association of State Directors of Special Education (NASDSE)) as well as the guidance developed by the COVID Stakeholder Group. The OSES solicited information from LEAs during these calls regarding needs and current practices as well as through other measures such as surveys and monthly virtual meetings with the South Carolina Association of School Administrators. The third stakeholders group developed in the Spring included representatives from student and parent advocacy groups throughout the state. This group met weekly initially to identify and address concerns of parents and students with disabilities that resulted during the pandemic.

60 Part B

Page 61: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Historical Data

Baseline Year Baseline Data

2013 100.00%

FFY 2014 2015 2016 2017 2018

Target >= 75.00% 75.00% 75.00% 64.00% - 100.00%

Data 0.00% 50.00% 50.00% 0.00% 100.00%

Targets

FFY 2019 (low) 2019 (high)

Target 64.00% 100.00%

FFY 2019 SPP/APR Data

2.1.a.i Mediation

agreements related to

due process complaints

2.1.b.i Mediation

agreements not related to due process complaints

2.1 Number of

mediations held

FFY 2018 Data

FFY 2019 Target (low)

FFY 2019 Target (high)

FFY 2019 Data Status Slippage

0 4 5 100.00% 64.00% 100.00% 80.00% Met Target No Slippage

Provide additional information about this indicator (optional)There did not appear to be a significant impact on the number of mediation agreements (related or not related to due process hearings) of the COVID pandemic.

16 - Prior FFY Required ActionsNone

16 - OSEP ResponseThe State reported fewer than ten mediations held in FFY 2019. The State is not required to meet its targets until any fiscal year in which ten or more mediations were held.

16 - Required Actions

61 Part B

Page 62: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

Indicator 17: State Systemic Improvement Plan

62 Part B

Page 63: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

CertificationInstructionsChoose the appropriate selection and complete all the certification information fields. Then click the "Submit" button to submit your APR.CertifyI certify that I am the Chief State School Officer of the State, or his or her designee, and that the State's submission of its IDEA Part B State Performance Plan/Annual Performance Report is accurate.Select the certifier’s role:Designated by the Chief State School Officer to certifyName and title of the individual certifying the accuracy of the State's submission of its IDEA Part B State Performance Plan/Annual Performance Report.Name: Rebecca C DavisTitle: Director, Office of Special Education Services, South Carolina Department of EducationEmail: [email protected]:803-734-8028Submitted on:04/28/21 4:22:46 PM

63 Part B

Page 64: Introduction · Web viewState Performance Plan / Annual Performance Report:Part B for STATE FORMULA GRANT PROGRAMS under the Individuals with Disabilities Education Act For reporting

ED attachments

64 Part B