1820-0624 Comments and Analysis 3-12-14

1820-0624 Comments and Analysis 3-12-14.doc

IDEA Part B State Performance Plan (SPP) and Annual Performance Report (APR)

1820-0624 Comments and Analysis 3-12-14

OMB: 1820-0624

Document [doc]
Download: doc | pdf

1820-0624 60-Day Comments and Discussion 3/12/14

General Comments

Comment: None.

Discussion: As a part of our review of documents in response to comments, we determined that the instructions required revisions to remove references to hard copy forms and hard copy submissions. Beginning with the FFY 2013 State Performance Plan/Annual Performance Report (SPP/APR) due in February 2015, States will submit using the online GRADS 360 system; therefore, there are no paper forms and States do not have the option of submitting a hard copy of the SPP/APR for the six-year cycle that will cover FFYs 2013 through FFY 2018.

Changes: We have removed the reference to the SPP/APR template in the SPP/APR materials section. We have removed the option for a State to submit a hard copy of the SPP/APR through the mail.

Comment: None.

Discussion: As a part of our review of documents in response to comments, we determined that we had not removed all references to the previously-required Improvement Activities. We have revised the measurement table and instructions.

Changes: References to “improvement activities” in the measurement table have been removed from Indicator 8 and the Paperwork Burden Statement at the end of the measurement table and at the end of the instructions document.

Comment: A few commenters asked if the State educational agency (SEA) must re-establish baseline for Indicators 1-16 and set new targets.

Discussion: There is no requirement that SEAs re-establish baseline data for FFY 2013 for each of the results indicators for its FFY 2013 SPP/APR submission, due in February 2015, but the State must make clear for each results indicator the FFY data that it had previously identified as its baseline data for that indicator. With the exception of Indicators 2 and 17, the data source and measurement for each IDEA Part B results indicator is consistent with the previously approved SPP/APR. States will also need to establish measureable and rigorous targets for each results indicator, and for each reporting year through the end of the six-year cycle (for FFY 2013, due in February 2015 through FFY 2018, due in February 2020), and must describe how it included the participation of stakeholders, such as parent of children with disabilities, local educational agencies (LEAs), the State Advisory Panel and others, when establishing those targets. Targets for these results indicators (Indicators 1, 2, 3, 4A, 5, 6, 7, 8, 14, 15, and 16) may remain flat over a period of reporting years, but the State’s FFY 2018 target for each results indicator must demonstrate improvement over the State’s established baseline data for that indicator, which in most cases will have been established in the previous SPP that covered FFYs 2005-2012.

Changes: None.

Comment: One commenter asked if an SEA must submit a new sampling plan for indicators under which a sampling plan was previously approved by the Department.

Discussion: The Department must ensure that each State using a sampling methodology to collect data for an indicator has a sampling plan that yields valid and reliable representative data. If a State will use its currently-approved sampling plan and only change the years for which it is used, the State can provide an assurance to this effect. If a State proposes to use a sampling plan that was not previously used and approved or will revise its current sampling plan, the State will need to submit the sampling plan for approval.

Changes: None.

Comment: One commenter requested clarification on when a State must report on slippage.

Discussion: As stated in the SPP/APR instructions, a State must only report on slippage for those indicators where the data do not demonstrate progress from the previous year’s data and where the State did not meet its target. There is currently no requirement for a State to report on slippage for those indicators where the data do not demonstrate progress from the previous year’s data, but the State has met its target, although a State may choose to do so. Additionally, there is currently no requirement to report on progress, although a State may choose to do so.

Changes: None.

Comment: Several commenters requested that the Office of Special Education Programs (OSEP) provide more detail on how it calculated the burden hours associated with this collection.

Discussion: We believe that the explanation on how the Department calculated the burden hours associated with this collection is adequately explained in the response to Item 12 of the Supporting Statement.

Changes: None.

Comment: One commenter requested that the SEA be given the flexibility to publically report school level data instead of LEA level data, when determined more appropriate by the SEA.

Discussion: IDEA section 616(b)(2)(C)(ii)(I) requires the State to annually report to the public on the performance of each LEA located in the State on the targets in the State’s SPP. Therefore, the State must annually report at the LEA level. However, the State may additionally choose to report at the school level, when it determines that those data would also be valuable to the public. Under IDEA section 616(b)(2)(C)(iii), the State must ensure that it does not report any information on performance data that would result in the disclosure of personally identifiable information about individual children or when the available data is insufficient to yield statistically reliable information.

Changes: None.

Indicators 1 and 2: Graduation and Dropout

Comment: One commenter requested that Indicator 1 be eliminated because the data are already reported under the Consolidated State Performance Report (CSPR) submitted annually to the Department’s Office of Elementary and Secondary Education (OESE).

Discussion: Section 612(a)(15)(A)(iii) of IDEA requires that the State has established goals for the performance of children with disabilities in the State that address graduation rates. Therefore, we decline to delete Indicator 1 as the commenter requested even though similar data are submitted to the Department through the CSPR. Additionally, States are required under Indicator 1 to further analyze the data submitted through the CSPR.

Changes: None.

Comment: Several SEAs, a membership organization, and an individual expressed concern with Indicator 1. All commenters noted that many students with disabilities do not graduate from high school in four years, as they are entitled to special education and related services until they graduate with a regular high school diploma or reach the age at which eligibility ceases under the age requirements within the State, whichever comes first. Therefore, the data reported in the CSPR may not accurately reflect outcomes for students with disabilities. Some commenters requested that States be given flexibility to determine the targets for Indicator 1 instead of using the targets reported in the CSPR. Additionally, the commenters believe that targets should be set with stakeholder involvement and should account for “student-centered decision making.”

Discussion: Indicator 1 was previously revised at the urging of stakeholders. There was concern that States were reporting one graduation percentage in the CSPR and another in the SPP/APR. The “double reporting” was perceived as being burdensome and confusing to the public. Targets are established through a State’s accountability workbook and are approved by OESE. We believe that it is important for the SPP/APR requirements to align with the Elementary and Secondary Education Act (ESEA) requirements. However, a State has the flexibility to report additional information in the APR on those students with disabilities who do not graduate with their cohort because they continue to receive special education and related services until they reach the age at which eligibility ceases under the age requirements within the State.

Changes: None.

Indicator 2: Dropout

Comment: Many commenters requested that States be given flexibility when reporting on dropout percentages for Indicator 2. They state that the indicator, as written, is not a true dropout rate and unfairly inflates the dropout statistic for students with disabilities.

Discussion: The SPP/APR must continue to collect dropout data as it is required by IDEA section 612(a)(15)(A)(iii). We agree with the commenters that States should be given flexibility when reporting on dropout percentages for Indicator 2. Previously, States were required to submit the same data that was reported in a State’s CSPR. While dropout data are no longer collected in the CSPR, States can still calculate a dropout rate based on the dropout data previously collected through the CSPR (i.e., calculated using the annual event school dropout rate for students leaving a school in a single year determined in accordance with the National Center for Education Statistic's Common Core of Data). Beginning with the FFY 2011 APR, the Department changed the data source for Indicator 2 from data used for reporting to the Department under Title I to data used for reporting to the Department under section 618. However, in response to concerns raised by States regarding the use of these two calculation methodologies, OSEP gave States the option to report Indicator 2 data for their FFY 2011 and FFY 2012 APRs by using the data source and measurement as written (based on section 618 data) or by using the data source and measurement that the State used for its FFY 2010 APR that was submitted on February 1, 2012 (based on Title I data). (See OSEP Memorandum 13-6 on Submission of the FFY 2011 APR, which was due on February 15, 2013 and OSEP Memorandum 14-2 on Submission of the FFY 2012 APR, which was due on February 3, 2013.)

We agree that the indicator as written, which is based only on the data used for reporting to the Department under section 618, is not a true dropout rate for students with disabilities because it is based on the number of students with disabilities, ages 14 through 21, who left high school in the reporting year instead of the number of students with disabilities who were enrolled in school for the reporting year, as previously reported in the CSPR. We also recognize that the indicator as written would elicit percentages that are significantly different than the dropout rates previously reported in the CSPR. However, the calculation based on section 618 data has been used by OSEP in its annual report to Congress for more than ten years and has been publically reported. Therefore, trend data exists and States could set meaningful targets.

In order to continue to provide flexibility to States when reporting under Indicator 2, we have revised Indicator 2 of the Part B Measurement Table to allow each State to choose between two calculation methodologies. A State may either report the percentage using the number of youth with IEPs (ages 14-21) who exited special education due to dropping out in the numerator and the number of all youth with IEPs who left high school (ages 14-21) in the denominator, i.e., OPTION 1, or a State may choose to report using the same data source and measurement that the State used for its FFY 2010 APR that was submitted on February 1, 2012, i.e., OPTION 2.

Changes: The measurement table has been revised to allow for flexibility in reporting data for Indicator 2.

Indicator 3: Assessment

Comment: One commenter requested that Indicator 3 be eliminated because the data are reported in a State’s CSPR. Another commenter requested that the indicator be eliminated because the information is no longer required under ESEA flexibility.

Discussion: We will not remove Indicator 3 as the commenters requested. This indicator requires reporting data for both Adequate Yearly Progress (AYP) (Indicator 3A) as well as the assessment data reported for the CSPR (Indicator 3B and 3C). Determining AYP, the focus of Indicator 3A, remains a current requirement under ESEA. However, in September 2011, the Secretary invited each interested SEA to request flexibility from certain ESEA requirements pursuant to the authority granted to the Secretary in the ESEA that allows the Secretary to waive, with certain exceptions, any statutory or regulatory requirement of the ESEA for an SEA that receives funds under a program authorized by the ESEA and requests a waiver. Under this flexibility, SEAs could apply for a waiver of the requirements to determine AYP for LEAs. Currently, many States have been granted such a waiver. Therefore, Indicator 3A provides more details regarding the data source, measurement, and instructions for: 1) States that either did not apply for or did not receive ESEA flexibility, or applied for and received flexibility but did not apply for a waiver of determining AYP; and 2) States with an approved ESEA flexibility request that includes a waiver of determining AYP. Regarding Indicators 3B and 3C, though assessment data used in Indicators 3B and 3C are reported in the CSPR, the analysis and comparison of these data to state-determined targets for the subgroup of children with disabilities is not required in other Federal reporting requirements.

Changes: None.

Comment: A few commenters requested that we revise Indicator 3A to better align with a State’s individual ESEA flexibility request. As an example, one SEA requested that it be allowed to provide data on its growth model.

Discussion: The measurement for Indicator 3A reflects the percentage of districts with a disability subgroup that meets the States minimum “n” size that meet either the State’s AYP or Annual Measurable Objectives (AMO) targets for the disability subgroup. The data are the same data that the State uses for AYP or AMO reporting under ESEA. Therefore, a State could use data from its growth model as its data source if those data are the data used for AYP or AMO reporting under ESEA.

Changes: None.

Indicator 4: Suspension and Expulsion

Comment: Two commenters expressed concern that Indicator 4A might be eliminated.

Discussion: Indicator 4A will not be eliminated.

Changes: None.

Comment: One commenter requested clarification on why Indicator 4A is considered a “results” indicator and Indicator 4B is considered a “compliance” indicator and is used in making annual determinations.

Discussion: The measurement for Indicator 4B is the “Percent of districts that have: (a) a significant discrepancy, by race or ethnicity, in the rate of suspensions and expulsions of greater than 10 days in a school year for children with IEPs; and (b) policies, procedures or practices that contribute to the significant discrepancy and do not comply with requirements relating to the development and implementation of IEPs, the use of positive behavioral interventions and supports, and procedural safeguards.” Indicator 4B is a compliance indicator because it requires a State to report not only the number of districts that have a significant discrepancy, by race or ethnicity, in its disciplinary rate for children with disabilities, but also the number of districts that have policies, procedures, or practices that contribute to the significant discrepancy and do not comply with specified IDEA Part B requirements. Indicator 4A, however, is a results indicator that measures only the “Percent of districts that have a significant discrepancy in the rate of suspensions and expulsions of greater than 10 days in a school year for children with IEPs,” regardless of whether a district has policies, procedures, or practices that do not comply with IDEA Part B requirements.

Changes: None.

Comment: One commenter recommended that OSEP consider Indicator 4A when making annual determinations.

Discussion: OSEP determines on an annual basis how it will make determinations and will soon seek public input on using results indicators in making determinations. We will consider the commenter’s suggestion as a part of that process.

Changes: None.

Indicators 5 and 6: Least Restrictive Environment (LRE) and Preschool LRE

Comment: A few commenters requested that the indicators be removed because the data are already submitted through EDFacts. Several commenters representing private special education schools expressed concern that reporting on educational placements adversely impacts a student’s right to be educated in a non-public setting because States feel pressure to meet the targets established for Indicators 5 and 6.

Discussion: IDEA section 616(a)(3)(A) specifically requires that the Department monitor SEAs, and SEAs monitor LEAs, using quantifiable indicators, and qualitative indicators as necessary, in the priority area of the provision of a free appropriate public education (FAPE) in the LRE. Subsequent to the reauthorization of IDEA in 2004, the Department, with broad stakeholder input, developed Indicator 5: School Age FAPE in the LRE and Indicator 6: Preschool LRE to meet this requirement. We will not remove or revise these indicators. Indicators 5 and 6 reporting considerations should never drive placement decisions. Pursuant to the Part B regulations in 34 CFR §§300.320 through 300.324, a child’s IEP team develops an IEP for that child to ensure that the child is provided FAPE in the LRE. Subsequently, pursuant to 34 CFR §§ 300.116 and 300.327, a group, which must include the parents of the child, determines the educational placement of the child. Educational placement decisions, pursuant to 34 CFR §300.116, must be based on a child’s IEP and be in conformity with the LRE provisions in Part B of the Act and its implementing regulations. Therefore, placement decisions must always be based on the unique needs of the child to ensure the provision of FAPE in the LRE.

Changes: None.

Comment: A few commenters requested guidance on how a State should account for five-year-olds who attend kindergarten and not preschool. Those commenters also requested that OSEP revise the Indicator 6 language because it is confusing. As an example, the commenters state that the indicator should read “regular early childhood ‘classroom’” instead of “program.” They also requested that the indicator define or provide examples of inclusive or regular early childhood classrooms. Finally, they state that the indicator should capture that preschool LRE is about access to typically developing peers and standards aligned preschool activities.

Discussion: The EDFacts Submission System technical guide for C089 – Children with Disabilities (IDEA) Early Childhood File Specifications (technical guide) provides the answers that are responsive to the comments. The technical guide states that five-year-olds who attend kindergarten are counted as attending a regular early childhood program. We decline to change “program” to “classroom” because the indicator language is consistent with the data collection file specifications. The technical guide also provides a list of settings that would be considered regular early childhood programs. We agree that preschoolers with disabilities should be educated with their typically developing peers to the extent that it is consistent with the preschooler’s IEP, and that these children should have access to the general curriculum. However, we do not believe that Indicator 6 is the appropriate place to collect that information because the indicator captures quantitative information on the settings in which preschoolers with disabilities are educated.

Changes: None.

Indicator 7: Preschool Outcomes

Comment: One commenter requested that this indicator be removed because “the challenges presented by the data collection make reporting burdensome, costly, and the data results questionable.” Another commenter expressed concern that the indicator, as written, is too confusing and should be “condensed or simplified.”

Discussion: Based on significant input from States, the indicator measurement was previously streamlined to reduce burden and confusion. Those revisions reduced the data points from 15 to two summary statements. Further, we do not agree that the “data results are questionable.” Each State has the flexibility to determine the data source and data collection method it will use to yield valid and reliable data for reporting. Additionally, ongoing analysis shows many States are submitting high quality data and the data from other States continues to improve. Therefore, we will not eliminate the indicator as the commenter requested.

Changes: None.

Indicator 8: Parent Involvement

Comment: All commenters agreed that parent input is important and that parent participation is a critical component in improving results for students with disabilities. While one commenter believes that the data are an important improvement tool in the State, many other commenters questioned the utility of the data collected under Indicator 8. Some commenters wondered about representativeness and reliability. Another stated that the data collected for the indicator have not been linked to increased parent involvement in his or her State. A few commenters stated that they do not believe that a survey is the best method for collecting Indicator 8 data. Other commenters suggested that Indicator 8 collect information on a State’s individual strategy for gathering parent input and that this plan be spelled out in the indicator and that the parent feedback is reported and addressed. One commenter suggested that OSEP encourage States to collect these data electronically. One commenter requested that the indicator be eliminated because it is not required by statute or regulation.

Discussion: We agree with the commenters that parent input and participation are critical to improved outcomes for students with disabilities and do not agree that this indicator should be eliminated. The Act and the Part B regulations encourage parent input and involvement in all aspects of a child’s educational program, including those areas set forth in section 616(a)(3) of the Act as priority areas. In addition, the Secretary recognizes the vital role parents play in the education of their child. Therefore, we feel that it is critical to include an indicator measuring the percent of parents with a child receiving special education services who report that the school facilitated parent involvement as a means of improving services and results for children with disabilities. As we have previously clarified, there is no requirement that a State use a survey to collect data for this indicator; rather the indicator allows a State to select the data source that it will use to report valid and reliable data for Indicator 8. The instructions provide guidance for those States that choose to use a survey to collect its data for this indicator, but in no way mandate a survey’s use. However, we recognize that the instructions, as written, may be confusing. We will add language to the instructions that a State is not required to use a survey to collect data for this indicator.

We appreciate the commenters’ suggestions for improving this indicator and believe that the indicator, as written, is structured in such a way to provide a State with the flexibility to report on its individual strategy for gathering and reporting parent input. Although it is not required, we agree with the commenters that it would be beneficial for a State to provide information on how it will address parent feedback. We also support the idea of collecting Indicator 8 data electronically if that is the State-selected data collection method.

Changes: We have revised the instructions for Indicator 8 to read “While a survey is not required for this indicator, a State using a survey must submit a copy of any new or revised survey with its APR.”

Indicators 9 and 10: Disproportionality

Comment: Many commenters suggested that Indicators 9 and 10 are duplicative and requested that OSEP combine the two indicators. Another commenter requested that the indicators be eliminated because they are not required by statute.

Discussion: IDEA section 616(a)(3)(C) specifically requires the Secretary to monitor the States, and SEAs to monitor LEAs, using a quantifiable or a qualitative indicator, as necessary, to adequately measure disproportionate representation of racial and ethnic groups in special education and related services, to the extent the representation is the result of inappropriate identification. Therefore, we decline to eliminate Indicators 9 and 10 as the commenter suggested.

In a previous Information Collection request, OSEP proposed to combine Indicators 9 and 10 to eliminate duplication and reduce reporting burden. In response, we received overwhelming comment that combining the indicators would actually increase burden because States would have to retool their information collection systems to reformat these data. It was also noted by a few commenters that combining the indicators may unintentionally eliminate data from being included in determining the extent to which an SEA is meeting IDEA requirements. Further, Indicators 9 and 10 are not exactly the same. Indicator 9 collects data on the percent of districts with disproportionate representation of racial and ethnic groups in special education and related services that is the result of inappropriate identification. Indicator 10 disaggregates that information by disability category. In response, Indicators 9 and 10 were maintained as separate indicators. We will continue to maintain both Indicators 9 and 10 because no commenter provided a reason that outweighs the previous concerns.

Changes: None.

Comment: Two commenters requested that the term “mental retardation” be removed from the data source for Indicator 10 because Public Law 111-256 (Rosa’s Law) replaces the term in IDEA and other Federal laws with the term “intellectual disability.”

Discussion: The data used to calculate the measurement for Indicator 10 are collected through EDFacts File C002. That file has replaced the term “mental retardation” with “intellectual disability.” However, the Indicator Measurement Table Instructions for Indicator/Measurement column for Indicator 10 still included the outdated term. We have replaced it with the term “intellectual disability.” We encourage States to replace the term “mental retardation” with “intellectual disability” if the term is used in the preparation of the SPP/APR.

Changes: We have replaced the term “mental retardation” with the term “intellectual disability” in the Indicator Measurement Table Instructions for Indicator/Measurement column for Indicator 10.

Comment: A few commenters requested that OSEP define “disproportionate representation.”

Discussion: We maintain that it would not be appropriate to specifically define the term “disproportionate representation” as used in Indicators 9 and 10 given that there are multiple factors at the State level to consider when establishing this definition. However, we recognize that some State-established definitions may be written in such a way that makes it likely that no LEAs will be identified with disproportionate representation. We encourage every State, particularly those in which the State, using its current State-established definition, has not identified any districts with disproportionate representation, to review its definition and, with stakeholder involvement, make any necessary revisions. OSEP will continue to review State definitions to ensure the definitions will yield valid results.

Changes: None.

Indicator 12: Early Childhood Transition

Comment: One commenter remarked that element “e” in Indicator 12 (the number of children determined to be eligible for early intervention services under Part C less than 90 days before their third birthdays) is burdensome data to collect. The commenter recognizes that the collection often requires collaboration between the SEA and the Part C lead agency, but stresses that the reality of gathering data from another agency is problematic.

Discussion: We appreciate the commenter acknowledging that collaboration between Part B and Part C is critical to ensuring effective early childhood transition. We encourage the SEA and the Part C lead agency to continue to build systems to address the timely and accurate transfer of data. Element “e” allows a State to exclude from its calculation those children who were determined to be eligible for early intervention services under Part C less than 90 days before their third birthday, thereby decreasing the denominator and increasing the compliance percentage. Therefore, it is to the State’s advantage to exclude from the calculation children who were determined to be eligible for early intervention services under Part C less than 90 days before their third birthday.

Changes: None.

Indicator 13: Secondary Transition

Comment: A few commenters questioned the cost/benefit of Indicator 13. One stated that the indicator is complex with many elements and requires 100% compliance, which is often not possible. The commenter suggested an alternative means of calculating the data for this indicator. Another commenter stated that monitoring this indicator takes a large amount of staff time and that the paperwork associated with documenting the transition requirements does not necessarily lead to improved outcomes for students with disabilities.

Discussion: IDEA section 616(a)(3)(B) specifically identifies a system of transition services as defined in section 602(34) (definition of “transition services”) as one of the priority areas the Department must measure using quantifiable indicators, and qualitative indicators as necessary. Indicator 13 was reworded at commenter request during a previous approval cycle for 1820-0624 to ensure accurate and complete reporting that is aligned with statutory and regulatory requirements. The IDEA Part B regulations in 34 CFR §300.320(b) require, beginning not later than the first IEP to be in effect when the child turns 16, or younger if determined appropriate by the IEP Team, and updated annually, thereafter, that the IEP include: (1) appropriate measurable postsecondary goals based upon age appropriate transition assessments related to training, education, employment, and, where appropriate, independent living skills; and (2) the transition services (including courses of study) needed to assist the child in reaching those goals. The public agency must invite a child with a disability to attend the child’s IEP team meeting where transition services are to be discussed, and if appropriate, a representative of any participating agency that is likely to be responsible for providing or paying for transition services, with the prior consent of the parent or student who has reached the age of majority, as required by 34 CFR §300.321(b). We continue to believe that Indicator 13, as currently worded, is aligned with the applicable statutory and regulatory requirements and States collect valid and reliable data consistent with the required measurement for this indicator. Further, we believe that comprehensive and meaningful planning is a key component of a student with a disability’s successful transition to college or career; and that the benefits to a student with a disability of ensuring transition planning that is aligned with the statutory and regulatory requirements outweigh any associated burden.

Changes: None.

Indicator 14: Post School Outcomes

Comment: One SEA supports the inclusion of Indicator 14 and states that “without measures of post-school engagement, there is little to substantiate the long-term success of the other indicators.” However, a few other commenters recommended that the indicator be eliminated because of the cost associated with the collection and because, as one of the commenters asserts, the collection yields little data of value.

Discussion: We agree with the commenter that data on post school outcomes is a key measure of the efficacy of the IDEA. One purpose of the IDEA is to ensure that all children with disabilities have available to them a FAPE that emphasizes special education and related services designed to meet their unique needs and prepare them for further education, employment, and independent living. Indicator 14 measures the percent of students with disabilities who are no longer in secondary school, had IEPs in effect at the time they left school, and were: (A) enrolled in higher education within one year of leaving high school; (B) enrolled in higher education or competitively employed within one year of leaving high school; or (C) enrolled in higher education or in some other postsecondary education or training program, or competitively employed or in some other employment within one year of leaving high school. In short, Indicator 14 is one measure of the result of the FAPE provided to a student with a disability. States have been given the flexibility to determine how to best collect those data in order to yield valid and reliable data that are useful.

Changes: None.

Current Indicator 15: Timely Correction

Comment: Several commenters did not agree with the elimination of current Indicator 15: Timely Correction. They believe that school districts need a specific reason to focus on the timely identification and correction of noncompliance. Other commenters strongly supported eliminating current Indicator 15 because it is viewed as duplicative and overly burdensome.

Discussion: IDEA section 616(a)(1) requires the Secretary to monitor implementation of the IDEA through oversight of the exercise of general supervision by the States, and, in turn, for States to monitor implementation of IDEA by LEAs. In order to effectively monitor implementation of IDEA, States must ensure that identified noncompliance is corrected in a timely manner, as required by IDEA sections 612(a)(11) and 616(a), 34 CFR §§300.149 and 300.600, and section 441(b)(3) of the General Education Provisions Act. Therefore, regardless of the fact that Indicator 15 is being eliminated, States and school districts must continue to focus on timely identification and correction of noncompliance.

The SPP/APR is one of the methods that the Department uses to monitor States, and States use to monitor LEAs, to ensure timely identification and correction of noncompliance. IDEA section 616(a)(3)(B) specifically requires that the Department monitor SEAs, and SEAs monitor LEAs, using quantifiable indicators, and qualitative indicators as necessary, in specified priority areas, including the area of State exercise of general supervision. As stated in the explanation and rationale document that accompanied this proposed Information Collection, there will continue to be a focus on the timely identification and correction of noncompliance. The Department is eliminating Indicator 15 to reduce reporting burden. However, States will continue to be required to report on the timely correction of noncompliance under the individual compliance indicators in the SPP/APR. The instructions for these indicators require States to provide detailed information about the timely correction of noncompliance reported for these indicators in the previous year’s APR. We will also monitor States’ compliance with the requirement to ensure timely identification and correction of noncompliance. Under Results Driven Accountability, we will implement a differentiated monitoring system through which we will monitor all States, and provide individualized support to a State that needs it. The data from the compliance indicators and other monitoring efforts provide a reasonable quantifiable basis for OSEP to reach a determination as to whether a State has a monitoring system that is effective in correcting identified noncompliance within one year of identification. OSEP will continue to consider timely correction in its annual determinations process.

Changes: None.

Comment: Several commenters requested guidance on how the timely correction requirements would be best implemented within each of the compliance indicators.

Discussion: Current Indicator 15 has been eliminated, but the requirement to correct noncompliance at both the child-specific and systemic levels remains the same. When reporting on correction of noncompliance under the individual compliance indicators, States must continue to report on timely correction of noncompliance at both the child-specific and systemic levels. OSEP’s September 2008 FAQs and the October 17, 2008 Dear Colleague Memorandum continue to provide States with guidance on how States should verify, and report on, the timely correction of findings of State-identified noncompliance in their APRs. Child-specific correction requires the LEA to correct, and the SEA to verify correction of, each individual case of identified noncompliance. Systemic correction requires the LEA correctly implement the regulatory requirements and the SEA to verify that correction by reviewing updated data.

Changes: None.

New Indicators 15 and 16: Resolution Sessions and Mediations

Comment: Several comments requested that these indicators be eliminated because the data are already submitted through another OMB-approved information collection. They state that State complaint timelines (previous Indicator 16), due process hearing timelines (previous Indicator 17), and timely and accurate data (previous Indicator 20) were all eliminated because the data were collected through another OMB-approved information collection and would like OSEP to apply that rationale to proposed Indicators 15 and 16.

Discussion: IDEA section 616(a)(3)(B) requires the Secretary to monitor the States, and States to monitor the LEAs, using quantifiable indicators in a number of priority areas, and specifically references the use of resolution sessions and mediations. Therefore, we decline to remove these indicators even though the data are initially submitted through another OMB-approved information collection.

Changes: None.

Comment: Several commenters recommended that Indicators 15 and 16 measure the timeliness of resolution sessions and mediations, as SEAs have no control over resolution through resolution sessions or mediations. Another commenter requested that there be no targets for Indicators 15 and 16 because of the SEAs lack of control over the outcomes of resolution sessions and mediations.

Discussion: As previously stated, IDEA section 616(a)(3)(B) requires that the resolution sessions and mediations be measured using quantifiable indicators, and qualitative indicators as necessary and expressly specifies that the indicators in these two priority areas must measure the use of these dispute resolution methods. Therefore, while we agree that SEAs might have more control over the timeliness of conducting these meetings rather than the outcome of these meetings, we believe that Indicators 15 and 16 measure the “use” of these dispute resolution options and, thus, the measurement for both of these indicators is consistent with the statute. Additionally, IDEA section 616(b)(2)(A) requires each State to establish measurable and rigorous targets for the indicators established under the priority areas described in section 616(a)(3). IDEA section 616(b)(2)(C)(ii)(II) requires the State to report annually to the Secretary on the performance of the State under the State’s performance plan, which includes measurable and rigorous targets for each indicator. Therefore, we cannot eliminate the requirement to set, and annually report on, measurable and rigorous targets for Indicators 15 and 16.

Changes: None.

New Indicator 17: State Systemic Improvement Plan (SSIP)

Comment: We received multiple comments regarding proposed Indicator 17. Some commenters applauded the proposal as a true step towards results driven accountability. These same commenters expressed that the proposal was well-developed and detailed, and offered enough room for each State to craft and implement an SSIP that will help each State focus on systemic improvement. Other commenters were concerned that new Indicator 17 is duplicative of other improvement efforts already underway in a State and that the burden associated with developing, implementing, and reporting on the SSIP far outweighs any associated benefit.

Discussion: New Indicator 17 is each State’s opportunity to develop and implement a comprehensive, ambitious, yet achievable plan focused on improving results for students with disabilities. OSEP strongly encourages alignment between Indicator 17 and other improvement activities or plans that are already being implemented in the State, as long as the existing plan is based on recent data and infrastructure analyses. Additionally, in order to use an existing plan to meet the requirements of Indicator 17, that plan must have a direct impact on students with disabilities and align with the State-identified Measurable Result for Students with Disabilities. We do not agree that Indicator 17 presents an increase in the reporting burden because a State is no longer required to develop and report in the SPP/APR on a separate set of improvement activities for each indicator. In addition, a State is encouraged to align and integrate existing State-level improvement efforts on which data are already collected and reported. This alignment, when used to meet the requirements of Indicator 17, if appropriate, will reduce the reporting burden.

Changes: None.

Comment: Many commenters requested that OSEP further clarify and define the terms used throughout Indicator 17 and provide more guidance on the proposed process for developing the SSIP.

Discussion: We agree with the commenters that some of the terms require additional explanation and that the process for developing and implementing the SSIP should be further clarified. In response, we made several structural and definitional revisions to Indicator 17 -

  • We labeled the two sections in the indicator to make clear that the first section is “Overview of The Three Phases of the SSIP” and the second section is the “Specific Content of Each Phase of the SSIP”. Below we discuss the revisions by section.

  • Measurement – The measurement, while not substantively changed, was reworded for clarity. We added a section to clarify that the State must include in its FFY 2013 SPP/APR due February 1, 2015 the State’s baseline data for FFY 2013 and its targets for FFYs 2014 through 2018 for its “State-identified Measurable Result for Children with Disabilities.” We also clarified that the State must include updated data for FFY 2014 through FFY 2018 for its “State-identified Measurable Result for Children with Disabilities” in its respective SPPs/APRs for FFYs 2014 through 2018.

Overview of the Three Phases of the SSIP

  • Stakeholder Involvement – We added a section to clarify that stakeholders must be included throughout the process of developing, implementing, evaluating, and revising the SSIP, and included in establishing the State’s targets under Indicator 17. The SSIP should include information about stakeholder involvement in all three phases.

  • Phase I – We labeled this as the ”Analysis” phase. Phase I has five components and is due with the FFY 2013 SPP/APR submitted on February 2, 2015.

    • Data Analysis. This component has not been substantively revised.

    • Analysis of State Infrastructure to Support Improvement and Build Capacity. The title of this component was revised to clarify that the State must provide an analysis, and not just a list, of the State infrastructure that supports improvement and builds capacity.

    • In response to commenters, the proposed “Identification of the Focus of Improvement” was retitled “State-identified Measurable Result for Children with Disabilities” to clarify that the focus for improvement must be a student-level outcome, in contrast to a process outcome.

    • Selection of Coherent Improvement Strategies” was added to focus States on identifying strategies that are sound, logical, aligned, and will lead to a measurable improvement in the State-identified result for students with disabilities.

    • Theory of Action. This component has not been revised in the description of Phase I.

  • Phase II- We labeled this as the “Plan” phase and made changes as described below to the section titled “Support for LEA Implementation of Evidence-Based Practices” and the section titled, “Evaluation.”

  • Phase III – We labeled Phase III as the ”Implementation and Evaluation” phase. We clarified that a State will provide descriptions of its ongoing evaluation and revisions to the SSIP during this phase.

Specific Content of Each Phase of the SSIP

This section provides comprehensive definitions and explanations of all of the terms and concepts used in the Overview section.

Phase I

  • Data analysis is defined as description of how the State identified and analyzed key data, including data from SPP/APR indicators, 618 data collections, and other available data as applicable, to: (1) select the State-identified Measureable Result for Children with Disabilities, and (2) identify root causes contributing to low performance. The description must include information about how the data were disaggregated by multiple variables (e.g., LEA, region, race/ethnicity, disability category, placement, etc.). As part of its data analysis, the State should also consider compliance data and whether those data present potential barriers to improvement. In addition, if the State identifies any concerns about the quality of the data, the description must include how the State will address these concerns. Finally, if additional data are needed, the description should include the methods and timelines to collect and analyze the additional data.

  • Analysis of State Infrastructure to Support Improvement and Build Capacity is defined as a description of how the State analyzed the capacity of its current infrastructure to support improvement and build capacity in LEAs to implement, scale up, and sustain the use of evidence-based practices to improve results for students with disabilities. State systems that make up its infrastructure include, at a minimum: governance, fiscal, quality standards, professional development, data, technical assistance, and accountability/monitoring. The description must include current strengths of the systems, the extent the systems are coordinated, and areas for improvement of functioning within and across the systems. The State must also identify current State-level improvement plans and initiatives, including special and general education improvement plans and initiatives, and describe the extent that these initiatives are aligned, and how they are, or could be, integrated with, the SSIP. Finally, the State should identify representatives (e.g., offices, agencies, positions, individuals, and other stakeholders) that were involved in developing Phase I of the SSIP and that will be involved in developing and implementing Phase II of the SSIP.

  • State-identified Measureable Result for Children with Disabilities is defined as a statement of the result(s) the State intends to achieve through the implementation of the SSIP. The State-identified result(s) may, but need not, be an SPP/APR indicator or a component of an SPP/APR indicator. The State-identified result(s) must be clearly based on the Data and State Infrastructure Analyses and must be a student-level outcome in contrast to a process outcome. The State may select a single result (e.g., increasing the graduation rate for students with disabilities) or a cluster of related results (e.g., increasing the graduation rate and decreasing the dropout rate for students with disabilities).

  • Selection of Coherent Improvement Strategies is defined as an explanation of how the improvement strategies were selected, and why they are sound, logical and aligned, and will lead to a measurable improvement in the State-identified result(s). The improvement strategies should include the strategies, identified through the Data and State Infrastructure Analyses, needed to improve the State infrastructure and to support LEA implementation of evidence-based practices to improve the State-identified result(s) for students with disabilities. The State must describe how implementation of the improvement strategies will address identified root causes for low performance and ultimately build LEA capacity to achieve the State-identified Measureable Result(s) for Students with Disabilities.

  • Theory of Action is defined as a graphic illustration that shows the rationale of how implementing the coherent set of improvement strategies selected will increase the State’s capacity to lead meaningful change in LEAs, and achieve improvement in the State-identified result(s) for children with disabilities.

Phase II

Phase II was revised to provide additional guidance on developing the plan, based on the Phase I analysis, that will be implemented in the State to achieve the State-identified measurable result for students with disabilities. Phase II includes infrastructure development, support for LEA implementation of evidence-based practices, and evaluation.

  • Infrastructure Development: A State must specify improvements that will be made to the State infrastructure to better support LEAs to implement and scale up evidence-based practices to improve the State-identified result(s) for children with disabilities. Additionally, a State must identify the steps the State will take to further align and leverage current improvement plans and initiatives in the State, including general and special education improvement plans and initiatives, which impact students with disabilities. This section must also identify who will be in charge of implementing the changes to infrastructure, resources needed, expected outcomes, and timelines for completing improvement efforts. In addition, the State should specify how it will involve multiple offices within the State educational agency (SEA), as well as other State agencies, in the improvement of its infrastructure.

  • Support for LEA Implementation of Evidence-based Practices: A State must specify how it will support LEAs in implementing the evidence-based practices that will result in changes in LEA, school, and provider practices to achieve the State-identified Measurable Result(s) for Children with Disabilities. This section must identify steps and specific activities needed to implement the coherent improvement strategies, including communication strategies and stakeholder involvement; how identified barriers will be addressed; who will be in charge of implementing; how the activities will be implemented with fidelity; the resources that will be used to implement them; how the expected outcomes of the improvement strategies will be measured; and timelines for completion. In addition, the State should specify how it will involve multiple offices within the SEA (or other State agencies) to support LEAs in scaling up and sustaining the implementation of the evidence-based practices once they have been implemented with fidelity.

  • Evaluation: The evaluation must include short-term and long-term objectives to measure implementation of the SSIP and its impact on achieving measureable improvement in State-identified result(s) for children with disabilities. The evaluation must be aligned to the theory of action and other components of the SSIP, include how stakeholders will be involved, and include the methods that the State will use to collect and analyze data to evaluate implementation and outcomes of the SSIP. The evaluation must specify: (1) how the State will use the information from the evaluation to examine the effectiveness of the implementation of the SSIP and the progress toward achieving intended improvements in the State-identified result(s) for children with disabilities, and to make modifications to the SSIP as necessary; and (2) how information from the evaluation will be disseminated to stakeholders.

A State may also amend previously-submitted information from Phase I to update it and ensure its accuracy.

Phase III

Phase III was revised for clarity. In Phase III, the State must, consistent with the evaluation described in Phase II, assess and report on its progress in implementing the SSIP. This reporting will include data and analysis on the extent to which the State has made progress toward and/or met the State-established short-term and long-term objectives for implementation of the SSIP and its progress in achieving the State-identified Measureable Result for Children with Disabilities. If the State intends to continue implementing the SSIP without modifications, the State must describe how the data from the evaluation support this decision. Also, the State must provide a rationale for any revisions that have been made, or revisions the State plans to make, in the SSIP in response to evaluation data, and describe how stakeholders were included in the decision-making process.

Changes: As described in the Discussion section directly above, Indicator 17 has been revised to address commenters’ questions and concerns regarding clarifying the components and phases of the State’s development and implementation of the SSIP.

Comment: A few commenters requested that the SSIP not be referred to as an indicator as the structure and content of Indicator 17 is inconsistent with the other SPP/APR indicators.

Discussion: We do not agree that the SSIP should not be included as an indicator because its structure and the content is not the same as the other SPP/APR indicators. By designating the SSIP as an indicator, OSEP requires each State to identify its baseline data in FFY 2013 and targets for FFY 2014 through FFY 2018 that reflect improvement over the baseline data.

Changes: None.

Comment: A few commenters asked whether a State must require an SSIP of its LEAs.

Discussion: Indicator 17 is a State-wide indicator and there is no requirement that the State require its LEAs to develop and implement a SSIP, but the State may do so.

Changes: None.

Comment: Many commenters requested that OSEP provide the criteria by which OSEP will evaluate an SSIP.

Discussion: We are developing the criteria by which OSEP will evaluate an SSIP and will share those criteria with States as a part of the support we provide to States under Results Driven Accountability.

Changes: None.

Comment: Several commenters requested that OSEP provide an SSIP model or template for States to follow.

Discussion: Indicator 17 outlines the specified content the State must include in each of the three phases of the SSIP. Additionally, GRADS 360, the online SPP/APR tool, will provide fields to report on each of the SSIP’s required components. OSEP has collaborated, and will continue to work, with technical assistance providers to provide States with further guidance regarding States’ reporting under each phase and step in the process.

Changes: None.

Comment: A few commenters requested that OSEP provide guidance on how to establish targets for Indicator 17.

Discussion: The FFY 2013 SPP/APR, submitted in February 2015, must include FFY 2013 data as the baseline data and measurable and rigorous targets for FFYs 2014 through FFY 2018 for Indicator 17 that are: expressed as percentages; and reflect a measurement that is aligned with the State-identified Measurable Result for Children with Disabilities, i.e., the desired child-level outcome that is clearly based on the State’s Data and State Infrastructure analyses. For example, a State might report that “after conducting its Data and State Infrastructure Analyses, the State has determined that its State-identified Measurable Result for Children with Disabilities will be how well it improves third grade reading test results for children with disabilities. .” For Indicator 17, the State would provide baseline data for FFY 2013 (expressed as a percentage) on the third grade reading assessment results for children with disabilities. The State would provide annual targets for each of the five years from FFY 2014 through FFY 2018 (expressed as a percentage) and the State’s end target for FFY 2018 under this SPP/APR would have to demonstrate improvement over the State’s FFY 2013 baseline data.

Changes: We have added guidance to the Part B State Performance Plan (SPP)/Annual Performance Report (APR) Instruction Sheet on establishing measurable and rigorous targets for Indicator 17.

Comment: A few commenters asked how Indicator 17 would impact a State’s determination under IDEA section 616(d).

Discussion: Indicator 17 will not impact the Department’s determinations made under IDEA section 616(d) in 2015 based on the FFY 2013 SPP/APR. The Department will consider the State’s Indicator 17 and SSIP data in the data the Department considers for determinations made beginning with 2016 as part of Results Driven Accountability.

Changes: None.

Current Indicator 20: Timely and Accurate Data

Comment: A few commenters are concerned that data quality will suffer because States would no longer be required to report on timely and accurate data.

Discussion: We do not agree that data quality will be compromised because States are no longer required to report on Indicator 20. We will continue to consider data accuracy and the timeliness of a State’s submission when making annual determinations under section 616(d).

Changes: None.









Page 43 of 43

File Typeapplication/msword
AuthorBecca Walawender
Last Modified ByTomakie Washington
File Modified2014-03-19
File Created2014-03-19

© 2024 OMB.report | Privacy Policy