Impact
Evaluation of RTT
and SIG
CONTENTS
PART
A SUPPORTING STATEMENT FOR PAPERWORK REDUCTION
ACT SUBMISSION
A. Justification
1. Circumstances Necessitating the Collection of Information
2. Purposes and Uses of the Data
3. Use of Technology to Reduce Burden
4. Efforts to Avoid Duplication of Effort
5. Methods to Minimize Burden on Small Entities
6. Consequences of Not Collecting Data
7. Special Circumstances
8. Federal Register Announcement and Consultation
9. Payments or Gifts
10. Assurances of Confidentiality
11. Additional Justification for Sensitive Questions
12. Estimates of Hours Burden
13. Estimates of Cost Burden to Respondents
14. Estimates of Annual Costs to the Federal Government
15. Reasons for Program Changes or Adjustments
16. Plan for Tabulation and Publication of Results
17. Approval Not to Display the OMB Expiration Date
18. Explanation of Exceptions
APPENDIX A: state letter
APPENDIX b: STUDY INFO SHEET
APPENDIX c: recruiting protocol with state administrator
APPENDIX
D: RECRUITING PROTOCOL WITH DISTRICT ADMINISTRATOR
(DISTRICTS IN THE SCHOOL TURNAROUND
MODEL [STM] SAMPLE)
APPENDIX
E: RECRUITING PROTOCOL WITH DISTRICT ADMINISTRATOR
(DISTRICTS IN THE RACE TO THE TOP [RTT] SAMPLE)
APPENDIX F: district letter
TABLES
A.1 Research Questions and Data Sources
A.2 Study Timetable
A.3 Burden in Hours to Respondents
FIGURES
A.1 Venn Diagram of Study Sample
PART A. SUPPORTING STATEMENT FOR
PAPERWORK
REDUCTION ACT SUBMISSION
This Office of Management and Budget (OMB) package requests clearance to recruit 50 states, the District of Columbia, approximately 240 school districts, and approximately 1,200 schools for inclusion in an evaluation of the Race to the Top (RTT) and School Improvement Grants (SIG) programs. The RTT-SIG evaluation will provide important information on the implementation and impacts of school turnaround efforts and educational reforms funded through these two federal grant programs. The Institute of Education Sciences (IES) at the U.S. Department of Education (ED) has contracted with Mathematica Policy Research and its subcontractors, the American Institutes for Research and Social Policy Research Associates, to conduct this important evaluation.
The RTT-SIG evaluation will include implementation and impact components. For the evaluation of RTT, the implementation component will include semi-structured interviews with state and district officials while the impact component will be based on an interrupted time series (ITS) design. For the evaluation of RTT and SIG-funded school turnaround models (STMs), the implementation component will include semi-structured interviews with state and district officials and a web survey of school principals. The impact evaluation of STMs will be based on a regression discontinuity design (RDD).
This OMB clearance request is the first of two for this evaluation and includes materials that will be used in the study’s recruitment process. We are submitting two clearance requests because recruitment efforts must begin before all of the study’s data collection instruments can be developed. Included in this first OMB clearance request are drafts of the state recruitment letter (Appendix A), the RTT/SIG study information sheet (Appendix B), protocols for recruitment calls and site visits (Appendices C, D, and E), and the district recruitment letter (Appendix F). We provide an overview of the study’s design and eventual data collection plans to provide context, but they are not the focus of this request. A later request will seek clearance for activities to collect information from the states, districts, and schools included in the evaluation, and will include data collection instruments for the study.
a. Statement of Need for a Rigorous Evaluation of RTT and SIG
The investments being made by the U.S. Department of Education in Race to the Top and School Improvement Grants are unprecedented in scope and scale. The SIG program was funded in fiscal year 2009 with $546.6 million and received an additional $3 billion from the American Recovery and Reinvestment Act (ARRA) of 2009 (Pub. L. 111-5). SIG funds go to states based on their share of Title I funding; states then distribute the funds to districts with the lowest achieving Title I schools that demonstrate need and a strong commitment to implement one of four models—turnaround, restart, closure, and transformation—aimed at improving or closing these persistently lowest-achieving schools.
The RTT program was allocated $4 billion in ARRA funding to encourage and reward states already implementing significant education reforms in four priority areas—(1) standards and assessments; (2) data systems; (3) effective teachers and school leaders; and (4) turning around persistently lowest-achieving schools—to extend their efforts and advance comprehensive and coherent education reforms across districts for the purpose of improving student outcomes. RTT grants were awarded competitively in two phases. Phase I awards were announced in March 2010 to Tennessee ($500 million) and Delaware ($100 million). Phase II awards were made in September 2010 to New York ($700 million); Florida ($700 million); Georgia ($400 million); North Carolina ($400 million); Ohio ($400 million); Massachusetts ($250 million); Maryland ($250 million); Rhode Island ($75 million); Hawaii ($75 million); and the District of Columbia ($75 million).
Given the scale and scope of these federal investments, findings from the RTT-SIG evaluation will be highly anticipated and critically scrutinized by a broad audience of policymakers, educators, and other citizens. These constituents will want to know if these programs accomplished their goals: Are struggling schools initiating reforms? Are states improving their data systems? Are common standards and assessments being adopted? Are teachers and principals being supported in their attempts to turn around lowest-achieving schools? In addition to these and other questions of program implementation, there is the bottom-line question of whether these reforms affect students’ academic achievement and progress beyond high school.
Legislative authorization for the RTT-SIG evaluation is found in the Education Science Reform Act of 2002, Part D, Section 171(b)(2), which authorizes IES to “conduct evaluations of Federal education programs administered by the Secretary (and as time and resources allow, other education programs) to determine the impact of such programs (especially on student academic achievement in the core academic areas of reading, mathematics, and science).”
b. Research Questions
The RTT-SIG evaluation will examine the following research questions:
How are RTT and SIG implemented at the state, district, and school levels?
Does receipt of RTT and/or SIG funding to implement a school turnaround model have an impact on outcomes for lowest-achieving schools?
Are RTT reforms related to improvement in student outcomes?
Is implementation of the four school turnaround models, and strategies within those models, related to improvement in outcomes for lowest-achieving schools?
c. Study Design
The RTT-SIG evaluation is designed to provide a descriptive account of the implementation of RTT and SIG; the most rigorous possible estimates of the effects of RTT and SIG; and the contextual information needed to fully understand and interpret those effects. The study will be based on two samples of school districts, strategically selected both to provide information on RTT and SIG implementation and to support a rigorous analysis of program impacts. To estimate the impact of STMs on student achievement, the evaluation’s first choice is to use a rigorous RDD, exploiting approaches for awarding STM funds to schools that involve a continuous measure. The second choice design, which would be used if an RDD were not feasible, is ITS. The evaluation will also assess the correlation between turnaround models—and the specific turnaround strategies used within such models—and improvements in school outcomes. Separately, to assess the relationship between RTT and student outcomes, the evaluation will use an ITS analysis.
The study will involve two samples, one for the evaluation of STMs and one for the evaluation of RTT (see Figure A.1). The sample for the evaluation of STMs (referred to throughout as the STM sample) will consist of approximately 1,200 schools within an estimated 120 school districts across 30 states (roughly 600 schools will form the treatment group, and roughly 600 schools will form the comparison group). The districts in the STM sample will be purposefully selected based on suitability for the RDD. The sample for the evaluation of RTT (referred to throughout as the RTT sample) will include all 50 states and the District of Columbia and, within the 12 RTT-winning states and the 12 states with the highest application scores among the losing states,1 a sample of approximately 120 school districts. To be eligible for this component of the study, a district must have been identified as a “participating district” in the state’s RTT application (these applications are available both for the RTT winners and losers). From among those eligible districts we will draw our stratified random sample. We anticipate that there will be some overlap between the RTT and STM samples, based on a preliminary examination of districts that may be suitable for the RDD and states that received RTT grants.
Figure A.1. Venn Diagram of Study Sample
Research Question 1: How are RTT and SIG implemented at the state, district, and school levels?
The implementation study will gather information to both answer this research question and support answering research question 4. From interviews with representatives from all 50 states and the District of Columbia, we will learn about RTT-related reforms, such as the steps states are taking to develop standards for college and career preparedness, to improve data systems, to promote an equitable distribution of effective teachers, and to support school turnaround. From interviews with district representatives in the RTT sample, we will learn how RTT-related reforms affect districts and their schools, such as through improved professional development opportunities, use of data systems in shaping policies, approaches to evaluating teachers, and school turnaround strategies. From interviews with district representatives in the STM sample, we will learn about the school turnaround efforts that have been implemented in districts and the nature and type of supports provided by districts to turnaround schools. Through surveys administered to school administrators in the STM sample, we will learn about the specific STM strategies that are being implemented in these schools.
The evaluation will use several strategies to ensure that the implementation data collected through these activities are comparable and analyzed in a systematic way. A uniform protocol will be used for each data collection activity. We will also prioritize the use of closed-ended questions in the data collection instruments to ensure we capture quantitative data on the percentage of states, districts, and schools that are implementing particular RTT and SIG reforms. For the open-ended questions, we plan to use Atlas.ti or NVivo software to help organize, classify, and categorize the qualitative information gathered into objective themes and categories. This information will be further summarized through the use of indicator or categorical variables amenable to quantitative analysis. This approach will permit the study team to objectively and systematically describe the implementation of RTT and SIG and examine the relationship between patterns in outcomes and key implementation variables.
Research Question 2: Does receipt of RTT and/or SIG funding to implement a school turnaround model have an impact on outcomes for lowest-achieving schools?
The evaluation’s first choice for estimating the impacts of STMs is to use an RDD that meets the evidence standards of the What Works Clearinghouse (WWC) and provides as much statistical power as possible. If an RDD is infeasible,2 the contractors will be prepared to estimate impacts using an ITS design. Due to a particular interest from ED in the effects of the restart turnaround model, and the relatively small number of schools implementing that model (approximately 30), we plan to use an ITS analysis to estimate the impacts of restart schools.
To assess the feasibility of an RDD, the contractor will carefully review state applications for RTT and SIG, and, as part of the study’s recruiting efforts, which are the focus of this request, talk to state and district officials to identify cases where allocation of STM funding is based on a clear cutoff value on a continuous variable. For example, one opportunity may be to use the programs’ priorities for schools in need, based on their definitions of Tier I and Tier II, to implement a school-level RDD.3 Using this approach, we would use the 5 percent cutoff on each state’s school-level measure of achievement and the 60 percent graduation rate cutoff for high schools that are part of the Tier I and Tier II RTT and SIG eligibility requirements as the RDD “assignment variable” for the two groups of schools, respectively. (Additional eligibility requirements, described below, are not based on continuous measures and therefore cannot support an RDD.) Schools below these cutoffs will form the treatment group, and schools above them will form the comparison group.
For every RDD mini-study (that is, unique combination of RDD assignment variable and cutoff, outcome, and grade4), we will conduct a full RDD analysis aligned with WWC evidence standards. Specifically, we will estimate impacts within an optimal bandwidth around the assignment variable’s cutoff value and conduct a full set of diagnostic analyses to assess the performance of the RDD. The overall impact of STMs will be a weighted average of the mini-study impacts, where the weight is the inverse of the variance of the mini-study impacts.
Student-level data used to answer research question 2 will come from two sources: (1) the school districts recruited for the study and (2) extant data from states that have adequate data systems. The recruited school districts will include approximately 600 schools that are below the cutoff value on a continuous variable used to allocate STM funds, which means that they will be in the RDD treatment group. We also assume that in these same districts, there will be at least 600 schools that qualify for the RDD comparison group (that is, they are above the cutoff values but otherwise would be eligible for funds). We will need states to provide us with lists of schools that are ranked by the continuous variables used to determine eligibility. The availability of those variables for schools on both sides of the cutoff will be essential for conducting the RDD analysis.
Research Question 3: Are RTT reforms related to improvement in student outcomes?
We will use a quasi-experimental ITS design to assess how student outcomes change following the receipt of RTT grants. The primary data source for this analysis will be state-level NAEP scores. The ITS model projects the outcomes that would have been expected in the absence of RTT funding and compares the projections with the pattern of outcomes actually observed in the post-intervention period. The effect of the intervention is estimated as the difference between the predicted pattern of outcomes and the actual trend in outcomes in the post-intervention period.
We will implement a “short” ITS model with data from multiple cohorts of students in the states in our study. These multiple cohorts will be “stacked” into a single data set. We will identify the effect of RTT by comparing NAEP scores (separately for each subject) for cohorts prior to RTT implementation with the NAEP scores of cohorts that experience RTT implementation. Importantly, the ITS design cannot establish causal relationships between the reforms implemented and estimated changes in student outcomes. Thus, appropriate caution will be used when interpreting these results.
Research Question 4: Is implementation of the four school turnaround models, and strategies within those models, related to improvement in outcomes for lowest-achieving schools?
To examine the correlation between improvements in school outcomes and specific turnaround strategies, we will draw on the implementation data collected from schools implementing an STM. We will use two correlational approaches to examine the relationship between school-level outcomes and specific turnaround models and strategies within those models. For the first approach we will examine correlations across mini-studies between RDD impacts and the characteristics of the average STM schools at the cutoff value of the assignment variable in each mini-study. For the second approach we will conduct an ITS analysis within each school and then correlate the ITS impacts with school-level turnaround models/strategies. As with the ITS design described for research question 3, this correlational analysis cannot establish a causal relationship between turnaround models/strategies and estimated changes in school outcomes. Thus, caution must be used when interpreting these results because specific turnaround models/strategies may not have caused the observed changes in outcomes.
Outcomes to be examined include student achievement on state assessments, high school graduation rates, and (to the extent data are available) college enrollment rates and rates of completion of at least a year’s worth of college credit. Data sources for turnaround models, strategies, and practices include state and district interviews and school surveys.
d. Recruitment Plan
(i) Recruitment for the STM Component
The recruitment plan for the STM component of the evaluation comprises two stages: (1) state recruitment and (2) district recruitment. The STM component of the evaluation may include schools implementing STMs using either SIG or RTT funds.
State Recruitment. In the first phase of recruitment, we will contact each state and the District of Columbia to introduce the study, gauge appropriateness for inclusion in the RDD study, and generate interest in the study. We will send introductory FedEx packages to the State Contacts for the SIG and RTT programs. Each package will include:
A state notification letter (Appendix A) will explain the study’s importance, provide an overview of the study, and indicate that a member of the study team will call to provide more details. This letter will be printed on ED letterhead and signed by IES director John Easton to underscore the study’s high-level federal support.
A nontechnical information sheet (Appendix B) will provide a non-technical description of the study. This document includes a summary of evaluation activities, the study’s research questions, and a timeline for study activities.
The recruiter assigned to the state will follow up with the State Contacts for the SIG and RTT programs to confirm receipt of the mailing and arrange appointments with the appropriate agency staff for a telephone discussion that will be guided by a recruiting protocol (Appendices C, D, and E). This protocol will guide recruiters in providing a nontechnical description of the study design and data collection activities, confirming the state’s process used to define STM eligibility categories and rankings of the STM eligible schools (for both SIG and RTT-funded STMs), reviewing the type of student level data maintained by the state’s data system and willingness to provide data to the study team, and securing participation in the study. The recruiter will also highlight the study’s importance and address questions or concerns.
District and School Recruitment. Based on the discussions with the states, we will determine whether the state is well suited for the RDD component of the evaluation and, if so, which districts and schools should be included in the STM sample. For these districts and schools, we will send FedEx mailings to the District Contacts for the SIG program and/or RTT program (where RTT-funded STMs are being included). These mailings will include the district notification letter (Appendix F) and the nontechnical study information sheet (Appendix B).
Recruiters will make follow up calls to the District Contacts to introduce themselves and the study, and to arrange for a time to further discuss the study. We anticipate that recruiting communications will predominantly take place by phone and e-mail, but in-person meetings will be arranged where they are deemed necessary to facilitate recruitment efforts. Recruitment discussions with districts will be guided by a modified version of the state recruitment protocol, with a greater emphasis on district and school participation. During these discussions, recruiters will review the study design and planned data collection activities (such as interviews with district administrators, principal surveys, and the collection of student-level data if the state cannot provide these data), identify the schools in the district that would be included in the STM component, and discuss what their participation in the study would entail. We anticipate that the bulk of recruiting discussions will take place with the targeted district staff (and that these district staff will facilitate the participation of their schools in the study). However, we will follow up with the individual schools as requested by the district.
Research Applications and MOUs. In states or districts with policies concerning external research projects, the contractor will gain the necessary approvals and abide by the relevant guidelines for conducting the study. The contractor will seek expedited research application reviews where possible and emphasize that the evaluation has received prior review by its institutional review board (IRB).
Following an oral commitment from state and district officials to participate in the study, they will be sent a memorandum of understanding (MOU) to sign (also signed by a representative of the evaluation contractor). The MOU will describe the agreed-upon roles and responsibilities of the study team and of participating states, districts, and schools.
Recruitment Training and Tracking. Each state and district will have one primary recruiter assigned to them to facilitate developing rapport between the recruiter and the state and district officers and their support staff. Recruiters will participate in a one and a half day training session covering the following topics: a study overview; review of RDD requirements; recruitment procedures and timeline; the effective, non-technical communication of the study’s needs, importance, and methodological issues; dealing with issues that might arise; and task management (such as the schedule, weekly meetings, and use of the tracking system). Trainers will also share key strategies based on the contractor’s extensive experience recruiting districts and schools for IES studies.
All contacts will be documented in an electronic tracking system to minimize redundant contacts and to provide the Contracting Officer’s Representative (COR) with up-to-date information on recruiting progress. The web-based system will be accessible to recruiters from Mathematica and its subcontractors.
(ii) Recruitment for the RTT Component
No separate state-level recruitment contacts are planned for the RTT component of the evaluation. Instead, as part of the STM state recruitment contacts described above (after gathering the information needed to assess suitability for the RDD study component), we will discuss with state administrators the evaluation’s plan to gather information about their implementation of RTT-related reforms and solicit their cooperation with these activities.
RTT-specific recruitment calls are planned for school districts in the RTT sample. The goals of these calls are to (1) provide basic information about the study; (2) obtain buy-in for planned data collection; and (3) identify the appropriate respondents for the RTT district interviews, according to area of expertise. Protocols are included in Appendices C, D, and E.
e. Data Collection Plans
To address the study research questions, the evaluation will collect and analyze data from several sources. The present OMB request seeks clearance only for the study’s recruitment effort. An overview of the study’s design and data collection plans is included to provide context, but is not the focus of this request. In a subsequent OMB package, ED will request OMB clearance to collect information through interviews with state staff, interviews with district staff, surveys of school administrators, and extracts from administrative records from states and districts. Table A.1 lists the study’s research questions and the data sources that will be used to answer the research questions. We describe the study’s planned data collection activities in more detail below.
Interviews with State Representatives. To thoroughly document the extent to which states have implemented RTT and SIG systems and requirements, we will conduct semi-structured telephone interviews with representatives from the state education agency (SEA) in every state and the District of Columbia. States in the RTT sample that did not receive RTT grants will be asked about their implementation of RTT-related reforms. These interviews will take place in the spring of 2012, 2013, and 2014.
Interviews with District Representatives. We will also conduct semi-structured telephone interviews with staff in each district included in the study (approximately 240 districts). These interviews will document RTT and STM implementation at the district level. Interviews with districts in the STM sample will focus on school turnaround. Interviews with districts in the RTT sample will focus on implementation of RTT-related reforms. To facilitate comparisons, districts in the RTT sample from RTT-winning and losing states will be asked the same questions about implementation of RTT-related reforms.
As noted above, some districts in our study may be in both the RTT and STM samples and, therefore, will need to participate in both the RTT-focused district interviews and the STM-focused district interviews (which would be coordinated in these cases to minimize response burden). These interviews will take place in the spring of 2012, 2013, and 2014.
Table A.1. Research Questions and Data Sources
Research Question |
Data Sources |
1. How are RTT and SIG implemented at the state, district, and school levels? |
Surveys of school administrators; interviews with state and district staff |
2. Does receipt of RTT and/or SIG funding to implement a school turnaround model have an impact on outcomes for lowest-achieving schools? |
State and district extant data |
3. Are RTT reforms related to improvement in student outcomes? |
NAEP data |
4. Is implementation of the four school turnaround models, and strategies within those models, related to improvement in outcomes for lowest-achieving schools? |
State and district extant data; surveys of school administrators; interviews with state and district staff |
Survey of School Administrators. We will conduct a web survey of school administrators (principals, assistant principals, or other staff knowledgeable about school turnaround activities) at the approximately 1,200 schools that are part of the STM sample. To ease burden on respondents, we will limit the length of the survey to between 30 and 60 minutes. Because the information we need to obtain from schools is considerable, our goal will be to develop a set of items that captures specific areas of interest through closed-ended questions and offers specific and mutually exclusive response options. These surveys will be conducted in the spring of 2012, 2013, and 2014.
Administrative Data from Districts and States. The outcomes for the impact analyses will come from administrative data the study team collects from states and districts, as well as from NAEP data the study team obtains from ED. (Student-level data will be collected for the STM impact analysis only; the RTT impact analysis will rely on administrative data aggregated to the state, district, or school levels.) The outcomes of interest for this study are student standardized test scores (on both state proficiency assessments and the NAEP) from the 2011-2012, 2012-2013, and 2013-2014 school years;5 high school graduation rates; and (to the extent data are available) college enrollment rates and completion of at least a year of college credit.
f. Study Activities and Timeline
The RTT-SIG evaluation is expected to be completed in five years. Table A.2 shows the timing of major study activities. Since this package is requesting clearance for study recruitment activities, only the first activity listed in Table A.2 applies to this request. We also show the timeline for other major evaluation activities to provide an overview of the study.
Activity |
Date |
2011 |
|
Recruit 50 States (and the District of Columbia), 240 Districts, and 1,200 Schools |
Date of OMB approval through 12/30/2011 |
2012 |
|
Collect Interview Data |
3/2012 through 6/2012 |
Collect Survey Data |
3/2012 through 6/2012 |
Collect (Extant) Student Assessment Data |
7/2012 through 10/2012 |
2013 |
|
Collect Interview Data |
3/2013 through 6/2013 |
Collect Survey Data |
3/2013 through 6/2013 |
Collect (Extant) Student Assessment Data |
7/2013 through 10/2013 |
2014 |
|
Collect Interview Data |
3/2014 through 6/2014 |
Collect Survey Data |
3/2014 through 6/2014 |
Collect (Extant) Student Assessment Data |
7/2014 through 10/2014 |
No data are being collected as part of recruitment activities.
The recruitment plan for which clearance is being sought in this submission is designed to minimize burden on the states and districts being recruited. An electronic recruiting tracking database will be designed and used to store contact history with states and districts and relevant recruiting data collected in the course of those communications. Effective use of this management tool will minimize redundant contacts with states and districts and allow for specific requests for information to be consolidated and coordinated among study team members. A summary of non-confidential recruiting information will also be entered into the contractor’s cross-study recruiting database, allowing further coordination of requests across ongoing education studies, as well as coordination of decisions on the timing of such requests, to further minimize burden while meeting the requirements of each study.
Additional efforts will be made to rely on information gathered from existing sources to the extent possible. Available information will be culled and analyzed to develop a highly targeted sample of states and districts for recruitment, focusing on those with a large number of schools implementing STMs, the availability of high-quality extant data, and other factors suggesting a high probability of eligibility for the study. This approach will limit contact with districts that are ineligible or unsuitable for participation in the study.
No other national study has been conducted or is underway to address the same research questions as this study. ED determined that an in-depth national study examining the implementation and impacts of the RTT and SIG programs is needed. ED will coordinate efforts between this evaluation and several other ongoing studies of ARRA, including the Integrated Evaluation of ARRA Funding, Implementation, and Outcomes (IEA) and the Study of School Turnaround (SST), in order to minimize burden on study participants and avoid duplication of effort.
With regard to the IEA, some overlap among respondents is inevitable, given that the ARRA evaluation is collecting data from state officials, district administrators, and school principals as a nationally representative sample from all 50 states. However, the topics of data collection and the data collection strategies are notably different. For example, the IEA will administer a closed-ended survey to state officials. While the state-level interview for the RTT/SIG impact study addresses some of the same broad topics covered in the IEA survey, the RTT-SIG interviews will probe more deeply than is possible in a survey. In addition, the study team for the Impact Evaluation of RTT and SIG is currently comparing their draft data collection instruments to those from the Integrated ARRA Evaluation and will explore deleting any duplicative questions from their study instruments.
With respect to the SST we anticipate little overlap among respondents for these two studies for two reasons:
The Study of School Turnaround began data collection in the spring of 2011, while the Impact Evaluation of RTT and SIG seeks to begin data collection in spring of 2012, thus avoiding one year of simultaneous data collection efforts.
There is limited overlap in respondent groups. In the few cases in which there is overlap, the study teams could investigate the feasibility of conducting joint interviews. That is, a researcher from one study team could conduct the interview while a representative from the other study team listens, only adding questions as necessary to address study requirements. The teams will explore this option once the extent of sample overlap is known, keeping in mind that the focus of the two studies differs in important ways (with the SST focusing more on the change process and how reforms are implemented over time, while the Impact Evaluation of RTT and SIG focuses more on documenting the reforms that were implemented, including RTT reforms which are not a focus of the SST).
Whenever possible, the evaluation contractor will use existing data including EDFacts; state SIG and RTT grant and subgrant applications; Consolidated State Performance Reports (CSPRs); Office of Elementary and Secondary Education (OESE) monitoring reports; and federal, state, and local administrative files. This will help to further reduce respondent burden and minimize duplication of data collection efforts.
The primary small entities for the study are districts and states that have received RTT and/or SIG funds, and schools implementing STMs with SIG and/or RTT support. In order to minimize burden, recruitment staff will be trained to make their contacts as straightforward and concise as possible. The recruitment mailings and protocol are designed to be clear, brief and informative. We will suggest that all relevant staff participate in telephone recruiting meetings so that the state and district officers will not be required to convey information individually to their staff members. State recruitment will be used to identify the districts eligible for the STM sample to minimize screening of districts to formulate this sample.
The recruitment plan described in this submission is necessary for ED to conduct a rigorous evaluation of the implementation and impacts of RTT and SIG programs. Without this evaluation, ED will not know if its investment in RTT and SIG has resulted in improved student outcomes. Moreover, RTT and SIG together represent the largest investment in school turnaround in American history; failing to conduct this evaluation would mean missing the opportunity to learn lessons relevant to future school improvement efforts.
There are no special circumstances involved with recruitment activities.
a. Federal Register Announcement
The 60-day notice to solicit public comments was published in Volume 75, page 78230 of the Federal Register on December 15, 2010. The 30-day notice was published in Volume 76, page 13136 of the Federal Register on March 10, 2011. No public comments relevant to the collection were received.
b. Consultations Outside of the Agency
The evaluation team will work with ED to identify experts in evaluation methods and data analysis, state assessment programs, and education reform to become members of a technical working group (TWG) advising on the evaluation. Once these individuals have been determined, the contractor will seek their input on the evaluation’s design.
c. Unresolved Issues
There are no unresolved issues.
The study does not plan to give gifts to states, districts, or schools for participating in the recruitment process.
No confidential data will be sought during the recruitment phase of the study, for which clearance is being sought in this package.
The following statement applies to procedures to take place during the data collection phase of the study, for which clearance will be sought in a separate OMB submission. A consistent and cautious approach will be taken to protect all information collected during the data collection phase of the study. This approach will be in accordance with all relevant regulations and requirements. These include the Education Sciences Institute Reform Act of 2002, Title I, Part E, Section 183, which requires “[a]ll collection, maintenance, use, and wide dissemination of data by the Institute … to conform with the requirements of section 552 of Title 5, United States Code, the confidentiality standards of subsections (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232 g, 1232h).” These citations refer to the Privacy Act, the Family Education Rights and Privacy Act, and the Protection of Pupil Rights Amendment. In addition, for student information, the project director will ensure that all individually identifiable information about students, their academic achievements and families, and information with respect to individual schools shall remain confidential in accordance with section 552a of Title 5, United States Code, the confidentiality standards subsection (c), and sections 444 and 445 of the General Educations Provision Act.
Subsection (c) of Section 183, referenced above, requires the director of IES to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” The study will also adhere to requirements of subsection (d) of Section 183 prohibiting disclosure of individually identifiable information as well as making the publishing or inappropriate communication of individually identifiable information by employees or staff a felony.
Mathematica, and its subcontractors AIR and SPRA, will use the information collected in the study for research purposes only. When reporting the results, data will be presented only in aggregate form, such that individuals and institutions will not be identified. A statement to this effect will be included with all requests for data. All members of the study team with access to the data will be trained and certified on the importance of privacy and data security. All data will be kept in secured locations and identifiers will be destroyed as soon as they are no longer required.
The following safeguards are routinely employed by Mathematica to carry out privacy assurances during the study:
All Mathematica employees sign a privacy pledge emphasizing its importance and describing their obligation.
Identifying information is maintained on separate forms and files, which are linked only by sample identification number.
Access to hard copy documents is strictly limited. Documents are stored in locked files and cabinets. Discarded materials are shredded.
Computer data files are protected with passwords and access is limited to specific users.
Especially sensitive data are maintained on removable storage devices that are kept physically secure when not in use.
No sensitive questions will be asked during the course of recruitment.
Representatives from all 50 states and the District of Columbia, and officials and staff from approximately 240 school districts,6 will participate in initial phone calls, in-person meetings as necessary, and follow-up communications that will occur during the recruitment phase of the study. We estimate that we will need to contact all 50 states and the District of Columbia to learn about their processes to identify and award turnaround grants to Tier I and Tier II schools. We estimate that we will need to contact 240 districts to discuss data collection activities, with in-person meetings required in 40 districts to effectively describe the study requirements.
We estimate that the total number of state and district staff involved in recruitment activities will be 1,210 and the total number of recruitment hours will be 4,762, for an average of 3.9 hours per person (Table A.3). See Appendices C, D, and E for protocols to be used in recruitment phone calls.
Although there are no instruments for the recruitment phase of the study because no data are being collected, future data collection instruments (for which approval will be sought in a separate OMB package) will include the following appropriately tailored Public Burden Statement:
According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless such collection displays a valid OMB control number. Public reporting burden for this collection of information is estimated to average XX minutes/hours per response, including time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. The obligation to respond to this collection is mandatory (citing authority)/required to obtain or retain benefit (citing authority)/ voluntary. Send comments regarding the burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to the U.S. Department of Education, (Program Sponsor mailing address here), or email (PO email address here) and reference the OMB Control Number XXXX_XXXX. Note: please do not return the completed XXXX (cite form or other applicable reporting mechanism) application to this address.
Table A.3. Burden in Hours to Respondents
Respondents/Activity |
Total
|
Number
of |
Total
|
Burden
|
Total
|
States |
|
|
|
|
|
Recruiting communications |
102 |
1 |
102 |
3 |
306 |
Review of research applications and/or MOUs |
60 |
1 |
60 |
7 |
420 |
Total State Staff |
162 |
1 |
162 |
|
726 |
Districts |
|
|
|
|
|
Recruiting communications (STM sample) |
660 |
1 |
660 |
3 |
1,980 |
Recruiting communications (RTT sample) |
120 |
1 |
120 |
1.5 |
180 |
Review of research applications and/or MOUs |
268 |
1 |
268 |
7 |
1,876 |
Total District Staff |
1,048 |
1 |
1,048 |
|
4,036 |
Overall Total |
1,210 |
1 |
1,210 |
|
4,762 |
There are no start-up costs related to recruitment.
The current budget to recruit states and districts for participation in the evaluation is $1,225,604. The total budget for the evaluation (base period activities only) is $6,552,061, including the $1,225,604 for recruitment activities. The estimated average annual cost of study recruiting activities over 3.58 years (that is, the duration of the evaluation’s base period--40 months) is $342,347.
This is a new collection. IES research and development funds are being used to study the impact on student outcomes of the RTT and SIG programs. In order to conduct the study, states, districts, and schools need to be recruited to participate. This OMB package describes and accounts for the various activities that will take place during the recruitment phase of the study.
There are no tabulation or publication plans based on this package because no data are being collected.
a. Tabulation Plans
There are no tabulation plans based on this package because no data are being collected.
b. Publication Plans
There are no publication plans based on this package because no data are being collected.
The recruitment materials will display the OMB expiration date.
No exceptions are being sought.
www.mathematica-mpr.com
Improving
public well-being by conducting high-quality, objective research and
surveys
Princeton,
NJ ■
Ann Arbor, MI ■
Cambridge, MA ■
Chicago, IL ■
Oakland, CA ■
Washington, DC
Mathematica®
is a registered trademark of Mathematica Policy Research
1 ED recently announced that the nine States that were closest to winning RTT Phase 2 grants are eligible to compete for $200 million in additional funds. To compete, States will propose specific parts of their Phase 2 plans that they would implement with the new funds. While the new funding might have implications for interpretation and analysis, we do not currently see a need to change the study’s design or sampling plans. When we know which states win these additional funds and exactly what they intend to use them for, we will reassess our design and analysis plans.
2 Estimating RDD impacts requires that we observe a discontinuous change in the proportion of schools receiving RTT/SIG at a cutoff value on a continuous variable. If we do not observe such a breakpoint, then an RDD analysis will be infeasible.
3 In each state, Tier I includes any school among the lowest-achieving 5 percent of (or five) Title I schools in improvement, corrective action, or restructuring plus Title I high schools in improvement, corrective action, or restructuring that have a graduation rate lower than 60 percent. Tier II includes any secondary school that is eligible for, but does not receive, Title I funds that is below the 5 percent achievement cutoff or the 60 percent graduation rate for high schools. States choose which achievement measure to use when ranking schools to determine the 5 percent cutoff. Both tiers can include schools that fall under expanded eligibility rules established in early 2010, which also involve allocating funds using a cutoff value on a continuous variable.
4 We plan to estimate impacts separately by grade because the relationship between the assignment variable and outcomes could differ by grade. We plan to do this because accurate modeling of the relationship between the outcome and the assignment variable is an essential component of RDD analysis.
5 The NAEP tests in reading and mathematics are administered every other year (in odd years). Hence, state-level NAEP results will only be available for the 2010-2011 and 2012-2013 school years.
6 These 240 districts include the 120 in the evaluation’s STM sample and another 120 districts in the evaluation’s RTT sample.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Jennifer S. Baskwell |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |