Comprehensive School Reform under No Child Left Behind Act of 2001, P.L. 107‑110. This request is for OMB approval of revised data collection associated with the Longitudinal Assessment of the Comprehensive School Reform Program Implementation and Outcomes (LACIO). Sec. 1606 of the Elementary and Secondary Education Act (ESEA), as reauthorized by the No Child Left Behind Act (P.L. 107‑110) mandates activities to be conducted by K‑12 schools across the country under “comprehensive school reform,” and Sec. 1607 mandates the National Evaluation.1
Along with the new authorization, Congress appropriated $235 million for the Comprehensive School Reform program in fiscal year 2002.2 This level of funding supported reform activities at an estimated 2,000 schools. The vast majority of these schools were Title I schools “in need of substantially improving” their student achievement levels. 3
The federal funds were distributed on a formula basis to the states, which in turn made grants to districts to support the schools.
Each school received an average of over $70,000 per year for three years. The modest award amounts and limited award durations were intended to signal a catalytic role for the federal funds - helping a school to initiate or advance its reform efforts - rather than serving as a long‑term subsidy.
As of September 2005, the schools in the cohort studied through LACIO have completed the original funding cycle. Consequently, this request for the revision of OMB approval reflects a changed focus of the LACIO to address two concerns: (1) the extent to which the federal funds actually played the anticipated catalytic role and schools continue reform after CSR funding ended; and (2) the lessons from CSR that can be applied to school actions to successfully stimulate improved student achievement. This revision to the existing LACIO evaluation is for a new evaluation that includes a survey of a nationally representative sample of CSR and comparison schools approved by OMB (approval number 1875-0222). It draws on the findings from the study and poses additional questions that deepen knowledge of the role of comprehensive reform in changing outcomes in high-poverty, low-achieving schools.
The Longitudinal Analysis of the Comprehensive School Reform Program Implementation and Outcomes (LACIO) has involved three types of data collection and analysis:
Quantitative analyses of the relationship between scientifically based model adoption and school-level achievement using all 2002 CSR grantees;
Quantitative analysis of the longitudinal relationship between CSR awards and school-level achievement in the universe of 2002 Title I CSR grantees;
Quantitative and qualitative analyses of a survey of 500 CSR schools that received awards in 2002 and a matched set of 500 non-CSR schools, as well as case study data on 15 CSR and 15 matched non-CSR schools.
This extension will increase the number of sites for case studies and will place less emphasis on discrete programs and more on the role of CSR approaches in improving school performance. It will gather new data to determine what strategies and practices are implemented in improved schools. The contractor will collect the new data through document review and site visits.
The contractor will carry out the work described in this modification, culminating in a final report due October 27, 2008. The work to be completed is described below, along with a timeline for deliverables and other activities to support the data collection, analysis and reporting.
The No Child Left Behind legislation stipulates two broad goals for the LACIO:
1) To evaluate the implementation and results achieved by schools after three years of implementing comprehensive school reforms; and
2) To assess the effectiveness of comprehensive school reform in schools with diverse characteristics.
The original U.S. Department of Education (ED) solicitation for the National Evaluation articulated these two goals in terms of three specific evaluation questions:
1) To what extent have CSR schools made progress on state assessments in comparison to gains for schools in their state with similar characteristics?
2) How effective are various school reform activities, especially in diverse settings, and to what extent can school progress be linked to CSR reforms?
3) How have district policies and state policies affected CSR program implementation and comprehensive school reform?
In 2005, ED revised the LACIO to collect follow-up information on the extent to which reforms were sustained and to identify approaches for improving Title I schoolwide programs.
In 2006, ED determined that the LACIO could conduct additional analyses and on-site data collection about schools that changed from low-performing to meeting NCLB goals. Under the current contract modification, the contractor will examine the policies and processes implemented, actions taken, role of the 11 CSR components, and characteristics of lead actors in turnaround schools. The study will compare schools that are more or less successful in improving student achievement, focusing on the following new questions:
1) What is the role of the 11 CSR components in successful schools that turned around academically and narrowed achievement gaps between subgroups?
2) What combinations of strategies and promising practices did successful schools use?
3) Who or what was responsible for implementing these strategies?
4) How have successful schools overcome challenges in implementing turnaround strategies?
The contractor will gather data from a variety of sources to answer each question during site visits to successful turnaround schools and a small group of comparison schools.
The purpose of the new data collection is to identify promising practices in turnaround schools by documenting the role of CSR approaches and other strategies associated with improved student performance. Such schools demonstrate higher achievement levels and are making their Adequate Yearly Progress (AYP) targets.
The deliverables will include a series of ten site-specific reports that highlight and describe promising practices. In addition, the findings from these ten site-specific reports will be used to inform the final cross-site analysis and report. The final report will examine the extent to which the 11 CSR components and others identified in the Turnaround Evidence Review and the contractor’s supplement are apparent in successfully reforming schools.
The contractor’s supplement, also referred to as the Addendum to the Turnaround Evidence Review, focused on staff capacity, development, and leadership; external assistance; and, effective schools. Specifically, the literature referenced in the Addendum included the final report of the National Longitudinal Evaluation of Comprehensive School Reform (Aladjem et al., 2006), studies of the new American Schools (Berends et al., 2001 and 2002; Bodilly, 1996), as well as more general literature on comprehensive school reform and leadership (Desimone, 2000; Elmore, 2000; Hallinger, Bickman, & Davis, 1996).
Site visit protocols. The contractor developed a set of case study protocols to guide data collection on site. The instruments drew on existing research on CSR, LACIO findings and data collection experience, AIR’s data collection for the Study of State Implementation and Accountability and Teacher Quality under NCLB (SSI-NCLB), as well as the Turnaround Evidence Review.
The case study protocols will involve interviews and/or focus groups of district officials, principals, teachers, parents, community members, instruction/curriculum specialists, School Improvement Plan (SIP)/leadership team, department chairs, assistant principals, and guidance counselors.
A Technical Working Group (TWG) reviewed draft data collection instruments for clarity and precision, as well as to give feedback on the site selection criteria and process. The contractor pilot tested all data collection instruments. During these tests, which were administered to no more than nine respondents, the team assessed item comprehension, the effectiveness of the proposed strategies for gaining cooperation, and the length of time for respondents to answer questions in the instruments. Such information is critical for determining the burden associated with each instrument, which must be presented to all respondents before the administration of any federally sponsored research questionnaire to more than nine respondents.
The contractor delivered to ED an initial set of protocols in January 2006. Following ED review, the contractor pilot tested the revised protocols and submitted a report to ED detailing the results from the pilot test as well as suggested changes to the protocols in February 2007. Last, the contractor submitted a final set of protocols to ED in April 2007. Under Budget Services direction, the contractor revised the data collection protocols in May 2007.
Identify case study sites. The goal is to select schools that experienced a significant increase in student achievement and sustained it over time. The sites selected for the case studies of schools that show marked improvement in student achievement outcomes will be, to the extent feasible, sites that received CSR funds in any year of the program’s history. The contractor will select case study sites with preference given to schools that show rapid (within one to two years) or steady (within three or more years) gains that sustained for three to four years. The contractor will also consider schools with sustained academic growth for shorter periods of time (one to two years).
To select the sites, the contractor will use several existing databases. The contractor merged data from the National Longitudinal School-Level State Assessment Score Database (NLSLSASD) and the Common Core of Data (CCD), to obtain annual school-level student assessment scores from 2000 through 2005, as well as school demographic information. These databases have also been linked to the Southwest Educational Development Laboratory (SEDL) CSR awards database to ensure the sample includes schools that received CSR awards. Once schools are selected, the contractor will verify their AYP (Adequate Yearly Progress) status with the National AYP and Identification database (NAYPI).
More detail on the selection process is presented in section one of Part B: Respondent Universe and Sampling Methods.
Additional Details on Data Sources. The NAYPI includes 2004 and 2005 AYP data from all 50 states and the District of Columbia for 88,160 schools. Some data elements, such as the applicability of subgroups, were not available for all states. The database also does not include 2,529 schools for which states reported AYP as “not determined,” and about 4,000 schools that were not included in state-provided data files. The NAYPI is a source of AYP status rather than performance data.
The number of states included in the NLSLSASD varies across both school level and academic year (Exhibit 1). The largest number of states is represented in the 2001 through 2005 school years. The original suggestion was to examine gains from 2001 through 2005. We will include data from 2000 in the site selection. If 1999 data are included, the number of states represented is limited and requires designation of separate selection criteria. Therefore, the contractor will not include 1999 achievement data. The high school sample will be reduced to the 22 states for which there are consistent achievement data from 2000 to 2005.
Exhibit 1
Number of States with Valid Standardized Achievement Scores, by School Level and Academic Year
|
Elementary |
|
Middle |
|
High |
|||
Year |
Math |
Reading |
|
Math |
Reading |
|
Math |
Reading |
1999 |
26 |
26 |
|
26 |
26 |
|
15 |
13 |
2000 |
38 |
39 |
|
39 |
39 |
|
22 |
21 |
2001 |
43 |
43 |
|
43 |
43 |
|
26 |
25 |
2002 |
45 |
45 |
|
45 |
45 |
|
26 |
26 |
2003 |
49 |
49 |
|
49 |
49 |
|
26 |
26 |
2004 |
48 |
48 |
|
48 |
48 |
|
28 |
29 |
2005 |
45 |
46 |
|
45 |
45 |
|
25 |
26 |
Disaggregated data are available in the NLSLSASD for the majority of states from 2003 forward. In states with well-established assessment systems, disaggregated data are also available for years prior to 2003, but the data are less consistent than data provided after 2003. To the extent feasible, we will use these data as selection criteria.
Conduct site visits. The contractor will conduct site visits at a sample of schools with evidence of turnaround. The case studies will determine the role of CSR approaches as described in the 11 components and other strategies and practices implemented in successfully turned-around schools. Evaluators will pay close attention to understanding what combinations of strategies and promising practices successful schools used to turn around academically and narrow achievement gaps between subgroups. In addition, the site visits also will examine how districts support turnaround efforts at each case study school.
The case study sample will include 20 successful turnaround schools, with as many as feasible (at least 10) being schools that received CSR awards, and 10 additional schools that have not been as successful. A two-person team of evaluators will visit each site for three to four days during the first visit. After the initial visit, the contractor and ED will identify 10 compelling sites to visit a second time. The contractor believes some of the turnaround sites’ success may be due to more complex phenomena than others. Therefore, it contends a follow-up visit will be necessary to collect all necessary data to describe the school’s story for a portion of the turnaround sites. The contractor will conduct a follow-up visit to 10 of these “compelling” sites. The second visit will involve only one staff member for a shorter time. The intent of this extended time is to provide in-depth descriptions of the turnaround phenomena.
The contractor developed a protocol for the site visits, which includes issues to be addressed during the visit and documents to collect. In addition, site visitors will meet prior to any visits to ensure there is a common understanding of the framework, questions, and issues to be addressed during the visits.
LACIO, and other national studies of CSR, have found limited relationships between CSR and improvement in student achievement. The studies found no clear differences between CSR and non-CSR schools in either gains or the extent of implementation of the 11 components. Consequently, these cases are an opportunity to generate hypotheses about the reform features—which may not be fully captured by the 11 CSR components--that are associated with success. The sample is purposive, focusing on schools that have successfully turned around so their students markedly increase achievement. The case studies will examine practices associated with increased student achievement. The framework for examining the features will include the 11 CSR components, augmented by research, particularly the Turnaround Evidence Review, which has noted additional features that may account for changes in schools that result in positive outcomes. Some of these are:
The extent to which the school has autonomy in allocating resources and selecting staff
The extent to which curriculum and instructional practices are aligned within the school and with district and state policies
The extent to which school and district leaders monitor practices within the school
The extent to which the community is engaged in school reform
The actions and capabilities of the school leader
To understand how districts support improvement efforts at each case study school, evaluators will arrange interviews with district staff who were involved in planning for turnaround or are currently involved with managing turnaround schools (including those responsible for managing contracts with EMOs [Educational Management Organizations] and/or charter schools). The contractor will also collect archival documents from the district and school relevant to the turnaround process, including but not limited to improvement plans, assessment descriptions, and staff resumes. Whenever possible, such documents will be secured prior to the site visit so that evaluators can begin the data analysis and focus the data collection while on-site.
The contractor will conduct the site visits after OMB approval, between September 25, 2007 and March 17, 2008. Findings from the case studies will be delivered to ED in a final report, which will identify promising practices in case study schools and include short vignettes describing each practice. Prior to the report, the contractor will provide ED a report outline for approval. The contractor will also submit 10 site-specific reports on the compelling turnaround experiences.
The contractor will submit to ED a draft outline for the 10 site-specific reports on February 1, 2008, the first drafts on April 1, 2008, and the final reports on August 15, 2008. For the cross-site report, the contractor will submit to ED a summary of findings and arrange to brief ED on key findings no later than May 15, 2008. The contractor will submit a draft report to the COR by June 23, 2008. After receiving comments by ED, the contractor will submit the first revised report by August 4, 2008, the second revised report by September 1, 2008, the third revisions to the report by September 29, 2008, and the final report by October 27, 2008.
ED and other interested parties will use the data from the LACIO to assess the sustainability of student achievement outcomes from the comprehensive school reform provisions as stated in Sec. 1606 of the ESEA. It will also provide information that ED can use to strengthen schoolwide Title I programs by pointing to important processes and lessons learned from efforts to turnaround failing schools. This information is intended to be useful to state and local school systems, including individual schools, in their efforts to achieve Adequate Yearly Progress (AYP) under the No Child Left Behind Act.
The evaluation team will use technology in a variety of ways, especially to reduce burden on schools. First, the contractor has obtained the student achievement data for the sample from ED databases, which contain outcome measures for schools in almost every state. Having access to these secondary data allow researchers to reserve data collection efforts at the school and district for only the most necessary data elements.
Communication between the evaluation team and selected school and district officials will occur through email, fax, and conference calls to take advantage of information technology and reduce burdens associated with paperwork. The communication will cover initial inquiries, the exchange of preliminary information, and scheduling and planning of site visits.
Throughout the evaluation, the contractor will provide a toll-free number and email addresses to respondents allowing them to contact the evaluation team with any questions or requests for assistance.
The LACIO design is built upon the survey and research questions posed by previous research efforts, including the National Longitudinal Survey of Schools (NLSS), the Field-Focused Study, the National Study of Title I Schools (NSTS), and the Longitudinal Evaluation of the Effectiveness of School Interventions (LEESI). However, the LACIO is unique in that it combines elements of each study into a comprehensive data collection effort that allows for comparisons of successful and unsuccessful efforts to improve low-performing schools. LACIO researchers will also describe how the implementation of such efforts is linked to education reform and student achievement.
The LACIO will collect data from few small entities, as most of the data sources will be school organizations (district and local). The few small entities are likely to be associated with the external consultants and community members who are helping a school implement its reform. Only minimal information will be needed from these small entities; therefore, no significant impact on these data sources is expected.
The revision of LACIO will provide ED with a complete picture of the implementation and results achieved by low-performing schools after several years of implementing turnaround strategies and the extent to which both reforms and student achievement gains were achieved and sustained in failing schools. In addition, the evaluation will outline the effectiveness of school reform in schools with diverse characteristics. Such answers are necessary to understand how federal and state funds can better serve as a stimulus for school reform.
The data collection efforts focusing on turnaround schools will allow researchers and policy makers to better understand and comment upon what appears to turn around schools that were failing the students they serve. The LACIO's combination of student achievement data and intensive field-based study comprises a design that builds on the strength of the combined methods.
This information collection fully complies with 5 CFR 1320.5(d)(2).
The 60 day Federal Register notice was published in the Federal Register on May 7, 2007.
To date, no public comments have been received.
The evaluation team will seek the expertise of persons outside the agency through the creation of a Technical Working Group (TWG). The TWG will advise the evaluation team on issues of school reform from the perspective of various stakeholders as well as methodological issues in evaluating CSR. We expect that the TWG will meet several times during the course of the study, with such meetings being tied to important events or tasks within the evaluation. Each TWG member will receive an honorarium of $750 per day. The time commitment is relatively small, but the TWG will play an important role in providing insight and guidance to support the evaluation. The TWG members are listed in Exhibit 2.
Exhibit 2
Members of the LACIO Technical Working Group
Member |
Affiliation |
Areas of Expertise |
Carolyn Temple Adger |
Center for Applied Linguistics |
English Language Learners |
Geoffrey Borman |
University of Wisconsin- Madison |
Quantitative methods; comprehensive school reform |
H.J. Green |
Executive Director |
District policy; school improvement |
Bryan Hassel |
Center for Improvement and Innovation |
Research on organizational turnaround |
Elsie Leak |
North Carolina Department of Public Instruction |
State policy; school improvement |
Valerie Lee |
University of Michigan |
Quantitative methods; school reform |
Paul Ruiz |
Education Trust |
State and national policy; school improvement; student assessment |
Jean Rutherford |
National Center for Educational Accountability |
Accountability, assessment |
Sam Stringfield |
University of Louisville |
School improvement; comprehensive school reform |
Malik Stewart |
Delaware Department of Education |
State policy; school improvement |
Ken Wong |
Brown University |
Research on organizational turnaround |
All TWG members offer project specific expertise and experience. The contractor and ED selected these TWG members for their breadth of expertise across multiple disciplines including: methodology, statistical analysis, education context and special issues, and knowledge of CSR. Several of the TWG members have expertise in CSR, including: Elsie Leak, Valerie Lee, Geoffrey Borman, and Sam Stringfield. Elsie Leak gives us a first-hand perspective of school reform at the state level as the Associate State Superintendent for Curriculum and School Reform Services in the North Carolina Department of Public Instruction. Valerie Lee has conducted much research on school reform, including a current study on high school curriculum reform in Chicago. Further, in 2007, she published a book with Douglas D. Ready entitled Schools Within Schools: Possibilities and Pitfalls of High School Reform. Geoffrey Borman is also an avid researcher on comprehensive school reform issues. Most significantly in 2003, he conducted a meta-analysis on the impact of comprehensive school reform of student achievement entitled Comprehensive School Reform and Achievement: A meta-analysis. Last, Sam Stringfield is an accomplished researcher in the field of comprehensive school reform. He is the Co-Director of the Program on Systemic Supports for School Improvement, Center for Research on the Education of Students Placed At Risk (CRESPAR) and a senior scientist at the Center for Social Organization of Schools at Johns Hopkins University. He authored Choosing Success, which provides guidance to schools considering a CSR program based on specific program objectives and research evidence.
The enormous pressures on school systems, in part due to increased assessment and accountability requirements, lead to their assigning a lower priority for participating in data collection efforts such as the LACIO. To indicate the importance of the work of the evaluation for informing federal, state, and district policies and practices on turnaround and reform, schools participating in the LACIO will receive a special monetary payment. Past research shows such payments are a cost‑effective way of increasing participation rates substantially (e.g., Dillman, 1991).
Each school will receive an honorarium of $200 to be used for purposes such as the purchase of books for the school library. This amount was approved by OMB in the existing data collection.
The evaluators will carefully handle all data in a responsible manner so they are not seen by or released to anyone not working on the project, except as required by law. For the cross-site report, the evaluation team will ensure all data are reported in a summary fashion, so no specific individual or school may be identified. For the individual case-study reports, WestEd will not use school or individual respondent’s names. Finally, the evaluation team will maintain all data in secure and protected files that do not include personally identifying data.
The evaluators will not collect any information that would identify individual participants. Therefore, the evaluation team will not reference participants by name. The contractor will communicate an explicit statement regarding these processes to protect the data to any and all participants. Similarly, the student achievement data extracted from ED databases are aggregate school-level data and do not contain records of individual students.
The contractor will not ask questions that are of a sensitive nature.
A revision is being made to the existing collection to add a new evaluation with 1,200 burden hours. The existing data collection (with 10,774 burden hours) has been completed—with the exception of 10 burden hours that will be completed this fall. (We anticipate that as of December 2007, an 83C will be able to be completed to delete these 10 hours.) However, the 1,220 hours that we are requesting for this evaluation being added—plus the 10 burden hours that remain to be completed with the collection—total the 1,230 hours being requested.
Exhibit 3
Hour Burden for Additional Data Collection
Task |
Number of Respondents |
Number of Responses |
Hour Burden |
Field-Based Study |
1,140 |
1,200 |
1,220 |
Existing Study |
10 |
10 |
10 |
TOTAL |
1,150 |
1,210 |
1,230 |
Thirty sites will be identified for the field-based component of the study utilizing the procedures previously outlined. The contractor will initiate cooperation through a telephone call explaining the study.
The targeted sample of schools selected for the field-based study consists of 20 turnaround schools and 10 schools that have not been successful based upon their student achievement levels. Before the site visit, the contractor will request documents related to school improvement from the principal (estimated .25 hours for this task). The contractor will conduct individual interviews with the principal in the school during the turnaround period, the current principal (if different from the principal in the school during the turnaround period), assistant principal, one or two district officials responsible for the decisions and curriculum at the school, a school specialist or coach for mathematics and English/language arts (ELA), the math or ELA department chair (if applicable), and the guidance counselor. Exhibit 5 presents the burden associated with this data collection.
The evaluation team will also conduct focus group interviews of teachers, parents, community members, and the School Improvement Plan (SIP) or leadership team at each school site. Each focus group is expected to last one hour each. Teacher focus groups will include four sets of interviews: two focus groups with experienced teachers (at the school for five years or more) and two with less experienced teachers (at the school less than five years). Three to four teachers will participate in each focus group. For the other focus groups (e.g., parents, community members, leadership team), the contractor will ask for three to four participants with a longstanding relationship with the school (five or more years).
All site visit interview protocols are presented in the Appendix with a crosswalk that demonstrates the link between protocol questions for each instrument and the 11 CSR components.
Exhibit 4
Estimated Burden for Participants in Field-Based Study for Additional Data Collection
Individual Interview Respondent |
Number of Respondents |
Number of times they Respond |
Number of Responses |
Time Estimate |
Total Hours |
Hourly Rate |
Estimated Cost of Burden |
Experienced Principal Interview |
45 |
1 |
45 |
1.25 |
66.25 |
$36 |
$2,025 |
Current Principal/Assistant Interview1 |
35 |
1 |
35 |
1.25 |
43.75 |
$36 |
$1,575 |
10 |
2 |
20 |
1 |
20 |
$36 |
$720 |
|
Experienced Teacher Focus Group2 |
220 |
1 |
220 |
1 |
220 |
$30 |
$6,600 |
20 |
2 |
40 |
1 |
40 |
$30 |
$600 |
|
New Teacher Focus Group3 |
220 |
1 |
220 |
1 |
220 |
$30 |
$6,600 |
20 |
2 |
40 |
1 |
40 |
$30 |
$600 |
|
Curriculum/Instructional Specialist Interview Protocol |
60 |
1 |
60 |
1 |
60 |
$30 |
$1800 |
English/Language Arts/Mathematics Department Chair Interview |
30 |
1 |
30 |
1 |
30 |
$30 |
$900 |
Parent Focus Group |
120 |
1 |
120 |
1 |
120 |
$- |
$0 |
Community Member Focus Group |
120 |
1 |
120 |
1 |
120 |
$- |
$0 |
District Official Interview4 |
25 |
1 |
25 |
1 |
25 |
$52 |
$1,300 |
5 |
2 |
10 |
1 |
10 |
$52 |
$520 |
|
District Curriculum Specialist |
25 |
1 |
25 |
1 |
25 |
$52 |
$1.300 |
5 |
2 |
10 |
1 |
10 |
$52 |
$520 |
|
School Improvement Plan (SIP)/Leadership Team Focus Group |
120 |
1 |
120 |
1 |
120 |
$30 |
$3,600 |
Guidance Counselor Interview Protocol |
30 |
1 |
30 |
1 |
30 |
$33 |
$990 |
Document Review Checklist5 |
30 |
1 |
30 |
1 |
30 |
$- |
$0 |
Existing Study |
10 |
1 |
10 |
1 |
10 |
$36 |
$360 |
TOTAL |
1150 |
23 |
1210 |
- |
1230 |
- |
$31,210 |
1 10 of the 45 Current Principal/Assistant Principal Interviews will be conducted during the follow-up site visit.
2 20 of the 240 New Experienced Teacher Focus Group Interviews will be conducted during the follow-up site visit.
3 20 of the 240 New Teacher Focus Group Interviews will be conducted during the follow-up site visit.
4 10 of the 60 District Official Interviews will be conducted during the follow-up site visit.
5 Document Review Checklist is for site visitor use only. The burden estimate of 1 hour reflects the time a school staff member will spend collecting the documents.
There are no annual costs associated with recordkeeping or data reporting other than that reported in section A12. All other data will be collected directly by the contractor from existing ED data sources.
The total cost for this part of the evaluation is $911,627. Since a three-year clearance has been requested for purposes of ROCIS, we are showing an annual cost of $303,875.
There is a program chane of –9,544 burden hours. All but 10 hours of the burden of the current data collection are complete. The revision of the evaluation adds 1,220 burden hours to the 10 remaining hours from the current collection. When this total of 1,230 burden hours is taken from the current OMB inventory of 10,744 burden hours, there is a reduction of –9,544 burden hours.
The study will produce 10 case-specific reports and one cross-case report describing promising practices and policies of turnaround. The timeline for the publication of these findings is outlined in Exhibit 6. The analyses for each data collection method are detailed below.
Exhibit 6
Schedule for Dissemination of Study Results
Activity |
Due Date |
10 Site-Specific Reports |
|
Draft outline |
February 1, 2008 |
Revised outline |
March 3, 2008 |
Draft reports |
April 1, 2008 |
First revision of reports |
June 2, 2008 |
Second revision of reports |
July 1, 2008 |
Final reports |
August 15, 2008 |
Cross-site Report |
|
Draft outline |
April 15, 2008 |
Revised outline and summary of initial findings |
May 15, 2008 |
Draft of report |
June 23, 2008 |
First revised report |
August 4, 2008 |
Second revised report |
September 1, 2008 |
Third revised report |
September 29, 2008 |
Final report |
October 27, 2008 |
In this case, student achievement data are assessed prior to the selection of the turnaround schools. Schools with the highest levels of performance will be selected for study and compared to non-improving schools to identify factors associated with improved student achievement and those that did little to facilitate turnaround efforts. School level performance data will be gathered from the most recent ED databases. These databases contain achievement data from nearly every school in the country.
Because states use different tests, comparisons of student achievement are problematic. The variability in norms from different test publishers makes the comparison of absolute performance across states difficult. Neither the content nor the criteria for determining proficiency are directly comparable from state to state. Also standards, assessments, and proficiency criteria often change, making scores within states difficult to compare over time. Therefore, to select the appropriate schools accounting for inconsistencies over time, the contractor will rely on standardized school-level achievement scores computed from the NLSLSASD database. The contractor will standardized average scale scores and percent proficient measures by calculating z-scores within each year. In those instances in which percentile ranks are the only available achievement measure, the percentile ranks will be converted into normal curve equivalents (NCEs) and subsequently standardized.
The planned field-based study analysis will include both within- and across-school issues. ED will prepare reports in a common format to ensure all relevant elements of the conceptual framework that focus on CSR approaches, as augmented by concepts in the Turnaround Evidence Review and supplement, are captured.
The 10 site-specific reports will detail the individual school’s experience to illustrate the policy and processes within the complex school context. These field-based studies will integrate information from the documents reviewed, interviews, and focus groups held on site. As such, they can serve as stand-alone documents as well as the basis for the cross-site analysis.
The contractor is not requesting an exemption from displaying the expiration date.
This collection of information involves no exceptions to the Certification for Paperwork Reduction Act Submissions.
Aladjem, D. K., LeFloch, K. C., Zhang, Y., Kurki, A., Boyle, A. et. al. (2006). Models Matter—The final report of the National Longitudinal Evaluation of Comprehensive School Reform. Washington, DC: American Institutes for Research.
Berends, M., Chun, J., Schuyler, G., Stockley, S., & Briggs, R. J. (2002). Challenges of conflicting school reforms: Effects of New American Schools in a high-poverty district. Santa Monica, CA: RAND.
Berends, M., Kirby, S. N., Naftel, S., & McKelvey, C. (2001). Implementation and performance in New American Schools: Three years into scale-up. Santa Monica, CA: RAND.
Bickman, L. (1987). The functions of program theory. In L. Bickman (Ed.), New directions for program evaluation, No. 33 (pp. 5-17). San Francisco: Jossey-Bass.
Bodilly, S. J. (1996). Lessons from New American Schools Development Corporation’s demonstration phase. Santa Monica, CA: RAND.
Center for Innovation and Improvement (CII). (2007). Turnaround evidence review. Report submitted to the U.S. Department of Education on February 28, 2007.
Desimone, L. (2000). Making comprehensive school reform work. (Urban Diversity Series No. 112). East Lansing, MI: Clearinghouse on Urban Education, Institute for Urban and Minority Education. (ERIC Document Reproduction Service No. 1637)
Dillman, D.A. (1991). The Design and administration of mail survey. Annual Review of Sociology, 17: 225-249.
Elmore, R.F. (2000). Building a new structure for school leadership. Washington, DC: The Albert Shanker Institute.
Hallinger, P, Bickman, L. & Davis, K. (1996). School Context, Principal Leadership, and Student Reading Achievement. The Elementary School Journal. 96(5), 527-549.
Experienced Principal Interview
Current Principal/Assistant Principal Interview
Experienced Teacher Focus Group
New Teacher Focus Group
Curriculum/Instructional Specialist Interview Protocol
English/Language Arts/Mathematics Department Chair Interview
Parent Focus Group
Community Member Focus Group
District Official Interview
District Curriculum Specialist
School Improvement Plan (SIP)/Leadership Team Focus Group
Guidance Counselor Interview Protocol
Document Review Checklist
Exhibit 7
Crosswalk between Protocol Questions and 11 CSR Components
11 CSR Components |
Experienced Principal Interview |
Current Principal/Assistant Principal Interview |
Experienced Teacher Focus Group |
New Teacher Focus Group |
Curriculum/ Instructional Specialist Interview |
ELA/ Mathematics Department Chair Interview |
Proven methods |
2, 6, 14 |
2, 11 |
2 |
2, 3 |
2, 6, 14 |
2, 4, 9 |
Comprehensive design |
2, 14 |
2, 11 |
2 |
2 |
2, 4, 14 |
2, 9 |
Professional development |
2, 10 |
9, 10 |
8, 12 |
2, 7 |
9, 10 |
6, 7 |
Measurable goals |
5, 11 |
5, 7 |
9 |
9 |
3, 5, 11 |
8 |
Support from staff |
3, 8 |
4 |
7 |
3 |
3, 8 |
5 |
Support for staff |
9, 10, 12 |
9, 10 |
8, 11, 13, 14 |
6, 10-13 |
9, 10, 12 |
6, 7 |
Parent and community involvement |
13 |
2, 4, 11 |
16 |
15 |
3, 13 |
|
External assistance |
9 |
9 |
8 |
6 |
8, 9 |
6 |
Evaluation |
5, 11 |
4, 7 |
5 |
9 |
6, 11 |
8 |
Coordination of resources |
7 |
2, 11 |
12 |
|
7 |
|
Scientifically based research |
2, 14 |
2, 11 |
2 |
2, 3 |
2, 14 |
2, 4, 9 |
11 CSR Components |
Parent Focus Group |
Community Member Focus Group |
District Official Interview |
District Curriculum Specialist |
SIP/Leadership Team Focus Group |
Guidance Counselor Interview |
Proven methods |
|
|
2 |
1, 7 |
2 |
3 |
Comprehensive design |
|
|
2 |
1, 7 |
2 |
3 |
Professional development |
|
|
2, 6 |
1, 7, 8 |
6, 7 |
5 |
Measurable goals |
|
|
10 |
9 |
5 |
9 |
Support from staff |
|
|
3 |
|
3, 4 |
4 |
Support for staff |
|
|
6 |
5, 10 |
7 |
5 |
Parent and community involvement |
1, 4-7 |
1-3, 6-7 |
3 |
|
|
8 |
External assistance |
|
|
2, 6 |
1, 5, 8 |
7 |
5 |
Evaluation |
|
|
8, 10 |
9 |
5 |
|
Coordination of resources |
|
|
7 |
5 |
6 |
|
Scientifically based research |
|
|
2 |
1, 7 |
2 |
3 |
1 The new legislation continues comprehensive school reform initiatives started in FY1998 and originally established through appropriations, not authorizing legislation, the Appropriations Act for the U.S. Department of Education, P.L. 105‑78.
2 Total support for comprehensive school reform activities consisted of this appropriation plus another $75 million from an account under the Fund for the Improvement of Education (FIE). Thus, the total amount of funds available to support the activities was actually $310 million.
3 The earlier appropriations, (covering both comprehensive school reform and FIE allocations (see previous footnote) and number of schools supported are as follows: FY1998: $145 million (about 1,800 schools); FY1999: $145 million (to continue 1,800 schools along with about 441 new awards); FY2000: $220 million (to continue about 2,241 schools and also to fund about 568 new awards); FY2001: $260 million (to continue all the previous schools and fund about 139 new awards); and FY2002: $310 million (to continue all the previous schools and fund an estimated 2,000 new awards).
LACIO OMB Clearance
Request DRAFT page
File Type | application/msword |
File Title | PART A |
Author | Flaherty |
Last Modified By | DoED |
File Modified | 2007-09-20 |
File Created | 2007-09-20 |