Supporting
Statement for Paperwork Reduction Act Submissions for the Evaluation
of 21st
Century Community Learning Centers State Competitions
Submitted by:
American Institutes for Research
1000 Thomas Jefferson Street, NW
Washington, DC 20007
(202) 403-5000
Policy Studies Associates, Inc.
1718 Connecticut Avenue, NW
Suite 400
Washington, DC 20009
(202) 939-9780
February 6, 2012
Transmitted via email
Supporting Statement for Paperwork Reduction Act Submissions for the Evaluation of 21st Century Community Learning Centers State Competitions
Part A. Justification
A.1 Circumstance Requiring Collection of Information
The 21st Century Community Learning Centers (CCLC) program was introduced in 1997 to provide expanded learning opportunities and anti-drug and -violence programs for rural and inner-city children and youth. Authorized under the Elementary and Secondary Education Act (ESEA), the program was administered by the U.S. Department of Education (ED) and distributed awards totaling $40 million in 1998. Originally, grants were only awarded to elementary and secondary schools, school consortia, or local education agencies (LEAs) applying on their behalf. In 2001, ESEA Title IV, Part B amended the program, which included significant changes. First, the administration of the grants was transferred from ED to state education agencies (SEAs). In addition, the law significantly expanded and strengthened opportunities for out-of-school time (OST) programming for at-risk children and youth. The pool of eligible grantees was widened to include community-based organizations, faith-based organizations, and other public or private organizations working with a local school or district as part of the 2001 amendments. Today, 21st CCLCs serve approximately 1.5 million children and youth through 4,163 grants to 10,366 school-based and community-based centers. Approximately nine in ten 21st CCLCs are located in schools, and approximately half primarily serve elementary grades students. About two-thirds of 21st CCLCs serve elementary and/or middle school youth; the remaining centers serve a combination of middle school and high school youth, with 15 percent serving high school youth exclusively.
SEAs have major administrative responsibilities for administering the 21st CCLC program and have wide latitude in the design and execution of their states’ programs. However, little is known about the capacity of states to compete, select, monitor, and support the local grants that they award, particularly in light of SEA funding and staffing capacities. Indeed, the diverse administrative responsibilities of SEAs under the 21st CCLC program—conducting grant competitions that set priorities for the type and amount of 21st CCLC work that gets done in support of the academic needs of educationally disadvantaged children and youth, controlling the distribution of grant awards within the state, monitoring the progress of grantees, and supporting center work—attest to the importance of the SEA’s administrative effectiveness in supporting grantees to address the program’s priorities. Assessing states’ implementation of federal requirements regarding the 21st CCLCs, including their successes and challenges, will provide federal officials with insight into how to improve federal support for SEA administration of the 21st CCLC program and, more generally, for SEA administration of other ESEA programs.
ED’s Policy and Program Studies Service (PPSS) in the Office of Planning, Evaluation and Policy Development (OPEPD) is sponsoring a study that intends to provide the U.S. Department of Education with information regarding SEA implementation of the 21st CCLC program, and the administrative conditions and capacities that may both challenge and enhance implementation, improve program outcomes, and offer guidance on program approaches and strategies that create the conditions for success. Improving state capacity is a priority area identified by the administration. The study is descriptive in nature and consists of state and 21st CCLC personnel interviews. The study is not a program evaluation and does not purport to assess program outcomes. Data will be collected from a purposive sample in order to capture relevant program practices. Sites are therefore not representative of all states. The study’s sample is based on a distribution of candidate states that reflect differences with respect to each of the following: type of priorities set for grant awards; extent to which regular grant competitions are held; extent to which new programs/grantees are funded with each competition; and the selectivity of the grant competitions. All participation in the data collection effort will be voluntary.
Through case studies of nine states implementing the 21st CCLC program, the study will examine and assess the capacity of states to administer the 21st CCLC program grant competitions and identify lessons learned from SEA experiences in order to (a) inform state administration of this and other federal programs; and (b) provide federal officials with information needed to craft guidance and technical assistance to states in this crucial area of state responsibility. The evaluation questions that guide the study including the following:
How do states conduct their 21st CCLC competitions? What systems are used to inform the grant competition process? How might states improve their grant competition processes? Do states have sufficient analytic capacity to analyze and synthesize data to inform improvements in programs funded through 21st CCLC grants?
What are the key factors or conditions related to state capacity (e.g., structural, human, organizational, systemic) that are needed to carry out the 21st CCLC program and other state-administered discretionary programs? What lessons can be applied to state competitions for other federal education programs?
The research team, made up of staff of Policy Studies Associates (PSA) and American Institutes for Research (AIR), will develop and execute plans for the nine case studies, which will include (a) interviews with leaders and staff responsible for administering the 21st CCLC program and (b) review of key documents and artifacts from each of the study sites.
Authorization to conduct this study is provided by the Elementary and Secondary Education Act (ESEA), as reauthorized by the No Child Left Behind Act (NCLB) (Public Law 107-110), Section 1501.
A.2 Indicate How, by Whom, and for What Purpose the Information Is to be Used
The study will produce a report that presents findings from the nine case studies. The target audience for the report will be federal, state, and local program administrators who are implementing the 21st CCLC program. The report will be disseminated on the Web. The research team will also provide briefings for policy audiences. The study results are intended to inform ED about the administrative conditions that may enhance program implementation and improve program outcomes. In addition, the study is intended to provide federal officials insights into how to improve federal support for SEA administration of the 21st CCLC program and other ESEA discretionary grant programs. Finally, the case studies are intended to provide more guidance to state and local directors of 21st CCLC programs on approaches and strategies that create the conditions for success in state administration.
A.3 Use of Information Technology to Reduce Burden
The research team will conduct extensive online searches of the websites of SEAs (including portions of the websites that may be dedicated to 21st CCLC activities and other relevant policy and program areas), LEAs, and relevant community-based learning centers prior to each site visit. We anticipate that these searches will yield documents pertaining to the history, structure, and operation of the 21st CCLC program in each of the sites. We also anticipate that program materials and artifacts will be identified as the research team works with sites to plan the data collection visits. Thorough analysis of these documents will allow the research team to use this extant information to develop at least partial answers to the study’s research questions and to refine and streamline interview protocols prior to the site visits so that the interviews focus only on issues and topics not covered in the documents.
A.4 Avoidance of Duplication
The research team has determined that no previous federal studies have examined the 21st CCLC state competitions or the capacities of states to administer the 21st CCLC program.
A.5 Methods to Minimize Burden on Small Entities
All entities participating in this data collection effort are SEAs, LEAs or community based organizations (CBOs). No small businesses will be involved in any way, and the SEAs, LEAs, and CBOs included in the study are large enough that they do not meet the definition of small entities.
A.6 Consequences of Not Collecting Information
Without the report from this study, ED will not have access to key information that ED leaders and staff need to help guide and assist states in their efforts to manage their 21st CCLC grant competitions effectively. SEA administrative effectiveness, particularly with respect to the way in which SEAs manage their grant competitions, is a critical variable affecting local program quality. In the years since administrative responsibility was transferred from ED to SEAs, however, the broad latitude afforded states in the design and execution of their 21st CCLC programs has resulted in a wide variety of strategies for managing their grant competitions. In addition, ED is aware that SEA downsizing since the mid-1990s has reduced state capacity to administer federal education programs. Accordingly, understanding the capacity issues that states face with respect to managing the 21st CCLC program would inform ED’s approach to supporting SEAs in shoring up the quality of state competitions and related state administrative actions through technical assistance and monitoring. Accordingly, ED is interested in better understanding the capacities of states that dictate their design and management decisions with regard to their state 21st CCLC competitions. Specifically, to what extent are states able to: (a) launch regular grant competitions; (b) assemble and manage a peer review team for purposes of evaluating the quality of applications; and (c) provide information to and encourage applications from eligible entities in high-need communities across the state. In addition, to what extent do states have the capacity to monitor the activities of grantees and ensure that they are implementing effective strategies for serving the academic needs of participating students and for providing them with technical assistance to guide and assist program implementation?
The report will describe how states manage their competitions and the challenges they face in this era of scarce resources and reduced administrative capacities. Based on these findings, the report will offer states and districts strategies to inform proposals and competitions to ensure that the highest quality 21st CCLCs are ultimately funded. These “lessons learned” are also intended to inform the state administration of other discretionary grants programs and assist federal officials in improving the guidance and technical assistance offered to states in this crucial area of state responsibility.
A.7 Explain Any Special Circumstances
No special circumstances apply to the planned case studies.
A.8 Consultation Outside the Agency
The research team will pre-test each interview protocol with a few respondents appropriate to that protocol, including SEA staff, three district or other 21st CCLC grantee staff, and three local 21st CCLC managers. The pre-test results will be used to refine the interview protocols and finalize estimates of respondent burden.
The 60 day Federal Register Notice published on February 15, 2012, Vol. 77, page 8847.
No comments were received.
A.9 Payments or Gifts
No payments or gifts will be made to SEA, LEA, or other 21st CCLC grantee staff who participate in interviews for the study.
A.10 Assurances of Confidentiality
Responses to this data collection will be used to develop aggregate findings across groups of sites or to provide examples of program implementation. In no case will reporting from this evaluation associate responses with a specific site or individual. In the report, pseudonyms will be used for each site. The research team may refer to the generic title of an individual (e.g., “project director” or “peer reviewer”), but neither the site name nor the individual name will be used. All efforts will be made not to disclose the true name or identity of the site or individuals at the site. The evaluator will not provide information that associates responses or findings with a subject or district to anyone outside the research team, except as required by law.
Prior to each individual interview, the site visitors will explain the purpose of the study, the topics to be covered in the interviews, and the measures taken to provide confidentiality assurances discussed here. Site visitors will also explain that participation in the study and responses to individual interview questions are voluntary and that respondents may decide not to participate or may end their participation at any time.
The research team has extensive experience in protecting the privacy and confidentiality of interview respondents. Safeguards to protect the privacy and confidentiality of all respondents—in addition to the ones discussed above—include the following:
All individual contact information that may be available to the research team will be used for scheduling interviews only and will be destroyed as soon as the interviews and necessary follow-ups are completed.
All audiotapes and notes from individual interviews, as well as all documents that contain sensitive, personally identifiable information, will be maintained in secure files accessible only by members of the research team.
All audiotapes, interview notes, and sensitive documents will be destroyed upon submission of the final report on the case studies.
Training for site visits will familiarize team members with the confidentiality provisions discussed above and with their responsibilities for explaining those provisions to respondents and for maintaining the necessary safeguards in storing and using interview data for analysis and reporting.
A.11 Justification of Sensitive Questions
The interview protocols do not include any questions of a sensitive nature.
A.12 Estimates of Respondent Hour Burden and Annualized Cost
Data collection for the nine case study sites will include up to 81 individual interviews with state-level respondents and up to 45 in-person interviews with LEA staff and local stakeholders. In addition, the research team will conduct individual interviews with members of state peer review teams, including up to three peer reviewers per state. The evaluation team estimates that the individual interviews will last 45-60 minutes.
Estimates of the number of interviews and the amount of time required to conduct them are displayed in Exhibit 1.
There are no direct monetary costs to respondents for this activity. At an estimated 153 hours and an average of $45 per labor hour, the overall cost burden for information collected through the surveys will be $6,885.
Exhibit 1
Number of Respondents and Labor Hours
Expected for Each Participating Site
Respondent Category |
Interviews Per Site (60-minute interviews) |
Total Labor Hours |
SEA 21st CCLC program staff (e.g., SEA liaison, other 21st CCLC staff, administrative consultant(s), budget/finance director) |
Up to 6 respondents (6 hours) |
54 hours |
SEA administrators of other federal discretionary grants programs (e.g., Improving Teacher Quality State Grants, School Improvement Grants, Math and Science Partnership Grants) |
Up to 3 respondents (3 hours) |
27 hours |
21st CCLC peer review team members |
Up to 3 respondents (3 hours) |
27 hours |
LEAs and other local 21st CCLC grantees (1 LEA or other local Grantee per state) |
Up to 5 respondents (5 hours) |
45 hours |
Total for All Nine Sites |
153 respondents |
153 hours |
There is no capital or start-up cost component to these data collection activities, nor is there any operations, maintenance, or purchase cost associated with the evaluation.
A.14 Estimates of Annual Cost Burden to Federal Government
The estimated cost to the federal government is $582,900. This total includes costs already invoiced, plus budgeted future costs charged to the government by PSA and AIR for preparation of the study design, site selection, data collection (including travel for site visits), data analysis, and reporting.
A.15 Program Changes in Burden/Cost Estimates
This request is for a new information collection and no changes apply.
A.16 Plans/Schedules for Tabulation and Publication
This study will generate a project report, which will provide study findings and lessons learned from state 21st CCLC grant competitions, and their potential relevance to other programs. The report will describe sample states’ practices in conducting grant competitions for 21st CCLC programs, the capacity issues—including organizational, funding, and staff number, availability, and expertise—that affect the quality of those competitions, and the extent to which states assess local applications and plans. In addition, the report will describe the experiences of local 21st CCLC grant recipients with the selection and monitoring processes and the extent to which they believe the grant requirements reflect the needs of the community. The report will highlight the particular successes and challenges that yield lessons generalizable to other ESEA programs. For example, the report will include vignettes to illustrate state-level strategies used to run grant competitions, monitor local grantee performance and identify areas of need, and provide technical assistance. In addition, the report will highlight the particular areas in which states may need additional support and technical assistance to bolster their capacity to conduct rigorous grant competitions that result in high-quality 21st CCLC programs.
The research team will conduct case studies from June 18, 2012, to July 27, 2012. The final draft of the case study report will be completed by May 31, 2013.
A.17 Expiration Date Omission Approval
Not applicable. All data collection instruments will include the OMB data control number and data collection expiration date.
A.18 Exceptions
Not applicable. No exceptions are requested.
American Institutes for Research
1000 Thomas Jefferson Street NW, Washington, DC 20007-3835 | 202.403.5000 | TTY 877.334.3499 | www.air.org
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Yvonne Woods |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |