Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Culture of Continuous Learning Project: Case Study
OMB Information Collection Request
0970 – 0605
Supporting Statement
Part A
July 2023
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer:
Nina Philipsen
Part A
Executive Summary
Type of Request: This Information Collection Request is for a nonmaterial/non-substantive change.
Description of Request: This request is for a study to assess the feasibility of implementing continuous quality improvement methods in Early Care and Education programs and systems to support the use and sustainability of evidence-based practices. Three Breakthrough Series Collaboratives (BSCs), a specific quality improvement methodology designed to support the implementation of continuous quality improvement methods in organizations, will be implemented in Head Start and child care centers. The implementation of the BSCs will be evaluated using a case study design that will involve focus groups, interviews, surveys, and classroom observations, in addition to examining artifacts of the BSCs’ implementation (e.g., work that the BSC teams do as part of the BSC itself). We do not intend for this information to be used as the principal basis for public policy decisions.
A1. Necessity for Collection
Head Start and child care programs aim to provide children and families with high quality early care and education (ECE), and often strive to provide their workforce with effective quality improvement training and ongoing supports. However, we know that not all quality improvement efforts in ECE result in sustainable change (Derrick-Mills et al., 2014).
The Office of Planning, Research, and Evaluation (OPRE) within the Administration for Child and Families (ACF) at the Department of Health and Human Services (HHS) is conducting the Culture of Continuous Learning (CCL) project to assess the feasibility of implementing continuous quality improvement methods in ECE programs and systems to support the use and sustainability of evidence-based practices. In particular, the CCL project aims to evaluate the feasibility and sustainability of innovative quality improvement strategies that benefit both Head Start and child care programs.
Breakthrough Series Collaboratives (BSCs) are a specific quality improvement methodology designed to support the implementation of continuous quality improvement methods in organizations. The BSC methodology has been studied extensively in health care and other fields but has limited evidence as an effective quality improvement methodology in the ECE field. The CCL project is proposing a new information collection to implement three BSCs in Head Start and child care centers in three different sites across the country. The findings will be of broad interest to ECE programs as well as training and technical assistance providers and researchers, all of whom are interested in improving the quality of services young children receive.
There are no legal or administrative requirements that necessitate this collection. ACF is undertaking the collection at the discretion of the agency. ACF has contracted with Child Trends to carry out this study.
A2. Purpose
Purpose and Use
This primary data collection is intended for research purposes. The CCL team proposes conducting a descriptive case study to document the factors that contribute to the feasibility of BSC implementation within existing ECE quality improvement structures within states (e.g., a state quality rating and improvement system) and/or regions (e.g., a professional development or technical assistance system within a region or state, or a cross-state region such as Head Start regional technical assistance areas). In the future, embedding the BSC into these existing quality improvement structures could increase the reach of this quality improvement methodology. Head Start and child care centers that voluntarily participate in the BSCs will be asked to complete a number of tools designed to facilitate implementation of the BSC. The implementation of the BSCs will be evaluated using a case study design that will involve focus groups, interviews, surveys, and classroom observations.
Findings from the case study will inform hypotheses and study measures which will be useful in the design of an evaluation for a future study of BSCs in ECE systems. The case study will also help determine what additional capacity ECE systems may need to adopt the BSC methodology and offer it within their system at a larger scale. ACF may use the findings to help inform continuous quality improvement efforts and future research agendas. Researchers may use the findings to inform their understanding of BSC implementation in Head Start and child care centers, and provide context for designing future studies related to BSC implementation. Findings will also be used to inform the design of a possible future research study about implementing a BSC in ECE systems.
The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.
Guiding Questions
This data collection is designed to answer the following guiding questions:
What needs to be in place to pilot a BSC within state, Head Start, and regional ECE quality improvement (QI) systems?
How do state/regional factors facilitate or hinder effective implementation of a BSC conducted within state, Head Start, and regional QI systems?
How do ECE center-level factors facilitate or hinder effective implementation of a BSC conducted within state, Head Start, and regional QI systems?
What do state/regional/Head Start leaders perceive they would need to integrate the BSC methodology’s use within state, Head Start, and regional QI systems to ensure sustainability?
Is there evidence of BSC participant engagement, activation of change mechanisms, and observed and/or perceived changes in short-term outcomes of a BSC piloted within BSC within state, Head Start, and regional QI systems?
Within participating centers, do we see spread of social and emotional learning (SEL) and QI practices beyond staff who directly participated in the BSC?
Are SEL and QI practices sustained 6 months after the end of participation in a BSC?
Study Design
The CCL team plans to use surveys, focus groups, classroom observations, and interviews at several time points over the course of the implementation of the BSC to document and assess the feasibility of implementing BSCs within a state and/or regional QI system, professional development (PD) system, or technical assistance (TA) system (See Table A12 for detail about the proposed total number of responses per respondent for each instrument). This approach is appropriate to answer the proposed research questions and meet the project’s objectives (See Supporting Statement Part B1 for details).
We will select the three locations (i.e., states and/or regions) for the BSCs and each BSC will be comprised of up to eight ECE sites (i.e., child care or Head Start centers) for a total of 24 participating ECE centers (see Supporting Statement Part B2 for details). Each site will select up to seven individuals to be part of the core BSC team. This team will include an administrator, one parent, and several teachers and support staff. Further, among these team members, one person will be selected to serve as the senior team leader, and another will be selected as the team data manager.
The team data manager will be responsible for uploading various implementation documents (e.g., data collection planning worksheet, PDSAs, monthly metrics) to an online shared learning platform to provide information about the BSC implementation process. Individual team members will also make use of the online shared learning platform for team or full BSC discussions and information sharing; this information will be used, in part, to document BSC participation levels among BSC teams. These pieces of information are considered artifacts of the BSC process.
Data collection will take place over approximately 36 months. Both qualitative and quantitative data will be collected to test hypotheses associated with each of the research questions. All data collected and used will be analyzed for emergent themes related to each of the seven research questions. Information from each instrument covered by this OMB package will inform the overarching study objective: to assess the feasibility of implementing BSCs in Head Start and child care centers. Table A2 summarizes all data collection activities. Information collected through these data collection activities will be purely descriptive; findings from these activities are not intended to be generalizable.
One notable limitation of the study design is the small sample size. All limitations to how the information can be used will be described in any publications.
Table A2. Summary of data collection activities
Instruments |
Respondent, Content, Purpose of Collection |
Mode and Duration |
BSC Implementation Instruments |
||
BSC Selection Application Questionnaire (Instrument 1) |
Respondents: Two to five people per center or program would be involved in completing the BSC selection application, including but not limited to the director and teacher(s)
Content: Staff and population served demographics, prior staff training in and knowledge about Pyramid Model/SEL practices, understanding of the BSC time commitments, goals for participation, and organizational capacity for improvement
Purpose: To assess the capacity of applicants to successfully participate in the BSC |
Mode: Web-based or paper-based
Duration: 1.5 hours |
Pre-Work Assignment: Data Collection Planning Worksheet (Instrument 2) |
Respondents: Team Data Manager and Senior Team Leader
Content: A plan for how to collect the metrics
Purpose: To establish a concrete plan for how to collect the metrics data for the BSC |
Mode: Web-based or paper-based
Duration: 2 hours |
Plan, Do, Study, Act (PDSA) Form & Tracker (Instrument 3) |
Respondents: Entire BSC team
Content: The six PDSA topics include: 1) primary driver, 2) strategy that will be tested, 3) detail about what the plan is, 4) how the plan went, 5) what was learned, and 6) what will be done next
Purpose: To test small adjustments in practice as the team strives to implement, spread, and sustain the improvements across their organization |
Mode: Web-based or paper-based
Duration: 15 minutes |
Monthly Metrics (Instrument 4) |
Respondents: BSC Team Leader/Manager
Content: Example metrics include: 1) child attendance, 2) teacher attendance, 3) number of child behaviors perceived as challenging, 4) number of children making progress in a social-emotional domain of an approved assessment
Purpose: To capture monthly snapshots of BSC implementation and track trends over time |
Mode: Web-based
Duration: 1.5 hours |
Implementation Discussion Forum Prompts (Instrument 5) |
Respondents: Every member of each BSC team
Content: Topics will include ideas, strategies, and problems related to BSC implementation
Purpose: To create a forum for ongoing sharing of ideas and collaborative problem-solving for improving practices and organizational capacity |
Mode: Web-based
Duration: 15 minutes |
Learning Session Feedback Form (Instrument 6) |
Respondents: Every member of each BSC team
Content: Perceptions of learning session
Purpose: To help team members reflect on their experience and provide feedback to the implementation staff and faculty that can be used to improve the BSC |
Mode: Web-based or paper-based
Duration: 15 minutes |
Action Planning Form (Instrument 7) |
Respondents: Every member of each BSC team
Content: Topics related to strategizing on PDSAs BSC teams will try next
Purpose: To help BSC teams focus their thinking on next steps for improvement |
Mode: Web-based or paper-based
Duration: 15 minutes |
BSC Overall Feedback Form (Instrument 8) |
Respondents: Every member of each BSC team
Content: Perceptions and experiences with the BSC
Purpose: To help team members reflect on their experience in the BSC as a whole and provide feedback to the implementation staff and faculty that can be used to improve the BSC |
Mode: Web-based or paper-based
Duration: 15 minutes |
Organizational Self-Assessment (Instrument 9) |
Respondents: Every BSC team
Content: Level of a center or program’s functioning (on a 4-point Likert scale) across each domain’s goals
Purpose: To help BSC teams review the practices and systems they currently have in place that help support social and emotional learning practices in their center or program, and to identify priorities and goals for improvement |
Mode: Web-based or paper-based
Duration: 1.5 hours |
BSC Evaluation Instruments |
||
Key Informant Interviews with BSC Faculty Members Affiliated with the States/Regions Discussion Guide (Instrument 10) |
Respondents: BSC faculty members who are affiliated with the case study states/regions
Content: Example interview topics include: 1) faculty member background and 2) perceptions of BSC experience compared to other QI experiences
Purpose: To collect information about the process of implementing a BSC and the systemic facilitators and barriers to BSC implementation |
Mode: Web-based
Duration: 1 hour |
BSC Implementation Staff and Faculty Focus Group Discussion Guide (Instrument 11) |
Respondents: BSC implementation staff and faculty
Content: Example focus group topics include: 1) factors that helped or hindered BSC participation, 2) perception of participant goals, needs, and expectations of the BSC, 3) reflection of BSC implementation
Purpose: To facilitate implementation staff and faculty’s reflections on the facilitators and barriers to implementing their particular BSCs and the perceived changes in BSC participants knowledge, attitudes, and practices through BSC participation |
Mode: Web-based
Duration: 1.5 hours |
BSC Implementation Staff and Faculty Background Survey (Instrument 12) |
Respondents: BSC implementation staff and faculty
Content: Faculty and staff demographics and background information
Purpose: To gather faculty and staff demographics and background information |
Mode: Web-based
Duration: 10 minutes |
Key Informant Interviews with BSC Center Administrators Discussion Guide (Instrument 13) |
Respondents: BSC team members who are directors or assistant directors
Content: Example interview topics include: 1) state, regional, and Head Start context, 2) center context, 3) changes in program culture, practices, and distributed leadership
Purpose: To gather information about facilitators and barriers for BSC implementation at both the system and center levels |
Mode: Web-based
Duration: 1 hour |
BSC Teachers and Support Staff Focus Group Discussion Guide (Instrument 14) |
Respondents: BSC team members who are teachers and other center or program staff
Content: Example focus group topics include: 1) state and center-level factors that may have helped or hindered BSC participation, 2) participant goals, needs, and expectations of the BSC, 3) how the BSC compared to other QI experiences
Purpose: To gather information on facilitators and barriers to BSC participation and how the BSC compares to their other experiences with quality improvement |
Mode: Web-based
Duration: 1.5 hours |
BSC Parent Focus Group Discussion Guide (Instrument 15) |
Respondents: BSC team members who are parents
Content: Example focus group topics include: 1) BSC elements that were most helpful and most challenging to parent, 2) satisfaction/perception of BSC value, 3) spread and sustainability
Purpose: To gather information on motivation, facilitators, and barriers to BSC participation |
Mode: Web-based
Duration: 1.5 hours |
Individual BSC Teams Focus Group Discussion Guide (Instrument 16) |
Respondents: Every member of each BSC team
Content: Example focus group topics include: 1) feedback on case study itself (implementation and evaluation), 2) state and center-level factors that may have helped or hindered BSC participation
Purpose: To learn about perceived changes following BSC participation at the individual and center or program levels |
Mode: Web-based
Duration: 1.5 hours |
Administrator Surveys (Instrument 17a) |
Respondents: Administrators in participating BSC centers or programs
Content: Questions related to topics such as staff turnover and support, use of data, individual well-being, perceptions of implementation, team self-efficacy, etc.
Purpose: To gather information on changes in knowledge, attitudes, practices, and center-level administrative data over the course of BSC implementation |
Mode: Web-based
Duration: 30 minutes |
Teacher Surveys (Instrument 17b) |
Respondents: Teachers, assistant teachers, and classroom aides in BSC participating centers or programs regardless of whether the individuals participated in the BSC themselves
Content: Questions related to topics such as data use, individual well-being, perceptions of implementation, inter- and intra-organizational learning, team self-efficacy, etc.
Purpose: To gather information on changes in knowledge, attitudes, and practices over the course of the BSC implementation |
Mode: Web-based
Duration: 30 minutes |
Other Center Staff Surveys (Instrument 17c) |
Respondents: Other center or program staff (besides teachers) in participating BSC centers or programs regardless of whether the individuals participated in the BSC themselves, team self-efficacy, etc.
Content: Questions related to topics such as data use, individual well-being, perceptions of implementation, inter- and intra-organizational learning, etc.
Purpose: To gather information on changes in knowledge, attitudes, and practices over the course of BSC implementation |
Mode: Web-based
Duration: 30 minutes |
Non-BSC Parent Surveys (Instrument 17di) |
Respondents: All parents in participating BSC centers or programs regardless of whether the individuals participated in the BSC themselves
Content: Questions related to topics such as family engagement and parent demographics
Purpose: To gather information on parents’ perceptions of their level of collaboration with their child’s program or center |
Mode: Web-based
Duration: 15 minutes |
BSC Parent Surveys (Instrument 17dii) |
Respondents: BSC team members who are parents
Content: Questions related to topics such as team self-efficacy, family engagement, time and resources, demographics
Purpose: To gather information on parents’ perceptions of their level of collaboration with their child’s program or center |
Mode: Web-based
Duration: 30 minutes |
Classroom Observations (Instrument 18) |
Respondents: Within each participating center or program, one teacher who is participating in the BSC and one teacher who is not participating in the BSC
Content: Observations will be conducted using the Swivl video recording system. Preparation for classroom observations will involve connecting with teachers and administrators to facilitate observation scheduling
Purpose: To capture changes in teacher’s classroom practices over the course of the BSC implementation |
Mode: Web-based and in-person
Duration: 20 minutes |
Administrative Data Survey (Instrument 19) |
Respondents: Center or program administrators
Content: Staff rosters
Purpose: To capture monthly snapshots of BSC implementation and track trends over time |
Mode: Web-based
Duration: 15 minutes |
Other Data Sources and Uses of Information
When available, we will use administrative data (e.g., staff rosters) from participating Head Start or child care centers to minimize the amount of active data collection with respondents. Additionally, the CCL team will examine artifacts of the BSCs’ implementation (e.g., work that the BSC teams do as part of the BSC itself, meeting notes). These administrative data and artifacts will be used to help understand the feasibility of BSC implementation.
A3. Use of Information Technology to Reduce Burden
The research team will employ information technology in the form of online surveys administered through REDCap, our secure online data collection platform. Links to each survey will be distributed electronically. Conducting surveys online will allow respondents to complete the survey on their own time and take pauses as needed, thereby minimizing respondent burden. Interviews and focus groups will be conducted over a secure web-based video conference platform. Classroom observations will be conducted using the Swivl video recording system which will focus on the teacher’s movements. Video recordings will reduce burden by allowing for minimally intrusive classroom observations. Lastly, the online shared learning platform will be used to provide BSC teams with information, tools, and resources, as well as provide a space for teams to share their successes and challenges, ask questions, and receive feedback from other centers and the CCL project team. Providing BSC teams with an online shared learning platform will allow respondents to collaboratively engage with resources and complete materials on their own time.
A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency
None of the study instruments will ask for information that can be reliably obtained from alternative data sources, including administrative data. Furthermore, the design of the study instruments ensures that the duplication of data collected through each instrument is minimized. Finally, the case study will utilize data collected as part of the BSC (e.g., how people are interacting with the online shared learning website, attendance records for participating in phone calls and meetings associated with BSC implementation, notes taken during meetings) or secondary analysis to minimize duplication.
A5. Impact on Small Businesses
Most of the organizations in the study will be small businesses, including child care centers and Head Start/Early Head Start programs. We will minimize the burden to these respondents by limiting the length of the instruments and by providing most instruments in a web-based format that respondents can complete at their convenience. Burden will also be minimized for respondents by convening interviews, focus groups, and classroom observations virtually.
A6. Consequences of Less Frequent Collection
To understand change over time in individual beliefs and behaviors, as well as organizational change in culture around continuous improvement efforts and supports for children’s social and emotional learning, we will collect some information (e.g., surveys, classroom observations, focus groups) at various stages of the BSC process (i.e., the beginning, mid-points, end, and follow-up). Planned data collection activities aim to gather information only as frequently as needed to meet study objectives. Reducing any of the proposed data collection activities would compromise the CCL team’s ability to address key research questions. For example, the action planning forms will be collected on an ongoing basis (up to weekly during action periods) to monitor and inform implementation of the BSCs. More frequent collection of action forms allows for a strong implementation because real-time data can inform ongoing technical assistance, training, and coaching needs. Including this level of detail and frequency of data collection strengthens the added value of the CCL project to the ECE field.
A7. Now subsumed under 2(b) above and 10 (below)
A8. Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on September 12, 2022, Volume 87, Number 175, page 55819, and provided a sixty-day period for public comment. During the notice and comment period, we did not receive any comments.
The CCL team plans to consult with experts to complement the knowledge and experience of the team (Table A.8). Consultants have specialized knowledge in continuous quality improvement methods, implementation science, evaluation, social-emotional development, Head Start administration, organizational contexts, and coaching.
The purpose of engaging with experts in continuous quality improvement, social and emotional learning, and Head Start and child care and development fund (CCDF) policies is these experts may recommend effective approaches for recruitment and implementation of the Breakthrough Series Collaborative. They may offer advice about connecting with possible locations for the BSC implementation. And they may be able to help the project team anticipate questions, concerns, and barriers to participation in the BSC. It is expected that experts would be engaged in discussions focused on each of the two main topics of interest (methods/content and policy); fewer than 10 experts will be asked the same question during these discussions.
Table A8. CCL Project expert consultants
|
Name and Affiliation |
Expertise |
|
Methodological and Content Experts |
|
1. |
Zelda Boyd, National Center on Early Childhood Quality Assurance |
CCDF quality initiatives |
2. |
Rob Corso, Senior Research Associate at Vanderbilt University and Executive Director of the Pyramid Model Consortium |
Pyramid Model, Head Start T/TA; previous Project Coordinator for the Center on the Social and Emotional Foundations for Early Learning (CSEFEL) |
3. |
Allyson Dean, Cultivate Learning, University of Washington |
ECE professional development, Head Start T/TA |
4. |
Gayle Kelly, MN Head Start Association |
Head Start T/TA, Head Start systems, Head Start within a state context |
5. |
Mary Mackrain, EDC |
Home visiting COIIN |
6. |
Lauren Smith, Maternal and Child Health Division, Vermont Department of Health |
BSC in a state health system (focus: health screenings) |
7. |
Yvette Rodriguez, ABCD Head Start |
Head Start T/TA, Head Start systems, Head Start within a state context |
8. |
Sherri Killins Stewart, Director of Systems Alignment and Integration, BUILD Initiative |
State ECE systems, equity |
|
Head Start and State CCDF Administrators, Experts, and Advisors |
|
9. |
Rachel Brown-Kendall, Washington Dept of Children, Youth and Family Services |
QRIS and quality improvement |
10. |
Sarah Neville-Morgan, California Dept of Education |
QRIS and quality improvement |
11. |
CCDF state administrator, TBD |
CCDF policy |
(duplicate) |
Gayle Kelly, MN Head Start Association |
Head Start T/TA |
12. |
Regional Head Start, TBD |
Head Start policy, Head Start T/TA |
13. |
State HS Collaboration Director, TBD |
Head Start policy, Head Start T/TA |
A9. Tokens of Appreciation
To collect data that are as representative as possible, it is important to maximize our response rates to surveys, interviews, and focus groups. We are using tokens of appreciation to increase participants’ engagement with the data collection efforts.
Focus group, interview, and survey data will not be representative in a statistical sense, in that they will not be used to make statements about the prevalence of experiences for the population of BSC participants (i.e., BSC participants in other studies). It is, however, important to secure participants with a range of background characteristics and personal circumstances to capture a variety of possible perspectives on the BSC experience in the current study. Additionally, maintaining engagement and participation longitudinally (i.e., completing multiple requests over the course of the BSC) is crucial to the study design; a lack of consistent engagement from a variety of participants (regardless of income level) would reduce the overall quality and utility of the data collection efforts.
Previous research has shown that tokens of appreciation improve survey response rates regardless of modality (i.e., web, mail, phone) and can help mitigate nonresponse bias, particularly from low-income respondents (Singer & Ye, 2013). This is relevant for our primary study population as ECE educators are paid low wages which means that many ECE educators have household incomes around the federal poverty level and use public supports such as SNAP and Medicaid (Whitebrook & McLean, 2017). Various studies with low-income individuals have found that not offering a token of appreciation degrades the quality of a study, while providing a token of appreciation improves participant engagement. For example, FACES (OMB #0970-0151) offered $35 tokens of appreciation in their 2006 and 2009 cohorts to parents/guardians who completed baseline information forms and reports about their children participating in the study. These tokens of appreciation were reduced to $15 in FACES 2014-2018, which resulted in a drop in response rates from 93.1% to 77.5%, and differential response rates across different demographic groups. The sample for the Project LAUNCH Cross-Site Evaluation (OMB #0970-0373) included preschool and ECE settings and did not initially offer tokens of appreciation to parents completing a 30-minute web-based survey. Early results indicated that respondents were not representative of their communities; individuals with low incomes and those who did not have full-time employment were underrepresented. OMB approved a $25 token of appreciation after data collection had started, which improved both the completion rate and representativeness of responses (LaFauve et al., 2018).
Prior research has also found that prepaid tokens of appreciation for surveys, followed by an additional token of appreciation post-survey completion, is an effective method for increasing response rates and addressing nonresponse bias when compared to only providing a post-completion token (Mercer et al., 2015; Singer et al., 2013). This finding was supported in a recent survey of child care staff. As part of the Assessing the Implementation and Cost of High Quality Early Care and Education (ECE-ICHQ; OMB #0970-0499) project, Albanese, Edwards, Weiss, and Gonzalez (2022) conducted an experiment to assess the effect of prepaying a portion of the token of appreciation on response rates. They found that giving a portion of the token amount with the initial request prior to survey completion increased the response rate among child care staff by 20 percentage points compared to giving the full amount after completion. The study team believes that creating a token of appreciation structure with pre- and post-survey completion tokens will help achieve the study’s goal of securing participants (including child care staff and BSC parents) with a range of background characteristics and personal circumstances, which is vital to capture the necessary variety of participant perspectives on the BSC experience.
Teachers, other center staff, and parents who are part of the Core-BSC team will participate in a survey at three time points during the project. At each time point, respondents will receive an initial $5 prepaid token of appreciation, followed by an additional $20 token of appreciation for completing the 30-minute survey. All parents whose children attend the centers that participate in the CCL project will be asked to complete a survey at two timepoints. Parents who complete the survey will be entered into a raffle to receive one of twenty $25 tokens of appreciation. Twenty $25 tokens of appreciation will be given out at both timepoints. Because of the raffle structure, the study team will not offer prepaid tokens of appreciation to non-BSC parents.
Lastly, in line with similar studies (e.g., ECE-ICHQ, OMB #0970-0499; HS2K, OMB #0970-0581; HS REACH, OMB #0970-0508), the CCL team will provide a $50 token of appreciation to individuals who participate in 60-90-minute qualitative data collection activities (i.e., focus groups and interviews). Similar to the ECE-ICHQ project (OMB #0970-0499) the CCL project team determined that the initial level of appreciation ($10) for individuals who participate in focus groups and interviews was not aligned with the level of effort requested. The project team reviewed the Head Start to Kindergarten (HS2K) and Head Start REACH: Strengthening Outreach, Recruitment and Engagement Approaches with Families (HS REACH) projects as recent, comparable data collection efforts in terms of burden. The HS2K team recommended a $50 token of appreciation to parents or other family members who participated in a 75-minute focus group. Similarly, HS REACH recommended a $40 token of appreciation to parents who participated in a 90-minute focus group. Both projects determined these levels of appreciation (i.e., $50 and $40, respectively) as necessary to acquire needed response rates.
Table A9. Tokens of appreciation
Instrument |
Avg. Burden per Response (in hours) |
Previous Token of Appreciation per Response |
Prepaid Token of Appreciation per Response |
Post-Activity Token of Appreciation per response |
Total Token of Appreciation per Response |
BSC Teachers and Support Staff Focus Groups |
1.5 |
$10 |
$0 |
$50 |
$50.00 |
BSC Parent Focus Groups |
1.5 |
$10 |
$0 |
$50 |
$50.00 |
Individual BSC Teams Focus Groups |
1.5 |
$10 |
$0 |
$50 |
$50.00 |
Key Informant Interviews BSC Center Administrators |
1 |
$10 |
$0 |
$50 |
$50.00 |
Administrator Surveys |
0.5 |
$20 |
$5 |
$20 |
$25.00 |
Teacher Surveys |
0.5 |
$20 |
$5 |
$20 |
$25.00 |
Other Center Staff Surveys |
0.5 |
$20 |
$5 |
$20 |
$25.00 |
Non-BSC Parent Surveys |
0.25 |
$20 |
$0 |
$25 |
$25.00 lottery ($25 token of appreciation given to up to 20 respondents at each of the 2 timepoints) |
BSC Parent Surveys |
0.5 |
$20 |
$5 |
$20 |
$25.00 |
A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing
Personally Identifiable Information
For all surveys, email addresses will be initially associated with survey responses to track who has completed the surveys and thus, who needs a follow-up reminder. During data collection, participants will complete all surveys using an online platform, REDCap. The information in REDCap, including participants’ contact information, is hosted on a FedRAMP compliant Microsoft Azure Server. A BSC participant’s name, email address, role, and center name will be associated with their unique web-based dashboard allowing them to fully engage with the online shared learning platform. Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.
Assurances of Privacy
Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. As specified in the contract, the Contractor will comply with all Federal and Departmental regulations for private information. The CCL team has obtained Institutional Review Board approval for all aspects of this information collection, including recruitment, data collection, and analysis procedures. Once data collection is complete, the data will be de-identified and downloaded into a dataset stored on Child Trends’ secure server. We will record all interviews and focus groups using Microsoft Teams. For interviews and focus groups, to minimize the effort required for participants, we will read consent language and give participants the option to opt out of the activity at the start of the call. For focus groups, if participants do not wish to participate in a recorded group call, we will offer them the option of having a one-on-one call with a member of our team in which we will only take notes and will not record. Recordings will be stored on Child Trends’ secure server. Recordings will be transcribed and the transcription will be stored on the secure server. Recordings will be identifiable as they will contain the participants’ voice and image (if they choose to have the video on). Once transcription is complete, has gone through a quality assurance process, and the final report is complete, recordings will be permanently deleted. Transcriptions will be de-identified of participants’ names but will retain information about location and role.
Data Security and Monitoring
As specified in the contract, the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ personally identifiable information (PII). The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements.
As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractor shall: ensure that this standard is incorporated into the Contractor’s property management/control system; establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractor must submit a plan for minimizing to the extent possible the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive or PII that ensures secure storage and limits on access.
A11. Sensitive Information 1
We will not collect sensitive information.
A12. Burden
Explanation of Burden Estimates
The estimated annual burden for respondents is shown in Table A12. We estimated burden for the pre- and post-surveys by piloting with fewer than ten individuals. We estimated the burden for all other instruments by considering the number and types of questions asked, as well as the time needed for respondents to review instructions, search data sources, complete and review their responses, and transmit or disclose information.
Estimated Annualized Cost to Respondents
The estimated annual cost to respondents is shown in Table A12. The mean hourly wage for each respondent type is based on information from the Bureau of Labor Statistics, Occupational Employment and Wages, May 2021 and the U.S. Department of Labor’s Wage and Hour Division.
For BSC faculty members affiliated with the state/region, the mean hourly wage of $36.02 was used based on the wage for “Instructional Coordinators” in State Government, excluding schools and hospitals (https://www.bls.gov/oes/current/oes259031.htm).
For Head Start and center directors, the mean hourly wage of $25.87 was used, based on the wage for “Education and Childcare Administrators, Preschool and Daycare: Child Day Care Services” (https://www.bls.gov/oes/current/oes119031.htm).
For teachers and assistant teachers, we averaged the mean hourly wage for “Preschool Teachers, Except Special Education” ($17.53) and “Childcare Workers” ($13.31), which yielded an estimated teacher/assistant teacher mean hourly wage of $15.42 (Preschool Teachers, Except Special Education; https://www.bls.gov/oes/current/oes252011.htm and Childcare Workers; https://www.bls.gov/oes/current/oes399011.htm). This hourly wage was then adjusted for overtime pay 1.5 times the estimated teacher/assistant teacher hourly wage to account for the fact that we anticipate teachers and assistant teachers may complete the data collection instruments in hours that fall outside their typical working hours. The estimated mean hourly overtime wage of $23.13 was used for teachers/assistant teachers.
The federal minimum wage of $7.25 was used to calculate the hourly wage for parents/guardians (https://www.dol.gov/agencies/whd/minimum-wage).
Table A12. Estimated annual and total burden and cost
Instrument |
No. of Respondents (total over request period) |
No. of Responses per Respondent (total over request period) |
Avg. Burden per Response (in hours) |
Total Burden (in hours) |
Annual Burden (in hours) |
Average Hourly Wage Rate |
Total Annual Respondent Cost |
BSC Implementation Instruments |
|||||||
1. BSC Selection Application Questionnaire |
225 |
1 |
1.5 |
338 |
113 |
$23.68 |
$4,001.58 |
2. Pre-Work Assignment: Data Collection Planning Worksheet |
48 |
1 |
2 |
96 |
32 |
$25.87 |
$1,241.76 |
3. Plan, Do, Study, Act (PDSA) Form & Tracker |
168 |
34 |
0.25 |
1,428 |
476 |
$21.25 |
$15,174.54 |
4. Monthly Metrics |
48 |
8 |
1.5 |
576 |
192 |
$25.87 |
$7,450.56 |
5. Implementation Discussion Forum Prompts |
168 |
34 |
0.25 |
1,428 |
476 |
$21.25 |
$15,174.54 |
6. Learning Session Feedback Form |
168 |
4 |
0.25 |
168 |
56 |
$21.25 |
$1,785.24 |
7. Action Planning Form |
168 |
4 |
0.25 |
168 |
56 |
$21.25 |
$1,785.24 |
8. BSC Overall Feedback Form |
168 |
1 |
0.25 |
42 |
14 |
$21.25 |
$446.31 |
9. Organizational Self-Assessment |
168 |
5 |
1.5 |
1,260 |
420 |
$21.25 |
|
BSC Evaluation Instruments |
|||||||
10. Key Informant Interviews with BSC Faculty Members Affiliated with the States/Regions |
9 |
1 |
1 |
9 |
3 |
$36.02 |
$180.10 |
11. BSC Implementation Staff and Faculty Focus Groups |
30 |
2 |
1.5 |
90 |
30 |
$36.02 |
$1,620.90 |
12. BSC Implementation Staff and Faculty Background Survey |
30 |
1 |
0.17 |
5 |
2 |
$36.02 |
$108.06 |
13. Key Informant Interviews with BSC Center Administrators |
24 |
2 |
1 |
48 |
16 |
$25.87 |
$620.88 |
14. BSC Teachers and Support Staff Focus Groups |
120 |
2 |
1.5 |
360 |
120 |
$23.13 |
$4,163.40 |
15. BSC Parent Focus Groups |
24 |
2 |
1.5 |
72 |
24 |
$7.25 |
$261.00 |
16. Individual BSC Teams Focus Groups |
168 |
2 |
1.5 |
504 |
168 |
$21.25 |
$5,355.72 |
17a. Administrator Surveys |
24 |
3 |
0.5 |
36 |
12 |
$25.87 |
$465.66 |
17b. Teacher Surveys |
240 |
3 |
0.5 |
360 |
120 |
$23.13 |
$4,163.40 |
17c. Other Center Staff Surveys |
96 |
3 |
0.5 |
144 |
48 |
$23.13 |
$1,665.36 |
17di. Non-BSC Parent Surveys |
2136 |
2 |
0.25 |
1068 |
356 |
$7.25 |
$3,871.50 |
17dii. BSC Parent Surveys |
24 |
3 |
0.5 |
36 |
12 |
$7.25 |
$130.50 |
18. Classroom Observations |
48 |
3 |
0.33 |
48 |
16 |
$23.13 |
$555.12 |
19. Administrative Data Survey |
24 |
4 |
0.25 |
24 |
8 |
$25.87 |
|
Total |
8308 |
2770 |
|
$83,921.11 |
*Note: Due to the intrinsically collaborative nature of BSCs, several instruments will be completed by multiple respondent types. Average hourly wage rates for these instruments are therefore averaged across the different respondent wage categories.
A13. Costs
There are no additional costs to respondents.
A14. Estimated Annualized Costs to the Federal Government
The total cost for the data collection activities under this current request will be $2,020,567.02. Annual costs to the Federal government will be $673,522.34 for the proposed data collection under this OMB clearance number. This includes direct and indirect costs of data collection.
Cost Category |
Estimated Costs |
Data Collection |
$1,275,295.08 |
Analysis |
$637,647.54 |
Publications/Dissemination |
$107,624.40 |
Total costs over the request period |
$2,020,567.02 |
Annual costs |
$673,522.34 |
A15. Reasons for changes in burden
This is a new information collection request.
A16. Timeline
Table A16. Study timeline
Task |
Months After OMB Approval |
Begin recruitment |
Within 1 month |
Data collection |
Months 1 through 36 (36-month window) |
Data analysis |
Months 27 through 36 (9-month window) |
Draft report |
Months 34 through 39 (5-month window) |
Final report |
Month 40 |
A17. Exceptions
No exceptions are necessary for this information collection.
Attachments
Instrument 1: BSC Selection Application Questionnaire
Instrument 2: Pre-Work Assignment: Data Collection Planning Worksheet
Instrument 3: Plan, Do, Study, Act (PDSA) Form & Tracker
Instrument 4: Monthly Metrics
Instrument 5: Implementation Discussion Forum Prompts
Instrument 6: Learning Session Feedback Form
Instrument 7: Action Planning Form
Instrument 8: BSC Overall Feedback Form
Instrument 9: Organizational Self-Assessment
Instrument 10: Key Informant Interviews with BSC Faculty Members Affiliated with the States/Regions Discussion Guide
Instrument 11: BSC Implementation Staff and Faculty Focus Group Discussion Guide
Instrument 12: BSC Implementation Staff and Faculty Background Survey
Instrument 13: Key Informant Interviews with BSC Center Administrators Discussion Guide
Instrument 14: BSC Teachers and Support Staff Focus Group Discussion Guide
Instrument 15: BSC Parent Focus Group Discussion Guide
Instrument 16: Individual BSC Teams Focus Group Discussion Guide
Instrument 17a-dii: Pre-post Surveys with Administrators, Teachers, Staff, and Parents
Instrument 18: Classroom Observations
Instrument 19: Administrative Data Survey
Appendix A: BSC information session announcement
Appendix B: Information session registration confirmation
1 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Alexandra Verhoye |
File Modified | 0000-00-00 |
File Created | 2023-07-31 |