SSB - Case Studies of Child Care and Early Education Supply-Building and Sustainability Efforts

Generic SSB_CCEE Supply Case Studies_2.5.25_clean.docx

Formative Data Collections for ACF Research

SSB - Case Studies of Child Care and Early Education Supply-Building and Sustainability Efforts

OMB: 0970-0356

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Case Studies of Child Care and Early Education Supply-Building and Sustainability Efforts





Formative Data Collections for ACF Research


0970 - 0356





Supporting Statement

Part B

February 2025


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Amanda Coleman and Bonnie Mackintosh


Part B


B1. Objectives

Study Objectives

The objective of this information collection is to complete case studies with up to 12 sites to document strategies states, territories, and tribes use to build supply and sustain child care and early education (CCEE).


This study has three primary objectives:

  1. Assess the readiness of selected strategies for evaluation and recommend potential future evaluation designs

  2. Document the implementation details of selected supply-building and sustainability strategies and understand implementation challenges and successes

  3. Understand child care providers’ and other strategy recipients’ perspectives on and experiences participating in or receiving the strategies, including their perspectives on what is working well and recommendations for improvement


Generalizability of Results

The study is intended to present an internally valid description of CCEE supply-building and sustainability strategies implemented by up to 12 sites. It is not designed to yield statistical generalization to other sites or service populations.


Appropriateness of Study Design and Methods for Planned Uses

The research team will use a case study design with purposive selection of sites to achieve the study’s objectives. A purposive sample will ensure the study includes sites relevant to (and respondents with perspectives on) a range of strategies designed to build supply and sustain CCEE. Because this project aims to learn about approaches, processes, challenges, and facilitators to implementing the strategies, qualitative methods will promote in-depth examination of constructs of interest, using flexible instruments that can respond to variability across the strategies.


The data collected are not meant to be representative. This study does not include an impact evaluation and will not be used to assess the effectiveness of the strategies or the participants’ outcomes. All publicly available products will clearly describe key limitations.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.



B2. Methods and Design

Target Population

In each of the up to 12 case study sites, the research team will collect data from Child Care and Development Fund (CCDF) Administrators, strategy implementation lead staff from the Lead Agency or partner organizations, and data and evaluation lead staff from the Lead Agency or partner organizations. In addition, in each site, the team will collect data from child care providers with experience participating in or receiving the strategy. In some sites, the team may conduct interviews with other strategy recipients (who may include staff from community-based or intermediary agencies who are leading local projects supported by the site’s strategy or community engagement group members who helped plan and/or oversee implementation of the local projects).


The sampling frame for each site will be the CCDF Administrator; the roster of staff that lead implementation of the selected strategy; the roster of staff who lead data collection and evaluation activities related to the strategy; (in most sites) child care providers who participate in or receive the strategy; and (in some sites) recipients of strategies other than child care providers (i.e., staff from community-based or intermediary agencies who are leading local projects supported by the site’s strategy or community engagement group members who helped plan and/or oversee implementation of the local projects).


Sampling and Site Selection

The research team will use non-probability, purposive sampling to identify potential respondents who can answer questions about the study’s key questions. Because participants will be purposively selected, they will not be representative of the population of CCDF Administrators, strategy implementation lead staff, data and evaluation lead staff, child care providers with experience participating in or receiving the strategy, or other strategy recipients (who may include staff from community-based or intermediary agencies who are leading local projects supported by the site’s strategy or community engagement group members who helped plan and/or oversee implementation of the local projects). Instead, the team aims to get a variety of perspectives and experiences of the respondents they interview to understand their unique perspective as it relates to the CCEE supply-building and sustainability efforts in each site.



The research team used the strategies identified through the project’s environmental scan of CCDF Lead Agencies’ supply-building and sustainability efforts as the sample frame for the case studies (related information collections approved under OMB #0970-0356; Title: Survey of Child Care and Early Education Supply-Building and Sustainability Efforts; approved 2/1/2023. Through the scan, the team identified 722 strategies. They narrowed the pool of strategies through a multi-step process developed by the research team and ACF, in which they eliminated strategies from consideration if available information from the scan indicated:

  • The strategy ended in or before 2023 (given the team’s interest in interviewing for the case studies individuals who are or recently were actively involved in planning and implementation of the strategies);

  • The state did not collect information to track implementation or progress towards goals (given that an objective of the case studies is to identify strategies that may be ready for future evaluation);

  • The strategy was not led or implemented by the state CCDF Lead Agency (given this project’s focus on state-level efforts); and

  • The strategy was focused on workforce issues (since other ACF-funded projects were focused on this topic and the research team wanted to avoid the potential for overlap or duplication).

After applying these criteria, the research team then grouped the remaining 74 strategies into groups of strategies organized by the primary aim or goal of the strategy. Six groups of strategies that were of primary interest for the case studies were selected, resulting in 24 potential strategies.



The research team compiled information about the strategies available from the environmental scan and supplemented it with information the team gathered through a web search for publicly available information about the strategies conducted in spring 2024. The team identified eight strategies with limited information available conducted screening calls with the CCDF Administrators in these states to gather information about implementation status. (Because the research team conducted the calls with less than nine individuals, they are not included as part of this information collection request). Using all available information (collected through the project’s environmental scan, spring 2024 web search, and screening calls), the team identified 12 sites to invite to participate in case studies (along with alternative sites in case some states are unable or unwilling to participate). Federal staff reviewed the list of recommended sites to inform final site selection.


Recruitment

To recruit sites, the research team will send a recruitment email (Appendix A) and/or invite CCDF Administrators to participate in a recruitment call (Appendix A includes an outreach email inviting administrators who did not participate in screening calls with the team already to participate in a recruitment call and Instrument 1 is a protocol for the recruitment call). The team anticipates that one to two individuals per site will participate in recruitment calls and that they will need to reach out to 16 sites in order to recruit 12.


For sites that agree to participate in a case study, the team will recruit the CCDF Lead Agency Administrator and invite any key agency staff working on efforts to build and sustain CCEE to participate in an interview (N=1-3 per site). They will also recruit and interview one or more staff responsible for implementation of the strategy; these staff may be from the Lead Agency or partner organizations and may work at the state or local level depending on the strategy (N=1-3 per site). To gather information on data available about the strategy and/or any evaluation efforts planned, underway or completed related to the strategy, the team will recruit and interview key data and evaluation staff from the Lead Agency and/or partner organizations (N=1-3 per site). To understand their experiences participating in or receiving the strategies, in most sites (N=10 sites) the team will recruit child care providers, including either center directors or family child care providers or both depending on the strategy, to participate in focus groups (N=6-8 per site). In other sites (N=2 sites), the team may interview strategy recipients other than child care providers. In these sites, instead of focus groups with providers, the team will conduct interviews with staff from community-based or intermediary agencies who are leading local projects supported by the site’s strategy or community engagement group members who helped plan and/or oversee implementation of the local projects (N=3 per site).



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The research team developed the data collection instruments based, in part, on the findings from this project’s environmental scan task. They designed the instruments to efficiently address the research questions and capture constructs of interest. The instruments include a protocol for recruitment calls with CCDF Administrators, four open-ended interview guides, and a focus group facilitator guide.



Once drafted, the research team and federal staff worked to refine the instruments. The instruments each address some, if not all, of the study objectives described in B1 (Exhibit B1).


In drafting these instruments, the research team considered whether they could gather information from other sources to minimize the burden placed on respondents. For instance, for each site participating in the study, they will review publicly available information gathered by the research team in spring 2024 on their strategies prior to the interviews to minimize the number of questions they need to ask respondents. They have also carefully considered which interviewee(s) are best suited to address which question(s) to avoid asking multiple people for the same information.



Exhibit B1. Alignment of Study Objectives and Instruments

Study Objective

Instruments Addressing Study Objective

  1. Assess the readiness of selected strategies for evaluation and recommendation potential future evaluation designs


  • Instrument 2. CCDF Lead Agency Staff Interview Protocol

  • Instrument 3. Strategy Implementation Lead Interview Protocol

  • Instrument 4. Data and Evaluation Staff Interview Protocol

  1. Document the implementation details of selected supply-building and sustainability strategies and understand implementation challenges and successes

  • Instrument 2. CCDF Lead Agency Staff Interview Protocol

  • Instrument 3. Strategy Implementation Lead Interview Protocol

  • Instrument 4. Data and Evaluation Staff Interview Protocol

  • Instrument 6. Other Recipient Interview Protocol

  1. Understand child care providers’ and other strategy recipients’ perspectives on and experiences participating in the strategies, including their perspectives on what’s working well and recommendations for improvement

  • Instrument 5. Child Care Provider Focus Group Guide

  • Instrument 6. Other Recipient Interview Protocol




B4. Collection of Data and Quality Control

Mode of Data Collection

The research team will collect data virtually using a video conferencing platform. The research team from the Urban Institute will collect all necessary data. Exhibit A1 in Supporting Statement A summarizes the data collection that will be conducted for the study. They will assign a two-person site liaison team to each site and aim to have that team perform all recruitment and data collection for that site.


Recruitment Protocol

To recruit sites, the research team will send a recruitment email (Appendix A) and/or invite CCDF Administrators to participate in a recruitment call (Appendix A includes an outreach email inviting administrators to participate in a call and Instrument 1 is a protocol for the recruitment call). Specifically, if they are recruiting a site for which they conducted a screening call, they will send an email only. If they are recruiting a site that did not participate in a screening call, they will invite the CCDF Administrator to participate in a recruitment call so they have an opportunity to share details about the case study and answer questions. The team will attach an overview of the project to both emails (Appendix B). Regardless of the recruitment approach, if the recipient does not reply within about one week, the research team will send a reminder email (see Appendix A). They will follow-up with outreach by telephone using a prepared script if the administrator is not responsive to email (see Appendix A). If the administrator declines the invitation to include the site in the study, the team will select a site from their list of alternates.


In each site that agrees to participate in a case study, the team will request that the administrator connects the team with a point of contact from the CCDF Lead Agency to assist with planning the case study. They will email the point of contact to schedule a planning meeting (Appendix C). In addition to the project overview (Appendix B), they will attach an overview of data collection activities (Appendix D) to the email. They will follow-up by email if the point of contact has not responded to the email within about one week (see Appendix C). The research team will meet with the point of contact to discuss who to interview based on the topics planned for each interview and logistics for scheduling interviews and inviting child care providers to focus groups (Appendix C includes a guide the research team will follow for the planning meeting).


The research team will work with the site’s point of contact to schedule interviews with the CCDF Lead Agency staff, strategy implementation lead staff, and the data and evaluation lead staff. They will follow the point of contact’s preference as to whether the point of contact will contact the individuals and schedule the interviews or whether the team will contact the individuals directly. If the latter, the team will ask the point of contact to send an introductory email on the team’s behalf to make a “warm hand off.” If the research team conducts outreach directly to recruit interview participants, they will send an initial outreach interview (Appendix E). If the recipient does not reply within about one week, the research team will send a reminder email (see Appendix E). If they are not able to reach potential interview participants, they will follow-up with the site’s point of contact for assistance.


The research team will also work with the site’s point of contact to determine the best approach to recruit child care providers to participate in a focus group or other strategy recipients (who may include staff from community-based or intermediary agencies who are leading local projects supported by the site’s strategy or community engagement group members who helped plan and/or oversee implementation of the local projects) to participate in interviews. Based on past experience, the team anticipates working with local intermediaries or service organizations (e.g., child care resource and referral agencies or family child care networks) to recruit providers. If this is the approach recommended by the point of contact, the team will email the staff at the intermediary or local service organizations (Appendix F) to request their assistance distributing a flyer to potentially eligible providers (Appendix G). To express interest in participating in a focus group, child care providers will have the option to contact the research team by email or telephone or by filling out an online contact form (Appendix H). The research team will contact providers to confirm their eligibility and schedule the focus group.


Quality Control

Each research team member will participate in a training prior to data collection which will focus on strategies to ensure they collect high-quality data in the least burdensome way for respondents, including: (1) how to prepare for the interview (for example, identifying where documents the team has gathered already provide the needed information); (2) how to efficiently move through the interview protocols and focus group guide while collecting high-quality information (for example, how to make decisions about which probes are critical, based on answers received to that point in the interview); and (3) how to synthesize notes after each interview to confirm completeness of the data.


For all interviews and focus groups (Instruments 1-6) one member of the site liaison team will conduct the interview or facilitate the focus group and one will take notes. With the permission of respondents, the team will also record the interviews and focus groups for the purpose of audio transcription.


The site liaison team will meet briefly after each interview and focus group to debrief on the information shared and review any questions or gaps in notes taken during the data collection activity. They will use transcriptions of interviews and focus groups as the primary data source, and they will use notes taken during the interviews and focus groups to clarify any information unclear in the transcriptions. Throughout the data collection period, site liaisons and project leaders will hold weekly meetings to share updates, identify and troubleshoot challenges, and ensure that all data are collected as intended.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The interviews and focus groups are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported. The research team will track refusals to invitations to participate and document reasons for refusal when known. For most sites, the research team will conduct focus groups with child care providers who have experience participating in/receiving a selected strategy. In those cases, the research team will have access to providers’ names and contact information for active recruitment. They will track refusals and calculate a refusal rate for each site as a percentage of the number providers they attempt to recruit in that site, including those who directly decline and soft refusals (i.e., no-shows, unresponsive providers).


Nonresponse

Since interview and focus group participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated.



B6. Production of Estimates and Projections

This study is descriptive. The research team does not intend for this information to be used as the principal basis for public policy decision. Data will not be used to generate population estimates, either for internal use or dissemination.



B7. Data Handling and Analysis

Data Handling

The research team will conduct the interviews and focus groups through Zoom.gov, acquiring participants’ consent to be audio recorded. Following each case study, the research team will have the audio recordings transcribed by a trusted vendor (i.e., not with the use of Artificial Intelligence). As detailed in the project’s data security plan, the contractor (research team) will store these recordings and transcripts on a protected drive designed for confidential data in a project-specific folder set up for this data collection that will only be accessible to research team members. No data will be saved on the cloud or individual computers, and no raw data files will be shared via email.


For analysis, the research team will rely on the transcriptions of audio recordings. In cases where an interview or focus group participant does not consent to an audio recording, the team will rely on near-verbatim notes taken by a member of the research team. Data collectors will review the interview and focus group transcripts for clarity, and a senior member of the team will review a subset of the notes to ensure that data are complete and error free.


Data Analysis

The research team’s approach to data analysis is designed to yield highly trustworthy findings related to each of the case study’s research questions. With respect to each question, the research team will examine each site independently (site-specific analysis), and may conduct analysis across sites in the study (cross-site analysis). When possible, within sites, the team will triangulate responses between the various interviewees to develop a more thorough understanding of the strategy, its implementation, and any relevant outcomes.


To facilitate secure data management and efficient coding, the team will use NVivo, a qualitative coding and analysis software program. The interview and focus group transcripts will be loaded into the program, and members of the research team will code each transcript in accordance with a codebook. The team will use deductive coding, meaning they will use a predefined set of codes (Bingham and Witkowsky 2022). This method is appropriate since the primary purpose of coding for this study is to organize the data to facilitate analysis across interviews and focus groups within a site. The codebook will include codes linked to the case study research questions and primary topics from the study’s instruments. No Artificial Intelligence will be used to code or analyze the transcripts.


Senior members of the research team will train the coding and analysis team on the codebook and coding procedures. They will ensure reliability across coders by having each code one transcript and then checking inter-coder reliability by calculating kappa coefficients in NVivo for codes and subcodes. The team will meet to discuss discrepancies and misalignment and to identify specific rules to improve reliability. If there are some codes/subcodes with kappa coefficient values below 0.3 (considered low, or not coded consistently), the team will repeat the exercise with a second transcript. The team will aim to achieve acceptable reliability, on average, across all codes and subcodes (kappa values above 0.8), with all codes and subcodes with kappa values at or above 0.4 (considered medium or moderate; O’Connor and Joffe 2020).


Site-specific analysis will take place on a rolling basis, after each case study is completed. For each site, the team that conducted the case study will apply the codes in the codebook.


Each site liaison team will draft a site summary memo that provides an overview of the strategy, including data gathered as part of the web scan or survey, addresses the research questions, and provides an assessment of whether the strategy is ready to be rigorously evaluated and what evaluation design might be appropriate. The memos will be free of personal identifiers or distinguishing information, though they will identify the site and strategy by name. They will serve the main memo(s) of findings for the case study task.


The research team may conduct cross-site analysis when all case studies are complete and create a report or brief presenting findings across sites for internal ACF purposes. The research team may also publish a report or brief presenting findings across sites, under specific circumstances (detailed under “Data Use” section below and in alignment with the specifications in the umbrella generic clearance). They will assign each research question to a member of the team; that team member will be responsible for examining relevant coded excerpts from all case study sites and making meaning. These team members will first compare themes across sites. They may also compare what is emerging within and across strategies with similar objectives or components, to understand if there are similarities across sites implementing similar strategies. The team will meet several times to discuss findings and draw out the cross-site narrative about the different approaches to building and sustaining the supply of CCEE. The final report will present cross-site findings and highlight areas for future research and evaluation of the supply-building and sustainability strategies.


Data Use

The research team will produce a site-specific 12-to-15-page memo for each case study site treating the research questions for each of the sites individually. These memos will be drafted on a rolling basis. The team may also prepare a final report (25 to 30 pages in length) presenting the findings from the cross-site analysis.


The research team may also present findings during presentations or briefings to federal staff at ACF, technical assistance providers, and CCDF Lead Agency staff. Information from the case studies may also be made public in other ways, in line with the specifications in the umbrella generic, which provides the following examples: research design documents or reports; research or technical assistance plans; background materials for technical workgroups; concept maps, process maps, or conceptual frameworks; contextualization of research findings from a follow-up data collection that has full PRA approval; informational reports to TA providers; or project specific reports, or other documents relevant to the field, such as federal leadership and staff, grantees, local implementing agencies. In sharing findings, the team will describe the study methods and limitations with regard to generalizability and as a basis for policy.


In sharing findings, the team will describe the study methods and limitations with regard to generalizability and as a basis for policy.



B8. Contact Persons

Amanda Coleman, OPRE project officer, Amanda.Coleman@acf.hhs.gov

Bonnie Mackintosh, OPRE project officer, Bonnie.Mackintosh@acf.hhs.gov

Heather Sandstrom, Project Director, HSandstrom@urban.org

Gina Adams, Principal Investigator, GAdams@urban.org

Sarah Prendergast, Project Manager and Co-Task Leader, SPrendergast@urban.org

Tricia DelGrosso, Co-Task Leader, TDelgrosso@urban.org



Attachments

Attachments

Instrument 1. CCDF Lead Agency Staff Recruitment Call Protocol

Instrument 2. CCDF Lead Agency Staff Interview Protocol

Instrument 3. Strategy Implementation Lead Interview Protocol

Instrument 4. Data and Evaluation Staff Interview Protocol

Instrument 5. Child Care Provider Focus Group Guide

Instrument 6. Other Recipient Interview Protocol


Appendix A. Recruitment Emails to CCDF Lead Agencies and Follow-up Recruitment Phone Scripts

Appendix B. Project Overview of Data Collection Outreach

Appendix C. Email to Point of Contact to Schedule Planning Call and Guide for Planning Call

Appendix D. Overview of Data Collection Activities

Appendix E. Recruitment Emails to Interview Participants

Appendix F. Outreach Email to Intermediary or Local Service Organizations to Help Recruit Focus Group Participants

Appendix G. Focus Group Recruitment Flyer

Appendix H. Focus Group Contact Form

Appendix I. Thank You Emails to Interview and Focus Group Participants

Appendix J. Urban Institute IRB Approval



References

Bingham A. J., & Witkowsky P. (2022). Deductive and inductive approaches to qualitative data analysis. In Vanover C., Mihas P., Saldaña J. (Eds), Analyzing and interpreting qualitative data: After the interview. Sage Publications.

O’Connor, C., & Joffe, H. (2020). Intercoder Reliability in Qualitative Research: Debates and Practical Guidelines. International Journal of Qualitative Methods, 19. https://doi.org/10.1177/1609406919899220.



10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHeather Sandstrom
File Modified0000-00-00
File Created2025-05-29

© 2025 OMB.report | Privacy Policy