CCL_Case study_Supporting Statement B.v5.clean

CCL_Case study_Supporting Statement B.v5.clean.docx

Culture of Continuous Learning Project: Case Study

OMB: 0970-0605

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Culture of Continuous Learning Project: Case Study



OMB Information Collection Request

0970 - New Collection





Supporting Statement

Part B



December 2022








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officer:

Nina Philipsen

Part B


B1. Objectives

Study Objectives

The primary objective of the Culture of Continuous Learning (CCL) case study is to collect information about the factors that contribute to the feasibility of the Breakthrough Series Collaborative (BSC) implementation within a state quality improvement system (e.g., a state quality rating and improvement system) and/or a regional professional development or technical assistance system (e.g., a region within a state, or a cross-state region such as Head Start regional technical assistance areas) such that we can refine hypotheses and study measures which will be useful in the design of an evaluation for a future study of BSCs in early care and education (ECE) systems. The case study will also help determine what additional capacity ECE systems may need to adopt the BSC methodology and offer it within their system at a larger scale. Specific CCL project objectives include:

  1. Understand what factors facilitate and hinder the implementation of a BSC within an existing state, Head Start, or regional ECE quality improvement (QI) system.

  2. Identify what is needed to further integrate a BSC into existing state, Head Start, or regional QI systems in ways that promote sustainability.

  3. Add to the evidence base for the BSC as an approach to continuous quality improvement in early childhood settings.


Generalizability of Results

This study is intended to present an internally-valid description of BSCs in chosen sites, not to promote statistical generalization to other sites or service populations. Information collected with the case study design cannot be generalized to the broader population of child care centers and Head Start programs. The results of this case study will be used to inform the design of a future, larger scale project about implementing the BSC methodology in early childhood systems at the state and/or regional level. The findings will also inform appropriate research approaches to evaluate a potential larger-scale effort using the BSC methodology at the state and/or regional level.  


Appropriateness of Study Design and Methods for Planned Uses

We plan to use a case study design to document and assess the feasibility of implementing BSCs within a state quality improvement system and/or a regional professional development or technical assistance system. Using a case study design is appropriate for our intended purposes. Case studies are well-suited for collecting data at different timepoints from a variety of respondents, through multiple means (e.g., interviews, surveys, focus groups, observations). This approach will yield rich descriptions of BSC implementation in Head Start and child care centers, including the facilitators and barriers to implementation, and the overall extent to which it is a feasible approach for promoting changes in practice at multiple levels of the organization that ultimately support children’s social and emotional learning.


As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information. This case study does not include an impact evaluation and should not be used to assess participant outcomes. Publicly available written products will clearly describe study limitations.



B2. Methods and Design

Target Population

We will target up to twenty-four sites in three different locations to participate. For each of the 24 selected ECE centers, we will collect information from child care center or Head Start directors or administrators and staff; teachers, assistant teachers, and teacher’s aides; and parents/guardians.


Sampling and Site Selection

Site Selection

We will select the three locations (i.e., states and/or regions) for the BSCs through active engagement of fewer than nine individuals. The locations will be selected to represent varied conditions for implementation of a BSC and demonstrate states’ and regions’ readiness to partner in implementing a BSC. In each of the three locations we will conduct one BSC, comprised of up to eight ECE sites (i.e., child care or Head Start centers) for a total of 24 participating ECE centers.



In each location, we will work with a local partner who will help to identify ECE sites (i.e., child care or Head Start centers) to invite to apply to participate in the BSC. The CCL implementation team will use an application and selection process to identify the centers that ultimately will be invited to participate in the BSC. A BSC coordinator (i.e., a member of the CCL implementation team) will work with the local partner to share information about the CCL project and to invite ECE centers to complete a BSC Selection Application Questionnaire (Instrument 1). ECE centers to be invited will be identified through purposive sampling based on the joint decisions of the BSC implementation staff and the local partner. These ECE centers will likely be participating in some existing statewide systems (e.g., QRIS, Head Start) that support centers’ professional development (PD). Up to 45 ECE centers will be invited to complete an application to participate in a BSC and up to 5 people per center will be involved in completing the application. Up to 8 centers will be selected to participate in each of the three BSCs, for a total of up to 24 participating ECE centers. For successful implementation of the BSC, the process of selecting ECE centers is a mutual selection process. The ECE centers that are selected for the CCL BSCs will be those who the CCL Implementation Team sees as the most “ready” and capable of fully participating in the BSC, based on responses to the selection application questionnaire.



Participant Sampling

Below we outline how we will select participants for each set of proposed respondents. All samples below represent convenience samples from the ECE centers that are selected for participation in the BSCs. Because participants will be selected through purposive and convenience sampling, they will not be representative of the population of state administrators or child care/Head Start staff. Instead, we aim to obtain variation in perspectives to understand the range of experiences regarding the BSC.



BSC Implementation Staff and Faculty: The faculty, quality improvement advisors, implementation managers and coordinators, and the CCL implementation leads comprise the BSC implementation staff and faculty who lead and implement the BSC. All implementation staff and faculty who are employed as part of the CCL contract (N = 39; 13 per BSC) will be invited to participate in the evaluation. Implementation staff and faculty will be asked to participate in limited evaluation activities for the purpose of process (or implementation) evaluation. Primarily, they will be asked to reflect on their experiences supporting BSC teams.



Core BSC Team Members: Each participating ECE center will internally select a Core BSC Team that includes individuals who participate in all BSC activities and have first-hand experience with the BSC elements. Within each ECE center, up to 7 individuals (e.g., directors, lead teachers, assistant teachers, teacher aides, parents, curriculum specialists, etc.) will be part of the Core BSC team, meaning that up to 168 individuals will participate. We will invite all BSC participants, including center administrators, staff, and parents to take part in evaluation activities.

Non-BSC Participant Teachers and Staff at Participating Centers: We will invite all other teachers (N = 240; 24 per ECE center) and support staff (N = 96; 4 per ECE center) at each participating ECE center to respond to surveys.

Parents in the BSC-participating Centers: We will invite all parents at each participating ECE center (N = 2136; approximately 89 per ECE center) to respond to surveys.



B3. Design of Data Collection Instruments

Development of Data Collection Instruments

Each instrument aims to collect unique information about the characteristics of ECE centers; perceptions, experiences, and activities of parents and staff (administrators, teachers, assistant teachers); and implementation fidelity. There is limited research and data on BSC implementation in ECE settings to inform our constructs. As such, we plan to collect data from various sources to strengthen our ability to adequately measure our constructs. See Table B3 for information on which project objectives will be addressed by each instrument.


All BSC implementation instruments were drawn from the previous Culture of Continuous Learning Project: A Breakthrough Series Collaborative for Improving Child Care and Head Start Quality (CCL) (OMB #0970-0507) study and tailored to meet the current study’s objectives. The observation protocol, focus groups guides, and interview guides were developed to capture relevant constructs in collaboration with experts in BSC methodology, child care and Head Start systems, and research evaluation methodology. The surveys are a combination of existing validated scales [e.g., BASiC-QI (Brown et al., 2019)] and newly developed survey items aligned with the CCL project theory of change including survey items developed by ACF’s ExCELS study (OMB #0970-0582). The surveys were also finalized through expert consultation and pilot testing with fewer than 9 individuals.


All instruments were developed through identifying constructs required to adequately answer each guiding question (see Supporting Statement A) and address study objectives.


Table B3 Study objectives addressed by each instrument

Objectives

Instruments

1. Understand what factors facilitate and hinder the implementation of a BSC within an existing state, Head Start, or regional system

  • BSC Selection Application Questionnaire

  • Pre-Work Assignment: Data Collection Planning Worksheet

  • Plan, Do, Study, Act (PDSA) Form & Tracker

  • Monthly Metrics

  • Implementation Discussion Forum Prompts

  • Learning Session Feedback Form

  • Action Planning Form

  • BSC Overall Feedback Form

  • Organizational Self-Assessment

  • Key Informant Interviews with BSC Faculty Members Affiliated with the States/Regions

  • BSC Implementation Staff and Faculty Focus Groups

  • BSC Teachers and Support Staff Focus Groups

  • BSC Parent Focus Groups

  • Individual BSC Teams Focus Groups

  • Administrator Surveys

  • Teacher Surveys

  • Administrative Data Survey

  • BSC Implementation Staff and Faculty Background Survey

2. Identify what is needed to further integrate a BSC into existing state, Head Start, or regional systems in ways that promote sustainability

  • Key Informant Interviews with BSC Faculty Members Affiliated with the States/Regions

  • BSC Implementation Staff and Faculty Focus Groups

  • Key Informant Interviews with BSC Center Administrators

3. Add to the evidence base for the BSC as an approach to continuous quality improvement in early childhood settings

  • BSC Selection Application Questionnaire

  • Pre-Work Assignment: Data Collection Planning Worksheet

  • Plan, Do, Study, Act (PDSA) Form & Tracker

  • Monthly Metrics

  • Implementation Discussion Forum Prompts

  • Learning Session Feedback Form

  • Action Planning Form

  • BSC Overall Feedback Form

  • Organizational Self-Assessment

  • BSC Implementation Staff and Faculty Focus Groups

  • BSC Teachers and Support Staff Focus Groups

  • BSC Parent Focus Groups

  • Individual BSC Teams Focus Groups

  • Key Informant Interviews with BSC Center Administrators

  • Administrator Surveys

  • Teacher Surveys

  • Other Center Staff Surveys

  • Non-BSC Parent Surveys

  • BSC Parent Surveys

  • Classroom Observations



B4. Collection of Data and Quality Control

Who will be collecting the data?

The Contractor will collect BSC selection application data, administer surveys, and conduct interviews, focus groups, and classroom observations.


A member of the CCL project team will support BSC teams by introducing participants to their online shared learning platform which will house the implementation instruments (i.e., the pre-work assignment data collection planning worksheet, PDSAs, monthly metric template, discussion forum prompts, learning session and BSC overall feedback forms, action planning form, and the organizational self-assessment). Each BSC team member will learn how to interact with the shared online learning platform, including how to access materials and engage in discussion forums. Each BSC team will also designate a team data manager who will be responsible for uploading completed implementation instruments to their team’s shared site.


What is the recruitment protocol?

Recruitment of ECE Centers

As described in Section B2, we will first conduct outreach to ECE centers in each location to describe the CCL project and invite centers to complete the BSC selection application questionnaire. Outreach materials and activities will be finalized in collaboration with our state/regional partners. Outreach activities may involve offering informational meetings to share information and expectations about the project, emails, flyers, and phone calls. See Appendix A for sample outreach materials.


Recruitment of Individuals

Individual participants will be invited to participate in each research activity (surveys, interviews, and focus groups) via email, verbal reminders during BSC activities, and reminders on the online shared learning platform. Participants will be able to provide informed consent for each individual research activity.


What is the mode of data collection?

Data collection will be conducted through multiple means, including: implementation instruments hosted on our secure web-based learning platform; online surveys hosted through REDCap, our secure online data collection platform; interviews and focus groups conducted through Microsoft Teams; and classroom observations conducted using the Swivl video recording system.


What data evaluation activities are planned as art of monitoring for quality and consistency in this collection, such as re-interviews?

While respondents are taking a survey, REDCap’s built-in validation functions will ensure responses are within expected ranges. If the response does not pass validation, the participant will be prompted to correct the response. If participants start the survey but do not complete it, reminder emails will be sent as part of the outreach efforts. The CCL team will monitor survey responses and conduct quality assurance checks.



B5. Response Rates and Potential Nonresponse Bias

Response Rates

The instruments are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.


NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection.



B6. Production of Estimates and Projections

The data will not be used to generate population estimates, either for internal use or dissemination.



B7. Data Handling and Analysis

Data Handling

Any electronic notes and audio recordings from the interviews and focus groups will be stored in a secure, password-protected location. A quality improvement advisor from the CCL implementation team will periodically check in with teams if we notice they are not engaging with or uploading information to their online shared learning platform.


For all surveys, the CCL team will build validation checks into the REDCap survey platform to ensure responses are within expect ranges. Skip logic will allow respondents to only respond to questions that are relevant to them. The survey team will also conduct ongoing reviews of the data, including frequencies and cross tabulations, to ensure each survey is running as expected. The data will be stored on REDCap’s secure server. Only research team members who have completed human subjects research and data security training will have access to data collected through the REDCap survey.



Data Analysis

Both qualitative and quantitative data will be collected as evidence to test the hypotheses associated with the stated research questions. For quantitative measures, we will report means and ranges on items and scales for the overall sample and for subgroups of interest. The data from standardized observational tools and questionnaires both within and across case study locations will be examined for statistically significant signal patterns of change over time in attitudes, beliefs, or practices among individuals and teams participating in the BSC, and the “spread” of continuous learning practices beyond those immediately involved in the BSC.


All data collected and used for the evaluation study—both quantitative and qualitative data from multiple participants and data sources (e.g., administrative data, surveys, in-depth interviews, focus groups, and field notes from observations)—will be analyzed for emergent themes related to each of the seven research questions.


There are four general analytic strategies used in case study research: (1) testing theoretical propositions (i.e., deductive); (2) “ground up” analysis (i.e., inductive or “grounded theory”); (3) case description; and (4) examining plausible, rival explanations (Yin, 2014). In addition, there are five specific analytic techniques used in case study research: (1) pattern matching, (2) explanation building, (3) time-series analysis, (4) logic models, and (5) cross-case synthesis (Yin, 2014). For this evaluation study, we will use a combination of analytic strategies, including case description, theory testing, and examination of rival explanations; we will also use a combination of analytic techniques, including pattern matching and cross-case (or cross-location) synthesis. We will use pattern matching to detect whether we see evidence to support the hypotheses regarding how the BSC facilitates change in individual and organizational practice. However, we will also be open to alternative explanations for the patterns we find and seek to provide a general description of how the BSC was implemented in this set of early care and education locations. We will be looking to document patterns that emerge across the seven centers participating in each BSC regardless of auspice and look for patterns across other characteristics of the participating centers (e.g., Head Start and child care) using cross-location analysis and synthesis.


Data Use

We will produce a final report describing the methods and findings of the case studies, as well as several shorter summary documents for various target audiences (e.g., a brief about lessons learned for ECE system leaders, an additional tool for BSC implementers at the ECE system level, conference presentations or webinars for research and policy audiences). The findings will also be used to inform the design of a potential future research study about implementing a BSC in ECE systems.



B8. Contact Persons

Kathryn Tout

Vice President of Early Childhood Research and Partnerships

708 North First Street, Suite 333 | Minneapolis, MN 55401

ktout@childtrends.org

(612) 250-1592


Anne Douglass

Executive Director, Institute for Early Education Leadership & Innovation

Professor and Program Director, College of Education & Human Development

University of Massachusetts Boston

anne.douglass@umb.edu


Nina Philipsen

Senior Social Science Research Analyst, Division of Child and Family Development

Office of Planning, Research, and Evaluation

Administration for Children and Families

US Department of Health and Human Services

330 C Street, SW | Washington, DC 20201

nina.philipsen@acf.hhs.gov







Attachments

Instrument 1: BSC Selection Application Questionnaire

Instrument 2: Pre-Work Assignment: Data Collection Planning Worksheet

Instrument 3: Plan, Do, Study, Act (PDSA) Form & Tracker

Instrument 4: Monthly Metrics

Instrument 5: Implementation Discussion Forum Prompts

Instrument 6: Learning Session Feedback Form

Instrument 7: Action Planning Form

Instrument 8: BSC Overall Feedback Form

Instrument 9: Organizational Self-Assessment

Instrument 10: Key Informant Interviews with BSC Faculty Members Affiliated with the States/Regions Discussion Guide

Instrument 11: BSC Implementation Staff and Faculty Focus Group Discussion Guide

Instrument 12: BSC Implementation Staff and Faculty Background Survey

Instrument 13: Key Informant Interviews with BSC Center Administrators Discussion Guide

Instrument 14: BSC Teachers and Support Staff Focus Group Discussion Guide

Instrument 15: BSC Parent Focus Group Discussion Guide

Instrument 16: Individual BSC Teams Focus Group Discussion Guide

Instrument 17a-dii: Pre-post Surveys with Administrators, Teachers, Staff, and Parents

Instrument 18: Classroom Observations

Instrument 19: Administrative Data Survey

Appendix A: BSC information session announcement

Appendix B: Information session registration confirmation


9


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy