Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Home-Based Child Care
Practices and Experiences Study
OMB Information Collection Request
0970 - New Collection
Supporting Statement
Part B
May 2023
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers: Ann Rivera and Bonnie Mackintosh
Part B
B1. Objectives
Study Objectives
The Home-Based Child Care (HBCC) Practices and Experiences Study is designed to shed light on the experiences of a particular group of FFN providers—those who are involved or might later become involved in child care and early education (CCEE) systems. We define family, friend, and neighbor (FFN) providers as those who are legally exempt from state licensing or other state regulations for child care providers that specify non-custodial care of children in the provider’s own home. The objective of this study is to inform understanding of the experiences, strengths, resources, and strategies used by FFN providers to serve and support equitable outcomes for children and families and how the experiences of home-based providers intersect with the culture, race, ethnicity, language, and geographic location of the providers.
Generalizability of Results
This study will use qualitative methods to generate rich detail about the experiences of FFN providers. The study’s approach to identifying sites and providers through trusted partners means that the findings will shed light on the experiences of some FFN providers—those who are involved or might later become involved in CCEE systems. This study is intended to present internally-valid description of this population in chosen sites, not to promote statistical generalization to other sites or service populations. However, we will not be able to conclude if any findings are representative of these subgroups of FFN providers throughout the country, nor will the findings be representative of the experiences of all FFN providers. Publications resulting from the study will acknowledge this limitation. Nonetheless, these data will provide information about a population of great policy interest that is not well-represented in existing research (Bromer et al. 2021; Del Grosso et al. 2021). This study will help us understand more about the experiences and needs of FFN providers who are participating in or might participate in CCEE systems.
Appropriateness of Study Design and Methods for Planned Uses
By drawing on ethnographic methods—including in-depth interviews, photographs, and audio journals—the study will help researchers, policymakers, program administrators, and other stakeholders to develop a fuller understanding of the circumstances, challenges, and opportunities that FFN providers face, and the strategies, strengths, and priorities they bring to these experiences. The study’s approach will be primarily open-ended to avoid imposing any preconceived ideas about what constitutes quality in child care on the study participants. Instead, it is important to learn directly from providers about their experiences and the practices that they think are most important for children and families, without weighting the discussion toward conceptions of quality that do not take into account HBCC perspectives, especially among providers from historically excluded and/or marginalized communities.
As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information. The data collected are not intended to be representative, and findings may not necessarily apply to all sites or providers. This study does not include an impact evaluation and will not be used to assess participants’ outcomes. All publicly available products associated with this study will clearly describe key limitations.
B2. Methods and Design
Target Population
For each of four sites, the study team will collect data from FFN providers, family members of children cared for by the FFN providers, and community members. The study team will use non-probability, purposive sampling to identify potential respondents. Because participants will be purposively selected, they will not be representative of their populations. Instead, the study aims to obtain variation in providers’ experiences to explore how these experiences intersect with culture, race, ethnicity, language, and geographic location.
Respondent Recruitment and Site Selection
The study team will conduct the study in four purposively selected sites: Alabama, Arizona, the Bay Area in California, and New York City in New York. When considering possible sites, the team prioritized recruiting communities with:
A trusted partner, such as community organizations that offer support and quality improvement opportunities to FFN providers, that is willing to support the study and has the capacity to work with the study team to recruit enough providers.
Sites that include providers from a range of backgrounds, including those more likely to live in under-resourced communities.
Sites in states with different policy contexts, such as child care licensing thresholds (numbers of children allowed in home-based child care settings, policies on subsidy eligibility, participation of FFN providers in Quality Rating and Improvement Systems (QRIS), and the Child and Adult Care Food Program (CACFP).
Sites in states with different quality improvement supports available for FFN providers.
For each of the four sites selected, the study team will work with trusted partners to screen 30 FFN providers with the goal of recruiting 15 (60 providers total) to participate in the study. The study team will seek providers who are at least 18 years old and who (1) provide care for at least 15 hours per week, (2) provide care in their own home, (3) do not have a current license to provide HBCC, and (4) have at least one of the following: (a) receive payment for providing care, (b) have considered licensing, and/or (c) have prior CCEE work or volunteer experience or related training. The study team will work with each participating provider to recruit and interview 2 family members (120 total) and 1 community member (60 total).
B3. Design of Data Collection Instruments
Development of Data Collection Instrument(s)
The HBCCSQ Project’s conceptual framework (forthcoming) guided the development of the nine data collection instruments. Specifically, the instruments aim to explore several topics while retaining an open-ended style of questioning and discussion. Topics include features of quality that might be implemented differently or are more likely to occur in HBCC than in other CCEE settings. When feasible and appropriate, the study team drew on or adapted questions from previous research, including previous qualitative and ethnographic research with FFN and other HBCC providers and with families. Other questions are drawn from the National Survey of Early Care and Education (NSECE), to help put questions into context and provide topics to ask about in more depth. Other questions are new, and were developed by the study team in order to answer the study’s guiding questions.
The study team also involved trusted partners in the instrument development process. The team shared draft instruments with trusted partners and asked for their written feedback. The team also asked four academic experts who have already contributed to this project to review and comment on the protocols and instruments (see Section A8). Experts provided written and verbal feedback.
All instruments and protocols will be translated into Spanish.
Aligning Instruments with Objectives
Table B.1, below, details how each data source aligns with each of the study’s five guiding questions.
Table B.1. Alignment of study guiding questions and data collection activities
Guiding question |
Data collection activities |
|
Provider interview #1 (Instrument 2) and provider interview #2 (Instrument 6) will gather information from providers about the aspects of care they prioritize. Provider photo journals (Instrument 4) and provider audio journals (Instrument 5) and will document what providers prioritize and think is important to share; they may also illuminate some of the stressors and working conditions that put pressures on FFN providers. Provider interview #2 (Instrument 6) will focus on provider-child interactions, supporting provider-family relationships, and how providers’ racial, ethnic, and linguistic identity might intersect with their views about caregiving. |
|
Provider photo journals (Instrument 4) and provider audio journals (Instrument 5) will document how each provider implements features of quality that they think are most important, provide context, and explain why they think these are important (the study team will provide prompts to help inform their decisions). Provider interview #2 (Instrument 6) will ask providers to elaborate on certain quality features shown in their photos and will ask providers about other quality features that they do not show in their photos (such as features related to provider-child interactions and provider-family relationships). |
|
Provider interview #1 (Instrument 2) will gather information about the resources and sources of knowledge and strength providers use or access. Community member interviews (Instrument 8) will provide information about how community resources support the work of FFN providers. |
|
Family member interviews (Instrument 7) will provide information about which features of quality families most prioritize and appreciate, why they find them important, and how they experience these features in FFN settings. |
|
Comparison of data from provider interview #1 (Instrument 2) and provider interview #2 (Instrument 6) with data from family member interviews (Instrument 7) for each provider will allow for exploration of the alignment between provider and family priorities and views of quality. |
In addition to data collection activities that directly answer guiding questions, the provider screener (Instrument 1) will identify FFN providers who are eligible and interested in participating in the study, the provider logistics call (Instrument 3) will provide support and guidance for using study smartphone and software application and will discuss the role of providers in asking family and community members for permission to share contact information for interviews, and the provider feedback focus groups (Instrument 9) will allow the study team to share preliminary findings and themes with providers and obtain their feedback.
Pre-testing of Data Collection Instruments
Prior to submitting the data collection request, the study team completed a two-phase pre-test process and 1) conducted an initial pre-test of photo and audio journals and 2) pre-tested the full data collection process. The team conducted pre-tests with multiple respondents per protocol, but did not request the same information from more than nine respondents through the process. The study team conducted pre-tests in communities where the study will be conducted but did not conduct pre-tests with providers, families, and community members who will participate in the study.
Pre-testing of photo and audio journals
The goal of this step was to determine how best to execute the photo and audio journals in the full study and to confirm that the prompts are clear and understandable, and the providers can respond to them to submit relevant information.
The study team conducted the pre-test with three providers (identified by a trusted partner) from one site over a three-week period. The team asked providers to record and submit photo and audio journal entries in response to prompts, using the phones and software that the study team lent to providers. The team used a separate call to focus on the logistics of the phones and software to ensure providers were comfortable using them to submit photo and audio journal entries.
At the end of the pre-test period, the study team conducted a cognitive interview by video or telephone with each provider to gather their feedback on the logistics, the ease of using audio journal entries, collecting photographs, the clarity of the prompts, and how it led them to choose what to record for the journal entries and what to capture in the photos.
Pre-testing of full data collection process
The study team tested the full data collection process for the provider interview and photo and audio journals (6 providers), the family member interview (3 family members), and community member interviews (2 community members). Regarding the data collection process for providers, the team followed the full study procedures including interviews, logistics call and reminders and check-ins, with a couple exceptions. First, two of the providers only completed the interviews and did not participate in the photo and audio journal part of the study. Second, the team asked the other providers to complete only two weeks of photo and audio journal entries instead of entries for all four weeks. Throughout the pre-test, interviewers tracked the length of each interview, recorded any questions and assistance needed (both in terms of content or related to the equipment, the journal software application, or the video conference call), and assessed the process of having providers assist with recruiting family and community members.
In addition, these pre-tests focused on the content within each instrument. The study team asked the participants for feedback on the overall flow of the interviews and the clarity of the questions and probes, drawing on cognitive interviewing techniques. They asked providers for their opinions about recruiting family members and community partners. Following provider interview #2, the team also asked providers for their feedback on the process of collecting photo journals and recording audio journals, including the ease of understanding the prompts and the logistics of recording and uploading entries and their ability to use video conference call and screen sharing.
B4. Collection of Data and Quality Control
Selecting and Training Data Collectors
Selecting data collectors. Data will be collected by members of the study team, which includes staff at Mathematica and Erikson Institute, and the project’s consultant Toni Porter from Early Care and Education Consulting. Study team members will have experience conducting qualitative data collection and expertise in conducting semi-structured telephone interviews. Because data collection activities link to and build upon each other in important ways, the same data collector will be assigned to carry out all communication and data collection activities for a given provider.
Training data collectors. . The qualitative and open-ended nature of the data collection activities and the population of interest requires substantial attention to examining implicit biases. Data collectors will participate in multiple trainings. Trainings on recruitment and data collection will focus on building rapport; raising awareness around participant experiences with inequities, including racism; and other aspects of qualitative interviewing, including the need to maintain objectivity and to avoid allowing assumptions or preconceptions about participants affect how the interviewers ask questions. Training on analysis and reporting will focus on coding open-ended data and interpreting the findings.
Recruitment Protocol
The study will work with trusted partners to recruit providers and will work with providers who participate in the study to recruit family and community members. For all potential participants, the team has developed flyers and other recruitment materials that explain the study clearly and concisely. These materials are in Appendix A: Participant recruitment materials.
Recruiting providers. After OMB approval is received, they study team will ask trusted partners to help identify providers who meet the study criteria. These recruitment criteria include a mix of the following:
Are at least 18 years old and who (1) provide care for at least 15 hours per week, (2) provide care in their own home, (3) do not have a current license to provide HBCC, and (4) at least one of the following: (a) receive payment for providing care, (b) have considered licensing, and/or (c) have prior child care and early education (CCEE) work or volunteer experience or related training.
Receive child care subsidies or participate in the Child and Adult Care Food Program or who are not participating in either system.
Have a mix of different individual and provider characteristics, such as number and age of children cared for, relationship to children in care, and provider race/ethnicity, and languages spoken.
Are living in rural and urban areas as well as those living in high-poverty areas.
Per our trusted partner, are likely to agree to participate in our study.
The study team will obtain permission to collect provider contact information and work with liaisons identified by the trusted partner to contact the potential provider. Study team members will then call the providers to complete the 20-minute provider screener over the phone (rescheduling for a more convenient time if the provider prefers). The screener protocol includes giving an overview of the study and answering any questions from the provider.
The study team will select providers purposively and on a rolling basis, to minimize time between the provider screener and scheduling provider interview #1. For provider call scripts and email template materials see Appendix B: Participant scheduling scripts and supplemental contact materials.
Identifying and contacting family members of children cared for by the provider. During the provider logistics call, the study team will discuss with the provider the process for interviewing two family members (who would be from two different families, and who would be parents or guardians of children cared for by the provider). The study team will ask the provider to give a contact form (and a flyer about the study) to a family member of each child that the provider cares for. The form (see Appendix D: Consent statements and interview contact forms) gives the family member’s agreement to be contacted for an interview. Providers will ask the family members to sign them if they agree, and then collect the forms and return them to the study team. To receive the forms promptly and securely, providers will be asked to take photos of the forms and submit them through the journal software application. At the end of the study, they will return the hard copies along with the study phone. (If a provider prefers, they can give the study team the contact information over the telephone.)
Once the provider has returned the signed contact forms (or otherwise given the contact information), the study team will select the two family members and reach out to them for a brief conversation where they will explain the study, confirm their interest in participating, and schedule a time for the interview that is convenient for them. For family member call scripts and email template materials see Appendix B: Participant scheduling scripts and supplemental contact materials.
Identifying and contacting community members. During provider interview #1, the study team will ask the provider to identify community members who are sources of support, either formal or informal. Based on this information, interviewers will discuss with the provider which community members (who are frequent or primary sources of support) to consider interviewing; the study team will prioritize formal sources of support.
The study team will ask the provider to reach out to the top three community member options (and give them a flyer about the study). During the provider logistics call, the study team will briefly review a form included in the materials; the form will be used by the provider to fill out contact information for the community members who have given permission to the provider for the study team to contact them. As with the family member forms, the provider will submit a photo of the community member form (which is part of Appendix D: Consent statements and interview contact forms) through the journal software. (If a provider prefers, they can give the study team the contact information over the telephone.)
After the provider has returned the form (or otherwise given the contact information), the study team will select one community member and contact them to confirm their interest in participating and arrange a time for the interview. For community member call scripts and email template materials see Appendix B: Participant scheduling scripts and supplemental contact materials.
3. Data Collection
Mode. Data collection will be conducted virtually. The COVID-19 pandemic led to logistical constraints and changes in data collection, but showed that virtual approaches could work, and may even be preferred by participants as less burdensome, with lower risk of exposure to disease, and more convenient. For this reason, the study team plans to use a variety of virtual strategies including photo and audio journals, and telephone/video interviews. With consent, the study team will record the interviews so that, once they are transcribed, they can capture participants’ exact wording and other nuances that would be harder to infer from notes.
Monitoring for data quality and consistency. To monitor for data quality and consistency throughout data collection, the study lead will hold weekly check-ins with data collectors, so that staff can discuss progress, ask questions, troubleshoot problems, and share successful strategies. In addition, study team members will monitor the completion of providers’ photo and audio journal entries and offer feedback as needed to providers depending on whether providers complete the activities as well as follow the instructions. The team will monitor submissions through the journal software application for quality, relevance, and completeness, and will follow-up with providers as needed to address any issues. See Appendix C: Instructions for providers to use study tools for information on what the study team will provide to providers for use of smartphone and software application.
B5. Response Rates and Potential Nonresponse Bias
Respondent Selection
The data collection for this study is not designed to produce statistically generalizable findings. Participation is wholly at each respondent’s discretion. Response rates will not be calculated or reported.
However, because the study team will screen potential respondents for eligibility and interest in participating, they will calculate and report basic information about how many of the potential respondents that were screened participated, were not eligible, or declined to participate. They will also report the percentage of respondents who complete each study activity. As noted in A13, the study team expects that 50 percent of the potential respondents screened will be eligible and agree to participate.
NonResponse
As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Basic demographics of all potential respondents screened—including those who participate, are not eligible, or decline to participate—will be documented and reported in written materials associated with the data collection.
B6. Production of Estimates and Projections
The data will not be used to generate population estimates, either for internal use or dissemination. Information reported will clearly state that results are not meant to be generalizable.
B7. Data Handling and Analysis
Data Handling
Interview data: To ensure quality and completeness of these data, the study team will develop standard templates for interview notes, and train interviewers on the process for developing comprehensive notes. Interviewers will record written initial notes during and immediately after the interview and will also immediately confirm that the interview recorded properly, and the dialogue is audible. The study team will have all interviews transcribed by an external service based on the recording. Interviews done in Spanish will be transcribed and translated to English. Once a transcript is available, interviewers will briefly supplement the transcripts using their notes (and recording if needed). Each set of notes will be reviewed by either the study lead or the study consultant for quality and completeness. The reviewer will advise the interviewer whether any individualized follow-up is needed with the respondents to gather additional information or clarify existing information.
Photo and audio journal entries: The study team will view photo journal entries and listen to audio journal entries each week, in order to provide feedback to providers as needed, and then to prepare for provider interview #2, which will include follow-up questions about the photos.
Data Analysis
The study team will rely on NVivo qualitative software to analyze interview and photo and audio journal data. Approaches to analyses will draw on grounded theory and other approaches to qualitative data analysis (Maxwell 2013; Strauss and Corbin 1998). Triangulation of data sources will allow the team to examine themes that align with the study’s guiding questions from multiple perspectives and to validate themes that emerge from any single data source.
The team will follow a systematic and iterative process of coding and analysis throughout and after the data collection process. Analytic memos will document preliminary themes, theories, and observations that emerge during the data collection and analysis process. The team will integrate coding across data sources using matrices and frameworks in NVivo to map coding and memos back to the study’s guiding questions.
Analyses will focus on the study’s five guiding questions (as listed in Section B3 and in Section A2 of Supporting Statement Part A). For these questions, the study team will develop preliminary codebooks based on the HBCCSQ conceptual framework for quality and the study instruments, and also allow new codes to emerge from the transcripts that might not be tied to interview questions or prompts. Methods will vary based on the guiding question; for example, for the question about sources of knowledge and support, the team will use the categories of support described by providers as well as data from community member interviews to better understand the types of supports that FFN providers prioritize.
The study team will also explore different profiles of FFN providers based on the ways they describe their caregiving, perceptions of quality, and sources of support. For example, some providers might use a family-based perspective in describing their work while others might use more of a school-based framework. Examples of within-person narratives that combine photo, journal, and interview data as well as interview data from family and community members would be used to illustrate these different approaches to child care.
The study team will pre-register the study before data collection begins. The team plans to use osf.io.
Data Use
The study team will share findings from the analysis based on the five guiding questions with the public, including dissemination to community experts and partners, researchers, and program partners, in a series of materials and presentations. These materials will include a summary of the methods used in the study.
The study team plans to archive the transcripts of the interviews and audio journals with the Child and Family Data Archive. Data will be de-identified and redacted for potential disclosure risks according to best practices outlined in the Data Producer’s Guide to the Child and Family Data Archive and in adherence with the project’s data archive plan.
B8. Contact Persons
Ann Rivera
Project Officer
Office of Planning, Research, and Evaluation
Administration for Children and Families
Bonnie Mackintosh
Project Officer
Office of Planning, Research, and Evaluation
Administration for Children and Families
Ashley Kopack Klein
Project Director
Mathematica
AKopackKlein @mathematica-mpr.com
Ann Li
Deputy Project Director
Mathematica
Juliet Bromer
Co-Principal Investigator
Erikson Institute
Laura Kalb
Survey Director
Mathematica
Attachments
Instruments
Instrument 1. Provider screener
Instrument 2. Provider interview #1
Instrument 3. Provider logistics call
Instrument 4. Provider photo journals
Instrument 5. Provider audio journals
Instrument 6. Provider interview #2
Instrument 7. Family member interview
Instrument 8 Community member interview
Instrument 9. Provider feedback focus group
Appendix A: Participant recruitment materials
Appendix B: Participant scheduling scripts and supplemental contact materials
Appendix C: Instructions for providers to use study tools
Appendix D: Consent statements and interview contact forms
Appendix E: 60-Day Federal Register Notice
Appendix F: Comments received on 60-day Federal Register notice and ACF Response
References
Bromer, Juliet, Toni Porter, Christopher Jones, Marina Ragonese-Barnes, and Jaimie Orland. “Quality in Home-Based Child Care: A Review of Selected Literature.” OPRE Report 2021-136. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2021.
Del Grosso, Patricia, Juliet Bromer, Toni Porter, Christopher Jones, Ann Li, Sally Atkins-Burnett, and Nikki Aikens. “A Research Agenda for Home-Based Child Care.” OPRE Report 2021-218. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2021.
Maxwell, Joseph A. Qualitative Research Design: An Interactive Approach. Third Edition. Thousand Oaks, CA: SAGE Publications, 2013.
Strauss, Anselm, and Juliet Corbin. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. Second Edition. Thousand Oaks, CA: SAGE Publications, 1998.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Chris Jones |
File Modified | 0000-00-00 |
File Created | 2023-12-15 |