Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Head Start REACH: Strengthening Outreach, Recruitment and Engagement Approaches with Families
OMB Information Collection Request
New Collection
Supporting Statement
Part B
September 2021
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers:
Amanda Coleman
Mary Mueggenborg
Part B
B1. Objectives
Study Objectives
The Head Start REACH: Strengthening Outreach, Recruitment, and Engagement Approaches with Families study, funded by the Administration for Children and Families (ACF) Office of Planning, Research, and Evaluation (OPRE), proposes to collect in-depth qualitative case study data to meet the following objectives:
To understand how Head Start programs decide which families to focus on for their recruitment, selection, and enrollment (RSER) activities
To identify RSER approaches used by Head Start programs and the extent to which programs tailor the approaches to families facing adversities.
To understand which approaches are the most promising for recruiting, selecting, enrolling, and retaining families experiencing adversities
Adversities is a broad term that refers to a wide range of circumstances or events that pose a threat to a child or caregiver’s physical or psychological well-being. The adversities that families experience are often intertwined with poverty, may co-occur, and are affected by systematic factors, such as structural racism. Common examples include (but are not limited to) families experiencing homelessness; involvement in child welfare, including foster care; and affected by substance use, mental health issues, and domestic violence. Promising RSER approaches are those that are supported by descriptive research and/or endorsed by key early care and education (ECE) stakeholders as contributing to programs’ ability to serve families facing adversities; these could include building collaborative relationships with partner agencies and supporting program staff in acquiring skills and knowledge related to serving such families.
Generalizability of Results
The study is intended to present an internally valid description of promising RSER approaches for up to six purposively selected cases, not to promote statistical generalization to a wider population. Publications resulting from the study will acknowledge this limitation.
Appropriateness of Study Design and Methods for Planned Uses
We have proposed to select programs purposively for the case studies and to use qualitative methods to collect data, as such methods are optimal for achieving the study’s objectives. Case studies are an appropriate method for this study because they provide multiple sources of data collected from a variety of respondents, such as through interviews, focus groups, and document reviews, in order to gain an in-depth, multi-faceted understanding of complex policies or interventions within a real-life context (Yin 2003; 2017). The case study approach captures information on more explanatory questions, such as how a program is being implemented and experienced by service providers, families, and children (Harrison et al. 2017). This approach can offer additional insights into what gaps exist in a program’s recruitment or delivery and why some implementation strategies might be more successful than others.
A purposive sample will ensure that we include programs that will provide an in-depth understanding of the promising approaches that are likely to be successful for the RSER of families experiencing specific adversities such as homelessness, involvement in child welfare/foster care, substance use and mental health issues, and domestic violence.
Qualitative methods–such as semistructured interviews, focus groups, and document reviews–will promote an in-depth examination of the implementation of RSER, the barriers to participation in Head Start faced by families experiencing adversities, the RSER-related training and support that Head Start staff receive, and the types of partnerships programs form with community organizations to support families experiencing adversities. The interviews and focus groups will use flexible instruments that are easily adaptable to specific situations and respondent groups (more details about respondent types appear in Section B2 under Target Population). Perspectives from the respondent groups will provide an in-depth understanding of which approaches are likely to be the most successful in the RSER of families experiencing adversities.
As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information. The data collected are not intended to be representative. This study does not include an impact evaluation and will not be used to assess participants’ outcomes. All publicly available products associated with this study will clearly describe key limitations.
B2. Methods and Design
Target Population
To achieve the study’s objectives, we will conduct case studies in six sites. We define a site as a Head Start program and the community organizations with which it partners for the RSER of families experiencing adversities. For each of the six sites, the target population includes Head Start program directors and staff involved in implementing eligibility, recruitment, selection, enrollment and attendance (ERSEA) activities at the center level and/or program level and staff in community organizations with which the Head Start program partners to serve families experiencing adversities and families who are enrolled in the selected Head Start programs as well as families who are eligible for but not enrolled in Head Start programs.
The research team will use nonprobability, purposive sampling to select case study programs and identify potential respondents who can provide information on the study’s key constructs. Given that participants will be purposively selected, they will not be representative of the population of Head Start programs, staff, community partner organizations, or families experiencing adversities.
Respondent Recruitment and Site Selection
To select programs and respondents, we will apply a set of program selection criteria that will lead to an appropriate level of variation in programs for meeting the study’s goals and answering the research questions. Each site will include a Head Start program that has demonstrated success, as perceived by key ECE stakeholders, in the RSER of families experiencing adversities and up to four of its community partner organizations that serve families experiencing adversities. We will purposively select programs that will permit us (1) to examine a variety of approaches that have shown success with families experiencing different types of adversities and (2) to interview respondents within sites who can provide lessons relevant to the RSER of families experiencing adversities. In this section, we describe the program and respondent selection and recruitment steps.
Program identification
Goal: Identify a pool of programs that use promising RSER approaches to achieve success in the RSER of families experiencing adversities.
In February and March 2021, the Head Start REACH team identified 39 programs that key ECE stakeholders suggest use promising approaches to achieve success in reaching and supporting families experiencing adversities (more details on the process of identifying programs appear in Section A8 in Part A under Consultation with Experts Outside of the Study).
Program selection
Goal: Identify six programs (and six backup programs) with the strongest potential for enabling us to answer the study research questions.
The Head Start REACH team will assess each of the 39 programs by using the site selection criteria in Exhibit 1 to ensure that case study programs demonstrate adequate variation in RSER approaches and other program characteristics to answer the research questions. We will supplement the information collected on the programs with information available from administrative sources such as the Program Information Report (PIR)1 and the Head Start Enterprise System (HSES)2. Looking across this information, we will make a final selection of 6 Head Start programs (Tier 1 programs) and an additional 6 programs that will serve as backups in the event that any of our selected programs refuses participation (Tier 2 programs).
We will prioritize programs for which we have the most RSER-related information and those serving the largest proportions of families experiencing adversities. We will also aim to ensure variation in program characteristics, such as urban/rural designation, whether the program is Head Start only or includes Early Head Start, and adversities targeted by the program. We will exclude programs that do not have an existing Head Start grant.
Exhibit 1. Site selection criteria
Types of RSER approaches (program identification process)
Characteristics of the Head Start programs to ensure variation in the sample
|
Program recruitment
Goal: Recruit six programs for the case studies.
Following OMB approval, we will provide project team liaisons with a list of programs containing six pairs of programs; each pair will include a Tier 1 program and its backup Tier 2 program. The two programs in each pair will have similar characteristics for each of the selection criteria. Project team liaisons will first contact the Tier 1 program in each pair to present information about the study, indicate that the program was identified because of its strong RSER practices with families experiencing adversities, and secure the program’s participation in the study (using Instrument 1). If a Tier 1 program is unwilling to participate, the project team liaison will contact and try to recruit a Tier 2 program in its place.
Identification and recruitment of respondents
Goal: Identify and recruit Head Start program staff, parents enrolled in the selected programs, staff from up to four of the program’s partner organizations, and parents from one of the program’s partner organizations.
Once a program has agreed to participate in the study, project team liaisons will ask the program director (using Instrument 1) to identify up to four staff who are involved in the program’s ERSEA activities. We will then ask program staff to help us identify and recruit 8 to 10 families—all experiencing adversities and enrolled in the Head Start program—to participate in a focus group. These families, experiencing a variety of adversities, will be able to discuss their different reasons for enrolling in Head Start and staying enrolled as well as their experiences with the recruitment process and the Head Start program overall. Project team liaisons will also ask the program director to identify up to four community partner organizations that the program targets as part of its ERSEA efforts (such as health agencies, homelessness services providers, social service and child welfare/foster care agencies, and special education providers).
Then, we will ask the program to join us in a call with each identified community partner organization. During the call, we will (using Instrument 4) recruit up to four organizations to participate in the study. We will also recruit one (of the up to four) community organization to help us identify 8 to 10 parents who are eligible for but not enrolled in Head Start to participate in a focus group. These families will be able to discuss their various barriers to enrollment in Head Start as well as their reasons for choosing alternative early education and child care options. We will purposively select organizations to ensure that they are serving families experiencing different types of adversities. We will contact up to 24 community partner organizations (using Instrument 4). Twenty-four community staff will complete the community partner staff interview (Instrument 5). Six Head Start program directors and 24 ERSEA staff will complete the staff interview (Instrument 2). We will conduct focus groups with up to 60 families enrolled in Head Start (Instrument 3) and up to 60 families not enrolled in Head Start (Instrument 6). We recognize that it may be more challenging to recruit families not enrolled in Head Start for focus groups; if necessary, we will offer these families the option of participating in a one-on-one interview (one-on-one or over the phone) in lieu of a focus group. Due to the increased analysis burden associated with interviews, the goal is for most, if not all, respondents to participate in a focus group rather than an interview.
B3. Design of Data Collection Instruments
Development of Data Collection Instruments
We developed six data collection protocols for the Head Start REACH study. Table A.1 in Supporting Statement A provides details about the respondent, content, purpose, mode, and duration for each. The data collection protocols include two recruitment protocols to secure participation in the study—one for Head Start programs (Instrument 1) and the other for community organizations the Head Start programs partner works with in their RSER work with families experiencing adversities (Instrument 4). It also includes two semistructured interview protocols–one for the program director and ERSEA staff at the program and center levels (Instrument 2) and one for community partner staff (Instrument 5). Finally, it includes two semistructured focus group guides–one for families enrolled in Head Start (Instrument 3) and one for families not enrolled in Head Start (Instrument 6).
All questions in the recruitment protocols (instruments 1 and 4), interview protocols (Instruments 2 and 5), and focus group guides (Instruments 3 and 6) are new because they reflect a new area of research and the constructs under study cannot be measured using existing instruments. Prior to data collection, we pretested the recruitment protocol for Head Start programs (Instrument 1) via telephone with two program directors and the interview protocol for program director and ERSEA staff (Instrument 2) with two program directors and two ERSEA staff; we also conducted a debrief after each pretest. The final instruments reflect the adjustments we made to clarify language and address length issues following the pretests.
This effort fills a gap in the knowledge base about the RSER of families facing adversities and will answer the range of research questions in Exhibit 1 in Part A. The core semistructured interview protocols (Instruments 2 and 5) include different modules that capture the range of topics addressed in the study’s research questions. Within each module, interview questions align with the key constructs relevant to the research questions. Respondents will answer only the subset of questions that align with their own areas of experience and knowledge.
Exhibit 2. Program staff interview protocol: Program director and ERSEA staff (Instrument 2)
Protocol module |
Relevant research question(s) |
|
NA |
|
NA |
|
1b. To what extent are these decisions influenced by program, community, and systems level factors (e.g., community needs assessment, availability of other ECE options)? |
|
1a. How do they prioritize families for enrollment in communities where there are more eligible families than slots? 1b. To what extent are these decisions influenced by program, community, and systems level factors (e.g., community needs assessment, availability of other ECE options)?
2a. To what extent are these approaches tailored to families facing adversities (such as families experiencing homelessness, involvement in child welfare, including foster care, and affected by substance use, mental health issues, and domestic violence)?
|
|
|
|
1b. To what extent are these decisions influenced by program, community, and systems level factors (e.g., community needs assessment, availability of other ECE options)? |
|
1a. How do they prioritize families for enrollment in communities where there are more eligible families than slots? 1b. To what extent are these decisions influenced by program, community, and systems level factors (e.g., community needs assessment, availability of other ECE options)?
|
|
NA |
Exhibit 3. Head Start enrolled families focus group guide (Instrument 3)
Protocol module |
Relevant research question(s) |
|
NA |
|
|
|
NA |
|
NA |
|
NA |
|
2b. How do families perceive the approaches programs use for recruitment, selection, enrollment, and retention? |
|
2a. To what extent are these approaches tailored to families facing adversities (such as families experiencing homelessness, involvement in child welfare, including foster care, and affected by substance use, mental health issues, and domestic violence)? 2b. How do families perceive the approaches programs use for recruitment, selection, enrollment, and retention? |
|
2b. How do families perceive the approaches programs use for recruitment, selection, enrollment, and retention? |
Exhibit 4. Community partner staff interview protocol (Instrument 5)
Protocol module |
Relevant research question(s) |
|
NA |
|
NA |
|
2a. To what extent are these approaches tailored to families facing adversities (such as families experiencing homelessness, involvement in child welfare, including foster care, and affected by substance use, mental health issues, and domestic violence)? |
|
1b. To what extent are these decisions influenced by program, community, and systems level factors (e.g., community needs assessment, availability of other ECE options)? |
|
2a. To what extent are these approaches tailored to families facing adversities (such as families experiencing homelessness, involvement in child welfare, including foster care, and affected by substance use, mental health issues, and domestic violence)? 2b. How do families perceive the approaches programs use for recruitment, selection, enrollment, and retention? |
|
1a. How do they prioritize families for enrollment in communities where there are more eligible families than slots? 1b. To what extent are these decisions influenced by program, community, and systems level factors (e.g., community needs assessment, availability of other ECE options)?
|
|
|
Exhibit 5. Families not enrolled in Head Start focus group guide (Instrument 6)
Protocol module |
Relevant research question(s) |
|
NA |
|
NA |
|
NA |
|
NA |
|
NA |
|
NA |
|
2b. How do families perceive the approaches programs use for recruitment, selection, enrollment, and retention? |
|
2b. How do families perceive the approaches programs use for recruitment, selection, enrollment, and retention? |
B4. Collection of Data and Quality Control
Case study teams, consisting of two people, will recruit and conduct all interviews and focus groups at a particular site.
Training
To ensure that we collect high quality data, we will train teams to conduct the recruiting, interviewing, and focus group activities. We will train case study teams in six topics:
Program recruitment to prepare teams for activities needed to recruit programs (Instrument 1)
Program and center staff semistructured interviews to prepare teams to implement the program director and ERSEA staff semistructured interview protocol (Instrument 2), including selection of questions appropriate for each interviewee
Head Start enrolled families focus groups to prepare teams to implement the Head Start enrolled families focus group guide (Instrument 3)
Community partner organization recruitment to prepare teams for activities needed to recruit partner organizations (Instrument 4)
Community partner staff semistructured interviews to prepare teams to implement the community partner staff interview protocol (Instrument 5), including selection of questions appropriate for each interviewee
Non-Head Start families focus groups to prepare teams to implement the focus group guide for families not enrolled in Head Start (Instrument 6). Teams will also be trained in using the guide to conduct an interview when necessary.
Before beginning the recruitment process, team members will be trained on how to establish rapport with program and partner organization staff, including refusal conversion techniques. Data collection training will include a thorough review of the interview protocol and focus group guides and will focus on strategies to ensure that we collect high quality data while minimizing burden on respondents, including (1) how to prepare for the interview (for example, asking only relevant questions; identifying already collected documents that provide needed information); (2) how to move efficiently through the interview protocols and focus group guides while collecting high quality information (for example, how to make decisions about which probes are critical based on answers received to that point in the interview/focus group); and (3) how to synthesize notes after each interview/focus group to confirm completeness of the data.
Recruitment
Recruitment materials and protocol. Section B3 describes the development of the recruitment protocols. To ensure buy-in, we have also prepared an engaging recruitment packet that includes a recruitment letter describing the study to program directors (Appendix C), endorsement letters from the Administration for Children and Families (Appendix B), aesthetically pleasing flyers to encourage the participation of families in the focus groups (Appendices H and I), and an attractive frequently asked questions (FAQ) sheet that includes study details (Appendix D).
The recruitment protocols will ensure that project team liaisons clearly communicate the purpose of the study and that they are prepared to answer any questions or address any concerns of potential participants. Project team liaisons will use the program director recruitment call protocol (Instrument 1) to ensure that they systematically collect information needed to make decisions about the respondents best suited for participation in the semistructured interviews and the partner organizations best suited for inclusion in the study.
Recruitment process for study respondents. After we have selected Head Start programs for the study sample, we will move to the formal recruitment of respondents. Project team liaisons will start by sending program directors the recruitment letter describing the study (Appendix C), endorsement letters from the Administration for Children and Families (Appendix B), and the FAQ sheet (Appendix D). They will follow up by telephone and use the program director recruitment call protocol (Instrument 1) to secure the program’s participation and identify ERSEA staff and up to four community organizations with which the program partners for the RSER of families experiencing adversities. We will work with the designated on-site coordinator at the Head Start program to identify enrolled parents for a focus group. We will also work with the on-site coordinator to make initial contact with the partner organizations (Instrument 4) to present information about the study, secure their participation in the study, and gauge their willingness to help us recruit for a focus group for Head Start-eligible but not enrolled families.
Collecting data
A two-member case study team will conduct two-day site visits at each of the six selected sites and collect documents for a document review, administer semistructured interviews, and conduct focus groups to answer the study’s research questions. Although we have planned to conduct site visits, we will be prepared to conduct all data collection remotely via telephone and video conferencing software, if necessary.
For all interviews and focus groups, one member of the team will conduct the interview, and the other member will take notes. With the permission of respondents, we will also audio record the interviews for later transcription. The case study site team will confer after each interview and focus group (using recordings as needed) to ensure completeness of data. Throughout the data collection period, the whole case study team will conduct weekly meetings to report on and exchange information and strategies, help troubleshoot challenges, and ensure that all data are collected uniformly.
Details about each information collection follow. Before or during the site visit and in order to perform an in-depth document review, we will request documents from the program on its ERSEA policies, eligibility criteria scoring, recruitment materials, enrollment forms, and recent community needs assessment.
We will conduct a 60-minute semistructured interview with program directors and 90-minute semistructured interviews with up to four program staff who are responsible for ERSEA-related efforts. We plan to conduct the program director and staff interviews in-person; however, they could take place by telephone or videoconference if more convenient for the respondent or if we need to conduct data collection remotely.
We will conduct a 45-minute semistructured interview with a representative from the community partner organization from which we recruit families for a focus group. We will conduct the interview in-person, if possible, but could conduct it by telephone or videoconference if more convenient for the respondent or if we need to conduct data collection remotely. We will conduct 45-minute semistructured telephone interviews with representatives from up to three of the other community partner agencies either before or after the site visit.
We will conduct one 90-minute focus group with 8 to 10 families (one representative per family) who are enrolled in Head Start. This focus group will take place in-person or via video conference if we need to collect data remotely.
We will conduct one 90-minute focus group with 8 to 10 families (one representative per family) who are not enrolled in Head Start. The community partner organization will help us identify those non-enrolled families. This focus group will take place in-person (or via video conference if we need to collect data remotely). If necessary, we will offer the option of administering the focus group protocol as a 45-minute one-on-one interview (in person or over the telephone) to maximize participation.
We will regularly hold team meetings throughout data collection. These meetings will be an opportunity to identify and address any data collection issues. If data are not being collected uniformly, we will make adjustments during the data collection to ensure the protocols are being followed and data collectors are using comparable strategies in conducting interviews or focus groups.
As needed, a senior team member will provide additional training to the data collector, such as reviewing training materials, or observe the data collector and provide feedback on their interviewing technique so they can make corrections and apply a more consistent approach to data collection.
B5. Response Rates and Potential Nonresponse Bias
Response Rates
The interviews and focus groups are not designed to produce statistically generalizable findings, and participation is wholly at the respondent’s discretion. Response rates will not be calculated or reported.
NonResponse
Based on experience with similar methods and respondents, we do not expect substantial nonresponse for the interviews or focus groups. Given that we will not randomly select participants and that we do not intend the findings to be representative, we will not calculate nonresponse bias. As part of study reporting, however, we will present information about the characteristics of the participating programs and organizations.
B6. Production of Estimates and Projections
This study is intended to present an in-depth description of promising RSER approaches at exemplary Head Start sites, not to promote statistical generalization to other programs. We do not plan to make policy decisions based on data that are not representative and we will not publish population estimates. Information reported will clearly state that results are not meant to be generalizable.
B7. Data Handling and Analysis
Data Handling
The interviews and focus groups will be transcribed, with the transcripts used for data analysis. Reliance on transcripts rather than on a summary allows us to include respondents’ own voices and words in the final report. To ensure the completeness and accuracy of the transcripts, the data collectors will review the audio recordings to fill in any words, phrases, or portions of text indicated as inaudible by the transcriber. When possible, the team member who conducted the interview or focus group will be the one who listens to the recording and fills in missing text. Because focus groups are usually held in a conference room or large room, we will use two recorders, placed in different parts of the room, to capture the conversation. Such an arrangement will also ensure that, if one recorder fails, there is a backup of the recording and that there is an additional recording for the interviewer to consult to fill in missing transcript text.
Data Analysis
Qualitative data from both the interviews and focus groups and from the collected documents will provide comprehensive and rich information for analyzing factors associated with approaches that are most promising for recruiting, selecting, enrolling, and retaining families experiencing adversities. We will build a structure for organizing and coding the data across all the data collection sources to facilitate efficient analysis across respondents.
Coding and analysis approach. Using the same standard procedures for organization, coding, triangulation, and theme identification, we will code and analyze interview, focus group, and document data from the case studies. First, we will conduct primary coding, a deductive process in which we apply predetermined codes to the data (Crabtree and Miller 1999). Primary coding allows large amounts of data to be categorized and organized into more manageable portions of text. Next, we will conduct secondary coding (synonymous with the analysis step), applying an inductive method called constant comparative analysis (CCA; Lewis-Beck et al. 2004). In inductive qualitative coding methods, we will compare codes or themes that arise from the data with existing data or other identified codes. CCA will enable us to identify themes and categories that emerge from the data rather than searching the data for confirmation of themes or hypotheses. Such themes are sometimes collapsed into broader themes or further refined into subthemes for reporting.
Preparing data sources for coding. First, the documents gathered from the case study sites will require an additional step of preparation before primary coding. The team will develop a template to extract data from the collected documents. We will create a standardized template with fields for recording information that we expect to be important or particularly relevant to RSER. The study team will review each document and complete a template for each document. We will use the completed templates, rather than the documents themselves, for coding and analysis.
Second, to prepare for analysis, we will load all interview and focus group transcripts and completed document review templates into NVivo 12, or a similar qualitative analysis software package, in order to conduct primary coding. We will create codebooks for a set of deductive (predetermined) codes to be applied to the data sources during primary coding. For the interview and focus group data, we will derive the codes closely from the interview protocols. For the documents, the codebook will contain codes that map to the fields in the document review template.
Finally, we will train a small team of coders to conduct the primary coding. To ensure consistency in coding, a lead coder will develop the coding scheme, train the coders, oversee the work, and ensure reliability. All coders will code the first transcript and document together and discuss any differences to establish coder agreement. When new themes or concepts emerge, we will create new primary codes and apply them to all coded data sources. A senior coder will provide quality assurance by reviewing up to 25 percent of the coded transcripts and documents.
Analyzing coded data. In the next step, we will conduct secondary, or analytic, coding to identify emergent themes and patterns in the data. Using the data coded in NVivo, we will run reports from the software to obtain aggregated output on each primary code across data sources and, using the CCA method, will conduct secondary coding of the data. We will refine and summarize emergent themes and findings both within and across data sources. In addition, the lead coder will hold meetings with the coders and other study team members, as needed, to build consensus about whether specific findings were classified under the correct thematic labels and to confirm that the labels accurately reflect the underlying concept.
Data Use
Summarizing data on specific research questions and concepts to generate findings. When the data are coded, the team will be able to retrieve, and sort data linked to specific research questions and constructs that will improve understanding about the implementation of approaches for the RSER of families experiencing adversities into Head Start. We will synthesize data pertaining to a specific research question across respondents or to specific types of respondents. Dissemination of findings may include a report, research brief, and presentations or briefings. All materials that are disseminated will note limitations to the data.
We will also provide data files that will be made available publicly or through restricted download. Because case study data will be qualitative, we will use the following two primary approaches to archiving them: (1) interview and focus group summaries prepared to ensure the confidentiality and privacy of the participants or (2) spreadsheets of the coded qualitative data, including data from the document review. Before archiving the data, we will screen the data content to reduce the risk of confidentiality breaches, either directly or through deductive analysis, and we will ensure data are stripped of any identifying information, such as a uniquely identifying detail. We will prepare a data-use manual to accompany the data archive files that describes the study design, data collection procedures, and analysis approaches employed by the study team in order to support understand how to properly interpret, analyze and evaluate the information collection. The manual will also describe study limitations.
B8. Contact Persons
The following individuals at ACF and Mathematica are leading the study team:
Amanda Coleman
Senior Social Science Research Analyst
Office of Planning, Research, and Evaluation
Mary Mueggenborg
Senior Social Science Research Analyst
Office of Planning, Research, and Evaluation
Louisa Tarullo, Project Director
Mathematica
Laura Kalb, Survey Director
Mathematica
Harshini Shah, Deputy Project Director and Deputy Survey Director
Mathematica
Appendices
Study Recruitment Materials
Appendix A.1. Responses to comments received on 60-day Federal Register Notice
Appendix A.2. Expression of support in response to 60-day Federal Register Notice
Appendix B: ACF Endorsement Letter
Appendix C: Program Director Recruitment Letter
Appendix D: Study FAQs for Head Start Staff
Appendix E: Consent Form for Head Start-Enrolled Parents’ Focus Group
Appendix F: Consent Form for Non-Enrolled Parents’ Focus Group
Appendix G: Consent Form for Non-Enrolled Parents’ Interview
Appendix H: Study Recruitment Flyer for Head Start-Enrolled Parents’ Focus Group
Appendix I: Study Recruitment Flyer for Non-Enrolled Parents’ Focus Group
Attachments (Instruments)
Instrument 1: Program director recruitment call protocol
Instrument 2: Program staff interview protocol: Program director and ERSEA staff
Instrument 3: Head Start enrolled families focus group guide
Instrument 4: Community partner recruitment call protocol
Instrument 5: Community partner staff interview protocol
Instrument 6: Families not enrolled in Head Start focus group guide
References
Crabtree, B., and W. Miller. “A Template Approach to Text Analysis: Developing and Using Codebooks.” In Doing Qualitative Research, edited by B. Crabtree and W. Miller. Newbury Park, CA: Sage, 1999.
Harrison, H., Birks, M., Franklin, R., & Mills, J. (2017, January). Case study research: Foundations and methodological orientations. In Forum: Qualitative Social Research (Vol. 18, No. 1).
Lewis-Beck, M.S., A. Bryman, and T. Futing Liao. The SAGE Encyclopedia of Social Science Research Methods (Vols. 1-0). Thousand Oaks, CA: Sage Publications, Inc., 2004.
Yin, R. K. (2003). Case study research: Design and methods (3rd ed.). Thousand Oaks, CA: Sage.
Yin, R. K. (2017). Case study research and applications: Design and methods. Sage publications.
1 OMB #0970-0427
2 OMB #0970-0207
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Caroline Lauver |
File Modified | 0000-00-00 |
File Created | 2021-09-02 |