Descriptive Study of Early Head Start (Early Head Start Family and Child Experiences Study; Baby FACES)
Supporting Statement for OMB Clearance Request
Part A: Justification for the Collection of Baseline Data
This section provides supporting statements for each of the points outlined in Part A of the Office of Management and Budget (OMB) guidelines for the collection of information in the Early Head Start Family and Child Experiences Survey (Baby FACES).1 The study will be carried out in two phases:
Phase 1: Pilot test two instruments (the Preschool Language Scale, 4th Edition, and the Bayley Scales of Infant and Toddler Development, Third Edition, Screening Test), sample, and recruit programs.
Phase 2: Collect data, and analyze and report findings.
We have received partial OMB clearance under a separate package for Phase 1 for the pilot test (NOA 0970-0347, dated July 3, 2008, expiring July 31, 2011). This submission requests clearance for conducting Phase 2 and sampling and recruiting from Phase 1. The successful completion of Phase 1 sampling and recruiting activities will be dependent on timely review of this package as noted in the terms of clearance.
The Administration for Children and Families (ACF) of the Department of Health and Human Services is requesting OMB clearance for instruments to be used in recruiting and data collection for a descriptive study of Early Head Start. ACF has contracted with Mathematica Policy Research, Inc., and its subcontractors, to collect descriptive information on services provided to children and families, characteristics of the children and families served and key child and family outcomes.
There are two legislative bases for the Baby FACES data collection. The Government Performance and Results Act (GPRA) of 1993 (PL 103-62) required that the Office of Head Start move expeditiously to develop and implement Head Start Performance Measures. The 2007 reauthorization of Head Start (Improving Head Start for School Readiness Act of 2007, PL 110-134, Section 649 (h)), requires a study of the status of dual language learner children and their families participating in Head Start and Early Head Start programs. The Baby FACES study will allow the Office of Head Start to document the status of dual language learner children and families in Early Head Start, and to understand how the population of enrolled children and families fare over time. This information will satisfy the accountability and program improvement goals mandated by the Act.
Study Rationale and Objectives
Since the inception of Early Head Start, research has been an integral part of the program. At the same time the first programs began operating, the Administration for Children, Youth, and Families (ACYF) initiated an evaluation of Early Head Start. The Early Head Start Research and Evaluation Project (EHSREP) findings showed that the programs had a broad range of impacts on the development of 3 year olds and on parents of those children (ACYF 2002a). Overall, children performed better on measures of cognition, language, and social emotional functioning than their peers who did not receive Early Head Start. Additionally, they were less likely to be in the “at risk” category on cognitive and language functioning. Early Head Start parents were more supportive of their children’s emotional, cognitive, and language development, and were more likely to be in education or job training. In addition, the pattern of impacts varied by program characteristics. Programs that fully implemented key elements of the Head Start Program Performance Standards (HSPPS; ACF 1996) early in the evaluation period demonstrated the strongest pattern of impacts and a broader range of impacts. In addition to broad impacts on children’s development, parents in these programs were more likely to provide support for their child’s language and literacy development through activities such as daily reading and providing enhanced literacy environments.
Given the variations in program impacts by service delivery and implementation observed in the EHSREP and the growth of the Early Head Start program, it is important to understand how programs currently operate, the services that they deliver, how these vary by program and family characteristics, and how children and families enrolled in the program fare over time. This knowledge can then inform technical assistance and actions aimed at improving the program. Accordingly, ACF launched a planned series of descriptive studies of Early Head Start, beginning with the Survey of Early Head Start Programs (SEHSP; OMB Control No. 0970-0008; Vogel et al. 2006), and continuing with the current study. Baby FACES goes beyond the SEHSP by providing information at the level of the individual child and family in addition to data on programs. It builds upon the template used in the Head Start Family and Child Experiences Survey (FACES) using ACF’s previously approved approach and methodology.
Research questions that Baby FACES will explore include (1) identifying services offered, their frequency, and their intensity; (2) identifying the key characteristics of children and families currently served in Early Head Start (with a focus on dual language learners); (3) investigating how programs individualize services to meet family needs and the match between identified needs and services; (4) showing how children and families are faring over time; and (5) exploring associations between the type and quality of Early Head Start services and child and family well-being. The focus on program services is an area of particular value. Whereas in the SEHSP we learned in general what programs provided (that is, the number of home visits that home visit programs typically provided each month to home visit families), it was not possible to link service provision to individual families. Baby FACES will go beyond these general program measures to provide specific information on the number of home visits provided to specific families along with child and family outcomes.
Baby FACES will be a descriptive study of Early Head Start programs with a representative sample of programs and children in two age cohorts: perinatal (subjects will be pregnant women within two months of their due date and the mothers and their infants up to two months after birth) and age 1 (children between 10 and 14 months of age); both cohorts will be followed until children reach age 3. Major study activities will include:
Selecting a nationally representative sample of 90 Early Head Start programs and recruiting them to participate in the study
Sampling and recruiting families of approximately 2,000 children across two birth cohorts (perinatal and age 1) to participate in the study
Collecting data on the research sample annually until children reach age 3 with a followup at age three and one-half for children in the age 1 cohort only (to learn about transitions out of Early Head Start). The types of data collection activities to be undertaken in the study are listed below. Data collection procedures are described more fully in section B.3, all measures proposed are included in Appendix D.
Direct child assessments at ages 2 and 3
Annual parent interviews
Parent-child interactions at ages 2 and 3
Annual primary caregiver/home visitor interviews and ratings
Annual program director interviews
Weekly family services tracking information on sample members
Annual classroom/home visit observations (no burden for study members)
Analyzing and reporting findings
Baby FACES findings will help guide ACF, the Office of Head Start, national and regional training and technical assistance (T/TA) providers, and local programs in supporting policy development and program improvement at all levels. The data collected will provide information to be used in the national program’s performance measures review. The study will also provide guidance to Early Head Start programs for their own performance measurement by providing examples of measures that programs could use and national benchmarks for those measures, and by helping them to incorporate regular data collection into their routines.
The main goals are to:
Describe the national Early Head Start program, including variations in and patterns of services provided to families, training and credentials of staff, and quality of services provided.
Describe the population served by Early Head Start, including demographic characteristics, strengths and needs of families over time, and information on children’s development over time.
Explore how change over time in child and family functioning is related to specific aspects of the program and services received.
This information is critical to understanding the Early Head Start program as it exists today, including the experiences of the growing segment of dual language learners (as mandated by the Head Start reauthorization). The information is central to national program planning and training and technical assistance activities.
TABLE A.1
INSTRUMENT COMPONENTS, TYPE OF ADMINISTRATION,
FREQUENCY AND PURPOSE
Instrument |
How survey is administered |
How often survey is administered |
Overall Goal of Instrument |
Parent Interview |
Age 1: CATI; Age 2,3: CATI/CAPI |
Annually, until child is 3 years old |
Describe the population served by EHS, including demographic characteristics, strengths and needs of families over time, and information on children’s development over time.
|
Program Director Interview |
Semi-structured telephone interview with MPR staff |
Annually |
Describe the national EHS program, including variations and patterns of services provided to families, training and credentials of staff, and quality of services provided. Link change over time in child and family functioning to specific aspects of the program and services received
|
Primary Caregiver Interview |
CATI |
Annually |
Describe the national EHS program, including variations and patterns of services provided to families, training and credentials of staff, and quality of services provided. Link change over time in child and family functioning to specific aspects of the program and services received
|
Home Visitor Interview |
CATI |
Annually |
Describe the national EHS program, including variations and patterns of services provided to families, training and credentials of staff, and quality of services provided. Link change over time in child and family functioning to specific aspects of the program and services received |
Instrument |
How survey is administered |
How often survey is administered |
Overall Goal of Instrument |
Primary Caregiver/Home Visitor Child Rating |
SAQ |
Annually |
Describe the population served by Early Head Start, including demographic characteristics, strengths and needs of families over time, and information on children’s development over time. Link change over time in child and family functioning to specific aspects of the program and services received.
|
Family Service Tracking |
Web/SAQ |
Weekly |
Describe the national EHS program, including variations and patterns of services provided to families, training and credentials of staff, and quality of services provided. Link change over time in child and family functioning to specific aspects of the program and services received.
|
Child Direct Assessment |
CADE |
Annually when children are 2 and 3 |
Describe the population served by EHS, including demographic characteristics, strengths and needs of families over time, and information on children’s development over time. Link change over time in child and family functioning to specific aspects of the program and services received.
|
Parent-Child Interaction |
Videocoding |
Annually when children are 2 and 3 |
Describe the population served by EHS, including demographic characteristics, strengths and needs of families over time, and information on children’s development over time. Link change over time in child and family functioning to specific aspects of the program and services received |
The proposed data collection will use a variety of information technologies to reduce the burden of participating on respondents. Parent interviews, conducted at each wave of data collection, will be conducted using computer-assisted telephone interviewing (CATI) during the first wave, and a combination of CATI and computer-assisted personal interviewing (CAPI) during follow-up waves. Primary caregivers and home visitors will be interviewed using CAPI technology at all waves.
Child assessments will be administered using computer-assisted data entry (CADE) when the cohort children reach 2 and 3 years of age. This technology will facilitate the routing and calculation of basal and ceiling rules, thereby reducing the amount of time required to administer the assessments and reducing burden on the child.
Family service tracking data will be collected using a web-based interface allowing the electronic submission of responses. Primary caregivers and home visitors will be given the option of completing a weekly paper and pencil inventory of the services provided to each of the sampled families they work with or completing the inventory online. If these personnel complete the paper and pencil inventory, we will encourage the program to identify a person to be responsible for entering the responses into the web-based interface.
The program director interview will be conducted during a semi-structured telephone discussion that is not conducive to computerized telephone interviewing.
There is no evidence of any other studies or administrative data that offer information as comprehensive as Baby FACES will.
No comparable data have been collected using a national sample on the characteristics of Early Head Start children and families, their program participation, service use, or key child and family outcomes. No existing data provides the breadth of description that will result from Baby FACES, especially as it involves a nationally representative sample of Early Head Start children across two age cohorts.
No small businesses are impacted by the data collection in this project.
To address the study’s research objectives about how children fare over time in Early Head Start requires annual data collection. Collecting data over children’s full experience in Early Head Start will allow us to examine growth over time, patterns of needs and match of services to those needs, and exit patterns.
There are no special circumstances that might require deviation from the guidelines of 5 CFR 1320.5.
The first Federal Register notice for Phase 2 of the Baby FACES study was published in the Federal Register, Volume 76, on April 1, 2011. During the notice and comment period, one request was received for copies of instruments. All requests were fulfilled, and no public comments were received.
Many individuals and organizations, including the Baby FACES Technical Work Group, have been contacted for advice on various aspects of the design of the study and data collection instruments. Their feedback was obtained through in-person meetings and telephone conversations. Members of the Baby FACES Technical Work Group are listed in Table A.2.
tABLE A.2
Membership of Baby FACES Technical Work Group
Margaret Burchinal |
University of California, Irvine |
Judy Carta |
University of Kansas |
Catherine Ayoub |
Harvard University |
Lori Roggman |
Utah State University |
Helen Raikes |
University of Nebraska |
Jeanne Brooks-Gunn |
National Center for Children and Families, Teachers College, Columbia University |
Carol Hammer |
Pennsylvania State University |
Ellen Eliason Kisker |
Twin Peaks Partners, LLC |
Michael Foster |
University of North Carolina, Chapel Hill |
Judith Jerald |
Save the Children |
Tammy Mann |
ZERO TO THREE |
Brenda Jones Harden |
University of Maryland, College Park |
Lisa Berlin |
Center for Child and Family Policy, Duke University |
We recognize that participation in the Baby FACES study will place some burden on the participating program staff, families, and children. We have attempted to minimize this burden through our data collection procedures and our use of carefully constructed instruments and assessments. Nevertheless, we believe it is important to acknowledge the burden that participation entails at each round of the study. Our incentive structure is based on those used effectively in previous projects and attempts to acknowledge respondents' efforts in a respectful way. ACF has structured the incentives to be provided at each level of data collection – at the program level, at the caregiver level and at the family level.
The study protocol calls for home visitors and primary caregivers to complete a weekly family services tracking form for each of the Baby FACES families they serve. In addition, program staff will support Baby FACES data collection by providing rosters that enable us to select sample from the program, updating these rosters on a yearly basis, scheduling home visit and classroom observations, and providing updated contact information on our sample each year. Since completing this form on a weekly basis throughout the year is burdensome, and because we will be contacting caregivers, home visitors, and staff to support Baby FACES in these additional ways, we propose to provide a yearly incentive of $500 directly to programs as a sign of appreciation for their participation and in order to sustain their participation and improve the quality of information gathered. Since the sampling strategy begins at the program level we believe it's essential to provide them with an incentive for participation in order to encourage them as "gatekeepers" of access to caregivers, home visitors and staff.
An incentive of this amount is consistent with previous information collections approved by OMB. OMB's approval of cash incentives of this amount is particularly important to collecting high quality data in Baby FACES for two reasons. First, and perhaps most importantly, the $500 incentive to programs was the same as was provided to the gatekeepers in the FACES 2006 information collection. The $500 incentive payment helped to assure a response rate over 80 percent in FACES 2006, and we believe that providing an incentive of a similar level in Baby FACES would be critical to ensuring equally high response rates and the highest quality data. FACES 2006 not only establishes a baseline from which to draw inferences on the effect of incentives on response rates, but it also is particularly important in this study since eight of the 90 programs identified in the sample will overlap with the FACES 2006 sample. Thus, we believe that parity between the two studies will be essential to obtaining cooperation and high quality data.
Primary caregivers and home visitors (employees of the program) will receive a gift of supplies valued at $25 at each of the four annual visits to be used in their classrooms or on their home visits. Finally, since we anticipate that primary caregivers will complete the weekly report forms on their own time, we will provide them with a $5 incentive for each form they complete (we estimate that each primary caregiver or home visitor will complete these for 3.2 children each week).
Although the sample is drawn at the program level, families are essential respondents to this project. Parents are asked to participate in Baby FACES through a one-hour long parent interview in the first round – this is either during their last few months of pregnancy or the first few months postpartum for the perinatal cohort, or in the months surrounding the focal child's first birthday for the age 1 cohort families. In subsequent data collection rounds, when children are 2 and 3 years old, families will also be participating in the home visit portion of the study. At these age points parents have the option of completing the parent interview by phone or in-person. Families will also be asked to schedule the in-home direct child assessment and 15-minute parent-child interaction task. This visit will be one-hour and 15 minutes at a minimum and up to 2 hours if parents elect to complete the parent interview in person. We propose to give families a $35 incentive for completing the parent interview (either by telephone or in-person). This amount is comparable to incentives used in FACES ($35 for 45-60 minute interview) and Building Strong Families ($50 for completing two 50-minute parent interviews). As an additional incentive for the home visits, we propose to give families an age appropriate book worth $5 to $7. This amount is also in line with the incentive used in FACES ($5 to $7 book for completing 45-minute child assessment) and Building Strong Families (a $5 book and a $5 toy for participating in two 15-minute parent-child interaction tasks). This book will be a token of our appreciation to families for allowing field staff into their home.
Respondents will receive information about privacy protections when they consent to participate in the study. Information about privacy will be repeated in the introductory comments of interviewers and in-home data collectors. All interviewers and data-collectors will be knowledgeable about privacy procedures and will be prepared to describe them in detail or to answer any related questions raised by respondents.
We have crafted carefully worded consent forms (Appendix C) that explain in simple, direct language the steps we will take to protect the privacy of the information each sample member provides. For the purposes of Baby FACES, we refer to the child’s parent or legal guardian as “parent,” the Early Head Start staff member providing home-based services as “home visitor”, and the Early Head Start staff member providing center-based child development services as “primary caregiver.” Assurances of privacy related to the parent interviews, home observations, and child assessments will be given to each parent as he or she is recruited for the study and before each round of data collection. Parents will be assured that their responses will not be shared with the Early Head Start program staff, their child’s primary caregiver, or the program, and that their responses will be reported only as part of aggregate statistics across all participating families. We will not share any information collected from the parent interviews or parent-child interaction with any Early Head Start staff member. Moreover, no scale scores from direct child assessments will be reported back to primary caregivers. MPR will obtain signed, informed consent from all parents prior to their participation and obtain their consent to assess their children. The Baby FACES fact sheet makes it clear that parents may withdraw their consent at any time.
We are also in the process of obtaining a certificate of confidentiality to help ensure the privacy of study participants. We have recently received the IRB clearance needed prior to applying for the certificate, and expect to also obtain the certificate of confidentiality in approximately 4 to 6 weeks. With this timetable, we should have a certificate of confidentiality in plenty of time to begin approaching parents for consent to participate in the study.
To achieve its primary goal of describing the characteristics of the children and families served by Early Head Start, we will be asking some sensitive questions, including some aimed at assessing feelings of depression; use of services for emotional or mental health problems or for family violence or substance abuse; family involvement with the criminal justice system; and the child’s exposure to neighborhood or domestic violence. Such questions are necessary in a study of a program designed to affect family outcomes. The questions employed are from standardized measures or have been used extensively in prior studies with no evidence of harm (for example, in the Fragile Families Study and in the Early Head Start Research Evaluation Project).
The parent and primary caregiver interviews also contain a question about race. The race categories are never read to the respondent; rather, the question is asked as an open-ended question. The responses are shown on the CATI/CAPI screen for the interviewer’s ease of coding. The categories specified in this item are taken directly from the FACES 2006 parent and primary caregiver interviews, and are the most common responses given in previous FACES interviews. This approach is also used in other large federal studies such as the Early Childhood Longitudinal Study – Kindergarten and Birth cohorts and the National Household Education Study. By showing these categories on the interviewer’s screen, we lower respondent burden by reducing the amount of coding that must be done during the interview, and reduce the possibility of recording errors. Fewer than 5 percent of the parent and teacher responses on FACES 2006 did not fall into one of the listed categories. For those that do not fit, interviewers record exactly what the respondent says. This approach allows for back coding to the OMB standard race categories by our analysts. In FACES 2006, we were able to successfully back code almost all responses to the OMB standards. The majority of the Other responses that were final coded as missing were cases where the respondent insisted on Hispanic/Latino as a race. In the end, FACES 2006 reported fewer than 1 percent as other race and fewer than 4 percent missing.
As part of the consent process, participating parents will be informed that sensitive questions will be asked and asked to sign a consent to participate form acknowledging that their participation is voluntary. All respondents will be informed that their identity will be kept private and that they do not have to answer questions that make them uncomfortable.
To further ensure privacy, personal identifiers that could be used to link individuals with their responses will be removed from all completed questionnaires and stored under lock and key at the research team offices. Data on laptop computers will be secured through operation and survey system configuration and a password. The use of common Windows utilities, such as Explorer, will be prevented, and all communication utilities will be disabled, except for those required to communicate with the home office. Any computer files that contain this information also will be locked and password-protected. The project director of the contractor handling data collection will control access to information in the locked files. Interview and data management procedures that ensure the security of data and privacy of information will be a major part of training.
The proposed data collection does not impose a financial burden on respondents nor will respondents incur any expense other than the time spent completing the interviews and direct assessments.
The estimated annual burden for study respondents—parents, children, and program staff—is listed in Table A.3. Response times are derived from previous studies using the same instruments with a similar population. For copyrighted measures, published estimates of the administration times were also used. The total annual burden is expected to be 12,460 hours for all of the instruments.
TABLE A.3
ESTIMATED ANNUAL RESPONSE BURDEN
Instrument |
Number of Respondents |
Number of Responses per Respondent |
Average Burden Hours per Response |
Total Burden Hours |
Parent Interview |
1,554 |
1 |
.95 |
1,479 |
Program Director Interview |
90 |
1 |
.67 |
60 |
Child Care Provider Interview |
180 |
1 |
.25 |
45 |
Home Visitor Interview |
270 |
1 |
.25 |
68 |
Primary caregiver/Home Visitor Child Rating |
450 |
3.2 |
.333 |
480 |
Family Service Tracking |
450 |
166.4 |
.125 |
9,360 |
Child Direct Assessment |
774 |
1 |
1 |
774 |
Parent-Child Interaction |
774 |
1 |
.25 |
194 |
Estimated Total Annual Burden Hours |
|
|
|
12,460 |
Estimates of Annualized Costs
To compute the total estimated annual cost, the total burden hours were multiplied by the average hourly wage for each adult participant, according to the Bureau of Labor Statistics, Current Population Survey, 2008. The results appear in Table A.4 below. For child care providers and primary caregivers, we used the mean salary for child care providers ($9.05 per hour). For program directors, we used the mean salary for full-time employees over age 25 with a bachelor’s degree ($25.30 per hour). For parents, we used the mean salary for full-time employees over the age of 25 who are high school graduates with no college experience ($15.03 per hour).
TABLE A.4
TOTAL ESTIMATED ANNUAL COST
Instrument |
Total Burden Hours |
Average Hourly Wage |
Total Annual Cost |
Parent Interview |
1,479 |
$15.30 |
$22,629 |
Program Director Interview |
60 |
$25.30 |
$1,518 |
Child Care Provider Interview |
45 |
$9.05 |
$407 |
Home Visitor Interview |
68 |
$9.05 |
$615 |
Primary Caregiver/ Home Visitor Child Rating |
480 |
$9.05 |
$4,344 |
Family Service Tracking |
9,360 |
$9.05 |
$84,708 |
Child Direct Assessment |
774 |
|
|
Parent-Child Interaction |
194 |
$15.30 |
$2,968 |
Total |
12,460 |
|
$129,674 |
Not applicable.
The total cost to the federal government for Baby FACES under the terms of the contract to MPR., is $16,886,110.00. The cost for the data collection elements is $11,260,477.00 or $1,851,775.00 per year. These costs include the sampling, data collection, participant tracking, data processing, and data coding. Respondent incentives and gifts are also included in the costs.
Analytic strategies for Baby FACES will be tailored to address each of the study’s research objectives. Specifically, the analyses will aim to (1) describe the characteristics of Early Head Start programs and the prevalence of each type of service delivery option at the individual family level as well as at the program level; (2) describe key characteristics of the children and families currently served in Early Head Start (with a focus on dual language learners); (3) identify services provided to families, their frequency, and their quality; (4) investigate how programs individualize services to meet family needs and the match between identified needs and services; (5) show how children and families are faring over time; and (6) explore associations between the type and quality of Early Head Start services and child and family well-being.
Analyses will employ a variety of methods, including descriptive statistics (means, percentages), simple tests of differences across subgroups and over time (t-tests, chi-square tests), trend analysis (growth curve modeling), and multivariate analysis (regression analysis, hierarchical linear modeling). In this descriptive study, many of the research questions can be answered by calculating averages and percentages of children, classrooms, or programs falling into various categories; comparisons of these averages across time or across subgroups; and changes in outcomes over time. More complex analysis of relations among program characteristics and child and family outcomes can be done through hierarchical linear modeling (HLM). Growth curve models will allow comparison of developmental trajectories across children in different types of programs and with different family characteristics—for example, dual language learners compared to non–dual language learners.
Interviews with program staff will provide descriptive data about programs, such as enrollment, program model, program approach, and staff background and qualifications. Analyses focused on child and family characteristics will draw on data from interviews with parents, primary caregivers, and home visitors. These informants will provide information on children’s and families’ demographic characteristics, the needs families have as they enter the program, and changes in family needs over time. Cross-tabulations of program characteristics and family characteristics will also provide important information about differences across programs in the kinds of families they serve and how different families are served.
Analyses of the types, intensity, and quality of services families receive will rely on weekly staff reports and on structured observations of classrooms and home visits. Descriptive analysis techniques will be used to summarize the services families receive (including referrals to providers outside the program), the frequency of those services, and the content of classroom and home visit experiences. These descriptive analyses will also support comparisons of service quality and intensity across program subgroups (such as program model) and family subgroups (such as demographic and household characteristics) and over time. Services provided to families will be compared through cross-tabulations with identified family needs in order to assess the extent of service individualization.
Changes over time in child and family well-being will be analyzed with data from direct child assessments, primary caregiver and home visitor ratings of children, and parent interviews. Using graphs and tables, including growth curves where possible, we will describe children’s development from the first assessment or interview through the final followup. Norm-referenced tests will allow for comparison of Early Head Start children to the average child in the United States and to determine whether children make progress relative to norms over time. Particular care will be taken, however, to ensure that the longitudinal results and comparisons are not misconstrued as impacts of the program on child and family outcomes and children’s developmental trajectories.
We will also collect data on caregiver changes over the course of the study (both the parent and Early Head Start primary caregiver/home visitor), making it possible to analyze caregiver effects. Our parent interview is designed to allow for changes in respondent and will gather information about when the change in guardianship took place, the reasons why, and characteristics of the new respondent. The weekly family services tracking allows us to pinpoint the exact week of a change in the child’s primary caregiver/home visitor. The child will then become associated in our sample management system with the new caregiver who will then be interviewed at the next annual interview. Because we will have data on when caregivers change, we will have the ability to analyze caregiver effects.
Finally, the question of how program services relate to child and family outcomes will be addressed using HLM techniques that take into account the nesting of children and families within a given service approach (center-based or home-based) within programs. In particular, the analysis will explore links between types of services received and outcomes, service intensity and outcomes, and service quality and outcomes. These analyses will include appropriate controls and covariates to account for differences in child, family, and program characteristics.
The Baby FACES study began on September 30, 2007, and will be completed by September 29, 2013. Four project reports are planned, presenting key findings as data and analyses become available from each wave of data collection. The first report, planned for publication in early 2010, will include descriptions of child and family characteristics and program quality at the time of the first wave of data collection. The project’s second (planned for early 2011) and third (planned for early 2012) reports will describe program services and how children and families are faring at ages 0 and 1 and at age 2, respectively. A final report, planned for 2013, will describe services and outcomes for families and children at age 3 and 3.5, provide a summary of changes in family characteristics and program implementation, and explore associations between family needs and outcomes and services received over time.
The OMB number and expiration date will be displayed at the top of the cover page or first Web page for each instrument used in the study. For CATI or CAPI instruments, we will offer to read the OMB number and expiration date at the start of the interview. The OMB number and expiration date will also appear on each of the site recruitment materials and the consent forms.
No exceptions are necessary for this data collection.
1 The study has appeared in the Federal Register as the Descriptive Study of Early Head Start. We are now referring to the study as the Early Head Start Family and Child Experiences Survey, or Baby FACES.
File Type | application/msword |
File Title | MEMORANDUM |
Author | Dawn Smith |
Last Modified By | Department of Health and Human Services |
File Modified | 2011-07-11 |
File Created | 2011-02-16 |