Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
The Early Head Start Family and Child Experiences Survey (Baby FACES)—2020
OMB Information Collection Request
0970 - 0354
Supporting Statement
Part A
SEPTEMBER 2019
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer: Amy Madigan, Ph.D.
Part A
Executive Summary
Type of Request: This Information Collection Request is for a revision. We are requesting two years of approval.
Progress to
Date: ACF’s Baby FACES study periodically collects
nationally representative information about Early Head Start (EHS)
programs, their staff, and the families they serve to inform program
planning, technical assistance, and enable research. OMB approved
the 2009 and 2018 Baby FACES data collections under this control
number (0970-0354, previously approved October 21, 2008 and
September 24, 2017, respectively). Data collection for Baby FACES
2018, which took place in winter and spring 2018, is now complete.
See Table B.1 in supporting statement B for the full sample sizes
and response rates for Baby FACES 2018.
Timeline: This project is progressing on schedule, and the request for Baby FACES 2020 approval is submitted as described in the prior information collection request under this control number. Data collection for Baby FACES 2020 is scheduled to begin in fall 2019, pending OMB approval.
Previous Terms of Clearance: There were no terms of clearance included in the NOA for the September 24, 2017 approval.
Summary of changes requested: This ICR will support the collection of Baby FACES 2020. Like Baby FACES 2018, Baby FACES 2020 will collect detailed information about centers, staff, and families through interviews, self-administered questionnaires, observations of classrooms, and administrative data sources. While Baby FACES 2018 took an in-depth look at center-based classrooms, Baby FACES 2020 will focus on how home visits and classrooms support infant–toddler development through responsive relationships. To understand how home visit quality is associated with the quality of parent–child relationships, we request new observations of home-visits and parent–child interactions for Baby FACES 2020.
A1. Necessity for Collection
The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval to collect descriptive information for the Early Head Start Family and Child Experiences Survey 2020 (Baby FACES 2020). The goal of this information collection is to provide updated nationally representative data on Early Head Start (EHS) programs, staff, and families to guide program planning, technical assistance, and research.
Study Background
ACF’s
Baby FACES study periodically collects nationally representative
information about Early Head Start (EHS) programs, staff, and
families to guide program planning, technical assistance, and
research. Baby FACES 2009 included a sample of 89 programs and nearly
1,000 children from two birth cohorts (newborns and 1-year-olds),
following them annually throughout their enrollment in the program
(2009‒2012).
For
2018, Baby FACES was redesigned to collect repeated cross-sectional
data. The Baby FACES 2018 and 2020 data collections offer the first
nationally-representative information about teachers, home visitors
and classroom/home visit quality in the EHS program. Available
administrative data do not provide the depth or richness necessary to
answer key research questions. By linking information about staff
and service quality to information about activities in the sampled
programs, we will be able to examine associations between program
processes, support of staff, and staff relationships with children
and families.
Legal or Administrative Requirements that Necessitate the Collection
There are no legal or administrative requirements that necessitate the collection. ACF is undertaking the collection at the discretion of the agency.
A2. Purpose
Purpose
and Use
The overarching purpose of the Baby FACES studies is to provide knowledge about EHS children and families, and the EHS programs and staff who serve them. The Baby FACES collection of information on EHS programs extends the work of the Family and Child Experiences Survey (FACES)1, which serves a similar purpose for Head Start programs. The ongoing series of Baby FACES data collections aims to maintain up-to-date core information on EHS over time while also focusing on areas of timely topical interest. The Baby FACES studies began with the longitudinal Baby FACES 2009 and continued with the redesigned, cross-sectional Baby FACES 2018 and Baby FACES 2020.
The findings from Baby FACES 2018 and 2020 will provide information about program processes and how program supports are associated with intermediate and longer term outcomes and contribute to ACF’s evidence-based planning, training and technical assistance, management, and policy development efforts. This information is particularly timely given the implementation of new Head Start Program Performance Standards that require grantees to implement program and teaching practices aligned with the Head Start Early Learning Outcomes Framework. A restricted use data set and data documentation will enable secondary research use of the data.
Previously Approved Requests
Baby FACES 2009 (approved October 2008). The 2009 study was designed to produce nationally representative information on EHS services offered to families, training and credentials of staff, and the quality of services provided. The study also described the EHS population, examining changes over time in child and family functioning and possible associations with aspects of the program and services they received. Baby FACES 2009, which concluded in 2015, provided rich descriptive information on the EHS program, families’ participation in it, and the amount and quality of services provided (see Vogel et al. 2011, 2015a, and 2015b).
Baby FACES 2018 (approved September 2017). For 2018, Baby FACES was reconceptualized as a repeated cross-sectional study. The descriptive information Baby FACES 2018 collected allowed ACF to answer new questions about the full age range of children participating in EHS; the characteristics of and professional development supports for the EHS classroom teachers and home-visitors; and how EHS services support infant-toddler development through responsive relationships. In particular, Baby FACES 2018 provided an in-depth look at the processes and teacher-child relationships in EHS center-based classrooms. It also provided information on EHS-Child Care Partnership grantees, which will inform a separate sub-study with EHS Partnership grantees (we will submit an additional information collection request for this).
Responsive relationships are those in which caregivers are respectful of infants and toddlers and interact with them by reading their cues and responding in a way that makes them feel heard and valued. Examples include talking to infants and toddlers, asking questions, responding to their verbal and non-verbal cues, and using strategies to engage children. These relationships are critical to infants’ and toddlers’ development and learning.
Relationship-based approaches to supporting infant-toddler development are approaches that support relationships between caregivers and the infants and toddlers in their care. They are based on caregivers’ being sensitive to the child’s cues and responding contingently to them, and thereby helping to support their physical-motor, social-emotional, language, and cognitive development.
Current Request
Baby FACES 2020 will build upon and extend information from 2018 with a new nationally representative cross-section of programs, and their associated centers, home visitors, teachers, children, and families. The descriptive information gathered through Baby FACES 2020 will allow ACF to examine national-level changes in center-based service provision and quality between Baby FACES 2018 and 2020. Additionally, Baby FACES 2020 will collect new information about home visiting quality and the parent–child relationships associated with home visiting. When combined with information from ACF’s FACES study, which describes Head Start programs and the children they serve (ages 3 to 5), Baby FACES 2020 will fill out the birth to 5 age spectrum.
Research Questions or Tests
Working
collaboratively with ACF and the Baby FACES technical work group (see
section A.8), Mathematica developed a broad conceptual framework for
EHS that hypothesizes how and why program services are expected to
lead to positive outcomes for infants and toddlers and their families
(see Appendix C). The conceptual framework depicts hypothesized
pathways from inputs into EHS program operation to the program’s
goals of improving outcomes for children and families.
The
overarching research question for both Baby FACES 2018 and Baby FACES
2020 is: How
do EHS services support infant/toddler growth and development in the
context of nurturing, responsive relationships?
Baby FACES 2018 focused on EHS classrooms, while Baby FACES 2020 will
collect in-depth information on home visits.
Table A.1 lists high-level research questions that align with the study’s conceptual framework, regarding program processes, program functioning, and classroom/home visit processes hypothesized to be associated with responsive relationships, enhanced infant/toddler outcomes, and family well-being. Detailed lists of specific research questions for the center-based and home-based questionnaires are in Appendix C (Tables C.1 and C.2, respectively). The research questions in those tables map to the research question numbers in the conceptual sub-frameworks in Appendix C (Figures C.2 and C.3). These questions address gaps in the research literature identified at the conclusion of Baby FACES 2009 (Xue et al. 2015).
Table A.1. Research questions for Baby FACES 2018 and 2020
Service characteristics |
How do EHS classrooms and home visits support infant/toddler growth and development in the context of nurturing, responsive relationships?
|
Program processes and functioning |
How do program-level processes and functioning support the development of nurturing, responsive relationships in classrooms and home visits?
|
Infant/toddler outcomes and family well-being |
How are EHS infants and toddlers faring in key domains of development and learning (e.g., language and social-emotional development)? How are EHS families functioning (e.g., social/economic well-being, family resources and competencies)?
|
Study Design
Baby FACES 2009 was the first nationally representative descriptive study of EHS programs. Using a longitudinal cohort design, it included a sample of 89 programs and nearly 1,000 children from two birth cohorts (newborns and 1-year-olds) and followed them annually throughout their enrollment in the program (2009‒2012). Baby FACES 2018 employed a cross-sectional approach included a nationally representative sample of 137 EHS programs, 871 classrooms and teachers, 611 home visitors, and 2,868 children and families.
Baby FACES 2020 will continue the cross-sectional sample of ECE programs established in 2018, capturing descriptive data on EHS programs, centers, home visitors, classrooms and teachers and the families, children, and pregnant women at a single point in time. The study will involve collecting quantitative information at each of these levels to enable nationally representative estimates and the testing of hypothesized associations across study levels.
Universe of Data Collection Efforts
Data
collection instruments for Baby FACES 2020 measure similar constructs
to those used in Baby FACES 2018, with revisions to individual items
or measures based upon their performance in 2018.
To
reflect 2020’s focus on in-depth measurement of home visiting,
we propose adding an in-home, observation-based measure of the
parent–child relationship, as well as observation-based
measures of home visit quality. The instruments and forms
(Instruments 1-10) are annotated to identify sources of questions
from prior studies, and new questions developed for Baby FACES 2020
(Appendix C). Appendix C also lists the research questions,
constructs, measures and in which instruments these measures appear.
Below we describe the data collection instruments/sources of information in the current request:
Data Collection Activity |
Respondents |
Mode |
Purpose |
Classroom/home visitor sampling form from EHS staff (Attachment Instrument 1) |
EHS staff (On-Site Coordinators or Center Directors) |
CADE |
We will ask staff at each sampled EHS program to provide information in this form, listing all of the centers and home visitors, along with characteristics such as the number of classrooms (for centers) and size of caseload and whether they provide services to pregnant women (for home visitors). |
Child roster form from EHS staff (Attachment Instrument 2) |
EHS staff (On-Site Coordinators or Center Directors) |
CADE |
After sampling centers, classrooms, and home visitors, we will ask EHS program staff to provide information on the child roster form, listing all children in the sampled classrooms and all children and pregnant women receiving services from the sampled home visitors. Information from this form will be used to select EHS-funded families for inclusion in the study. |
Parent consent form (Attachment Instrument 3) |
Parents and pregnant women |
Paper with Web option |
After sampling children and pregnant women, we will ask each child’s parent and each pregnant woman to fill out and sign a form giving their consent to participate in the study. |
Parent survey (Instrument 4). |
Parents and pregnant women |
CATI |
We will ask parents about child and family socio-demographic characteristics; child and family health and well-being; household activities, routines, and climate; and parents’ relationships with EHS staff and their engagement with and experiences in the program. This will provide information at the child/family level that will be important for understanding linkages and associations among family characteristics, program experiences, and outcomes. |
Parent Child Report (Instrument 5) |
Parents |
Paper SAQ |
The Parent Child Report will collect information about their child’s language and social-emotional development; parenting stress; parents’ perceptions of their relationship with their child; social support; household drug and alcohol use; and household income.2 |
Staff survey (Teacher survey and Home Visitor survey) (Instruments 6a and 6b) |
Teachers and Home Visitors |
PAPI |
These surveys will provide information about the staff development and training their program offers, curricula and assessments they use, the organizational climate of their program, languages spoken, and their health and background information. In addition, teachers will provide information about the characteristics and routines they use in their classrooms and languages spoken in their classroom. We will link the information gathered in the teacher survey to observed quality in the classroom. We will report data gathered from the staff surveys descriptively as well as in analyses examining associations among different sample levels and moderators. Field staff who are on-site for data collection will administer the paper surveys in person. |
Staff Child Report (Instruments 7a and 7b) |
Teachers and Home Visitors |
SAQ web and paper |
These reports gather information on each child’s language and social-emotional development, developmental screenings and referrals, perceived relationship with the child’s parents, and the family’s engagement with the program. In addition, teachers will report on their perceptions of their relationship with the child, and home visitors will provide information about the services they offered to families in the past four weeks (including topics and activities covered, referrals, alignment of visit content to planned goals, and frequency and modes of communication). Home visitors will complete a briefer version for pregnant women that excludes the reports of the child’s development. Field staff will collect the paper forms before they leave the program site. |
Program director survey (Instrument 8) |
Program Directors |
Web SAQ with PAPI follow-up |
This survey gathers information about program goals, plans, program decision making, training, and professional development, staff supports, and use of data. The survey will also ask program directors to provide information about home visiting curricula and home visitor professional development, parent involvement, and program processes for supporting responsive relationships. |
Center director survey (Instrument 9) |
|
Web SAQ with PAPI follow-up |
This survey will gather information about aspects of the center such as use of curricula in classrooms, organizational climate, staff qualifications, and teacher professional development. |
Parent–child interaction (Instrument 10) |
Parents and children |
Paper data entry |
For children over 12 months who receive home-based services, we will use a parent-child interaction task in which we will ask parents and their children to interact with one another in a book reading and a semi-structured free-play task with toys. Staff will video record the interaction which will subsequently be coded for attributes such as sensitivity, positive regard, stimulation of cognitive development, intrusiveness, detachment, negative regard, and quality of the relationship. |
Classroom observations. We will use a classroom observation tool to capture teacher-child relationships: the Quality of Caregiver-Child Interactions for Infants and Toddlers (Q‑CCIIT) measure (Atkins-Burnett et al. 2015). The Q-CCIIT is a new measure developed under contract with ACF (OMB #0970-0513). As in Baby FACES 2018, we will use the Q-CCIIT for Baby FACES 2020 to advance knowledge about the quality of EHS classrooms and expand information about the validity of the measure. The Q-CCIIT assesses the quality of child care settings for infants and toddlers in center-based settings and family child care homes—specifically, how a given caregiver interacts with a child or group of children in nonparental care. The Q-CCIIT measures caregivers’ support for social-emotional, cognitive, and language and literacy development, as well as areas of concern (such as harshness, ignoring children, and health and safety issues). At the end of the observation, observers will complete the Structural Features and Practices form in which they rate the room arrangement of the classroom, indicate the presence of a variety of materials and activities for children in the classroom, indicate whether information for parents is posted anywhere in the setting, whether a quiet space is available to children, whether a separate area for napping (with cribs, cots, or mats) is available in the classroom, and the nature of transitions between activities in the classroom. There is no burden to study participants associated with the collection of data using the observations.
Home visit observations. For families with children receiving home-based services, we will capture the quality of the interactions between home visitors and families by conducting home visit observations using the Home Visitor Practices subscale from the Home Visit Rating Scales 3rd edition (HOVRS-3) and the Home Visit Content and Characteristics Form. The HOVRS was initially developed from field-based descriptions of successful home visits and is supported by home visiting research in multiple disciplines. Four home visiting practice scales include indicators of relationship building with families, responsiveness to family strengths, facilitation of parent-child interaction, and collaboration with parents. The Home Visit Content and Characteristics Form is an observational measure that documents the content of the home visit (e.g., topics discussed), and the characteristics (e.g., who was present, the level of distraction from TV, and so on). Study staff will accompany home visitors to visits to study families. These observations do not impose any burden on respondents.
Other Data Sources and Uses of Information
The sample of ECE programs will be drawn using data from the most recent Head Start Program Information Report (PIR)3, using administrative data on program characteristics as explicit and implicit stratification variables. We describe this approach in detail in Supporting Statement Part B. During data analysis, we will incorporate program characteristics data from the PIR, including program size, location, population served, and percentage of children who have a medical home. There is no burden to study participants associated with using PIR data for Baby FACES.
A3. Use of Information Technology to Reduce Burden
The data collection will use a variety of information technologies to reduce the burden of participating on respondents. Program director surveys, center director surveys, and Staff Child Reports will be optionally offered as a web-based survey. Parent surveys will be administered using computer-assisted telephone interviewing to reduce respondent burden and data entry errors. Parents will have the option to access an electronic version of the consent form (all paper consent packets will include log-in information to complete the electronic form). Study staff will collect missing consent forms during in-person data collection visits by accessing the electronic version of the consent forms on tablets. Staff surveys (teacher and home visitor surveys) will be administered in person as part of the on-site data collection; as a result, a web-based mode is not necessary for these surveys.
A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency
Wherever possible, we will use existing administrative information from PIR about EHS program characteristics to prevent duplication, minimize burden, and increase efficiency. No study instruments ask for information that is available from alternative data sources, including administrative data.
A5. Impact on Small Businesses
Most of the EHS programs and child care centers included in the study will be small organizations, including community-based organizations and other nonprofits. We will minimize burden for respondents by restricting the length of survey interviews as much as possible, conducting survey interviews on-site or via telephone at times that are convenient to the respondent, and providing some instruments in a web-based format.
A6. Consequences of Less Frequent Collection
No nationally representative information has been collected on EHS classrooms, home visitors, families, or children since the conclusion of Baby FACES 2018. In the past two years, EHS has undergone program expansion and other policy changes that warrant measurement to describe the status of implementation efforts.
A7. Now subsumed under 2(b) above and 10 (below)
A8. Consultation
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on April 24, 2019, Volume 84, Number 79, page 17167, and provided a 60-day period for public comment. A copy of this notice is attached as Appendix A. During the notice and comment period, we received one set of substantive comments which are attached with our comments on how we addressed each suggestion (Appendix B).
We consulted with experts to complement our team’s knowledge and experience (Table A.2). Consultants included researchers with expertise in EHS and child care more broadly, child development, family engagement, and classroom and home visit processes. We also engaged experts with specialized knowledge and skills in the areas of home visit quality and parent–child interactions relevant to this work.
Table A.2. Baby FACES 2020 technical work group members and outside experts
Name |
Affiliation |
Rachel Chazan Cohen |
Department of Curriculum and Instruction, College of Education and Human Development, University of Massachusetts Boston |
Mary Dozier |
University of Delaware |
Anne Duggan |
Department of Population, Family and Reproductive Health, Bloomberg School of Public Health, Johns Hopkins University |
Beth Green |
Portland State University |
Erika Lunkenheimer |
Pennsylvania State University |
Anne Martin |
Columbia University |
Carla Peterson |
Iowa State University |
Lori Roggman |
Utah State University |
Daniel Shaw |
University of Pittsburgh |
Catherine Tamis-LeMonda |
A9. Tokens of Appreciation
Given
the complex study design and nested analysis plan for Baby FACES
2020, respondents’ participation in the study activities is key
to ensuring the study’s success. High levels of participation
among the sampled EHS programs, staff, and families are essential to
help ensure that estimates are nationally representative and to
increase comparability of data with that collected in Baby FACES
2018.
Similar studies of low-income young families, such
as FACES (OMB control number 0970-00151, expires August 31, 2021) and
PACT (OMB control number 0970-0403, expired December 31, 2016),
include incentives to participating families and children as part of
an overall successful strategy to increase data quality in a complex
study design. Conversely, the Project LAUNCH Cross-Site Evaluation
(OMB control number 0970-0373, expires October 31, 2019) did not
offer an incentive to respondents completing the web-based parent
survey. The study team found that early respondents (pre-incentive)
were not representative of their communities. Minorities, individuals
with lower incomes and those who worked part time or were unemployed
were underrepresented. Following OMB approval of a $25 post-pay
incentive after
data collection had started, completion rates and representativeness
both improved (LaFauve et al. 2018).
Table A.3 lists proposed tokens of appreciation for programs, staff, and families participating in Baby FACES 2020 data collection. For comparison, the table also reports approved gift amounts and response rates from prior rounds of Baby FACES.
Table A.3. Structure of gifts of appreciation for Baby FACES 2020 and prior rounds
|
|
Baby FACES 2020 |
Baby FACES 2018 |
Baby FACES 2009 |
||||
Baby FACES component |
Respondent |
Length of activity |
Token of appreciation |
Length of activity |
Token of appreciation |
Response rate (percentage) |
Token of appreciation |
Response rate (percentage) |
Parent survey |
Parent |
30 minutes |
$20 |
30 minutes |
$20 |
81.9 |
$35 |
79.6 |
Parent Child Report (PCR) |
Parent |
15 minutes |
$5 |
15 minutes |
$5 |
88.0 |
(PCR administered in the home after child assessment. Parent–child interaction part of in home visit |
86 |
Parent–child interaction (and in-home observation of home visitor) |
Parent |
10 minutes for parent–child interaction, up to 90 minutes in the home |
$35 |
n.a. |
n.a. |
n.a. |
83.7 |
|
Staff survey |
Teachers and home visitors |
30 minutes |
Children’s book ($10 value) |
30 minutes |
Children’s book ($10 value) |
97.5 |
Children’s book ($5 value) |
98.7 |
Staff Child Report |
Teacher or home visitor |
15 minutes per sampled child |
$5 per report |
15 minutes per sampled child |
$5 per report |
94.4 |
$5 per report |
96.2 |
n.a. = not applicable.
A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing
Personally Identifiable Information
This collection requests personally identifiable information (PII), such as name, dates of birth, due date, and contact information. All electronic data will be stored on a secure network drive at Mathematica offices and never in possession of ACF; data will be backed up on our secure servers for 60 days for disaster recovery purposes. Sixty days after the primary data files are securely deleted, the backed-up data will be automatically and securely deleted, as required by our contract (i.e., “The Contractor shall dispose of the primary data and files created during the course of the study in accordance with specifications provided by ACF”). These plans are described in more detail in a data security plan, also required by the contract. Systems will be accessible only by staff working on the project through individual passwords and logins.
Our hard copy data collection instruments (staff child reports, staff surveys, and classroom and home visit observation booklets) will temporarily include teacher/home visitor/child names because respondents need to know who they are providing information when completing these instruments. Field staff will be trained to guard hard copy documents shared between team members that contain PII. All hard copy documents will be inventoried and sent to and from the field using FedEx shipping service. FedEx shipments are logged and tracked from the moment of package pick-up to the time of delivery, including the name of the person who received the package. We will also use our sample management system to track hard copy documents sent to and from the field. Hard copy materials are stored in locked cabinets during the study. Following the end of the project, and when no longer required, hard copy materials and other physical media containing sensitive data will be destroyed using a cross-cut shredder.
Following data collection, we will remove all PII from the instruments and the de-identified data will be exported for analysis. Neither analysis staff nor ACF will have access to any PII; only de-identified data will be available. Once the analysis is complete all electronic databases will be deleted, and as mentioned above, after 60 days the data will no longer be able to be retrieved.
Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individual’s personal identifier.
Assurances of Privacy
Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. The consent statement that all study participants will receive provides assurances that the research team will protect the privacy of respondents to the fullest extent possible under the law, that respondents’ participation is voluntary, and that they may withdraw their consent at any time without any negative consequences.
As specified in the contract signed by ACF and Mathematica (referred to as the Contractor in this section), the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor developed a Data Safety Plan that assesses all protections of respondents’ PII) and submitted it to ACF on October 30, 2015. The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract/subcontract are trained on data privacy issues and comply with the above requirements. All of the Contractor’s staff sign the Contractor’s confidentiality agreement when they are hired.
Due to the sensitive nature of part of this research (see A.11 for more information), the evaluation has obtained a Certificate of Confidentiality, attached in Appendix D. The Certificate of Confidentiality helps assure participants that their information will be kept private to the fullest extent permitted by law. Further, all materials to be used with respondents as part of this information collection, including consent statements and instruments, will be submitted to the Health Media Lab Institutional Review Board (the Contractor’s IRB) for approval.
Data Security and Monitoring
As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard (currently, FIPS 140-2) compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractor shall ensure that this standard is incorporated into the Contractor’s property management/control system and establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. Any data stored electronically will be secured in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable Federal and Departmental regulations. In addition, the Contractor must submit a plan for minimizing, to the extent possible, the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive data or PII, ensuring secure storage and limits on access.
For each round of the study, we will create a de-identified restricted use data file and a data user’s guide to inform and assist researchers who would like to use the data in future analyses.
A11. Sensitive Information
To
achieve its primary goal of describing the characteristics of the
children and families EHS serves, we ask parents and staff (teachers
and home visitors) a limited number of sensitive questions. Responses
to these items collected during Baby FACES 2009 and 2018 were used to
describe the EHS population, their needs, parent outcomes, and
families’ circumstances over time. Sensitive questions for
parents include potential feelings of depression, use of services for
emotional or mental health problems, reports of family violence or
substance abuse, household income, and receipt of public assistance.
Staff will only be asked about symptoms of depression.
The
invitation to participate in the study will inform parents and staff
that the survey will ask sensitive questions (these materials are in
Appendix E). The invitation will also inform parents and staff that
they do not have to answer questions that make them uncomfortable and
that the responses they provide will not be reported to program
staff.
A12. Burden
Explanation of Burden Estimates
Table A.4 presents the current request for data collection activities that enable sampling classrooms, home visitors, and families; surveys with sampled EHS staff and families; and observation of parent–child interactions during home visits. The estimates include time for respondents to review instructions, search data sources, complete and review the responses, and transmit or disclose information. This information collection request covers a period of two years. There are no remaining approved burden hours from the Baby FACES 2009 or 2018 data collections. We expect the total annual burden to be 1,970 hours for all of the instruments in the current information collection request.
Classroom/home visitor sampling form from EHS staff (Instrument 1). For each selected center, a member of the Baby FACES study team will request a list of all Early Head Start (EHS) classrooms from EHS staff (typically the On-Site Coordinator or center director), for a total of 407 classrooms and programs with home visitors. We expect it will take approximately 10 minutes for the EHS staff member to complete this sampling form.
Child roster form from EHS staff (Instrument 2). For each selected classroom or home visitor caseload, a Baby FACES study team member will request the names and dates of birth (or due dates for pregnant women) and enrollment of each child or family enrolled in the selected classroom or HV caseload from Early Head Start (EHS) staff (typically the On-Site Coordinator). We will identify the sibling groups in the sampling program and the sampling program will then randomly drop all but one member of each sibling group, leaving one child per family. We expect this form to be completed 252 times, and that it will take about 20 minutes for EHS staff to provide the information requested.
Parent consent form (Instrument 3). We will ask parents of all 2,495 elected children to provide their consent via a parent consent form. We expect it will take parents about 10 minutes to complete the form.
Parent survey (Instrument 4). We will conduct a 32-minute telephone survey interview with parents of sampled children or with pregnant women. We expect responses from a total of 2,084 parents of children across the 123 programs, about 16.9 per program.
Parent Child Report (Instrument 5). The Parent Child Report is a 20-minute self-administered questionnaire that we expect 2,008 parents of sampled children to complete.
Staff survey (Teacher survey and Home Visitor survey) (Instruments 6a and 6b). We will conduct 30-minute in-person staff surveys with 609 classroom teachers and 706 home visitors.
Staff Child Report (Instruments 7a and 7b). The Staff Child Report is a 15-minute self-administered survey that asks home visitors to report on all of their sampled children and a subsample of teachers to report on their sampled families, which will total 1,046 staff completing 2,230 Staff Child Reports.
Program director survey (Instrument 8). The 30-minute program director survey will be administered via the web with the option of in-person follow-up for those who do not respond on the web. We expect 120 program directors to participate in this survey.
Center director survey (Instrument 9). The 30-minute center director survey will be web-based with the option of in-person follow-up for those who do not respond on the web. We expect 294 center directors to complete this survey.
Parent–child
interaction (Instrument 10).
For children over 12 months who receive home-based services, we will
use a 10-minute parent-child interaction task. We expect that 996
families will complete the parent-child interaction task.
Table A.4. Total burden requested under this information collection
Instrument |
No. of Respondents (total over request period) |
No. of Responses per Respondent (total over request period) |
Avg. Burden per Response (in hours) |
Total Burden (in hours) |
Annual Burden (in hours) |
Average Hourly Wage Rate |
Total Annual Respondent Cost |
Classroom/ home visitor sampling form (from EHS staff) |
407 |
1 |
0.17 |
70 |
35 |
$33.50 |
$1,172.50 |
Child roster form (from EHS staff |
252 |
1 |
0.33 |
84
|
42 |
$33.50 |
$1,407.00 |
Parent consent form |
2,495 |
1 |
0.17 |
424 |
212 |
$18.65 |
$3,953.80 |
Parent survey |
2,084 |
1 |
0.53 |
1,104 |
552 |
$18.65 |
$10,294.80 |
Parent Child Report |
2,008 |
1 |
0.33 |
662 |
331 |
$18.65 |
$6,173.15 |
Staff survey (Teacher survey and Home Visitor survey) |
1,317 |
1 |
0.50 |
660 |
330 |
$33.50 |
$11,021.50 |
Staff Child Report |
1,046 |
2.13 |
0.25 |
558 |
279 |
$33.50 |
$9,346.50 |
Program director survey |
120 |
1 |
0.50 |
60 |
30 |
$33.50 |
$1,005.00 |
Center director survey |
294 |
1 |
0.50 |
148 |
74 |
$33.50 |
$2479.00 |
Parent–child interaction |
996 |
1 |
0.17 |
170 |
85 |
$18.65 |
$1,585.25 |
Total |
|
|
|
|
1,970 |
$ |
$ |
Estimated Annualized Cost to Respondents
We expect the total annual cost to be $42,265.35 for all of the instruments in the current information collection request.
Average hourly wage estimates for deriving total annual costs are based on Current Population Survey data for the fourth quarter of 2018 (Bureau of Labor Statistics 2019). For each instrument included in Table A.4, we calculated the total annual cost by multiplying the annual burden hours and the average hourly wage.
For program directors, center directors, and staff (teachers and home visitors), we used the median usual weekly earnings for full-time wage and salary workers age 25 and older with a bachelor’s degree or higher ($33.50 per hour). For parents, we used the median usual weekly earnings for full-time wage and salary workers age 25 and older with a high school diploma or equivalent and no college experience ($18.65). We divided weekly earnings by 40 hours to calculate hourly wages.
A13. Costs
With OMB
approval, the study team will offer each participating center an
honorarium of $250 in recognition of the time and expertise that
center staff contribute to the data collection, including their
assistance in scheduling data collection site visits and gathering
parent consent forms. The honorarium is intended to both encourage
center’s initial participation and recognize their efforts to
coordinate a timely and complete data collection.
The
proposed honorarium matches the site payments approved for Baby FACES
2018.
A14. Estimated Annualized Costs to the
Federal Government
Cost Category |
Estimated Costs |
Instrument Development and OMB Clearance |
$21,395 |
Field Work |
$3,467,209 |
Publications/Dissemination |
$238,736 |
Total costs over the request period |
$3,727,340 |
Annual costs |
$1,863,670 |
A15. Reasons for changes in burden
This is an additional information collection request under OMB control number 0970-0354.
A16. Timeline
Table A.5 contains the timeline for the data collection and reporting activities. Recruitment will begin in fall 2019, after obtaining OMB approval. Data collection is expected to occur through spring 2020. Mathematica will produce several publications based on analysis of data from Baby FACES 2020:
Descriptive tables of findings from all surveys. The intention is to quickly produce findings that Federal agencies can use.
A final report including information from the descriptive tables and additional narrative explanation of the findings. This report will be accessible to a broad audience, using graphics and figures to communicate key findings.
Specific topics briefs of interest to the government. These briefs will be focused and accessible to a broad audience.
Restricted-use data files and documentation that will be available for secondary analysis.
Table A.5. Schedule for Baby FACES 2020 data collection and reporting
Activity |
Timinga |
Recruitment |
|
Program recruitment |
Fall 2019 |
Data collection |
|
Parent survey (by telephone) |
Spring/summer 2020 |
Program and center director surveys |
Spring 2020 |
On-site classroom observations and staff surveys |
Spring 2020 |
In-home visits for home visit observations and parent–child interactions |
Spring 2020 |
Analysis |
|
Data processing and analysis for data tables |
Spring/summer 2020 |
Data processing and analysis for final report |
Winter 2020/spring 2021 |
Reporting |
|
Data tables |
Fall 2020 |
Final report on the 2020 data collection |
Spring 2021 |
Briefs on specific topics |
Spring/summer 2021 |
Restricted-use data file |
Spring 2021 |
aAfter obtaining OMB approval.
A17. Exceptions
No exceptions are necessary for this information collection.
Attachments
Appendices
Appendix A. 60-Day Federal Register Notice
Appendix B. Comments Received on 60-Day Federal Register Notice
Appendix C. Conceptual Frameworks and Research Questions
Appendix D. NIH Certificate of Confidentiality
Appendix E. Advance Materials
Appendix F. Brochure
Appendix G. Screen Shots
Instruments
Instrument 1. Classroom/home visitor sampling form from Early Head Start staff
Instrument 2. Child roster form from Early Head Start staff
Instrument 3. Parent consent form
Instrument 4. Parent survey
Instrument 5. Parent Child Report
Instrument 6a. Staff survey (Teacher survey)
Instrument 6b. Staff survey (Home Visitor survey)
Instrument 7a. Staff Child Report (Teacher)
Instrument 7b. Staff Child Report (Home Visitor)
Instrument 8. Program director survey
Instrument 9. Center director survey
Instrument 10. Parent–child interaction
References
Bureau of Labor Statistics. “Usual Weekly Earnings of Wage and Salary Workers: Fourth Quarter 2018.” USDL-19-0077. Washington, DC: Bureau of Labor Statistics, January 2019.
Horm, D., D. Norris, D. Perry, R. Chazan-Cohen, and T. Halle. “Developmental Foundations of School Readiness for Infants and Toddlers: A Research to Practice Report.” OPRE Report No. 2016-07. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2016.
LaFauve, K., K. Rowan, K. Koepp, and G. Lawrence. “Effect of Incentives on Reducing Response Bias in a Web Survey of Parents.” Presented at the American Association of Public Opinion Research Annual Conference, Denver, CO, May 16–19, 2018.
Mack, S., V. Huggins, D. Keathley, and M. Sundukchi. “Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation?” In Proceedings of the American Statistical Association, Survey Research Methods Section, vol. 529534, 1998.
Martin, E., and F. Winters. “Money and Motive: Effects of Incentives on Panel Attrition in the Survey of Income and Program Participation.” Journal of Official Statistics, vol. 17, no. 2, 2001, pp. 267–284.
Office of Management and Budget, Office of Information and Regulatory Affairs. “Questions and Answers When Designing Surveys for Information Collections.” Washington, DC: Office of Management and Budget, 2006.
Singer E., N. Gebler, T. Raghunathan, J. V. Hoewyk, and K. McGonagle. “The Effect of Incentives on Response Rates in Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, no. 2, 1999, pp. 217–230.
Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, pp. 105–128. Washington, DC: National Academy Press, 2002.
Singer, E., J. Van Hoewyk, and M.P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, 2000, pp. 171–188.
Sosinsky, L, K. Ruprecht, D. Horm, K. Kriener-Althen, C. Vogel, and T. Halle. “Including Relationship-Based Care Practices in Infant-Toddler Care: Implications for Practice and Policy.” Brief prepared for the Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, 2016.
Vogel, Cheri A., Kimberly Boller, Yange Xue, Randall Blair, Nikki Aikens, Andrew Burwick, Yevgeny Shrago, Barbara Lepidus Carlson, Laura Kalb, Linda Mendenko, Judy Cannon, Sean Harrington, and Jillian Stein. “Learning As We Go: A First Snapshot of Early Head Start Programs, Staff, Families, and Children.” OPRE Report No. 2011-7. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, February 2011.
Vogel, Cheri A., Pia Caronongan, Jaime Thomas, Eileen Bandel, Yange Xue, Juliette Henke, Nikki Aikens, Kimberly Boller, and Lauren Bernstein. “Toddlers in Early Head Start: A Portrait of 2-Year-Olds, Their Families, and the Programs Serving Them.” OPRE Report No. 2015-10. Washington, DC: Administration for Children and Families, U.S. Department of Health and Human Services, 2015a.
Vogel, Cheri A., Pia Caronongan, Yange Xue, Jaime Thomas, Eileen Bandel, Nikki Aikens, Kimberly Boller, and Lauren Murphy. “Toddlers in Early Head Start: A Portrait of 3-Year-Olds, Their Families, and the Programs Serving Them.” OPRE Report No. 2015-28. Washington DC: Office of Planning, Research, and Evaluation, and Princeton, NJ: Mathematica Policy Research, April 2015b.
Xue, Yange, Kimberly Boller, Cheri A. Vogel, Jaime Thomas, Pia Caronongan, and Nikki Aikens. “Early Head Start Family and Child Experiences Survey (Baby FACES) Design Options Report.” OPRE Report No. 2015-99. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services, September 2015.
1 The Family and Child Experiences Survey (FACES) information collection is approved under OMB #0970-0151.
2 Pregnant women in the study sample do not complete the Parent Child Report. However, they will report on their perceptions of social support, household drug and alcohol use, and household income in the parent survey.
3 The PIR is an administrative data system for the Head Start program as a whole that includes data collected annually from all programs. Head Start programs collect the information as approved under OMB control number 0970-0427.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Nicole Deterding |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |