FACES 2019 Recruitment OMB Part A_passback_20180827_clean

FACES 2019 Recruitment OMB Part A_passback_20180827_clean.docx

Head Start Family and Child Experiences Survey (FACES 2019): Recruitment of Programs and Selecting Centers

OMB: 0970-0151

Document [docx]
Download: docx | pdf

Head Start Family and Child Experiences Survey (FACES 2019): Recruitment of Programs and Selecting Centers



OMB Information Collection Request

0970 - 0151




Supporting Statement

Part A

February 2018



Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Mary Mueggenborg and Meryl Barofsky

CONTENTS

A.1 Necessity for the Data Collection 1

A.2. Purpose of Survey and Data Collection Procedures 2

A.3. Improved Information Technology to Reduce Burden 7

A.4. Efforts to Identify Duplication 7

A.5. Involvement of Small Organizations 7

A.6. Consequences of Less Frequent Data Collection 7

A.7. Special Circumstances 7

A.8. Federal Register Notice and Consultation 7

A.9. Incentives for Respondents 8

A.10. Privacy of Respondents 11

A.11. Sensitive Questions 12

A.12. Estimation of Information Collection Burden 12

A.13. Cost Burden to Respondents or Record Keepers 13

A.14. Estimate of Cost to the Federal Government 13

A.15. Change in Burden 13

A.16. Plan and Time Schedule for Information Collection, Tabulation and Publication 13

A.17. Reasons Not to Display OMB Expiration Date 14

A.18. Exceptions to Certification for Paperwork Reduction Act Submissions 14

REFERENCES 15



APPENDICES

A PROGRAM INFORMATION PACKAGES

B OMB HISTORY

C MATHEMATICA CONFIDENTIALITY PLEDGE

D AI/AN FACES 2019 CONFIDENTIALITY AGREEMENT

E AI/AN FACES 2019 AGREEMENT OF COLLABORATION AND PARTICIPATION

F AI/AN FACES 2019 TRIBAL PRESENTATION TEMPLATE

G PUBLIC COMMENTS


ATTACHMENTS

1 TELEPHONE SCRIPT AND RECRUITMENT INFORMATION COLLECTION FOR PROGRAM DIRECTORS, REGIONS I THROUGH X

2 TELEPHONE SCRIPT AND RECRUITMENT INFORMATION COLLECTION FOR PROGRAM DIRECTORS, REGION XI

3 TELEPHONE SCRIPT AND RECRUITMENT INFORMATION COLLECTION FOR ON-SITE COORDINATORS, REGIONS I THROUGH X

4 TELEPHONE SCRIPT AND RECRUITMENT INFORMATION COLLECTION FOR ON-SITE COORDINATORS, REGION XI




TABLES

A.1. FACES study structure 3

A.2. FACES 2019 Core studies, by level of data collected 4

A.3. FACES 2019 previously approved incentive structure compared to structure of prior rounds 9

A.4. Total burden requested under this information collection 12



A.1 Necessity for the Data Collection

The Administration for Children and Families (ACF) at the U.S. Department of Health and Human Services (HHS) seeks approval for a new round of the Head Start Family and Child Experiences Survey (FACES). Beginning in 2019, two parallel studies will commence. Each study will provide data on a set of key indicators for Head Start programs. FACES 2019 focuses on Head Start Regions I through X (which are geographically based); AI/AN (American Indian and Alaska Native) FACES 2019 focuses on Region XI (which funds Head Start programs that serve federally recognized American Indian and Alaska Native tribes1). As with FACES 2014-2018, FACES 2019 features a ‘‘Core Plus’’ study design: Core studies will capture key indicators; potential Plus studies afford the flexibility to address new policy and programmatic issues.

In this package, we present the sampling plans for all the information collection activities necessary to recruit Head Start programs and centers into FACES 2019 and AI/AN FACES 2019. The sample design and recruitment strategy for Regions I through X will differ slightly from those of Region XI. ACF proposes to contact up to 230 Head Start programs in Regions I through X and up to 30 Head Start programs in Region XI to participate in FACES 2019 or AI/AN FACES 2019, respectively.2 This information collection request (ICR) relates to the gathering of information that we will use to develop a sampling frame of Head Start centers in each program. We will submit a separate ICR for the FACES and AI/AN FACES 2019 data collections, including selecting classrooms and children for the study and gathering consent for children, data collection instruments and procedures, data analyses, and reporting study findings.

Study Background

ACF has contracted with Mathematica Policy Research and its subcontractors, Juárez and Associates and Educational Testing Service, under contract number HHSP233201500035I/HHSP23337024T, to collect information on Head Start program performance measures. FACES 2019 extends a previously approved data collection program (OMB number 0970-0151) to a new sample of Head Start programs, families, and children. As with previous rounds, both FACES 2019 and AI/AN FACES 2019 will collect information from a national probability sample of Head Start programs to ascertain what progress Head Start has made toward meeting program performance goals. AI/AN FACES 2019 builds on AI/AN FACES 2015 (also OMB number 0970-0151).

Legal or Administrative Requirements that Necessitate the Collection

There are two legislative bases for these data collection efforts: the Government Performance and Results Act of 1993 (P.L. 103-62), requiring that the Office of Head Start (OHS) move expeditiously toward developing and testing Head Start performance measures, and the Improving Head Start for School Readiness Act of 2007 (P.L. 110-134), which outlines requirements for monitoring, research, and standards for Head Start. FACES provides the mechanism for collecting data on nationally representative samples of programs, children, and families that Head Start serves in Regions I through X in order to provide OHS, other federal government agencies, local programs, and the public with valid and reliable national information. Similarly, AI/AN FACES 2019 collects data on a nationally representative sample in Region XI to provide data to federal, local, and tribal stakeholders.

A.2. Purpose of Survey and Data Collection Procedures

Overview of Purpose and Approach

In 2019, FACES will enter its 22nd year of serving as a source of timely, periodic, contextualized data about the national Head Start program and its participants. The study design takes into account the fact that the needs of the Head Start program and broader early childhood education field are always evolving and that study measurement must respond to those shifts. It consists of a core set of data collection activities to capture key characteristics and indicators relating to programs, classrooms, and child and family outcomes. We refer to these activities occurring in Head Start Regions I through X as FACES 2019 and comparable activities in Region XI as AI/AN FACES 2019. The FACES 2019 design (just as with FACES 2014) has the flexibility to potentially integrate new measurement and content—known as Plus studies—to inform emerging programmatic questions. In Table A.1, we provide a brief overview of the 2019 study structure. Key characteristics include (1) two rounds of data collection (fall 2019 through spring 2020 and spring 2022) in a nationally representative sample of programs, centers, classrooms, and children;3 (2) working with ACF and other key stakeholders to finalize elements of the Core and Plus studies; and (3) providing reporting products in a timely fashion after the completion of each FACES 2019 and AI/AN FACES 2019 wave.

Research Questions

The goals of FACES 2019 are to describe: (1) the quality and characteristics of Head Start classrooms, programs, and staff for specific program years; (2) the changes or trends in the quality and characteristics of classrooms, programs, and staff over time; and (3) the factors or characteristics that predict differences in classroom quality. The study also will focus on describing (4) the school readiness skills and family characteristics of children who participate in Head Start during specific program years, (5) the changes or trends in children’s outcomes and family characteristics over time, and (6) the factors or characteristics at multiple levels that predict differences in children’s outcomes. Across the two Core studies, we will address several types of questions, including:

  1. What are the characteristics and needs of the children and families Head Start serves?

  2. What gains in school readiness skills and developmental outcomes do children make during a year of Head Start?

  3. What are the home and community-based activities available to children and families?

  4. What are the key features of Head Start programs and the services that children and families receive?

  5. What are the characteristics and qualifications of staff who provide services to families? What kinds of support (training, mentoring, and supervision, for example) do staff receive?

  6. What is the quality of the services Head Start provides to families?

  7. Does classroom quality vary by program, classroom, and teacher characteristics? Do children and families, staff, and services differ by programs with different features (program size, auspice, and staff satisfaction, for example)?

AI/AN FACES 2019 addresses the following research questions:

  1. What are the demographic characteristics and home environments of children and families Region XI Head Start serves? What are the strengths and needs of the children and families who receive services?

  2. What are the home and community-based activities (in particular around storytelling, native language, and cultural or traditional ways) available to children and families? What supports for native language and culture are Region XI programs providing?

  3. What are the average school readiness skills of Region XI Head Start children in fall and spring of the Head Start year? How do Head Start children compare with children of similar ages in the general population?

  4. What characteristics of children’s Head Start experiences and home life are associated with better child outcomes?

  5. What are the home and community-based activities (in particular around storytelling, native language, and cultural or traditional ways) available to children and families? What supports for native language and culture are Region XI programs providing?

As we will describe in Part B, AI/AN FACES 2019 focuses on children and families. We collect information on programs and classrooms to provide context for children’s experiences.

Study Design

Table A.1. FACES study structurea


Fall 2017 through spring 2019

Fall 2019

Spring 2020

Fall 2020

Spring 2021

Fall 2021

Spring 2022

FACES 2019 (Regions I-X)

Design with ACF
*recruitment begins spring 2019

Classroom + child

Classroom + child

Classroom




Classroom

AI/AN FACES 2019 (Region XI)

Design with ACF
*recruitment begins fall 2018

Classroom + child

Classroom + child





Plus Study (Regions I-X)




Potential design with ACF

Potential Plus study

a We will prepare reporting products beginning three months after each Core study and AI/AN FACES 2019 wave.

Table A.2 provides an overview of the periodicity, data sources, and number of programs included in each Core study. The Classroom + Child Outcomes Core will occur in fall 2019 and spring 2020. At both time points, FACES 2019 will assess the school readiness skills of 2,400 Head Start children in Regions I through X, survey their parents, and ask their Head Start teachers to rate the children’s social and emotional skills. We will draw this number from 240 classrooms within 120 centers and 60 programs in Regions I through X. In spring 2020, classroom observations of sampled programs will occur. The number of programs will increase from the 60 we used to collect data on children’s school readiness outcomes to 180 so as to conduct observations in 720 Head Start classrooms within 360 centers. We will also conduct surveys with the program director, center director, and teacher in spring 2020. Please see Part B for more detailed information about data collection.

Table A.2. FACES 2019 Core studies, by level of data collected


Level of data

Core study

Program/center

Classroom

Child

Parent/family

Classroom + child outcomes core

Periodicity: spring 2020

Source: program and center director surveys

Periodicity: spring 2020

Source: teacher surveys and classroom observations

Periodicity: fall 2019 and spring 2020

Source: direct child assessments, parent surveys, and teacher reports on children

Periodicity: fall 2019 and spring 2020

Source: parent surveys


Number of programs: 60

Number of programs: 60

Number of programs: 60

Number of programs: 60

Classroom core

Periodicity: spring 2020, spring 2022

Source: program and center director surveys

Number of programs: 180*

Periodicity: spring 2020, spring 2022

Source: teacher surveys and classroom observations

Number of programs: 180*

NA

NA

* The 180 programs in the Classroom Core include the 60 programs above with child-level data in spring 2020.

NA = not available. These data are not collected as part of this Core study.


AI/AN FACES 2019 is similar in structure to the FACES 2019 Classroom + Child Outcomes Core with a nationally representative sample of Region XI Head Start programs, classrooms, and the children and families they serve. AI/AN FACES 2019 represents a much smaller population than the Core studies, with a correspondingly smaller sample. We will conduct child assessments and parent surveys with 800 families in fall 2019 and spring 2020. In spring 2020, we will conduct approximately 80 Head Start classroom observations. Please see Part B for more detailed information about data collection.

To achieve the goals of an efficient, nationally representative sample of Head Start classrooms and children, FACES 2019, as all earlier rounds, consists of a multistage sample design with four stages: (1) Head Start programs, (2) centers within programs, (3) classes within centers, and (4) children within classes. Ultimately, the FACES 2019 sample will include about 180 programs, 360 centers, 720 classrooms, and, from 60 of the programs, 2,400 3- and 4-year-olds and their families. We will select Head Start programs in late winter 2019 from the Head Start Program Information Report (PIR) database for program year 2017–2018 (the most current PIR available at the time of sampling). For the 60 programs in Regions I through X that participate in child-level data collection, we will begin recruitment activities in spring 2019. We will contact the other 120 programs in fall 2019 so that data collection will commence in spring 2020.

AI/AN FACES 2019 will follow the same four-stage sample design. Ultimately, AI/AN FACES 2019 will include 22 programs, 37 centers, 80 classrooms, and 800 3- and 4-year-olds and their families. Program recruitment for Region XI centers will begin in fall 2018 to allow a full year to ensure that we can complete the additional processes involved with tribal review and approval. Because we will begin recruiting programs for AI/AN FACES 2019 from Region XI in fall 2018, we will likely use the 2016–2017 PIR to select those programs.

Universe of Data Collection Efforts

No frame of Head Start centers currently exists for FACES 2019 and AI/AN FACES 2019 so we will ask each program in the sample to provide (1) the number of classrooms in the center, (2) the total number of children enrolled in the center, and (3) the approximate percentage of dual language learner children enrolled in the center. We will also ask the program for information—such as hours of operation—that will help to prepare for the data collection site visits and identify a person who will help study staff select the classroom and child samples and obtain consents from the parents of the sampled children. For the 60 programs participating in the FACES 2019 child-level data collection, it will be necessary to collect this information in spring 2019 while program offices are open in order to stay on schedule for a fall baseline data collection. We will gather this information using:

  • Telephone Script and Recruitment Information Collection For Program Directors, Regions I Through X (Attachment 1)

  • Telephone Script and Recruitment Information Collection For Program Directors, Region XI (Attachment 2)

  • Telephone Script and Recruitment Information Collection For On-Site Coordinators, Regions I Through X (Attachment 3)

  • Telephone Script and Recruitment Information Collection For On-Site Coordinators, Region XI (Attachment 4)

We describe in detail in Part B the steps we will take to recruit programs and to select center samples in Regions I through X and in Region XI. We will follow the same steps regardless of whether we choose a program for child-level data collection or not. For AI/AN FACES 2019, recruitment materials reflect previous advice from members of the AI/AN FACES 2015 Workgroup, which included Region XI Head Start directors and researchers who have experience working with tribal communities. Members provided guidance that helped inform the design of the study and ensure that the study will be responsive to the unique characteristics of Region XI and tribal priorities. Members of the AI/AN FACES 2019 Workgroup will build on this design; informing any updates (for example, to measurement).

A.3. Improved Information Technology to Reduce Burden

The burden on program directors and on-site coordinators is minimal, as we will gather information over the phone. We will, however, offer the option of providing the information that we are requesting (for example, names and addresses of Head Start centers, numbers of classrooms, and numbers of children) electronically if that is less burdensome for them.

A.4. Efforts to Identify Duplication

No frame exists that we can use to sample Head Start centers. Each round of FACES and AI/AN FACES has developed center frames after the program sample is selected based on the information those programs provided. The recruitment process is key to the development of the frame.

A.5. Involvement of Small Organizations

No small businesses are impacted by the data collection in this project.

A.6. Consequences of Less Frequent Data Collection

The first wave of data collection for FACES 2019 and AI/AN FACES 2019 is scheduled for fall 2019. To adhere to this schedule, we must sample and contact programs in FACES 2019 in spring 2019 to develop the center sampling frames. We will recruit for AI/AN FACES 2019 in fall 2018, six months earlier than FACES 2019, to allow a full year of outreach to ensure that we have obtained the required tribal approvals. Based on experience with AI/AN FACES 2015, this additional time is vital to securing tribal approvals. We will sample Head Start centers on a rolling basis once we recruit them in 2019. We will contact centers to schedule data collection team visits beginning in summer 2019.4

A.7. Special Circumstances

There are no special circumstances for the proposed data collection efforts.

A.8. Federal Register Notice and Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on October 20, 2017, Volume #82, Number #2017-22713, Pages #48819-48820. A copy of this notice is attached as Appendix B. During the notice and comment period, we received one comment in support of the study, which is attached (Appendix G). Since submitting the 60 day notice, we have refined recruitment plans; as a result, the estimated number of responses per respondent has decreased.

Consultation with Experts Outside of the Study

We have not consulted with experts outside the study for the purposes of this information collection request for recruitment.

A.9. Incentives for Respondents

Participation in FACES 2019 and AI/AN FACES 2019 will place some burden on program staff, families, and children. The current information request focuses on work with program staff for recruitment. To offset this burden and to acknowledge respondents’ efforts in a respectful way, we are requesting nominal monetary incentives to respondents based on those we have used effectively in previous rounds of FACES and AI/AN FACES.

As part of the recruitment activities proposed in this information collection, the call scripts (Attachments 1 through 4) and information packages (Appendix A) we will describe the incentives for upcoming data collection activities. Future ICRs will provide detailed information on the procedures and instruments, including incentives.

We plan to ask Head Start teachers to complete a 10-minute Teacher Child Report (TCR) form for each sampled and consented FACES child in their classrooms. We propose to offer each teacher a $10 gift card for each TCR he or she completes. We plan to ask parents to participate in a survey (25 minutes for FACES 2019; 30 minutes for AI/AN FACES 2019) in fall 2019 and spring 2020. We propose to offer parents a gift card in each wave for participating. In addition, the children will participate in a 45-minute child assessment in fall 2019 and spring 2020. We propose to offer participating children a gift (for example, a book) worth approximately $10 each time the child participates. Table A.3 provides an overview of the proposed incentives for data collection.

Table A.3. FACES 2019 previously approved incentive structure compared to structure of prior rounds



FACES 2006

FACES 2009

FACES 2014-2018

FACES 2019

FACES Component

Respondent

Incentive

Incentive

Incentive

Incentive

Teacher child report

Teacher

Fall and Spring:
$7 per web form

$5 per paper form

Fall and Spring:
$7 per web form

$5 per paper form

Fall and Spring: $10 per form

Fall and Spring: $10 per form

Parent survey

Parent

Fall and Spring:
$35

Fall and Spring:
$35

FACES Fall 2014 and Spring 2015: $15 (additional $5 if completed within 2 to 3 weeks of receiving survey; additional $5 if completed on the web)


AI/AN FACES Fall 2015 and Spring 2016: $25

Fall and Spring: gift card













Child assessment

Child

Fall and Spring:
Children’s book (valued at $10)

Fall and Spring:
Children’s book (valued at $10)

Fall and Spring:
Children’s book (valued at $10)

Fall and Spring:
Children’s book (valued at $10)

Taking into consideration OMB guidance (2006), we propose to provide participants with these incentives for the following reasons:

  1. They should increase response rates and mitigate nonresponse bias. The knowledge that they will receive an incentive for completion will likely increase respondents’ probability of completing the data collection activities This has been found in particular for low-income and minority populations, which resemble populations Head Start serves. For example, in their meta-analysis, Singer et al. (1999) found that in three studies, using incentives was useful in achieving higher response rates from respondents who might otherwise be underrepresented in surveys, such as those from low-income and minority populations. Singer and Kulka (2002) examined a number of studies that showed that incentives reduce differential response rates and the potential for nonresponse bias. While there is a tendency for response rates to decrease over multiple rounds of a study, incentives are able to mitigate the nonresponse, particularly among low-income and minority populations (Mack et al.1998; Martin et al. 2000; Singer et al. 2000).

FACES 2014-2018 tried a tiered incentive approach to the parent survey, lowering amounts relative to the prior FACES study to $15 as a base (with add-ons for a potential of $25 total), and we saw lower response rates than seen in previous studies. We conducted nonresponse bias analyses and they showed significant differences between respondents and nonrespondents at baseline (fall 2014) in terms of teacher-reported child disability status (with those with disabilities having a higher response rate than those without disabilities), child language (with non-English speakers more likely to respond than English speakers), parent access to unlimited cell phone minutes (with parents with limited cell phone minutes more likely to respond than those with unlimited minutes), and program-level report of the percentage of enrolled children who are Black and White (with children in programs with 20 percent or less black child enrollment, and those in programs with more than 50 percent white child enrollment, more likely to respond than children in other types of programs). While the nonresponse adjustments incorporated in the analysis weights were able to mitigate these significant differences, the experience raises concern for nonresponse bias without an incentive being offered. For AI/AN FACES 2015, the child-level response rate was sufficiently high using a standard incentive approach. Therefore, our nonresponse bias analysis of AI/AN FACES in 2015-2016 was carried out at the program level only.

  1. Complex study design. Additionally, for longitudinal studies such as FACES, offering incentives has been shown to be an effective way to retain participants over time (James 2001; Mack et al. 1998; Martin et al. 2001). Therefore we believe that using a similar approach for FACES 2019 is the best way to achieve high response rates in the current study. The participation of respondents in the study activities is key to ensuring the information gathered is of high quality. High levels of participation among the sampled Head Start programs, staff, and families are essential to help ensure that estimates are nationally representative; high participation rates reduce the risk of nonresponse bias, which in turn will help produce more nationally representative estimates. We have historically achieved high response rates on FACES. The prior rounds used incentives within $5 of our proposed amounts,5 with response rates above 80 percent, and in many cases over 90 percent. It is difficult to find complex studies with populations similar to FACES that did not use incentives. In addition, comparable studies of low income young families such as Baby FACES (OMB Control Number 0970-0354, Expires 09/30/19) and PACT (OMB Control Number 0970-0403, Expired 12/31/16), have all included incentives to families and children. However, for the Project LAUNCH Cross-Site Evaluation (OMB Control Number 0970-0373, Expires 10/31/2019), the study initially did not offer an incentive to respondents who completed the web-based Parent Survey. It was found that early respondents (pre-incentive) were not representative of their communities. Minorities, individuals with lower incomes and education levels, and those who worked part-time or were unemployed were underrepresented. OMB then approved a $25 post-pay incentive after data collection had started. Completion rates and representativeness both improved following the addition of the incentives (Lafauve et al. 2018).

  2. Equity. AI/AN FACES 2019 will follow the same incentive structure as FACES 2019, similar to FACES 2014 and AI/AN FACES 2015. Though they are two distinct studies, we kept the incentive structure the same for both after discussion with the AI/AN FACES 2015 Workgroup and determining incentives are necessary to be responsive to the population’s needs.

A.10. Privacy of Respondents

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, that they may withdraw consent at any time without negative consequences, and that their information will be kept private to the extent permitted by law.

As specified in the contract signed by ACF and Mathematica (referred to as the Contractor in this section), the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor has developed a Data Safety and Monitoring Plan that assesses all protections of respondents’ personally identifiable information (PII). The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract/subcontract receive training on data privacy issues and comply with the above requirements. All of the Contractor’s staff sign the Contractor’s confidentiality agreement when joining the company. We have attached a copy of the agreement, called the Mathematica Confidentiality Pledge (Appendix C). Staff who work on AI/AN FACES 2019 must sign an additional confidentiality pledge (Appendix D).

The study will obtain a Certificate of Confidentiality from the National Institutes of Health. The study team will provide this certificate to OMB upon receipt. The Certificate of Confidentiality helps assure participants that their information will be kept private to the fullest extent the law permits. Further, all materials to be used with respondents as part of this information collection, including consent statements and instruments, will be submitted to Health Media Lab IRB (the contractor’s institutional review board) for approval.

As specified in the evaluator’s contract, the Contractor shall use Federal Information Processing Standard (currently, FIPS 140-2) compliant encryption (Security Requirements for Cryptographic Module, as amended) to protect all instances of sensitive information during storage and transmission. The Contractor shall securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Processing Standard. The Contractor shall incorporate this standard into the Contractor’s property management/control system and establish a procedure to account for all laptop computers, desktop computers, and other mobile devices and portable media that store or process sensitive information. The Contractor will secure any data stored electronically in accordance with the most current National Institute of Standards and Technology (NIST) requirements and other applicable federal and departmental regulations. In addition, the Contractor must submit a plan for minimizing, to the extent possible, the inclusion of sensitive information on paper records and for the protection of any paper records, field notes, or other documents that contain sensitive data or PII, ensuring secure storage and limits on access.

Information will not be maintained in a paper or electronic system from which they are actually or directly retrieved by an individuals’ personal identifier.

We are currently working with ACF Office of the Chief Information Officer to complete the PIA for FACES 2019.

A.11. Sensitive Questions

There are no sensitive questions in this data collection.

A.12. Estimation of Information Collection Burden

Program directors and on-site coordinators will review study materials and speak with a study team member about the centers in their Head Start program. These individuals will not incur any expense other than the time spent answering the few questions.

The estimated annual burden for program directors and on-site coordinators is in Table A.4. We expect the total annual burden for this information gathering activity to be 288 hours.

Table A.4. Total burden requested under this information collection

Instrument

Total number of respondents

Annual number of respondents

Number of responses per respondent

Average burden hour per response

Estimated annual burden hours

Average hourly wage

Total annual cost

Telephone script and recruitment information collection for program directors, Regions I–X

230

77

2

1

154

$29.10

$4481.40

Telephone script and recruitment information collection for program directors, Region XI

30

10

1

1

10

$29.10

$291.00

Telephone script and recruitment information collection for on-site coordinators, Regions I–X

230

77

2

.75

116

$29.10

$3,375.60

Telephone script and recruitment information collection for on-site coordinators, Region XI

30

10

1

.75

8

$29.10

$232.80

Estimated total





288


$8,380.80

Total Annual Cost

To compute the total estimated annual cost, we multiplied total burden hours by the average hourly wage for Head Start staff, based on median weekly wages from the Bureau of Labor Statistics, Current Population Survey estimates (third quarter of 2017). The results are in Table A.4. For program directors and on-site coordinators, we used the median salary for full-time employees over age 25 with a bachelor’s degree ($29.10 per hour).

A.13. Cost Burden to Respondents or Record Keepers

OSCs in the Classroom + Child Outcomes Core and in AI/AN FACES will be provided $500 in fall 2019 for their critically-needed assistance in recruiting families, securing parents’ informed consent, and scheduling the multiple data collection activities that need to be completed in only one week on site at each program. The OSC’s familiarity with families and the families’ trust in the local staff member will be imperative for a successful data collection effort. In spring 2020 (and in spring 2022 for FACES only), OSCs will receive $250 for their assistance in scheduling data collection visits to include classroom observations and staff surveys. For all studies, the OSC plays a critical role communicating study information to program and center staff, gathering the data we need to perform sampling activities, and communicating center information back to FACES study staff.

A.14. Estimate of Cost to the Federal Government

The total cost to the federal government of recruiting and gathering information from Head Start program directors and on-site coordinators in Regions I through X for FACES 2019 is estimated to be $599,523, including direct and indirect costs and fees. The annual cost to the federal government is $199,841.

The total cost to the federal government of recruiting and gathering information from Region XI Head Start program directors and on-site coordinators in AI/AN FACES 2019 is estimated to be $371,481, including direct and indirect costs and fees. The annual cost to the Federal Government is $123,827.

The total annual cost for FACES 2019 and AI/AN FACES 2019 recruitment is $323,668 ($199,841 + $123,827).

A.15. Change in Burden

This is for a new round of data collection under 0970-0151. All previously approved data collections under 0970-0151 are complete.

A.16. Plan and Time Schedule for Information Collection, Tabulation and Publication

There are no plans to tabulate and publish the information we gather from program directors and on-site coordinators. The information we collect will be for internal use only.

For FACES, all contacts with program directors and on-site coordinators for the purpose of collecting basic information on Head Start centers will start approximately four months prior to each round of data collection (spring 2019 for programs selected for child-level data collection and fall 2019 for programs selected for classroom-only data collection). For AI/AN FACES 2019, we will begin reaching out to program directors (and, once programs have been recruited, on-site coordinators) starting fall 2018, approximately 10 months prior to data collection. Program directors will receive letters and a subsequent telephone call from a member of the study team6 in spring for FACES 2019 (or fall for AI/AN FACES 2019). Mailings and calls to on-site coordinators will occur on a rolling basis as we identify those individuals during conversations with program directors. We expect to complete all calls with program directors and on-site coordinators by summer 2019 (or winter 2019–2020 for those programs not selected for child-level data collection). We will contact all programs participating in FACES 2019 again in fall 2021 to update the program and center information in preparation for spring 2022 data collection.

A.17. Reasons Not to Display OMB Expiration Date

The OMB number and expiration date will be displayed at the top of the first page of the program director and on-site coordinator scripts. We will offer to read the OMB number and expiration date at the beginning of the call.

A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this data collection.


REFERENCES

Bureau of Labor Statistics. “Usual Weekly Earnings of Wage and Salary Workers: Third Quarter 2017.” USDL-17-1402. Washington, DC: Bureau of Labor Statistics, October 2017.

James, T. “Results of the Wave 1 incentive experiment in the 1996 survey of income and program participation.” Paper presented at the Proceedings of the Section of Survey Research Methods, Alexandria, VA, 2001.

Lafauve, K., K. Rowan, K. Koepp, & G. Lawrence. “Effect of Incentives on Reducing Response Bias in a Web Survey of Parents.” Presented at the American Association of Public Opinion Research Annual Conference: Denver, CO, May 16-19, 2018.

Mack, S., V. Huggins, D. Keathley, and M. Sundukchi. “Do Monetary Incentives Improve Response Rates in the Survey of Income and Program Participation?” In Proceedings of the American Statistical Association, Survey Research Methods Section (vol. 529534), 1998.

Martin, E., and F. Winters. “Money and Motive: Effects of Incentives on Panel Attrition in the Survey of Income and Program Participation.” Journal of Official Statistics, vol. 17, no. 2, 2001, p. 267.

Office of Management and Budget, Office of Information and Regulatory Affairs. “Questions and Answers When Designing Surveys for Information Collections.” Washington, DC: Office of Management and Budget, 2006.

Singer E., N. Gebler, T. Raghunathan, J.V. Hoewyk, and K. McGonagle. “The Effect of Incentives In Interviewer-Mediated Surveys.” Journal of Official Statistics, vol. 15, no. 2, 1999, pp. 217‑230.

Singer, E., and R.A. Kulka. “Paying Respondents for Survey Participation.” In Studies of Welfare Populations: Data Collection and Research Issues, edited by Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, pp. 105–128. Washington, DC: National Academy Press, 2002.

Singer, E., J. Van Hoewyk, and M.P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, 2000, pp. 171–188.



1 In this document, we use the terms American Indian and/or Alaska Native (AI/AN), tribal, tribe, and Native to refer inclusively to the broad and diverse groups of American Indian and Alaska Native tribes, villages, communities, corporations, and populations in the United States, acknowledging that each tribe, village, community, corporation, and population is unique from others with respect to language, culture, history, geography, political and/or legal structure or status, and contemporary context.

2 This package request is to contact 230 Head Start programs in Regions I through X and 30 AI/AN Region XI Head Start programs with the goal of obtaining participation from 180 Head Start programs and 22 AI/AN Region XI Head start programs.

3 As in FACES 2014, data for the Core studies will be nationally representative at each of these levels; however, in AI/AN FACES 2019, data will be representative only at the child level.

4 Program recruiting will commence upon OMB approval of this information collection request.

5 Differences in incentives reflects changes in length and mode.

6 In AI/AN FACES, this initial call includes a Workgroup member.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorHetzner, Nina (ACF) (CTR)
File Modified0000-00-00
File Created2021-01-21

© 2025 OMB.report | Privacy Policy