Personal Responsibility Education Program (PREP) Multi-Component Evaluation – Extension
OMB Information Collection Request
0970 - 0398
Supporting Statement
Part B
November 2017
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer: Caryn Blitz
B1. Respondent Universe and Sampling Methods
From the universe of PREP grantees, ACF selected four program sites to participate in the Impact and In-Depth Implementation Study component of the PREP evaluation. The selected sites are not meant to be representative of PREP-funded programs as a whole. Rather, site selection focused on programs that (1) are large enough to support a rigorous evaluation of program effectiveness, (2) are implementing programs in a way that is amenable to random assignment, and (3) address priority gaps in the existing research literature on evidence-based approaches to teen pregnancy prevention. These gaps include evidence on effective programs for high-risk populations such as pregnant and parenting teens, or under-studied youth populations, such as youth living in rural areas.
Consent for study data collection was obtained upon enrollment in the study, which has been completed for all sites (Table B1.1). Each site is testing a different teen pregnancy prevention program and thus has a different target population. In the three school-based sites, all students in the participating schools and grade levels were invited to participate. In the community-based site, participants were recruited on a volunteer basis through a local social service provider. The resulting samples are not intended to be representative of a broader population. Data collection is complete in New York.
Table B1.1 Sample enrollment in PREP Impact and In-Depth Implementation study sites
Site |
Respondent universe |
Sampling method |
Total Enrollment |
Iowa |
7th grade males |
All 7th grade males in participating schools were invited to participate |
737 |
Kentucky |
9th grade students |
All students taking the required health class were invited to participate |
2,220 |
New York |
Youth enrolled in alternative schools |
All students |
465 |
Texas (HFSA) |
Pregnant and parenting young women |
All pregnant and parenting young women who were seeking services from Healthy Families San Angelo |
595 |
B2. Procedures for Collection of Information
The mode of follow-up data collection varies by site. Wherever possible, the evaluation team administers the follow-up surveys in groups using a paper-and-pencil instrument (PAPI). When necessary to increase response rates or accommodate specific populations, this method has been augmented with or replaced by a computer assisted telephone interview (CATI) follow-up or a telephone follow-up with hard copy. For the telephone follow-up with hard copy, trained interviews read the PAPI survey aloud to respondents over the phone, and the interviewers record the respondent’s answers on a hard copy survey.
For group administration, the evaluation team begins by handing out pre-identified survey packets to the youth whose names are on the packets, and obtaining youth assent (Attachment D). Each packet consists of the PREP follow-up survey and a sealable return envelope. The survey includes a label with a unique ID number (no personally identifying information appears on the survey or return envelope). Youth self-administer the survey. The instrument has three parts (Part A, Part B1, and Part B2) to avoid asking youth who are not sexually experienced detailed questions about their sexual activities. Part A of the survey asks for background information and concludes with a single screening question about sexual experience. Youth with sexual experience complete Part B1 and those without complete Part B2. Two members of the evaluation team monitor activities in each survey room. At the end of the survey administration, youth place the entire survey in the return envelope, seal it, and return it to a member of the evaluation team. Completed surveys are immediately shipped via FedEx to Mathematica’s Survey Operations Center for receipting, and then checked for completeness. Any forms with identifying information (assent forms) are shipped separately from the surveys. All surveys that pass the check are sent to a vendor for scanning. All scanned data are electronically transmitted back to the evaluation team.
For youth who do not attend group administrations or when group administration is not feasible, the evaluation team works collaboratively with each site to determine the best alternative mode of survey administration. Two options are available: individual administration of a PAPI survey over the telephone when small numbers of respondents cannot attend group administration or individual administration through CATI when the majority or all respondents would not find group administration feasible. For example, one of the PREP Impact and In-Depth Implementation study sites is assessing the effectiveness of a home visiting program for teen mothers. The structure of the home visiting program does not provide a natural group setting for survey administration. Therefore, the follow-up surveys are conducted via computer assisted telephone interviewing (CATI).
B3. Methods to Maximize Response Rates and Deal with Nonresponse
Expected Response Rates
To date, response rates have either equaled or exceeded the rates projected at the start of the study. For the 12-month follow-up survey, response rates have ranged across sites from 84 percent for the New York and Texas sites to 94 percent for the Iowa site. The projected rate was 80 percent. For the 24-month follow-up, response rates have ranged across sites from 76 percent for the Texas site to 91 percent for the Iowa site. The projected rate was 75 percent. ACF expects to achieve similar response rates for the remaining follow-up data collection.
Dealing with Nonresponse
Even with the high survey response rates, the evaluation team will take steps to understand the nature of any non-response and to account for the threat that it may pose for the validity of the study’s impact estimates. Using data from the baseline survey, evaluation team members will first test for statistically significant differences across demographic and baseline outcome variables between respondents and nonrespondents. Any such differences will be documented in the site-specific impact reports. The team will also test for differences between the research groups in their baseline characteristics and control for these differences using covariates when estimating program impacts.
Maximizing Response Rates
For the school-based sites, the evaluation team has worked closely with school contacts to locate respondents in their new classrooms. Evaluation team members have also asked schools to post reminders and make announcements prior to and on the day of the survey administration to maximize attendance. On the day of the survey administration, contractor staff have taken attendance prior to beginning administration and immediately follow-up with the school contact regarding any unexpected absentees. Sample members who have transferred schools or moved out of the area have been tracked and given the option to complete the survey by telephone.
In sites where group-based administration is not possible, contractor staff have sent advance letters to sample members, notifying them of the data collection and providing them with the information necessary to complete the survey over the phone. Additional telephone, email and text prompts to youth and parents have been conducted as needed (Attachment E).
Additionally, gift cards are provided to respondents in the amounts previously approved by OMB. For the group survey administrations, respondents receive $15 gift cards for completing a 12-month follow-up survey and $20 gift cards for completing a 24-month follow-up survey. For surveys completed by telephone, respondents receive $20 gift cards for completing a 12-month follow-up survey and $25 gift cards for completing a 24-month follow-up survey. Slightly larger gifts are offered to respondents who complete surveys outside of group administration because of the additional burden associated with phone administration, requiring greater initiative and cooperation on behalf of the respondent, as well as additional time outside of school or their ordinary day. For both group survey administrations and telephone surveys, slightly larger gifts are offered to respondents for the 24-month follow-up surveys to promote high response rates. Attrition from surveys tends to increase over time due to mobility of participants and study fatigue. Higher incentives are needed to continue to ensure participant responses. Research has shown that gifts of this size are effective at increasing response rates for populations similar to those participating in this study.1,2 Throughout the study, the evaluation team has offered these incentives and maintained response rates of more than 80 percent for all sites and for each of the two follow-up waves.
B4. Tests of Procedures or Methods to be Undertaken
ACF is not proposing any changes to the previously approved data collection instruments or procedures as part of this one-year extension request without change. ACF has not conducted any additional testing of the data collection procedures or methods since receiving initial OMB approval.
B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Follow-up survey data for the PREP Impact and In-Depth Implementation Study is being collected and analyzed by ACF’s prime contracting organization, Mathematica Policy Research.
Attachment B lists the individuals whom ACF consulted on the follow-up survey.
1 Singer, Eleanor, and Richard A. Kulka. 2002. Paying respondents for survey participation. In Studies of welfare populations: Data collection and research issues, eds. Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, 105–28. Washington, DC: National Academy Press.
2 U.S. Department of Education. National Center for Education Statistics. Technical Report and Data File User’s Manual For the 1992 National Adult Literacy Survey, NCES 2001–457, by Irwin Kirsch, Kentaro Yamamoto, Norma Norris, Donald Rock, Ann Jungeblut, Patricia O’Reilly, Martha Berlin, Leyla Mohadjer, Joseph Waksberg, Huseyin Goksel, John Burke, Susan Rieger, James Green, Merle Klein, Anne Campbell, Lynn Jenkins, Andrew Kolstad, Peter Mosenthal, and Stéphane Baldi. Project Officer: Andrew Kolstad. Washington DC: 2001.
File Type | application/msword |
File Title | OPRE OMB Clearance Manual |
Author | DHHS |
Last Modified By | SYSTEM |
File Modified | 2017-11-13 |
File Created | 2017-11-13 |