U.S.
Department of Health and Human Services, Office of Population
Affairs 1101 Wootton Parkway, Suite 700 Rockville, MD 20852
Contact person: Tara Rice, tara.rice@hhs.gov, 240-453-8123
|
Part A: Justification for Extension and Revision of OMB Clearance of the Collection of Follow-up Survey Data - Federal Evaluation of Making Proud Choices! OMB Control Number 0990-0452 February 2020
|
CONTENTS iii
TABLES iv
ATTACHMENTS v
A.1. Circumstances Making the Collection of Information Necessary 1
1. Legal or Administrative Requirements that Necessitate the Collection 2
2. Study Objectives 3
A.2. Purpose and Use of Information Collection 4
A.3. Use of Improved Information Technology and Burden Reduction 5
A.4. Efforts to Identify Duplication and Use of Similar Information 6
A.5. Impact on Small Businesses or Other Small Entities 6
A.6. Consequences of Collecting the Information Less Frequently 6
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5 6
A.8. Comments in Response to the Federal Register Notice/Outside Consultation 6
A.9. Explanation of any Payment/Gift to Respondents 6
A.10. Assurance of Confidentiality Provided to Respondents 9
A.11. Justification for Sensitive Questions 10
A.12 Estimates of Annualized Hour and Cost Burden 12
A13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers/Capital Costs 12
A.14. Annualized Cost to Federal Government 13
A.15. Explanation for Program Changes or Adjustments 13
A16. Plans for Tabulation and Publication and Project Time Schedule 13
1. Analysis Plan 13
2. Time Schedule and Publications 14
A17. Reason(s) Display of OMB Expiration Date is Inappropriate 14
A18. Exceptions to Certification for Paperwork Reduction Act Submissions 14
SUPPORTING REFERENCES 15
Table A1.1. Summary of Outcome Domains and Constructs 3
Table A9.1. OMB -approved incentives on similar Federal projects 8
Table A9.2. Thank You Payments for the Follow-up Data Collection 8
Table A11.1. Summary of Sensitive Topics to be Included on the Follow-up Survey and Their Justification 11
Table A.12.1. Calculations of Annual Burden Hours and Cost for Youth Participants for the follow-up survey 12
ATTACHMENT A: QUESTION BY QUESTION SOURCE TABLE FOR THE follow- up survey
ATTACHMENT b: SOURCES REFERENCED FOR THE follow-up survey
ATTACHMENT c: PERSONS CONSULTED ON INSTRUMENT DEVELOPMENT AND/OR ANALYSIS OF THE follow-up survey
ATTACHMENT D: CONFIDENTIALITY PLEDGE
ATTACHMENT E: ANALYSIS PLAN
ATTACHMENT F: Section 301 Public Health service act
ATTACHMENT G: 60 DAY FEDERAL REGISTER NOTICE
INSTRUMENTS
Instrument 1: follow-up survey
A.1. Circumstances Making the Collection of Information Necessary
This package is an extension and revision to an existing approval (OMB Control #0990-0452) for the Making Proud Choices! Evaluation study to complete 9 month follow up survey data collection for the last cohort of study participants. This follow-up data collection involves surveys administered to the individual youth enrolled in the impact study. To increase study power at the cluster level and achieve our original target of approximately 39 clusters, an additional fourth cohort was enrolled into the evaluation in Fall 2019 and Spring 2020. In order to complete all follow up activities with these final participants, an extension beyond the current expiration date of August 31, 2020 is needed.
OPA is requesting a three-year extension to the current expiration date to complete the follow-up data collection as planned. Additionally, this submission describes a revision to the study design, by dropping the 15 months follow up survey administration, detailed in section A15.Therefore, under the revision, survey data would only be collected at 9 months post baseline survey administration.
OMB approved the initial information collection request for activities related to the MPC! Evaluation in January 2017 (OMB No. 0990–0452). OPA subsequently requested and received approval for a revision to the information collection in August 2017. These prior approvals covered the following (1) collection of baseline data for the impact study through the baseline survey; (2) collection of information on program implementation, and (3) collection of follow up data for the impact study through the 9 month follow up survey.
Research on programs to prevent teen pregnancy is at a turning point. Much of the available research evidence dates to the late 1980s and early 1990s, when public health officials were facing the twin threats of the emerging HIV/AIDS epidemic and a sharp, unexpected increase in the teen birth rate in the United States. In response to these threats, researchers launched a broad, sustained effort to identify and test new programs and curricula with the potential to reduce high rates of teen pregnancy, sexually transmitted diseases (STDs), and associated sexual risk behaviors.
Much has changed in the intervening years. The teen birth rate ultimately peaked in the early 1990s and has now plunged to historic lows (Ventura et al. 2014). Researchers have succeeded in identifying dozens of prevention programs with demonstrated evidence of success in reducing adolescent sexual risk behaviors (Goesling et al. 2014), and the federal government has invested millions of dollars in disseminating knowledge of the programs and implementing them in communities around the country (Kappeler and Farb 2014; Zief et al. 2013). Overall rates of adolescent sexual activity have also declined since the early 1990s, but there has been less progress on addressing dissimilar rates by race and ethnicity. This current context shifts the research agenda towards a new primary challenge: how to use existing evidence-based programs to sustain the ongoing decline in teen birth rates in the United States and reduce remaining disparities in rates across communities and between different racial/ethnic groups.
In response to this shifting research agenda, the Office of Adolescent Health (OAH), now the Office of Population Affairs (OPA), seeks to launch a “second generation” of evaluation activities, one that addresses a more targeted set of research questions of significant relevance to OPA and the broader field.
To meet this objective, OPA designed a large-scale, multisite randomized control trial (RCT) of commonly used, but understudied, teen pregnancy prevention program- Making Proud Choices! (MPC!). The MPC! curriculum aims to increase students’ knowledge of STDs and HIV, as well as their understanding of the effectiveness of abstinence, condoms, and contraceptives at reducing STDs and pregnancy. MPC! is a very popular program across federal grant programs. It is implemented by over 100 providers nationwide. The program’s evidence of effectiveness is limited to a single study that met the HHS TPP evidence review standards in 2010 (Jemmott et al. 1998). The study is now over 20 years old, and was conducted in a highly controlled implementation context by the program developers. New evidence is needed on the effectiveness of the program as it is replicated nationwide, and in schools. School-based programs, like MPC!, are widely used across TPP grantees. In addition to understanding the impact of the MPC! curriculum, the evaluation also offers opportunities to share lessons learned and best practices around offering programing and conducing impact evaluations in-school settings.
The federal emphasis on evidence-based approaches to teen pregnancy prevention began in 2010 with congressional authorization of the TPP program and creation of OAH, now under OPA. The TPP program was one of six early evidence-based initiatives authorized by Congress to increase the use of data and evidence in social policy (Haskins and Margolis 2015). The program provides roughly $100 million annually to state and local organizations to implement evidence-based and promising new teen pregnancy prevention programs. As with several of the other federal evidence-based initiatives, the TPP program features a “tiered evidence” grant structure: the majority of funding goes to disseminate and scale up Tier 1 programs that have some existing evidence of effectiveness, whereas a smaller amount supports Tier 2 demonstration projects, which support innovation in the field by developing and rigorously testing promising new approaches to teen pregnancy prevention.
The first cohort of TPP grantees was announced in fall 2010, consisted of five-year awards running through fall 2015 (Kappeler and Farb 2014). A total of 75 organizations received funding under Tier 1 of the TPP program, with awards ranging from roughly $400,000 to $4 million annually. In line with the program’s emphasis on evidence-based approaches, grantees were required to select from a list of 28 existing programs and curricula that the U.S. Department of Health and Human Services (HHS) had identified as having demonstrated evidence of effectiveness in reducing teen pregnancy, STDs, or associated sexual risk behaviors. More than three-quarters of these eligible programs (23 of 28) were selected for use by at least one grantee. The TPP program was successful in reaching a very large segment of the population, with about 100,000 youth per year receiving services across a broad network of schools and other community organizations (Wilson and Lawson 2014). In addition, nearly 20 of these grantees conducted impact evaluations of their TPP program.
The experience of the first cohort of TPP grantees highlighted challenges local communities can face when implementing evidence-based programs (Margolis and Roper 2014). For example, grantees needed practical guidance on how to replicate evidence-based programs with fidelity within the time and scheduling constraints of their local schools and community-based organizations. In other cases, grantees found that the content of some of the older evidence-based programs was outdated or did not resonate with local youth. Implementation fidelity was often difficult to maintain, and varied based on the setting and mode.
This evaluation, designed to provide new evidence to guide the identification of evidence-based TPP programs, is authorized under Section 301 of the Public Health Service Act (42 U.S.C.241), Attachment F.
The study has been designed to address this question:
Does MPC!, implemented by health educators in schools, change youth sexual behavioral outcomes, relative to a business as usual sexual health program?
The evaluation is being conducted in 15 schools across four school districts and in required health classes or the school’s equivalent when health is not offered.
It is expected that most youth in the study will be 9th & 10th graders enrolled in a school’s required health class. Schools will be randomized to one of two conditions: (1) a treatment group taught MPC! by an outside health educator from a local health department or community based organization, or (2) a control group that receives the health curriculum the school’s health teacher normally provides (i.e. a business as usual control condition). Eligible evaluation youth will be those who are expected to take a required health class.
Survey data will be collected from youth study participants at baseline (before MPC! programming begins for treatment youth) and approximately 9 months after baseline. The baseline survey data will be used to describe the evaluation sample, to define subgroups of interest (gender and sexual experience at baseline), and as a source of covariates to be used in the impact estimation models. The follow-up survey data will be used to estimate program impacts on knowledge, attitudes, beliefs, and behaviors such as sexual initiation and contraception use (see Table A1.1 for information on outcome domains and constructs). Survey items that measure each outcome construct will be used as the dependent variable in a regression analysis used to estimate intent-to-treat program impacts of the MPC! program. The impact study will be complemented by the implementation and fidelity assessment. This study component will take a detailed look at program operations along four key aspects: (1) inputs required for implementation to succeed and be sustained, (2) contextual factors that influence implementation, (3) fidelity and quality of program implementation, and (4) participants’ responsiveness to service1.
Table A1.1. Summary of Outcome Domains and Constructs
Outcome Domain |
Outcome Construct |
Exposure to information |
Attended classes on reproductive health topics |
Received information about birth control from a doctor, nurse, or clinic |
|
Knowledge |
Knowledge about condoms |
Knowledge about birth control pills |
|
Knowledge about STIs |
|
Knowledge about IUDs |
|
Knowledge about other hormonal methods of birth control |
|
Knowledge about pregnancy |
|
Attitudes |
Support for abstinence |
Support for condom use |
|
Refusal skills |
Perceived refusal skills |
Communication with parents |
Communication about romantic relationships and sex |
Intentions |
Intentions to have sexual intercourse Intentions to use birth control |
Sexual risk behavior |
Sexual initiation |
Sex in the past three months |
|
Sex without a condom in past three months |
A.2. Purpose and Use of Information Collection
As a component of the “second generation” of OPA TPP evaluation activities – this impact study of MPC! addresses a more targeted set of research questions of significant practical relevance to OPA and the broader field. In particular, the impact study seeks to advance the existing evidence base by identifying and testing a replication of a commonly used but understudied teen pregnancy prevention program.
Findings from the impact study can be used:
By OPA and other agencies in HHS to plan and inform future TPP funding opportunity announcements (FOAs).
To inform the TPP field at large.
To build the evidence base behind MPC!, a popular TPP program model that has outdated evidence associated with it.
Data collected on the Federal Evaluation of MPC! follow-up survey will be used as a central component to the impact study. The follow-up data collection will focus on two types of outcomes outlined in Table A1.1 – both of which can be measured only through surveys of youth participating in the evaluation. The first are sexual risk outcomes, including the extent and nature of sexual activity, use of contraception (if sexually active), pregnancy, and testing for and diagnoses of STDs. The second are a series of intermediate outcomes that may be associated with the sexual risk outcomes and thus important to measure as potential pathways of any program effects on sexual risk behavior. Examples of these intermediate outcomes include participation in and exposure to pregnancy prevention programs and services, intentions and expectations of sexual activity, knowledge of contraception, condom use self-efficacy and negotiation skills and sexual risks, dating behavior and alcohol and drug use. In addition, the survey includes a small number of questions that identify socio-demographic or other characteristics of youth in the study sample, which may be used for descriptive purposes. Finally, for sample youth who report not being sexually active, the survey includes questions to support a descriptive analysis of these youth and a future investigation of their propensity for later risky behaviors2.
Follow-up data will be used to address the following research questions on program impact:
Is MPC! effective at meeting its immediate objectives, such as improving exposure, knowledge, and attitudes?
What is the effect of MPC! on sexual behavior outcomes, such as postponing sexual activity, and reducing or preventing sexual risk behaviors and STDs?
Does MPC! work better for some groups of adolescents than for others?
The primary impact analysis will focus on those who provide follow-up survey data, regardless of their level of participation in the program, or whether they complete the baseline survey. Doing so enables the team to conduct a rigorous, intent-to-treat impact analysis that meets the standards of the HHS Teen Pregnancy Prevention Evidence Review. We also plan to conduct analyses on subgroups defined by baseline measures. These analyses will be considered exploratory, and will not be used as a primary test of the effectiveness of the intervention. Instead, they are intended to help program providers and practitioners understand whether the pattern of the findings for the full sample is similar to or different from trends observed for particular subgroups. We will observe trends for subgroups defined by (1) gender, and (2) sexual experience at baseline.
We acknowledge that statistical power for these exploratory analyses may be insufficient due to smaller sample sizes within the subgroups. For that reason, these analyses are intended not as a primary test of the intervention’s effectiveness, but instead as a means to understand whether the overall pattern of findings are similar to trends observed within and across particular subgroups.
A.3. Use of Improved Information Technology and Burden Reduction
OPA is not proposing any changes to the previously approved data collection procedures as part of this three-year extension request. The follow-up data collection plan reflects sensitivity to issues of efficiency, accuracy, and respondent burden. Whenever possible, the follow-up will be a web-based survey administered in school, in a group setting.3 Trained Mathematica field staff will provide participants with smartphones, along with unique login information to access the survey from the device.
Web-based surveys are an attractive option for surveys of adolescents and young adults, especially surveys that ask sensitive questions and have various pathways based on responses to those questions. Web-based surveys can decrease respondent burden and improve data quality. The web-based application will include built-in skips and will route respondents to the next appropriate question based on their answers. The web-based program automatically skips them out of any questions not relevant to them, thus reducing burden on respondents having to navigate through various paths. Additionally, data checks can be programmed into the survey to eliminate responses that are out of range as well as conflicting responses.
A.4. Efforts to Identify Duplication and Use of Similar Information
The information collection requirements for the Federal Evaluation of MPC! have been carefully reviewed to avoid duplication with existing studies. Although the information from the one prior 1998 study of MPC! that meets the HHS evidence review standards provides value to our understanding of the effectiveness of this curriculum on behavioral outcomes, OPA does not believe that it provides current information on program effectiveness, and with a broader population of youth participating in schools. The data collection for the Federal Evaluation of MPC! is a critical step in providing essential information on program effectiveness on this very popular program being implemented in today’s schools.
A.5. Impact on Small Businesses or Other Small Entities
No small businesses are expected to be impacted. Mathematica staff will work with the study sites (schools) to lead and coordinate the data collection activities. The data collection plan is designed to minimize burden on schools by providing staff from Mathematica to lead the follow-up data collection activities.
A.6. Consequences of Collecting the Information Less Frequently
Outcome data are essential to conducting a rigorous evaluation of the MPC! program. Without outcome data, we cannot estimate the effect of the intervention following program implementation).
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
There are no special circumstances for the proposed data collection efforts.
A.8. Comments in Response to the Federal Register Notice/Outside Consultation
The 60-day was published on December 18, 2019 (Vol. 84, No. 244, page 69755). The Federal Register Notice is included with this submission (Attachment G). No public comments were received.
The names and contact information of the persons consulted in the drafting and refinement of the follow up survey instrument are in Attachment C.
A.9. Explanation of any Payment/Gift to Respondents
OPA is not requesting any change to the previously approved payments provided to respondents. We propose continuing to offer a combination of non-monetary gifts and gift cards (detailed below) to study participants in appreciation of their ongoing participation in the study by responding to the follow-up surveys. Our surveys include highly sensitive questions, and thus impose some burden on respondents. Research has shown that such payments are effective at increasing response rates for populations similar to those participating in this study4,5.
Achieving a high survey response rate at follow-up is critical for three reasons. First, it is necessary for achieving the minimum sample size needed to demonstrate the statistical significance of the findings. The federal government makes great efforts to ensure that they are investing in well-powered studies. Second, rigorous evidence reviews (such as the HHS Teen Pregnancy Prevention Evidence review, which will eventually assess the evidence from this study), assess study attrition using follow-up survey response rates when determining the evidence rating for the study. High attrition from the study can result in a low evidence rating, which would indicate that the federal government had invested in a study that lacked sufficient internal validity to draw conclusions about program effectiveness. The federal government makes great efforts to ensure that they are investing in valid program evaluations. Third, and finally, OPA has directed its contractor to study the effectiveness of MPC! in schools in low-income, disadvantaged areas. OPA intends that the study population be representative of the youth in these school districts. If high response rates are not received, the study sample could be biased in the direction of the higher achieving, higher-income, more highly motivated youth with more supportive parents – the population that is likely to complete a follow-up survey early and with little or no incentive6. Unlike other large-scale federal survey efforts, such as the Census, the study sample of a randomized controlled trial is set at the beginning of the study; there is no opportunity to add sample members over time to replace survey non-respondents. Therefore, a plan to achieve a high response rates among randomized study participants is a critical part of a follow-up data collection plan.
The incentive structure proposed in this ICR is a modified version of the incentive structure used on other federally funded studies with similar populations, including the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), PREP (OMB Control Number 0970-0398), and the Strengthening Relationship Education and Marriage Services (STREAMS) Evaluation (OMB Control Number 0970-0481, approved July 5, 2016), see table A9.1. These studies have achieved high response rates on follow-ups. On PREP, the first follow-up response rates in two of the school-based sites are 90 to 94 percent.7
This ICR reflects OMB’s efforts to lower the amount of the incentives and to provide non-monetary incentives for students completing surveys in schools.
Table A9.1. OMB -approved incentives on similar Federal projects
Project (OMB Control No.) |
ICR Reference # |
Follow
up: |
Follow
up: |
PREP (0970-0398) |
201401.0970.001 |
$15 |
$20 |
STREAMS (0970-0481) |
201703.0970.011 |
$15 |
$20 |
PPA (0990-0382) |
201502.0990.002 |
$10 |
$25 |
In an evaluation such as Making Proud Choices, students completing surveys 9 months after the initial baseline survey are no longer organized in the class groupings in which the study and intervention initially took place. Instead, at the time of the follow-up surveys, schools gather students during congregate times, such as lunch, study halls, and advisory periods. Youth have discretion as to whether they meet with the survey team or not.
Since students in school can easily be detracted from attending a follow-up survey session during lunch or study hall, it is important to incentivize them to attend the survey administration and be fully informed about the opportunity to participate. Therefore, we propose to use a small non-monetary incentive to encourage students to attend the survey administration, regardless of whether they then assent to participate.
We will work with the schools to identify the most appropriate and accessible activity for a non-monetary incentive. The incentive will be valued at no more than $5 per respondent, which is consistent with the value of the non-monetary incentive OMB approved to provide to students to return the parent permission form at the time of study enrollment.
Maintaining high response rates among adolescents who are not available for in-school administration is much more challenging. For example, on the OPA sponsored study of the Pregnancy Assistance Fund, where survey respondents are not available for in-school administration, 85 percent response rates are only achieved after the use of $25 gift cards (OMB approval 0990-0424). For the Making Proud Choices study we estimate that up to 30 percent of our sample may be responding to the survey out of school time. For those students we are unable to reach in school either because they have moved or are chronically absent, the youth must put added time and effort in outside of the school day to complete the survey.
For those participants who complete the survey outside of school, either because group administration is not feasible or they are not able to attend a group administration, a $10 gift card will be provided for completing the follow-up survey.
Table A9.2 summarizes the non-monetary and gift cards to be provided to participants for the follow-up survey data collection, based on where the survey will be administered.
Table A9.2. Thank You Payments for the Follow-up Data Collection
Type of Administration |
Length of Activity(minutes) |
Thank You Payment |
Group Administration (In-school) |
30 minutes |
Non-monetary incentive valued at $5 |
Individual Administration (Out of school) |
30 minutes |
$10 gift card |
A.10. Assurance of Confidentiality Provided to Respondents
The MPC! study and all related instruments have been approved by New England IRB.
Before collecting any data, Mathematica received active consent from a parent or legal guardian. The consent form explained the purpose of the study, the data being collected and the way the data will be used. As with the baseline survey, prior to the administration of the follow-up survey the evaluation team will seek assent from youth with parental consent.8 The assent form states that (1) answers will be kept private and will not be seen by anyone outside the study team, (2) participation is voluntary, and (3) youth may refuse to participate at any time without penalty. Participants will be told that, to the extent allowable by law, individual identifying information will not be released or published; rather, results will be published only in summary form with no identifying information at the individual level. In addition, our protocol during the self-administration of the web instrument and CATI interviews will provide reassurance that we take the issue of privacy seriously. It will be made clear to respondents that identifying information will be kept separate from questionnaires. To access the web survey application, each questionnaire will require a unique login, and respondents will have to enter a verification code to begin the survey; this will prevent unauthorized users from accessing the web application. Any personally identifiable information will be stored in secure files, separate from survey and other individual-level data. Field staff will collect the smartphones used for survey administration at the end of the survey and will be trained to keep the devices in a secure location at all times. All field staff and phone interviewers are required to sign a confidentiality pledge when hired by Mathematica (Attachment D).
Mathematica has established security plans for handling data during all phases of the data collection. The plans include a secure server infrastructure for online data collection of the web-based survey, which features HTTPS encrypted data communication, user authentication, firewalls, and multiple layers of servers to minimize vulnerability to security breaches. Hosting the survey on an HTTPS site ensures that data are transmitted using 128-bit encryption; transmissions intercepted by unauthorized users cannot be read as plain text. This security measure is in addition to standard user unique login authentication that prevents unauthorized users from accessing the web application.
All electronic data will be stored in secure files, with identifying information kept in a file separate from survey and other individual-level data. Survey responses will be stored on a secure, password-protected computer shared drive.
Privacy Act Considerations
Based on the following two considerations, the Privacy Act does not apply for this information request. First, the records collected in this study will not be retrieved by personal identifiers and second, according to the 1975 OMB Privacy Act Guidance, which was reaffirmed in the recent issuance of Circular A-108, the Privacy Act only applies to systems of records that are required to be managed by the agency; the data collection for this study is discretionary.
For the first consideration, each sample member in the study is assigned a unique study identification (ID) number. Only Mathematica team members have access to these study ID numbers. When creating a survey data file, only the study ID number is included in the file, not the student name or any other PII. Student names and other PII are kept separate from the survey data. When retrieving information about a case, team members pull up the case using their study ID number, not the student name.
All PII data are stored separately and securely from de-identified survey data. Any files containing PII are stored on Mathematica’s network in a secure project folder whose access is limited to select project team members. Only the principal investigator, project director and key study staff have access to this folder. Furthermore, approved study team members can only access this folder after going through multiple layers of security.
For the second consideration, the data collection is contracted and discretionary, therefore not covered by the Privacy Act. The proposed data collection will not be conducted by OAH, but rather, through their contractor, Mathematica. OMB Privacy Act Implementation: Guidelines and Responsibilities (July 9, 1975) describes terms under which data collected by a contractor under contract to the Federal Government is covered by the Privacy Act:
“Not only must the terms of the contract provide for the operation (as opposed to design) of such a system, but the operation of the system must be to accomplish an agency function. This was Intended to limit the scope of the coverage to those Systems actually taking the place of a Federal system which, but for the contract, would have been performed by an agency and covered by the Privacy Act.” (40 FRN 28976)
The proposed data collection does not create a system to “accomplish an agency function,” and is not a system that is “taking the place of a Federal system which, but for the contact, would have been performed by an agency.” OAH has discretion as to whether and how to carry out this data collection. Thus, the proposed data collection is discretionary, not required, and the Privacy Act does not apply.
A.11. Justification for Sensitive Questions
The sensitive questions included in the follow up survey have already been approved by OMB (OMB Control #0990-0452), and OPA is not requesting any change to the previously approved data collection instruments. A key objective of MPC! is to prevent teen pregnancy through a delay in sexual initiation, decrease in sexual activity, and/or an increase in contraceptive use. Because this is the primary focus of the program, some questions on the OMB-approved follow-up survey are necessarily related to these sensitive issues.
Table A11.1 lists the sensitive topics found on the follow-up survey, along with a justification for their inclusion. Questions about sensitive topics are drawn from previously-successful youth surveys and similar federal surveys (see Attachments A and B). Careful selection of these items was guided by experience in determining whether or not the benefits of measures may outweigh concerns about the heightened sensitivity to specific issues among sample members, parents, and program staff. Although these topics are sensitive, they are commonly and successfully asked of high school youth similar to those who will be in the Federal Evaluation of MPC!
Table A11.1. Summary of Sensitive Topics to be Included on the Follow-up Survey and Their Justification
Topic |
Justification |
Similar Federally Funded Surveys9 |
Gender identity10 |
The MPC! program aims to be sensitive and inclusive of all students. This question asks for the student’s self-identified gender, and includes options for transgender, unsure and other. |
BRFSS |
Sexual orientation |
OAH has a strong interest in improving programming that serves lesbian, gay, bisexual, and questioning youth. This question will provide documentation of the proportion of youth in the study that are part of this subpopulation. |
PREP, STREAMS, YRBSS, PPA, TPP Replication Study |
Sexual activity, incidence of pregnancy, and contraceptive use |
Sexual activity, incidence of pregnancy, and contraceptive use are all key outcomes for the evaluation. |
Title V Abstinence Study, PPA, PREP, PAF, STREAMS, YRBSS, ADD Health, TPP Replication Study |
Intentions regarding sexual activity |
Intentions regarding engaging in sex and other risk-taking behaviors are extremely strong predictors of subsequent behavior (Buhi and Goodson, 2007). Intentions are strongly related to behavior and will be an important mediator predicting behavior change. |
Title V Abstinence Study, PREP, PAF, PPA, TPP Replication Study |
Drug and alcohol use and violence |
There is a substantial body of literature linking various high-risk behaviors of youth, particularly drug and alcohol use, sexual intercourse, and risky sexual behavior. The effectiveness of various program strategies is expected to differ for youth who are and are not experimenting with or using drugs and alcohol (Tapert et al., 2001; Li et al., 2001; Boyer et al., 1999; Fergusson and Lynskey, 1996; Sen, 2002; Dermen et al., 1998; Santelli et al., 2001.) |
Title V Abstinence Study, PPA, PREP, PAF, YRBSS, ADD Health, TPP Replication Study |
In addition, the follow-up survey instrument has been designed so that only sexually active youth will receive most of these sensitive questions. The survey will ask all youth for background information and will include a screening question about sexual experience. The survey will route youth who report ever having sexual experience to additional questions about sexual behavior; those who report never having sex will be routed to other questions. Thus, many of the sensitive items related to sexual activity will be asked only of sample members who report being sexually active. This structure was approved under the current OMB clearance for this information collection request, and has been used successfully in other federally funded evaluations of teen pregnancy prevention programs, such as the Evaluation of the Title V, Section 510 Abstinence Education Program (OMB Control Numbers 0990-0233 and 0990-0237), the Evaluation of Adolescent Pregnancy Prevention Approaches (PPA-OMB Control Number 0990-0382), the TPP Replication Study (OMB Control Number 0990-0394), and the Personal Responsibility Education Program (PREP) Multi-Component Evaluation (OMB Control Number 0970-0398). As an added protection and to make respondents feel more comfortable answering these sensitive questions, the smartphones will be equipped with privacy screens.
A.12 Estimates of Annualized Hour and Cost Burden
The previously approved burden for 9-month post baseline follow-up survey data collection was estimated to be 2,497 hours.
We estimate that through August 2020, a total of 1,998 burden hours will have been used for follow-up data collection, with 499 hours remaining. It is expected that 600 youth will complete follow-up surveys between September 2020 and August 31, 2023, for an estimated annual number of respondents of 200 per year. Based on previous experience with the follow-up questionnaire, it is estimated that it will take youth 30 minutes (30/60 hour) to complete the follow-up survey. Therefore, the total annualized burden for the follow-up survey during the three-year extension period we are requesting is estimated to be 100 hours (see Table A12.A).
Table A.12.A. Calculations of Annual Burden Hours and Cost for Youth Participants for the follow-up survey
Instrument |
Type of respondent |
Annual Number of Respondents |
Number of Responses per Respondent |
Average Burden Hours per Response |
Annual Burden Hours |
1. Follow-up survey |
Youth study participants |
200 |
1 |
30/60 |
100 |
Estimated Annual Burden for Youth Follow-up survey Participants |
|
|
1 |
|
100 |
The annualized costs for all participants in the burden hour estimate would be $725.00, using the federal minimum wage. However, based on the ages of sample members at sample intake, we assume that 10 percent of the remaining respondents will be 18 or older at the time of the follow-up survey (200*10%=20 respondents). Therefore, the annual cost of this burden would be estimated to be $72.50, or $217.50 for the complete three-year period (see Table A.12.B).
Table A.12.B. Estimated Annualized Burden Costs for Youth Participants for the follow-up survey
Instrument |
Type of respondent |
Total Number of Respondents |
Annual Number of Respondents |
Number of Responses per Respondent |
Average Burden Hours per Response |
Annual Burden Hours |
Annual Burden Hours for Youth Age 18 or Older |
Hourly Wage Rate |
Total Annual Costs |
1. Follow-up survey |
Youth study participants |
600 |
200 |
1 |
30/60 |
100 |
10 |
$7.25 |
$725.00 |
Estimated Annual Burden Costs for Youth Follow-up survey Participants |
|
|
|
|
|
100 |
10 |
|
$725.00 |
NOTE: Burden estimate in the table for all participants. Based on baseline data, we would actually assume 10% of the sample will be 18 at the 9-month survey (total annual costs of $72.40). The federal minimum wage was used to calculate annual costs (https://www.dol.gov/general/topic/wages/minimumwage).
A13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers/Capital Costs
These information collection activities do not place any capital cost or cost of maintaining requirements on respondents.
A.14. Annualized Cost to Federal Government
The total cost for data collection for MPC! is estimated at $1,417,405. Annual costs to the Federal government will be $236,234 for the data collection (across six years for data collection), $132,014 specifically for 9-month follow up data collection.
A.15. Explanation for Program Changes or Adjustments
OMB gave approval on January 17, 2017 for the impact study baseline data collection and implementation and fidelity assessment data collections (OMB Control Number 0990-0452) and approval for a revision on August 15, 2017 for impact study follow up data collection. OPA now seeks approval for an extension to complete the 9 month follow up data collection. This extension is necessary to complete the follow up data collection after enrolling a fourth and final cohort into the study. To increase study power at the cluster level and achieve our original target of approximately 39 clusters, an additional fourth cohort will be enrolled into the evaluation in Fall 2019 and Spring 2020. In order to complete all follow up activities with these final participants, an extension beyond the current expiration date of August 2020 is needed.
With this ICR, OPA is also explaining a modest revision to the study design. The original study design planned for follow up survey administrations both 9 months and 15 months post baseline. The 15 month follow-up survey is no longer part of the study design, as OPA opted not to fund this additional, optional round of data collection. In addition, six data collections (baseline survey, staff interviews, staff surveys, program attendance data collection, program fidelity checklist and youth focus groups) included in the original information collection request have already been completed under the current (valid) OMB control period, or will complete in Spring 2020 prior to the current expiration date. Therefore, this requested revision decreases the original burden estimate.
A16. Plans for Tabulation and Publication and Project Time Schedule
Data from the follow-up survey will be used to estimate the effect of the intervention on the outcomes of interest – both the sexual behavior measures as distal outcomes, and the more proximal, mediating variables (knowledge, attitudes, and intentions).
As noted in Section A.2., the primary impact analysis will focus on those who provide follow-up survey data, regardless of their level of participation in the program, or whether they complete the baseline survey. Doing so enables the team to conduct a rigorous, intent-to-treat impact analysis that meets the standards of the HHS Teen Pregnancy Prevention Evidence Review. Many baseline measures will be measured again at follow-up; their baseline values can be used to improve the precision of impact estimates by their inclusion as covariates in the impact models, for those with both baseline and follow-up data.
We also plan on conducting exploratory analyses on subgroups defined by baseline measures. These analyses will be considered exploratory, and will not be used as a primary test of the effectiveness of the intervention. Instead, they are intended to help program providers and practitioners understand whether the pattern of the findings for the full sample is similar to or different from trends observed for particular subgroups. We will observe trends for subgroups defined by (1) gender, and (2) sexual experience at baseline.
We acknowledge that statistical power for these exploratory analyses may be insufficient as a result of smaller sample sizes within the subgroups. For that reason, these analyses are not intended as a primary test of the intervention’s effectiveness, but instead as a means to understanding whether the overall pattern of findings is similar to trends observed within and across particular subgroups.
A detailed analysis plan is in Attachment E.
A schedule of the data collection efforts and reporting for the follow-up survey follows. .
Spring 2020: Final study enrollment (consent and baseline survey administration)
Fall 2021-Winter 2022: Final follow up instrument administrations
Fall 2022: Reporting
We anticipate publishing a final impact report, which would be available through OPA’s website. Additionally, other reporting may include a study brief and dissemination activities at relevant professional conferences.
A17. Reason(s) Display of OMB Expiration Date is Inappropriate
All instruments, consent and assent forms and letters will display the OMB Control Number and expiration date.
A18. Exceptions to Certification for Paperwork Reduction Act Submissions
No exceptions are necessary for this information collection.
Boyer, Cherrie B., Jeanne M. Tschann, and Mary-Ann Shafer. “Predictors of Risk for Sexually Transmitted Diseases in Ninth Grade Urban High School Students.” Journal of Adolescent Research, vol. 14, no. 4, 1999, pp. 448–465.
Buhi, Eric R., and Patricia Goodson. “Predictors of Adolescent Sexual Behavior and Intention: A Theory-Guided Systematic Review.” Journal of Adolescent Health, vol. 40, no. 1, 2007, pp. 4–21.
Dermen, K. H., M. L. Cooper, and V. B. Agocha. “Sex-Related Alcohol Expectancies as Moderators of the Relationship Between Alcohol Use and Risky Sex in Adolescents.” Journal of Studies on Alcohol, vol. 59, no. 1, 1998, pp. 71–77.
Fergusson, David M., and Michael T. Lynskey. “Alcohol Misuse and Adolescent Sexual Behaviors and Risk Taking.” Pediatrics, vol. 98, no. 1, 1996, pp. 91–96.
Jemmott, J. B., L. S. Jemmott, and G. T. Fong. “Abstinence and Safer Sex HIV Risk-Reduction Interventions for African American Adolescents: A Randomized Controlled Trial.” Journal of the American Medical Association, vol. 279, no. 19, 1998, pp. 1529–1536.
Li, Xiaoming, Bonita Stanton, Lesley Cottrell, James Burns, Robert Pack, and Linda Kaljee. “Patterns of Initiation of Sex and Drug-Related Activities among Urban Low-Income African-American Adolescents.” Journal of Adolescent Health, vol. 28, no. 1, 2001, pp. 46–54.
Santelli, John S., Leah Robin, Nancy D. Brener, and Richard Lowry. “Timing of Alcohol and Other Drug Use and Sexual Risk Behaviors Among Unmarried Adolescents and Young Adults.” Family Planning Perspectives, vol. 33, no. 5, 2001, pp. 200–205.
Sen, Bisakha. “Does Alcohol Use Increase the Risk of Sexual Intercourse Among Adolescents? Evidence from the NLSY97.” Journal of Health Economics, vol. 21, no. 6, 2002, pp. 1085–1093.
Tapert, Susan F., Gregory A. Aarons, Georganna R. Sedlar, and Sandra A. Brown. “Adolescent Substance Use and Sexual Risk-Taking Behavior.” Journal of Adolescent Health, vol. 28, no. 3, 2001, pp. 181–189.
1 Data collection instruments associated with the implementation and fidelity assessment were approved by OMB on 1/17/2017 (OMB Control No. 0990-0452).
2 To ensure the privacy of survey respondents, we have timed the length of the series of questions for non-sexually active youth to approximate to the length of the series for sexually active youth.
4 Berlin, Martha, Leyla Mohadjer, Joseph Waksberg, Andrew Kolstad, Irwin Kirsch, D. Rock, and Kentaro Yamamoto. 1992. An experiment in monetary incentives. In JSM proceedings, 393–98. Alexandria, VA: American Statistical Association.
5 James, Jeannine M., and Richard Bolstein. 1990. The effect of monetary incentives and follow-up mailings on the response rate and response quality in mail surveys. Public Opinion Quarterly 54 (3): 346–61.
6 Singer, Eleanor, and Richard A. Kulka. 2002. Paying respondents for survey participation. In Studies of welfare populations: Data collection and research issues, eds. Michele Ver Ploeg, Robert A. Moffitt, and Constance F. Citro, 105–28. Washington, DC: National Academy Press.
7 Response rates provided are for two school-based sites on PREP.
8 The consent and assent forms were approved by OMB on January 17, 2017 (OMB# 0990-0452).
9 PPA OMB Control Number 0990-0382, PREP OMB Control Number 0970-0398, PAF OMB Control Number 0990-0424, STREAMS OMB Control Number 0970-0481. Center for Disease Control and Prevention (CDC) YRBSS: 2017 Youth Risk Behavior Surveillance System Questionnaire & the Eunice Kennedy Shriver National Institute of Child Health and Human Development, with co-funding from 17 other federal agencies, National Longitudinal Study of Adolescent to Adult Health (ADD Health), Title V Abstinence Evaluation OMB Control Numbers 0990-0233 and 0990-0237, 2016 Behavioral Risk Factor Surveillance System Questionnaire, sponsored by the Center for Disease Control and Prevention, and the Teen Pregnancy Prevention (TPP) Replication Study, OMB Control Number 0990-0397.
10 In consultation with OMB, we are retaining on the follow up survey the version of the gender identity question that was approved on the baseline survey. In future ICR’s we will incorporate current OMB guidance for gender identity items in baseline submissions.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | MPC_Statement A_042617 |
Author | Mathematica Staff |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |