Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Sexual Risk Avoidance Education National Evaluation: Nationwide Study of the National Descriptive Study
OMB Information Collection Request
Supporting Statement
Part B
August 2022
Submitted by:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, DC 20201
Project Officers:
Calonie Gray
MeGan Hill
Part B
B1. Objectives
Study Objectives
The objective of the Sexual Risk Avoidance Education (SRAE): Nationwide Study (NWS) is to collect survey data from SRAE grant-recipients, program providers, and facilitators about their experiences with delivering their SRAE programs, including implementation successes, challenges, and lessons learned. In addition, the study will seek youth feedback through focus groups on the SRAE programming they have received. Finally, the study will examine the relationships between program implementation features and youth outcomes.
The data collected from the surveys and focus groups will do the following:
Describe grant-recipients’, providers’, and facilitators’ experiences with delivering SRAE curricular content
Describe how grant recipients and providers interpret, understand, and address the A to F topics outlined in SRAE legislation.
Describe youth’s experiences with receiving the SRAE curricular content
Determine if any features of implementation (such as the setting, facilitator’s background and cultural match with youth, and the level of youth interaction prescribed by the curriculum) are more strongly associated with youth outcomes than others
Assess which provider characteristics (such as the number of different funding sources, level of experience implementing SRAE, and funding amount) are associated with a greater number of youth served and positive youth outcomes
Title V State, Competitive, and General Departmental SRAE grant-recipients are required to report performance measures on attendance, reach, and dosage as well as on structure, cost, and support for program implementation. As part of the performance measures, these grant-recipients also administer entry and exit surveys to middle and high school students before and upon completion of the SRAE program. These surveys focus on participant characteristics, pre- and post-program behaviors, perception of program effects, and program experiences.1 This study will leverage these existing data and will not collect any new data covered in the performance measures.
The data collected from the surveys and focus groups will provide the Administration for Children and Families (ACF) with (1) up-to-date information on how grant-recipients nationwide implement their SRAE programming and (2) an exploratory analysis of the implementation and provider features that are associated with program outputs and youth outcomes. The information will inform future ACF support and programming for grant-recipients.
Generalizability of Results
This study will present an internally valid description of the implementation experiences of SRAE grant-recipients and the relationships between program implementation and youth outcomes. It will not promote statistical generalization to other grant-recipients or service populations. The information will be used to inform support and programming at the sites from which it was collected.
Appropriateness of Study Design and Methods for Planned Uses
We will use data from surveys and focus groups to address the research questions for the NWS (See Supporting Statement A, Section A2). Data from the surveys will help us better understand program implementation experiences and lessons learned from multiple perspectives—that of the grant-recipients, their sub-recipient program providers, and the frontline facilitators. Qualitative data from the youth focus groups will supplement the survey data and provide further context on implementation through the youth’s perceptions and reactions to the programming. Using existing performance measures submitted by all SRAE grant-recipients, the study will also analyze the associations between provider and implementation characteristics and program outputs and youth outcomes. The study will not make any causal claims of these relationships. Any written products associated with the study will include key limitations from the data and analytic approach.
As noted in Supporting Statement Part A, this information is not intended for use as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
B2. Methods and Design
Target Population
Nationwide grantee survey: We will administer the grantee survey to all SRAE grant-recipients who are not also providers. This group includes primarily the Title V State grantees, and the number of grant-recipients at the time of survey administration is expected to be 40. We anticipate that the primary respondent for the survey will be the project director at the grant-recipient organization, but some topics may be more relevant to other staff members. For instance, a staff member in charge of partner outreach may be in a better position to answer questions on school and community receptivity to the program or on how the partnerships have evolved over time. The primary respondent may delegate sections of the survey to other staff who have more detailed knowledge of the particular topic covered, which is similar to the administration of the prior grantee survey, the Early Implementation Study (EIS) survey.
Nationwide provider survey: The target population is a census of all SRAE providers that bring the grant-funded program directly to youth – a mix of General Departmental Grantees, Title V Competitive grantees, and Title V State sub-recipients—about 500 organizations, based on EIS data and a review of subsequently funded grant-recipient applications. ACF maintains a list of contact information for all grant-recipients and their program provider organizations; the study team will use that information to construct the provider survey frame.
Nationwide facilitator survey: The target population is a census of all facilitators for each provider, for a total of about 2,000 facilitators across the 500 providers (an average of 4 facilitators per provider, which is consistent with information collected by the Sexual Risk Avoidance Education National Evaluation [SRAENE] to date). To construct the facilitator survey frame, we will capture the names and contact information of the facilitators through a question on the provider survey.
Youth focus groups: We will conduct 20 focus groups with youth. We will create a sample by conducting four groups in each of five geographic regions – West, Mid-West, Southwest, Southeast, and Northeast. In each region, we will conduct two middle school and two high school focus groups. The selected groups in each region will reflect the different curricula being used in that region. Each focus group will include up to 10 youth.
Sampling
Nationwide grantee survey: This survey will be administered to the full census of grant recipients who are not also providers—which is necessary to understand how different grant types may impact program implementation as well as ensure that the data collected fully represent the variation in grant-recipients. The ACF Family and Youth Services Bureau SRAE Project Officers will provide a current list of all SRAE grant-recipients who are not also providers at the time of survey administration in fall 2022.
Nationwide provider survey: We will gather the names of all providers using data from the EIS survey, the applications of grant-recipients funded since the EIS survey, and the PAS. ACF is developing a list of contact information for these providers. The survey will be administered to the full census of program providers associated with the SRAE grant-recipients. Program providers vary substantially in SRA program delivery, including in content and curriculum taught, setting where youth are served, dosage of content, geographic location, and number of youth served. Currently, there are no up to date data at the provider level which ACF could use to sample. A census of providers will prevent the introduction of sampling error and strengthen the credibility of the findings, which is important given the relative nascence of SRAE program funding and limited existing evidence on similar programs. Given that a key objective of the data collection from providers involves calculating the relationship between these features of implementation and youth outcomes, it is critical to capture all dimensions of program provision for all providers.
Nationwide facilitator survey: During the provider survey, we will also ask for the names and contact information of each provider’s facilitators. We will use this information to compile a list of all SRAE facilitators, which will serve as the data collection frame for the facilitator survey. We plan to conduct a census of the SRAE program facilitators to ensure that the facilitator data represent the program providers in the study. We estimate an 80 percent response to this survey, with an estimated four facilitators per provider. A full census will eliminate the potential for sample bias. Similar to the approach for the provider survey, given the variation in program plans across the country but without precise data on how providers’ program plans differ from each other, it will be important to survey the census of facilitators to ensure representation across all program implementation experiences.
Youth focus groups: Grant-recipients implement programs throughout the nation, in both middle and high schools, in schools and out of schools. The sampling strategy for the youth focus groups is purposive, seeking variation variables of interest – such as region and age. We will conduct a total of 20 focus groups; 4 focus groups in each of 5 regions of the U.S., of which 2 groups will be with middle schoolers and 2 groups with high schoolers. Groups will be formed to obtain feedback from youth in SRAE programs occurring in schools and out of schools and across a range of curricula. Each group will consist of up to 10 youth being served by a provider at the time of the focus groups for no more than 200 youth focus group participants. Youth will receive a parental consent form from their program facilitator. All youth younger than age 18 who participate in the focus group must have signed parental consent and youth assent forms (Appendix B). Any youth who are age 18 and older must sign a consent form.
B3. Design of Data Collection Instruments
Development of Data Collection Instruments
As described below, several topics will be addressed on all three surveys and the youth focus groups. To fully understand program implementation experiences, it is important to gather the perspectives from all levels that bring the SRAE programs to youth—from grant administrators to program supervisors to the facilitators who interact directly with the youth themselves. ACF and Mathematica staff developed the survey instruments and the youth focus group topic guide to address the domains being measured, all of which map directly to the study research questions. During the development of the data collection instruments, the study team surveyed additional sources of information and reviewed existing data collection instruments that could be used to build upon. Because the research questions are highly specific to implementation of the SRAE grants, the most relevant survey was the EIS Grantee Survey, and two survey items were used from that survey. A set of questions for the provider and facilitator surveys was also developed based on questions from the Cross-site Study Data for Improving Implementation Evaluation among Office of Population Affairs (OPA) Teen Pregnancy Prevention (TPP) Grantees to inform National Implementations (IMAGIN) Front Line survey.2 All other survey questions were newly developed to meet the needs of this study. Appendix C: SRAENE NWS Surveys Crosswalk and Research Questions shows the survey question for each of the instruments, mapped to the research question and domain the item measures, and indicates the item source.
However, most items across the instruments were newly developed to address specific objectives.
Nationwide grantee survey: The grantee survey will be used to obtain information on grant-recipients’ experiences with delivering their SRAE programs and their understanding and perceptions of the Title V–required Topics A to F.3 Specifically, the survey will ask respondents about (1) the receptivity of their communities and schools to SRAE content; (2) whether they use different content for youth, depending upon their age or other characteristics; (3) modifications to their chosen curricula, supplemental content, information on contraception, target population, setting, and mode of delivery and why they made these modifications; and (4) how the grant-recipient interprets, understands, and implements the A to F content and whether this has changed over time. This instrument uses items developed for the EIS Grantee Survey as well as questions developed to meet the specific research questions that the survey data will address. The instrument will be pre-tested with no more than nine grant-recipient directors prior to fielding.
Nationwide provider survey: With approximately 500 different providers—a mix of General Departmental and Title V Competitive grantees and State sub-recipients—it is critical to obtain direct information about their experiences with delivering the content, including the receptivity of their facilitators and communities to the curricular content in their SRAE program and the receptivity of youth to the content. They will then be asked about the modifications they have made to their content, service population, setting, delivery mode, or types of facilitators used to deliver the curricula over time and whether they have flexibility and discretion to adjust programming as needed for the youth, in the case of the State sub-recipients. Next, the survey will ask how the providers have implemented the A to F topics, including the relative emphasis of each topic within their curriculum and their interpretation and understanding of the topics. Finally, the survey will ask for specific information on the features of implementation such as the setting, primary curriculum and supplemental content provided to youth, and the characteristics of facilitators and their understanding of co-regulation strategies and predictors of sexual delay. This instrument uses items developed to meet the specific research questions that the survey data will address. A number of the questions are designed to be asked on the grantee survey and the facilitator survey to allow for comparisons by respondent type. The instrument will be pre-tested prior to fielding with no more than nine program provider directors.
Nationwide facilitator survey: The facilitator survey is designed to collect data from those who have direct contact with implementing the SRAE program content with the youth. The survey will ask about facilitators’ receptivity to the SRAE curricular content they deliver, along with the receptivity of youth and others in the schools and communities they serve. The survey will also ask about the specific content that resonates well with youth and that is more challenging for youth and whether opinions among different groups of youth are similar. The survey will collect detailed information on the facilitator’s professional background and interactions with youth. A number of the questions are designed to be asked on the grantee survey and the provider survey to allow for comparisons by respondent type. The instrument will be pre-tested prior to fielding with no more than nine facilitators.
Youth focus groups: The youth focus group protocol was developed to gain insights into youth’s perceptions, receptivity, and understanding of the SRAE content they receive within various regions, across grade levels, and across different settings and curricula. This will add valuable perspective to the survey data collected from grant-recipients, providers, and facilitators because youth have not yet been represented in the SRAENE data collection plans to date. During focus groups, the moderator will ask about youth’s receptivity to the curricular content they have received through the SRAE programming, which content resonates well with them, and which content was more challenging for them. The topics covered in the topic guide are specific to address the research questions that the focus groups are intended to address. The topic guide was developed using other youth focus group topic guides from studies of similar educational programming. The protocol will not be pre-tested prior to fielding.
The data collection instruments are designed to prevent measurement error (see section B4 for information about programming).
B4. Collection of Data and Quality Control
ACF has contracted with Mathematica to conduct the data collections. Data collection for the three survey instruments—the grantee, provider, and facilitator surveys—will be administered electronically using the web-based survey platform Confirmit. Programming the instruments with Confirmit will ensure that skip patterns are followed correctly—that is, when a specific response to a question requires a follow-up question. Edit checks will also prevent measurement error by ensuring that out-of-range or atypical responses are either unallowable or receive a prompt to confirm the entry. During data collection, the survey leads will review data frequencies to check for errors and ensure that the instrument correctly captures the data. All potential survey respondents will receive an email inviting them to participate in the survey and containing a link to the survey. Administering the survey via web allows respondents to complete the survey on their own time while using their preferred Internet-connected device (smartphone, tablet, laptop, or desktop computer). Web-based surveys allow the respondent to pause at any time and return to the survey with all responses automatically saved, as well as go back and change responses if desired. Mathematica staff will oversee the survey data collection efforts and monitor progress of response rates throughout the data collection period. Data collection for the youth focus groups is proposed to occur in person with two members of the evaluation team. For the youth focus group topic guide, measurement error will be prevented by having the moderator probe on unclear responses in real time to clarify and ensure a common understanding. The focus groups will be recorded and a notetaker will be present at each focus group. Depending upon local, state, school, or district requirements due to the COVID-19 pandemic, we will have contingency plans in place to conduct focus groups virtually. The grantee and provider surveys and youth focus groups will be administered in fall 2022. The facilitator survey will be administered in winter 2022 – spring 2023. The data collection period for each of the three surveys will be six weeks for each group of respondents. The focus groups will take place over 12 weeks.
Before each data collection activity, the study’s survey director will train the staff members who will work on the respective data collections. The training will cover topics of data privacy and security, monitoring of data collection and nonresponse follow-up activities, and data cleaning and coding to ensure the consistency and quality of data. Throughout each data collection, the study staff will assess the response rates daily and conduct follow-up to ensure high response rates.
B5. Response Rates and Potential Nonresponse Bias
Response Rates
The surveys will be administered to all SRAE grant-recipients, providers, and facilitators nationwide. We anticipate a 100 percent response rate to the grantee survey because this survey is required as part of the grant agreement. We also expect a 100 percent response rate to the provider survey because many providers are also grantees, and the grantees are required to participate in the information collection. For Title V state sub-recipients, grant-recipients are responsible for ensuring that their providers respond to data collection requests. Our assumptions are based on our experience collecting data specifically from these grant recipients. For example, on the EIS Grantee survey conducted in 2020 (OMB Control No. 0970-0530), all but one out of 114 grantees responded, for a response rate of 99 percent. On the current Grantee COVID-19 Interviews data collection (OMB Control No. 0970-0531), we have received responses from all but one grant-recipient out of 101, for a current response rate of 99 percent. It will be more challenging for grant-recipients to ensure that all facilitators complete a survey. Facilitators are often teachers, either in a school classroom or as community-based educator and thus we anticipate their response rate to the survey will be similar to the response rates obtained on other federal studies that survey classroom teachers. For example, the spring 2021 teacher survey for the Impact Evaluation of Departmentalized Instruction, conducted by Mathematica for the U.S. Department of Education (OMB Control No. 1850-0942), obtained a response rate of 83 percent from teachers. Therefore, we expect an 80 percent response rate for the facilitator survey.
The survey topics are highly salient to all respondent groups. The study team will use several strategies to establish good rapport with respondents and utilize the nested nature of each group to gain support for completion of the survey. For example, because the providers work directly with the grant-recipients, the grant-recipients can encourage the providers to complete the survey.
For the youth
focus groups, we do not anticipate low participation rates because we
plan to conduct them on-site at the program locations the youth
normally attend. The focus groups are not
designed to produce statistically generalizable findings and
participation is wholly at the respondent’s discretion.
Response rates for the focus groups will not be calculated or
reported.
Nonresponse
As noted, we do not expect nonresponse on the grantee and provider surveys. To encourage a high response rate on the facilitator survey of about 2,000 staff, we propose the use of a token of appreciation, as described in the Supporting Statement Part A. The study team will send an advance email to potential respondents prior to the onset of data collection that explains the NWS, the respective survey, how it relates to the broader goals of SRAENE, and the importance of their participation. Throughout data collection, the study team will examine item nonresponse across all surveys. We will follow up with nonresponders via email reminders to reduce nonresponse.
Youth will receive a reminder delivered to them in school the day before and the day of the focus group to reduce low focus group participation rates.
B6. Production of Estimates and Projections
The data will not be used to generate population estimates, for either internal use or dissemination.
B7. Data Handling and Analysis
Data Handling
Grantee, provider, and facilitator surveys
The grantee, provider, and facilitator surveys will be programmed with Mathematica’s Confirmit software. Error messages will be programmed into Confirmit to alert respondents to inconsistencies between data elements, values beyond the expected range, and similar issues. Respondents will have an opportunity to correct such errors before the data are submitted. The use of a web-based survey will eliminate the need for an additional step for data entry, thus minimizing potential errors that may occur during that process.
Once a sufficient number of responses have been received, we will conduct an initial quality check to identify any potential issues with the data. Additional data quality checks will be conducted throughout the study.
Youth focus groups
The youth focus groups will be selected for variation across a number of variables to include geographic regions, age (middle versus high school), setting (in-school versus out-of-school) and SRAE program curricula. The findings from the youth study are not intended to be generalizable to nor representative of a broader target population. The study team will save recordings from the youth focus groups on a secure drive accessible only to Mathematica study team members. For each focus group, one member of the study team will generate a transcript, and another will review the transcripts for accuracy and clarity. The transcripts will also be saved on a secure drive accessible only to the Mathematica study team members. The recordings will be saved only until transcribed. The transcriptions will be securely destroyed following the U.S. Department of Health and Human Services and the National Institute of Standards and Technology guidance.
Data Analysis
Grantee, provider, and facilitator surveys
To better understand the characteristics and implementation experiences of grant-recipients, providers, and facilitators, we will complete simple summary statistics of relevant survey items. To assess the associations between program outputs and youth outcomes and provider and implementation characteristics, we will link the survey data with existing grant-recipient performance measures and youth exit survey data. We will estimate correlations between the identified provider and implementation characteristics of interest and the outputs and outcomes. We will also estimate regression models using these data. In the regressions, the implementation and provider characteristics (measured through the facilitator and provider surveys) will be the independent variables, while the performance measures and youth exit surveys will be the dependent variables. This analysis will not be causal in nature, but rather is an exploratory analysis that will highlight the associations between various implementation and provider characteristics and youth outcomes. We will identify the characteristics with the strongest associations with youth outcomes. We will not report the results for cell sizes with fewer than five respondents to ensure the anonymity of respondents.
Youth focus groups
Qualitative data from the focus groups will provide an in-depth and rich source of information to learn about how youth experience SRAE programming; across region, age, curricula and settings level, and by location, all of which will inform ACFs understanding of the implementation of programming directly from the voice of the intended recipients. The focus group transcripts will be reviewed for overarching themes voiced by the youth respondents. The study team will develop a coding scheme based on the research objectives and focus group topics. The team will apply the coding scheme to each transcript.
The instrument lead for the focus groups will monitor coding and thematic analysis across the team to ensure accuracy and consistency. During the initial stages of coding, the members of the study team will together review the interview transcripts and code the data. To ensure reliability across coders, each study team member will independently code a transcript. The team will then meet to compare codes applied to the transcript to identify and resolve discrepancies. The team will continue this process until consistency in the application of codes across coders is achieved.
Data Use
The contractor team will develop an internal report for ACF that outlines the findings from the surveys and focus groups. There will also be additional dissemination products generated for public use such as briefs, interim reports, and fact sheets with infographics that share specific themes emerging from the analysis. For example, products could focus on lessons learned from providing SRA programming to different populations of youth; ACF is particularly interested in how the programming is provided and received by youth in out-of-home care settings. We will also be able to compile best practice recommendations for grantees, such as approaches for integrating the SRAE legislation “A-F” required topics into programming. The SRAE legislation also requires a report to Congress and ACF will use the reports and data collected from this study as one component to that report.
B8. Contact Persons
In Table B.1, we list the federal and contract staff responsible for the study, including their affiliation and email address.
Table B.1. Individuals Responsible for Study
Name |
Affiliation |
Email address |
Calonie Gray |
Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
Calonie.Gray@acf.hhs.gov |
MeGan Hill |
Family and Youth Services Bureau Administration for Children and Families U.S. Department of Health and Human Services |
Megan.Hill@acf.hhs.gov |
Tia Brown |
Office of Planning, Research, and Evaluation Administration for Children and Families U.S. Department of Health and Human Services |
Tia.Brown@acf.hhs.gov |
Susan Zief |
Mathematica |
SZief@mathematica-mpr.com |
Hande Inanc |
Mathematica |
HInanc@mathematica-mpr.com |
Betsy Keating |
Mathematica |
EKeating@mathematica-mpr.com |
Tiffany Waits |
Mathematica |
TWaits@mathematica-mpr.com |
Stacie Feldman |
Mathematica |
SFeldman@mathematica-mpr.com |
Erin Boyle |
Mathematica |
EBoyle@mathematica-mpr.com |
Attachments
Appendices
Appendix A: Study Notifications and Reminders
Appendix B: Parent Consent and Youth Assent for Youth Focus Groups
Appendix C: SRAENE NWS Surveys Crosswalk and Research Questions
Instruments
Instrument 1: Nationwide Study Grantee Survey
Instrument 2: Nationwide Study Provider Survey
Instrument 3: Nationwide Study Facilitator Survey
Instrument 4: SRAE Program Youth Focus Group Protocol
1 OMB Control Number 0970-0536.
2 OMB Control Number 0990-0469.
3 The Title V Competitive SRAE Program was authorized and funded by Section 510 of the Social Security Act (42 U.S.C. § 710), as amended by Section 50502 of the Bipartisan Budget Act of 2018 (Pub. L. No. 115-123) and extended by the CARES Act of 2020 (Pub. L. No. 116-136). See https://www.ssa.gov/OP_Home/ssact/title05/0510.htm.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Woolverton, Maria (ACF) |
File Modified | 0000-00-00 |
File Created | 2022-09-08 |