Alternative Supporting Statement Instructions for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Sexual Risk Avoidance Education National Evaluation: Nationwide Study of the National Descriptive Study
OMB Information Collection Request
Supporting Statement
Part A
August 2022
Submitted by:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, DC 20201
Project Officers:
Calonie Gray
MeGan Hill
Part A
Executive Summary
Type of Request: This information collection request is for a new information collection. We are requesting one year of approval.
Progress to Date: The Nationwide Study (NWS) is part of the Sexual Risk Avoidance Education National Evaluation (SRAENE) National Descriptive Study (NDS). It builds upon the findings from the NDS Early Implementation Study (EIS), which took place in 2020 (OMB No. 0970-0530).
Description of Request: The purpose of SRAENE is to provide information on the design and implementation of Sexual Risk Avoidance Education (SRAE) programs, the effectiveness of program components, and the ways grant recipients can use data and evidence to improve SRAE programming. SRAENE involves the following three main parts, two sub-studies and the provision of technical assistance: (1) the NDS of SRAE program implementation and youth outcomes, (2) a Program Components Impact Study to identify effective SRAE program components (OMB information collection request forthcoming), and (3) data capacity building and local evaluation support for SRAE grant recipients (feedback surveys approved through generic clearances). This request is specific to the NDS and the goal is to learn about program implementation experiences and outcomes of the SRAE grant program. The data collection would include conducting onetime surveys of SRAE grant recipients, program providers, and program facilitators as well as holding focus groups with youth from various SRAE programs. Taken together, these data will be used to understand program implementation experiences and to examine relationships between program implementation and outcomes. The data collected for the NDS are not intended to be generalized to a broader population. We do not intend for this information to be used as the principal basis for public policy decisions.
Timing: SRAE programming is typically conducted during the school year in most settings, thus the goal is to collect information between fall 2022 and spring 2023.
Study Background
As part of the federal government’s efforts to support youth in making healthy decisions about their relationships and behaviors, Congress reauthorized Title V, Section 510 of the Social Security Act in February 2018,1 funding the Sexual Risk Avoidance Education (SRAE) grant program. SRAE, administered by the Family and Youth Services Bureau (FYSB) within the Administration for Children and Families (ACF), of the U.S. Department of Health and Human Services, funds programs that teach adolescents to refrain from sexual activity. The SRAE programs also provide education on personal responsibility, self-regulation, goal setting, healthy relationships, a focus on the future, and preventing drug and alcohol use. SRAE replaces the Title V, Section 510 Abstinence Education grant program, which Congress had passed as part of welfare reform in the mid-1990s. In 2018, the Sexual Risk Avoidance National Evaluation (SRAENE) began, with goals for conducting a comprehensive and rigorous evaluation on the design and implementation of the SRAE programs and improving SRAE programming. SRAENE is comprised of two sub-studies, the National Descriptive Study (NDS) and the Program Components Impacts Study2. SRAENE also comprises a third element-- which is not a study—that involves the provision of Data and Evaluation Support technical assistance.
This information collection request focuses on a sub-study under the NDS, the Nationwide Study (NWS), and seeks approval to collect survey and focus group data between fall 2022 and spring 2023 for the NWS to learn about program implementation experiences and outcomes of the SRAE grant program. The data collected through the NDS will be used to describe the implementation of the SRAE grant program nationally, and will also be included in ACF’s required report to Congress.
ACF is undertaking the collection at the discretion of the agency. The ACF Office of Planning, Research, and Evaluation (OPRE) contracted with Mathematica to conduct the NDS as a component of SRAENE.
A2. Purpose
The NDS is the centerpiece of the SRAE National Evaluation and has two sub-studies: the Early Implementation Study (EIS)3 and the NWS.
Sub-Study #1: Early Implementation Study: The EIS, described the programs’ structure, the context in which the programs were designed, and how they prepared for implementation—including their use of sub-recipients and other program partners; their plans for training, technical assistance, and monitoring; and their targeted geographic areas, populations, and settings. The EIS identified the inputs and characteristics of program implementation, including program messages and plans to address Topics A to F in the SRAE legislation.
Sub-Study #2: Nationwide Study: The NWS – the focus of this request – builds from the EIS and allows for a deeper dive into learning about program implementation experiences and outcomes of the SRAE grant program.
The purpose of the NWS is to conduct a more detailed, mixed-methods study of program implementation and youth outcomes and to perform analyses to identify promising approaches to program implementation. The NWS will build on the EIS in two ways. First, the NWS will collect detailed information on grant-recipient program implementation experiences, through surveys of grant recipients, their SRAE program providers, and the program facilitators, and through focus groups with the program recipients themselves, the youth. Second, the NWS will make use of extant data from grant-recipient performance measures on program outputs and outcomes. Combined with data on program implementation, the NWS will examine associations among implementation, outputs, and outcomes.
The information collected is meant to contribute to the body of knowledge on ACF programs and the data will be used in an ACF required report to Congress. Improved knowledge on a specific component of SRAE program implementation, program facilitation strategies, will not only be useful for ACF’s reporting on the SRAE program to Congress, but will also be critical to gaining understanding about how to implement programming in a way that best serves the youth participants. Data collected can be used by ACF to improve specific and focused elements of the SRAE program, cycling the information learned back to the grantees, providers, and facilitators in the form of webinars and direct technical assistance. There will also be additional dissemination products generated for public use such as briefs, interim reports, and fact sheets with infographics that share specific themes emerging from the analysis. The information collected is not intended to be used as the principal basis for a decision by a federal decision-maker, and is not expected to meet the threshold of influential or highly influential scientific information.
Research Questions
ACF proposes to examine the following guiding research questions for this study:
What are grant-recipients’ and providers’ experiences with delivering SRAE curricular content? What are youth’s experiences with receiving the SRAE curricular content?
How did grant recipients and providers interpret, understand, and address the A to F topics in the SRAE legislation?
Are some features of implementation more strongly associated with youth outcomes than others?
What provider characteristics are associated with a greater number of youth served and with youth outcomes?
Study Design
The NWS will gather data from SRAE grant recipients who are not providers (primarily the State SRAE grantees), program provider organizations who are a mix of grantees (General Departmental and Competitive) and State SRAE subrecipients, the program facilitators who work directly with youth, and the youth program participants, to gain an understanding of program implementation and how youth perceive the SRAE information taught to them. The study also entails use of existing data sources, namely the SRAE Performance Analysis Study (PAS) Performance Measures4 that SRAE grant recipients are required to report. These measures include structure, cost, and support for implementation; attendance, reach, and dosage; and participant characteristics, behaviors, perceptions of program effects, and program experiences. The study team will use these measures to analyze program outputs and youth outcomes in relation to program implementation.
The grantee survey focuses on learning about the grant recipients’ (who are not also providers) experiences with managing and overseeing the delivery of SRAE curricular content, providing a high-level perspective of the grant implementation and structure, while the provider survey captures a different perspective from the lens of program providers’ and their experiences with delivering the content and working directly with facilitators. Both data collections, which will be conducted via the web, will survey the full census of (1) grant recipients who are not providers, and (2) SRAE program providers, who are a mix of General Departmental and Competitive grant recipients and State sub-recipients. Section B2 of Supporting Statement Part B provides further detail on the survey target populations. For both surveys, the respective directors will first be contacted about the study via an advance email (see Appendix A, Study Notifications and Reminders) that describes the purpose of the survey and the importance of their participation. This will be followed by a survey invitation email, which will contain the link to their web survey (Appendix A).
The facilitator survey provides the on-the-ground perspective of implementation of the program, including interaction and engagement with the intended program recipients, the youth, which neither the grantee nor the provider surveys can accurately offer. The sample for the facilitator survey will be constructed through a series of survey questions on the provider survey. The web-based survey will focus on their direct implementation experiences, including their receptivity to the SRAE content and their perceptions of student receptivity and engagement. Facilitators will receive an advance email notification and the survey invitation (Appendix A).
The study team also plans to conduct focus groups of youth ages 12 to 18 in five regions of the U.S. We plan to conduct up to four focus groups per region, with two groups per region for middle schoolers and two for high schoolers, across a diversity of SRAE curricula with up to ten youth per group. Table A1 provides the purpose and content for these groups. The focus groups will take place in person and on-site at the program sites. The in-person nature of the groups is critical for ensuring that focus group participation does not vary demographically due to differences in student digital technologies and online access.5,6 Conducting the groups in-person avoids the complication of, and potential exclusion of youth due to lack of personal technological devices such as laptop computers or tablets with cameras to ensure full engagement, differing data-use plans, and internet bandwidth and service that may be unreliable in rural settings. Focus group engagement and interaction, critical for procuring verbal data from the youth participant, will be more effective in-person than remotely. A recent survey shows socioeconomic differences in student self-reported engagement with remote learning 7 and data from the current SRAENE Grantee COVID-19 Interviews (OMB Control No. 0970-0531) consistently shows SRAE program grant directors reporting that remote learning did not provide the same level of youth engagement in the programming as that of in-person. A virtual video conference option will be available, depending on the COVID-19 safety measures that might be in place at sites. Supporting Statement Part B, Section B2 further describes the study’s methods, design, and sample.
Table A.1 includes each of the data collections by instrument, participant, content, purpose, and mode and duration of the data collection. To fully understand program implementation experiences, it is important to gather the perspectives from all levels that bring the SRAE programs to youth—from grant administrators to program supervisors to the facilitators who interact directly with the youth themselves.
Table A.1. Study design summary
Instruments |
Participant, content, purpose of collection |
Mode and duration |
|
Instrument 1: NWS Grantee Survey |
Respondents: Grant directors who are not program providers Purpose and content: Collect data to examine grant recipients’ experiences with delivering SRAE curricular content through their sub-recipient providers and the changes they may have made to improve the delivery of this content. Focuses on topics including receptivity of communities and schools to SRAE content; whether grant recipients use different content for youth based on age; modifications to the grant-recipients’ original plans regarding chosen curricula, supplemental content, information on contraception, target population, setting, and mode of delivery; why grant recipients made modifications to plans; how grant recipients interpret, understand, and implement the A–F content, and whether this has changed over time |
Mode: Web-based survey Duration: 10 minutes |
|
Instrument 2: NWS Provider Survey |
Respondents: Provider organization director Purpose and content: Collect data from providers - a mix of General Departmental grantees, Competitive grantees, and State sub-recipients - who work directly with facilitators to implement the SRAE curricula. Focuses on topics including experiences with delivering SRAE content; receptivity of youth; modifications made to content, target population, setting, delivery mode, or type of facilitator over time; provider flexibility to adjust programming; implementation of the A–F topics; program implementation features (setting, primary and supplemental curricula, facilitator characteristics) |
Mode: Web-based survey Duration: 45 minutes |
|
Instrument 3: NWS Facilitator Survey |
Respondents: Program facilitators working directly with youth Purpose and content: Collect data on program implementation from program facilitators who work most directly with youth and communities. Focuses on topics including program implementation and receptivity of facilitators and youth participants to SRAE program content |
Mode: Web-based survey Duration: 45 minutes |
|
Instrument 4: NWS Youth Focus Group Topic Guide |
Respondents: Program participants ages 12–18 Purpose and content: Obtain the perspectives of program recipients. Focuses on topics including youth receptivity to SRAE program content, setting and delivery |
Mode: In person or virtual |
NWS = Nationwide Study; SRAE = Sexual Risk Avoidance Education.
Other Data Sources and Uses of Information
As noted in A2, the NWS builds upon the EIS (OMB Control No. 0970-0530: SRAE NDS-EIS). The NWS also will use existing data on performance measures that SRAE grant recipients are required to report (OMB Control No. 0970-0536: SRAE PAS Performance Measures). The study team will use these measures to analyze program outputs and youth outcomes in relation to program implementation.
A3. Use of Information Technology to Reduce Burden
The study team plans to use information technology wherever possible. The surveys will be available via web and can be completed using a tablet, smartphone, desktop computer, or laptop. The study team plans to conduct youth focus groups in person but is prepared to conduct them remotely—via videoconference if necessary—because of the COVID-19 pandemic.
A4. Use of Existing Data: Efforts to Reduce Duplication, Minimize Burden, and Increase Utility and Government Efficiency
ACF has carefully reviewed the information collection requirements, and no other federal or nonfederal studies are collecting the same type of data requested, which is specifically focused on learning about program implementation experiences and outcomes of the SRAE grant program. Through our review, we have identified relevant questions from other surveys to tailor for this study, see SSB section B3 for more information. These studies to not provide the same information as this proposed data collection.
A5. Impact on Small Businesses
We expect most of the respondents in the study to be employees of small, nonprofit organizations. The study team will only request information required for the intended use. The burden for respondents will be minimized by restricting the survey length to the minimum required, using a web survey platform that allows participants to complete the survey at a time most convenient to them and which automatically saves their responses. The survey participants do not have to complete all questions at one time. Burden is also reduced as record keeping on the part of the programs is not required.
A6. Consequences of Less Frequent Collection
This is a onetime data collection.
A7. Now subsumed under 2(b) above and 10 (below)
Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on May 26, 2022 (87 FR 32030) and provided a 60-day period for public comment. During the notice and comment period, no comments were received.
We will not be seeking consultation from experts outside of the study.
A9. Tokens of Appreciation
Tokens of appreciation are planned for one of the four data collections: the facilitator survey. To obtain representation of facilitators and help ensure a high survey completion rate, we will send facilitators a $10 electronic gift card as a pre-paid token of appreciation, followed by a $30 post-paid gift card after survey completion. The tokens of appreciation serve as an important aspect within a broader plan of addressing and mitigating nonresponse bias (See Supporting Statement Part B for details on other strategies that will be used to increase survey response and reduce nonresponse bias). Unlike the directors of the grant or the sub-recipient program provider directors, the facilitators have not previously had a direct connection to the activities of the SRAENE; this will be the first time that the study team will be in contact with them. Also, unlike the grant and program directors, the facilitators are not required to participate in the survey as part of their role as sexual health educator (grant-recipient participation in the survey is a condition of their grant). Tokens of appreciation have conclusively been found to increase response rates and reduce nonresponse bias for studies that are similar in mode (web-based)8 and population (teachers) to this proposed research study.9
The use of pre-paid tokens of appreciation, in particular, have been shown to be more effective than just using post-paid tokens of appreciation across a variety of respondents in studies, with one study using regression modeling within a meta-analysis finding that prepaid tokens of appreciations “have the largest per-dollar impact on responsiveness among a variety of facets (including the use of promised incentives).” 10,11 This approach was recently approved and conducted for use on the Implementation and Cost of High Quality Early Care and Education (ECE-ICHQ) as part of their strategy to reduce non response bias due to low response rates for classroom teaching staff (OMB Control No. 0970-0499). That study conducted two experiments to test the effectiveness of prepaying for survey participation. Though the study’s results have not yet been published, and preliminary results have been shared with ACF. In the first experiment, administrators and teachers were assigned to two groups; one group received a $10 prepay followed by a $10 post pay after completing a 30-minute survey and the second group received only a $20 post pay (no prepay). The group receiving the prepay had a 20-point increase in their response rate compared with the post pay only group (81 vs 61 percent). The second experiment used a higher token of appreciation because the survey was 45 minutes, which is the estimated time for our facilitator survey. The second experiment, conducted only with teachers, similar to facilitators in our study, tested two prepay/post pay amounts: (1) $10 prepay/ $40 post pay and (2) $25 prepay/ $25 post pay. The group that received a $10 prepay and a higher post pay of $40 had a higher response rate (93 percent) than the group that received a $25 prepay and $25 post pay (87 percent). Our proposed approach for using tokens of appreciation to achieve high response rates from the SRAE facilitators is to follow a similar strategy used in this experiment, by offering a $10 prepay followed by a higher $30 post pay upon survey completion.
There will not be a token of appreciation for the grantee or provider surveys nor for the youth focus groups because these respondent types are either required to complete the survey as a condition of their grant (grant-recipient program directors); are expected to participate based on their role in the SRAE grant (program provider directors); or, in the case of youth, because they are at their program and the focus group will take place during a SRAE class period.
A10. Privacy: Procedures to Protect Privacy of Information, While Maximizing Data Sharing
Personally Identifiable Information
This data collection effort will collect personally identifiable information (PII) from providers on facilitators (names, work email addresses, and telephone numbers), which is information needed from the providers in order to contact the facilitators for the survey. For youth focus groups, the youth’s name and parent signature will be obtained on the parent consent form and the youth’s name and their own signature will be collected on the youth assent form—both of which are necessary to obtain consent to participate in data collection activities.
Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individual’s personal identifier.
Assurances of Privacy
All study participants in the facilitator survey and youth focus groups will be informed of the planned uses of data, that their participation is voluntary, and that the study team will keep their information private to the extent permitted by law. Participants in the grantee and provider surveys will be provided the same assurances of privacy; however, because grant-recipient participation in the survey is a condition of their grant, they will be informed that the grantee’s participation is required but that they may designate any knowledgeable individual to complete the study; and that they may choose not to respond to specific questions. Grant recipients are also required to ensure that their providers respond to the survey.
The study team will discuss issues of privacy during training sessions with staff who work on the project. The contractor, Mathematica, requires that staff complete online security awareness training when they are hired and then participate in annual refresher training thereafter. Training topics include the security policies and procedures outlined in the Mathematica Corporate Security Manual. Any transfer of records between Mathematica and ACF will occur using a secure file transfer protocol site in case the files contain PII. As specified in the contract, Mathematica will protect respondents’ privacy to the extent permitted by law and will comply with all federal and departmental regulations for private information. In addition, the study leaders at Mathematica will conduct project-specific trainings for all staff who work on the study to communicate the expectations on privacy, informed consent, and data security procedures.
Parent consent and youth assent forms (Appendix B) inform parents and youth that the youth are invited to participate in focus groups and that participation is voluntary, that the information requested from them is for program improvement purposes only, and that their identities will not be disclosed to anyone outside the study team. With participants’ permission, the focus groups will be recorded. Participants will be assured that their recorded comments will be saved only until transcribed and that the transcription summaries will not reveal their identities. All youth (and their parent or legal guardian) must read and acknowledge the form before participating in the data collection. The study will be reviewed by Mathematica’s institutional review board (IRB), the Health Media Lab. Outreach and data collection will not begin until IRB approval has been received.
Data Security and Monitoring
As specified in the contract, the contractor shall protect respondent privacy to the extent permitted by law and will comply with all federal and departmental regulations for private information. The contractor has developed a Data Security Plan that assesses all protections of respondents’ PII. The contractor will ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor who perform work under this contract and subcontract receive training on data privacy issues and comply with the above requirements. All Mathematica staff must sign an agreement to (1) maintain the privacy of any information from individuals, businesses, organizations, or families participating in any projects conducted by Mathematica; (2) complete online security awareness training when they are hired; and (3) participate in a refresher training annually.
As specified in the evaluator’s contract, the contractor will use encryption compliant with the Federal Information Processing Standard (Security Requirements for Cryptographic Module, as amended) to protect all sensitive information during storage and transmission. The contractor will securely generate and manage encryption keys to prevent unauthorized decryption of information, in accordance with the Federal Information Processing Standard. The contractor will incorporate this standard into its property management and control system and establish a procedure to account for all laptop and desktop computers and other mobile devices and portable media that store or process sensitive information. The contractor will secure any data stored electronically in accordance with the most current National Institute of Standards and Technology requirements and other applicable federal and departmental regulations. In addition, the contractor’s data safety and monitoring plan includes strategies for handling sensitive information on paper records and for protecting any paper records, field notes, or other documents that contain sensitive information to ensure secure storage and limits on access.
No information will be given to anyone outside the SRAENE study team and ACF. All PII, typed notes, and audio recordings will be stored on restricted, encrypted folders on Mathematica’s network, which is accessible only to the study team. All data collected for the project, including recordings, will be securely destroyed following the U.S. Department of Health and Human Services and the National Institute of Standards and Technology guidance.
A11. Sensitive Information
There are no sensitive questions asked as part of the grantee, provider, or facilitator surveys (Instruments 1, 2, and 3). Although the youth focus group protocol (Instrument 4) also does not have sensitive questions, since these will be conducted during a SRAE class period, focus group facilitators will be trained to handle any sensitive issues that youth may raise independently, such as sexual behavior. An IRB will review all instruments and related materials, such as the consent and assent forms.
Before the start of the focus groups, participants and their parents will read the consent and assent forms (Appendix B), which describe the purpose of the study, explain their rights as research participants, and acknowledge their consent before participating. Participants will be informed that they do not have to respond to any questions that make them uncomfortable and that their participation is voluntary. The moderator will also state this at the start of the focus group to ensure that all participants understand their rights, as well as the voluntary nature of their participation and responses to questions.
A12. Burden
Explanation of Burden Estimates
In Table A.2 we summarize the estimated reporting burden and costs for each instrument. The survey estimates include time for respondents to review the instructions, complete and review their responses, and transmit their responses. The focus group time includes time for participants to review the instructions and participate in an in-person or virtual focus group. The study team expects the total annual burden to be 1,748 hours for all of the instruments in this information collection request. Figures are estimated as follows:
Nationwide Study Grantee Survey. The survey will be administered to all SRAE grant-recipient directors who are not also providers. This primarily includes the State SRAE grantees. (N = 40). Because participation in data collection activities is a requirement for receipt of the grant, we anticipate a response from all 40 grantees. The survey is estimated to take 10 minutes to complete via web.
Nationwide Study Provider Survey. All SRAE program providers will be asked to complete this 45-minute, web-based survey. There are approximately 500 providers, including a mix of State sub-recipients and General Departmental and Competitive grantees. We assume a 100 percent response rate, for a total of 500 completed surveys.
Nationwide Study Facilitator Survey. The study team plans to conduct a survey with all SRAE program facilitators. We estimate there are, on average, 4 facilitators per SRAE program. The facilitator contact information will be obtained from the 500 completed provider surveys, for a total sample of 2,000 facilitators (500 completed provider surveys * 4 facilitators per provider = 2,000 facilitators). We anticipate that 80 percent of the facilitators will complete the 45-minute survey via web, for a total of 1,600 completed facilitator surveys (2,000 * 0.8 = 1,600).
SRAE Program Youth Focus Groups. Teen recipients of SRAE programming who have provided assent and have parental consent to participate will be selected to participate in focus groups lasting no more than 45 minutes per group. We will conduct a total of 20 youth focus groups, with four focus groups across five geographic regions of the U.S. (4 groups * 5 regions = 20 groups). Within each region, two groups will be held with middle grade students and two groups will be held with high school age students. All sets of middle school and high school aged students will be asked the same set of questions. Groups will vary by SRAE curricula in use across a multitude of settings (in school, out of school, etc.). The questions that are asked will be the same for all curricula in all types of settings. Each focus group will have up to 10 students participating, for a total of 200 students (5 regions * 4 groups * 10 students = 200 total students).
Estimated Annualized Cost to Respondents
The study team expects the total annual cost to be $47,163 for all instruments in the current information collection request. The Occupational Employment Statistics (2021)12 from the Bureau of Labor Statistics have been used to estimate the average hourly wage for the participants of this study and derive total annual costs. For each instrument listed in Table A.2, the study team calculated the total annual cost by multiplying the annual burden hours by the average hourly wage, as follows:
The mean hourly wage of $43.7013 for social scientists and related workers (Occupational Code 19-3099 was used for grant-recipient directors and program provider directors who complete their respective surveys.
The mean hourly wage for educational instruction and library workers, all other, at the elementary and secondary school level (Occupational Code 25-9099) of $23.3614 was used for the program facilitators who complete the facilitator survey.
The average hourly wage for high school–age youth was estimated at $14.68. The hourly wage was based on median weekly earnings of $587 for youth ages 16 to 19 who work a 40-hour workweek.15
The estimated burden results appear in Table A.2.
Table A.2. Total burden requested under this information collection
Instrument |
No. of participants (total over request period) |
No. of responses per participant (total over request period) |
Avg. burden per response (hours) |
Total/annual burden (hours) |
Avg. hourly wage rate |
Total annual participant cost |
||||||
Instrument 1. Nationwide Study Grantee Survey |
40 |
1 |
0.17 |
7 |
$43.70 |
$306 |
||||||
Instrument 2. Nationwide Study Provider Survey |
500 |
1 |
0.75 |
375 |
$43.70 |
$16,388 |
||||||
Instrument 3. Nationwide Study Facilitator Survey |
1,600 |
1 |
0.75 |
1,200 |
$23.36 |
$28,032 |
||||||
Instrument 4. SRAE Program Youth Focus Group Topic Guide |
200 |
1 |
0.83a |
166 |
$14.68 |
$2,437 |
||||||
Estimated total annual burden |
1,748 |
|
$47,163 |
SRAE = Sexual Risk Avoidance Education.
a Average burden per response includes 5 minutes to complete the consent and assent forms.
A13. Costs
There are no additional costs to respondents.
A14. Estimated Annualized Costs to the Federal Government
The estimated total cost to the federal government for this study is $1,046,000 (Table A.3). This includes the costs incurred for administering all collection instruments, processing and analyzing the data, and preparing reports.
Table A.3. Estimated total cost by category
Cost category |
Estimated costs |
|
Fieldwork |
$600,00 |
|
Analysis and summary brief |
$446,000 |
|
Total/annual costs over the request period |
$1,046,000 |
A15. Reasons for Changes in Burden
This is a new information collection request.
A16. Timeline
Table A.4 contains the timeline for data collection, analysis, and reporting activities for the NWS. Pending OMB approval, the study team expects to collect data between fall 2022 through winter 2022, followed by analysis in spring and summer 2023 and reporting in summer 2023.
Table A.4. Schedule for Nationwide Study data collection and reporting
Activity |
Timinga |
Data collectiona |
|
Grantee survey |
Fall 2022 |
Provider survey |
Fall 2022 |
Facilitator survey |
Winter 2022 |
Youth focus groups |
Fall 2022 |
Analysis |
|
Key indicators |
Spring 2023 |
Data tables |
Spring 2023 |
Reporting |
|
Report |
Summer 2023 |
Data files and documentation |
Summer 2023 |
a After obtaining OMB approval.
A17. Exceptions
No exceptions are necessary for this information collection.
Attachments
Appendices
Appendix A: Study Notifications and Reminders
Appendix B: Parent Consent and Youth Assent for Youth Focus Groups
Appendix C: SRAENE NWS Surveys Crosswalk and Research Questions
Instruments
Instrument 1: Nationwide Study Grantee Survey
Instrument 2: Nationwide Study Provider Survey
Instrument 3: Nationwide Study Facilitator Survey
Instrument 4: SRAE Program Youth Focus Group Protocol
1 The Title V Competitive SRAE Program was authorized and funded by Section 510 of the Social Security Act (42 U.S.C. § 710), as amended by Section 50502 of the Bipartisan Budget Act of 2018 (Pub. L. No. 115-123) and extended by the CARES Act of 2020 (Pub. L. No. 116-136). See https://www.ssa.gov/OP_Home/ssact/title05/0510.htm.
2 An information collection request related to this study will be submitted separately.
3 The EIS was approved and conducted under OMB #0970-0530.
4 SRAE PAS Performance Measures are approved and conducted under OMB #0970-0536.
5 Domina, T., Renzulli, L., Murray, B., Garza, A.N., and Perez, L. 2021. Predicting Successful Engagement with Online Learning during COVID-19. Socius. Volume 7, pp, 1–15.
6 Vigdor, J.L. Ladd, H.F., and Martinez, E. 2014. Scaling the digital divide: Home computer technology and student achievement. Economic Inquiry, 52(3), pp, 1103-1119.
7 Barnum, M. and Bryan, C. 2020. America’s great remote-learning experiment: What surveys of teachers and parents tell us about how it went. Chalkbeat.
8 Dillman, Don A. Mail and Internet Surveys: The Tailored Design, second edition, 2007 update. Hoboken, NJ: John Wiley, 2007. ISBN: 0-470-03856-x.
9 Robbins, Michael, and Jennifer Hawes-Dawson. “The Effect of Incentives and Mode of Contact on the Recruitment of Teachers into Survey Panels.” Survey Practice, vol 12, No. 1, November 18, 2020.
10 Singer, Eleanor, and Conge Ye. “The Use and Effects of Incentives in Surveys.” The ANNALS of the American Academy of Political and Social Science, vol. 645, no. 1, January 2013, pp. 112–141.
11 Mercer, Andrew, Andrew Caporaso, David Cantor, and Reanne Townsend. “How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys.” Public Opinion Quarterly, vol. 79, no. 1, Spring 2015, pp. 105–129.
12 U.S. Bureau of Labor Statistics. “May 2021 National Occupational Employment and Wage Estimates.” Available at https://www.bls.gov/oes/current/oes_nat.htm.
15 See Table 3. Median usual weekly earnings of full-time wage and salary workers by age, race, Hispanic or Latino ethnicity, and sex, first quarter 2022 averages, not seasonally adjusted - 2022 Q01 Results (bls.gov)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Susan Zief |
File Modified | 0000-00-00 |
File Created | 2022-09-08 |