SSB - Measuring Self- and Co-Regulation in SRAE Programs – Part 2

Pretesting Generic_0970-0355_SSB_Self and Co Reg SRAENE_8.5.24.docx

Pre-testing of ACF Data Collection Activities

SSB - Measuring Self- and Co-Regulation in SRAE Programs – Part 2

OMB: 0970-0355

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Measuring Self- and Co-Regulation in Sexual Risk Avoidance Education Programs



Pre-testing of Evaluation Data Collection Activities


0970 - 0355





Supporting Statement

Part B

August 2024


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Calonie Gray

Tia Brown

Kathleen McCoy

MeGan Hill

Nakia Martin-Wright


Part B


B1. Objectives

Study Objectives

The proposed pre-test data collection supports the development and testing of a new survey instrument, the Youth Self-Assessment Survey, on youth’s understanding of Sexual Risk Avoidance Education (SRAE) topics and their perception of their self-regulation skills and the co-regulation in their SRAE classroom. This data collection will further ACF’s learning agenda on self- and co-regulation by supporting ACF’s goal of prioritizing development of a reliable and valid survey instrument for youth. New survey measures are critical to ACF’s self- and co-regulation learning agenda1 and will contribute to the body of knowledge on ACF program design and intended outcomes.

Generalizability of Results

The proposed pre-test is intended to inform instrument development by producing descriptive findings about the reliability and validity of a survey instrument for the focus population. It is not designed to promote statistical generalization to other sites or service populations.

Appropriateness of Study Design and Methods for Planned Uses

Data collected under this generic clearance will be used to inform the development of a new survey instrument, which, if found to be valid and reliable through pre-testing, will further ACF’s learning agenda. This study’s three-phase pre-testing approach will allow us to understand different aspects of the survey necessary for validity and reliability. Cognitive interviews (Phase 1, Instrument 2: Cognitive Interview Protocol and Instrument 3, version A: Youth Self-Assessment Survey) will explore how youth interpret the items on the Youth Self-Assessment Survey, including how they understand and operationalize self- and co-regulation and related topics in their daily lives. This phase will include teens who test new content and engagement strategies from a market research vendor, and teens who have participated in a SRAE class. A one-time administration of the survey with a large group of youth (Phase 2, Instrument 3, version B: Youth Self-Assessment Survey) will allow us to assess the psychometrics of the survey with a diverse sample.

Phase 3 (Instrument 3, version C: Youth Self-Assessment Survey) will be a pre-post survey administration that will enable us to (1) test the survey instrument in a classroom using an SRAE curriculum with co-regulation facilitation strategies, (2) evaluate the sensitivity of the survey to detect change between the two data collection points, and (3) assess the validity and reliability of the items specifically as they relate to their intended use. This design and data collection approach is appropriate for ensuring we have the data necessary to assess the items piloted, and that ultimately the instrument will provide high-quality data for its intended uses.

Because this is an initial pilot study, the data collected will not be representative and cannot be used to directly assess student outcomes. This information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information. Limitations will be described in written products associated with this pilot study.


B2. Methods and Design

Target Population

The target population for this generic information collection request is youth in grades 9 through 12.

Sampling and Site Selection

For phase 1, the study team will conduct up to 10 small-group cognitive interviews, with each group comprised of 3-4 youth. To maximize participation and the ability to engage a diverse sample, cognitive interviewees will be recruited by a market research vendor. The vendor manages a non-probability-based panel of members across the U.S. from various geographical areas and with a wide range of demographic characteristics. Selecting study participants through the vendor allows for highly targeted sampling that is efficient and cost-effective. The contractor will also conduct outreach to youth who have participated in a SRAE program, working directly with FYSB.

For youth recruited from the market research vendor, the vendor will first send an email to potential groups of participants (Appendix A: Pretest Phase I: Outreach to participants) to inform them of this study and link them to the eligibility screener (Instrument 1: Youth Screener for Cognitive Interviews and Pilot Survey). The vendor will share the de-identified screener responses with the study team, who will carefully review the screener data and conduct purposive, non-probability-based sampling to ensure that participants are selected based on key criteria (that is, they must be in grades 9-12 and have participated in a sexual education and/or healthy relationship class within the current or prior school year). We will also use the screener to ensure that a diverse range of demographics are represented in each of the small discussion groups, including age/grade level, gender, race, and ethnicity.

For youth who have participated in a SRAE program, the contractor will work with FYSB to share the opportunity to review and provide feedback on materials and instruments.

For phase 2, the study team will again work with the market research vendor to recruit a diverse sample of up to 350 youth using the vendor’s extensive panel. The vendor will follow the same initial recruitment procedures described under phase 1, including sending an email to potential groups of participants (Appendix B: Pretest Phase 2: Outreach to participants) to take a study eligibility screener (Instrument 1: Youth Screener for Cognitive Interviews and Pilot Survey). The vendor will share the de-identified screener data with the study team. The study team will then conduct a purposive, non-probability-based selection of study participants to ensure diversity on geographic regions of the U.S. and on key individual demographics of age/grade level, gender, race, and ethnicity.2

For both phase 1 and phase 2, the vendor will email parents/guardians of eligible youth a description of the study, planned use of data, and confirm their consent prior to emailing study information to youth (Appendix C: Participant consent). For phase 3, the study team will make revisions to the Youth Self-Assessment Survey (Instrument 3, version B) and administer this revised survey (Instrument 3, version C: Youth Self-Assessment Survey) to youth who participate in ACF-funded SRAE programs run by up to three grant recipients. Eligible SRAE grant recipients are those implementing the Love Notes curriculum in classroom settings with facilitators who are using co-regulation strategies. ACF will select up to three grant recipients for this pre-test study based on the following criteria: (1) programs serving large numbers of youth during spring 2025; (2) existing procedures for administering the SRAE program performance measures that allow for the addition of a new survey (that is, the survey undergoing pre-testing); and (3) parental consent is already in place and/or they are exempt from collecting consent. Where active parental consent or youth assent is needed, the study team will work with the grant recipient and its IRB to ensure all specific consent requirements are met.

All youth participating in the programs offered by the three selected grant recipients will be eligible to complete the survey, provided they assent and that the consent requirements of the site and its IRB are met. We anticipate that the youth program participants will be high school students between the ages of 14 and 19. Program participants involved in data collection will be from a convenience sample. Consequently, they may not be representative of the population that all SRAE grant recipients serve.

We expect that across the three selected grant recipients, approximately 250 youth will be invited to participate in phase 3 of the pre-test. We expect up to about 20 percent attrition by the post-test; for a total of 200 youth completing both the pre- and post-test.


B3. Design of Data Collection Instruments

Development of Data Collection Instruments

The Youth Self-Assessment Survey (Instrument 3) focuses on youth perceptions of self- and co-regulation and their knowledge of the skills and concepts taught in the SRAE program. The survey includes five areas: (1) knowledge of SRAE topics, (2) perceptions of the classroom environment, (3) perceptions of the facilitator-youth relationship, (4) perceptions of using self-regulation skills, and (5) knowledge or awareness of self-regulation skills reinforced through the co-regulation strategies.

These topics areas were selected because they cover the components of programming that may influence youth’s outcomes in areas of self-regulation and sexual risk avoidance. Strong SRAE program facilitation, that includes supporting co-regulation in the classroom, can influence how youth engage with the SRAE content and how they are able to benefit from the program.3

To develop items for the first three topic areas (knowledge of SRAE topics, the classroom environment, and the facilitator-youth relationship), we drew on items from existing survey instruments (such as The Learning Alliance Inventory, the National Center of Safe Supportive Learning Environments compendium, and measures used under the Self-Regulation Training Approaches and Resources to Improve Staff Capacity for Implementing Healthy Marriage Programs for Youth, and the Core Components Evaluation of Real Essentials Advanced for the Office of Population Affairs ). These existing instruments have been used to measure these topic areas across similar research studies and with similar populations.

The items that ask about youth perceptions of their own self-regulation skills (topic area 4) are based on the survey developed in the initial pilot for this work. To develop that survey, the study team conducted a measures scan to identify existing measures of youth self-regulation and consulted substantive experts for their input on the existing measures and the areas of focus for this survey. An initial set of items were pilot tested in summer 2023 under a separate approved generic Information Collection Request under this umbrella generic (approved in April 2023). Based on the results of this pilot, the study team further refined the Self-Regulation questions on the Youth Self-Assessment Survey to focus on youth’s knowledge, perceptions, and confidence of using self-regulation skills rather than how often they use self-regulation skills. Given that many of these programs occur over a short time period, tapping into youth’s awareness of and perceptions of their ability to use self-regulation skills may be more sensitive to change over a short time period.

We also expanded the items about self-regulation skills to ask about how often youth use specific skills that are reinforced by facilitators through the co-regulation strategies used in the classroom (topic area 5). For example, we included items that ask about how often youth use breathing exercises to refocus in times of arousal or dysregulation.


B4. Collection of Data and Quality Control

ACF is contracting with Mathematica for this data collection and Mathematica will oversee all data collection and ensure quality control measures are in place and followed for each phase of the pre-test study. To ensure an efficient and standardized data collection process for each phase, Mathematica study staff will participate in a project-specific data collection training. The training will cover the expectations and process for working with the market research vendor and cover all aspects of the study data collection. As phase 3 data collection will occur in the winter of 2025, a separate training will be held to cover the pre-post data collection process. All staff will be trained on best practices for collecting high-quality data and procedures around data privacy and security.

Participant screening for phases 1 and 2. To recruit participants for phases 1 and 2 of the pre-test, Mathematica will work closely with the market research vendor. The vendor will conduct outreach to their panel members, and administer via a web survey a short, 5-minute eligibility screener. This screener (Instrument 1. Youth Screener for Cognitive Interviews and Pilot Survey) is designed to allow the Mathematica study team to purposively select study participants based on key criteria, ensuring all participants are eligible for the study, that the participants come from geographic regions across the US, and that variation exists in their age/grade level, gender, race, and ethnicity. The data collection process will be monitored to maintain high standards of participant selection and to ensure that the study participants accurately reflect the desired variation in the study population to meet the needs of the pre-testing analysis.

Phase 1: Cognitive Interviews. For phase 1 data collection Mathematica will collect the data by moderating all small group cognitive interviews, following a cognitive interview protocol (Instrument 2 Cognitive Interview Protocol). These groups may be held in-person or virtually. If conducted in-person, Mathematica will send experienced cognitive interviewers to the market research vendor’s focus group facilities in up to two geographic locations. Groups conducted virtually will be held on a virtual platform such as WebEx or Zoom, and all participants, including the experienced cognitive interviewers, will log into the platform. We will conduct the cognitive interview groups in this way with up to 32 participants total, split across up to 10 groups. During each cognitive interview discussion group, the participants will begin by taking the survey, followed by a guided discussion using the protocol. The in-person groups will take the survey on paper and the virtual groups will take a web-based version that they will access by a link provided in the chat box. A notetaker from Mathematica will take live notes and the sessions will be audio recorded to ensure accurate collection of data. Following each discussion group, the moderator and notetaker will debrief on the session together. The notetaker will finalize the notes, which will be reviewed by the moderator to ensure accuracy and completeness.

Phase 2: Youth Self-Assessment Survey: Single Administration. For phase 2, Mathematica will again work closely with the market research vendor on study participant recruitment, using the same screener instrument used in phase 1, Youth Screener for Cognitive Interviews and Pilot Survey (Instrument 1). The vendor employs staff who are experienced recruiters and who have extensive experience appropriately obtaining parental consent prior to engaging the youth into a study. After youth are recruited, deemed as eligible, and purposively selected by Mathematica to participate in the survey (Instrument 3, version B: Youth Self-Assessment Survey), the vendor will email a link to youth to access their survey. We plan to complete 350 surveys during a 5-week data collection period. Mathematica will closely monitor the survey data, reviewing completed cases daily to assess quality and completeness. This continuous monitoring will help the study team immediately identify and address any issues that may arise.

Phase 3: Youth Self-Assessment Survey: Pre-Post Survey Data Collection. For phase 3, Mathematica staff will work with up to three selected grant recipients to coordinate the distribution and collection of the surveys (Instrument 3, version C: Youth Self-Assessment Survey) at each of the two administration periods (pre-program implementation and post-program). Depending on the preferences of the grant recipients, the program facilitators may distribute and collect the surveys, or Mathematica study team members may go to the site to support the collection. The mode of data collection may be paper or web, again, depending on the preferences of the grant recipient. Collected data will be immediately reviewed by the study team to determine if issues around quality exist (for example, that the number of completed surveys matches to the number of youth in the classroom we expect to complete and that the surveys contain few item-level missing data). Senior project staff will monitor the completion rates daily during data collection. The study team will work closely with facilitators and study team members going to the site to continually address data collection issues as they arise. To increase data quality, we will have facilitators attempt to have students absent at the time of either data collection take the survey at the next session they attend, within a specified time frame. As noted, the study team will aim to survey up to 250 youth participants for phase 3 but actual numbers may differ depending on how many youth the selected programs serve. We will not exceed the burden requested.

Table B.1 lists all data collection activities proposed for this pre-test study.

Table B.1. Data collection activities

Data collection

Administration plans

Participant screener for Phase 1 and Phase 2

Total participants

Up to 450

Mode

Web-based survey

Time

5 minutes

Frequency

1

Phase 1. Cognitive interviews

Total participants

Up to 32

Mode

In-person or virtual discussion groups

Time

90 minutes

Frequency

1

Phase 2. Youth survey Single Administration

Total participants

Up to 350

Mode

Web-based

Time

10 minutes

Frequency

1

Phase 3. Youth survey Pre and Post Administration

Total participants

Up to 250

Mode

Paper-based or web-based

Time

10 minutes

Frequency

2


All study team members will receive project-specific training in addition to the standard human subjects research training all team members undergo. This will help to ensure that all data collected from youth across the three pre-test phases is gathered consistently and with high-quality. The study team will meet weekly or biweekly during each data collection to discuss progress and troubleshoot issues as they arise.





B5. Site/Respondent Selection

Response Rates

For phases 1 and 2 of the pre-test study, respondents will be purposively selected from a pool of eligible respondents (eligibility determined by Instrument 1: Youth Screener for Cognitive Interviews and Pilot Survey). During phase 2, which involves a single administration of the Youth Self-Assessment Survey (Instrument 3, version B), we will track survey completions and monitor the incoming data for quality. In phase 3, pre-and post-program data collection in classrooms, we expect that most youth will have consent and agree to take the survey (Instrument 3, version C: Youth Self-Assessment Survey). The survey will be administered after the performance measures entrance and exit tickets4, a standard component of their SRAE program, unless the site asks for it to be administered at a different time to ease burden. We will track the total number of youth in each classroom to determine the expected number of completions at both the pre- and post-program data collection stages.

NonResponse

As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection.

During phase 2, the study team will examine item non-response as an indicator of the sensitivity and age appropriateness of survey questions. This information will be used in revisions of questions across the phases of the pre-test, with a goal of minimizing item non-response in the final Youth Self-Assessment Survey instrument.

During phase 3, we will monitor the total number of expected completes based on each total classroom roster. We will work with the onsite facilitator to determine which students are missing either the pre- or post- Youth Self-Assessment Survey, with a goal of having all students complete the survey at pre- and post-program. However, we anticipate that up to 20 percent of the youth will attrit and not be in the final study data. We will closely monitor completes at the classroom level during data collection.


B6. Production of Estimates and Projections

The goal of collecting the information across the three pre-test phases is to build information to help design a survey instrument to support ACF’s goal of prioritizing the development of a reliable and valid survey instrument that measures youth’s perceptions of their self-regulation skills and co-regulation in the classroom, thereby furthering ACF’s learning agenda. During phase 1 (cognitive interviewing) the data collected for this survey development phase of the study will document (1) whether the questions are appropriate for high school–age youth participating in SRAE programs where facilitators are using co-regulation strategies in classroom settings, (2) whether the questions can be interpreted by respondents, and (3) how long the survey takes to complete. During phase 2 (single survey administration), the study team will examine the reliability and validity of the items. During phase 3 (pre-post-program testing), the study team will assess whether the survey is sensitive to change from pre-program and post-program data collection.

The data will not be used to generate population estimates, either for internal use or for dissemination. Policy decisions will not be made based on the data collected for this pre-test study, as the data are not representative.


B7. Data Handling and Analysis

Data Handling

No personally identifiable information will be collected during this study. Data handling will vary for each phase of the pre-test. All data will be destroyed at the end of the study.

  • Data from the recruitment screener for phases 1 and 2 will be collected by the market research vendor through their secure platform. They will share the de-identified data with Mathematica weekly, using a secure transfer site administered by the vendor’s information technology team.

  • For the phase 1 cognitive interview groups, Mathematica will collect data, audio record the group discussions and take live notes. All data will reside in Mathematica’s secure and restricted folder that only assigned study team members can access.

  • During phase 2, the single survey administration, Mathematica will collect data using a secure, federally approved data collection platform. Surveys administered to the panel occur at one point in time and will not need any information to link responses over time. The vendor will provide de-identified demographic information on each survey respondent. These data will be linked with the survey data using identification numbers, in preparation for analysis.

  • Data collected during phase 3 will include paper surveys pre-labeled with an identification number. The surveys also include a set of survey questions for matching pre- and post-surveys. The surveys will be distributed and collected in the classroom, with paper copies sent securely to Mathematica for data entry and secure storage. If a web survey is necessary at a site, data handling will follow the same secure process described for phase 2.

Data Analysis

This project will not employ complex data analytic techniques.

Following phase 1 of data collection, the study team will analyze the cognitive interview data qualitatively, using the feedback from the participants to assess and improve the clarity and understandability of the survey questions and response options. This will include coding the interview notes to identify recurring themes or patterns across the questions, as well as comments that may indicate potential issues with a particular survey question. The analysis will focus on understanding the context and reasons behind any challenges in understanding or answering survey questions, and will determine what can be done to improve the clarity, relevance, and meaningfulness of each survey question.

For phase 2, single survey administration survey data, which is expected to yield 350 completes, the study team will conduct standard descriptive analyses, including measures of central tendency and ranges; exploratory factor analysis to discover whether particular items might group together conceptually; and will conduct standard psychometric tests of validity and reliability (for example, Cronbach’s alpha).

For the phase 3 data, which is expected to yield 200 post-test completes, we will examine whether individual’s post-test scores can be predicted from their pre-test scores to determine how related the pre- and post-test scores are. We will also calculate the effect size of the paired differences (using t-tests) to determine if there are significant changes from youth’s pre-test responses to their post-test responses.

Data Use

Findings will be used to assess whether the Youth Self-Assessment Survey instrument is understandable to youth, provides valid and reliable data, and can detect change between pre- and post-program implementation on five domains of interest to ACF. Analyses will assess the preliminary effectiveness of the instrument at measuring youth’s self-assessment of self-regulation skills before and after participating in an SRAE program where facilitators are using co-regulation strategies in a classroom setting. The findings from this pre-test study will inform the development of the final Youth Self-Assessment Survey, which will be used to support ACF’s learning agenda on self- and co-regulation and may be used in future ACF evaluations. The study team will develop an internal ACF memorandum describing the pre-test methodology and results from each of the three pre-test phases, including changes made to the measures and justification for the changes based on each phase, and suggestions for further use.

B8. Contact Persons

Table B.2 lists the federal and contract staff responsible for the study, including their affiliation and email address.

Table B.2. Staff responsible for study

Name

Affiliation

Email address

Calonie Gray

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

Calonie.Gray@acf.hhs.gov

MeGan Hill

Family and Youth Services Bureau

Administration for Children and Families

U.S. Department of Health and Human Services

Megan.Hill@acf.hhs.gov

Tia Brown

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

Tia.Brown@acf.hhs.gov

Nakia Martin-Wright

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

Nakia.Martin-Wright@acf.hhs.gov

Heather Zaveri

Mathematica

HZaveri@mathematica-mpr.com

Elizabeth Cavadel

Mathematica

ECavadel@mathematica-mpr.com

Avery Hennigar

Mathematica

AHennigar@mathematica-mpr.com

Melissa Thomas

Mathematica

MThomas@mathematica-mpr.com

Jennifer Walzer

Mathematica

JWalzer@mathematica-mpr.com




Attachments

Instruments

Instrument 1: Youth Screener for Cognitive Interviews and Pilot Survey

Instrument 2: Cognitive Interview Protocol

Instrument 3: Youth Self-Assessment Survey


Appendices

Appendix A: Outreach to participants

Appendix B: Participant consent



1 McKenzie, K.J., Meyer, A., and OPRE Self-Regulation Learning Agenda Team. “Co-Regulation and Connection in Human Services: Developing a Learning Agenda.” U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research and Evaluation. https://www.acf.hhs.gov/opre/blog/2022/03/co-regulation-connection-human-services-developing-learning-agenda.

2 Diversity of region and characteristics will ensure the sample includes respondents from a variety of backgrounds, but the sample will not be representative, and subgroups will likely not be large enough for rigorous subgroup testing.

3 Tingey, L., R. Piatt, A. Hennigar, C. O’Callahan, S. Weaver, and H. Zaveri. (2023). The Sexual Risk Avoidance Education National Evaluation: Using Co-regulation in Youth Programs. OPRE Report 2023-281, Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.


4 Surveys approved under OMB Control Number 0970-0536, expiration date 1/31/2025.

10


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMccoy, Kathleen (ACF)
File Modified0000-00-00
File Created2024-09-06

© 2024 OMB.report | Privacy Policy