SS Part B-National YRBS - 4.17.2024

SS Part B-National YRBS - 4.17.2024.docx

[NCCDPHP] 2025 and 2027 NATIONAL YOUTH RISK BEHAVIOR SURVEY

OMB: 0920-0493

Document [docx]
Download: docx | pdf


SUPPORTING STATEMENT FOR THE

2025 and 2027 NATIONAL YOUTH RISK BEHAVIOR SURVEY



Reinstatement With Change: OMB No. 0920-0493, expiration 11/30/2023



PART B

















Submitted by:

Nancy Brener, PhD, Health Scientist

Division of Adolescent and School Health

National Center for Chronic Disease Prevention and Health Promotion
4770 Buford Highway, Mailstop S107-6

Atlanta, GA 30341

Phone: 404-718-8133

Email: nad1@cdc.gov

Centers for Disease Control and Prevention
Department of Health and Human Services

[SUBMISSION DATE HERE]



TABLE OF CONTENTS





LIST OF ATTACHMENTS

  1. Authorizing Legislation

  2. Justification

  3. 60-Day Federal Register Notice

  4. 60-Day Federal Register Notice Comment(s)

  5. Rationale for Survey Questions

  6. Expert Reviewers for the 1989 Consultations

  7. Report on the 2018 YRBS External Peer Review

  8. Permission Form Tracking Log

H1. Permission Form Tracking Log for the national YRBS

H2. Permission Form Tracking Log for the Validation Study

  1. Survey Administrator Script

I1. Survey administrator Script for the national YRBS

I2. Survey Administrator Script for the Validation Study

  1. Parental Permission Forms and Supplemental Documents

J1. Parental Permission Form and Fact Sheet for the national YRBS (English Version)

J2. Parental Permission Form and Fact Sheet for the national YRBS (Spanish Version)

J3. Parental Permission Form Distribution Script for the national YRBS

J4. Parental Permission Form Reminder Notice for the national YRBS (English Version)

J5. Parental Permission Form Reminder Notice for the national YRBS (Spanish Version)

J6. Parental Permission Form and Fact Sheet for the Validation Study

J7. Parental Permission Form Distribution Script for the Validation Study

  1. IRB Approval Letters

  2. Questionnaire

L1. Youth Risk Behavior Survey Questionnaire for the national YRBS

L2. Dietary Behavior Questionnaire for the Validation Study

L3. 24-Hour Dietary Recall Interview for the Validation Study

  1. Recruitment Scripts for the Youth Risk Behavior Survey

M1. State-level Recruitment Script for the national YRBS

M2. District-level Recruitment Script for the national YRBS

M3. School-level Recruitment Script for the national YRBS

M4. School-level Recruitment Script for the Validation Study

  1. Example Table Shells

  2. Sampling and Weighting Plan for the national YRBS

  3. Permission Form Tracking Log Supplemental Documents

P1. Letter to Teachers in Participating Schools for the national YRBS

P2. Make-up List and Instructions for the national YRBS

  1. Letters of Invitation

Q1. Letter of Invitation to States for the national YRBS

Q2. Letter of Invitation to School Districts for the national YRBS

Q3. Letter of Invitation to School Administrators for the national YRBS

Q4. YRBS Fact Sheet for Schools for the national YRBS

Q5. Letter to Agreeing Schools for the national YRBS

Q6. Letter of Invitation for the Validation Study


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS

National YRBS

The universe for the national YRBS will consist of all regular public and private school students in grades 9, 10, 11, and 12 in the 50 states and the District of Columbia.

The sampling frame for schools combines data files obtained from MDR Inc. (Market Data Retrieval, Inc.) and from the National Center for Education Statistics (NCES). The MDR file contains school information including enrollments, grades, race distributions within the school, district, and county, and other contact information for public and non-public schools across the nation. The NCES file includes the Common Core of Data (CCD) for public schools and the Private School Survey (PSS) for non-public schools. When combining data sources to form a sampling frame, duplicates are eliminated so that each school is represented once on the final frame.

Table B-1 displays the current distribution of schools nationally by urban status and type of school.

Table B-1

Distribution of Schools Nationally by Urban Status and School Type

Urban Status

School Type

Total

Frequency

Percent

Public

Private

Catholic

City-Large

2835

739

236

3810

10.04%

2.62%

0.84%

13.49%

City-Midsize

1165

333

90

1588

4.12%

1.18%

0.32%

5.62%

City-Small

1400

363

116

1879

4.96%

1.29%

0.41%

6.65%

Suburb-Large

4676

1355

279

6,310

16.55%

4.80%

0.99%

22.34%

Suburb-Midsize

666

146

16

828

2.36%

0.52%

0.06%

2.93%

Suburb-Small

409

82

21

512

1.45%

0.29%

0.07%

1.81%

Town-Fringe

732

77

15

824

2.59%

0.27%

0.05%

2.92%

Town-Distant

1673

166

52

1891

5.92%

0.59%

0.18%

6.69%

Town-Remote

1148

89

33

1270

4.06%

0.32%

0.12%

4.50%

Rural-Fringe

2602

587

59

3248

9.21%

2.08%

0.21%

11.50%

Rural-Distant

3192

238

5

3435

11.30%

0.84%

0.02%

12.16%

Rural-Remote

2580

70

4

2654

9.13%

0.25%

0.01%

9.40%

Total

23078

4245

926

28249

81.69%

15.03%

3.28%

100.00%



Sampling or other respondent selection method used: Students will be selected using the procedures described in detail below. To briefly summarize, for each YRBS cycle, a nationally representative sample of students will be selected using a three-stage stratified cluster sample. Primary Sampling Units (PSUs - counties, a portion of a county, or a group of counties) and Secondary Sampling Units (SSUs - schools) within sampled PSUs will be selected with probability proportional to size (PPS) selection methods. Within each selected school, one class in each grade will be selected to participate, except in high minority schools where two classes per grade will be selected. All students in selected classes are eligible to participate, with the exception of students who cannot complete the survey independently (e.g., for language or cognitive reasons). A Spanish translation of the questionnaire will be made available to any students who need it.

Expected response rates for the data collection: The average participation rates over the 18 prior cycles of YRBS are 75% for schools and 85% for students. Over the five most recent cycles, the average participation rates are 68% for schools and 80% for students. We consider the historical participation rates, with particular emphasis on the rates obtained during the five most recent cycles, to reflect the current picture of school and student participation in preparing the sample design for the 2025 and 2027 YRBS.

Actual response rates achieved during the last collection period: During the most recent cycle of the YRBS, conducted in 2023, the participation rates were 50% for schools and 72% for students.

Statistical justification for all sample sizes: The expected student sample size is approximately 39,060 students before nonresponse and is necessary to meet study precision requirements. The sample size is calculated by inflating the sample size that would be required under the assumptions of simple random sampling by historical design effects (to account for the complex sampling design) and participation rates to account for nonresponse at both the student and school levels.

Validation study

The validation study will use a convenience sample of approximately 10 public schools to yield an estimated 600 participants, which is necessary to meet study precision requirements. Students of all races and ethnicities are invited to participate, and we will not select students based on any demographic characteristics. However, the priority population for this study is Black, White, or Hispanic students who are enrolled in grades 9 through 12.

B.2 PROCEDURES FOR THE COLLECTION OF INFORMATION

National YRBS

Statistical Methodology for Stratification and Sample Selection

For each YRBS cycle, a probability sample will be selected that will support national estimates among students in grades 9-12 overall and by grade, sex, and race/ethnicity (white, Black, Hispanic). The design also will support sex-specific estimates by grade or race/ethnicity and racial/ethnic-specific estimates by grade. A detailed description of the sampling design may be found in Attachment O.

Sampling Frame. The sampling frame will stratify the 50 states and the District of Columbia by urbanicity and minority composition. The sampling frame is structured into geographically defined units, Primary Sampling Units (PSUs), defined as a county, portion of a county, or a group of contiguous counties. The stratification by minority composition will divide the PSUs into eight groups based on the percentages of Blacks and Hispanics in the PSU. This is accomplished in two steps. First, each PSU is stratified into either the Hispanic stratum or Black stratum based on whether there is a higher percentage of Hispanic or Black enrolled students in the PSU. Each stratum is then subdivided into four strata depending on the percentage of Black or Hispanic enrolled students, as appropriate, in the PSU. The eight racial/ethnic-oriented strata will each be further divided by urban status defined as being in one of the 54 largest Metropolitan Statistical Areas (MSA) versus not. In addition, the first-stage PSU sample will be implicitly stratified by geography using 5-digit zip code areas.

Selection of PSUs. Sixty PSUs will be selected with probability proportional to the student enrollment in the PSU within strata. The allocation of PSUs to the first-stage strata will be approximately in proportion to the total enrollment in the PSU. A proportional allocation tends to maximize the precision of overall survey estimates.

Selection of Secondary Sampling Units (SSUs). SSUs are comprised of either a single school (if the school includes each of grades 9-12) or multiple schools “linked” together. An SSU is comprised of multiple “linked” schools when the physical schools do not include all of grades 9-12. This is done to form school-based SSUs that provide coverage for all four grades in each unit. SSUs will be grouped by size as either large or small, depending upon whether they have 28 students or more per grade. In each selected PSU, at least three large SSUs (28 students or more per grade) will be selected with probability proportional to an aggregate enrollment measure, resulting in 180 selected SSUs (60 PSUs * 3 SSUs). In addition, from a sub-sample of 20 PSUs, one small SSU (fewer than 28 students per grade) will be randomly selected to represent those attending small schools. A total of 200 SSUs will be selected (180 large and 20 small). These 200 SSUs will include approximately 214 physical schools, to account for “linked” schools that are combined during sampling to provide the full span of the grades of interest.

Selection of Classes. Classes in each school are randomly selected based on two specific scientific parameters to ensure a nationally representative sample.  First, classes must be selected in such a way that all students in the desired grade(s) within the school have a chance to participate.  Second, all classes must be mutually exclusive so that no student is selected more than once.   In each school, once we have determined the type of class or time period from which classes will be selected, we randomly select the appropriate number of intact classes within each grade.  To maintain acceptable school participation rates, it is essential that each school have input into the decision regarding which classes will be sampled in their school.  Examples of class sampling frames that have been used in past surveys include a required subject course such as English or all 2nd period classes.  As long as the scientific sampling parameters are met, we work with each school to identify a classroom sampling frame that will work best for the school.

Selection of Students. As stated above, all students in a selected classroom are eligible to participate, with the exception of students who cannot complete the survey independently (e.g., for language or cognitive reasons). As stated above, we will draw a sample of 60 PSUs, with 3 large SSUs (“full” schools) selected from each PSU, for a total of 180 large SSUs. Based on historical averages, a PSU will supply a sample of 300 students across all of grades 9-12 before non-response (3 SSUs * 4 grades/school * 25 students per grade). The estimated sample yield from these large schools will be 25,200 students before school and student non-response.


To provide adequate coverage of students in small schools (those with an enrollment of less than 28 students per grade) we also will select one small SSU in each of 20 subsample PSUs, therefore adding an additional 20 SSUs to the sample. From historical averages, small SSUs are expected to add 1,600 students before non-response.

Refusals. School districts, schools, and students who refuse to participate in the study, and students whose parents refuse to give permission, will not be replaced in the sample. We will record the characteristics of schools that refuse for analysis of potential study biases. Accounting for school and student nonresponse, we expect approximately 14,740 participating students.

Estimation and Justification of Sample Size

The YRBS is designed to produce estimates with error margins of ±5 percent:

  • 95 percent confidence for domains defined by grade, sex, or race/ethnicity;

  • 95 percent confidence for domains defined by crossing grade by sex, and race/ethnicity by sex; and

  • 90 percent confidence for domains formed by crossing grade with race/ethnicity.

During the design of the initial YRBS cycles, CDC’s contractor conducted a series of simulation studies that investigated the relationship of various weighting functions to the resulting numbers and percentages of minority students in the obtained samples. New simulation studies are performed periodically to determine opportunities for efficiency while maintaining the target yields across grade, sex, and race/ethnicity to meet the levels of precision required for CDC’s purposes. The 2025 and 2027 YRBS sample size and design will be consistent with the parameters developed for the 2021 and 2023 cycles. Minor design refinements are made to account for the changing demographics of the in-school population of students, primarily, the growing number of Hispanic students.



Estimation and Statistical Testing Procedures

Sample data will be weighted by the reciprocal of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments. Finally, the data will be post-stratified to match national distributions of high school students by race/ethnicity and grade.

Variances will be computed using linearization methods. YRBS data are also used for trend analyses where data for successive cycles are compared using statistical testing techniques. Statistical testing methods are used also to compare subgroup prevalence rates (e.g., male versus female students) for each cross-sectional survey.

Confidence intervals vary from estimate to estimate depending upon whether the estimate is for the full population or for a subset such as a particular grade or sex. Within a grouping, confidence intervals also vary depending on the level of the estimate and the design effect associated with the measure. Based on prior YRBS cycles with similar designs and sample sizes, we can expect the following:

  • Estimates among students overall or by grade, sex, or race/ethnicity (white, Black, Hispanic) will be accurate at ±5 percent at 95 percent confidence.

  • For racial/ethnic estimates by grade (e.g., 11th grade Hispanic students), about 70% will be accurate to within ±5 percent at 90 percent confidence.1

The experience in using these data is that the levels of sampling errors involved are appropriate given the uses of the data for descriptive reporting and trend analysis.

Survey Instrument

The YRBS questionnaire (Attachment L1), contains 107 items which can be roughly divided into seven categories. The first category includes five demographic questions. The remaining questions address health risk behaviors in six categories: unintentional injuries and violence; tobacco use; alcohol and other drug use; sexual behaviors that contribute to HIV infection, other sexually transmitted diseases and unintended pregnancies; unhealthy dietary behaviors; and physical inactivity. Obesity (assessed by self-reported height and weight) and other health behaviors also are assessed. The questions are all in a multiple-choice format.

Beginning with the 2023 cycle, the YRBS transitioned to an electronic self-administered questionnaire to reduce burden and improve information gathering via advances in information technology. For both the 2025 and 2027 YRBS cycles, the YRBS methodology will continue as an electronic data collection. The initial survey administration will be via tablet, and make-up efforts with eligible students who are absent on the day of initial administration will be via web. Both the tablet application and the web survey share the same code base and will be identical in survey content and function.

Data Collection Procedures

Data will be collected by a small staff of professional data collectors, specially trained to conduct the YRBS. The time during the school day in which the survey is administered varies by school.  This decision is made in coordination with each school to ensure that the type of class or period of the day selected for sampling 1) meets the scientific sampling parameters to ensure a nationally representative sample and 2) results in the least burden/highest possible acceptability for the school. The data collector will have direct responsibility for administering the survey to students. Data collectors will follow a survey administrator script (Attachment I1).

Teachers will be asked to remain at the front or back of the classroom and not to walk around the room monitoring the aisles during survey administration because doing so could affect honest responses and compromise anonymity. Teachers also will be asked to identify students allowed to participate in the survey and to make sure non-participating students have appropriate alternative activities. The rationale for this is to increase the candor and comfort level of students. The only direct responsibility of teachers in data collection is to distribute and follow up on parental permission forms sent out prior to the scheduled date for data collection in the school. Teachers are provided with a parental permission form distribution script (Attachment J3) to follow when distributing permission forms to students.

The Permission Form Tracking Log (Attachment H1) is completed by teachers to track which students have received parental permission to participate in the data collection. The teachers receive instructions on completing the Permission Form Tracking Log in the “Letter to Teachers in Participating Schools” (Attachment P1). The data collector will utilize the information on the Permission Form Tracking Log to identify students eligible for a make-up survey administration; this information will be recorded by the data collector on the “Make-up List and Instructions” document (Attachment P2).

In general, our data collection procedures have been designed to ensure that:

  • Protocol is followed in obtaining access to schools

  • Everyday school activity schedules are disrupted minimally

  • Administrative burden placed on teachers is minimal

  • Parents give informed permission to participate in the survey

  • Anonymity of student participation is maintained, with no punitive actions against nonparticipants

  • Alternative activities are provided for nonparticipants

  • Control over the quality of data is maintained

CDC’s contractor will rent or purchase electronic devices (i.e. tablets or similar devices) on which the survey application will be loaded and used by students to complete the survey. At the start of survey administration, professionally trained YRBS data collectors will remind students that their responses will be captured anonymously (Attachment I1). Students will be instructed to hand their tablet to the data collector at the conclusion of survey administration. Following data collection at each school, student data will be uploaded to a central repository using secure protocols and erased from the tablets.

Obtaining Access to and Support from Schools

All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Kathleen Ethier, PhD, Director, Division of Adolescent and School Health, National Center for Chronic Disease Prevention and Health Promotion.  The procedures for gaining access to schools will have three major steps:

  • Notify state education agencies (SEAs) in states with sampled schools and inform states of their schools’ selection into the national YRBS sample. Obtain names of supportive school district contacts and general guidance on working with the selected school districts and schools in the state, and request state-level support for the survey prior to sending district invitations.

  • Invite school districts in which selected schools are located to participate in the study. For Catholic schools and other private schools, invite the office comparable to the school district office (e.g., diocesan office of education). Obtain approval to invite sampled schools to participate. Verify existence of schools, grade ranges, and other information as needed. Request that the school district notify schools that they may anticipate being contacted about the survey. Request general guidance on working with the selected schools and district scheduling information.

  • Once cleared at the school district level, invite selected schools to participate. Verify information previously obtained about the school. Present the burden and benefits of participation in the survey. Obtain approval for participation at the school level. After a school agrees to participate, develop a customized plan for collection of data in the school (e.g., select classes and schedule survey date). Ensure that all pre-survey materials reach the school well in advance of when they are needed. Maintain contact with schools until all data collection activities have been completed.

Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the state. Scripts for use in guiding these discussions may be found in Attachments M1 (state-level), M2 (district-level), and M3 (school-level). Attachment Q contains copies of letters of invitation to states (Attachment Q1), school districts (Attachment Q2), and school administrators (Attachment Q3). Attachment Q also contains the YRBS Fact Sheet for Schools (Attachment Q4). A copy of the letter to be sent to schools once they have agreed to participate is found in Attachment Q5.

Informed Consent

The parental permission form and fact sheet (Attachments J1 and J2) inform both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activity, it ensures that permission will be informed. The permission form indicates that a copy of the questionnaire will be available for review by parents at their child’s school. The parental permission forms will be made available in both English and Spanish.

A waiver of written student assent was obtained for the participation of children because this research presents no more than minimal risk to subjects, parental permission is required for participation, the waiver will not adversely affect the rights and welfare of the students because they are free to decline to take part, and it is thought that some students may perceive their responses are not anonymous if they are required to provide stated assent and sign a consent/assent document. Students are told “Participating in this survey is voluntary and your grade in this class will not be affected, whether you answer the questions or not.” Completion of the survey implies student assent.

Quality Control

Table B.2 lists the major means of quality control for the electronic data collection methodology. As shown, the task of collecting quality data begins with a clear and explicit study protocol. Data collection quality control begins with accurate programming of the YRBS questionnaire and concludes with the regular submission of data records to a secure central repository.

Because the ultimate aim is production of a high-quality data set and reports, various quality assurance activities will be applied during the data collection phase. Subsequent to data collector training, measures must be taken to reinforce training, to assist field staff who express/exhibit difficulties completing data collection activities, and to verify compliance with data collection protocols. Also, early inspection of a preliminary data set is necessary to ensure data integrity.

Table B.2 - Major Means of Quality Control

Survey Step

Quality Control Procedures

Mailing to Districts and School

  • Validate district and school sample to verify/update contact information of district/diocese/school leadership (100%)

  • Check inner vs. outer label for agreement in correspondence (5% sample)

  • Verify that any errors in packaging were not systematic (100%)

  • Determine if local approval processes require a formal research proposal (100% of districts)

  • Review all formal research applications and confirm they are in accordance with local requirements (100%)

Telephone Follow-up Contacts

  • Monitor early sample of calls to ensure that the recruiter follows procedures, elicits proper information, and has proper demeanor (10%)

  • Perform spot checks on recruiters’ class selection outcomes to confirm procedures were implemented according to protocol (10%)

Pre-visit Logistics

Verification

  • Review data collection procedures with school personnel in each school to ensure that all preparatory activities are performed properly in advance of data collector arrival (e.g., distribution of permission forms) (100%)

Data Collector Training and Supervision of School Visits

  • Issue quizzes during data collector training to ensure that key concepts are understood (daily during training)

  • Maintain at least one weekly telephone monitoring of all field staff throughout data collection (100% of field staff)

  • Reinforce training and clarify procedures through periodic conference calls with field staff (100% of field staff)

  • Verify by telephone with a 10% sample of schools that data collection procedures are being followed

Questionnaire Programming and Testing

  • Ensure verbatim wording of displayed text to that of the analyst/programmer version of the questionnaire (100% of question and instructional text)

  • Verify skip patterns are functioning properly to 1) correctly advance to the appropriate question from the user perspective and 2) apply appropriate value codes in the data set to distinguish legitimately skipped questions and other missing data

  • Create “dummy data set” to verify that all entered responses are correctly captured in the data set as intended (minimum 50 records)

Receipt Control

  • Verify syncing of data from the field is occurring no later than 48 hours after data collection concludes (100% of schools)

  • Verify number of data records received in the data base match the number of expected records reported by field staff (100% of schools)

  • Capture date/time stamps and staff credentials in the centralized system for all transactions (100%)

Data Review



  • During fielding, extract records from the first three schools visited in the fielding window to verify data set is capturing and storing records as expected. Inspect the data set at least twice monthly weekly after that.

  • Verify that all anticipated schools are represented in the data set and frequencies of records by school match reported student participation rates (100% of schools)



Validation study

For the validation study, a convenience sample of schools will be recruited using the School-level Recruitment Script for the Validation Study (Attachment M4), and classes will be selected to ensure that students are distributed approximately equally across grades 9-12. Procedures for obtaining parental permission will be the same as described for the national YRBS, except for the use of a parental permission form specific to the validation study. This form will be distributed to students in the selected classes. All students in selected classrooms are eligible to participate in the study, except those who cannot complete the survey independently (e.g., for language or cognitive reasons). Data will be collected by a small staff of professional data collectors trained to conduct the study. The study questionnaire will be self-administered in classrooms using electronic devices (i.e., tablets or similar devices), and the information collected from the surveys will be uploaded to a central repository using secure protocols. The 24-hour recall interviews will be conducted at school via a virtual meeting platform by individuals trained in the use of Nutrition Data System for Research (NDSR) software.



B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NO RESPONSE

National YRBS

Expected Response Rates

Historically for the national YRBS, the overall response rate (product of school response rate and student response rate) has ranged from 55% to 71%, with a 18-cycle average of 64%. For the purposes of 2025 and 2027 YRBS sample design, we have conservatively assumed an overall response rate of 55%, the average over the five most recent survey cycles. The participation rates established by the YRBS are the result of the application of established procedures for maximizing school and student participation and minimizing non-response as described below.

Methods for Maximizing Responses

To increase the likelihood of an affirmative decision to participate, we will:

(1) work through the SEA and/or state health agency to communicate state-level support

(2) indicate that the survey is sponsored by CDC and has support of Federal and state agencies

(3) solicit endorsement from and convey to school districts and schools that the survey has the endorsement of many key national education and health associations, such as the American Academy of Pediatrics, American Psychological Association, American School Counselor Association, Association of State and Territorial Health Officials, Boys and Girls Clubs of America, Council of Chief State School Officers, National Association of School Nurses, National Association of School Psychologists, National Association of Secondary School Principals, National Association of State Boards of Education, National Catholic Educational Association, National Education Association, National Coalition of STD Directors, National PTA, National Rural Health Association, National School Boards Association, National Urban Leagues, Sexuality Information and Education Council of the US, and The School Superintendents Association (formerly the American Association of School Administrators).

(4) maintain a toll-free hotline to answer questions from school district and school officials, teachers, parents, and students throughout the process of recruiting schools and obtaining parental permission for student participation

(5) comply with district requirements in preparing written proposals for survey clearance

(6) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey

(7) offer schools a monetary token of appreciation of $500, consistent with previous recommendations from OMB, and implemented on the national YRBS since 2001.

Once a school has agreed to participate, we collaborate with each school to determine the class selection method that fits best with their school environment and meets the scientific protocol and to schedule a survey administration date that is convenient for their school calendar. As indicated in A.16, it is highly desirable to complete data collection before the final 1-2 months of school. At that point in the academic year, schools are typically focused on testing and attendance can be unstable, particularly among twelfth grade students. To further encourage participation among students in selected classes, we will recommend that schools help to advertise the survey through the principal’s newsletter, PTA meetings, and other established means of parental communication.

Methods for Handling Non-Response

We distinguish among six potential types of nonresponse problems: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student. To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey and the value of the data gathered in allowing for the continued monitoring of factors that influence youth health. All participating districts and schools will be notified when the survey results are published and data are available for download from CDC’s website, which districts and schools may utilize in supporting grant applications.

Dealing with refusals from parents, teachers, and students requires different strategies. Parental permission form reminders (Attachments J4 and J5) will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., 3 days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child's participation. Permission forms will be available in English, Spanish, and other languages as required based on dominant languages spoken by parents in selected schools. Field staff will be available on location to answer questions from parents who remain uncertain of permission.

Teacher refusals to cooperate with the study are not expected to become a cause for concern because school leadership will already have agreed to participate. Refusals by students who have parental permission to participate are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced.

To minimize the likelihood of missing values on the questionnaire, students will be reminded within the instrument instructions and verbally by the survey administrator to review their answers prior to submitting their electronic questionnaire. On the submission page, they are shown a list of any questions that were intentionally or accidentally left blank and can link directly to that question to provide a response, if they so desire. Missing values for an individual student on the survey will not be imputed.

Following the completion of the YRBS, non-response bias analyses will be conducted that examine school participation rates by school characteristics (e.g., school size, school type), census region, and student characteristics (e.g., race/ethnicity, socioeconomic status). The results of these analyses will be described when reporting YRBS results.

Validation study

Because the validation study is designed to examine the concordance between responses to a survey and responses to a dietary recall interview rather than to produce estimates that are generalizable to a population, response rates are less of a concern. School recruitment of schools in the convenience sample will continue until the desired number of respondents has been reached.

B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN

National YRBS

YRBS questionnaire items were originally tested by the NCHS laboratories. The 1993 special issue of Public Health Reports on the development of the Youth Risk Behavior Surveillance System describes the development and testing process. A limited pretest of the questionnaire on nine respondents was conducted in November 1989 by the contractor in the Prince George's County, Maryland school system in accord with OMB guidelines. The pretest was conducted to:

  • Quantify respondent burden

  • Test survey administrator instructions and procedures

  • Verify the overall feasibility of the survey approach

  • Identify needed changes in the instruments or instructions to control/reduce burden

The pilot test sharpened the articulation of certain survey questions and produced an empirical estimate of the survey burden.

The YRBS questionnaire has been used extensively in 18 prior national school-based surveys approved by OMB, as well as at the state and local levels. Further pilot testing in accord with OMB guidelines has been performed on new and potential questions.

Validation study

The questions included in the validation study have been widely used and previously validated (Thiagarajah et al. 2008, Hoelscher et al. 2003, Eaton et al. 2013); however, never in direct comparison with one another. The items, which use a 7-day recall period, have been included on the YRBS since 1999 and were previously validated (Eaton et al. 2013).The Texas School and Physical Activity Survey (https://span-interactive.sph.uth.edu/) (SPAN) instruments have been found to be valid and reliable in students as young as 4th grade and serve as the basis for the new questions being tested as part of this validation study (Thiagarajah et al. 2008, Penkilo et al. 2008).

Since the YRBS is designed to monitor trends in risk behavior (e.g., not eating vegetables at least once a day) and identify differences in mean intakes across demographic groups, rather than estimate usual intake, a single 24-hour recall is sufficient for validation purposes (Byers et al. 2012).

The protocol used to collect and analyze 24-hour recall data will follow methods established by the Nutrition Coordinating Center at the University of Minnesota, which develops the Nutrition Data System for Research (NDSR) software used in this validation study and conducts the trainings that data collectors will follow. The analytic methods used to 1) convert data from a single 24-hour recall and the survey items into a comparable “times/day” structure, 2) calculate correlations between the measures, and 3) assess mean differences between the measures follow those previously used in validation studies for dietary items in the YRBS (Eaton 2013, O’Malley et al. 2014).



B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA

National YRBS

Under OMB's prior review of the YRBS (OMB No. 0920-0493, expiration, 9/30/2019), a Notice of Action was issued that requested the study undergo an external peer review prior to submitting the next package for approval. To ensure continuous scientific rigor of the sample design, best practices for recruitment, and efficient strategies to maximize participation rates, a panel of four experts was convened in April 2018. Four experts in survey methodology, school-based data collection, and health surveys commented on the YRBS methodology and offered recommendations for improvement. Specifically, the topics of discussion were frame development and sampling design, maximizing participation, transition to a mixed mode methodology and YRBS strategy to address emerging topics. A summary of the panel's recommendations and CDC's concurrence with those recommendations can be found in Attachment G.

Statistical aspects of the study have been reviewed by the individuals listed below.

  • Ronaldo Iachan, PhD

ICF

530 Gaither Road, Suite 500

Rockville, Maryland 20850

Phone: (301) 572-0538

E-mail: Ronaldo.Iachan@icf.com

  • Yangyang Deng, MS

ICF

530 Gaither Road, Suite 500

Rockville, Maryland 20850

Phone: (301) 572-0553

E-mail: Yangyang.Deng@icf.com



Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:

  • Nancy Brener, PhD
    Team Lead, Survey Operations and Dissemination Team
    Division of Adolescent and School Health
    Centers for Disease Control and Prevention

4770 Buford Highway

Mailstop S107-6

Atlanta, GA 30341

Phone: 404-718-8133
Email: nad1@cdc.gov



The representative of the contractor responsible for conducting the planned data collection is:

  • Alice Roberts, MS

Project Director

ICF

530 Gaither Road, Suite 500

Rockville, Maryland 20850

Phone: (301) 572-0290

E-mail: Alice.Roberts@icf.com



Validation study

The following individuals have been consulted on statistical aspects of the validation study and are responsible for collecting and analyzing data:



Name

Contact Info

Organization

Role

Michele Sadler

misadler@deloitte.com

Deloitte

Statistical Consultation

John Zimmerman

jzimmerman@deloitte.com

Deloitte

Statistical Consultation

Kate Brouse

kbrouse@deloitte.com

Deloitte

Data Collection and Analysis Lead


REFERENCES


Byers T, Sedjo R. Nutrition monitoring and surveillance. In: Willett W, (ed) Nutritional Epidemiology. 3rd ed. New York, NY: Oxford University Press 2012; 344-56.


Eaton DK, Olsen EOM, Brener ND, et al. A Comparison of Fruit and Vegetable Intake Estimates from Three Survey Question Sets to Estimates from 24-Hour Dietary Recall Interviews. Journal of the Academy of Nutrition and Dietetics. 2013; 113(9): 1165-74.


Hoelscher DM, Day RS, Kelder SH, Ward JL. Reproducibility and validity of the secondary level School-Based Nutrition Monitoring student questionnaire. Journal of the American Dietetic Association. 2003; 103(2): 186-94.


O'Malley Olsen E, Eaton DK, Park S, Brener ND, Blanck HM. Comparing methods for assessing beverage intake among high school students. Am J Health Behav. 2014; 38(1): 114-23.


Penkilo M, George GC, Hoelscher DM. Reproducibility of the School-Based Nutrition Monitoring Questionnaire among fourth-grade students in Texas. J Nutr Educ Behav. 2008; 40(1): 20-7.


Thiagarajah K, Fly AD, Hoelscher DM, et al. Validating the Food Behavior Questions from the Elementary School SPAN Questionnaire. Journal of Nutrition Education and Behavior. 2008; 40(5): 305-10.

1 Based on empirical results and simulations.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWindows User
File Modified0000-00-00
File Created2024-09-06

© 2024 OMB.report | Privacy Policy