SUPPORTING STATEMENT FOR THE
NATIONAL YOUTH PHYSICAL ACTIVITY AND NUTRITION STUDY
PART B
Submitted by:
Nancy Brener, PhD, Project Officer
Division of Adolescent and School Health
National Center for Chronic Disease Prevention and Health Promotion
4770 Buford Hwy, NE, MS K-33
Atlanta, GA 30341
770-488-6184
(voice); 770-488-6156 (fax)
Nad1@cdc.gov
Centers for Disease Control and Prevention
Department of Health and Human Services
November 12, 2009
TABLE OF CONTENTS
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
1. Respondent Universe and Sampling Methods
2. Procedures for the Collection of Information
a. Statistical Methodology for Stratification and Sample Selection
b. Estimation and Justification of Sample Size
c. Estimation and Statistical Testing Procedures
d. Use of Less Frequent than Annual Data Collection
e. Survey Instrument
f. Data Collection Procedures
g. Obtaining Access to and Support from Schools
h. Informed Consent
i. Quality Control
3. Methods to Maximize Response Rates and Deal with Nonresponse
a. Expected Response Rates
b. Methods for Maximizing Response and Handling Non-Response
4. Tests of Procedures or Methods to be Undertaken
5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or
Analyzing Data
a. Statistical Review
b. Agency Responsibility
c. Responsibility for Data Collection
REFERENCES
LIST OF APPENDICES
A. Authorizing Legislation
B1. 60-Day Federal Register Notice
B2. 60-Day Federal Register Notice Comment and CDC Response
C. National Youth Physical Activity and Nutrition Survey Questionnaire
C1. Questionnaire Administration Guide – NYPANS only
C2. Questionnaire Administration Guide – NYPANS and 24-Hour Dietary Recall
C3. Parental Permission Form and Fact Sheet - NYPANS only (English Version)
C4. Parental Permission Form and Fact Sheet - NYPANS only (Spanish Version)
C5. Parental Permission Form and Fact Sheet - NYPANS and 24-hour Dietary Recall (English Version)
C6. Parental Permission Form and Fact Sheet - NYPANS and 24-hour Dietary Recall (Spanish Version)
C7. Parental Permission Form Reminder Notice (English Version)
C8. Parental Permission Form Reminder Notice (Spanish Version)
D. Height and Weight Record Form
E. Student Contact Form for 24-Hour Dietary Recall
F. 24-Hour Recall Interview Script
F1. Food Amounts Booklet
G. State Recruitment Script for the National Youth Physical Activity and Nutrition Study
G1. State Letter of Invitation
H. District Recruitment Script for the National Youth Physical Activity and Nutrition Study
H1. District Letter of Invitation
I. School Recruitment Script for the National Youth Physical Activity and Nutrition Study
I1. School Letter of Invitation
I2. School Fact Sheet
I3. Letter to Agreeing Schools
J. Data Collection Checklist for the NYPANS and Make-up List and Instructions
J1. Letter to Teachers in Participating Classes
K1. Parental Permission Form Distribution Script – NYPANS only
K2. Parental Permission Form Distribution Script – NYPANS and 24-Hour Dietary Recall
L. Detailed Sampling and Weighting Plan
M. IRB Approval Letter
N. Sample Table Shells
O. List of Previously Fielded Questions in the NYPANS Questionnaire
P. Data Collector Confidentiality Agreement
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
The proposed study includes a data collection with an administration of an in-school survey to a nationally representative sample of high school students that includes height and weight measurements of these students. Sampling methods will be similar to those used in the 2009 national Youth Risk Behavior Survey (YRBS).
The proposed study also includes a follow-up sub-study with a subsample of students from a subsample of schools participating in the in-school survey. The sub-study is not designed to be generalizable to the national student population and is based only partly on statistical sampling methods.
B.1 RESPONDENT UNIVERSE AND SAMPLING METHODS
The respondent universe for the National Youth Physical Activity and Nutrition Survey (NYPANS) is the universe of all private and public high schools nationwide (i.e., in the 50 states and the District of Columbia) and their students. Excluded are schools normally excluded from the national YRBS: alternative schools, schools serving only a special education population, vocational education schools serving only a pull-out population enrolled in other schools, continuing education schools, and Department of Defense schools. The sampling frame for schools has been obtained from Quality Education Data (QED), Inc. QED data encompass both private and public schools and include the latest data from the Common Core of Data from the National Center for Education Statistics. Data on enrollments by grade and minority enrollments at the school level are available in this dataset. Table B.1 displays the current national distribution of high schools by metropolitan status and school type using the three types in the QED database: Catholic schools, non-Catholic private schools, and public schools.
For the follow-up sub-study, the sample will be selected from a subset of the schools participating in the main sample (NYPANS).
B.2 PROCEDURES FOR COLLECTION OF INFORMATION
B.2.a Statistical Methodology for Stratification and Sample Selection
For the NYPANS, a probability sample will be selected that will support national estimates by age or grade and gender for secondary school students. The design also will support separate estimates of the characteristics of White, Hispanic, and black students. A detailed description of the sampling design may be found in Appendix L.
Table B-1
Distribution of Schools by Urban Status and School Type
Metro Status |
School Type |
Total |
||
Frequency |
Catholic |
Private |
Public |
|
Unclassified |
1 |
35 |
19 |
55 |
Urban |
556 |
1,670 |
3,770 |
5,996 |
Suburban |
592 |
2,859 |
7,573 |
11,024 |
Rural |
76 |
1,104 |
8,051 |
9,231 |
Total |
1,225 |
5,668 |
19,413 |
26,306 |
To ensure that all key estimates will meet target precision levels with the projected sample sizes, we implemented a simulation study with two separate arms described in Appendix L.
For the follow-up sub-study, estimates will be based on the subset of students who participate in the follow-up interviews. The subsample students will be recruited from a subset of those schools participating in the NYPANS with one class selected randomly per school. The sub-study estimates will be a composite of the measures collected across up to three follow-up interviews per respondent. These estimates will be compared with the estimates obtained from the NYPANS.
Sampling Frame. The sampling frame will stratify the 50 states and the District of Columbia by region, urbanicity, and minority composition. The sample is structured into geographically defined units, called primary sampling units (PSUs) defined as a county or groups of contiguous counties (except when they are unaffiliated cities). The stratification by minority composition will divide the PSUs into eight groups on the percentages of non-Hispanic blacks (referred to as “blacks” in the rest of this document) and Hispanics of any race (referred to as “Hispanics” in the rest of this document) in the PSU. "High Hispanic" strata will have higher percentages of Hispanics than blacks; "High black" strata will have the reverse. Each stratum will then be subdivided into four strata depending on the percentage of blacks or Hispanics, as appropriate, in the PSU. The racial/ethnic-oriented strata will be further divided by urban status into two strata--Metropolitan Statistical Area (MSA) versus non-MSA. In addition, the first-stage PSU sample will be implicitly stratified by geography using 5-digit zip code areas.
Selection of PSUs. Seventy five PSUs will be selected with probability proportional to the student enrollment in the PSU within strata, giving disproportionate weight to blacks and Hispanics. The PSUs will be allocated to the first-stage strata in proportion to the sum of the measure of size of the PSUs in the strata. This procedure will over-allocate PSUs to the high-minority strata and will increase the chances of high-minority PSUs being selected.
Selection of Schools. Schools will be grouped by size as either large or small, depending upon whether or not they have 25 students or more per targeted grade. Among large schools, two schools will be selected in each PSU with probability proportional to the weighted measure of enrollment by race/ethnicity. In addition to the total sample of large schools (n=150 selections anticipated), a random sample of 10 small schools will be taken to represent what is about 5.7% of the students nationwide (those attending small schools).
For the sub-study, a subset of 60 schools will be selected with convenience sampling from the schools participating in the NYPANS. It is anticipated that this sub-sample will represent different regions of the country and varying school sizes including non-public as well as public schools. It also is expected that the sub-sample of schools will serve a student population that is broadly diverse in terms of race and ethnicity.
Selection of Classes. For the NYPANS, classes will be selected randomly from a unit of organization that allows for the inclusion of each child once; i.e., they are collectively inclusive, but mutually exclusive. Usually, a list of sections of a mandatory subject, such as English, will be used. One class will be selected in each eligible grade for all schools.
For the sub-study, only one class will be selected for the chosen grade. After the subset of 60 participating schools is identified for the sub-study, we will select the grade and class within each school in a way that balances the grade distribution across schools, and at the same time, conforms to the class availability within schools.
Selection of Students. All students in a selected classroom will be selected for the NYPANS study. Also for the sub-study, all students within a sub-sampled class will be recruited for participation in the follow-ups (sub-study).
Refusals. School districts and schools that refuse to participate in the study, and students whose parents refuse to give permission, will not be replaced in the sample. We will record the characteristics of schools that refuse for analysis of potential study biases.
B.2.b Estimation and Justification of Sample Size
B.2.b.1 NYPANS Sample Size
The national YRBS is designed to produce most estimates accurate to within 5 percent at 95 percent confidence. Overall estimates and estimates by grade, by gender or by race/ethnicity meet this standard as do certain finer-grained analyses such as by grade and by gender. Because some of the precision requirements are not as tight for the 2010 NYPANS as for the regular YRBS, we can consider sample sizes that are smaller than the typical YRBS.
Minor design refinements may be expected in future surveys driven by the changing demographics of the in-school population. Current trends of increasing minority percentages, particularly for Hispanic students, will continue to influence the design in several areas:
The weighting function that over-samples minority students has been adjusted downwards over the history of the national YRBS to give less weight to minority students. In this way, the statistical efficiency of the survey will improve.
The stratum boundaries based on the percentage of minority students are being re-computed to minimize variances using new frame data on minority composition.
The allocation of PSUs to high-minority strata is being changed, making it more statistically desirable for whole population estimates, while preserving the ability to produce estimates by race/ethnicity.
The proposed sample will consist of 75 primary sampling units (PSU). At each grade level, at least two different schools will contribute classes of approximately 25 students each. As a result, there will be a minimum of two schools selected within each PSU. The actual number of schools will be more than 2 times 75, however, as we consider, and add to the mix, two types of schools: a) schools that only span part of the grades of interest, and hence are combined to form sampling units, and b) small schools that are selected over and above large schools. As a result, approximately 159 actual schools will be selected into the sample. We anticipate that approximately 127 schools will participate in the study (for a projected 80% school participation rate).
We will select one class per school per eligible grade. We expect that a final sample of approximately 8,000 respondents will be obtained on the NYPANS. Appendix L also presents results of a simulation study conducted to estimate the sample sizes and precision that result from dropping the double section sampling in high-minority schools.
Table B-2 presents the precision expected for the key subgroup estimates defined for racial minority groups. These estimates are based on design effects of 2.0 and subgroup sample sizes n=1200, the range expected for these subgroups. It may be worth noting that other key subgroups, defined by grade and by gender, will have comparable or larger sample sizes. Derivations are included in Appendix L. The table shows that confidence intervals will be within +/- 3 percentage points for all key subgroups.
Table B-2
Precision expected for racial subgroup estimates: standard error of estimated percentages and associated 95% confidence intervals
Estimated Percentage |
Standard Error |
95% Confidence Interval |
5% |
0.63% |
1.23% |
10% |
0.87% |
1.70% |
15% |
1.03% |
2.02% |
20% |
1.15% |
2.26% |
50% |
1.44% |
2.83% |
B.2.b.2 Sub-study Sample Size
The focus of the sub-study will be on comparing the measures based on the 24-hour recalls administered over the three follow-up data collections (CATI interviews) and the measures computed from the NYPANS data collected for the same subset of participating students.
By consenting as many as 1,200 students for the follow-up sub-study, the sub-study will be designed to generate approximately 900 completed interviews for the first follow-up, 750 completes for the second follow-up and 600 completes for the third follow-up. Therefore, the composite measures that combine data from all three follow-ups will be available from the n=600 students completing all three follow-up surveys as well as the NYPANS in-school survey (including height and weight measurements).
These sample sizes are premised on a pool of 60 participating schools sub-sampled for the sub-study from the set of NYPANS participating schools, and the selection of one class per school.
The discussion of expected precision begins with the variance and standard error anticipated for estimated proportions and percentages that use the composite measures (24-hour recall). Example estimates include the composite percentage of students that drink milk daily, or eat a prescribed amount of fruits and/or vegetables. Table B-3 presents the precision expected for these estimates.
Table B-3
Precision expected for sub-study estimates: standard error of estimated percentages and associated 95% confidence intervals1
Estimated Percentage |
Standard Error |
95% Confidence Interval |
5% |
0.89% |
1.74% |
10% |
1.22% |
2.40% |
15% |
1.46% |
2.86% |
20% |
1.63% |
3.20% |
50% |
2.04% |
4.00% |
Table B-3 shows that confidence intervals will be within +/- 4 percentage points.
It should be noted that for comparing the measures based on the NYPANS and based on the follow-ups, the precision will be better as differences will capitalize on the positive correlation between the two measures for the same student (i.e., the intra-subject correlations between the repeated measures). The sample sizes proposed for the sub-study will generate powerful comparisons with the NYPANS measures. Smaller sample sizes would not provide sufficient precision for estimates that use data from all three waves of data collection for the sub-study.
School and Student Non-response.
The average participation rates over the ten prior cycles of YRBS are 77% for schools and 86% for students. In 2007, the YRBS achieved one of its highest school participation rates (81%). However, student participation rates generally are trending downward; the student participation rate was 84% in 2007. We are assuming maintenance of historical average participation rates in preparing the sample design for the NYPANS.
For the sub-study, we will consent up to 1,200 students who completed the in-school NYPANS in the expectation that 900 will participate in the first follow-up. We also anticipate that 750 students will participate in the second follow-up, and 600 students will participate in the third follow-up.
B.2.c Estimation and Statistical Testing Procedures
NYPANS sample data will be weighted by the reciprocal of the probability of case selection and adjusted for non-response. The resulting weights will be trimmed to reduce mean-squared error. Next, the strata weights will be adjusted to reflect true relative enrollments rather than relative weighted enrollment. Finally, the data will be post-stratified to match national distributions of high school students by race/ethnicity and grade. Variances will be computed using linearization methods. The estimation process for the NYPANS will use statistical software developed for analyses of survey data arising from complex sampling designs (e.g., SUDAAN). These estimation procedures will appropriately account for the effects of non-response, unequal probability sampling, stratification, and clustering.
Confidence intervals vary from estimate to estimate depending upon whether the estimate is for the full population or for a subset such as a particular grade or gender. Within a grouping, confidence intervals also vary depending on the level of the estimate and the design effect associated with the measure. Based on prior YRBS surveys which had similar designs and slightly larger sample sizes, we can expect the following:
Estimates by grade or by gender, or pooling grades/genders, will be more accurate than 5 percent at 95 percent confidence.
Minority group estimates will be accurate to within 5 percent at 95 percent confidence.
The experience in using these data is that the levels of sampling errors involved are appropriate given the uses of the data for descriptive analysis.
The analytic focus of the sub-study is on comparing the measures computed using the follow-up data, based on 24-hour recall, with the NYPANS measures based primarily on both 7-day recalls and ”yesterday” recalls. These analyses do not need to be weighted nor do they need to reflect the complex survey design.
B.2.d Use of Less Frequent Than Annual Data Collection
This is a one-time data collection.
B.2.e Survey Instrument
The NYPANS questionnaire, contained in Appendix C, contains 120 items and can be roughly divided into three groups. The first seven questions are demographic items and include self-reported height and weight. Of the remaining questions, 61 are related to physical activity and 52 are related to dietary intake. The questions are all in a multiple-choice format and will be administered as a 12-page optically scannable questionnaire booklet.
Measured height and weight, contained in Appendix D, will be recorded on an optically scannable worksheet. This worksheet is completed by the data collector and is designed to capture anthropometric data for an individual student. For each student, height and weight are recorded. If height and/or weight cannot be measured, data collectors can also indicate the reason why (i.e., “refused” or “measurement problems”).
The 24-hour recalls are conducted using software developed by the Nutrition Coordinating Center at the University of Minnesota. The interview is respondent-driven and builds on information reported on dietary intake on the previous day. However, standardization of the data collection process is facilitated by the multiple-pass approach of interview methodology. This method uses four distinct passes to elicit information about an individual’s food intake. During the first pass, the participant is asked to recall everything consumed for the 24-hour period of interest. The second pass involves a review of the “quick list” from the first pass to identify and enter missed foods and/or eating occasions. The third pass prompts for information about additions to foods and for complete detail about each food and addition. Standardization of data collection through interviewer prompts, which have been integrated into the software, ensures the collection of complete food descriptions, variable recipe ingredients and food preparation methods. A final review of data, as entered, with the participant provides a fourth pass and the opportunity to make corrections and additions to the data.
B.2.f Data Collection Procedures
NYPANS Questionnaire. Data will be collected by a small staff of professional data collectors, specially trained to conduct the NYPANS. The data collector will have direct responsibility for administering the survey to students. Teachers will be asked to remain at the front or back of the classroom and not to walk around the room monitoring the aisles during survey administration because doing so could affect candor and compromise anonymity. They also will be asked to identify students allowed to participate in the survey and to make sure non-participating students have appropriate alternative activities. The rationale for this is to increase the candor and comfort level of students. The only direct responsibility of teachers in data collection is to distribute and follow up on parental permission forms sent out prior to the scheduled date for data collection in the school. In general, our data collection procedures have been designed to ensure that:
Protocol is followed in obtaining access to schools
Everyday school activity schedules are disrupted minimally
Administrative burden placed on teachers is minimal
Parents give informed permission to participate in the survey
Anonymity of student participation is maintained, with no punitive actions against nonparticipants
Alternative activities are provided for nonparticipants
Control over the quality of data is maintained
Height and weight measurement. All students who complete the in-school student questionnaire will be measured for height and weight by professional data collectors trained to use a standardized protocol. Data collectors will conduct height and weight measurements during a subsequent visit to schools after survey administration. The standardized protocol will be one used in previous studies, including a methodological study of the YRBS conducted in 2000 (OMB No. 0920-0464: expiration 12/2000).
Measurements will be taken in a place that adequately provides for student privacy in an uncarpeted area with at least one wall that has no measureable molding. The following materials will be used to perform these measurements:
Tanita electronic scale
Weighted measuring tape and triangle
Duct tape
Bowl for personal items
Clipboard
To measure height, a weighted measuring tape will be mounted onto a wall by securing the tape at the top and allowing the weight at the bottom to achieve verticality, with the side of the tape facing out showing inches. Students will be asked to remove heavy outer clothing (such as coats, jackets and vests), purses, shoes, and hair accessories located on the top of the head to the extent possible. Students will be instructed to stand with heels, buttocks and upper back against the wall with their feet together, to look straight ahead with arms at sides, and to take a deep breath and stretch up as far as possible, while keeping heels on ground.
To measure weight, a digital scale will be placed on a hard, uncarpeted floor. Students will be instructed to stand with their feet together with hands at sides and their weight distributed equally on both feet. Height will be recorded to the last whole number (e.g. 5’5 ¼” is 5’5”) and weight will be recorded to the last whole number, as well (e.g. 112.7 lbs is 112 lbs).
24-hour dietary recall interviews. For classes selected to participate in the 24-hour dietary recall follow-up study, students will be asked immediately prior to the start of the in-school survey administration to complete a form with their name and contact information. To increase student candor and sense of confidentiality, this information will be collected before students begin answering questions so that they do not think their responses will be connected with their names. At this time, students will be given a reference tool called the “Food Amounts Booklet” (Appendix F1) that they will use during the 24-hour dietary recall interviews to help estimate portion sizes of foods and beverages consumed.
Student contact information will be collected by the data collector visiting the school and securely shipped to the contractor’s headquarter. Once at headquarters, student unique IDs, student names, and contact information will be entered into a password protected system that will be accessed by the CATI center.
Dietary recalls will be administered following established protocol used by the Nutrition Coordinating Center at the University of Minnesota. Professional interviewers at a call center will be trained by University of Minnesota staff in the use of this protocol. Following training, staff from Macro, with assistance from the University of Minnesota, will supervise the interviewers to ensure that the protocol is being followed. Standardization of the data collection process is facilitated by the multiple-pass approach of interview methodology, which is described in Section B.2.a.
Each participant will be interviewed, in total, up to three times. To capture variability of dietary intake across weekdays and weekends, two of these interviews will be conducted for weekdays and one interview will be completed for a weekend day. Because 24-hour recalls are conducted retrospectively, recalls of weekend food intake will be conducted on Sundays and Mondays, while recalls for weekday food intake will be conducted Tuesdays through Saturdays. The protocol will not specify an order of completion. That is, we do not have to conduct two recalls for weekdays before we conduct the recall for the weekend day. We will try to interview each participant a total of three times.
The data collection procedures for the 24-hour dietary recall interviews are made up of several components. The components include: (1) loading the sample; (2) managing call attempts; (3) conducting the interviews; (4) handling busy and no-answer; (5) attempting call backs; (6) managing refusals and interrupted interviews; (7) recording call dispositions, and (8) seeking updated contact information from schools.
Loading the Sample: The sample will be loaded on an ongoing basis as contact information is collected in the field. There is an anticipated delay of no more than one week after school data collection is completed before students from a given school will begin to be contacted.
Managing Call Attempts: Each call attempt will be given a minimum of five rings, with a maximum of 10 attempts for each provided telephone number. Persistent “ring - no answers” will be attempted a minimum of four times at different times and days of the week, excluding traditional school hours when students are known to be unavailable. An exception is when a student has indicated that one of his/her “best times” to be reached is prior to the end of the traditional school day. Each number will be called a maximum of 10 times distributed over the fielding period or until a completed interview is achieved. If a respondent is contacted on the last call, and an interview cannot be completed, another attempt will be made.
Conducting the Interview: Interviewers will follow an administration script while conducting the 24-hour recall. Interviewers will verify that they are speaking with the correct student, that the student is available to be interviewed, and that the student has the Food Amounts Booklet with them, as it will be referred to throughout the interview. If the student is not available or if he/she does not have the Food Amounts Booklet, the interviewer will attempt to schedule an interview for later that day. If the student has lost the booklet, the interviewer will send them another copy and attempt the interview again once the student has received the booklet.
Dealing with Busy and No-Answer: Lines that are busy will be called back a minimum of five times at 10-minute intervals. If the line is still busy after the fifth attempt, the number will be attempted again on different calling occasions until the record is resolved.
Attempting Call-backs: The NYPANS calling system optimizes queuing for definite call-backs by continuously comparing station sample activity and the index of definite call-back records. When a definite appointment time arrives for a participant who has not yet begun or completed their first interview, the system finds the next available station and delivers the record as the next call. The call history screen that accompanies each record informs the interviewer that the next call is a definite appointment and describes the circumstances of the original contact. All subsequent attempts to reach this participant will be handled by the initial interviewer. The handling of call-backs to respondents is crucial to the success of any telephone survey project as the effective management of call-backs will increase the response rate. Perhaps more importantly, scheduling an appointment later in the day, and ensuring that the appointment is kept, and providing a familiar voice of an interviewer who has already spoken to the participant on one or more occasions offers a basic courtesy to someone who has agreed to assist us with a study.
Managing Interrupted Interviews: Given the nature of the 24-hour recall, interrupted interviews with respondents can only be resumed later on the same day in which the interview began. If a respondent indicates a time later in the same day, the interview will be restarted using a definite call-back strategy. A definite call-back for an exact time will be set and the interview can begin where it left off. If the interviewer who began the survey is available at the prescribed time, the system will send the call back to that station. If the interviewer who began the survey is unavailable, the call will be re-assigned to another interviewer. If a respondent is unable to resume a started interview on the same day, the data collected thus far is deleted but the respondent is still eligible to complete a new interview.
Recording Call Dispositions: Dispositions of each call attempt on all records in the sample will be automatically stored in the CATI system. This provides a complete call history for each record in the sample. The call history is displayed on the interviewer’s screen during each new attempt.
Obtaining Updated Student Contact Information: In the event that contact information provided by students results in un-serviced or incorrect telephone numbers, we will confirm contact information with the school and obtain updated information if not prohibited by district or school policy.
B.2.g Obtaining Access to and Support From Schools
All initial letters of invitation will be on CDC letterhead from the Department of Health and Human Services and signed by Howell Wechsler, Ed.D., M.P.H., Director, DASH, NCCDPHP at CDC. The procedures for gaining access to schools will have three major steps:
Notify SEAs in states with sampled schools and invite states to endorse the study. Obtain written endorsement for participation at the SEA level and request general guidance on working with selected school districts and schools in the state. Request that the state notify school districts that they may anticipate being contacted about the survey.
Once discussed with SEAs, invite school districts in which selected schools are located to participate in the study. Include invitation materials for selected schools. For Catholic schools and other private schools, invite the office comparable to the school district office (e.g., diocesan office of education). Obtain written approval for participation at the district level. Request that the school district forward invitation packets to selected schools with a note of approval for their participation or willingness to allow the schools to consider participating. Request general guidance on working with the selected schools.
Once cleared at the school district level, invite selected schools to participate. Verify information previously obtained about the school. Present the burden and benefits of participation in the survey. After a school agrees to participate, develop a tailor-made plan for collection of data in the school. Obtain written approval for participation at the school level. Inquire with schools agreeing to participate whether they also would like to volunteer to participate in the 24-hour dietary recall sub-study. Once they have volunteered, randomly select one of the four previously selected classes to participate in the 24-hour dietary recall interviews. If schools are not inclined to volunteer, assure them that their participation in the in-school survey is not jeopardized. Ensure that parental permission forms, Data Collection Checklists, and other materials that allow schools to prepare for the in-school survey reach the school well in advance of the survey administration date. Maintain contact with schools until all in-school data collection activities have been completed.
Re-establish contact with schools as needed for follow-up assistance in conducting the 24-hour dietary recalls in the event that student contact information has changed.
Prior experience suggests the process of working with each state education agency, school district, and school will have unique features. Discussions with each education agency will recognize the organizational constraints and prevailing practices of the agency. Scripts for use in guiding these discussions may be found in Appendices G, H, and I. Copies of letters to states, school districts, and school officials and teachers and a fact sheet are contained in Appendices G1, H1, I1, I2, and I3.
B.2.h Informed Consent
Two permission forms will be utilized: one for students participating in the in-school survey (which includes the self-administered questionnaire and the height/weight measurement) and one for students participating in the in-school survey and the 24-hour dietary recalls. The permission forms inform both the student and the parent about an important activity in which the student has the opportunity to participate. By providing adequate information about the activities, it ensures that permission will be informed. A copy of the permission forms used for the school survey and the 24-hour dietary recall are contained in Appendices C3, C4, C5, and C6. In accord with the No Child Left Behind Act, the permission form indicates that a copy of the student questionnaire will be available for review by parents at their child’s school.
B.2.i Quality Control
Tables B-4 and B-5 list the major means of quality control for the NYPANS and 24-hour dietary recall interviews, respectively. As shown, the task of collecting quality data begins with a clear and explicit study protocol and ends with procedures for the coding, entry, and verification of collected data. In between these activities, and subsequent to data collector training, measures must be taken to reinforce training, to assist field staff and CATI interviewers who run into trouble, and to check on data collection and interviewing techniques. Because the ultimate aim is production of a high quality database and reports, various quality assurance activities will be applied during the data collection phase.
Table B-4
Major Means of Quality Control for Student Survey
Survey Step |
Quality Control Procedures |
Mail Out of Materials Sent to Schools |
|
Previsit Logistics Verification |
|
Receipt Control |
|
Telephone Contacts |
|
Manual Editing |
|
Computer Scanning |
|
B.3 METHODS TO MAXIMIZE RESPONSE RATES AND DEAL WITH NONRESPONSE
B.3.a Expected Response Rates
Response rates are an important indicator of data quality. While we aim for an 80% school participation rate based on recent experience with the national YRBS, we have assumed conservatively school and student response rates of 77% and 86%, respectively, for the purposes of sample design. The 77% school and 86% student participation rates represent averages experienced over ten completed cycles of YRBS. The addition of a monetary incentive for each school (as suggested by OMB in 1999) has helped maintain and perhaps slightly increase school participation rates.
Exhibit B-5
Quality Control Procedures for CATI Interviews
Survey Step |
Quality Control Procedures |
Testing of CATI program |
|
CATI pretest |
|
Food Amounts Booklet
|
|
CATI quality assurance |
|
Preparation of data files |
|
OMB generally regards studies with higher response rates as offering more representative data. At the same time, OMB has acknowledged repeatedly in its own guidance documents that the range of likely feasible response rates are largely a function of the objectives of a study and of the methodology required. OMB also sets no pre-determined minimum required response rate across surveys of all types, recognizing that some types of surveys, such as population-based CATI surveys, necessarily should be expected to achieve lower response rates than surveys involving many other data collection methods. Moreover, OMB has recognized that CATI survey response rates have been declining in recent years for a variety of reasons, but serve an important purpose and need to be included in the mix of methods used to gather population-based data.
For CATI interviews, we anticipate that 1,200 students will initially be consented to participate in the sub study. Of the initial 1,200 students, we expect that 900 will complete at least one interview, 750 will complete two interviews, and 600 will complete all three interviews. In consultation with experts in conducting 24-hour dietary recalls, we will offer a $10 incentive to participating students for each completed interview, and an additional $10 if they complete all three interviews. Payments will be sent to the address provided by the student, either their home or some other place, via the United States Postal Service.
It is highly desirable to complete the in-school data collection at least 2 months before the end of the school year. Schools are very busy then with testing and during the final two months attendance can be very unstable, especially among twelfth grade students. Even though the 24-hour dietary recalls will be conducted outside of school, it is important that interviewing be completed before the end of the school year. The rationale for completing them while schools are still in session is to ensure that we are obtaining data that are comparable across sites in terms of the circumstances under which students are accessing meals. It is also important to complete the 24-hour recalls while students are still in school so their data can appropriately be compared to the data from the in-school student questionnaire.
B.3.b Methods for Maximizing Responses and Handling Non-Response
We discuss methods for maximizing responses and handling non-response separately for in-school survey (including height and weight measurement) and for the 24-hour recall follow-up interviews.
B.3.b.1 Maximizing Response on the In-School Survey.
We distinguish among six potential types of nonresponse problems on school-based surveys: refusal to participate by a selected school district, school, teacher, parent, or student; and collection of incomplete information from a student.
To minimize refusals at all levels--from school district to student--we will use a variety of techniques, emphasizing the importance of the survey. We expect that some school districts or schools will need to place the issue of survey participation before the school board. To increase the likelihood of an affirmative decision, we will: (1) work through the State Education Agency to communicate its support of the survey to school districts and schools; (2) indicate that the survey is being sponsored by CDC and has the support of Federal and state agencies; (3) convey to school districts and schools that the survey has the endorsement of many key national educational and health associations, such as the National PTA, American Medical Association, National Association of State Boards of Education, Council of Chief State School Officers, the National Education Association, and the National School Boards Association; (4) maintain a toll-free hotline to answer questions from school district and school officials, teachers, parents, and students throughout the process of recruiting schools and obtaining parental permission for the student’s participation; (5) comply with all requirements from school districts in preparing written proposals for survey clearance; (6) convey a willingness to appear in person, if needed, to present the survey before a school board, research committee, or other local entity tasked with reviewing the survey; (7) offer a package of educational products to each participating school, as recommended and approved by OMB in approving the 1998 YRBS in alternative schools (OMB No. 0920-0416: expiration 12/1998), and continued ever since; and (8) offer schools a monetary incentive of $200.
The sampling plan does not allow for the replacement of schools that refuse to participate due to concern that replacing schools would introduce bias. All participating SEAs, school districts, and schools also will be promised and sent a copy of the published survey results.
Maximizing responses and dealing with refusals from parents, teachers, and students require different strategies. To maximize responses, we will recommend that schools help to advertise the survey through the principal's newsletter, PTA meetings, and other established means of communication. Reminder forms (Appendices C7 and C8) will be sent to parents who have not returned parental permission forms within an agreed upon time period (e.g., 3 days); those who do not respond to the reminder will be sent a second and final reminder. The permission form will provide a telephone number at CDC that parents may call to have questions answered before agreeing to give permission for their child's participation. Permission forms will be available in English, Spanish, and other languages as required by dominant languages spoken by parents in selected schools. Field staff will be available on location to answer questions from parents who remain uncertain of permission. Bilingual field staff will be used in locations with high Hispanic concentrations (e.g., California, Florida, New York City, and Texas).
Teacher refusals to cooperate with the study are not expected to be a problem because schools already will have agreed to participate and burden to teachers is minimal.
Refusals by students whose parents have consented also are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.
To minimize the likelihood of missing values on the in-school student survey, students will be reminded in writing in the questionnaire booklet and verbally by the survey administrator to review the optically scannable questionnaire before turning it in to verify that: (1) each question has been answered, (2) only one oval is filled in for each question with the exception of the question on race/ethnicity, and (3) each response has been entered with a No. 2 pencil, fills the oval, and is dark. A No. 2 pencil will be provided to each survey participant to reduce the likelihood that responses will not scan properly, which would produce missing values. In addition, when completed questionnaires are visually scanned later at project headquarters, any oval that is lightly filled in will be darkened (unless they appear to be erasures) and stray marks will be erased before the forms are scanned. Missing values for an individual student on the survey will not be imputed.
To maximize participation in height/weight measurement, students will be measured in a private location where they cannot be observed by their peers or teacher. If a student decides that they do not wish to be measured for height and/or weight, they may refuse one or both measurements. To the extent that it does not over-burden schools, attempts will be made to follow-up with students who are absent or unavailable when height and weight measurements are initially taken.
B.3.b.2 Maximizing Response in the 24-hour Recall Follow-up Survey
Participation in the 24-hour recall interviews is dependent on schools first agreeing to participate in the in-school NYPANS. Similar methods described above including securing SEA endorsement of the study, conveying sponsorship of CDC, other Federal agencies, and key national educational and health associations (National PTA, American Medical Association, National Education Association, etc,) and maintaining a toll-free hotline to answer questions from districts, school, parents, and teachers will all be employed while recruiting volunteer schools for the follow-up survey. We recognize the potential that some schools may have additional procedures that must be followed in order to participate in the follow-up survey that are distinct from the in-school survey since that component of the study draws on the student population but takes place outside of school. Again, we will comply with all requirements from school districts and schools to attain clearance in the follow-up study if schools indicate a willingness to volunteer.
We will approach schools agreeing to participate in the in-school survey about their participation in the follow-up study until approximately 60 volunteer schools have been identified. Schools that are unwilling to participate in the follow-up survey will be assured that their participation in the in-school survey is not jeopardized.
Maximizing responses and dealing with refusals from parents, teachers, and students for the 24-hour dietary recall follow-up will in large part mirror those used for the in-school survey. Similar methods to advertise the follow-up survey will be suggested. The version of the permission form used in classes selected for the 24-hour dietary recall will again include the toll-free number at CDC that parents may call to have questions answered before agreeing to give permission for their child's participation and will be available in English, Spanish, and other dominant languages. Reminder forms will be employed.
Teacher refusals to cooperate with the study are not expected to be a problem because schools already will have agreed to participate and burden to teachers is minimal. Although one class at each volunteer school will be selected to participate in the 24-hour dietary recall follow-up, teachers of these classes are not asked to take on any additional tasks.
Parental refusal in the classes selected to participate in the in-school survey and the 24-Hour Dietary Recalls interviews is expected to be somewhat higher than in classes selected to participate in the in-school survey only.
Refusals by students whose parents have consented also are expected to be minimal. No punitive action will be taken against a nonconsenting student. Nonconsenting students will not be replaced. Data will be analyzed to determine if student nonresponse introduces any biases.
For the 24-hour recall interviews, we will provide phone coverage of afternoons, evenings and weekends to provide a range of times to meet differences in personal schedules. Our automated calling system will manage calling times to ensure that respondents who cannot be reached at one time of day are tried at other times of day. If a persistent busy is encountered at one time of day, we will switch to another time of day. When feasible, a caller who previously spoke to a selected respondent will be given the call to complete the actual interview. At each attempt, the interviewer can see the complete call history of call times and dispositions. At least 10 attempts will be made on each provided telephone number.
Additional efforts to achieve maximum participation on the 24-hour dietary recall interviews will include: (1) utilizing a dedicated team of specially trained interviewers adept at conducting the interview; (2) making scheduled call-backs the highest calling priority; (3) conducting weekly refresher trainings for all data collection staff; (4) leaving messages on persistent “answering machine” dispositions, and (5) requesting updated student contact information from schools when all other attempts to reach students has been exhausted.
An important component in maximizing response for the CATI interviews is having strategies for dealing with non-response, either in terms of refusal conversion efforts or analyses of data to detect biases. The underlying philosophy behind refusal conversion is that a large proportion of initial refusals are situationally-based (e.g., the respondent is on another call or just got home from school or work and is eating dinner). If attempted again, at another time of day, the person may be more responsive and accept the interview. A non-response conversion team, specifically trained in refusal conversion, will call back 100% of respondents who make an initial refusal. They will have the benefit of detailed notes taken by the caller who encountered the initial refusal about the articulated reason for the refusal. Respondents who refuse at this point will be considered a “hard” refusal and not called back again. Staff will be assigned to the non-response conversion team based on experience and performance.
B.4 TESTS OF PROCEDURES OR METHODS TO BE UNDERTAKEN
The NYPANS was developed in 2008 and 2009 based on 20 years of experience with similar school-based surveys. The NYPANS questionnaire uses the existing physical activity and diet questions from the YRBS as its backbone, around which expert panelists in these fields built additional questions. As part of this process, an initial pool of possible NYPANS questionnaire items was subjected to cognitive interviewing (N = 9) and analyses by the contractor in the Fall of 2008. This cognitive analysis resulted in the revision, addition, or deletion of response options and the revision or deletion of certain questions, with the overall effect of improving the clarity of questions and lowering respondent burden. Following cognitive interviewing, the finalized questionnaire underwent a limited pretest (N = 9) in Prince George’s County, Maryland with a racially and ethnically diverse group of students in accord with OMB guidelines. The pretest sharpened the articulation of certain survey questions and confirmed the empirical estimate of the survey burden.
B.5 INDIVIDUALS CONSULTED ON STATISTICAL ASPECTS AND INDIVIDUALS COLLECTING AND/OR ANALYZING DATA
B.5.a Statistical Review
Statistical aspects of the study have been reviewed by the individuals listed below.
Ronaldo Iachan, Ph.D.
Macro International Inc.
11785 Beltsville Drive, Suite 300
Beltsville, MD 20705
Phone: (301) 572-0538
Fax: (301) 572-0986
E-mail: Ronaldo.Iachan@macrointernational.com
Maxine M. Denniston, M.S.P.H.
Centers for Disease Control and Prevention (CDC)
National Center for Chronic Disease Prevention and Health Promotion
Division of Adolescent and School Health
Surveillance and Evaluation Research Branch
4770 Buford Highway, Mailstop K-33
Atlanta, Georgia 30341
Phone: (770) 488-6212
Fax: (770) 488-6156
E-mail: maxine.denniston@cdc.hhs.gov
B.5.b Agency Responsibility
Within the agency, the following individual will be responsible for receiving and approving contract deliverables and will have primary responsibility for data analysis:
Nancy Brener, Ph.D., Project Officer
Centers for Disease Control and Prevention (CDC)
National Center for Chronic Disease Prevention and Health Promotion
Division of Adolescent and School Health
4770 Buford Hwy, NE, MS K-33
Atlanta, GA 30341
Phone:
(770) 488-6148
Fax: (770) 488-6156
E-mail: Nad1@cdc.gov
B.5.c Responsibility for Data Collection
The representatives of the contractor responsible for conducting the planned data collection are:
Alice M. Roberts
Project Director
Macro International Inc.
11785 Beltsville Drive, Suite 300
Beltsville, Maryland 20705
Phone: (301) 572-0290
Fax: (301) 572-0986
E-mail: Alice.M.Roberts@macrointernational.com
James G. Ross
Director, Applied Research Division
Macro International Inc.
11785 Beltsville Drive, Suite 300
Beltsville, Maryland 20705
Phone: (301) 572-0208
Fax: (301) 572-0986
E-mail: James.G.Ross@macrointernational.com
REFERENCES
CDC. (1996). Guidelines for school health programs to promote lifelong healthy eating. MMWR 1996;45(No. RR‑9).
CDC. (1997). Guidelines for school and community programs to promote lifelong physical activity among young people. MMWR;46(No. RR‑6).
Rosen, B. and L. Barrington. (2008). Weights & Measures: What Employers Should Know about Obesity. New York, NY: The Conference Board, April 2008.
Eaton DK, Kann L, Kinchen S, Shanklin S, Ross J, Hawkins J, et al. (2008). Youth Risk Behavior Surveillance—United States, 2007. MMWR;57(SS04):1-131.
Finkelstein EA, Fiebelkorn IC, Wang G. (2003). National medical spending attributable to overweight and obesity: How much, and who’s paying? Health Affairs;W3;219-226.
Fox MA, Connolly BA, and Snyder TD. (2005).Youth Indicators 2005: Trends in the Well-Being of American Youth. U.S. Department of Education, National Center for Education Statistics. Washington, DC: U.S. Government Printing Office (NCES 2005–050).
Frazão E. (Ed). (1999). America's Eating Habits: Changes and Consequences. Food and Rural Economics Division, Economic Research Service, U.S. Department of Agriculture. Agriculture Information Bulletin No. 750 (AIB‑750), p. 5-32.
IOM. (2004). Preventing Childhood Obesity: Health in the Balance. Washington, DC.
Must A, Jacques P, Dallai G, Bajema D, Dietz W. Long term morbidity and mortality of overweight adolescents. N Engl J Med. 1992;327:1350-1355.
National Center for Health Statistics. (2007). Health, United States, 2007. With Chartbook on Trends in the Health of Americans. Hyattsville, Maryland.
Ostbye, T., Dement JM, and Krause KM. (2007). Obesity and Workers’ Compensation: Results from the Duke Health and Safety Surveillance System. Archives of Internal Medicine 167(8): 766-773.
SEARCH for Diabetes in Youth Study Group: Incidence of Diabetes in Youth in the United States. JAMA 2007; 297 (24): 2716-2724.
Sussman MP, Jones SE, Wilson TW, and Kann L. (2002). The Youth Risk Behavior Surveillance System: updating policy and program applications. Journal of School Health;72(1):
13-17.
Trust for America’s Health. (2008). F as in Fat: How obesity policies are failing in America. http://healthyamericans.org/reports/obesity2008/Obesity2008Report.pdf
U.S. Department of Health and Human Services. (1996). Physical activity and health: a report of the Surgeon General. Atlanta GA.
U.S. Department of Health and Human Services. (2000.) Healthy People 2010. 2nd ed. With Understanding and Improving Health and Objectives for Improving Health. 2 vols. Washington, DC: U.S. Government Printing Office.
U.S. Department of Health and Human Services and U.S. Department of Agriculture. (2005) Dietary Guidelines for Americans, 2005. 6th Edition, Washington, DC: U.S. Government Printing Office.
Wang G and Dietz W. (2002) Economic Burden of Obesity in Youth Aged 6-17 Years: 1979 -1999. Pediatrics:109(5).
1 The variances were computed based on design effects of approximately 1.0. The design effect is defined as the variance under the actual design divided by the variance that would be attained by a simple random sample of the same size. Design effects are not essential for the sub-study analyses that focus on differences between the various measures. For these analyses, weights and the complex sampling design may be ignored.
File Type | application/msword |
Author | kflint |
Last Modified By | nad1 |
File Modified | 2009-11-12 |
File Created | 2009-11-12 |