National Center for Education Statistics
National Assessment of Educational Progress
Volume I
Supporting Statement
National Assessment of Educational Progress (NAEP) 2021
COVID-19 Educational Experiences Student, Teacher, and School Administrator Pretesting
OMB# 1850-0803 v.270
June 2020
1) Submittal-Related Information 3
2) Background and Study Rationale 3
3) Recruitment and Data Collection 4
4) Consultations outside the Agency 6
5) Justification for Sensitive Questions 6
7) Assurance of Confidentiality 6
8) Estimate of Hourly Burden 6
9) Costs to Federal Government 7
This material is being submitted under the generic National Center for Education Statistics (NCES) clearance agreement (OMB# 1850-0803), which provides for NCES to conduct various procedures (such as pilot tests, cognitive interviews, and usability studies) to test new methodologies, question types, or delivery methods to improve survey and assessment instruments and procedures.
The National Assessment of Educational Progress (NAEP) is a federally authorized survey, by the National Assessment of Educational Progress Authorization Act (20 U.S.C. §9622), of student achievement at grades 4, 8, and 12 in various subject areas, such as mathematics, reading, writing, science, U.S. history, civics, geography, economics, and the arts. NAEP is conducted by NCES, which is part of the Institute of Education Sciences, within the U.S. Department of Education. The primary purpose of NAEP is to assess student achievement in the different subject areas and collect survey questionnaire (i.e., non-cognitive) data to provide context for the reporting and interpretation of assessment results.
This request is to pretest student, teacher, and school administrator survey questionnaire items planned for the 2021 NAEP Assessments that ask about education experiences during school closures due to the COVID-19 (coronavirus) pandemic. The student, teacher, and school administrator questions aim to capture data related to:
technology access and use;
resources for learning and instruction;
organization of instruction;
teacher preparation; and
self-efficacy.
Various forms of pretesting are used to obtain data about new assessment and survey items during the NAEP item development process, and the results are used to enhance the efficiency of data collection instruments’ development. Pretesting helps to identify and eliminate potential issues with items and tasks, often resulting in fewer challenges in scoring and analysis, and leading to higher pilot item survival rates.
Student questions will be pretested using cognitive interviews. Teacher and school administrator participation has historically proven more difficult to obtain, and so teacher and school administrator items will be pretested by soliciting general feedback from NAEP/NCES expert committees and panels to reduce impact and burden on teachers and school administrators and increase chances of participation in NAEP projects overall.
Cognitive interviews allow for gathering of qualitative data about how participants work through items and for probing potential sources of construct irrelevance. The objective is to determine whether items appear to elicit targeted knowledge and skills and/or reduce construct irrelevance in the form of either evidence that can be scored or qualitative data consisting of participant responses and reactions.
In cognitive interviews, an interviewer uses a structured protocol in a one-on-one interview drawing on methods from cognitive science. A verbal probing technique will be used for this cognitive interview study, where the interviewer asks probing questions, as necessary, to explore issues that have been identified a priori as being of particular interest. This interview technique has proven to be productive in previous NAEP studies and will be the primary approach in the NAEP student cognitive interviews in this study.
The main purpose of this pretesting activity study is to:
Identify potential problems with the items (i.e., ensure the item is understood by all participants and confirm items are not sensitive in nature or make the participant uncomfortable);
Find ways to improve wording of draft items where possible.
The results from this study will also be used to inform which items should be administered as part of the 2021 NAEP assessments.
Recruitment and Sample Characteristics
ETS will recruit a maximum of 20 students, 20 teachers, and 20 school administrators to participate in this pretesting activity. Participants will be recruited from around the U.S.
Although the sample will include a mix of participant characteristics, the results will not explicitly measure differences by those characteristics. Student participants who are or were enrolled in 4th or 8th grade for the 2019-2020 school year will be recruited. In addition, if possible, student participants will be recruited to obtain the following criteria:
A mix of gender;
A mix of race/ethnicity (Black, Asian, White, Hispanic, etc.); and
A mix of socioeconomic backgrounds.
Please note that SES will be given a higher priority than other respondent characteristics when recruiting while also ensuring sufficient balance of other criteria. ETS will document the information collected in the screeners using a tracking sheet, which will be used to determine the targeted sample and diversification on key characteristics (see Appendix J). Additionally, it should be noted that the sample is not large enough to support subgroup analyses. Table 1 summarizes the numbers and types of cognitive interviews that are planned for students. A minimum number of five respondents per subgroup is recommended to identify major problems with an item and for a meaningful analysis of data from exploratory cognitive interviews.1 Students will be oversampled to better ensure identification of confusion or sensitivity issues.
Table 1. Sample Size for Student Cognitive Interviews
Respondent Group |
Grade 4 |
Grade 8 |
Total |
Students |
5-10 |
5-10 |
10-20 |
Overall Total |
5-10 |
5-10 |
10-20 |
Teacher and school participants will be recruited from the NCES Principals and Teachers panels and NAEP expert standing committees (Appendix D). Table 2 summarizes the number of teachers and school administrators that we expect to recruit for this pretesting activity.
Table 2. Sample Size for Teacher and School Administrator Pretesting Activity
Respondent Group |
Grade 4 |
Grade 8 |
Total |
Teachers |
5-10 |
5-10 |
10-20 |
School Administrators |
5-10 |
5-10 |
10-20 |
Overall Total |
10-20 |
10-20 |
20-40 |
While ETS will use various outreach methods (see Appendices A and B) to recruit student participants, the bulk of the recruitment will be conducted by email and will be based on acquisition of targeted lists. ETS researchers will be providing a database of respondents that they have used for previous pretesting efforts. ETS will also use a participant recruitment strategy that integrates multiple outreach methods such as community organizations (e.g., Boys and Girls Clubs, Parent-Teacher Associations) and mass media recruitment (e.g., postings on the ETS website).
Interested parents of students under 18 years of age will be screened (see Appendix C) to ensure that the recruited individuals meet the criteria for participation in the study (i.e., that the participants are from the targeted demographic groups outlined above). When recruiting participants, ETS will email parent/legal guardians of an interested minor; the email will include information about the study and a link to the online screener. After confirming that an individual is qualified, willing, and available to participate in this study, he or she will receive a confirmation email (Appendix G).
Written, informed consent (see Appendices E and F) will be obtained for all respondents who are interested in participating in the study. 45-minutes2 cognitive interviews will be conducted with students. We estimate that the teacher’s and school administrator’s independent review of items followed by a 60-minute virtual meeting discussing collective feedback will take approximately 2-3 hours. The virtual meeting could take place with the committee/panel or be a one-on-one meeting with the committee/panel member and researcher.
Data Collection Process
Students
Student cognitive interviews will be conducted via videoconferencing (e.g., Zoom) to comply with social distancing mandates. To build rapport, the interviewer will share their video; however, students will not be required to share their video. Each interview will include an interviewer, and an observer will be present. Each cognitive interview session will last 45-minutes.
Each participant will first be welcomed by staff, introduced to the interviewer and the observer, and told that he or she is there to help answer research questions about how people answer survey questions. Then, the interviewers will explain the cognitive interview process.
Protocols for cognitive interviews will include probes for use as students work through item sets and probes for use after students finish answering items (see Volume II). Probes will include a combination of pre-planned questions identified before the session and ad hoc questions that the interviewer identifies as important from observations during the interview, such as clarifications or expansions on points raised by the student. For example, if a student paused for a long time over a particular item, appeared to be frustrated at any point, or indicated an “aha” moment, the interviewer might probe these kinds of observations further, to find out what was going on. To minimize the burden on the student, efforts will be made to limit the number of verbal probes that can be used in any one session or in relation to any set of items. The welcome script, cognitive interview instructions, and probes for the interviewers are provided in Volume II.
The cognitive interviews will be audio recorded. Interviewers will also record their own notes separately, such as behaviors (e.g., “the participant appeared confused”), questions posed by students, and observations of how long various items take to complete.
The types of data collected will include:
student reactions to and responses to items;
responses to generic questions;
responses to targeted questions specific to the item(s);
additional volunteered participant comments; and
answers to debriefing questions.
Teachers and School Administrators
To reduce impact and burden on teacher and school administrators and increase chances of participation, these participants will be asked to do an independent review of items on their own time, send feedback via email, and join a virtual meeting via Zoom or other videoconferencing platform. The virtual meeting could take place with the committee/panel or be a one-on-one meeting with the committee/panel member and researcher. To facilitate teacher and school administrator, feedback and discussion guiding questions will be provided. The welcome script, pretesting instructions, and guiding questions are provided in Volume II.
Analysis Plan
After the session, the notes and recording will be summarized to report main findings and illustrative statements that will be analyzed by the NAEP questionnaire development team. The cognitive interview results in 2020 will be used to improve the tested survey items and inform which items should be administered during the 2021 assessments.
Educational Testing Service (ETS) is the item development, data analysis, and reporting contractor for NAEP and will develop the items, analyze results, and draft a report with results. ETS research scientists will recruit participants and administer the cognitive interviews. CRP, Incorporated is the NAEP logistics contractor and will support the recruitment and payment of some participants.
Throughout the item and interview protocols development processes, effort has been made to avoid asking for information that might be considered sensitive or offensive.
To encourage participation in a 45-minute cognitive interview session, a $25 virtual gift card will be offered to each student who participates as a thank you for his or her or their time and effort.
Teachers and school administrators from the NCES Principals and Teachers panels and NAEP expert standing committees will be offered a $250 gift card from a major credit card company (e.g., Visa), which is commensurate to their rate for their expertise provided as a committee member and in line with incentives previously offered to major stakeholders (see the NAEP 2015 National Indian Education Study (NIES) Report Focus Group package (OMB #1880-0542) from April 2019) . The participants will be paid by ETS or CRP, Incorporated, depending on who they have a previous relationship with.
The study will not retain any personally identifiable information. Prior to the start of the study, participants will be notified that their participation is voluntary and that all of the information they provide may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law (20 U.S.C. §9573 and 6 U.S.C. §151).
Before students can participate in the study, written consent will be obtained from the parents or legal guardians of students less than 18 years of age. Participants will be assigned a unique student identifier (ID), which will be created solely for data file management and used to keep all participant materials together. The participant ID will not be linked to the participant name in any way or form. The consent forms, which include the participant name, will be separated from the participant interview files, secured for the duration of the study, and will be destroyed after the final report is released. Cognitive interviews may be audio recorded. The only identification included on the files will be the participant ID. The recorded files will be secured for the duration of the study and will be destroyed after the final report is completed.
The estimated burden for recruitment assumes attrition throughout the process.3 All student cognitive interviews will be scheduled for no more than one hour (45-minutes), and teachers and school administrators pretesting activities will entail 2-3 hours. Table 3 details the estimated burden for the survey questionnaire pretesting activities.
Table 3. Estimated Hourly Burden for Students, Teachers, and School Administrators for the Coronavirus Pandemic Pretesting Activities
Respondent |
Number of Respondents |
Number of Responses |
Hours per Respondent |
Total Hours |
Recruitment |
||||
Student Recruitment via Youth Organizations |
||||
Initial contact |
20 |
20 |
0.05 |
1 |
Follow-up and identify students |
15+ |
15 |
1.0 |
15 |
Sub-Total |
20 |
35 |
|
16 |
Parent or Legal Guardian for Student Recruitment |
||||
Initial contact |
99 |
99 |
0.05 |
5 |
Follow-up via phone |
66+ |
66 |
0.15 |
10 |
Consent and confirmation |
33+ |
33 |
0.15 |
5 |
Sub-Total |
99 |
198 |
|
20 |
Teacher and School Administrator Recruitment |
||||
Initial contact |
75 |
75 |
0.05 |
4 |
Follow-up via phone or e-mail |
50+ |
50 |
0.15 |
8 |
Consent and confirmation |
40+ |
40 |
0.15 |
6 |
Sub-Total |
75 |
165 |
|
18 |
Participation |
||||
Cognitive Interviews |
||||
Students |
20** |
20 |
.75 |
15 |
Sub-Total |
20 |
20 |
|
15 |
Pretesting |
||||
Teachers |
20+** |
20 |
3.0 |
60 |
School Administrators |
20+** |
20 |
3.0 |
60 |
Sub-Total |
40 |
40 |
|
120 |
Total Burden |
214 |
458 |
|
189 |
+ Strictly a subset of the initial contact group
**Figure represents the maximum of the range of participants.
Note: numbers have been rounded and therefore may affect totals
The total cost of the study is $45,500 as detailed in Table 4.
Table 4. Costs to the Federal Government
Activity |
Provider |
Estimated Cost |
Design and prepare for cognitive interviews; administer cognitive interviews (including recruitment, incentive costs, data collection, data entry); analyze findings and prepare report |
ETS |
$ 43,000 |
Support recruitment and payment of participants |
CRP |
$2,500 |
The schedule for this study, including all activities, is provided in Table 5.
Table 5. Project Schedule
Activity Each activity includes recruitment, data collection, and analyses |
Dates |
Student, teacher, and school administrator pretesting activity |
June July 2020 |
Cognitive interview report submitted |
July August 2020 |
1 Roach, A. T., & Sato, E. (2009). White paper: Cognitive interview methods in reading test design and development for alternate assessments based on modified academic achievement standards (AA-MAS). Dover, NH: Measured Progress and Menlo Park, CA: SRI International.
2 Please note that the 45-minutes includes time for introductions, conducting the interview, debriefing, and/or time for additional questions/feedback from the participants.
3 Based on our experiences in other similar NAEP studies, the estimated attrition rates for direct student participant recruitment are 33 percent from initial contact to follow-up, 50 percent from follow-up to confirmation, and 40 percent from confirmation to participation. We estimate the attrition rates for direct adult participant recruitment for this study are 33 percent from initial contact to follow-up, 20 percent from follow-up to confirmation, and no attrition from confirmation to participation. The estimated attrition rate for the initial youth organization contact for student identification is 25 percent from contact to follow-up.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Debby E. Almonte |
File Modified | 0000-00-00 |
File Created | 2021-01-13 |