ED Response to OMB Comments

OMB Passback Responses 2009-10-29.docx

Trends in International Mathematics and Science Study (TIMSS:11) and Progress in International Reading Literacy Study (PIRLS:11)

ED Response to OMB Comments

OMB: 1850-0645

Document [docx]
Download: docx | pdf

memorandum

to: Shelly Wilkie Martinez, OMB

from: Stephen Provasnik and Patrick Gonzales, NCES

subject: Responses to OMB Comments Received October 28, 2009

PIRLS:11 and TIMSS:11 Field Test emergency clearance, OMB# 1850- 0645 v.5

date: October 29, 2009

CC: Kashka Kubzdela, NCES




  1. Please clarify in SS A that the clearance request is (can be) only for activities from November through April (ie, 6 months) so that the rest of the information presented on the field test instruments, the benchmarking study, etc., is for informational purposes but not for approval.

 

 We will revise the language in the Preface of the SS  A so that it makes this explicit (new text in bold to make it easy to spot):

 

…The international schedule calls for the joint TIMSS and PIRLS 2011 field test data collection in the United States to occur between March 1 and April 15, 2010, with the full-scale data collection scheduled to occur in April–May 2011.  Included with this request are copies of the non-cognitive data collection instruments, which may be subject to minor editing, all of which will be finalized by the IEA by November 1.  Other materials required for OMB approval, such as recruitment letters, brochures, and other advance materials explaining the studies are included in Appendixes A and B of the accompanying documentation. Additional information, such as burden estimates for the main study and the state benchmarking study, are also included for informational purposes, but not for approval.

NCES is requesting that the Office of Management and Budget (OMB) approve on an emergency basis a six-month clearance (from November 2–May 1) for the PIRLS/TIMSS March-April 2010 field test, including recruitment of selected schools, school districts, and state education agencies for the field test starting in November 2009. 

  

  1. Please clarify whether student IDs leave the school or not.   We found some references that suggested no and others that suggested yes (see pages 12, 24, cover of questionnaires).  Please make text more consistent.


Student names never leave the school; student IDs alone are on the test booklets. A secure list that matches student names to the student IDs stays at the school only during the data collection period and is destroyed when the data collection and verification for a given school is completed. At this point the personally identifiable student data are irretrievable.


In more detail, the student IDs are generated randomly through the Win3s sampling software utilized for the TIMSS and PIRLS studies. These IDs are placed on each booklet (with sticky labels) because every student identified as belonging to the sampled classroom is assigned a booklet, regardless of their eventual participation status. These IDs are unique identifiers that are connected to student names on a separate document, which remains with the School Coordinator in a secure location at the school.  In some rare cases, the School Coordinator may be required to clarify information with a participating student (or teacher). Without the list of names linked to each unique student ID, this would be impossible.  At the end of the assessment, booklets for those students who do not actually participate--due to absence or parent refusal, for example--are marked with a special code to indicate that the student did not participate, and all booklets are collected and sent for processing. At this point student IDs are on the booklets, but student names are not.


The cover of the student questionnaires (which has an “Identification Label” box with a place for “student ID” and “student name”) is the internationally designed cover (to which we added “OMB # to go here”). The U.S. sticky identification label fully covers this Identification Label box and does not have a student name. 


We will revise the text on p. 24, to make it clearer:


Names are associated with unique student identification numbers in lists given to the School Coordinator to ensure that missing information can be obtained through follow-up sessions, if necessary, and so that teachers are cognizant of which students participated in the TIMSS or PIRLS assessments. It is important to note that the names of students do not leave the school and under no circumstances are the names of students or teachers included in the international or national database, nor is this information forwarded to any organization. After all data collection is complete, the School Coordinator is instructed to destroy the list of names associated with the unique IDs to ensure complete confidentiality and privacy of respondents, per NCES practice.  Neither the contractor nor NCES retain these lists.

 

Under no circumstances are the names of any participants (student, teacher, or principal) included in any dataset or form that becomes part of the national or international dataset.  

 

  1. Please clarify how ungraded classes or schools are treated in TIMMS and PISA.

 

Past experience indicates that ungraded schools/classrooms are quite rare in the TIMSS and PIRLS samples. For TIMSS, schools are asked to list all mathematics classes that have fourth- or eighth-graders as students, regardless whether other-grade students are also in the same mathematics class. Schools are asked to do the same for PIRLS, listing instead all reading/reading arts classes that have fourth-graders. Based on the sampling procedures devised by the international group, those classrooms are *eligible* to be sampled. If actually sampled, all students in the identified classrooms are asked to participate in the study.  This, then, would result in some students outside of the defined grade levels (fourth- or eighth-grade) being included in the sample. Again, this is a rare occurrence in the relatively small school samples selected for TIMSS and PIRLS.

  

  1. Why is NCES limiting cognitive labs to no more than 9?  We would much prefer “sufficient” testing under the NCES Generic clearance.

  

Based on internal NCES conversations, we are limiting the number of participants in the proposed cognitive labs to 9 or fewer to fall within federal regulations regarding clearance and approval. We included this request with the TIMSS and PIRLS clearance package because we understood that all aspects of the proposed studies should be included, rather than seeking clearance and approval for different aspects of the studies through different means.  We can explore the conduct of cognitive labs for TIMSS and PIRLS under the NCES generic clearance if that is the preferred way of doing it.  However, the use of cognitive labs (as explained in B.4) is only an option if questions arise about the performance of an item in the field test.

   

  1. Please clarify who sends the “sample notification letters” to parents?  It would appear to be a school administrator, but then the confidentiality language or “the data provided by your schools, staff, and student may…” doesn’t seem to fit.

 

 The letters will be provided to the School Coordinator, who then will distribute the letters or ask school-based personnel to distribute the letters (depending on the process in place at each school or district). The confidentiality language in the letters will be revised (revision in bold here):

 

All of the information collected is kept completely confidential, as required by law. NCES is authorized to conduct this study under the Education Sciences Reform Act of 2002 (Public Law 107-279, Section 153). Under that law, the data provided by schools, staff, and students may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose (Public Law 107-279, Section 183 and Title V, subtitle A of the E-Government Act of 2002 (P.L. 107-347)). Students and schools are never identified in any reports. All reported statistics refer to the United States as a whole.”

  

6.       Since the email to get more information about PIRLS is a Westat (rather than government) email address, shouldn’t the Westat’s role be made explicit in the “Facts” sheet?

 

Westat's roll as the primary contractor for data collection is stated in the initial contact letters sent to states, districts, and schools. However, we can add information to the Fact sheets just before “Where can I find out more about [TIMSS]?” (with the question adjusted to each Fact sheet accordingly), as follows:

 

Who administers TIMSS?

The entire assessment is administered by trained staff from Westat, a research organization under contract to the U.S. Department of Education’s National Center for Education Statistics.

  

7.       PIRLS grade 4 questionnaire, item 5: Please clarify what “i) Three or more cars…” is measuring and please reassure us that there is not an “non-urban bias” inherent in it?

 

This item has been administered in prior TIMSS studies, and is included to allow for trend analyses (by secondary researchers).  Some research has shown that the presence of 3 or more cars is associated with families of lower socioeconomic status. As this is but one of several items that may be used to construct a composite SES variable, we propose retaining this item for researchers who may be interested in pursuing more nuanced analyses. NCES does not use this item in its own analyses or reporting.

  

8.       Do fourth graders know what “e.g.” means?  We notes it in question response categories.

 

Both TIMSS and PIRLS have regularly included items that use "e.g.", for both fourth- and eighth-graders. To the best of our knowledge, there have been no reports of students having difficulty responding to these questions.  Most assuredly they do not know what "e.g." means in Latin (most adults do not know this), but there is nothing to suggest that this ignorance impedes “understanding” that an example is being provided after “e.g.” in the questions. 

  

9.       Do fourth graders know what “please specify” means?  We notes it in question response categories.

  

We have no evidence that students of either grade level have difficulty with understanding "please specify" in the instruments.  Where it has occurred in the instruments, the vast majority of students have provided appropriate responses, which suggests no difficulty in understanding.

  

10.   School questionnaire, item 20 (“At which grade do the following reading skills ….”) – we find the headings for the response categories very confusing.  Has this be cognitively tested?

 

11.   Teacher questionnaire, item 1 – is this the standard NCES wording?  We were thinking that there was usually some caveat about excluding student teaching etc.

 

This response addresses question 10 and 11.


The response categories for all questionnaire items are devised at the international level, through a year-long consensus building process. Participating countries can submit suggested revisions to the PIRLS or TIMSS questionnaires between 12 and 6 months before the international versions of the questionnaires are finalized and before NCES is able to seek the first OMB clearance for the study. U.S. does not have control over the final content and formatting of the international portions of the questionnaires, but can submit its comments during that early period.


NCES routinely makes suggestions for the improvement of the instruments at international study meetings, based on internal discussions as well as discussions with consultants. However, not all of these suggestions are adopted by the international group.  Once the questionnaires are "set" by the international group, it is not possible for NCES to alter the international portions of the instruments. (As shown in the draft questionnaires, there are items that are inserted by NCES for analyses of national interest; these items can be revised up to the time that NCES submits the instruments for international verification and finalization.) However, any suggestions for improvement of the instruments from OMB can be raised by NCES at future meetings for the next cycle of each study.


Once the questionnaire’s content is set, participating countries can make suggestions for national adaptations (i.e. translations that do not change the content but use country-appropriate phrasing; e.g., replacing “lift” with “elevator” for U.S.). Once finalized, no participating country is at liberty to make any content or formatting changes, as this would prevent direct comparisons between countries.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAuthorised User
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy