ELS-2002-3rd-Follow-up-2012-Field-Test-PartA

ELS-2002-3rd-Follow-up-2012-Field-Test-PartA.docx

Education Longitudinal Study (ELS) 2002 Third Follow-up 2011 Field Test

OMB: 1850-0652

Document [docx]
Download: docx | pdf

December 2010




Education Longitudinal Study: 2002
(ELS:2002)


Third Follow-up 2011 Field Test


OMB Supporting Statement

Part A









OMB# 1850-0652 v.7






National Center for Education Statistics

Institute of Education Sciences

U.S. Department of Education





TABLE OF CONTENTS

Section Page


LIST OF Appendixes

Appendix 1. Draft Field Test Questionnaire 1-1

Appendix 2. Cognitive Labs Report Summary 2-1

Appendix 3. Commisioned Paper: Randall J. Olsen 3-1

Appendix 4: Commissioned Paper: Michael Shanahan 4-1

Appendix 5: Commissioned Items: Robert Lent 5-1

Appendix 6: Data Collection Materials (Brochure, Lead Letters) 6-1

Appendix 7: Linkages to Extant Data Sources 7-1

Appendix 8: Reliability Re-interview Questionnaire


Exhibits

Number Page




Preface

This request concerns the third follow-up of the Education Longitudinal Study: 2002 (ELS:2002), an ongoing longitudinal study with a field test in 2011 and a full-scale data collection in 2012. This document requests clearance for field test and main study data collection activities and supplements earlier requests concerned with direct locating and contacting of individual respondents or their parents. ELS:2002 is being conducted by the RTI International under contract to the U.S. Department of Education (Contract number ED-04-CO-0036/0004).

More specifically, this clearance request is made to obtain Office of Management and Budget (OMB) approval for the field test questionnaires and for incentive experiments to be implemented in the data collection phase of the project. The request document includes estimated burden to respondents for the field test and full-scale studies. Additionally, the document contains a request for a waiver of a 60-day Federal Register notice for the full-scale study clearance to be submitted in 2011. It should be noted that generic clearance for cognitive testing of new and revised questionnaire items was requested separately, in a June 2010 submission (field test) and may be requested, if needed, in the September 2011 submission (full-scale) under OMB# 1850-0803.

The ELS:2002 study involves computer-assisted data collection (web, telephone, and field) with sample members who participated in the base-year or first follow-up ELS:2002 study (a subset of whom also participated in the second follow-up). The study may also involve the collection of financial aid information and postsecondary education transcripts for the cohorts in 2013–14. If the two optional components are approved, full details will be submitted to OMB in a future clearance package.

In this supporting statement, we report the purposes of the study, review the overall design, describe the field test and full-scale data collection procedures, and address how the collected information addresses the statutory provisions of the Education Sciences Reform Act of 2002 (P.L. 107-279). Subsequent sections of this document respond to OMB instructions for preparing supporting statements. Section A addresses OMB’s specific instructions for justification and provides an overview of the study’s design. The draft questionnaire is appended to this submission, and is represented by topic area in the justifications portion of Section A. Section B describes the collection of information and statistical methods.

  1. Justification of the Study

A.1 Circumstances Making Collection of Information Necessary

A.1.a Purpose of This Submission

The materials in this document support a request for clearance to conduct the third follow-up of ELS:2002. The basic components and key design features of ELS:2002 are summarized below.

Base Year

  • Baseline survey of high school sophomores, in spring term 2002 (field test in spring term 2001).

  • Assessments in Reading and Mathematics.

  • Parents and English and math teachers were surveyed in the base year. School administrator questionnaires were collected.

  • Additional components for this study included a school facilities checklist and a media center (library) questionnaire.

  • Sample sizes were about 750 schools and approximately 17,600 students (15,300 base-year respondents). Schools are first-stage unit of selection, with sophomores randomly selected within schools.

  • Oversampling of Asian Americans, private schools.

  • Design linkages (test concordances) with other assessment programs: Program for International Student Assessment (PISA), National Assessment for Educational Progress (NAEP), and test score reporting linkages to the prior longitudinal studies.

First Follow-up

  • Follow-up in spring 2004 (spring 2003 for field test), when most sample members were seniors, but some were dropouts or enrolled in other grades.

  • Student questionnaires, dropout questionnaires, cognitive tests, and school administrator questionnaires administered.

  • Returned to the same schools for data collection, but separately followed transfer students.

  • Sample members who were no longer in school were followed by telephone (computer-assisted telephone interview; CATI) or field (computer-assisted personal interview; CAPI) data collection.

  • Freshening for a nationally representative senior cohort.

  • High school transcript component in fall/winter, 2004–05 (2003–04 for field test).

Second Follow-up

  • Follow-up in spring 2006 (spring 2005 for field test) using web-based self-administered instrument with telephone (CATI) and field (CAPI) data collection for nonresponse follow-up.

  • Focus on transition to postsecondary education, labor force participation, and family formation, with emphasis on postsecondary access and choice.

Third Follow-up

  • Follow-up in summer 2012 (summer 2011 field test) using web-based self-administered instrument with telephone (CATI) and field (CAPI) data collection for nonresponse follow-up.

  • Options may be exercised to collect postsecondary transcripts and financial aid records.

  • Focus on postsecondary education, labor force participation, and family formation, with emphasis on college persistence and attainment.

The third follow-up study will provide data to map and understand the outcomes of the high school cohorts’ transition to adult roles and statuses at about age 26. For the cohort as a whole, the third follow-up will obtain information that will permit researchers and policymakers to better understand issues of postsecondary persistence and attainment, as well as sub-baccalaureate (and to a more limited degree, baccalaureate) rate of economic and noneconomic return on investments in education. The third follow-up will also provide information about high school completion (for students who dropped out or were held back) and the status of dropouts, late completers, and students who have obtained an alternative credential, such as the GED. Finally, for both college-bound and non–college-bound students, the third follow-up will map their labor market activities and family formation.

For many cohort members, complex pathways, with alternative timings and durations for work and postsecondary enrollment, will be followed at this point of transition. In the 6-year period since the previous round, a sample member may both have worked and attended school, either serially or simultaneously; a cohort member may have attended school part-time or full-time and combined education and work spells with marriage and family formation. The singular strength of longitudinal studies is their power to provide data on transitions that are both complex and of some duration. The transition from adolescence to adult roles—and in particular, the transition to and through postsecondary education, and to labor force activity, and family formation—is of this very type.

A.1.b Legislative Authorization

The National Center for Education Statistics (NCES) of the Institute of Education Sciences (IES), U.S. Department of Education, is conducting this study, as authorized under Section 151 of the Education Sciences Reform Act of 2002 (P.L. 107-279), which states:

  1. Establishment—There is established in the Institute a National Center for Education Statistics (in this part referred to as the “Statistics Center”).

  2. Mission—The mission of the Statistics Center shall be—

    1. to collect and analyze education information and statistics in a manner that meets the highest methodological standards;

    2. to report education information and statistics in a timely manner; and

    3. to collect, analyze, and report education information and statistics in a manner that—

      1. is objective, secular, neutral, and nonideological and is free of partisan political influence and racial, cultural, gender, or regional bias; and

      2. is relevant and useful to practitioners, researchers, policymakers, and the public.

  3. General Duties—The Statistics Center shall collect, report, analyze, and disseminate statistical data related to education in the United States and in other nations, including—

    1. collecting, acquiring, compiling (where appropriate, on a state-by-state basis), and disseminating full and complete statistics (disaggregated by the population characteristics described in paragraph (3)) on the condition and progress of education, at the preschool, elementary, secondary, postsecondary, and adult levels in the United States, including data on—

      1. state and local education reform activities;

      2. state and local early childhood school readiness activities;

      3. student achievement in, at a minimum, the core academic areas of reading, mathematics, and science at all levels of education;

      4. secondary school completions, dropouts, and adult literacy and reading skills;

      5. access to, and opportunity for, postsecondary education, including data on financial aid to postsecondary students;

      6. teaching, including—

  1. data on in-service professional development, including a comparison of courses taken in the core academic areas of reading, mathematics, and science with courses in noncore academic areas, including technology courses; and

  2. the percentage of teachers who are highly qualified (as such term is defined in section 9101 of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 7801)) in each state and, where feasible, in each local educational agency and school;

      1. instruction, the conditions of the education workplace, and the supply of, and demand for, teachers;

      2. the incidence, frequency, seriousness, and nature of violence affecting students, school personnel, and other individuals participating in school activities, as well as other indices of school safety, including information regarding—

  1. the relationship between victims and perpetrators;

  2. demographic characteristics of the victims and perpetrators; and

  3. the type of weapons used in incidents, as classified in the Uniform Crime Reports of the Federal Bureau of Investigation;

      1. the financing and management of education, including data on revenues and expenditures;

      2. the social and economic status of children, including their academic achievement;

      3. the existence and use of educational technology and access to the Internet by students and teachers in elementary schools and secondary schools;

      4. access to, and opportunity for, early childhood education;

      5. the availability of, and access to, before-school and after-school programs (including such programs during school recesses);

      6. student participation in and completion of secondary and postsecondary vocational and technical education programs by specific program area; and

      7. the existence and use of school libraries;

    1. conducting and publishing reports on the meaning and significance of the statistics described in paragraph (1);

    2. collecting, analyzing, cross-tabulating, and reporting, to the extent feasible, information by gender, race, ethnicity, socioeconomic status, limited English proficiency, mobility, disability, urban, rural, suburban districts, and other population characteristics, when such disaggregated information will facilitate educational and policy decisionmaking;

    3. assisting public and private educational agencies, organizations, and institutions in improving and automating statistical and data collection activities, which may include assisting state educational agencies and local educational agencies with the disaggregation of data and with the development of longitudinal student data systems;

    4. determining voluntary standards and guidelines to assist state educational agencies in developing statewide longitudinal data systems that link individual student data consistent with the requirements of the Elementary and Secondary Education Act of 1965 (20 U.S.C. 6301 et seq.), promote linkages across states, and protect student privacy consistent with section 183, to improve student academic achievement and close achievement gaps;

    5. acquiring and disseminating data on educational activities and student achievement (such as the Third International Math and Science Study) in the United States compared with foreign nations;

    6. conducting longitudinal and special data collections necessary to report on the condition and progress of education;

    7. assisting the Director in the preparation of a biennial report, as described in section 119; and

    8. determining, in consultation with the National Research Council of the National Academies, methodology by which states may accurately measure graduation rates (defined as the percentage of students who graduate from secondary school with a regular diploma in the standard number of years), school completion rates, and dropout rates.

Activities for ELS:2002 are included in Part 1 (A, C–K, M–O), Part 2, Part 3, Part 6, and Part 7.

The Center assures participating individuals and institutions that any data collected under the ELS:2002 study shall be in total conformity with NCES’s standards for protecting the privacy of individuals. Section 183 states that:

  1. In General—All collection, maintenance, use, and wide dissemination of data by the Institute, including each office, board, committee, and center of the Institute, shall conform with the requirements of section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232g, 1232h).

  1. Student Information—The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provisions Act (20 U.S.C. 1232g, 1232h).

  2. Confidentiality standards are—

    1. IN GENERAL that

      1. The Director shall develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data under this title.

      2. This section shall not be construed to protect the confidentiality of information about institutions, organizations, and agencies that receive grants from, or have contracts or cooperative agreements with, the federal government.

    2. With PROHIBITIONS that no person may—

      1. Use any individually identifiable information furnished under this title for any purpose other than a research, statistics, or evaluation purpose;

      2. Make any publication whereby the data furnished by any particular person under this title can be identified; or

      3. Permit anyone other than the individuals authorized by the Director to examine the individual reports.

Any person who uses any data provided by the Director, in conjunction with any other information or technique, to identify any individual student, teacher, administrator, or other individual and who knowingly discloses, publishes, or uses such data for a purpose other than a statistical purpose, or who otherwise violates subparagraph (a) or (B) of subsection (c) (2), shall be found guilty of a class E felony and imprisoned for not more than 5 years, or fined as specified in Section 3571 of title 18, United State Code, or both.

The confidentiality of ELS:2002 data is further regulated by the E-Government Act of 2002 as well as the Privacy Act of 1974 and the Computer Security act of 1987.

A.1.c Prior and Related Studies

In 1970 NCES initiated a program of longitudinal high school studies. Its purpose was to gather time-series data on nationally representative samples of high school students, which would be pertinent to the formulation and evaluation of educational polices.

Starting in 1972 with the National Longitudinal Study of the High School Class of 1972 (NLS:72), NCES began providing longitudinal data to educational policymakers and researchers that linked educational experiences with later outcomes such as early labor market experiences and postsecondary education enrollment and attainment. The NLS:72 cohort of high school seniors was surveyed five times (in 1972, 1973, 1974, 1979, and 1986). A wide variety of questionnaire data were collected in these follow-up surveys, including data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction. In addition, postsecondary transcripts were collected.

Almost 10 years later, in 1980, the second in a series of NCES longitudinal surveys was launched, this time starting with two high school cohorts. High School and Beyond (HS&B) included one cohort of high school seniors comparable to the seniors in NLS:72. The second cohort within HS&B extended the age span and analytical range of NCES’s longitudinal studies by surveying a sample of high school sophomores. With the sophomore cohort, information became available to study the relationship between early high school experiences and students’ subsequent educational experiences in high school. For the first time, national data were available showing students’ academic growth over time and how family, community, school, and classroom factors promoted or inhibited student learning. In a leap forward for educational research, researchers, using data from the extensive battery of cognitive tests within HS&B, were also able to assess the growth of cognitive abilities over time. Moreover, data were now available to analyze the school experiences of students who later dropped out of high school. These data became a rich resource for policymakers and researchers over the next decade and provided an empirical base to inform the debates of the educational reform movement that began in the early 1980s. Both cohorts of HS&B participants were resurveyed in 1982, 1984, and 1986. The sophomore cohort was also resurveyed in 1992. Postsecondary transcripts also were collected for both cohorts.

The third longitudinal study of students sponsored by NCES was the National Education Longitudinal Study of 1988 (NELS:88). NELS:88 further extended the age and grade span of NCES longitudinal studies by beginning the data collection with a cohort of eighth-graders. Along with the student survey, it included surveys of parents, teachers, and school administrators. It was designed not only to follow a single cohort of students over time (as had NCES’s earlier longitudinal studies NLS­­:72 and HS&B), but also, by “freshening” the sample at each of the first two follow-ups, to follow three multiple nationally representative grade cohorts over time. Eighth-grade, 10th-grade, and 12th-grade cohorts, thus, were included in the study series. This not only provided comparability of NELS:88 to existing cohorts, but it enabled researchers to conduct both cross-sectional and longitudinal analyses of the data. Additionally, in 1993, high school transcripts were collected for each student, further increasing the analytic potential of the survey system. Consequently, NELS:88 represents an integrated system of data that tracked students from middle school through secondary and postsecondary education, labor market experiences, and marriage and family formation.

In design, ELS:2002 recapitulates the sophomore cohort of HS&B. However, in terms of the richness of its contextual data sources, particularly its coverage of school-level, curricular, and home environmental factors, ELS:2002 is most similar to NELS:88, and for this reason a more detailed description of the 1988 study is provided below.

The base-year survey for NELS:88 was carried out during the spring semester of the 1987–88 academic year. The study employed a clustered, stratified national probability sample of 1,052 public and private eighth-grade schools. Almost 25,000 students across the United States participated in the base-year study. Questionnaires and cognitive tests were administered to each student in the NELS:88 base year. The student questionnaire covered school experiences, activities, attitudes, plans, selected background characteristics, and language proficiency. School principals completed a questionnaire about the school; two teachers of each student were asked to answer questions about the student, about themselves, and about their school; and one parent of each student was surveyed regarding family characteristics and student activities.

The first follow-up of NELS:88, conducted in 1990 or 2 years after the base-year study, included the same components as the base-year study, with the exception of the parent survey. Additionally, a “freshened” sample was added to the student component to achieve a representative sample of the nation’s sophomores. Some 18,221 students participated (of 19,363 selected), with 1,043 dropouts taking part (of 1,161 identified), for a total of 19,264 participating students and dropouts. In addition, 1,291 principals took part in the study, as did nearly 10,000 teachers.

The second follow-up for the cohort took place early in 1992, when most sample members were in the second semester of their senior year of high school. The second follow-up provided a culminating measurement of learning in the course of secondary school, and also collected information that facilitated the investigation of the transition into the labor force and postsecondary education after high school. Because the NELS:88 longitudinal sample was freshened to represent the 12th-grade class of 1992, trend comparisons were possible between the senior cohorts from the 1972, 1980, and 1982 school years from NLS:72 and HS&B. The NELS:88 second follow-up resurveyed students who were identified as dropouts in 1990, and identified and surveyed the additional students who had left school since the prior wave.

NELS:88/1994, the third follow-up wave of the eighth-grade class of 1988, took place during the spring semester of the 1993–94 school year. In 1994, most of the sample members had already graduated from high school, and many had begun postsecondary education or entered the workforce. The study addressed issues of employment and postsecondary access, and was designed to allow continuing trend comparisons with other NCES longitudinal studies. For the first time in the sequence of NELS:88 studies, the primary form of data collection was individual CATI interviews, with personal interviews completed with selected respondents requiring intensive tracking and nonresponse refusal conversion.

The fourth follow-up of the eighth-grade class of 1988 (NELS:88/2000) interviewed the sample cohort in the spring and summer of 2000 when the respondents were typically 25 to 26 years old, approximately 12 years after the base-year data collection. Postsecondary transcripts for this cohort were collected primarily in the autumn of 2000, with the last cases worked early in 2001. Data collection commenced approximately 6 years after the last contact with the sample, enabling researchers to explore a new set of educational and social issues about the NELS:88 respondents. At the time of the fourth follow-up, most of the participants in the various cohorts of NELS:88 had been out of high school for 8 years. At this age, most students who intend to enroll in postsecondary education have done so. A large proportion had achieved an undergraduate degree by 2000; some had completed graduate or professional programs. A postsecondary transcript component was added to NELS:88/2000 to collect the educational records of sample members who entered postsecondary education. By then many of these young people had married and had children of their own, some were divorced, some had become successful in the marketplace, and some were still struggling to transition to the work force and to develop their own career.

HSLS:09. Finally, although not a prior study, the High School Longitudinal Study of 2009 is a related NCES study, and indeed, the successor study to ELS:2002. It began with a nationally representative sample of public and private schools in the fall of 2009, and a student sample of entering high school freshmen. HSLS:09 ninth-graders will be resurveyed in 2012, 2013, 2015, and 2021. The base-year survey included a survey and math assessment of students as well as surveys of school administrators, counselors, science teachers, math teachers, and parents. The first follow-up includes another survey and math assessment for the same students (some of whom may have transferred or left school entirely). It also will include surveys of school administrators, counselors, and parents. HSLS:09 is similar in its objectives to the other high school longitudinal studies, but places greater emphasis on choice behaviors associated with coursetaking and careers in science, technology, engineering, and mathematics than did prior studies.

A.2 Purposes and Use of ELS:2002

A.2.1 Overview

ELS:2002 is intended to be a general purpose data set; that is, it is designed to serve multiple policy objectives. Broadly speaking, the third follow-up interview will focus on postsecondary education, work experiences, family formation, community involvement, and other life course outcomes. Topics related to education will build on the theme of collecting data on access to postsecondary education that was initiated in the second follow-up, where extensive data on all college applications submitted by sample members were obtained. The third follow-up focus includes a range of new issues concerning students’ attainment in postsecondary education. These new data will include, if the financial aid supplement is funded, information about the amounts of different types of student aid received from various sources over their entire college experience, and, if the collection option is exercised, from college transcripts from all colleges attended, a complete record of all the courses they enrolled in, and the grades they received. Some new data will also be collected through jobs summary measures on the dynamics of the employment they have entered into and their progress in finding and forming a promising career. In addition, special attention will be given to high school dropouts’ progress toward a high school diploma, GED, or other equivalency, including GED test score information. Because some sample members will have chosen not to continue their education in the 8 years following high school, a series of questions will focus on experiences in the workforce. Yet, because another group of respondents will have been going to school and working, work and educational summaries must be collected, covering the 6 years since last interview. In addition to collecting factual information about educational enrollments and work experiences, the interview will collect information on respondents’ basic life goals. As sample members turn 26 years of age, the modal age of the participants at the time of the interview, marriage and parenthood become more common. Therefore, the third follow-up is the appropriate time to determine which participants have started forming families. With regard to community involvement, participation in volunteer work and the political process will be examined. All outcomes must be collected in this round, in the compass of a relatively brief (35-minute) interview.

The objectives of ELS:2002 also encompass the need to support both longitudinal and cross-cohort analyses, and to provide a basis for important descriptive cross-sectional analyses. ELS:2002 is first and foremost a longitudinal study, hence survey items are chosen for their usefulness as outcome measures, particularly in the context of previously collected predictor items. At the same time, ELS:2002 content should, to the extent possible, be kept comparable to that of the prior NCES high school studies, to facilitate cross-cohort comparisons (for example, trends over time can be examined by comparing 1980, 1990, and 2002 high school sophomores; or 1972, 1980, 1982, 1992, and 2004 high school seniors). The 2012 (third follow-up) round of ELS:2002 can be compared to the year 2000 round of NELS:88, when cohorts from both studies will be, typically, 8 years beyond high school graduation.

It should be noted that a fourth follow-up (say at the modal age of 31, in 2017) is currently contemplated, subject to available funding. If another round is pursued, this would make possible far deeper analyses of emergent careers and of rates of return on investments in education. A fourth follow-up also would provide for ELS:2002 comparisons with the final data collection (1986, about age 32) of the NLS:72.



A.2.2 Content Justification

While past studies (including both the ELS:2002 prior round, and the NELS:88/2000 study of 26 year-olds) provide a rich source of questions of policy interest, planning for the ELS:2002 third follow-up has also sought to update the study by exploring possible new topics. In developing the questionnaire for this round, which is included with this submission as Appendix 1, some important extra steps were taken to solicit systematic inputs from distinguished researchers. Specifically, reviews in two cases reflecting the perspective of longitudinal research in labor economics, and life course theory, were commissioned. Additionally, new items were written by a specialist, to reflect recent developments in social-cognitive career theory. The focus of the systematic reviews was research topic prioritization and identification of new items. The reviews were conducted by an economist, Randall J. Olsen of Ohio State University, and a sociologist, Michael Shanahan of the University of North Carolina at Chapel Hill. The newly written items drawing on social cognitive theory were written for the study by Professor Robert Lent of the University of Maryland. Their contributions are appended to this submission.

The questionnaire and its justifications are presented in terms of key research areas. The research areas reflect the research agenda for the third follow-up study, based both on the precedent of the prior secondary longitudinal studies and new considerations and measures, as identified by the commissioned papers, Technical Review Panel (TRP), and project staff.

Status: Current Activities

The first questionnaire area (not in itself a research area, but a description of respondent status) is current activities. The activity module defines “current” in two ways. First, it will ask about the respondent’s main activities at the time of the interview. Second, it will ask about the respondent’s main activities as of June 30, 2012 (or 2011 for the field test). The interview date is used as one reference point for this set of data elements—that will minimize error in recall. Information on current/June 30 activities provides a summary measure of status vis-à-vis the common transitional pathways. This section will provide the foundation and path logic for much of the remainder of the survey instrument. The items within it reflect both time use (main activity), and a means of routing each respondents through the questionnaire according to these activities.

Although on the surface it might seem somewhat duplicative, the rationale for also asking about activity as of a specific date (June 30, 2012) is to anchor the snapshot of activities to a single point in time that is shared by the universe of all sample members. Linking to a universal common date is a tack that has been taken in the NCES secondary longitudinal studies, from NLS:72 onward, to help the analyst adjust for events across the divergent calendars that arise from the lengthy data collection period in which there may be a difference of about 6 to 7 months between the first interview and the last. Again, as noted above, information in the current activity module will be useful in subsequent sections for identifying important subsets of the population and their distinctive pathways through the transition from high school to adulthood. In prior surveys, this section has begun with an item about sample members’ current activity status as a student or employee, proceeded to the reference date activities question, and then collected additional information about sample members who might be unemployed. This is a high-priority section of the questionnaire. It is preceded by the required formal PRA statement:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number of this information collection is 1850-0652 and it is completely voluntary. The time required to complete this information collection is estimated to average around 35 minutes per response. If you have any comments concerning the accuracy of the time estimate or suggestions for improving the interview, please write to: U.S. Department of Education, Washington, DC 20202-4537. If you have comments or concerns regarding the status of your individual interview, write directly to: Education Longitudinal Study (ELS), National Center for Education Statistics, 1990 K Street NW, Washington, DC 20006.


First, I would like to ask you some questions about your current activities. Are you currently…

Response options: Yes, No (for each)

  • Working for pay at a full-time job (35 hours/week or more)

  • Working for pay at a part-time job or jobs (less than 35 hours/week)

  • Taking vocational or technical courses at any school or college

  • Taking academic courses at a 2- or 4-year college, including graduate or professional schools

  • Serving in another work experience, such as an apprenticeship, training program, or internship

  • Full-time manager of my own household

  • Caring for dependent children and/or adults

  • Currently serving in the armed forces, active duty, reserves, or National Guard?

Were your work and school activities during the last week in June 2011 the same as they are now?

Response options: Yes, No

During the last week in June 2011 were you…

Response options: Yes, No (for each)

  • Working for pay at a full-time job (35 hours/week or more)

  • Working for pay at a part-time job or jobs (less than 35 hours/week)

  • Taking vocational or technical courses at any school or college

  • Taking academic courses at a 2- or 4-year college, including graduate or professional schools

  • Serving in another work experience, such as an apprenticeship, training program, or internship

  • Full-time manager of my own household

  • Caring for dependent children and/or adults

  • Currently serving in the armed forces, — active duty, reserves, or National Guard?

Research Area: High School Completion

The second area of questionnaire content is the research area “high school completion.” This section is high priority for the relatively small (but highly policy-relevant) number of students who had not completed their secondary schooling (or equivalency) by 2006, or whose high school completion status was not known as of the second follow-up. One of the key milestones in a young person’s life is completion of high school. In earlier rounds ELS:2002 captured dropouts and students who fell behind the modal grade progression of their grade cohort. The pattern of continuing high school completion is likely to continue through the third follow-up, both in terms of students completing a GED or obtaining an adult high school diploma. We expect that by the year 2012 additional respondents will have completed high school, by either earning a diploma or an equivalency certificate. In this wave of the survey, we will update high school completion information for those who had not completed by 2006 or who were not interviewed in 2006.

Ancillary records source: For those students who completed their high school degrees by obtaining a GED, we will obtain their reasons for completing their high school programs by this alternative path. Indeed, additional information, for linkage, may be obtained from GED Testing Service, a tack that was followed in the prior round. (GED information could include test scores or pass/fail.) For the questionnaires, standard questions are available from past rounds of ELS:2002 (which in many cases were also used in NELS:88) to address completion status and reasons for completion through an alternate route.

Have you received a high school diploma, certificate of attendance, or a GED or other equivalency certificate?

Response options: Yes, No

Are you currently working toward a GED or equivalent?

Response options: Yes, No

What type of high school diploma or certificate did you complete? Did you receive a…

Response options: diploma; certificate of attendance; GED or other equivalency certificate

In what month and year did you receive your [diploma/certificate of attendance/GED or other equivalency]?

  • Month

  • Year

How did you earn the GED or equivalency, or in other words, what program or school were you enrolled in, if any?

Response options: No program, you just took the exam; part of a job training program; enrolled through adult education; part of a child care program or early childhood program; some other program

From what state did you receive your GED or equivalency?

Why did you decide to complete your GED or equivalency? Was it…

Response options: Yes, No

  • To improve, advance, or keep up to date on your current job?

  • To train for a new job/career?

  • To improve basic reading, writing, or math skills?

  • To meet requirements for additional study?

  • Required or encouraged by your employer?

  • For personal, family, or social reasons?



Research Area: Postsecondary Enrollment

The second follow-up of ELS in 2006 provided detailed information on issues related to early access to postsecondary education relative to high school experiences and family background. The timing of the third follow-up is ideal with respect to capturing degree attainment for students who enrolled in postsecondary education within a year or two following high school graduation. Because it takes an average of almost 6 years to complete a bachelor’s degree, by conducting the survey 8 years after most ELS participants graduated from high school, the data collection strategy should capture most of the college graduates and many postsecondary education dropouts and stopouts.

It should be noted, however, that the 8-year period since high school graduation (or the 6-year period since last interview) may present significant recall problems for persons asked to provide detailed information about numerous postsecondary education experiences. Collecting detailed information on schools attended, enrollment spans and periods of nonenrollment, degrees/certificates obtained, programs of study, and reasons for leaving without a degree or certificate may represent a significant data collection burden on sample members with multiple postsecondary experiences. Therefore, our approach to data collection will focus on using the interview information in identifying the names, locations (and in turn Integrated Postsecondary Education Data System (IPEDS) codes) for all postsecondary institutions attended by sample members since high school graduation (with school information collected during ELS:2002 second follow-up preloaded and confirmed with respondents). We believe this type of data collection will provide more valid and reliable information on student-reported enrollment events, while still providing the required information to collect postsecondary institution transcripts.

In sum, to achieve greater accuracy with decreased burden, instead of depending on the reports of sample members about enrollment and grades in detail, transcripts will be collected from all the colleges attended, and highly accurate enrollment and grades information derived from academic records. This same strategy was successfully applied in the postsecondary transcript collection of NELS:88/2000. On the basis of spring 2000 questionnaire information, transcript data were collected from initial postsecondary enrollment through any enrollment up to summer 2000. About 92 percent of requested transcripts were received. A minor limitation of this approach is that it excludes institutions not in IPEDS (e.g., foreign universities and military training programs) from which transcripts were neither requested nor obtained.

NCES has been collecting postsecondary transcripts for the high school longitudinal studies since 1984 (NLS:72). The NCES postsecondary longitudinal studies have collected them as well. ELS:2002/12 will take advantage of recent NCES postsecondary transcript efforts to use updated course coding lists, fully specified new derived variables, and revamped coding engines.

Coupling the interview data with postsecondary transcripts will provide a strong basis for investigating a number of important topic areas: ultimate educational attainment, the grade-measured quality of academic performance, and educational persistence, intensity of enrollment, and transfer. However, comparatively few measures of the college experience were taken at the contemporaneous point in time represented by the ELS:2002 second follow-up (2006, or 2 years out of high school). The richness of the high school quality and experiential data is not matched in postsecondary institutional or schooling context data (although IPEDS institutional data are of some help), and the individual experience of higher education, in all its diversity and importance, is hardly measured. This raises the question of whether, in 2012, some elements of the college experience might be captured retrospectively. We deal with this question by (later in this document) proposing a separate module for postsecondary educational experiences items to complement the continuous longitudinal data for enrollment, grades, and so on, captured in the transcript study.

The postsecondary education enrollment section will also include items related to the less-than-4-year institutions that many of the cohort have attended. It will also avail itself of preloaded information from the 2006 interview, for those who participated. But for 2006 nonrespondents, the time frame will begin with high school completion.

Social-cognitive items were considered for the postsecondary education section but are no longer included. Owing to space limitations, the comparatively small proportion of the sample enrolled in postsecondary education in the third follow-up year, and that group’s lack of general representativeness of the cohort (that is, the modal college-attendance experience of the cohort was at the time of the 2006 interview), the payoff for such items would be modest, especially in comparison to the value of social-cognitive career theory scale items to be asked of the employed.

[When we spoke with you in 2005, you indicated that you had attended [XXXX, XXXX, etc.] after high school. Since that time, have you attended any other college, university, or vocational, technical, or trade school for academic credit? / Since leaving high school, have you attended any college, university, or vocational, technical, or trade school for academic credit?]

Response options: Yes, No

[Schools we know about so far are: XXXX, XXXX, XXXX, etc.] What [other] college, university, or vocational, technical, or trade school have you attended since leaving high school?

Response options: text boxes for school name and city (plus dropdown list for state), along with an IPEDS coder

In what month and year did you first attend [XXXX]?

  • Month

  • Year

Schools we know about so far are: [XXXX]. Did you attend elsewhere?

Response options: Yes, No

Which institution did you last attend?

Response options: [all PS schools indicated in F2/F3]

In what month and year did you last attend [last/only PS school attended]?

  • Month

  • Year

Have you earned a degree or certificate from [last/only PS school attended]?

Response options: Yes, No

What types of degrees or certificates did you receive from [last/only PS school attended]? (Check all that apply)

Response options: Certificate; Associate’s Degree; Bachelor’s Degree; Master’s Degree; Ph.D. or equivalent (E.G., ED.D., D.P.H.); Professional doctorate (M.D., J.D., L.L.B., D.D.S., etc.)

In what month and year did you receive your [credential] from [last/only PS school attended]?

  • Month

  • Year

What was your primary major or program of study for your [credential] from [last/only PS school attended]?

Response options: textbox for major, plus CIP coder

Did you have a secondary major or program of study for your [credential] from [last/only PS school attended]?

Response options: Yes, No

What was your secondary major or program of study for your [credential] from [last/only PS school attended]?

Response options: textbox for major, plus CIP coder

[You have already told us about your [credential] from [XXXX]]. What other degrees or certificates, if any, do you have?

Response options: Do not have any other degrees or certificates; Certificate; Associate’s Degree; Bachelor’s Degree; Master’s Degree; Ph.D. or equivalent (E.G., ED.D., D.P.H.); Professional doctorate (M.D., J.D., L.L.B., D.D.S., etc.)

From what institution did you earn your [credential]?

Response options: [all PS schools indicated in F2/F3]

In what month and year did you receive your [credential] from [XXXX]?

  • Month

  • Year

What was your primary major or program of study for your [credential] from [XXXX]?

Response options: textbox for major, plus CIP coder

Did you have a secondary major or program of study for your [credential] from [XXXX]?

Response options: Yes, No

What was your secondary major or program of study for your [credential] from [XXXX]?

Response options: textbox for major, plus CIP coder



Research Area: The College Experience

A fourth content area pertains to the college experience. For a likely quite small proportion of students, the college experience will be contemporaneous with the interview. However, for the vast majority on the postsecondary educational pathway, the college experience must be explored through retrospective questions.

The comparative richness of the high school academic and institutional data in ELS:2002, and the comparative thinness of data on the experience of schooling in a higher education context, make for a marked contrast. Although students are influenced by their high school educational experience, postsecondary settings exert influence as well. The extent of students’ engagement in college education and its perceived contributions to the student’s overall development of knowledge and skills may also relate to the student’s subsequent career development and other life course outcomes. The college experience is a research area that may well support substantial numbers of questions that are new to the secondary longitudinal studies series. The items that follow, drawn from the National Survey of Student Engagement, NELS F4, and/or B&B:93/03, gather information on high-impact college practices, and reasons for college dropout and lack of persistence; they also inquire into the current beliefs of the respondent about the value of their postsecondary education.

Which of the following did you do as a part of your postsecondary education?

Response options: Yes, No

  • Practicum, internship, field experience, co-op experience, or clinical assignment

  • Work on a research project with a faculty member outside of course or program requirements

  • Study abroad

  • Participated in a community-based project (e.g., service learning) as part of a regular course

  • Culminating senior experience (capstone course, senior project or thesis, comprehensive exam, etc.)

  • Participated in an Honors program

  • Participated in Case-Studies competition(s)

  • Participated in a program in which you were mentored

[You told me earlier that you are no longer enrolled in any school and that you did not obtain a degree or certificate. / You told me earlier that you had once attended a 4-year school, are no longer enrolled in any school, and that you did not obtain a 4-year degree.] Why did you leave school? (Check all that apply).

  • Done taking the desired classes

  • Financial reasons

  • Change in family status (e.g., marriage, baby, death in family)

  • Personal problems/injury/illness

  • Conflicts with demands at home

  • Academic problems

  • Not satisfied with program/school/campus/faculty

  • Classes not available/class scheduling not convenient

  • Job/military considerations

  • Moved from the area

  • Decided to take time off from studies

  • Enrollment doesn’t suit lifestyle/boredom with school

  • School/program closed/lost accreditation

How satisfied are you with the following aspects of your [highest credential earned] at [name of school at which it was earned]?

Response options: Likert scale

  • Faculty/teaching

  • Courses offered

  • Course availability

  • Career preparation

How important would you say your undergraduate education was in preparing you for the following aspects of your life?

Response options: Likert scale

  • Work and career

  • Further education

  • Establishing your financial security

  • Civic participation

  • Your overall quality of life

Research Area: Educational Aspirations and Expectations

A fifth content area is that of educational aspirations and expectations. Although there may not be a future round of data collection in which fulfillment of expectations can be measured, it is nevertheless of interest to know, in this final round, the degree to which the cohort has further educational attainment expectations and aspirations. In prior waves a question has been asked about expected education at age 30. Regardless of whether the third follow-up is or is not the final round of the study, this item – whether the respondent expects to get more education – remains analytically relevant. Major use of expectation items would be to address shifts in sample members’ educational aspirations and expectations (e.g., reasons for not completing their educational programs) although any such measures would need to show that they are not notably subject to biases of ex post facto rationalization of behavior.

As things stand now, what is the highest level of education you ever expect to complete?

Response options:

  • Less than high school graduation

  • GED or other equivalency only

  • High school graduation only

  • Complete a 1- or 2-year program in a community college or vocational school

  • Graduate from college (4- or 5-year degree)

  • Obtain a Master’s degree or equivalent

  • Obtain a Ph.D., M.D., or other advanced degree

  • Don’t know

Research Area: Educational Debt and Finance

Educational debt is a key construct for ELS:2002. Educational debt in particular would appear to be a critical area for the ELS:2002/12 survey—in any case but perhaps especially in the context of the study’s potential financial records component. ELS:2002 data should enable research on relationship between educational goals and attainment, course of study, financial aid received, career path and plans, family formation, wealth, and debt.

Questions on outstanding loan balances were not asked in the ELS:2002 second follow-up, although willingness to incur indebtedness through financial aid was inquired into. Debt, then, can be viewed as a new priority area for ELS:2002 third follow-up.

Olsen has made the following suggestion: “For students who incurred debt for their schooling, the final interview could be used to elicit how that debt is being repaid. That is, ask the R what fraction of the debt has been forgiven by one program or another, what fraction has been paid off by them or their family, and what fraction remains to be paid off.”

For the related matter of financial aid, the questionnaires should not attempt to elicit aid histories retrospectively from the student. Rather, this will be approached through a records supplement. A feasibility study is necessary, however, to assess the quality and completeness of data collected at some remove in time. In other words, a pilot is required to determine whether financial aid records are kept over a span of years such that they could serve the ELS:2002/12. Financial aid data are desirable for examining access and choice, persistence, and attainment.

Other than money you may have borrowed from family or friends, did you take out any type of education loans to help pay for your education since high school?

Response options: Yes, No

How much of this amount that you borrowed do you still owe? (If you are unsure of the amount, provide your best estimate. If you have already repaid these loans in full, please enter ‘0’.)

How much do you pay each month for these loans? (If none, please enter 0.)

Has any of the student loan debt you have incurred been paid off by you, your family, or been forgiven by a loan forgiveness program? (Check all that apply.)

Response options: Paid none of the debt, Paid some of the debt, Paid all of the debt

  • You

  • Your family

  • Forgiven by a loan forgiveness program

In which of the following ways has your student loan debt influenced your employment plans and decisions? (Check all that apply.)

  • Took job outside field of study or training

  • Took less desirable job

  • Had to work more hours than desired

  • Had to work more than one job at the same time

  • Other

  • Student loan debt has not influenced my employment plans or decisions

Research Area: Military Occupations

Military job training and occupational role potentially require some special questions. Although active duty was gathered as a past or present status, by design, in NELS:88, military service is not explored as a job or career experience; only the civilian labor market is so treated. However, this may be a difficult approach to justify given the role of the armed services in employment and training, for a cohort coming of age in war time. In any event, for those formerly or currently in the military, it is desirable to collect information on location of service, branch of the military, component, entry time (and exit time if applicable) and military pay-grade. For this group, on-the-job training is of the essence.

Have you ever been in the military?

Response options: Yes, No

[Was your military service/Has your military service been] in the U.S., outside the U.S., or both?

Response options: Service in the U.S.; Service outside the U.S.; Both

In which branches of the military have you served? You may select more than one answer. (Check all that apply)

  • Army

  • Air Force

  • Marine Corps

  • Navy

  • Coast Guard

In which branch are you currently serving?

Response options: Army; Air Force; Marine Corps; Navy; Coast Guard

In which component are you currently serving?

Response options: Active duty; Reserves; National Guard

In what month and year did your first military service begin?

  • Month

  • Year

In what month and year did your most recent military service end?

  • Month

  • Year

What is the highest military pay grade you have achieved?

Response options: E-1; E-2; E-3; E-4; E-5; E-6; E-7; E-8; O-1; O-1E; O-2; O-2E; O-3; O-3E; W-1; W-2

What is the total amount of time you (have) served on active duty? (If none please enter 0)

  • Years

  • Months

What is the total amount of time you (have) served in a combat zone? (If none please enter 0)

  • Years

  • Months

Research Area: Employment

A further content area for the study is current or most recent job. One of the primary goals of ELS:2002 has been the collection of information about young people’s entry into the labor force, and especially the examination of information on the longer-term individual and institutional effects of secondary and postsecondary education, dropout and stopout behaviors, and aspirations.

Three constructs in particular seem to mark distinctive features of employment. One is current job (or most recent). A second is the notion of career, both as a status and as an animating plan. A third is employment history.

Capturing employment, both for ELS participants who did not enroll in postsecondary education (“non–college bound”) and for those who did enroll, is important to better understand the rate of economic (and noneconomic) return to individuals and society for various levels of education. In addition to determining the employment outcomes of the non–college-bound population, examining the early labor market experiences of ELS participants who obtained postsecondary educationfrom short-term vocational credentials to advanced degreeswill help researchers and policymakers discern the benefits of various levels of postsecondary education. Although the economic returns of a bachelor’s degree relative to a high school diploma, both in terms of occupations and earnings, have been well documented, the benefits of sub-baccalaureate credentials have been less clearly demonstrated and many noneconomic benefits could usefully be documented as well.

Job benefits (both monetary and nonmonetary) are a further dimension of employment, including medical insurance, retirement plans, intellectual challenges, and earnings.

Since high school, have you ever held a job for pay?

Response options: Yes, No

You mentioned before that you are not currently working for pay at [a full-time / either a full-time or a part-time ] job.

Do you want a [full-time / full- or part-time] job for pay at this time?

Response options: Yes, No

[Including your military service, how / How] many full-time and part-time jobs have you held for pay since you last attended high school?

  • Number of full-time jobs (35 hours/week or more) since last attending high school

  • Number of part-time jobs (less than 35 hours/week) since last attending high school

[Including your active duty military service, in / In] what month and year were you last working for pay?

  • Month

  • Year

Do you currently have more than one [full-time/part-time/military] job?

Response options: Yes, No

Altogether, how many full-time and part-time jobs do you currently have?

  • Current number of full-time jobs (35 hours/week or more)

  • Current number of part-time jobs (less than 35 hours/week)

[I would like you to answer the following questions for your primary/most important/military job. For your primary job, what is your job title? / For your most recent job, what was your job title?]
What [do/did] you do as an [XXXX]?

Response options: A textbox for job title and another for job duties, along with an O*NET coder

In what month and year did you begin your [job as a XXXX/primary job/current job/military job/most recent job]?

  • Month

  • Year

For your [job as a XXXX/primary job/current job/military job/most recent job] [are/were] you working for yourself or someone else?

Response options: Self-employed, Someone else

What type of organization or business [employs/employed] you? [Is/Was] it a…

Response options: Private, for-profit, company; Nonprofit or not-for-profit company; Local government; State government; Federal government, including civilian employees of the military; Military, including National Guard

How many hours per week, in a typical week, do you currently work for pay in your [job as a XXXX/primary job/current job/military job/most recent job]?

[Now I would like you to consider all of your current jobs for pay.] How many hours per week do you work for pay in a typical week at these jobs?

Which one of the following four statements best describes your [job as a XXXX/primary job/current job/military job/most recent job]?

Response options: Someone else decided what you did and how you did it; Someone else decided what you did, but you decided how to do it; You had some freedom in deciding what you did and how you did it; You were basically your own boss

[Which of the following benefits [does/did] your [primary/current/most recent] employer offer? / As a self-employed [XXXX] / In military position which of the following do you have?] (Check all that apply.)

  • Medical insurance and/or other health insurance such as dental or vision

  • Life insurance

  • Retirement or other financial benefits, such as a 401(k)/403(b)

Please indicate to what extent the following job characteristics [apply/applied] to your [job as a XXXX/primary job/current job/military job/most recent job].

Response options: 1=Not at all an aspect of the job; 2; 3; 4; 5=Very much an aspect of the job

  • Job security

  • Opportunity to learn new things

  • High earnings

  • New challenges

  • Enough time for leisure activities

  • Chance of doing something useful for society

  • Good chance to combine work with family tasks

Now we would like to ask you a few questions concerning your earnings at your [job as a XXXX/primary job/current job/most recent job]. For your [job as a XXXX/primary job/current job/most recent job], what is the easiest way for you to report your total earnings before taxes or other deductions? We use this information to compare the amount that people earn in different types of jobs.

Response options: per hour; per day; per week; biweekly (every 2 weeks); bimonthly (twice a month); per month; per year; other (SPECIFY)

Even though you told me it is easier to report your earnings [other unit], [are/were] you paid at an hourly rate on your [job as a XXXX/primary/current job/most recent] job?

Response options: Yes, No

For your [job as a XXXX/primary/current/most recent] job about how much [per hour / per [other unit]] [do/did] you earn before taxes and other deductions?

Research Area: Career (Conceptualized Through Social Cognitive Career Theory)

The ELS:2002/12 questionnaire also seeks to understand the role of career, and has approached this construct through the perspective of social cognitive career theory. With the assistance of Professor Robert Lent of the University of Maryland, a number of items reflecting social cognitive career theory have been written for ELS:2002/12, a new content area for the study. Within the occupational domain, items measure several key constructs: self-efficacy, outcome expectations, interests, supports, domain satisfaction, and persistence intentions. These variables are intended to complement and build on predictors and intermediate outcomes already included in ELS:2002, including earlier attempts to gather information on self-efficacy and self-directed learning. The items were recently cognitively tested so that they can be revised for the field test. The field test will enable scale reliabilities to be assessed.

Please indicate the extent to which you agree or disagree with each of the following statements with respect to your [job as a XXXX/primary job/current job/military job]:

Response options: 1=Strongly disagree; 2; 3; 4; 5=Strongly agree

  • I’m confident that I can perform the job at a very high level of skill

  • I’m certain that I can solve big problems that occur at work

  • I’m confident that I can reach the goals I set for myself at work

  • I’m certain that I can do my work well despite time pressures

  • I’m confident that I can do my work well even when I need to juggle work with nonwork responsibilities (e.g., in my family or community)

Remaining at my [job as a XXXX/primary job/current job/military job] will allow me to…

Response options: 1=Strongly disagree; 2; 3; 4; 5=Strongly agree

  • Get respect from other people

  • Do work that I find satisfying

  • Earn enough money for the lifestyle I want to have

  • Work with other people who share my interests and values

Please indicate the extent to which you agree or disagree with each of the following statements:

Response options: 1=Strongly disagree; 2; 3; 4; 5=Strongly agree

  • I am really interested in my work

  • I often get totally absorbed in my job tasks

  • I rarely get bored when I am doing my job

  • People at work are pretty supportive of me

  • There are people I can learn from at work

  • There are people I can turn to for help in solving a work problem

  • I feel fairly well satisfied with my present job

  • Most days I am enthusiastic about my work

  • I find real enjoyment in my work

  • I plan to remain in my current job over the next year

  • I don’t usually think about leaving this job

  • I feel pretty strongly committed to my current job

Which of the following best describes your [job as a XXXX/primary job/current job/military job]?

Response options: It is part of my long-term career or work goals; It is preparation for my long-term career or work goals; It is not related to my long-term career or work goals; I do not have a long-term career or work goals

Research Area: Job Training, Certification, and Licensure

Another key aspect of employment is reflected by job-related training, and certificates and licenses. Recent history has seen a growing consensus about the skill requirements of the “21st Century workforce.” Along with the formal educational training of workers, the new flexible workforce will need workers who are continuously learning new skills and competencies, some of which may be validated with formal state or professional licensure and certification. By the year 2012, many of the members of the ELS cohorts will have been in the workforce and will have been exposed to on-the-job training and further job-related education through their employers. How and why this training is taking place is of great concern to employers, employees, and policymakers who are creating programs to facilitate this type of training.

Similar issues portend for the pursuit of licenses and certificates. One instrumentation issue for the licensing items is whether it is enough to get simply the most recent license. Because of sequential licensing (“licensing ladders”) we are proposing to collect the number of licenses received since leaving high school, with follow-up questions collecting further detail on the last license received. We are assuming that a license is salient enough to be remembered, and that reporting licenses will not be greatly burdensome.

To ensure accurate recall periods and to more closely target specific opportunities for training, we will ask about job-related training received in the current (or most recent) job. We will also limit the reference period to the last 12 months, again to avoid recall difficulties. For members of the sample cohort who received such training, we will ask about motivation, form, purpose, and results of their job-related training activities.

Job Training/Employer-Provided Training

In the last 12 months, have you received any of the following types of formal training from your employer? (Mark all that apply. Include training from past as well as current employers.)

  • Safety and compliance training, which includes information on company or professional procedures and regulations concerning legal, ethical, and safety issues

  • Basic skills training, which includes training in elementary reading, writing, arithmetic and English language skills (including English as a second language)

  • Communication or team training, which includes training to improve communication in the workplace, foster teamwork, or to reorganize work teams and work flow

  • Management training, which includes training in supervising employees and in implementing employment practices, regulations, and policies

  • Position-specific skills training, which includes training to develop the skills you need to do your work, including sales and customer relations training, professional or technical skill development, use of computer applications, and other practical job skills

In the past 12 months, have you had one formal training, or more than one formal training?

Response options: One, More than one

[The next few questions ask about your most recent training.] What type of training did you [most recently] have? (Mark one.)

Response options: Safety and compliance training; Basic skills training; Communication or team training; Management training; Position-specific skills training

In what month did [your/your most recent] training end?

  • Month

  • Still enrolled

What was the total number of weeks this training lasted?

Response options: 1 week or less; 1 to 2 weeks; 2 to 4 weeks; 4 weeks to 3 months; 3 to 6 months; more than 6 months

How many hours per week did this training last?

Was [this/this most recent] training required by your employer?

Response options: Yes, No

Was [this/this most recent] training required for a professional certification or for a state or industry license? (A professional certification or license verifies that you are qualified to perform a specific job. It includes things like licensed realtor, certified medical assistant, certified construction manager, or Cisco Certified Network Associate.)

Response options: Yes, required to obtain a certification or license; Yes, required to maintain a certification or license; No

Was [this/this most recent] training taught or provided by…

(Mark one. If more than one instructor, indicate the main provider of instruction)

Response options: An employee of your company; A private vendor, product supplier, or other business firm; A professional association or labor union; An instructor from a community college or other college or university; Don’t know who taught or provided the training

Did you receive college credit from your participation in this training? (Do not count continuing education units/credits)

Response options: Yes, No

What benefits did you receive or do you plan to receive from [your/your most recent] training?

Response options: Received; Not received but expect to receive; Not received and not expected to receive

  • Higher pay or bonus

  • Promotion upon completion of the training

  • Future advancement opportunities

  • Improved job performance

  • Remain current with new regulations, laws, or technologies

  • Change job or career field

  • Other



Certification and Licensure.

Now I’d like to ask you about professional certification and licensure. Do you have a professional certification or a state or industry license? (A professional certification or license verifies that you are qualified to perform a specific job. It includes things like licensed realtor, certified medical assistant, certified construction manager, or Cisco Certified Network Associate.)

Response options: Yes, No

Do you have more than one certification or license?

Response options: Yes, No

How many do you have?

[Let’s talk about your most recent certification or license.] Is it a certification, license, or both? (“Both” typically occurs when someone gets a license upon completion of a certification program. If “both,” ask rest of questions about certification.)

Response options: Certification, License, Both

Did you have to do any of the following to get this [certification/license]?

Response options: Yes, No

  • Demonstrate skills while on the job?

  • Pass a test or exam?

  • Submit a portfolio of your work?

To maintain this [certification/license], do you have to…

Response options: Yes, No

  • Take continuing education classes or earn CEUs?

  • Take periodic tests?

Can this [certification/license] be…

Response options: Yes, No

  • Revoked or suspended for any reason?

  • Used if you wanted to get a job with any employer in that field? (Answer “yes” for credentials that are recognized state-wide or regionally)

What benefits did you receive or do you plan to receive from earning the [certification/license]?

Response options: Received; Not received but expect to receive; Not received and not expected to receive

  • Higher pay or bonus

  • Promotion upon completion of the training

  • Future advancement opportunities

  • Improved job performance

  • Remain current with new regulations, laws, or technologies

  • Change job or career field, enter the workforce, or start own business

  • Other

Would you say your [job as a XXXX/primary job/current job/military job/most recent job] [is/was] related to the major or field of study you had when you were last enrolled at [last PS school / PS school from which respondent earned a credential]?

Response options: Yes, No

[Would it be/Was it] difficult for you to do your [job as a XXXX/primary job/current job/military job/most recent job] without having had the courses you took at [last PS school / PS school from which respondent earned a credential]?

Response options: Yes, No

[Are/Were any of the following required by your [primary/current/most recent] employer as a condition for working? / Are/Were any of the following required for your [job as a XXXX/primary job/current job/military job/most recent job]? (Check all that apply.)

  • An industry certification or occupational license

  • A vocational or technical certificate or diploma

  • A 2-year college degree

  • A 4-year college degree

  • None of the above

Research Area: Employment History

Clearly employment history must be captured in some viable way. Because ELS:2002/12 and its 2011 field test must cover a 6-year interview gap in about 35 minutes, detailed employment event histories are unfortunately not possible. However, summary items, both for employment and unemployment spells, are the next best information.

For the next items, I want to ask about [your employment/any military or civilian employment] last year in 2010, and in the 2 years before that. Across all your jobs during the 2010 calendar year, how many weeks did you work for pay? Please include all paid time off such as vacations, sick leave, and family leave in your weeks spent working. (Do not include the time you have spent out of work, between jobs, or without pay.)

How many hours did you work for pay at all jobs in a typical week in 2010?

Now, I would like you to think back to the year before last. During the 2009 calendar year, were you employed [either by the military or in the civilian workforce] for 6 months or more during the year?

Response options: Yes, No

For [this/any] employment in 2009, were you employed primarily full time or part time?

Response options: Full Time, Part Time, Not employed at all during 2009

Now, I would like you to go back a year further to 2008. During the 2008 calendar year, were you employed [either by the military or in the civilian workforce] for 6 months or more during the year?

Response options: Yes, No

For [this/any] employment in 2008, were you employed primarily full-time or part time?

Response options: Full Time, Part Time, Not employed at all during 2008

Have you ever been unemployed (that is, not employed and seeking employment) since 2008?

Response options: Yes, No

Since 2008, approximately how many times have you been unemployed (not employed and seeking employment), and for approximately how many months?

  • Total number of times

  • Total number of months

[Since 2008, / Of the [X] number of times you mentioned being unemployed,] what was the longest period of time you were unemployed and looking for a job?

Research Area: Obstacles to Career Goals

One important perspective on employment and career goals is the presence or absence of perceived or actual barriers. Thus, it is desirable to inquire about obstacles to career in the 6 years since last interview.

Since [high school / the last interview / XXXX], have any of the following interfered with your work or career plans? (Check all that apply)

  • Grades not high enough

  • Lack of ability to get training degree

  • Lack of money to complete education or get started on my chosen career field

  • I was considered “overqualified”

  • Illness, accident, or disability

  • Lack of openings in my field

  • Relocation would have been difficult or impossible

  • Marriage

  • Children

  • Caring for a sick parent or relative

  • Discrimination against persons of my race or ethnic background

  • Discrimination against persons of my gender

  • Transportation problems—difficulty in getting to or from work

  • None of the above

What job or occupation do you plan to have when you are age 30?

How much education do you think you need to get the job you expect or plan to have when you are 30 years old?

Response options:

  • Less than high school graduation

  • GED or other equivalency only

  • High school graduation only

  • Attend or complete a 1- or 2-year program in a community college or vocational school

  • Attend college, but not complete a 4- or 5-year degree

  • Graduate from college (4- or 5-year degree)

  • Obtain a Master’s degree or equivalent

  • Obtain a Ph.D., M.D., or other advanced degree

  • Don’t know

Research Area: Living and Family Arrangements and Configurations

Living arrangements, family structure, and family formation have been the subject of many past questions in the later stages of the secondary longitudinal studies. Some statuses, such as that of single mothers, or number of dependents, may be especially of interest in interpreting outcomes at age 26.

How many of each of the following lives with you? If you live by yourself please indicate so.

  • You live alone (yes/no)

  • Your spouse

  • Your partner in a marriage-like relationship

  • Your mother or female guardian

  • Your father or male guardian

  • Friends or roommates (including girlfriends/boyfriends)

  • Brothers or sisters (including adoptive, step, and foster siblings)

  • Children (biological, step, or adopted)

  • Others not already listed

Do you live in your parent/guardian’s home, or [do they / does he/she] live in your home?

Now I would like to get some information about your current dependents. Excluding yourself, [and excluding your spouse / and excluding your partner,] how many of each of the following types of dependents do you currently support? Enter ‘0’ where appropriate. (A dependent is a person for whom you pay at least half their expenses (food, shelter, clothing, health care, and schooling). This may include your children, parents, or others. Note that a dependent does not have to live with you).

  • Number of dependent children

  • Number of dependent adults

  • Number of other dependents

Next, I’m going to ask you a few questions about your family life. What is your current marital status?

Response options: Single, never married; Married; Divorced; Separated; Widowed; Partner, significant other, not married, but in a marriage-like relationship

What is the highest level of education your spouse/partner has completed?

Response options: Less than high school; High school diploma or GED; Associate’s degree; Bachelor’s degree; Master’s degree; PhD, MD, law degree, or other high level professional degree

Have you been married more than once?

Response options: Yes, No

How many times have you been married?

What is the month and year of your [first/second/etc.] marriage?

  • Month

  • Year

Did your [first/second/etc.] marriage end in a…

Response options: Divorce or annulment; Permanent or legal separation; Death

Have you had any biological children [, that is, children born to you/, that is, children for whom you are the natural father/mother]?

Response options: Yes, No

How many biological children have you had?

In what month and year was your [first/second/etc.] biological child born?

  • Month

  • Year

At the time of your [first/second/etc.] biological child’s birth, were you married to or partnered with your child’s father?

Response options: Yes, No

Have you ever adopted a child?

Response options: Yes, No

How many children have you adopted?

In what month and year did you [first/next] adopt a child?

  • Month

  • Year

Research Area: Income and Assets

Income and assets can be viewed as part of the return on investments in education, and as a resource that can be leveraged to enhance positive life changes. Considering the substantial earnings advantages of education, economic returns are one of the most important outcomes of education to record for ELS:2002. It is necessary to collect income information from sample members on all pathways, and although hard to measure, it is desirable to capture some basic information about assets as well. Hourly rate of pay is extremely important in labor market analysis.

Including all of the wages, salaries, and commissions you earned in 2010, about how much did you earn from employment before taxes and all other deductions?

From which of these sources did you receive income during 2010?

You received income from this source; Your spouse received income from this source; [Neither you nor your spouse received/You did not receive] income from this source.

  • Wages, salaries, commissions, or tips

  • Net income from a business or farm

  • Dividends, interest, rental income, or investment income

  • Social Security benefits

  • Child support

  • Veterans benefits

  • Unemployment compensation

  • Public assistance, welfare, AFDC, etc.

  • Income in the form of gifts from relatives or friends

  • Scholarships, fellowships, grants, loans, etc.

  • Any nontaxable income not included above

  • No income

Next, about how much did your spouse earn from employment before taxes and all other deductions in 2010? Please include all wages, salaries, and commissions.

[Without considering the 2010 earnings from employment that you just reported,] approximately how much did [you / you and your spouse / you and your partner] receive from other sources of income in 2010? (These sources might include stocks and bonds, savings interest, insurance, alimony or child support, family members, and disability payments.)

Have you started a savings account or a retirement fund such as an IRA, 403b, or 401k?

Response options: Yes, No

About how much money do you have in all your savings or retirement accounts combined?

Do you…

Response options: Pay mortgage toward or own [your residence]; Rent [your residence], or; Have some other arrangement?

What is the approximate value of [your residence]?

About how much do [you/you and your spouse/you and your partner] owe on the mortgage for your house, apartment, or residence? (If none, enter ‘0’).

Now, think about your debts [besides any mortgage on your home.] How much do you and others in your household owe altogether? Include all debts, including all types of loans [except mortgage loans], credit card debt, medical or legal bills, etc.

Response options: Less than $1000; $1000 to $4999; $5000 to $9999; $10,000 to $24,999; $25,000 to $49,999; $50,000 to $99,999; $100,000 to $249,999; $250,000 or more

Suppose you and others in your household were to sell all of your major possessions (including your home), turn all of your investments and other assets into cash, and pay off all your debts. Would you have something left over, break even, or be in debt?

Response options: 1=Have something left over; 2=Break even; 3=Be in debt

How much would you [have left over / be in debt]?





Research Area: Civic Participation

Civic participation is a major lifecourse marker of adult status. Civic engagement questions, particularly on voter participation, have been asked in various rounds of NLS:72, HS&B, and NELS:88. Other aspects of citizenship, such as community service, have been asked about in some of the surveys. These are relatively quick and simple items to collect, and have some trend value. Items asked of 26-year-olds in NELS:88/2000 exemplify the potential of this area; for example, whether preformed volunteer work, type of organizations involved in rendering community service, voter registration, voting in the most recent presidential election.

Next, tell me how many days you did each of the following activities in a typical 30-day month. In the past year, how many days in a typical month did you…

  • Visit a public library

  • Go to a play, concert, or museum

  • Participate in organized religious activities

  • Participate in group or team sports and recreation

During the past 2 years, have you performed any unpaid volunteer or community service work through such organizations as youth groups, service clubs, church clubs, school groups, or social action groups?

Response options: Yes, No

Which of the following types of organizations have you been involved with in your unpaid volunteer or community service work during the past 2 years? (Check all that apply)

  • Youth organization

  • School/community organizations

  • Political organization

  • Church-related group

  • Neighborhood/social action associations

  • Hospital or nursing home

  • Education organizations

  • Conservation/environmental group

During the past 2 years, how often did you spend time volunteering or performing community service?

Response options: Less than once a month; At least once a month, but not weekly; At least once a week

Are you currently registered to vote?

Response options: Yes; No; Ineligible to vote

[Even if you are not currently registered to vote], did you vote in the 2008 presidential election?

Response options: Yes, No

In the last 2 years, have you voted in any local, state, or national election?

Response options: Yes, No



Research Areas: Life Events and Values

Other topics to be included in the ELS third follow-up include, life events, and values. . Significant life events (such as the death of a loved one, or being the victim of a serious crime) may have serious effects on a respondent’s life course, and knowing whether such events have occurred gives researchers another tool in examining barriers to positive education or employment outcomes. Sample member’s values (e.g., having strong friendships, finding steady work) have been collected in prior rounds of ELS, as well as in NELS:88, and collecting them again in the third follow-up allows for both inter- and intra-cohort comparisons.

Since 2005, have any of the following happened to you?

Response options: Has not happened; Has happened once; Has happened more than once

  • Your parents or guardians got divorced or separated

  • One of your parents or guardians lost his or her job

  • You lost your job

  • One of your parents or guardians died

  • A close relative or friend died

  • You became seriously ill or disabled

  • A family member became seriously ill or disabled

  • You were the victim of a violent crime


In what month and year did event X occur?/..first occur/..last occur.


How important is each of the following to you in your life?

Response options: Not important; Somewhat important; Very important

  • Being successful in my line of work

  • Finding the right person to marry and having a happy family life

  • Having lots o f money

  • Having strong friendships

  • Being able to find steady work

  • Helping other people in my community

  • Being able to give my children better opportunities than I've had

  • Living close to parents and relatives

  • Getting away from this area of the country

  • Working to correct social and economic inequalities

  • Having children

  • Having leisure time to enjoy my own interests

  • Becoming an expert in my field of work

  • Getting a good education



Future Follow-up: Locator Items

Locating items should also receive consideration for inclusion in the third follow-up questionnaire (a decision is pending on whether to fund a fourth follow-up.). Proposed locating items are based on those used successfully in the past (for example, on the prior ELS:2002 round [2006 second follow-up]).



Locating

L1. We would like to make sure our records accurately reflect your full name. (Please verify or update your name.)

(Source: ELS F2)

  • First name:

  • Middle name:

  • Last name:

  • Suffix:


L2. What is your spouse's/partner's full name [including maiden name]?

(Source: ELS F2)

  • First Name:

  • Last Name:

  • Suffix:

  • Maiden name:


L3. [Please verify, update, or complete your mother's/female guardian’s contact information. / Please provide your mother's/female guardian’s name, address, telephone number, and email address.]

(Source: ELS F2)

  • Last Name:

  • First Name:

  • Address:

  • City:

  • State:

  • Telephone number:

  • Cell phone number:

  • Email address:

  • Check here if your mother/female guardian is deceased.

  • If your father's/male guardian's address is the same as your mother's/female guardian's, please check this box.


L4. [Please verify, update, or complete your father's/male guardian’s contact information. / Please provide your father's/male guardian’s name, address, telephone number, and email address.]

(Source: ELS F2)

  • Last Name:

  • First Name:

  • Address:

  • City:

  • State:

  • Telephone number:

  • Cell phone number:

  • Email address:

  • Check here if your father/male guardian is deceased.



L5. Please provide contact information for another person who will always know how to contact you and is not someone you have already mentioned.

(Source: ELS F2)

  • Last Name:

  • First Name:

  • Address:

  • City:

  • State:

  • Telephone number:

  • Cell phone number:

  • Email address:

  • Relationship:



L6. What is your permanent address?

(Source: ELS F2)

  • [Mother/female guardian’s address – from L3, if provided]

  • [Father/male guardian’s address – from L4, if provided]

  • [Another person’s address – from L5, if provided]

  • Your permanent address is different from these



L7. Please verify, update, or complete your contact information.

(Source: ELS F2)

  • Address:

  • City:

  • State:

  • Telephone number:

  • Cell phone number:

  • Email address:



L8. May we contact you for the next round of the study by sending a text message to your cell phone?

  • Yes

  • No



L9. What is your Social Security number?

(Under Title 20 of the General Education Provisions Act, we may collect your Social Security number for the purpose of confirming information abstracted from postsecondary educational records. In addition, we may use your Social Security number to help locate you for future interviews. Giving us your Social Security number is completely voluntary and there is no penalty for not disclosing it.)

(Source: ELS F2)



L10. We’d like to thank you for completing the survey today. Which address would you like your [$] check sent to?

(Source: ELS F2)

  • [address from L3, if provided]

  • [address from L4, if provided]

  • [address from L5, if provided]

  • [address from L7, if provided]

  • Send to a different address

  • You prefer not to be paid for your participation



L11. Please provide the address you would like your check sent to.

(Source: ELS F2)

  • Address:

  • City:

  • State:

  • Zip code:



L12. This is the address as it will appear on the mailing label. Please make any necessary changes in the fields below. If the address is correct, please click the “Continue” button.

(Source: ELS F2)

  • Address:

  • City:

  • State:

  • Zip code:



A.3 Improved Information Technology

The same technologically innovative, web-based data collection technology employed in the ELS:2002 second follow-up will be used again in the third follow-up. With this web technology, the resulting survey instruments have been carefully designed to be virtually indistinguishable from each other in terms of screen text and skip patterns across all three modes of data collection: self-administered web, CATI, and CAPI. Expectations are that in the third follow-up, over 40 percent of the responses will be web self-administered. The advantages of a web-based instrument include real-time data capture and access, including data editing in parallel with data collection, and increased efficiencies in effecting timely delivery. This same approach—successfully used in the 2006 round—will also be used in the ELS:2002 third follow-up; however, the field test collection will not employ CAPI due to the smaller yield needed to meet the objectives of the field test. The CATI component will begin in the fourth week of data collection.

Additional features of the system include (1) online help for selected screens to assist in question administration (in all three modes); (2) full documentation of all instrument components, including variable ranges, formats, record layouts, labels, question wording, and flow logic; (3) capability for creating and processing hierarchical data structures to eliminate data redundancy and conserve computer resources; (4) a scheduler system to manage the flow and assignment of cases to interviewers by time zone, case status, appointment information, and prior cases disposition; (5) an integrated case-level control system to track the status of each sample member across the various data collection activities; (6) automatic audit file creation and timed backup to ensure that, if an interview is terminated prematurely and later restarted, all data entered during the earlier portion of the interview can be retrieved; and (7) a screen library containing the survey instrument as displayed to the respondent (or interviewer).

A.4 Efforts to Identify Duplication

Since the inception of its secondary education longitudinal program in 1970, NCES has consulted with other federal offices to ensure that the data collected in the series do not duplicate other national data sources. The inclusion on the Technical Review Panels for ELS:2002 both of members of the research community and of other government agencies helps to focus study and instrument design on features of youth transition that ELS:2002 uniquely can illuminate.

ELS:2002 does not duplicate, but temporally extends, the prior NCES longitudinal studies—NLS:72, HS&B, and NELS:88.

Other NCES studies involve assessments of similar age groups to ELS:2002 (PISA 15-year-olds, NAEP eighth-graders and high school seniors), but are not longitudinal, and do not collect data from parents. By the time of the second follow-up (2006, when most sample members were out of high school for 2 years), there is some similarity in sample to the NCES Beginning Postsecondary Students (BPS). However, the BPS longitudinal study focuses only on beginning postsecondary students, including late entrants into the system. In contrast, ELS:2002 includes both cohort members who go on to postsecondary education and those who do not—but misses many late entrants to the system, even if it t follows sample members to age 31. Thus BPS and ELS:2002 are fundamentally complementary, not duplicative.

The only non-NCES federal study that would appear to be comparable to ELS:2002 is the BLS National Longitudinal Survey of Youth (NLSY)—the NLSY79 and, sampling respondents closer to ELS:2002 in age, the NLSY97 shares with ELS:2002 (and the prior NCES high school cohorts) the goal of studying the transition of adolescents into adult roles. However, NLSY is an age cohort while ELS:2002 is a grade cohort, and NLSY is household based while ELS:2002 is school based. Although both studies are interested in both education and labor market experiences (and their interrelationship), ELS:2002 puts more emphasis on postsecondary education, while NLSY stresses labor market outcomes and collects detailed employment event histories. Thus, similarly as with BPS, ELS:2002 and the two NLSY cohorts are complementary rather than duplicative.

A.5 Methods Used to Minimize Burden on Small Businesses

This section has limited applicability to the proposed data collection effort. Target respondents for ELS:2002 are individuals, and direct data collection activities via web-based self-administration, CATI, and CAPI will involve no burden to small businesses or entities. Small entities such as high schools are no longer included in the data collection scheme. However, should the financial aid and postsecondary transcripts options be exercised, the data collection would involve some small entities (defined as proprietary or not-for-profit postsecondary institutions enrolling fewer than 1,000 students). The update memo covering these options would also address issues of burden minimization for small entities.

A.6 Frequency of Data Collection

This submission describes activities for the field test and full-scale survey of ELS:2002 third follow-up, in the larger context of the purposes and procedures of the study. One design element that is central to fulfilling the purpose of the study is the frequency or periodicity of data collection.

The rationale for conducting ELS:2002 is based on a historical national need for information on academic and social growth, school and work transitions, and family formation. In particular, recent education and social welfare reform initiatives, changes in federal policy concerning postsecondary student support, and other interventions necessitate frequent studies. Repeated surveys are also necessary because of rapid changes in the secondary and postsecondary educational environments and the world of work. Indeed, longitudinal information provides better measures of the effects of program, policy, and environmental changes than would multiple cross-sectional studies.

To address this need, NCES began the National Longitudinal Studies Program approximately 40 years ago with NLS:72. This study collected a wide variety of data on students’ family background, schools attended, labor force participation, family formation, and job satisfaction at five data collection points through 1986. NLS:72 was followed approximately 10 years later by HS&B, a longitudinal study of two high school cohorts (10th- and 12th-grade students). NELS:88 followed an eighth-grade cohort, which now, with a modal age of 26 years, represents the probable final data collection point. With the addition of ELS:2002, a 32-year trend line will be available. Taken together, these studies provide much better measures of the effects of social, environmental, and program and policy changes than would a single longitudinal study or multiple cross-sectional studies.

It could be argued that more frequent data collection would be desirable; that is, there would be a gain in having a program of testing and questionnaire administration that is annual throughout the high school years. However, the 2-year interval was employed with both the HS&B sophomore cohort and NELS:88, and proved sufficient to the realization of both studies’ primary objectives. Although there would be benefits to more frequent data collection in the high school years, it must also be considered that the effect would be to greatly increase the burden on schools and individuals, and that costs would also be greatly increased. Probably the most cost-efficient and least burdensome method for obtaining continuous data on student careers through the high school years comes through the avenue of collecting school records. High school transcripts were collected for a subsample of the HS&B sophomore cohort, as well as for the entire NELS:88 cohort retained in the study after eighth grade. A similar academic transcript data collection (covering grades 9 through 12) was conducted for the first follow-up of ELS:2002.

Periodicity of the survey after the high school years (at the very terminus of the study) may also be questioned—there is a 6-year gap between the 2006 round (2 years out of high school) and the final round in 2012 (8 years out of high school). Undoubtedly, more process and postsecondary education context information could be obtained if there were surveys in the intervening years (say at age 22, which would optimally capture the college experience). However, the strategy of waiting until about age 26 for the third follow-up interview is extremely cost-effective, in that the information collected at that time includes both final outcomes and statuses, and provides a basis for identifying the postsecondary institutions that individual sample members have attended. In turn, postsecondary transcripts are then obtained that provide continuous enrollment histories for specific courses taken, and provide records of course grades and other information needed to analyze postsecondary persistence and attainment. There is also currently consideration for a fourth follow-up (around age 31), which would provide better information on early career formation and rate of return on educational investment.

A.7 Special Circumstances of Data Collection

All data collection guidelines in 5 CFR 1320.5 are being followed. No special circumstances of data collection are anticipated.

A.8 Consultants Outside the Agency

The 60-day Federal Register notice was published on December 20, 2010 (75 FR, No. 243, p. 79352). No public comments were received in response to this notice. In recognition of the significance of ELS:2002, several strategies have been incorporated into the project’s work plan that allow for the critical review and acquisition of comments regarding project activities, interim and final products, and projected and actual outcomes. These strategies include consultations with persons and organizations both internal and external to the National Center for Education Statistics, the U.S. Department of Education, and the federal government.

ELS:2002 project staff have established a Technical Review Panel (TRP) to review study plans and procedures. The third follow-up TRP includes some of the earlier ELS:2002 panelists for continuity with prior phases of the study. However, the membership has been reconstituted to reflect the shift in focus from high school experiences to postsecondary and labor market transitions that mark the final outcomes of the study. See Exhibit A-1 for a list of the TRP membership and their affiliations. The TRP met to discuss the ELS:2002/12 field test study design, research priorities, and survey content September 30–October 1, 2010.

ELS:2002 project staff also enlisted three academic consultants as part of a research area plan to offer advice on priorities and new topic areas. Two of these individuals wrote position papers (Olsen and Shanahan) and a third wrote new items (Lent).

Exhibit A-1. Education Longitudinal Study:2002 (ELS:2002) Third Follow-up Technical Review Panel


Participants and Staff Contact List



Technical Review Panelists

Sara Goldrick-Rab

University of Wisconsin-Madison

1025 West Johnson Street, 575K

Madison, WI 53706

Phone: (608)265-2141

E-mail: SRab@education.wisc.edu

Robert Gonyea

Indiana University

Center for Postsecondary Research

107 S. Indiana Avenue, Eigenmann 443

Bloomington, IN 47405

Phone: (812)856-5824

E-mail: rgonyea@indiana.edu

Robert Lent

University of Maryland

RM 3214D Benjamin Building

College Park, MD 20742

Phone: (301)774-6390

E-mail: boblent@umd.edu

Amaury Nora

The University of Texas at San Antonio

College of Education and Human Development

One UTSA Circle

San Antonio, TX 78249

Phone: (210)458-4370

E-mail: Amaury.Nora@utsa.edu

Randall Olsen

The Ohio State University

921 Chatham Lane, Suite 100

Columbus, OH 43221

Phone: (614)442-7348

E-mail: olsen.6@osu.edu

Aaron Pallas

Columbia University, Teachers College

464 Grace Dodge Hall

New York, NY 10027

Phone: (212)678-8119

E-mail: Amp155@colmbia.edu

Kent Phillippe

American Association of Community Colleges

One Dupont Circle, NW, Suite 410

Washington, DC 20036

Phone: (202)728-0200

E-mail: kphillippe@aacc.nche.edu

Barbara Schneider

Michigan State University

516B Erickson Hall

East Lansing, MI 48824

Phone: (517)432-0300

E-mail: bschneid@msu.edu

Michael Shanahan

University of North Carolina at Chapel Hill

Department of Sociology

CB#3210, Hamilton Hall

Chapel Hill, NC 27599

Phone: (919)843-9865

E-mail: mjshan@e-mail.unc.edu

Marvin Titus

University of Maryland

EDHI

Room 2200 Benjamin

College Park, MD 20742

Phone: (301)405-2220

E-mail: mtitus@umd.edu

U.S. Department of Education and other Federal and Non-Federal Invitees

Elise Christopher

U.S. Department of Education, NCES

1990 K Street, NW, Room 9021

Washington, DC 20006

Phone: (202)502-7899

E-mail: Elise.Christopher@ed.gov

Stephanie Cronen

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)403-6419

E-mail: Scronen@air.org

Bruce Daniel

Kforce Government Solutions

2750 Prosperity Avenue, Suite 300

Fairfax, VA 22031

Phone: (703)245-7350

E-mail: BDaniel@kforcegov.com

Sandy Eyster

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)403-6149

E-mail: seyster@air.org

Mary Frase

National Science Foundation

Directorate of Social, Behavioral and Economic Sciences

Science Resources Statistics

4201 Wilson Blvd. Suite 965 S

Arlington, VA 22230

Phone: (703)292-7767

E-mail: mfrase@nsf.gov

Brian Harris-Kojetin

Office of Management and Budget

725 17th Street NW

Room 10201

Washington, DC 20503

Phone: (202)395-7314

E-mail: Brian_A._Harris-Kojetin@omb.eop.gov

Lisa Hudson

U.S. Department of Education, NCES

1990 K Street, NW, Room 8104

Washington, DC 20006

Phone: (202)502-7358

E-mail: lisa.hudson@ed.gov

Tracy Hunt-White

U.S. Department of Education, NCES

1990 K Street, NW, Room 8113B

Washington, DC 20006

Phone: (202)502-7438

E-mail: tracy.hunt-white@ed.gov

Stuart Kerachsky

U.S. Department of Education, NCES, IES

1990 K Street NW, Room 9116

Washington, DC 20006

Phone: (202)502-7442

E-mail: stuart.kerachsky@ed.gov

Kashka Kubzdela

U.S. Department of Education, NCES

1990 K Street, NW, Room 9014

Washington, DC 20006

Phone: (202)502-7411

E-mail: Kashka.Kubzdela@ed.gov

Laura LoGerfo

U.S. Department of Education, NCES

1990 K Street NW, Room 9022

Washington, DC 20006

Phone: (202)502-7402

E-mail: Laura.LoGerfo@ed.gov

Rochelle Martinez

Office of Management and Budget

725 17th Street, NW

Room 10202 NEOB

Washington, DC 20503

Phone: (202)395-3147

E-mail: Rochelle_W._Martinez@
omb.eop.gov

David Miller

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)403-6533

E-mail: dmiller@air.org

Isaiah O’Rear

U.S. Department of Education, NCES

1990 K Street, NW

Washington, DC 20006

Phone: (202)502-7378

E-mail: Isaiah.o’rear@ed.gov

Jeffrey Owings

U.S. Department of Education, NCES

1990 K Street, NW, Room 9105

Washington, DC 20006

Phone: (202)502-7423

E-mail: jeffrey.owings@ed.gov

Leslie Scott

American Institutes for Research

Education Statistics Services Institute

1990 K Street, NW, Suite 500

Washington, DC 20006

Phone: (202)654-6542

E-mail: lscott@air.org

Marilyn Seastrom

U.S. Department of Education, NCES

1990 K Street, NW, Room 9051

Washington, DC 20006

Phone: (202)502-7303

E-mail: marilyn.seastrom@ed.gov

Matthew Soldner

U.S. Department of Education, NCES

1990 K Street, NW, Room 8121

Washington, DC 20006

Phone: (202)219-7025

E-mail: Matthew.Soldner@ed.gov

Tom Weko

U.S. Department of Education, NCES

1990 K Street, NW, Room 8099

Washington, DC 20006

Phone: (202)502-7643

E-mail: tom.weko@ed.gov

Andrew White

U.S. Department of Education, NCES, IES

1990 K Street, NW, Room 9105

Washington, DC 20006

Phone: (202)502-7472

E-mail: andrew.white@ed.gov

John Wirt

U.S. Department of Education, NCES

El/Sec Sample Survey Studies Program-ESLSD

1990 K Street, NW, Room 9028

Washington, DC 20006

Phone: (202)502-7478

E-mail: john.wirt@ed.gov

Contractor and Subcontractor Staff

Mark Dennis

Millennium Services 2000+ Incorporated

8121 Georgia Ave., Suite LL2

Silver Spring, MD 20910

Phone: (240)839-5113

E-mail: mdennis@ms2kplus.com

Steven Ingels

RTI International

701 13th Street NW, Suite 750

Washington, DC 20005

Phone: (202)974-7834

E-mail: sji@rti.org

Donna Jewell

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)541-7266

E-mail: dmj@rti.org

Erich Lauff

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)990-8492

E-mail: erichlauff@rti.org

Tiffany Mattox

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)485-7791

E-mail: tmattox@rti.org

Daniel Pratt

RTI International

PO Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)541-6615

E-mail: djp@rti.org

John Riccobono

RTI International

P.O. Box 12194

3040 Cornwallis Road

Research Triangle Park, NC 27709

Phone: (919)541-7006

E-mail: jar@rti.org


A.9 Provision of Payments or Gifts to Respondents

Overview of Incentive Plans for the ELS:2002 Third Follow-up. We propose using respondent incentives at two junctures for the forthcoming study: first, at the locating stage, when sample members are first contacted (approved under OMB# 1850-0652 v.5); second, at the data collection stage, when they are expected to complete the questionnaire. As to the incentive in locating, its rationale and design can be succinctly stated. Due to the importance of reaching a very high percentage of the sample members, RTI has proposed an experiment for the field test sample in which half of the sample (the locating sample comprises student sample members and their parents as well) will be offered a $10 incentive check that will be sent upon receipt of updated or confirmed contact information. The other half of the sample will not be offered an incentive. The results of the experiment will be evaluated in determining the use of an address-update incentive for the full-scale sample the following year.

We also propose incentives for questionnaire completion. There are two possible models for the use of questionnaire completion incentives in the full-scale study, and which is chosen depends on field test results. First, there is the possibility of simply repeating the successful OMB-approved incentives of the ELS:2002 second follow-up. Indeed, the respondent panel will possibly have been somewhat conditioned in their expectations to receive amounts similar to what they received before, for what is a similar task and burden, 6 years later.

However, we also propose an alternative and an experiment to test the alternative. The alternative is to use prior-round information to model response propensities, and to prioritize cases based on that information, such that response bias will be minimized. The experimental method is similar in concept to the incentive plan offered in the second follow-up, but is a statistically well-grounded refinement of that approach. In the second follow-up plan, all respondents were given an incentive, but certifiably hard-to-get groups such as past-round nonrespondents and high school dropouts were given a larger incentive. (There was also an incentive for early response.) Because the propensity-modeling plan uses more data, the individuals in need of a higher incentive can, in theory, be more accurately determined. (Note that the “need” is relative to bias minimization, and not the need of the respondent.) Because the propensity-modeling plan is able to consider respondent information (such as a full range of response and sociodemographic characteristics) more inclusively and broadly, it is posited that it will also be able to determine which cases would contribute most to bias in estimates, and ensure that these cases receive priority.

In the next portion of this memo, the history of incentive use in the second follow-up of ELS:2002 will be recounted. In the portion that follows, the experimental method for achieving reduction in bias will be discussed. A discussions and conclusions section concludes the memorandum.

History of OMB-approved incentive payments in ELS:2002, with special reference to the second follow-up. We present a brief history of OMB-approved incentive payments in the follow-ups of ELS:2002. Then we sketch a proposal for use of propensity modeling to reduce nonresponse bias through case prioritization and optimal assignment of differential monetary incentives. The proposed new method—the efficacy of which would be established in a field test experiment—should provide more sophisticated modeling of the patterns of unit nonresponse. It would more effectively address the direct threat to the validity of study data—bias—and obviate the need to depend on use of a nonresponse criterion that is but a crude proxy for (or indirect measure of) bias.

The text that follows draws, in somewhat abbreviated form, on the ELS:2002 second follow-up data file documentation (NCES 2008-347). Incentive payments to respondents were a major feature of the data collection plan for the ELS:2002 2006 study. The results of the 2003 field test experiments and the success of the 2004 round of data collection provided evidence of the value of respondent incentives in achieving high response rates (see NCES 2006-344, appendix J). A number of important factors were considered in developing and implementing the incentive plan:

  • Almost all first follow-up sample members received an incentive, including both those who participated in school and those who participated outside of school. Paying incentives to almost all first follow-up participants may have raised expectations among the sample cohort that they would again receive payment for participating in the 2006 round.

  • Between the F1 and F2 surveys, the ELS:2002 sample cohort became further dispersed. In both the 2004 main study and second follow-up field test (2005) providing incentives was effective in making contact with sample members who were difficult to reach.

  • Although cell sizes for important analytic subgroups were satisfactory after the success of the 2004 data collection, significant attrition among these subgroups was a threat to the analytic value of the second follow-up. The two most important subgroups that were offered higher incentives in the first and second follow-ups were high school dropouts and prior-wave nonrespondents. Paying differential incentives to both dropouts and first follow-up nonrespondents in 2006 was designed to ensure sufficient inclusion of these critically important subgroups.

The second follow-up incentive plan was designed to maximize respondent participation by meeting their expectations of compensation for their time and efforts, helping to locate widely dispersed sample members, and offering greater incentives to particular subgroups with limited representation in the sample. In addition, the incentive plan was generally similar to the 2004 plan and also incorporated elements of similar education studies, including NPSAS:04 and the BPS longitudinal study. In this way, the 2006 plan was as consistent as possible with both the prior round of ELS:2002 and other current education surveys of the young adult population. It should be noted that incentives were one critical part of an ensemble of approaches followed to minimize nonresponse; the wider context of persuasive activities is further detailed in NCES 2008-347.

The 2006 incentive plan was designed to address five key features of survey context:

  • First follow-up participation status—F1 respondent or F1 nonrespondent.

  • High school dropout status—identified in F1 as ever having dropped out or not.

  • Timing of participation—during the first 4 weeks of data collection or beyond this period.

  • Difficulty in contacting or enlisting cooperation with the sample member—meeting the criteria for difficult cases or not. The criteria for the “difficult” status increase included the following:

  • more than 20 calls were made to contact the sample member without completing an interview;

  • sample member refused to participate during an initial contact;

  • others refused multiple times on behalf of the sample member;

  • sample member could not be located through any of the telephone numbers previously provided, so the case was submitted for intensive tracing;

  • case was sent to a field interviewer for tracing; or

  • sample member had still not completed the interview as of June 15, 2006.

  • During the final 8 weeks of data collection, partial prepayment of the incentive was sent to sample members who had not yet participated.

The first four of these five elements were approved by OMB and established prior to the start of the data collection period. The fifth element was implemented as a contingency during data collection based on discussions with and approval from OMB.

Because multiple criteria applied to many sample members, the incentive plan elements were combined to determine the appropriate payment level at each point of the study. To ensure that survey notification materials and interviewer statements matched respondents’ expectations on how much they would be paid at each point in the data collection period, consistency was maintained across all points of contact with respondents regarding the amount of their incentive payments. This consistency was achieved initially and maintained throughout the study by using the same predetermined variables—dropout status, F1 participation status, difficult case status, and current date—in all study materials and computer programs to indicate the appropriate incentive amount. Materials included mailed letters and instructions and e-mail messages. Computer programs included web/CATI/CAPI scripts and instruments as well as the sample database. The same procedures followed in the 2006 round to ensure that consistency had been used effectively in the 2004 data collection.

Exhibit A‑2 summarizes the specific elements of the 2006 incentive plan. The regular or “base” incentive amount for all ELS:2002 sample members who had never been identified as dropouts and had participated in the F1 data collection was $20. For those sample members who participated in the base-year study but did not participate in 2004, the regular incentive was higher at $40. Likewise, those who had ever been identified as dropouts through the 2004 round were offered $40 as a base incentive.

Exhibit A-2. Second Follow-up Full-Scale Respondent Incentive Plan: 2006

Respondent type

Regular incentive

Early completer

Difficult case

Final difficult
($10 prepaid)

F1 nonrespondent

$40

$50

$50

$60

Ever dropout

40

50

50

60

F1 respondent, nondropout

20

30

30

40

NOTE: F1 = First follow-up.

SOURCE: U.S. Department of Education, National Center for Education Statistics, Education Longitudinal Study of 2002 (ELS:2002), Second Follow-up, 2006.

To encourage sample members to participate early in the data collection period, either through web self-administration or by calling the toll-free number to complete a CATI interview, those who completed the survey (by either mode) prior to the start of outbound CATI calling were paid an additional $10 on top of the regular incentive. The early incentive period ran from the opening day of data collection on January 25, 2006, through February 19, 2006, when outbound calling began. This element was designed to offer the most responsive sample members a bonus for participating prior to when more intensive and more expensive data collection procedures were implemented.

A further addition to the incentive payment plan was to offer an additional $10 over the regular amount to those sample members who proved extremely difficult to contact or enlist in the study during the course of the 2006 data collection period. This increase was implemented independently of each sample member’s high school completion status or F1 participation status. Once a case met one (or more) of the difficult case criteria (listed previously), all computer programs and databases were updated with the higher incentive amount.

The preceding elements of the respondent incentive plan were all implemented at the beginning of the 2006 data collection period. On July 6, 2006, one final revision to the incentive plan was implemented for the final 8 weeks (or about 2 months) of data collection. All sample members who had not yet completed the survey were sent an express mail package with an additional $10 check as a prepayment of the full incentive amount. The remainder of the incentive was payable upon completion of the survey. If mailed packages did not reach the intended sample members and at least one alternative address was available in the sample members’ records, data collection staff re-mailed the $10 prepaid check to these sample members. The purpose of the prepaid incentive was to assure remaining sample members that NCES and RTI were serious about obtaining their participation in the survey and compensating them for completing the survey. A total of 3,200 packages with the prepaid incentive check were mailed. Another 10 sample members who had not yet completed the F2 interview did not have a current, valid address to be included in this mailing.

Throughout the 2006 data collection period, all incentive payments were provided in the form of checks. The data file for incentive payments was created at the beginning of each week and the incentive checks and thank you letters were mailed to participants at the address indicated during the last section of the interview. Because address information was occasionally incomplete or inaccurate, data collection staff investigated returned incentive checks to find an accurate mailing address so that these checks could be re-mailed. We next contrast the approach taken to the 2006 data collection with our proposal for an experiment in an alternative, and we believe superior, method for the 2012 round.

For the ELS Third Follow-Up Field Test, we propose experimentally testing a new methodology designed to focus on and minimize nonresponse bias in the final survey estimates. This experimental approach aims to reduce nonresponse bias by using multiple sources of data to produce models that estimate a sample member’s response propensity prior to the commencement of data collection. After we empirically identify those sample members with the lowest response propensities, we implement a differential incentive structure and telephone prompting schedule in an attempt to encourage these sample members’ participation. We conduct earlier telephone prompting calls and offer larger incentives to low-propensity cases because these cases often disproportionately contribute to nonresponse bias and can be harmful to the precision of survey estimates.

The experiment will be carried out as follows:

Step 1—Estimate response propensities for ELS 2011 field test sample members. Prior to the start of the field test data collection, we will estimate response propensities by predicting the second follow-up response outcome for all 2011 ELS field test sample members. The variables we are using in the propensity modeling preparation and analysis are listed below1:

  • Response status at each prior data collection

  • Mode of response in past rounds

  • Timing of response in past rounds (e.g., during early completion period)

  • Time of day of prior response

  • Number of call attempts

  • Panel maintenance update request response status

  • Completeness of contact information including address, phone, email

  • Level of recency of contact information including address, phone, email

  • High school completion status

  • Postsecondary enrollment status

  • Type of postsecondary institution attended (e.g., private, public, 4-year, 2-year)

  • Employment status

  • Family formation status

  • Assessment performance in prior rounds

  • High school academic performance (e.g., GPA)

Predicted probabilities for completed interview in the second follow-up will be used to divide the 2011 field test sample into two equal groups: a low and a high response propensity (n =530 for each group). The low-propensity group will consist of those sample members we predict to be the least likely to be interviewed. This approach represents a significant refinement to the nonresponse avoidance approaches previously used for ELS that are described above in detail. In prior waves “difficult cases” were identified using only a single characteristic, such as prior nonresponse or high school dropout. We now propose taking advantage of the longitudinal nature of ELS and employing data from the current as well as prior rounds of ELS and to produce more precise estimates of a sample member’s response propensity using a richer sampling of the prior round’s data.

Step 2—A multiphased experiment to test different interventions with low propensity cases. The approach we propose targets low-propensity cases with different interventions throughout data collection. In the third follow-up field test, we will implement these interventions experimentally. As described above, response propensities for all cases will be calculated before the start of data collection and the cases will be assigned to a low- or high-propensity status. Low-propensity cases will be further split into treatment and control groups (n=265 for each group).

Phase 1 (early response period prompting calls)—The first 3 weeks of data collection represent the early response period. Our first intervention for low-propensity cases will occur roughly halfway through the early response period when we begin outbound prompting calls. Cases in the treatment group will receive prompts.

Phase 2 (differential incentives)—After the early response period, we propose differential incentives for nonrespondent cases based on their response propensity. Significantly larger response incentives will be offered to low-propensity treatment cases. The response incentive for treatment cases will be $45 and $25 for control cases.

Phase 3 (large incentives for all remaining nonrespondents)—Toward the end of data collection (beginning at week 10), we propose to offer large incentives to all remaining nonrespondents. The response incentive for treatment cases will be $55 and $35 for control cases. This is a final effort to bring as many sample members into the response as we can. The incentive structure is presented by data collection phase and propensity group in Exhibit A‑3 for the field test and Exhibit A‑4 for the main study.

We understand that splitting the sample based on response propensities could result in some critical subgroups (e.g., low income, minorities) disproportionately receiving larger incentives should they fall into the low response propensity group. After response propensities are calculated, we will evaluate these propensities across demographic groups. Should there be disproportionate representation, we are prepared to post stratify the propensity groups accordingly, if NCES and OMB so desire.

Exhibit A-3. Incentives by Data Collection Phase and Propensity Group for Field Test

Phase and Group

Percent of Total Respondents

Number of Respondents

Incentive Amount

Phase 1

30%

150


High Prop. and Low Prop. – Control


112

$25

Low Prop. – Treatment


38

$25

Phase 2

44%

220


High Prop. and Low Prop. – Control


165

$25

Low Prop. – Treatment


55

$45

Phase 3

26%

130


High Prop. and Low Prop. – Control


97

$35

Low Prop. – Treatment


33

$55



Exhibit A-4. Incentives by Data Collection Phase and Propensity Group for Main Study

Phase and Group

Percent of Total Respondents

Number of Respondents

Incentive Amount

Phase 1

30%

4,243


High Propensity


3,182

$25

Low Propensity


1,061

$25

Phase 2

44%

6,222


High Propensity


4,666

$25

Low Propensity


1,556

$45

Phase 3

26%

3,677


High Propensity


2,758

$35

Low Propensity


919

$55



Step 3—Evaluating results. The evaluation of the modeling experiment will be conducted using methods established in Schouten et al. (2009). We intend to evaluate the experimental results by examining how well our model predicts response outcomes and by investigating whether our treatments minimized bias. First, we will look at the response rates for groups defined by estimated response propensity (i.e., how well our assigned response propensities actually predict the survey outcome). We then will address whether the variance of the response propensity, Shape1 , was lowered and whether the association between the response propensity and any survey variables, y, we choose to examine, Shape2 , was reduced, thus minimizing nonresponse bias in survey estimates of means and proportions.

Discussion and Conclusions. This propensity-modeling experiment can be viewed as a viable yet constrained component of the field test; that is, the generalizability of the field test results may be somewhat impaired by several factors: (1) Some individuals who received higher incentives in the 2005 field test may not be identified for high incentives in the 2011 field test, but may be negatively influenced by the unfulfilled expectations of their past experience. Although this is a reasonable intuition, it is important to note that empirical data suggest that panel participants are not conditioned based on past incentives (Singer 1998). (2) Another factor to consider is that the field test differs somewhat from the full-scale study in the premium put on either a high response rate or minimization of bias. In the field test, the data collection goal is driven by the need to achieve a yield sufficient for item analysis. This goal can be met by a yield of about 500 cases. Many of the hardest cases, then, could be more aggressively pursued in the full-scale study than in the field test—relatedly, the data collection period will be shorter in the field test. This may somewhat distort experimental results in the field test and lessen their generalizability. (3) The field test yield of 500 cases, when divided into low and high propensity, with low propensity then divided into two groups (treatment and control), when further cross-classified by demographic variables may provide only low n’s for looking at results by subgroups. This caveat must be entered, but the wider context here is that the treatment and control groups should be numerically sufficient for purposes of the experiment, and there has been no expectation that separate results for rare policy-relevant subgroups would be meaningfully reportable. (4) As earlier noted, the experiment does not attempt to test incentive levels, nor is the sample large enough to support such an approach. (5) Finally, there is some ambiguity about the basis for choice between the historical approach and the propensity-modeling approach. The latter offers a clear and clean contrast between a treatment group and a control group. But the fact that the treatment group fares better in response prediction or in a bias analysis than the control would not conclusively demonstrate that the treatment group results are better than the historical plan. At best this may be assumed as probable, but it would not be scientifically proven. Because the propensity modeling approach is a more sensitive and sophisticated device for allocating incentives and other interventions (i.e., outbound prompting calls), the presumption is in favor of this application, subject to field test results, and not the historical policy. (6) Nor does the experiment provide a test of paradata. Paradata include a large number of data elements which describe the process of data collection. They include for example the date and time of prior noncontacts and information surrounding refusals. In the third follow-up field test experiment, we will not be conducting a test of which elements of paradata are the most effective in encouraging responses from sample members. Rather, we will be using elements of paradata that are available in prior rounds of ELS to enhance our ability to predict the response propensity of cases in the current round of ELS.

Despite these limitations, it nevertheless appears to us that a propensity-modeling approach could, with appropriate caveats, be tested, and become the basis, if successful, for an improved full-scale study apparatus for minimizing bias in the ELS:2002 third follow-up estimates. If the propensity-modeling approach is not successful, then the obvious default is the incentives apparatus employed in the 2006 second follow-up.

A.10 Assurance of Confidentiality

RTI has prepared a data security plan (DSP) for the ELS:2002 third follow-up data collection. The ELS:2002 third follow-up data security plan will strengthen confidentiality protection and data security procedures developed for prior rounds of ELS:2002 and represent best-practice survey systems and procedures for protecting respondent confidentiality and securing survey data. An outline of this plan is provided in Exhibit A-3. The ELS:2002 third follow-up data collection DSP will

  • establish clear responsibility and accountability for data security and the protection of respondent confidentiality with corporate oversight to ensure adequate investment of resources;

  • detail a structured approach for considering and addressing risk at each step in the survey process and establish mechanisms for monitoring performance and adapting to new security concerns;

  • include technological and procedural solutions that mitigate risk and emphasize the necessary training to capitalize on these approaches; and

  • be supported by the implementation of data security controls recommended by the National Institute of Standards and Technology for protecting federal information systems.

Exhibit A-5. ELS:2002 Third Follow-up Data Security Plan Outline

ELS:2002 Data Security Plan Summary

Maintaining the Data Security Plan

Information Collection Request

Our Promise to Secure Data and Protect Confidentiality

Personally Identifying Information That We Collect and/or Manage

Institutional Review Board Human Subject Protection Requirements

Process for Addressing Survey Participant Concerns

Computing System Summary

General Description of the RTI Networks

General Description of the Data Management, Data Collection, and Data Processing Systems

Integrated Monitoring System

Receipt Control System

Instrument Development and Documentation System

Data Collection System

Document Archive and Data Library

Employee-Level Controls

Security Clearance Procedures

Nondisclosure Affidavit Collection and Storage

Security Awareness Training

Staff Termination/Transfer Procedures

Subcontractor Procedures

Physical Environment Protections

System Access Controls

Survey Data Collection/Management Procedures

Protecting Electronic Media

Encryption

Data Transmission

Storage/Archival/Destruction

Protecting Hard-Copy Media

Internal Hard-Copy Communications

External Communications to Respondents

Handling of Mail Returns, Hard-Copy Student Lists, and Parental Consent Forms

Handling and Transfer of Data Collection Materials

Tracing Operations

Software Security Controls

Data File Development: Disclosure Avoidance Plan

Data Security Monitoring

Survey Protocol Monitoring

System/Data Access Monitoring

Protocol for Reporting Potential Breaches of Confidentiality

Specific Procedures for Field Staff



Under this plan, the ELS:2002 third follow-up data collection will conform totally to federal privacy legislation, including the Privacy Act of 1974 (5 U.S.C. 552a) and Section C of Education Sciences Reform Act of 2002 (P.L. 107-279). Consistent with the Privacy Act, these data will constitute a system of records, per the system of records notice 18-13-01 National Center for Education Statistics Longitudinal Studies and the School and Staffing Surveys (64 FR, No. 107, p. 30181-82, June 4, 1999).

More specifically, it is expected that ELS:2002 will conform to the NCES Restricted Use Data Procedures Manual and NCES Standards and Policies. The plan for maintaining confidentiality includes obtaining signed confidentiality agreements and notarized nondisclosure affidavits from all personnel who will have access to individual identifiers. Each individual working on ELS:2002 will also complete the e-QIP clearance process. The security plan includes annual personnel training regarding the meaning of confidentiality and the procedures associated with maintaining confidentiality, particularly as it relates to handling requests for information and providing assurance to respondents about the protection of their responses. The training will also cover controlled and protected access to computer files, built-in safeguards concerning status monitoring and receipt control systems, and a secured and operator-manned in-house computing facility.

Immediately prior to field test data collection, contacting materials will be sent to sample members and a parent to initiate data collection and offer access to the web survey (see appendix 6). Sample members are more transient at this age than their parents, so we want to engage the parents in case the sample member has relocated since our last contact with him/her. The letter to parents thanks them for their past assistance with the study, informs them that we are trying to reach their children for the third follow-up, and requests their assistance in contacting and communicating with their children about the study. We will provide parents with complete information about the third follow-up data collection except for their children’s study ID and password. This exception will protect sample members’ privacy and help ensure data security.

The letters to both sample members and parents will describe the voluntary nature of the survey. The materials sent will include a brochure describing the study, the ways the data will be used, and conveying the extent to which the identity of the respondents and their responses will be kept confidential. The prenotification letter to the study will contain the following statement:

Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Public Law 107-279, Section 183].”

During the telephone interview, the following informed consent statement will be read verbatim. We have slightly modified the language used in this passage to more accurately reflect a telephone/personal contact.

As mentioned in the letter, you previously participated in ELS:2002 with about 15,000 other students across the country who were selected from 10th-grade classes in 2001 or 12th-grade classes in 2003. This survey is part of an education research study sponsored by the U.S. Department of Education. The purpose of ELS:2002 is to provide information that will be used to improve the quality of education in America. The interview will ask questions about your further schooling and work experiences. On average, it takes about 35 minutes to complete, depending on your responses.

Participation is voluntary. Your answers may be used only for statistical purposes and may not be disclosed, or used, in identifiable form for any other purpose except as required by law [Education Sciences Reform Act of 2002 (ESRA 2002) Public Law 107-279, Section 183]. You may withdraw from the study at any point. However, your answers are very important because they represent many others who were not selected to take part. You may skip any question that you don’t want to answer.”

Data files, accompanying software, and documentation will be delivered to NCES at the end of the project. Neither names nor addresses will be included on any data file. A separate locator database for these sample members will be maintained in a secure location. All hard-copy tracing directory updates will be destroyed after they are entered into magnetic form and verified.

A.11 Sensitive Questions

The student interview contains items about earnings, assets, and debts. Federal regulations governing the administration of these questions, which might be viewed as “sensitive” due to personal or private information, require (a) clear documentation of the need for such information as it relates to the primary purpose of the study, and (b) provisions to respondents which clearly inform them of the voluntary nature of participation in the study, and (c) assurances of confidential treatment of responses. Information about earnings and assets are vital labor force variables and provide important indicators of the rate of return of educational experiences to the respondent.

If a sample member’s SSN is unknown despite the prior rounds of data collection, it will be collected in the student interview. This information is needed to obtain data from a variety of extant data sources including student financial aid data from Central Processing System (CPS), data from the National Student Loan Data System (NSLDS) Pell loan and grant files, and GED test results. A description of matching procedures and the security measures in place for the linkages to extant data sources is provided in appendix 7.

We also plan to verify or collect locating information for the sample member and contact persons in case further data collection of this sample occurs in the future.

A.12 Estimates of Hour Burden for Information Collection for the Field Test and Full-scale Study

Estimates of response burden for the ELS:2002 third follow-up field test and full-scale study sample maintenance (tracing) and data collection activities are shown in Exhibit A-4. (The sample maintenance activities have already been approved by OMB.)

The field test administration will also include a reinterview with a randomly selected subset of 50 respondents. The purpose of this reinterview is to evaluate the temporal stability (or in effect, test-retest reliability) of the reinterview items. In choosing items for reinterview, preference is given to items that meet the following criteria: (1) any newly designed items for the study, or other new items, such as those borrowed from non-NCES studies and for which measurement properties are not well known or (2) radically revised versions of items previously used in ELS:2002 or its predecessor studies; (3) additionally the items should be factual rather than attitudinal.

Exhibit A-6. Estimated Sample Maintenance and Data Collection Burden on Respondents for Field Test Study (2011) and Main Study (2012)


Sample

Expected response rate

Number of respondents

Average burden/
response (minutes)

Range of response times (minutes)

Total burden (hours)

Sample maintenance







Field test (2011)

1,060

20%

212

5

----

18

Full-scale study (2012), 1

16,200

20%

3,240

5

----

270

Full-scale study (2012), 2

16,200

20%

3,240

5

----

270

Data collection







Field test (2011)

1,060

50%

530

35

25 to 45

309

Student reinterview

63

80%

50

10

5 to 15

8

Full-scale study (2012)

16,200

90%

14,580

35

25 to 45

8,505

NOTE: Table does not include optional activities (financial aid and transcript collections, which, if approved, will take place in 2013-14 and will be submitted in a separate package).

Included in the notification letter and on the entry page to the online survey will be the following burden statement:

According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number of this information collection is 1850-0652 and it is completely voluntary. The time required to complete this information collection is estimated to average around 35 minutes per response. If you have any comments concerning the accuracy of the time estimate or suggestions for improving the interview, please write to: U.S. Department of Education, Washington, DC 20202-4537. If you have comments or concerns regarding the status of your individual interview, write directly to: Education Longitudinal Study (ELS), National Center for Education Statistics, 1990 K Street NW, Washington, DC 20006.”

A.13 Estimates of Costs

There are no capital, startup, or operating costs to respondents for participation in the project. No equipment, printing, or postage charges will be incurred.

Estimated costs to the federal government for ELS:2002 are shown in Exhibit A-5. The estimated costs to the government for data collection for the third follow-up field test and full-scale studies are presented separately. Included in the contract estimates are all staff time, reproduction, postage, and telephone costs associated with the management, data collection, analysis, and reporting for which clearance is requested.

A.14 Costs to Federal Government

Exhibit A-7. Total Costs to NCES

Costs to NCES

Amount (in $)

Total ELS:2002/12 costs


Salaries and expenses

200,000

Contract costs

9,647,075



Total annual ELS:2002/12 cost

3,282,358

NOTE: All costs quoted are exclusive of incentive fee. Table does not include optional activities (financial aid and transcript collections).

A.15 Reasons for Changes in Response Burden and Costs

Projected estimates for response burden and costs are based on experiences from the second follow-up study and more recent studies, including BPS:04/09. The increase in burden from the last approved clearance package is due to the fact that the last clearance was for address updates, while this clearance also requests approval for the field test data collection burden.

A.16 Publication Plans and Time Schedule

The ELS:2002/12 field test will be used to test and improve the instrumentation and associated procedures. Publications and other significant provisions of information relevant to the data collection effort will be a part of the reports resulting from the full-scale study, and both public use and restricted use data files will be important products. The ELS:2002 data will be used by public and private organizations to produce analyses and reports covering a wide range of topics. With the third follow-up, ELS:2002 data will add a fourth point in time for longitudinal analysis, and extend the cross-cohort comparison to predecessor cohorts (NELS:88, HS&B, and NLS-72).

Data files will be distributed to a variety of organizations and researchers, including offices and programs within the U.S. Department of Education, the Congressional Budget Office, the Department of Health and Human Services, Department of Labor, Department of Defense, the National Science Foundation, the American Council on Education, and a number of other education policy and research agencies and organizations. The ELS:2002 contract requires the following reports, publications, or other public information releases:

  • detailed methodological reports (one each for the field test and full-scale survey—in the form of a comprehensive Data File Documentation Report covering the base year through the third follow-up, with an appendix for the field test) describing all aspects of the data collection effort;

  • complete restricted-use, longitudinal full-scale study data files and documentation for research data users, including postsecondary institution transcript data and potentially financial aid information;

  • corresponding public-use data files for public access to ELS:2002 base-year to third follow-up results; and

  • a “first look” summary of significant descriptive findings for dissemination to a broad audience (the analysis deliverable will include technical appendices).

Final deliverables for the third follow-up are scheduled for completion in 2013. (Final deliverables for the transcript study are scheduled for completion in 2015.) The operational schedule for the ELS:2002 third follow-up field test and full-scale study is presented in Exhibit A-6.

Exhibit A-8. Operational Schedule for ELS:2002/12 Field Test and Full-Scale Activities

Activity

Start

End

Field test



Panel maintenance: contact updates for sample

10/2010

6/2011

First round of cognitive testing of items

8/2010

9/2010

Data collection

7/2011

12/2011

Second round of cognitive testing

10/2011

12/2011




Full-scale study



Panel maintenance: contact updates for sample

9/2010

6/2012

Data collection

7/2012

1/2013




Transcript and student aid data



Pilot testing of operations

2/2013

8/2013

Transcript and student aid data collection

8/2013

3/2014

Transcript keying and coding

11/2013

8/2014



A.17 Approval to Not Display Expiration Date for OMB Approval

The expiration date for OMB approval of the information collection will be displayed on data collection instruments and materials. No special exception to this requirement is requested.

A.18 Exception to Certification for Paperwork Reduction Act Submissions

No exceptions are requested to the certification statement identified in the Certification for Paperwork Reduction Act Submissions of OMB Form 83-I

1 Other variables may be added as the model is finalized, though race/ethnicity, gender, income and socioeconomic status will not be included in the model.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleDecember 2010
Authorcannada
File Modified0000-00-00
File Created2021-02-01

© 2024 OMB.report | Privacy Policy