B. Collection of Information Employing Statistical Methods
This submission requests clearance for both the BPS:04/09 field test and full-scale studies. The sampling plan for the field test is presented below and describes the base-year and first follow-up samples as well as the proposed sample for the second follow-up. Also discussed is the process used to determine BPS eligibility in the full-scale administrations of the base year and first follow-up studies to inform sampling and eligibility determination for the BPS:04/09 full-scale study. Specific plans for BPS:04/09 field test activities are also provided below. These activities are intended to fully test all procedures, methods, and systems of the study in a realistic operational environment prior to implementing them in the full-scale study.
The respondent universe for the BPS:04/09 field test (to be conducted in 2008) consists of all students who began their postsecondary education for the first time during the 2002–03 academic year at any eligible postsecondary institution in the United States or Puerto Rico. The sample students were first-time beginners (FTBs) interviewed during the field test of NPSAS:04. Similarly, the respondent universe for the full-scale data collection (to be conducted in 2009) consists of FTBs enrolled for the first time during the 2003–04 academic year who were identified during the NPSAS:04 full-scale data collection. The institution and student universes are described in greater detail in the subsections that follow.
BPS-eligible sample members were selected from among NPSAS-eligible institutions. The institutions eligible for the NPSAS:04 were required to:
offer an educational program designed for persons who have completed secondary education;
offer at least one academic, occupational, or vocational program of study lasting at least 3 months or 300 clock hours;
offer courses that are open to more than the employees or members of the company or group (e.g., union) that administers the institution;
be eligible to distribute Title IV aid;
be located in the 50 states, the District of Columbia, or Puerto Rico; and
be other than a U.S. Service Academy.
Institutions providing only vocational, recreational, or remedial courses or only in-house courses for their own employees are excluded. U.S. Service Academies were excluded because of their unique funding/tuition base.
The above institutional eligibility conditions are completely consistent with previous NPSAS studies with two exceptions. The requirement to be eligible to distribute Title IV aid was implemented beginning with NPSAS:2000.1 Also, the previous NPSAS studies excluded institutions that only offered correspondence courses. NPSAS:04 includes such institutions if they are eligible to distribute Title IV student aid.
The institutional sampling frame for the NPSAS:04 field test was constructed from the 2001 Integrated Postsecondary Education Data System (IPEDS) Institutional Characteristics (IC) file, the 2001 IPEDS Completions file, and the 2001 Fall Enrollment file.
Two hundred institutions were selected for the field test and yielded 173 institutions that provided lists for selection of sample students. The field test sample was selected purposively from the complement of the institutions selected for the full-scale study. This ensured that no institution would be burdened with participation in both the field test and full-scale samples without affecting the representativeness of the full-scale sample. The field test sample of institutions was selected to approximate the distribution by institutional strata for the full-scale study. The distribution of the field test institutional sample is presented in table 7. Overall, about 98 percent of the sampled institutions met the NPSAS eligibility requirements; of those, about 89 percent provided enrollment lists for student sampling.
Table 7. NPSAS:04 field test institution sample sizes and yield by sampling strata
Institutional sampling strata (sector) |
Institutions |
|||||
Frame |
Sample |
Eligible |
Percent of sample |
Provided list |
Percent of eligible |
|
Total |
6,674 |
200 |
195 |
97.5 |
173 |
88.7 |
|
|
|
|
|
|
|
Public less-than-2-year |
321 |
3 |
2 |
66.7 |
2 |
100.0 |
Public 2-year |
1,225 |
71 |
70 |
98.6 |
59 |
84.3 |
Public 4-year non-doctorate-granting |
358 |
22 |
22 |
100.0 |
21 |
95.5 |
Public 4-year doctorate-granting |
276 |
12 |
12 |
100.0 |
11 |
91.7 |
Private not-for-profit 2-year or less |
379 |
6 |
5 |
83.3 |
5 |
100.0 |
Private not-for-profit 4-year non-doctorate-granting |
1,076 |
46 |
45 |
97.8 |
38 |
84.4 |
Private not-for-profit 4-year, doctorate-granting |
537 |
15 |
15 |
100.0 |
13 |
86.7 |
Private for-profit less-than-2-year |
1,390 |
15 |
14 |
93.3 |
14 |
100.0 |
Private for-profit 2-year or more |
1,112 |
10 |
10 |
100 |
10 |
100.0 |
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Postsecondary Student Aid Study (NPSAS:04), “Field Test Methodology Report.”
The students selected for BPS:04/09 are those students eligible to participate in NPSAS:04 who were FTBs at NPSAS sample institutions. Consistent with previous NPSAS studies, students eligible for NPSAS:04 were those enrolled in eligible institutions who satisfied the following eligibility requirements:
were enrolled in either (a) an academic program; (b) at least one course for credit that could be applied toward fulfilling the requirements for an academic degree; or (c) an occupational or vocational program that required at least 3 months or 300 clock hours of instruction to receive a degree, certificate, or other formal award;
were not concurrently enrolled in high school; and
were not enrolled solely in a General Equivalency Diploma (GED) or other high school completion program.
NPSAS-eligible students enrolled in a postsecondary institution for the first time during the “NPSAS year” (July 1, 2002–June 30, 2003 for the field test and July 1, 2003–June 30, 2004 for the full-scale data collection) after completing high school were considered “pure” FTBs and eligible for participation in the BPS:04 longitudinal series of studies. NPSAS-eligible students who enrolled for at least one course after completing high school but never completed a postsecondary course before the targeted academic year, considered “effective” FTBs, were also eligible for membership in the cohort. The BPS student universe includes both pure and effective FTBs.
The BPS student sample was identified as part of the base year study—NPSAS:04. BPS eligibility for NPSAS:04 nonrespondents who were potential FTBs was determined in the BPS:04/06 interview. The sections below describe the steps taken to create and evaluate the student sample for the BPS:04 field test data collections in each phase of the longitudinal study.
The student sample sizes for the NPSAS:04 field test were set to approximate the distribution planned for the NPSAS:04 full-scale study, with the exception that additional FTBs were selected to have more available for the BPS:04/06 field test. As shown in table 8, the NPSAS:04 field test was designed to sample 1,288 students, including 807 FTBs; 356 other undergraduate students; and 125 graduate and first-professional students. There were eight student sampling strata:
four sampling strata for undergraduate students:
FTB in-state tuition students,
FTB out-of-state tuition students,
other undergraduate in-state tuition students, and
other undergraduate out-of-state tuition students;
three sampling strata for graduate students:
master’s,
doctoral,
other graduate students; and
a sampling stratum for first-professional students.
The numbers of FTBs shown in table 8 include both “true” FTBs, who began their postsecondary education for the first time during the NPSAS field test year, and “effective” FTBs, who had not completed a postsecondary class prior to the NPSAS field test year. Unfortunately, postsecondary institutions cannot readily identify their FTB students. Therefore, the NPSAS sampling rates for students identified as FTBs and other undergraduate students by the sample institutions were adjusted to achieve the expected counts after accounting for expected false positive (students identified as FTBs who were not) and false negative (students not categorized as FTB who were) rates. The false positive and false negative FTB rates experienced in NPSAS:96 (i.e., the most recent NPSAS to include a BPS base-year cohort) were used to set appropriate sampling rates for the NPSAS:04 field test.2 The overall expected and actual student sample sizes are shown in table 8.
Table 8. Expected and actual field test student samples, by student type and level of institutional offering: NPSAS:04 field test
Student type and institutional offering level |
Expected
student |
Actual
student |
Total |
1,288 |
1,281 |
|
|
|
Potential FTB |
807 |
787 |
Less-than-2-year |
196 |
84 |
2-year |
357 |
406 |
4-year |
254 |
297 |
|
|
|
Other undergraduate |
356 |
361 |
Less-than-2-year |
25 |
10 |
2-year |
81 |
72 |
4-year |
250 |
279 |
|
|
|
Master’s (4 year) |
57 |
26 |
Doctoral (4-year) |
36 |
31 |
Other graduate (4-year) |
11 |
61 |
First-professional (4-year) |
21 |
15 |
1 Based on sampling rates, Fall 2001 IPEDS Fall Enrollment file counts, and Fall 2001 IPEDS Completions file counts.
NOTE: FTB = first time beginner.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Postsecondary Student Aid Study (NPSAS:04), “Field Test Methodology Report.”
To create student sampling frames, each participating institution was asked to provide a list of eligible students. The requests for student lists specifically indicated how to handle special cases such as students taking only correspondence or distance learning courses, foreign exchange students, continuing education students, extension division students, nonmatriculated students, etc. The data elements required for each enrollee were the student’s name and ID/Social Security number (for abstracting student records), the student’s level during the last term of enrollment (undergraduate, master’s, doctorate-granting, other graduate program, or first-professional), and FTB status. Contacting information, such as school and home telephone numbers and addresses, was also requested.
The student samples were selected from the lists provided by the sample institutions. To reduce data processing costs associated with selection of sample students, we attempted to get electronic lists (e.g., via e-mail, uploading them to a project website, or on a diskette or CD-ROM) whenever possible. When electronic lists could not be provided, hard-copy lists printed either in alphabetical order or in student ID number order were requested to facilitate unduplicating the sample selected from multiple hard-copy lists. Unduplicated, electronic lists, lists by term of enrollment and/or by type of student (e.g., FTB, undergraduate, graduate, and first-professional) were all accepted.
Student sample results from the NPSAS:04 field test. Student samples for the NPSAS:04 field test were selected only from the first 77 institutions that provided lists. These 77 institutions provided sufficient variation and numbers of sample students for the field test. If the 1,288 expected sample students were selected from all 173 participating institutions, the sample size per institution would have been too small to adequately test procedures during the field test. The student lists from the institutions that were not used for the NPSAS:04 field test were used to supplement the field test sample for BPS:04/09, as described below.
Table 9 provides the interview results from the NPSAS:04 field test for each of the institutional strata. Of the 1,281 students sampled for the field test, 1,159 were determined to be NPSAS-eligible. There were 824 student interview respondents, and 310 of these were confirmed as FTBs in the student interview.
Table 9. NPSAS:04 field test student sample, by institutional sector, eligibility, response status, and FTB status
Institutional sector |
Number sampled |
Number eligible |
NPSAS:04 field test respondents |
Non-respondents |
|
Number |
Number
|
||||
Total |
1,281 |
1,159 |
824 |
310 |
335 |
|
|
|
|
|
|
Public |
|
|
|
|
|
Less-than-2-year |
35 |
30 |
22 |
8 |
8 |
2-year |
383 |
317 |
197 |
97 |
120 |
4-year non-doctorate-granting |
187 |
179 |
138 |
59 |
41 |
4-year doctorate-granting |
196 |
187 |
136 |
33 |
51 |
|
|
|
|
|
|
Private not-for-profit |
|
|
|
|
|
2-year or less |
59 |
57 |
38 |
8 |
19 |
4-year non-doctorate-granting |
226 |
216 |
168 |
60 |
48 |
4-year doctorate-granting |
88 |
87 |
70 |
24 |
17 |
|
|
|
|
|
|
Private for profit |
|
|
|
|
|
Less-than-2-year |
59 |
43 |
24 |
7 |
19 |
2-year or more |
48 |
43 |
31 |
14 |
12 |
NOTE: First-time beginner (FTB) status was determined by student interview. NPSAS = National Postsecondary Student Aid Study.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004 National Postsecondary Student Aid Study (NPSAS:04), “Field Test Methodology Report.”
The BPS:04/06 field test sample was drawn from the NPSAS:04 field test interview study respondents who confirmed their FTB status, and from among the nonrespondents who were identified as potential FTBs by their institutions. However, to obtain the 1,000 interviews needed to test the questionnaires and procedures across the institutional strata, the field test sample also included a supplemental sample of potential FTBs who had not yet been contacted. Each of these three groups is described below. Table 10 provides the details of the field test sample distribution.
Confirmed FTBs who responded to NPSAS:04—All 310 students from the NPSAS:04 field test sample who responded to the NPSAS field test student interview and verified their FTB status were included in the BPS:04/06 follow-up field test sample.
Potential FTBs who were NPSAS:04 nonrespondents—There were 335 students who were sampled for, but did not respond to, the NPSAS:04 field test student interview. Of these nonrespondents, 213 were identified as FTBs by their sample institution and had a valid Social Security number.3 As an additional step to improve the likelihood that base-year nonrespondents would be eligible for inclusion in the BPS:04/06 field test cohort, the indicator for FTB status according to the Central Processing System (CPS) 4 was considered whenever possible. Students who matched to CPS (2002/03) and were identified as FTBs (92 students) were included in the sample. Base-year nonrespondents identified as potential FTBs by their institution who did not match to CPS were also included in the field test sample (83 students) for a total of 175 students. Because of the difficulty in locating and interviewing nonrespondents to prior studies, any student who was identified as an FTB by the institution, but who matched to CPS and was not identified as FTB (38 students), was excluded from the sample.
Potential FTBs not yet contacted—It was necessary to supplement the BPS:04/06 field test sample because the NPSAS:04 field test did not yield enough confirmed FTBs to adequately test the questionnaire and procedures for BPS. Thus, a supplemental sample of students who were selected for the NPSAS:04 field test but were not included in the final base-year student sample were also included in the BPS:04/06 field test sample. As noted earlier in this section, only 77 of the 173 institutions that provided lists were actually used for the NPSAS field test. To increase the likelihood of locating and interviewing an FTB from this group of students who had not yet been contacted, the students in the supplemental sample were restricted to those identified as FTBs by institution indicators, with a valid Social Security number, and with locating information either from CPS or Telematch.
The field test sample for BPS:04/06 was designed to yield a total of 1,060 respondents, distributed as shown in table 10 across strata defined by type of institution. This stratum definition for the field test was a collapsed version of the NPSAS:04 institutional strata.
Table 10. Actual distribution of respondents to the BPS:04/06 field test, by institution type
Institution type |
Number of respondents |
Total |
1,060 |
|
|
Public less-than-2-year |
10 |
Public 2-year |
250 |
Public 4-year nondoctorate-granting |
100 |
Public 4-year doctorate-granting |
140 |
Private not-for-profit less-than-4-year |
20 |
Private not-for-profit 4-year nondoctorate granting |
220 |
Private not-for-profit 4-year doctorate granting |
80 |
Private for-profit less-than-2-year |
150 |
Private for-profit, 2 year or more |
100 |
NOTE: Numbers have been rounded to the nearest 10. Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2004/2006 Beginning Postsecondary Students Longitudinal Study (BPS:04/06), “Field Test Methodology Report.”
The BPS:04/09 field test sample will consist of the 1,060 respondents to the BPS:04/06 field test, and an additional 80 cases who responded to the NPSAS:04 field test but not the BPS:04/06 field test, for a total of 1,137 cases. Table 11 shows the number of cases in the sample and the expected response rate by the NPSAS:04 and BPS:04/06 field test status. The expected response rates are based on the rates obtained in the BPS:96/01 field test and full-scale studies. The BPS:04/09 field test sample is expected to yield 865 respondents.
Table 11. BPS:04/09 field test sample sizes
NPSAS:04 field test response status |
BPS:04/06 field test response status |
Number of cases to be included in BPS:04/09 sample |
Expected BPS:04/09 respondents |
|
Response rate (percent) |
Number |
|||
Total |
Total |
1,137 |
|
865 |
|
|
|
|
|
Respondent |
Respondent |
230 |
84 |
193 |
Respondent |
Nonrespondent |
80 |
63 |
50 |
Nonrespondent |
Respondent |
43 |
79 |
34 |
BPS: supplemental sample (not in NPSAS:04) |
Respondent |
784 |
75 |
588 |
NOTE: BPS = Beginning Postsecondary Students. NPSAS = National Postsecondary Student Aid Study.
The following section describes the process of cohort identification from the full-scale study, beginning with NPSAS through the first follow-up. Identifying FTBs for membership in the BPS:04/06 cohort required an extensive process involving data collected across NPSAS:04 and BPS:04/06. Data were collected from a number of sources, including
lists of students enrolled during the academic year targeted (2002–03 for the field test and 2003–04 for the full-scale) at the NPSAS-eligible institutions that provided the lists;
student-level data abstracted from the student’s institutional record using a computer-assisted data entry (CADE) system;
records matches, conducted across academic years, to two extant databases: the CPS and the National Student Loan Data System (NSLDS);
student interviews conducted as part of NPSAS:04 and as part of BPS:04/06; and
a one-time record match to the National Student Clearinghouse (NSC) StudentTracker database conducted in September 2006 (full-scale only).
The following section describes the process by which sample members were identified for the full-scale study and ultimately classified as FTBs across these multiple data sources and time periods.
To begin the NPSAS:04 data collection, NPSAS-eligible institutions were asked to submit to RTI lists of all students enrolled at the institution at any time during the 2003–04 academic year. Students were classified by their institutions as being either FTBs, other undergraduates, or graduate and professional students. Students should be identified as FTBs if they were undergraduates enrolled at some time between July 1, 2003 and June 30, 2004 and, prior to July 1, 2003, had not earned any postsecondary degrees or completed any postsecondary classes toward a degree or formal award since completing high school.5 Table 12 presents the number of NPSAS-eligible FTBs and other undergraduate and graduate/first-professional students sampled from institution lists according to how they were listed initially by the institutions.
Table 12. Distribution of first-time beginners (FTBs) and other undergraduate, graduate, and first-professional students as listed initially by NPSAS institutions: 2004
Initial institution classification |
Count |
|
Percent |
||
Unweighted |
Weighted |
Unweighted |
Weighted |
||
Total, NPSAS-eligible sample |
101,010 |
17,267,520 |
|
100.0 |
100.0 |
|
|
|
|
|
|
Listed FTB students |
42,400 |
3,336,030 |
|
42.0 |
19.3 |
Listed other undergraduate and graduate students |
55,690 |
13,610,990 |
|
55.1 |
78.8 |
Unknown classification |
2,920 |
320,510 |
|
2.9 |
1.9 |
NOTE: Detail may not sum to totals because of rounding.
NOTE: NPSAS = National Postsecondary Student Aid Study.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04) and 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06).
That students were identified by their institutions as FTBs was only the first step in confirming student eligibility for membership in the BPS cohort. Information was also extracted by matching the entire NPSAS:04 sample to two extant databases. The first, the CPS, contains the records of all students who applied for federal financial aid using the Free Application for Federal Student Aid (FAFSA) for the 2003–04 academic year. Question 24 (Q24) of the FAFSA asked applicants about their year in postsecondary education:
What will be your grade level when you begin the 2003–2004 school year?
Sample members who answered Q24 as “first year/never attended college before” were considered FTBs according to CPS and, therefore, potentially eligible for membership in the BPS cohort.
In addition to the CPS, the NPSAS:04 student sample was matched to the NSLDS, which, as the central data system for federal student aid, contains records for Pell grants and the Direct loan program (including subsidized and unsubsidized Stafford, Perkins, and PLUS loans). As a history file, the NSLDS contains aid records for all years of a student’s funding, not just the current academic year. Although NSLDS records do not contain an FTB indicator, it was assumed that any student with a record of federal financial aid receipt prior to the 2003–04 academic year could not have been an FTB in 2003–04.
Two additional sources of student data were involved in the determination of eligibility for the BPS cohort. First, as part of NPSAS:04, records at the NPSAS institutions were abstracted for the entire sample using a CADE methodology. The CADE abstraction instrument contained one item, Question 8, that could help identify a particular sample member as FTB according to the definition of FTB reported under IPEDS.
Is this student classified as a first-time, first-year degree-seeking student for IPEDS reporting purposes? [y/n]
The response provided for CADE Question 8 helped confirm a student’s eligibility as an FTB but, given the additional requirement of full-time status under the IPEDS definition, could not be used to exclude sample members from the BPS cohort.
In addition to CADE, attempts were made to interview students selected for NPSAS:04 using either a self-administered interview on the Web or a CATI. Several items in the NPSAS:04 student interview helped to clarify a student’s status as an FTB. Depending on whether the NPSAS institution was the first postsecondary institution attended in the 2003–04 academic year, students were asked either N4FSTSTR or N4SCHSTR (see Table 13), which determined when the sample member first enrolled at any postsecondary institution after completing high school requirements. If the sample member reported enrollment prior to the 2003–04 academic year, N4CMPCLS determined whether credit was earned for the prior enrollment. As long as the student did not earn transferable credit for postsecondary enrollment between high school completion and July 1, 2003, he or she would still be considered potentially eligible for the BPS cohort.
Table 13. NPSAS:04 student interview items for determining student status as first-time beginner (FTB): 2004
Variable |
Item |
Administered to |
N4SCHSTR |
In what month and year did you first attend [NPSAS] after completing high school requirements? |
Undergraduate respondents whose first school was NPSAS |
N4FSTSTR |
In what month and year did you first attend any college, university, or trade school after high school? |
Undergraduate respondents whose first school was not NPSAS |
N4CMPCLS |
Did you complete one or more postsecondary classes (at a college or trade school) toward a degree or formal award between the time you completed high school and July 1, 2003? |
Undergraduates who first enrolled at a postsecondary institution prior to July 1, 2003 and are either in the first or second year of a degree program, or not in a degree program |
NOTE: NPSAS = National Postsecondary Student Aid Study.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04) and 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06).
At the end of the NPSAS:04 data collection, all available information for sample members—classification on the institution lists, student interview, CADE, and CPS and NSLDS records matching—was reviewed to make a final determination of BPS eligibility. Since these sources were sometimes found to be contradictory, a judgment was made as to the likely eligibility of each sample member. The outcome of this analysis is shown in Table 14, according to the institution’s original classification of the student. Institution listings of students were found to be correct for about 86 percent6 of the NPSAS:04 eligible sample. FTBs were falsely classified as such for 35 percent of the listed sample (false positives), while about 10 percent of the other undergraduate, graduate, and first-professional students were determined actually to be FTBs (false negatives). About 17 percent of those whose status as an FTB was unknown at the time of the listing ended up classified as FTBs.
Table 14. First-time beginner (FTB) status following NPSAS:04 interview, record abstraction, and record matching, by initial institutional classification: 2004
Initial institution classification |
Total count |
|
Institution listing disposition following data collection |
|||||
Percent confirmed |
|
Percent error in FTB status |
||||||
Unweighted |
Weighted |
Unweighted |
Weighted |
Unweighted |
Weighted |
|||
Total |
101,010 |
17,267,520 |
|
78.2 |
85.5 |
|
21.8 |
14.5 |
|
|
|
|
|
|
|
|
|
FTB |
42,400 |
3,336,030 |
|
61.91 |
65.3 |
|
38.2 |
34.7 |
Other undergraduate, graduate, or first-professional |
55,690 |
13,610,990 |
|
90.5 |
90.5 |
|
9.6 |
9.6 |
Unknown classification2 |
2,920 |
320,510 |
|
83.1 |
83.2 |
|
16.9 |
16.8 |
1 Includes 340 cases listed by the NPSAS institution as FTBs who were later determined to be FTBs at another institution. Since these cases were ultimately retained for the BPS:04 cohort, they were considered counted among the confirmed FTBs.
2 Students whose status was unknown according to the initial list classification were assumed to be non-FTBs.
NOTE: Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04) and 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06).
In preparation for BPS:04/06 full-scale data collection, the 32,170 FTBs identified during NPSAS:04 were subsampled to yield a starting sample of 23,090 sample members. NPSAS:04 interview nonrespondents were asked the same set of base-year interview items, described above, to determine eligibility for the BPS:04 cohort. In addition, a subset of the base-year respondents, whose eligibility as an FTB remained in question despite their interview responses and the results of the CPS and NSLDS record matching were rescreened. Table presents the FTB status of the BPS:04/06 sample, according to the original classification of the sample member by the NPSAS institution, following the BPS:04/06 interview. Of the sample interviewed during the BPS:04/06 study, 99 percent were confirmed to be FTBs.
Table 15. First-time beginner (FTB) status following BPS:04/06 interview according to initial FTB listing by NPSAS institution: 2006
Initial institution classification |
Total count |
|
Institution listing disposition following data collection |
|||||
Percent confirmed1 |
|
Percent error in FTB status |
||||||
Unweighted |
Weighted |
Unweighted |
Weighted |
Unweighted |
Weighted |
|||
Total |
23,090 |
2,770,780 |
|
98.6 |
98.5 |
|
1.4 |
1.5 |
|
|
|
|
|
|
|
|
|
FTB |
18,010 |
1,579,170 |
|
98.9 |
99.1 |
|
1.1 |
0.9 |
Other undergraduate, graduate, or first-professional |
4,530 |
1,133,010 |
|
97.5 |
97.6 |
|
2.5 |
2.4 |
Unknown classification2 |
550 |
58,220 |
|
98.6 |
99.6 |
|
1.4 |
0.4 |
1Includes those students who were confirmed to be FTBs as well as those who were nonrespondents to the BPS:04/06 interview.
2 Students whose status was unknown according to the initial list classification were assumed to be non-FTBs.
NOTE: Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04).
As part of the BPS:04/06 data collection, the BPS:04 cohort was again matched to the CPS, for every application year since 2004, and NSLDS databases. In addition, in 2006, the cohort sample was matched to a new source, the NSC StudentTracker database, which contains enrollment and degree completion data for any students enrolled in NSC-participating institutions.7 A record match for a student’s enrollment at the NPSAS institution was obtained for about 60 percent of FTBs.
Since it is a history file, the NSC shows current and prior postsecondary enrollment, as well as degrees being attempted and earned at all known institutions for an individual student. Consequently, the NSC data provided another opportunity to evaluate whether sample members were appropriately classified as FTB students in the 2003–04 academic year in light of what was already known from, and was sometimes contradictory with, the CPS, NSLDS, and interviews. If the NSC data confirmed enrollment prior to July 2003, or indicated degrees earned prior to 2003, the sample member was concluded to be ineligible for the BPS:04 cohort. Table 16 presents the final determination of BPS:04 cohort eligibility as a result of records matching to the CPS, NSLDS, and NSC databases in 2006. Based on the combination of information known about sample members across sources, another 7 percent of the sample initially classified as FTB was determined to be ineligible for the BPS:04 cohort.
Table 16. First-time beginner (FTB) status following records matching to CPS, NSLDS, and NSC databases according to initial FTB listing by NPSAS institution: 2006
Initial institution classification |
Total count1 |
|
Institution listing disposition following data collection |
|||||
Percent confirmed |
|
Percent error rate |
||||||
Unweighted |
Weighted |
Unweighted |
Weighted |
Unweighted |
Weighted |
|||
Total |
22,760 |
2,728,190 |
|
79.8 |
73.5 |
|
20.2 |
26.5 |
|
|
|
|
|
|
|
|
|
FTB |
17,800 |
1,564,860 |
|
89.4 |
91.1 |
|
10.6 |
8.9 |
Other undergraduate, graduate, or first-professional |
4,420 |
1,105,360 |
|
47.6 |
51.0 |
|
52.4 |
49.0 |
Unknown classification2 |
550 |
57,970 |
|
25.3 |
26.4 |
|
74.7 |
73.6 |
1The total count of FTBs decreased when additional information collected from records matching determined the students were not actually FTBs during the 2003–04 academic year.
2 Students whose status was unknown according to the initial list classification were assumed to be non-FTBs.
NOTE: Detail may not sum to totals because of rounding. CPS = Central Processing System. NSC = National Student Clearinghouse. NSLDS = National Student Loan Data System.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04) and 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06).
Identification of FTBs for the BPS:04 cohort, therefore, began with the NPSAS institution’s classification of students and ended with the completion of records matching following data collection for BPS:04/06. Table 17 shows the final false positive and false negative rates of the initial institutional classification. From among the students initially classified by their institutions as FTBs for the 2003–04 academic year, 48 percent were ultimately determined not to be FTBs (false positives). Among those classified by their NPSAS institution as other undergraduate, graduate, and first-professional students, about 4 percent were determined to be FTBs and retained for the BPS:04 cohort (false negatives) with another 13 percent of those with unknown classification determined to be FTBs as well.
Table 17. Final false positive and false negative rates for classification of first-time beginners (FTBs) by NPSAS institution following NPSAS:04 and BPS:04/06 student interviewing and records matching: 2006
Initial institution classification |
Count |
|
Percent error rate |
||
Unweighted |
Weighted |
Unweighted |
Weighted |
||
Total, NPSAS-eligible sample |
101,010 |
17,267,520 |
|
† |
† |
|
|
|
|
|
|
Listed FTB students (false positives) |
42,400 |
3,336,030 |
|
53.4 |
47.9 |
Listed other undergraduate and graduate students (false negatives) |
55,690 |
13,610,990 |
|
4.2 |
4.1 |
Unknown classification1 (false negatives) |
2,920 |
320,510 |
|
14.0 |
13.3 |
† Not applicable.
1 Students whose status was unknown according to the initial list classification were assumed to be non-FTBs.
NOTE: Detail may not sum to totals because of rounding.
SOURCE: U.S. Department of Education, National Center for Education Statistics, 2003–04 National Postsecondary Student Aid Study (NPSAS:04) and 2004/06 Beginning Postsecondary Students Longitudinal Study (BPS:04/06).
Response rates for BPS:04/09 will be a function of success in two basic activities: tracing and locating the sample members and, once contacted, gaining their cooperation and time to complete the interview in one of three modes: a self-administered web-based interview, a telephone interview, or an in-person field interview. We will use an advance tracing procedure initially and follow this by more intensive procedures, including field interviewing and the use of incentives, as necessary. Given the tracing, self-administration, telephone interviewing, and field interviewing plans we discuss below, we are confident that we will be able to achieve an effective response rate of 85 percent for the BPS:04/09 full-scale study. For the field test, due to its shorter time frame, the goal is to yield 865 (76 percent) completed interviews, rather than an 85 percent response rate.
One of the main issues for the BPS:04/09 data collection effort will be locating the members of the sample cohort. While most of these individuals have not been contacted since the BPS:04/06 first follow-up interview, some will have not been contacted since the NPSAS:04 base-year interview. Members of the cohort are highly mobile and may have completed their degrees, transferred to a different college, or moved. The high mobility rate of this population presents challenges to the BPS:04/09 tracing effort.
A successful tracing operation is dependent on a multitude of factors including the characteristics of the population to be located, the age of the locating information for the population (approximately 2 to 5 years old at the start of data collection), and the completeness and accuracy of that information. To maximize our location rate, sufficient resources will be devoted to tracing operations both in-house and in the field, giving careful consideration to identifying and implementing the most effective, yet cost efficient, tracing strategies for this population. The locator database for the cohort includes critical tracing information for most of the sample members, including their previous residences and telephone numbers. Moreover, Social Security numbers are available for virtually all of the sample members (99 percent), as well as other information useful for tracing.
To achieve the desired yield in the field test and the desired response rate in the full-scale study required by NCES standards, we propose a multistage tracing approach that will capitalize on the availability of locating data obtained in the previous rounds and the continuing cooperation of sample members. This multistage approach will consist of several steps designed to yield the maximum number of locates with the least expense. During the field test, we will evaluate the effectiveness of these procedures for the full-scale survey effort. The steps of our multistage tracing plan include the following elements.
Advance tracing. We propose to employ an advance tracing operation prior to field test and full-scale data collection that will update the addresses of sample members. Advance tracing will include searches of the U.S. Department of Education’s CPS for information on financial aid recipients. We will also conduct computerized searches of other databases, including the National Change of Address (NCOA), Telematch and ComServ's Death Information System. We will compare all sample member addresses obtained from the locator database (with information from NPSAS:04 and BPS:04/06) against the NCOA and Telematch databases to identify sample members who have moved since the previous follow-up. Updated addresses and telephone numbers produced by these advance tracing activities will be entered into the BPS:04/09 locator database and made available to data collection personnel at the start of data collection.
Advance interactive tracing. After the completion of advance tracing, cases without good locating information (primarily cases that participated only in NPSAS:04) will be directed for additional interactive tracing. Specially trained tracing staff will perform intensive tracing to locate additional contact information for these cases. In many instances, this will involve an interactive credit bureau search and may involve other interactive databases of locator information.
Parent mailing. In December 2007, we will mail a letter to the parents of all sample members informing them that their child’s participation will be requested. This letter will also include a study brochure, address update form, and a business reply envelope.
Initial contact mailing to sample members. Beginning in January 2008, we will mail a personalized letter (signed by the NCES Commissioner), study brochure, address update form, and business reply envelope to all sample members. This letter will include the study's website address and toll-free telephone number, and will request that sample members update their postal and electronic mail addresses. Undeliverable mailings to sample members will be recorded, and the next best address will be used to resend the materials. Once all potential addresses for the sample member are exhausted, we will contact other information sources for the sample member (e.g., a parent, other relative, or a designated contact.)
Data collection announcement mailing to sample members. Once we have the most current contact information for the sample members, we will mail a postcard to the sample cohort announcing the start of data collection. The postcard will include information about the study, and will describe the various ways to complete the interview. The postcard will also include the website address for the project, and the sample member’s unique username and password for the site. The postcard will be folded and secured with a mailing tab so personal information cannot be viewed until the tab has been broken.
Intensive in-house tracing. The goal of intensive tracing is to obtain a telephone number at which the sample member can be reached so that field interviewing will not be required. Tracing procedures may include (1) Directory Assistance for telephone listings at various addresses, (2) criss-cross directories to identify (and contact) the neighbors of sample members, (3) calling persons with the same unusual surname in small towns or rural areas to see if they are related to or know the sample member, and (4) contacting the current or last known residential sources such as the neighbors, landlords, and current residents of the last known address. Other more intensive tracing activities could include (1) database checks for sample members, parents, and other contact persons, (2) credit database and insurance database searches, (3) drivers’ license searches through the appropriate state departments of motor vehicles, (4) calls to colleges, military establishments, and correctional facilities to follow up on leads generated from other sources, (5) calls to alumni offices and associations, and (6) calls to state trade and professional associations based on information about field of study in school and other leads.
Field tracing and interviewing. One of the challenges presented by both the BPS:04/09 field test and the full-scale data collection efforts is the need for in-person tracing and interviewing nationwide. We will use a two-tiered tracing strategy for field cases that could not be completed through either self-administered web interview or CATI. Using the best available address for the nonresponding sample members, the cases will be clustered into geographic areas. At that time, field interviewers will be assigned areas with high concentration of sample members (e.g., a major metropolitan area). These field interviewers will be assigned to locate and interview the sample members residing in that cluster. Cases in areas without assigned field interviewers (e.g., cases not clustered with other cases) will be assigned to receive additional intensive tracing. Cases where additional telephone contact information is collected will be returned to data collection by telephone.
The self-administered web-based data collection effort will begin immediately after the initial postcards are distributed to sample members. This will enable the sample cohort to access the BPS:04/09 website to complete the survey online. The web-based data collection will be enabled throughout data collection for the study, and we estimate that approximately 65 percent of the BPS:04/09 interviews will be completed in this mode. To provide assistance to sample members who choose the self-administered web option, a help desk will be available during the data collection period. Help desk staff will complete the interview by telephone, if requested, when a respondent calls in for assistance.
All cases not completed by self-administration within 4 weeks of the start of data collection will be eligible for telephone interviewing. The data collection effort will involve locating, contacting, and then interviewing the sample cohort members who have not already responded to the web-based self-administered interview. As noted earlier, data collection for the field test (and full-scale study) will include both respondents and nonrespondents from the NPSAS:04 and BPS:04/06 studies (as long as they participated in one or the other study). Members of the sample who do not respond to the BPS:04/09 self-administered interview or attempts to conduct a telephone interview will receive special attention to ensure that appropriate response rates are achieved.
The costs of field tracing and interviewing nonresponding sample members are very high, especially relative to self-administered data collection. As a result, only a subset of nonrespondent cases will be sent to RTI field interviewers for in-person contact attempts. Clustering is proposed to reduce field expenses (such as interviewer travel time and other expenses). Cases will be assigned to the field on a flow basis, beginning several weeks after the start of telephone data collection.
Recognizing and avoiding potential refusals is critical to maximize the response rate. We will emphasize this and other topics related to obtaining cooperation during data collector training. Supervisors will carefully monitor interviewers during the early days of CATI data collection and provide retraining as necessary. In addition, supervisors will review daily interviewer production reports produced by the CATI system to identify and retrain any data collectors with high refusal rates.
Whenever a refusal is encountered, the data collector will enter comments into the CATI record. These comments will include all pertinent data regarding the refusal situation, including any unusual circumstances and any reasons given by the sample member for refusing. Supervisors will review these comments to determine what action should be taken with each refusal. Refusals and partial interviews will require supervisory review and approval prior to being finalized.
If there is a clear indication that follow-up would be inappropriate (e.g., there are extenuating circumstances, such as illness or the sample member clearly and firmly requested that no further contact be made), the case will be coded as final and will not be recontacted. If the case appears to be a “soft” refusal, follow-up will be assigned to a member of a special refusal conversion team made up of interviewers who have proven to be especially adept at dealing with refusals.
Refusal conversion efforts will be delayed for at least one week in order to give the respondent some time after the initial refusal. Conversion attempts made too soon are often more difficult for the second interviewer. We will not attempt refusal conversion efforts with individuals who become verbally abusive or who threaten to take legal or other action. Refusal conversion efforts will not be conducted to a degree that would constitute harassment. We will respect a sample member’s right to decide not to participate and will not impinge this right by carrying conversion efforts beyond the bounds of propriety.
Incentives to convert refusals, difficult and unable-to-locate respondents. We propose to offer incentive payments to nonresponding members of the sample population. We believe that nonrespondents will be comprised of three primary groups: individuals who have refused to participate in web/CATI interviewing, persons who have proven difficult to interview (i.e., persons who repeatedly break CATI appointments), and sample members who cannot be located or contacted by telephone. Our approach to maximizing the response of these persons, and thereby limiting potential nonresponse bias, involves an incentive payment as a token of appreciation for respondent time and expenses.
As described in the justification section (section A), we have proposed to offer incentive payments to nonresponding members of the sample population. All respondents during the early response period will be paid a $30 incentive, although how that incentive is paid (promised or partially prepaid and promised) will be manipulated during the field test. Respondents will also be offered an incentive for interview completion during the final phase of data collection—the nonresponse conversion phase.
Any respondents who were prior round nonrespondents will be paid an additional incentive amount to compensate for the additional burden of providing background information otherwise collected during the base-year interview. We will offer a $20 differential above the incentive amount for which they would otherwise be eligible. For example, a sample member who was a nonrespondent to BPS:04/06 who completed during the early response phase of BPS:04/09 would be eligible for the $30 early response incentive plus the $20 differential for prior round nonresponse status for a total of $50.
The following sections will briefly discuss four areas of data collection believed to affect overall study participation that will be evaluated during the BPS:04/09 field test data collection. These areas are (1) visibility of mailout materials, (2) notification by cell phones/text messaging, (3) prepaid incentives, and (4) nonresponse conversion incentives. This section will also introduce plans for experiments in these areas during the BPS:04/09 field test.
Much research about survey response has focused on the impact of response rates based on the types of outgoing mail. In particular, the method of mail delivery has been found to be an important factor. For instance, Abreu and Winters (1999) found that Priority Mail was effective when used as a method to increase response rates among non-responding cases. Similarly, Moore and An (2001) found the use of Priority Mail in a prenotification mailing and a reminder mailing to be most effective in their mail questionnaire survey (2001). Additionally, Fox et. al., revealed that first class mail yielded higher response rates than bulk mail. (Fox, Crask and Jonghoon 1988).
The reason is obvious: content is ineffective if the envelope is ignored or assumed to be junk mail. Couper et al. found that mail is usually sorted by only one person in one-half of all households. Furthermore, 60 percent of people discard mail without opening it (Couper, Mathiowetz, and Singer 1995), and therefore, it is imperative that researchers maximize the chances of their mailings being read. Increasing the look of legitimacy can help ensure that the mail is opened by the intended recipient, thereby increasing the likelihood of survey response.
Although research is not extensive regarding the impact that size of the mailout material has on response rates, some evidence suggests that packaging size is important. Dillman, Mahon-Haft, and Parson (2004) conducted cognitive interviews on the use of larger packages in comparison to traditional size packages. Dillman et al. found that larger packages possibly improved response rates. Respondents appeared more likely to pay attention and open the larger packages and, in turn, seemed to motivate the respondents to read the correspondence contained within the larger packages.
We believe that using larger, more visible envelopes will signal the importance of the information contained in the package, increasing the likelihood that the materials will be read, and in turn, the likelihood of survey participation. We propose to test the impact of the visibility of mailout materials on participation rates.
Prior to the start of data collection, the field test sample will be randomly assigned to two groups: one group will receive the initial study materials via regular mail in a larger envelope, and the other group will receive the same materials via Priority Mail also delivered in a large envelope. The initial mailing will contain information about the study, including how to log on to complete the interview. Results will be measured by comparing the participation rates at the end of the early response period for these two groups to determine whether participation is greater for those who receive the larger envelopes via regular mail.
Ho: There will be no difference in early participation rates between those who receive the initial mailing via Priority Mail packages and those who receive large mailing envelopes.
The use of cell phone calling and text messaging is a relatively new means for contacting sample members. Little research has been conducted on the effects of text messaging on participation rates. Research conducted by Brick et al. suggests that text messaging as a method of prenotifying sample members has nearly equal response rates as control group counterparts (Brick, Brick, Dipko, Presser, Tucker, and Yuan 2007). According to Lambries et al., those households using primarily cell phones required more attempts to contact than those using both landline and cell phones and those using landline only (Lambries, Link and Oldendick 2006). Households that used cell phones primarily showed differences of 1.1 more attempts than landline-only household and 0.8 more attempts than both landline and cell phone households.
However, text messaging has some advantages as the first means of contacting sample members. Text messaging may help identify working numbers and, in turn, increase the efficiency of the calling process (Steeh, Buskirk and Callegaro 2007). The research by Steeh et al. concludes that text messages have two advantages as the first means of contact: outcome rates are substantially improved and information about the working status of the number is obtained.
Further research in this field is needed to better understand the effects of cell phone calling and text messaging on participation rates.
Ho: The use of text messages and instant messaging as additional means of contacting sample members will have no effect on early participation rates.
Much evidence suggests that the use of prepaid incentives increases survey response more than promised incentives alone. Although operational challenges exist, the BPS:04/09 field test provides an opportunity to test the effect of no prepayment at all but the promise of a $30 check upon interview completion, versus $5 cash—with a promise of a $25 check upon interview completion.
Ho: Prepaid incentives, $5 cash, will have no effect on the proportion of respondents who complete the web-based, self-administered interview during the first 3 weeks of data collection.
No incentive will be offered or paid during production interviewing.
A summary of the experiments proposed for the BPS:04/09 field test is provided below. We also provide detail about the field test sample and its allocation to each of the cells, and discuss the assumptions made in developing the design.
One-way comparisons
In a comparison of participation rates during the early response period, there will be no difference between those who receive the study materials and survey invitation via Priority Mail and those who receive the study materials in a large envelope via regular mail.
There will be no difference in early response period participation rates between those who are notified about the start of data collection using text messages/instant messages during the early response period and those who are not.
Comparing participation rates during the early response period, there will be no differences observed among those offered (1) a promised incentive of $30 and those offered $5 cash and a promise of a $25 incentive paid by check upon interview completion.
Two-way comparisons
There will be no difference in participation rates observed during the early response period between those who receive the study materials and survey invitation via Priority Mail and are notified about the start of data collection using text/instant messages when compared with all others.
There will be no difference in participation rates observed during the early response period between those who receive the study materials and survey invitation via Priority Mail and are offered $5 cash and a promise of a $25 incentive when compared with all others.
There will be no difference in participation rates observed during the early response period between those who are notified about the start of data collection using text/ instant messages and are offered $5 cash and a promise of a $25 incentive when compared with all others.
Three-way comparison
There will be no difference in participation rates observed during the early response period between those who receive the study materials and survey invitation via Priority Mail, are notified about the start of data collection using text/instant messages, and are offered $5 cash and a promise of a $25 incentive when compared with all others.
As part of the planning process for developing the field test experiment design, the participation rate differences between the control and treatment groups necessary to detect statistically significant differences will be estimated. That is, we will estimate how large a difference is necessary to state that the participation rates between the two groups are different. Table 18 shows the expected sample sizes and statistically significant detectable difference for each of the eight hypotheses. Several assumptions were made regarding participation rates and sample sizes. In general, the closer the participation rate is to 50 percent (either less than or greater than), the larger the detectable difference. Likewise, the smaller the sample size, the larger the detectable difference.
Assumptions:
The sample will be equally distributed across experimental cells.
All ineligible cases will be included in the analysis because ineligibility will be determined after the interview begins.
All 1,137 sample members will be included in the mailout and prepaid experiments, and all 1,137 sample members for whom a cell phone number was obtained during NPSAS or the address update mailing to students and parents in January will be included in the text messaging experiment.
Cell phone numbers will be known for two-thirds of the sample members.
The participation rate for the control group for hypotheses 1 through 7 will be 35 percent.8
Table 18. Detectable differences for field test experiment hypotheses
Hypothesis |
Control group |
|
Treatment group |
|
Detectable difference with 95% confidence |
||
Description |
Initial sample size |
|
Initial sample size |
||||
One-way comparisons |
|
|
|
|
|
|
|
1. Visibility of mailout materials |
Regular mail, large envelope |
568 |
|
Priority mail, large envelope |
569 |
|
4.8 |
2. Text messaging |
Not notified via text messaging |
379 |
|
Notified via text messaging |
379 |
|
5.8 |
3. Use of prepaid incentives |
Promise of $30 |
568 |
|
$5 cash and $25 promise |
569 |
|
4.8 |
|
|
|
|
|
|
|
|
Two-way comparisons |
|
|
|
|
|
|
|
4. Visibility of mailout materials and text messaging |
All others (with cell phone number obtained) |
569 |
|
Priority mail, large envelope, text message |
190 |
|
6.7 |
5. Visibility of mailout materials and use of prepaid incentives |
All others |
853 |
|
Priority mail, large envelope, $5 cash and $25 promise |
284 |
|
5.5 |
6. Text messaging and prepaid incentives |
All others (with cell phone number obtained) |
569 |
|
Text message, $5 cash and $25 promise |
190 |
|
6.7 |
|
|
|
|
|
|
|
|
Three-way comparison |
|
|
|
|
|
|
|
7. Visibility of mailout materials, text messaging, and prepaid incentives |
All others (with cell phone number obtained) |
663 |
|
Priority mail, large envelope, text message, $5 cash and $25 promise |
95 |
|
8.9 |
|
|
|
|
|
|
|
|
NOTE: The detectable difference is the smallest difference in response rate (compared to the control) that will be statistically significant in a one-sided hypothesis test with level alpha = 0.05. The control group is assumed to have 35 percent response rate for hypotheses 1-7, and 25 percent response rate for hypothesis 8.
NOTE: BPS = Beginning Postsecondary Students.
Names of individuals consulted on statistical aspects of study design along with their affiliation and telephone numbers are provided below.
Name Affiliation Telephone
Dr. Sara Wheeless RTI 919/541-5891
Dr. Jennifer Wine RTI 919/541-6870
Ms. Melissa Cominole RTI 919/990-8456
Dr. Karol Krotki RTI 202/728-2485
Mr. Peter Siegel RTI 919/541-6348
Dr. Lutz Berkner MPR 510/849-4942
In addition to these statisticians and survey design experts, the following statisticians at NCES have also reviewed and approved the statistical aspects of the study: Dr. Dennis Carroll, Dr. James Griffith, Dr. Paula Knepper, Dr. Tom Weko, and Dr. Tracy Hunt-White.
The study is being conducted by the Postsecondary Longitudinal and Sample Survey Studies unit of NCES, U.S. Department of Education. NCES’s prime contractor is RTI. RTI is being assisted through subcontracted activities by MPR Associates. Principal professional staff of the contractors not listed above, who are assigned to the study, are provided below:
Name Affiliation Telephone
Ms. Donna Anderson RTI 919/990-8399
Mr. John Doherty RTI 919/541-7120
Ms. Vicky Dingler MPR 510/849-4942
Ms. Kristin Dudley RTI 919/541-6855
Ms. Emily Forrest-Cataldi MPR 515/270-8457
Mr. Jeff Franklin RTI 919/485-2614
Mr. Joe Simpson RTI 919/541-5941
1An indicator of Title IV eligibility has been added to the analysis files from earlier NPSAS studies to facilitate comparable analyses.
2The NPSAS:96 false positive rate was 27.6 percent for students identified as potential FTBs by the sample institutions, and the false negative rate was 9.1 percent for those identified as other undergraduate students.
3 To conserve resources, the follow-up sample of base-year nonrespondents was restricted to those with a valid Social Security number to increase the likelihood that they could be matched to sources used for batch locating.
4 This designation indicates that students were FTBs during the 2002–03 academic year, as were base-year interview respondents.
5 College credit earned while in high school did not affect FTB status.
6 Weighted percentages are cited in the text, while both unweighted and weighted values are provided in the tables.
7 Of the 1,280 NPSAS-eligible institutions enrolling FTBs, 830 (65 percent) participated in the NSC.
8 35 percent is used here as a baseline because it is consistent with participation rates obtained during the early response period from past studies.
BPS:04/09
Supporting Statement Request for OMB Review (SF83i)
File Type | application/msword |
Author | Randy Ottem |
Last Modified By | Edith.McArthur |
File Modified | 2008-04-07 |
File Created | 2008-04-07 |