Download:
pdf |
pdfB. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS
1.
RESPONDENT UNIVERSE AND SAMPLING METHODS
The 2008 NSRCG target population consists of persons who received a bachelor’s or master’s
degree in sciences, engineering, or health fields (SEH) from a postsecondary educational institution
located in the United States or in a U.S. territory (Puerto Rico, Guam, Virgin Islands) between July
1, 2005, and June 30, 2007. In addition, eligible persons must be age 75 or younger, not be
institutionalized, and be living in the United States or a U.S. territory as of October 1, 2008.
The 2008 NSRCG sample is based on a two-stage sample design. The first stage is a sample of
colleges and universities offering bachelor’s and master’s degrees in SEH fields; the first stage
sample was selected in the fall of 2007 with probability proportional to size (PPS), where the size
measure was a linear combination of desirable sampling rates for the domains of interest and the
number of graduates reported for schools in each domain. The sample comprises 302
postsecondary institutions. The second stage sample is a stratified random sample of graduates
from those institutions within strata based on the degree received, year of degree, major field of
study, race/ethnicity, and gender. In the spring of 2008, the second stage sample of 18,000
graduates will be selected from lists of graduates supplied by the institutions sampled in the first
stage. The sample design is described in more detail below.
The sampling frame for the 2008 NSRCG institutional sample was constructed based on the
2005–2006 Integrated Postsecondary Education Data System (IPEDS) Completions File 3 developed
by the U.S. Department of Education (ED), National Center for Education Statistics (NCES).
Institutions in the frame were classified by type of control (public, private); region (northeast,
north central, southeast, west); and the percentage of minority graduates in SEH fields. These
characteristics were used for sorting (implicit stratification of) the institutions for sampling.
The 2008 NSRCG institution sample consists of the 298 institutions selected for the 2006
NSRCG which remained in scope for 2008 and four new institutions that were selected by
probability proportional to a composite measure of school size to represent the newly eligible
institutions since the 2006 survey round. The composite measure of size is related to the number of
graduates reported by schools and the sampling rates for the analytic domains.
The composite measure of size for each institution required knowledge of the population counts
for the analytic domains and expected sampling rates for the domains. Domains used for the
composite measure of size calculation were the following:
• Two degree levels: bachelor’s and master’s
3
The Completions File contains the number of degrees/other awards granted by the postsecondary institution in
each field of study (CIP code), by level of award/degree, and race/ethnicity and gender of the recipient. The 2005-2006
IPEDS Completions File is the most recent file available from NCES for the selection of the 2008 NSRCG school
sample.
Page 13
Draft 2008 NSRCG OMB Supporting Statement
• Twenty-one major field categories: chemistry, physics/astronomy, other physical
sciences, mathematics/statistics, computer sciences, agricultural/food/environmental
sciences, aerospace engineering, chemical engineering, civil engineering, electrical
engineering, industrial engineering, mechanical engineering, other engineering,
biological sciences, psychology, economics, sociology/anthropology, other social
sciences, political science, and two health fields
• Six demographic groups: non-Hispanic white male; non-Hispanic white female; nonHispanic Asian male; non-Hispanic Asian female; minority (black, Hispanic, and
American Indian/Alaska Native) male; and minority (black, Hispanic, and American
Indian/Alaska Native) female
The measure of size for institution i, MOSi, is defined as
MOSi =
2 21 6
∑ ∑ ∑ f djk Nidjk ,
d =1 k =1 j =1
where f djk = expected sampling rate for degree d, major sampling category k, and demographic
group j, and Nidjk = total number of graduates of institution i with degree d, major sampling
category k, and demographic group j.
Of the total sample of 302 institutions, 4 217 were selected with probability proportional to size,
and 85 institutions were selected with certainty, that is, with probability equal to unity. The major
criterion for being selected with certainty was the number of SEH graduates in an institution.
2.
STATISTICAL PROCEDURES
The sampling frame for the SEH graduates is formed from lists of graduates from the sampled
universities. Each institution’s list will be stratified by (1) two graduate cohorts—one cohort from
the 2005–06 academic year (July 1, 2005-June 30, 2006) and the other cohort from the 2006–07
academic year (July 1, 2006-June 30, 2007); (2) two degree levels—bachelor’s and master’s; (3) the
20 major fields of study sampling categories identified above 5 ; (4) the three race/ethnicity groups—
non-Hispanic white, non-Hispanic Asian, and minority (black, Hispanic, and American
Indian/Alaska Native); and (5) two gender groups. All graduates will be selected in such a way as
to create equal probability of selection within full frame strata. A total of 504 different strata will
be developed for the cross-classification of the above-mentioned domains. Underrepresented
minorities will be selected at 3 times the rate of whites. Asians and unknown race cases will be
selected at 1.74 times white cases. The total sample size will be 18,000.
4
The 2003 and 2006 NSRCG selected 300 institutions. The sample size was increased to reflect the increase in
graduate sample size related to the addition of newly eligible schools.
5
Two health fields will be combined to be consistent with the level of analytic domains. That is, all health fields
are reported in the same reporting cell.
Page 14
Draft 2008 NSRCG OMB Supporting Statement
Appendix C shows the proposed sample sizes for each stratum of the cohort from the 2005–
2006 and 2006–2007 academic years. The proposed sample sizes are based on the same sampling
rates used for composite size measure calculation for the school sample selection. With these
proposed sample sizes, the corresponding sampling rates, defined as the ratio of the sample sizes to
the IPEDS counts, are calculated. These sampling rates by stratum will be applied within each
eligible responding institution and should result in sampling 18,000 graduates. The domain specific
sample sizes are random variables that depend on how closely the number of graduates in the
eligible fields as reported by the institutions corresponds to the IPEDS counts used for sampling;
minor variation in the achieved sample size is expected.
The target response rate for the 2008 NSRCG is 80 percent. This target is higher than response
rates achieved in 2003 and 2006 survey cycles. The plans for maximizing the response rate are
presented in Section 3.
The analysis of survey data from the 2008 NSRCG requires survey weights to account for
unequal probabilities of selection, unit nonresponse, duplicates on the sampling frame, extreme
weights, and coverage errors.
Constructing the Institution-Level Weight. The first step of the 2008 NSRCG weighting
process will begin with the construction of the sampling weights for the postsecondary institutions.
All sampled institutions will have a sampling weight equal to the inverse of the institution’s
probability of selection. The nonresponse adjustment cells at the school level will be formed by a
cross-classification of institutional control (public and private), region, representation (whether the
institution is self-representing or non-self-representing), and percentage of minority graduates.
Constructing the Graduate-Level Sampling Weights. The graduate sampling weight is the
product of the institution-level, nonresponse-adjusted weight and the inverse of the conditional
probability of selecting the graduate, given that the individual’s institution was selected. The next
step will be a weighting adjustment to account for graduate nonresponse. The graduates will be
classified as eligible respondents, eligible nonrespondents, ineligible, or eligibility unknown. In
addition, the sample can be also partitioned into two groups: located and not located. The graduate
level nonresponse adjustment will be computed in three steps: adjustment for not-located cases,
adjustment for eligibility unknown cases, and adjustment for eligible nonrespondents.
Consequently, the graduate, nonresponse-adjusted weight is the product of these three factors
(factor 1 for not-located cases, factor 2 for eligibility unknown cases, and factor 3 for
nonrespondents) and the base weight.
Adjustment for Multiple Chances of Selection (Multiplicity Adjustment). The next adjustment
to the graduate weight involves those responding graduates who could have been sampled more
than once. For example, a person who obtained a U.S. bachelor’s degree in June 2006 and a U.S.
master’s degree in June 2007 (both in eligible fields) could have been sampled for either degree. If
a respondent had multiple degrees from within or across sampled schools, he or she will very likely
be identified before the sample selection so that no graduate will be sampled more than once.
Consequently, multiple degree holders are expected to be identified in the weighting stage if they
reported eligible degrees from nonsampled schools in addition to sampled schools. To make the
survey estimates essentially unbiased, the weights of all responding graduates who could have been
sampled multiple times (but not identified at the time of sampling) will be divided by the number of
times of possible selection.
Page 15
Draft 2008 NSRCG OMB Supporting Statement
Raking Adjustment. As in the 2003 and 2006 NSRCG weighting, a raking procedure will be
applied to enhance the precision of the 2008 NSRCG estimates after adjusting for multiple degrees.
Raking is a method of adjustment that ensures the adjusted weights of the respondents conform to
each of the marginal distributions of the auxiliary variables (Deming and Stephan 1940). Raking
involves an iterative adjustment of the weights in which fitting methods—such as an iterative
proportional fitting algorithm or least squares—are used.
Trimming of Outlier Weights. Raked weights will be evaluated for the existence of outlier
weights. To do this, weighted counts for the present and past survey year will be compared for rare
populations subject to oversampling (that is, black, Hispanic, and American Indian/Alaska Native).
When rare populations are oversampled, excessive variation can occur in the population counts
from year to year, particularly when members of rare populations are unexpectedly encountered in
sampling a “non-rare” stratum. The large weight given to these rare cases when sampled from a
non-rare stratum can cause even one such selection to distort rare population counts from one year
to the next. The increase in sampling error can be substantial if the range of weights is large. In
particular, extremely large sampling weights can seriously reduce survey precision.
To correct outlier problems, the weight of the outliers will be trimmed by investigating weight
distributions for each analytic domain of interest. The raking adjustment will be repeated after
weight trimming. This second iteration of raking will serve as a smoothing adjustment to recover
the amount trimmed from the outlier weights.
Constructing the Final Weight and Replicate Weights. The final analysis weight will be
constructed by implementing the above-mentioned procedures: sampling, nonresponse adjustment,
multiplicity adjustment, raking, and trimming. A set of replicate weights will be produced based on
the jackknife replication method. The entire weighting process applied to the full sample will then
be applied separately to each of the replicates to produce a set of replicate weights for each record.
Standard Errors. Variance estimation procedures similar to those used for the 2003 and 2006
NSRCG will be used in 2008: the jackknife replication method and generalized variance function
(GVF) method. Appendix C shows the proposed sample size allocation that would provide
statistically reliable estimates for a substantial number of domains in the 2008 NSRCG.
3.
METHODS TO MAXIMIZE RESPONSE RATES
Maximizing Response Rates
A critical issue for the NSRCG is dealing with response rates that declined from around 85
percent in the early 1990s to 68 percent in 2003 and 2006. The approach for 2008 survey will be to
reduce the number of nonrespondents through improvements in the data collection strategy.
Nonresponse to most surveys is caused by two factors: (1) the inability to locate the sample
member and (2) the inability to gain cooperation from the located sample member.
The lower response rate in 2003 and 2006 might be not only because sample members were
less likely to respond to the survey, but also because it became harder to locate and contact sample
members. The nonresponse rate for the 2003 and 2006 NSRCG was about 30 percent, with about 16
Page 16
Draft 2008 NSRCG OMB Supporting Statement
percent in 2003 and 20 percent in 2006 due to nonlocated cases. The NSRCG population is recent
college graduates who are highly mobile and more likely to have only a cell phone.
Methods of maximizing response rates include offering multiple modes for completing the
interview, offering incentives, addressing the cell phone issue, converting refusals effectively, and
applying intensive locating efforts. The 2008 NSRCG is a multimode study, with web, mail and
telephone modes offered. We are planning to emphasize first the lowest marginal cost mode, the
web, followed by the second lowest cost mode, mail, and finally telephone, the most expensive. In
addition to this emphasis, MPR will use a number of techniques to try to ensure early participation
in the 2008 NSRCG, which reduces follow-up costs.
To maximize the response rate and timely completion of data collection, we plan to offer
monetary incentive strategy that will build on the incentive experiment results from the 2003 and
the 2006 NSRCG. The incentive plan will be designed to determine if incentives are cost effective
when offered more widely and earlier during the data collection. If this strategy does not result in
higher response rates early in the data collection, a different type of incentive strategy, with smaller
dollar amounts to larger numbers of sampled graduates, will be considered.
Locating
We will start locating the sample members early by obtaining the latest contact information
from the alumni offices of the institution from which they received their sampled degree. These
offices are often the best source of current information because they have a vested interest in
maintaining contact with alumni. Early locating will mostly involve various nonintrusive
locating resources to collect the best contact information on the sample members prior to the
data collection.
All survey mailings will utilize the “Return Service Requested” option to ensure that the
postal service will provide a forwarding address for any undeliverable mail. During the data
collection field period, all cases still lacking a valid address or telephone number will be
handled by the most experienced locators who will: (1) search more extensive (often more
expensive) electronic databases for contact information, (2) conduct individually customized
Internet searches, and (3) contact school departments from which the sample member graduated
or associations in which he or she might have memberships. In addition, emerging sources of
information, such as cell phone directories and search engines, will be monitored for possible
use in locating NSRCG sample members.
Addresses Outside the United States
If a sample member has a current address outside the United States, we will institute special
procedures to try to confirm that the person is still outside the United States on the reference date of
October 1, 2008, and therefore ineligible for the study. This will include calling the sampled
graduate and all available contacts during the week of October 1, 2008. We will do this before
mailing any initial invitation. If we can identify the sampled graduate as ineligible, we can code the
case ineligible without expending additional resources.
Page 17
Draft 2008 NSRCG OMB Supporting Statement
Telephone and Address Verification Form (TAVF)
The advance letter will be mailed to the sampled graduates four weeks prior to October 1, 2008.
To increase initial contact rates, a telephone and address verification form (TAVF) will accompany
the advance letter. TAVF will collect the usual contact information, cell phone information (the
service provider), and the sampled graduate’s email address(es). MPR’s toll-free telephone number
and email address will also be included for sample members who have questions. A postage-paid
return envelope will facilitate returning completed TAVFs. See Appendix E for the TAVF.
Data Collection
A multimode data collection protocol will be used to improve the likelihood of gaining
cooperation from sample cases that are located. Sample cases will be offered a choice for
responding—either by web, mail, or telephone. Offering choices to the respondent communicates
flexibility and consideration for the respondent, which may help obtain an increased number of
responses. In addition, offering a choice gives respondents who do not have a telephone, who have
an invalid telephone number, or who have a call-screening device other avenues of responding to
the survey. Recent graduates are highly web-literate, so offering a web response option is apt to be
appealing to NSRCG respondents.
In addition to these procedures, the following steps will be taken to maximize response rates
and minimize nonresponse:
• Developing “user friendly” survey materials that are simple to understand and use
• Sending attractive, personalized material using priority mail, making a reasonable
request of the respondent’s time, and making it easy for the respondent to comply
• Using priority mail for targeted mailings to improve the chances of reaching respondents
and convincing them that the survey is important
• Devoting significant time to interviewer training on how to deal with problems related to
nonresponse and ensuring that interviewers are appropriately supervised and monitored
• Using refusal-conversion strategies that specifically address the reason why a potential
respondent has initially refused, and then training conversion specialists in effective
counterarguments
See Appendices E and F for survey mailing materials.
Dealing with Issues of Nonresponse Bias
To minimize the potential nonresponse bias in the NSRCG, weighting procedures were
executed to compensate for nonrespondents in the final weighted estimates. Multivariate logistic
regression analyses were conducted to identify the sampling frame variables that might have
affected the sample members’ response propensity.
However, NSF was still concerned with the lower than expected survey response rate in the
NSRCG and contracted with the Census Bureau (Survey Research Division) to study the
nonresponse bias issues in the 2003 NSRCG data. Research results found no significant bias in the
Page 18
Draft 2008 NSRCG OMB Supporting Statement
final data and any small differences were properly addressed in the nonresponse weighting
adjustments.
In 2008, the base weights for nonresponse will be adjusted using the procedures described
above. Also, NSF may consider looking at a few other sampling variables for the weighting
strategy, such as school or respondent location, to see if the nonresponse adjustment weighting can
be fine-tuned. Careful selection of factors for constructing the weighting classes will reduce the
potential for nonresponse bias. Weights will also be adjusted to control distributions for some
variables to known totals from the sample frame, as described above. An assessment will be made
of the extent of remaining bias by comparing weighted estimates for the survey sample that can be
observed in the sample frame (e.g., degree field, degree level, and gender) to estimates for the
population that the weighted sample is intended to represent.
4.
TESTING OF PROCEDURES
Because data from all three SESTAT surveys are combined into a unified data system, the
surveys must be closely coordinated to provide comparable data from each survey. Most
questionnaire items in the three surveys are the same.
All content items have undergone an extensive review before they were included in the final
version of the SESTAT questionnaires. The changes made in the questionnaires are a result of a
variety of activities that included extensive review of the entire content in each of the SESTAT
survey questionnaires and additional research on specific items to provide more information before
a final decision was made on placement and wording of the item in the questionnaires. Content
evaluation and testing activities for the 2003 and 2006 surveys included the following:
• External and internal consultation with questionnaire design experts on questionnaire
layout and formatting to improve user-friendliness and minimize respondent reporting
errors
• External consultation on improving the messages in the survey contact materials
• A two-stage pretest of the survey questionnaires consisting of mail and telephone
Activities below contributed to the development of the NSRCG questionnaire.
Survey Questionnaire Review and Research
The SESTAT survey questionnaire items are divided into two types of questions: core and
module. Core questions are defined as those considered to be the base for all three SESTAT
surveys. These items are essential for sampling, respondent verification, basic labor force
information, and/or robust analyses of the science and engineering workforce in the SESTAT
integrated data system. They are asked of all respondents each time they are surveyed, as
appropriate, to establish the baseline data and to update the respondents’ labor force status and
changes in employment and other demographic characteristics. Module items are defined as special
topics that are asked less frequently on a rotational basis of the entire target population or some
subset thereof. Module items tend to provide the data needed to satisfy specific policy, research or
data user needs.
Page 19
Draft 2008 NSRCG OMB Supporting Statement
After identifying the core and module items that would be included in the SESTAT surveys,
SRS reviewed and identified content items needing improvement and engaged in research to craft
new questions. SRS conducted separate studies on six core items and one study on a module for the
2003 survey questionnaires. The core item research covered the following topics on the SESTAT
questionnaires: employer’s main business, academic positions, academic institutions, work
activities, marital status, and degrees earned abroad.
The core item research resulted in some wording changes to those questions on the SESTAT
questionnaires and a revision of how the occupation code frame is presented. The module item
research led to the addition of questions on community college experiences in the 2008 NSRCG
questionnaire.
The NSRCG questionnaire currently contains a set of core questions on community college
experience (e.g. attendance at community college, associate’s degree attainment). In the 1990s, the
NSRCG also fielded questions on the reasons for attending a community college. The role of
community colleges in postsecondary education has increased over the decade. In order to better
understand the influence of these institutions on the science and engineering educational pathway,
SRS engaged in research to improve the core and module questions on community college
experience that have been fielded in the past in the NSRCG, as well as to develop a new set of
questions for the 2008 NSRCG.
SRS fielded a series of cognitive interviews to improve the battery of questions on community
college attendance in the NSRCG. Through two rounds of cognitive interviews consisting of 20
cognitive interviews in the first round, and additional 14 interviews in the second round, SRS was
able to test the performance of previously asked questions on community college experience, as
well as to refine new questions for 2008. The new questions related to the timing of community
college attendance, and the influence of such attendance on educational and career pathways. The
goal of testing these new questions in the cognitive interviews was to ensure that the questions
performed as intended and were clearly understood by the respondents.
Cognitive interviews helped identify some issues that the respondents had with these questions.
Based on the interview results, SRS was able to fine-tune the community college questions and
response category options.
For 2008, the NSRCG questionnaire content will be revised from 2006 as follows:
•
•
•
•
Survey reference date changed from April 1, 2006 to October 1, 2008.
Removed a 2006 module on collaborative activities (it has not yet been decided if this will
be rotated back in at a future time).
Rotated in a module on second job (status, job description, job category, relatedness of
second job to highest degree), which was asked in 1993-2001.
Rotated in a module on respondent’s and spouse’s areas of technical expertise, which was
asked in 1993-2003.
A complete list of questions proposed to be added, dropped, or modified in the 2008 NSRCG
questionnaire is included in Appendix D.
Page 20
Draft 2008 NSRCG OMB Supporting Statement
The 2008 NSRCG questionnaire retains all content changes that were tested and implemented
for the 2006 SESTAT questionnaires. In 2005, SRS conducted an extensive pretest under a generic
clearance (OMB No. 3145-0174) that consisted of two phases: (1) two rounds of in-depth cognitive
interviews, and (2) a small-scale field test of the mail questionnaires.
Pretest Phase I – Cognitive interviews
MPR and the U.S. Census Bureau (Survey Research Division) were contracted to conduct in-depth
cognitive interviews on the 2006 NSRCG and the other two SESTAT survey questionnaires.
Cognitive interviews were conducted in two waves, with the waves being scheduled during the
same time period at MPR and the Census Bureau. MPR tested the full-length questionnaires for the
three surveys, while the Census Bureau was asked to focus on the employment section of the
NSRCG. In addition to the questionnaires, the cognitive interviews were also used to test
improvements to the cover letters for the 2006 survey administration.
The first round of cognitive interviews was conducted between February 2 and February 25, 2005.
During this period MPR and Census Bureau each interviewed 30 respondents. The second round of
cognitive interviews was conducted between March 25 and May 2, 2005. MPR interviewed 40
respondents (28 in-person and 12 via telephone) and the Census Bureau interviewed 30
respondents. Based on the results of the cognitive interviews, MPR and NSF worked together to
develop a series of experiments to test in the mail portion of the pretest.
Pretest Phase II – Mail Field Test
The field test consisted of two mailings of NSRCG and the other two SESTAT surveys with a
reminder postcard in between; no further nonresponse follow-up was conducted due to time
constraints. The NSRCG mail pretest included a sample of 1,500 selected from a frame of 600,000
records of recent college graduates that were not selected for the 2003 NSRCG. To mimic the
typical characteristics of NSRCG sample to the extent possible, 30 percent of the pretest sample
consisted of those who held master’s degrees and 70 percent who held bachelor’s degrees. In
addition, minority graduates were over-sampled as follows: 15 percent selected for the sample were
Hispanic, 15 percent black or African American, and 70 percent all others. These 1,500 cases were
randomly assigned a number from 1 to 4 designating the version of the questionnaire each was to
receive, and assigned to one of four control or experimental groups.
Pretest questionnaires were mailed on June 24, 2005 using first class mail. Although mailing a
reminder was not part of the original pretest plan, a postcard reminder was sent to all nonrespondents because of the low response (12 percent) to the first mailing. The postcard was mailed
on July 20, 2005, and provided an additional boost of about 2 percentage points to the response rate
for a 14 percent cumulative overall response rate from all three SESTAT surveys to the first
mailing. A second mailing was sent on August 3, 2005 with a cover letter urging participation with
a “respond by” date in a Priority Mail envelope. Mail returns were accepted until August 26, 2005.
Final response rate to the NSRCG mail pretest was about 20%. Final response rate for respondents
from all three surveys was 27 percent.
The primary goal of the field pretest was to test the various recommended questionnaire changes
from the cognitive interviews. Specific test conditions were incorporated to obtain research data
that might further improve the questionnaires. These are described below:
Page 21
Draft 2008 NSRCG OMB Supporting Statement
1) Testing the placement of the sample person’s name and address label on the questionnaire
(front versus back cover).
2) Testing the Field of Study and Job Category Code Lists in a new format.
3) Testing a different approach to “anchoring” the reference date in the employment questions.
4) Testing a new wording and format of the principal employer type question.
In addition, the experimental versions of the questionnaires had small wording and formatting
changes for some questions of interest such as work activity categories, employer name and
location, supervising, etc. The control versions of the questionnaire retained the same wording for
most questions of interest and Field of Study/Job Category Code Lists used in 2003. Testing the
label placement by the presence versus absence of the content changes created a two-by-two design,
shown in table below.
Mail Pretest Design
Content, Anchor, and Code List
Old Content
Address
Label
(Control)
New Content
(Experimental)
Back
Questionnaire Version 1
Questionnaire Version 3
Front
Questionnaire Version 2
Questionnaire Version 4
The mail pretest also included testing of a new 2006 module on the method and means of
collaboration; using “Yes/No” response options in a few remaining questions with the “Mark All
That Apply” response options used in 2003; moving the part-time employment questions to a
different section and revising the work-related training reasons to fine tune the measurement of the
concepts for these two items.
Based on the mail pretest results, decisions were made to keep the sample person’s name and
address labels on the front cover of the questionnaire; use the revised wording and format of the
employer sector question; use the new Field of Study/Job Category Code Lists; no longer use the
‘Mark All That Apply’ response option; not use the reference week “anchoring” question but use
consistent question wording in all references to the principal job.
Questionnaire Layout
SRS had previously engaged the services of Dr. Don Dillman to further improve the visual
presentation of the 2003 and 2006 SESTAT questionnaires. An SRS staff member with expertise in
visual design theory was also involved in this process. The suggested revisions to the questionnaires
included the standardization and consistent use of formatting, placement of instructions, and
placement of privacy act notices. Also revised were items whose format required the respondent to
review a long list of items before reporting a response, to make the selection process easier for the
respondents.
Page 22
Draft 2008 NSRCG OMB Supporting Statement
Web-Based Survey Instrument Tests
The 2008 NSRCG web survey instrument will be updated based on the web instrument
developed in Blaise for the 2003 NSRCG that MPR developed when they conducted the survey for
NSF. In 2006, NSRCG did not have a web survey option (Census Bureau conducting the survey).
The 2003 NSRCG web instrument went through usability testing that led to changes in layout,
question numbering, and question wording. With layout, participants were asked to comment on
the display of questions that were difficult to fit on a single screen due to lengthy instructions or
numerous response choices. Usability testing was also instructive in determining the best way to
format instructions for individual questions so respondents would be most likely to read them. In
addition to the usability testing, the performance of the Web survey was tested in several different
versions of various internet browsers such as Netscape and Internet Explorer to ensure its
functionality across various browsers. A series of user login IDs were also created for testing the
instrument. Project staff familiar with both the paper and CATI questionnaire conducted the testing
so that any unspecified inconsistencies between the modes were detected.
2006 Survey Methodology Tests
Prepaid Incentive Experiment
In 2006, the Bureau of the Census conducted a prepaid incentive experiment in the NSRCG.
This experiment was designed to increase the response rate of traditionally low responding
demographic groups. These groups were offered two levels of incentive in the form of a prepaid
gift card in the amount of $5 or $10. This activated card was included in the mailing with their first
questionnaire. There was also a control group that did not receive an incentive.
The experiment found that both the $5 or $10 incentive groups had significantly higher
response rates compared to the no incentive group. However, when compared to each other, the $10
incentive group did not have significantly higher response over the $5 incentive group.
Postpaid Incentive Experiment
In 2006, the Bureau of the Census also conducted a postpaid incentive experiment on the
NSRCG. This experiment was designed to increase the response rate of the late respondents who
were either classified as refusals (both soft and hard) or elusive nonrespondents (contact
information confirmed to be correct but unable to reach the sample person) by offering a postpaid
monetary incentive in the form of an unactivated $20 gift card. The respondents were told that the
gift card would be activated within two business days once the interview was completed. This
unactivated card was included in the final questionnaire mailing and also offered during the CATI
calls to the incentive treatment group of respondents. There was also a control group that did not
receive an incentive.
The experiment found that the incentive increased the response rate by about 11% among previous
NSRCG refusal cases. The difference in the response rates was statistically significant. However,
not much difference was found for the elusive nonrespondents between the incentive and control
groups perhaps due to less accurate contact information in the recent graduates.
Page 23
Draft 2008 NSRCG OMB Supporting Statement
Brochure Experiment
Historically, the NSRCG advance mailing included a Telephone and Address Verification
Form (TAVF). In 2006, NSRCG tested the effect of mailing the traditional TAVF as opposed to a
colorful brochure that included the same information as the TAVF and results from previous
NSRCG data collections. Three different brochures were developed and tested: (1) one targeted
historically low-responding degree fields, (2) another targeted low-responding racial minorities, and
(3) the last provided information of interest to all respondents. This experiment showed that the
brochure did not perform as well as the traditional TAVF. In 2008, NSRCG will use the TAVF for
advanced mailing.
Survey Methodology Tests to be Undertaken
As described in Section A, an incentive experiment to assess the effectiveness of 1) offering an
incentive early on before resistance sets in, and 2) offering one later in the second mailing, after all
of the initial respondents have responded is being considered. We would also like to look at ways
to help minimize data collection costs by offering a differential incentive that favors completing the
questionnaire on the web. The NSRCG survey contractor, MPR, has successfully used a differential
incentive to encourage web responses in a 2007 survey of college students (Ladinsky et al. 2007) as
well as in the 2003 NSRCG, but at that time the incentive was implemented only during
nonresponse followup. The response rates between the various groups can be tracked as well as the
rate of returns and the chosen completion mode.
Details on the incentive experiment plan are currently under development. This incentive plan
will emphasize both these strategies and build on the results of the 2003 and the 2006 NSRCG. The
incentive plan will be designed to determine if incentives are cost effective when offered more
widely and earlier during the data collection. If this strategy does not result in higher response rates
early in the data collection, a different type of incentive strategy, with smaller dollar amounts to
larger numbers of sampled graduates, will be considered.
NSF plans to conduct additional methodological tests in the current and future rounds of the
survey to reduce burden and increase utility of the survey under the burden hours in this survey
clearance for the next survey cycle. Proposals for these additional tests are still under consideration.
These will be submitted for OMB approval.
5.
CONTACTS FOR STATISTICAL ASPECTS OF DATA COLLECTION
The individuals consulted on technical and statistical issues related to the data collection are
listed in Table B.1. As mentioned, the data will be collected by MPR, a research contractor selected
through an open competition.
Individuals Consulted on Technical and Statistical Issues
NAME
Kelly Kang
NSRCG Project Officer
AFFILIATION
National Science Foundation
Arlington, VA
Page 24
Draft 2008 NSRCG OMB Supporting Statement
Telephone Number
703-292-7796
Steve Cohen
Chief Statistician
Donsig Jang
Senior Statistician
Anne Ciemnecki
Survey Director
National Science Foundation
Arlington, VA
703-292-7767
Mathematica Policy Research
Washington, DC
202-484-4246
Mathematica Policy Research
Princeton, NJ
609-275-2323
Page 25
Draft 2008 NSRCG OMB Supporting Statement
REFERENCES
Brick, J. M., and G. Kalton. “Handling Missing Data in Survey Research.” Statistical Methodology
in Medical Research, vol. 5, 1996, pp. 215–238.
Deming, W. E., and F. F. Stephan. “On a Least Squares Adjustment of a Sample Frequency Table
When the Expected Marginal Totals Are Known.” Annals of Mathematical Statistics, vol. 11,
1940, pp. 427–444.
Herron, A., M. Henly, M. White, and A. Zukerberg. “First Impression: An Advance Contact
Experiment to Locate and Engage Potential Respondents.” Washington, DC: US Census
Bureau, July 31, 2007.
Iannacchione, V. G., J. G. Milne, and R. E. Folsom. “Response Probability Weight Adjustments
Using Logistic Regression.” Proceedings of the Section on Survey Research Methods,
American Statistical Association. Alexandria, VA: American Statistical Association, 1991, pp.
637–642.
Jang, D., D. Edson, and E. Friedman. “Sampling Errors for SESTAT: 1993, 1995, 1997, 1999, and
2001.” Washington, DC: Mathematica Policy Research, Inc., December 22, 2003.
Jang, D., A. Sukasih, W. Sukasih, X. Lin. “Sampling Errors for 2003 SESTAT.” Washington, DC:
Mathematica Policy Research, Inc., February 2007.
Ladinsky, J., L. Kalb, G. Mooney. “One Mode or Two? Does Offering Less Yield More?” Paper
presented at the American Association for Public Opinion Research Annual Meeting, Anaheim,
CA, May 17, 2007.
Lessler, J. T., and W. D. Kalsbeek. Nonsampling Errors in Surveys. New York: Wiley, 1992.
Little, R. “Survey Nonresponse Adjustments for Estimates of Means.” International Statistical
Review, vol. 54, 1986, pp. 139–157.
Potter, F. J., V. G. Iannacchione, W. D. Mosher, R. E. Mason, and J. D. Kavee. “Sample Design,
Sampling Weights, Imputation, and Variance Estimation in the 1995 National Survey of Family
Growth.” National Center for Health Statistics. Vital and Health Statistics, series 2, no. 124,
1998.
Singer, E., R. Groves, and A. D. Corning. “Differential Incentives: Beliefs About Practices,
Perceptions of Equity, and Effects on Survey Participation,” Public Opinion Quarterly, vol. 63,
1999, pp. 251–260.
Tambay, J., I. Schiopu-Kratina, J. Mayda, D. Stukel, and S. Nadon. “Treatment of Nonresponse in
Cycle Two of the National Population Health Survey,” Survey Methodology, vol. 24, 1998,
pp.147–156.
Page 26
Draft 2008 NSRCG OMB Supporting Statement
Wilson, C., D. Jang, T. Barton, M. Pierzchala, K. Kang, and J. Tsapogas. “2003 National Survey of
Recent College Graduates: Methodology Report.” Report Submitted to National Science
Foundation. Washington, DC: Mathematica Policy Research, Inc., November 2005.
Wolter, K. Introduction to Variance Estimation. New York: Springer-Verlag, 1985.
Woodruff, R. S. “A Simple Method for Approximating the Variance of a Complicated Estimate.”
Journal of the American Statistical Association, v.66, 1971, pp. 879-884.
Yansaneh, I., and J. Eltinge. “Construction of Adjustment Cells Based on Surrogate Items or
Estimated Response Propensities.” Proceedings of the Survey Research Methods Section of the
American Statistical Association. Alexandria, VA: American Statistical Association, 1993, pp.
538-543.
Zukerberg, A. L. “Results of 2006 NSRCG Prepaid Incentive Experiment.” Washington, DC: U.S.
Census Bureau, Demographic Surveys Division, August 17, 2007.
Page 27
Draft 2008 NSRCG OMB Supporting Statement
File Type | application/pdf |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |