Download:
pdf |
pdfAttachment F: Analysis Plan for DIS Pilot Study
Research Questions
1. Does the use of dependent interviewing reduce the time to administer the SDR?
a. Compare overall timing, section timing, and timing on individual questions on DI-1, DI-2, and
INDI conditions (from paradata)
b. Compare DI-1, DI-2, and INDI conditions on perceived survey speed at which they moved
through the survey (from item RAS1)
2. Does the use of dependent interviewing affect response quality?
a. Compare DI-1, DI-2, and INDI conditions on rates of reported change in employment status,
employer, or occupation (Literature suggests DI approaches reduce reports of spurious change),
overall and by specific characteristics (as sample size permits):
1. By cohort (defined as categories based on year since degree)
2. By other demo characteristics (e.g., gender, citizenship, etc)
b. Compare internal consistency of reported changes in employer and occupation between DI-1,
DI-2, and INDI conditions (e.g., responses to A9 vs B1/B2 in DI instruments)
c. Compare number of characters in verbatim responses for principal job title, and job duties
between DI-1, DI-2, and INDI.
d. Compare DI-1, DI-2, and INDI conditions on item nonresponse rates
e. Evaluate negative response patterns suggestive of poor data quality, specific to each DI version:
1. For D1-2, changing the answer to first part after presented with screen to enter updated
information
2. For D1-1, editing prefilled data indicating a change, but also checking “no change” box
3. What do respondents think of the experience of dependent interviewing (in comparison to
independent interviewing)? (These questions are answered by analyzing RAS response data.)
a. (For all conditions) Overall, was this survey experience similar to most other web survey
experiences? (RAS2)
b. (For all conditions) How enjoyable was the survey experience? (RAS3)
c. (For all conditions) Perceived sensitivity of the survey (RAS4)
d. (For all conditions) ) Confidence in the protection of the data on the survey (RAS5)
e. (For DI conditions) Have respondents participated in a survey (web or other) where the vendor
had historical information from them or another source? (RAS8)
f. (For DI conditions) Reactions to seeing their historical information displayed on the
survey/(For INDI condition) How they might react if they saw their historical information
displayed on the survey (RAS9)
g. (For DI conditions) Do they remember doing the SDR in the past? (RAS6)
h. (For DI conditions) Did they remember their responses from last time they completed the SDR?
(RAS7)
i. (For DI conditions) Did displaying their historical information help them to provide more
accurate data?/(For INDI condition) Would displaying their historical information help them to
provide more accurate data? (RAS11)
j. (For DI conditions) How did respondents decide when to update information versus leave it asis? (RAS12, RAS13)
k. (For DI conditions) Did displaying their historical information change the perceived burden of
the survey experience/(For INDI condition) Would displaying their historical information
change the perceived burden of the survey experience (RAS14)
l.
(For all conditions) Do they think dependent interviewing is a good or bad idea (RAS16)
97
Method: Small-scale DIS Pilot Study
• Embed an experiment that randomly assigns sampled cases to one of three questionnaire
conditions: dependent interviewing with a one-stage approach (DI-1), dependent interviewing
with a two-stage approach (DI-2) or independent interviewing (INDI).
o The DI conditions will display data from a subset of items from the most recent SDR
completed and ask for confirmation or updates.
o The INDI condition will be an abbreviated version of the current SDR questionnaire
that is used in production.
• Include a brief response analysis survey (RAS, questions asked at the end of the survey to
gather information about the experience of completing the survey.)
Analysis
The tables will include measures of statistical significance when appropriate per sample size. Tables
that use survey response data will have weighted and unweighted versions.
Employer changes will be measured by change in employer name alone, change in department/division
but not employer name, change in employer name and address, and change in either name or address
along with respondent self-categorize change in employer.
Employment status change will be measured by a change from working for pay or profit on Feb 1 of
prior cycle to not working for pay or profit in DIS Pilot Study instrument, including change to retired
or not in labor force.
Other demographic characteristics shown in tables will include citizenship status and broad field of
degree, as sample size permits
Table 1. Average time to complete (minutes) by questionnaire version and employment status.
Currently employed
Not currently employed
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Full survey
Question #
Question #
Question #
Table 2a. Number and percent of respondents with a measured employment change, by questionnaire
version
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Change in
emp. status
Change in
employer
Change in
job title
Change in
OCC code
98
Change in
top level
OCC code
Table 2a. Number and percent of respondents in which self-categorized change in employer or job type
(B1/B2) is consistent with measured change in employer or job type, by version
Selfcategorized
matches
DI-1
DI-2
INDI
DI-1
DI-2
INDI
measure for:
Count
Count
Count
%
%
%
Change in
employer
Change in
job type
Change in
both
No change in
either
Table 3. Item nonresponse, by questionnaire version
DI-1 6
DI-2 7
INDI
Count
Count
Count
Question #
Question #
Question #
Overall
average
DI-1
%
DI-2
%
INDI
%
For DI-1 (one stage), item nonresponse occurs if the respondent did not make an update to prefilled data, nor mark the box
indicating no changes (e.g., displayed data reflect current situation).
7
For DI-2 (two stage), responses to the first part and second part are needed for the data to be complete. Item nonresponse
occurs when the respondent does not answer both parts of the question.
6
99
Table 4a. Percentage of D1-2 respondents who initially indicated a change (first part of DI question)
since last cycle, but reversed that response when presented with screen to enter updated information
(second part of DI question).
Final count of
Count of
% of “no
% of
“no change”
Respondents
change” from
Respondents
from last cycle
indicating “no
last cycle (first
indicating “no
(first part of DI
change” and
part of DI
change” who
question)
backed up at
question)
also backed up at
second part of
2nd part of DI
DI
Question #
Question #
Question #
Table 4b. Percentage of D1-1 respondents who enter some edits to prefilled answer but also check the
box indicating “no change since prior cycle”.
Count of
Count of
% of respondents % of
respondents
Respondents
editing prefilled Respondents
editing prefilled editing prefilled response
editing prefilled
response
response and
response who
indicating “no
also indicated
change”
“no change”
Question #
Question #
Question #
Table 5a. Average length of gap reported between end of job reported in the previous SDR response
and DIS Pilot Study reference date (September 1, 2020), for cases reporting a job change and currently
employed
Time between end of last job and September 1, 2020
Employer changed
Job title changed
OCC code changed:
DI-1questionnaire
DI-2 questionnaire
INDI questionnaire
Table 5b. Average length of gap reported between end of job reported in the previous SDR response
and DIS Pilot Study reference date (September 1, 2020), for cases reporting not currently employed
Time between end of last job and September 1, 2020
Currently Retired
Currently
Currently Not in
Unemployed
Labor Force
DI-1questionnaire
DI-2 questionnaire
INDI questionnaire
100
Table 6A. Rates of change in employment status from previous SDR response to 2020 by
questionnaire condition, and by other characteristics
Characteristics (sample
DI-1
DI-2
INDI
size dependent)
%
%
%
Overall rate of change:
employment status
Year of graduation:
- 1970-1979
- 1980-1989
- 1990-1999
- 2000-2009
- 2010-2013
- 2014 – 2018
By sex:
- Male
- Female
By race/ethnicity:
- (categories)
By other demographic
characteristic(s)
SDR response year
- 2015
- 2017
- 2019
Table 6B. Rates of Employer name change from previous SDR response to 2020 by questionnaire
condition, and by other characteristics
Characteristics (sample
DI-1
DI-2
INDI
size dependent)
%
%
%
Overall rate of change:
employer name
Year of graduation:
- 1970-1979
- 1980-1989
- 1990-1999
- 2000-2009
- 2010-2013
- 2014 - 2018
By sex:
- Male
- Female
By race/ethnicity:
- (categories)
By other demographic
characteristic(s)
SDR response year
- 2015
- 2017
- 2019
101
Table 6C. Rates of Principal Job Title change from prior SDR response to 2020 by questionnaire, and
other characteristics
Characteristics (sample
DI-1
DI-2
INDI
size dependent)
%
%
%
Overall rate of change:
job title
Year of graduation:
- 1970-1979
- 1980-1989
- 1990-1999
- 2000-2009
- 2010-2013
- 2014 - 2018
By sex:
- Male
- Female
By race/ethnicity:
- (categories)
By other demographic
characteristic(s)
SDR response year
- 2015
- 2017
- 2019
Table 6D. Rates of OCC change from prior SDR response to 2020 by questionnaire, and other
characteristics
Characteristics (sample
DI-1
DI-2
INDI
size dependent)
%
%
%
Overall rate of change:
OCC code
Year of graduation:
- 1970-1979
- 1980-1989
- 1990-1999
- 2000-2009
- 2010-2013
- 2014 - 2018
By sex:
- Male
- Female
By race/ethnicity:
- (categories)
By other demographic
characteristic(s)
SDR response year
- 2015
- 2017
- 2019
102
Analysis of Response Analysis Survey (RAS) responses
Table 7. Distribution of responses to RAS question concerning perceived speed of completing the
questionnaire
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Very slow
Somewhat slow
Somewhat fast
Very fast
Table 8. Distribution of responses to RAS question on “How would you describe today’s survey
relative to other SDR surveys you’ve completed?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Very similar
Somewhat
similar
Somewhat
dissimilar
Very dissimilar
Table 9. Distribution of responses to RAS question on “To what extent did you enjoy completing
today’s survey?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Did not enjoy at all
Enjoyed a little
Enjoyed somewhat
Enjoyed a great
deal
Table 10. Distribution of responses to RAS question “How sensitive did you think the questions on this
survey were?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Not sensitive at all
A little sensitive
Somewhat
sensitive
Very sensitive
103
Table 11. Distribution of responses to RAS question “How confident are you that NCSES will protect
your answers?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Not securely at all
Not very securely
Somewhat
securely
Very securely
Table 12. Distribution of responses to RAS question “Today’s survey referenced the 201x Survey of
Doctorate Recipients. Do you remember completing that survey?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Yes
No
Table 13. Distribution of responses to RAS question “Prior to answering today’s survey, did you
remember what your responses to the 201x Survey of Doctorate Recipients had been?”
DI-1
DI-2
DI-1
DI-2
Count
Count
%
%
Yes
No
Table 14. Distribution of responses to RAS question gathering reactions to DI approach
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Surprised:
Very much
A little
Not at all
Confused:
Very much
A little
Not at all
Appreciative:
Very much
A little
Not at all
Comfortable:
Very much
A little
Not at all
Annoyed:
Very much
A little
Not at all
Concerned:
104
Very much
A little
Not at all
Relieved:
Very much
A little
Not at all
Table 15. Distribution of responses to RAS question gathering reactions to whether DI (made/would
make) survey responses more or less accurate
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
More
accurate
Less accurate
No impact
Table 16. Distribution of responses to RAS question asking “were there any questions in today’s
survey where you felt that the answer displayed from 201x was ‘accurate enough’ and you decided to
leave it as-is, rather than updating it with potentially more accurate information?
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Yes, for one
question
Yes, for more
than one
question
No
Table 17. Distribution of responses to RAS question asking “Were there any questions in today’s
survey where the answer displayed from 201x was no longer true, and you decided to leave it as-is,
rather than updating it with current information?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Yes, for one
question
Yes, for more than
one question
No
Table 18. Distribution of responses to RAS question on “Do you think that pre-filling some of your
answers from 201x and asking you to confirm or update them (made/would make) this survey…?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Much more
burdensome
A little more
burdensome
105
Neither more or less
burdensome
A little less
burdensome
Much less
burdensome
Table 20. Distribution of responses to RAS question on “Do you think pre-filling your most recent
answers to Survey of Doctorate Recipients is a…?”
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Very bad idea
Somewhat bad idea
Neither a good nor
a bad idea
Somewhat good
idea
Very good idea
Table 21. Distribution of responses to RAS question on whether or not records were used to answer
questions.
DI-1
DI-2
INDI
DI-1
DI-2
INDI
Count
Count
Count
%
%
%
Yes
No
Additional Analyses
• Frequency distributions for all questions, including RAS.
• Comparison of coded RAS open-ended questions, between three instruments
• Interaction of RAS items with reported employment change for currently employed
• Interaction of RAS items with reported change in employment status for not currently
employed
• Survey breakoffs by condition
106
File Type | application/pdf |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |