An Evaluation of the National Science Foundation’s Graduate Research Fellowship Program: Overview of proposed study
Program Overview
As part of the National Science Foundation’s (NSF) continued commitment to graduate student education in the U.S., the Graduate Research Fellowship Program (GRFP), which began in 1952, seeks to promote and maintain advanced training in Science, Technology, Engineering and Mathematics (STEM) fields by annually awarding roughly 2,000 fellowships1 to U.S. citizens, nationals, and permanent residents for graduate study in research-based programs. The goals of the program are:
To select, recognize, and financially support individuals early in their careers with the demonstrated potential to be high achieving scientists and engineers, and
To broaden the participation of underrepresented groups, including women, minorities and persons with disabilities, in science and engineering fields
Underpinning the program goals are NSF’s broader strategic organization goals, including that of performing as a model organization. To achieve this goal and to become a model Federal steward, representing excellence in management and fiscal responsibility, NSF seeks to “learn through assessment and evaluation of NSF programs, processes, and outcomes; continually improve them; and employ outcomes to inform NSF planning, policies, and procedures” (http://www.nsf.gov/news/strategicplan/nsfstrategicplan_2011_2016.pdf, pp. 16-17, italics in original). Thus, excellence in management is an underlying goal of each NSF program, including the GRFP.
Purpose and Need for Study
NSF is seeking to conduct a study that has three purposes:
Provide descriptive information related to the GRFP program goals on the demographics, educational decisions, career preparation, aspirations and progress, as well as professional productivity, of GRFP Fellows and comparable non-recipient applicants and national populations of graduate students and doctorate recipients.
Provide rigorous evidence of the impact of the GRFP on individuals’ educational decisions, career preparations, aspirations and progress, as well as professional productivity
Provide an understanding of how the program is implemented by universities and whether and how specific program policies could be adjusted to make the program more effective in meeting its goals.
Previous studies of the GRFP were largely completed in the mid-1970s to the mid-1990s. The most recent study, published in 2002, examined GRFP Fellow cohorts through 1993, and is now dated. The NSF GRF program collects data on an ongoing basis through multiple sources that is used for program management and accountability purposes. These sources include reports from the GRFP Committee of Visitors, annual surveys of the review panelists, comments from Fellows and university GRFP coordinating officials, and data compiled from the applications. In addition, GRFP Fellows submit annual activity reports, the format for which was revised in 2010 to include activities that contribute to career preparation such as acquisition of research skills and other professional skills, data on career plans, internships, and other sources of financial support. The data are tracked over multiple years to examine trends and identify gaps that need to be addressed in subsequent competitions. However, the data, while useful, offered limited information for prior years. In addition, they did not address program impact or implementation.
Thus, NSF needs current information on several fronts to inform future decisions about program structure and design that cannot be addressed either with NSF data or existing national databases. These include: (a) how GRFP Fellows differ from their peers in terms of demographics, educational trajectories, and career outcomes; (b) the impact of the GRF program on Fellows in terms of educational trajectories, career outcomes, and professional productivity; (c) the effect of the GRF program on institutions in terms of student diversity and quality in STEM graduate programs; and (d) program implementation. The current study, being conducted for NSF by NORC at the University of Chicago, is intended to address each of these areas.
Research Questions and Study Approach
Research Questions
The study focuses on the following research questions:
RQ1. What is the impact of the GRFP fellowship on the graduate school experience?
RQ2. What is the impact of the GRFP fellowship on career outcomes?
RQ3. What are the effects of the GRFP on institutions?
RQ4. Is the program design effective in meeting program goals?
While RQ1 and RQ2 are framed in terms of impact, a necessary component of the research is examining how the Fellows compare with peers in terms of demographics, aspirations, educational trajectories, career outcomes, and professional productivity, to help address the program goals. RQ3 and RQ4 are designed to address both the GRF program goals as well as the underlying NSF strategic goal of excellence in management.
Data Sources
To address the research questions, the study will use both primary and secondary data sources. In terms of primary data collection, the study will:
Collect data from Fellows and carefully-matched counterparts (QG2 Honorable Mentions)2 through a survey (GRFP Follow-Up Survey) that asks about graduate school experiences, educational attainment, career outcomes, employment characteristics, and professional productivity (RQ1 and RQ2). The survey also asks Fellows about the influence of program elements (choice, flexibility, and monetary value) on their decision to enroll in and successfully complete STEM graduate programs (RQ4).
Collect data from two samples of institutions:
In-depth data from six institutions (“institutional site visit sample”) gathered from site visits which will encompass in-person interviews with administrators, faculty, and staff to understand: (1) the current climate, (2) perceived impact of the program on Fellows, institutions, and programs, (3) program implementation, and (4) GRFP policies. The in-depth data will be used to address RQ3 and RQ4.
Targeted data from a larger sample of 20 institutions (“institutional phone sample”) gathered from shorter phone interviews more narrowly focused on implementation and specific GRFP policies (RQ4 and, to a more limited extent, RQ3).
These two institutional samples serve two different purposes, each suited to the type of data to be collected. By limiting the site visit sample to six institutions and broadening the institutional phone sample to include 20 institutions, we balance the need for in-depth data collection with the goals of minimizing respondent burden and collecting data from a broader pool of institutions.
Review and analyze similar federal fellowship programs using data collected from websites, program materials, and interviews with program officers managing these programs. This part of the study will help inform GRFP policies and best practices (RQ4). The findings will be valuable in understanding how best to support Fellows and help develop a more diverse STEM workforce.
Secondary data sources include the Doctorate Records File to provide a national context and national comparison group.
Analysis
Descriptive analysis will be used to examine the composition, experiences, and outcomes of Fellows, non-recipients, and national peers and will provide evidence of GRFP participation of underrepresented groups and trends in program selection, recognition, and financial support of early career scientists and engineers. To measure impact, the study will model outcomes using quasi-experimental methods to compare outcomes of the treatment group (Fellows) with outcomes of plausibly similar control groups (QG2 Honorable Mentions). These methods are widely accepted as the best methods on which to base causal inferences in the absence of a randomized experiment (i.e., when it is not feasible to randomly assign participants to treatment and control groups). To examine implementation, the study will use qualitative methods to code and analyze the institutional interviews and to draw out lessons learned from the interviews with program officers.
Findings and Dissemination
The data collected in this study and the analytic reports will provide a comprehensive look at the GRFP, its impact on Fellows, institutions, and the science and engineering workforce, and the extent to which the program is meeting its goals. In particular, the findings will provide information on:
The influence the GRFP has had on the decisions, experiences, academic attainment, and career outcomes of Fellows compared with carefully-matched peers ;
The extent to which the program has broadened the participation of underrepresented groups in STEM fields at the graduate level;
The perceived effects on institutions in terms of student financing, enrollment, diversity and quality (among others);
Whether specific design elements—choice, flexibility, and monetary value—are working as intended and the extent to which they are valued; and
The need (if any) for changes in the way the program is structured to make it more effective.
Overall, the study findings will provide valuable insights to NSF on the impact of its investments in the GRFP and inform its program management. In conjunction with findings from the review of similar federal fellowship programs, the findings may prove valuable to the larger community of program officers administering these programs as well as to the graduate education policy community in understanding how best to support graduate education and help develop a more diverse STEM workforce.
Study results will be reported to the Division of Graduate Education, Education and Human Resources (EHR)/NSF, distributed within the community of universities who participate in GRFP, and published on the NSF website. Limited print copies of the full report will be made available to NSF as well as 500 copies of a printed executive summary that can be disseminated more widely and that will be useful to a variety of public audiences. A policy brief reporting on the review and analysis of federal fellowship programs will be made available to all federal fellowship program managers and may be of interest to the larger foundation community as well. Findings of more general policy or methodological interest will be distributed more broadly, through conference presentations and submissions for publication in peer-reviewed journals.
An Evaluation of the National Science Foundation’s Graduate Research Fellowship Program
As part of the National Science Foundation’s (NSF) commitment to graduate student education in the U.S., the Graduate Research Fellowship Program (GRFP), which began in 1952, seeks to promote and maintain advanced training in Science, Technology, Engineering and Mathematics (STEM) fields3 by annually awarding approximately 2,000 fellowships4 to U.S. citizens, nationals, and permanent residents for graduate study in research-based programs. Accompanying this award is the expectation that NSF Graduate Fellows (referred to as “Fellows” in the remainder of the document) complete their degree and become scientists and engineers with the skills and knowledge necessary to contribute to research, teaching, and/or innovation in STEM fields.
NSF is seeking to conduct a study that has three purposes:
Provide descriptive information related to the GRFP program goals on the demographics, educational decisions, career preparation, aspirations and progress, as well as professional productivity, of GRFP Fellows and comparable non-recipient applicants and national populations of graduate students and doctorate recipients.
Provide rigorous evidence of the impact of the GRFP on individuals’ educational decisions, career preparations, aspirations and progress, as well as professional productivity
Provide an understanding of how the program is being implemented by universities and whether and how specific program policies could be adjusted to make the program more effective in meeting its goals.
There have been several previous studies of the GRFP. The National Research Council conducted four major studies from the mid-1970s to the mid-1990s, focusing on traditional measures of academic career success such as completion rates, time to degree, faculty appointment, success in obtaining research grants, and publications and citations.5 These studies used secondary data sources (such as NSF’s annual Survey of Earned Doctorates (SED), Survey of Doctorate Recipients (SDR), and the NSF/NIH (National Institutes of Health) postdoctoral and research grant files) to examine completion rates and career plans of several cohorts of Fellows (1952-1972; 1967-1976; 1972-1981; and 1979-1981 cohorts respectively). The authors of these studies acknowledged limitations in the data used in the reports including limited measures used for career outcomes, lack of a credible comparison group, and need for primary data collection from students and faculty.
The most recent comprehensive evaluation of the GRFP (Goldsmith, Presley, and Cooley, 2002)6 was based on a mixed-method analysis of data from several sources. This included: (a) secondary data analysis of the 1979-1993 Fellows using data from the SED and NSF’s Cumulative Index (CI); (b) analysis of a graduate student follow-up survey of the 1989-1993 cohort of Fellows and graduate student peers in four disciplines at the Fellows’ institutions; and (c) analysis of interviews with administrators, faculty, staff, and students conducted during site visits to six research universities. The 2002 study compared the highest quality award recipients (QG1) to lower-quality recipients (QG2) and non-recipients and used the survey data to compare Fellows to their disciplinary peers. While the study produced important and useful information, it was primarily descriptive and is now dated.
NSF currently collects data from different sources to inform program management and for accountability purposes. First, it collects data on the composition of program applicants by outcome of the application process (awardees (QG1 and QG2; Fellows), those ranked as high quality but not offered the award (QG2; Honorable Mentions), and declinations) and reports trends over time in terms of distribution by field of study, level of academic preparation, demographics, geographic representation, and baccalaureate institution (disaggregated by minority-serving institution status). Second, NSF collects data on the composition of the GRFP panelist pool by demographics, institution type, professional rank, geography, and by new versus return panelist status. The panelists complete a survey that is used to inform the review process and future outreach efforts to underrepresented populations. Third, NSF collects annual activity reports from Fellows while enrolled in graduate school on their research, professional productivity (including papers, patents, and inventions), research and teaching appointments, activities that integrate research and education, international achievements, and their activities to help broaden participation. The report format was revised in 2010 to gather additional information on career plans, internships, sources of financial support during the Fellowship, and acquisition of research skills and professional skills. Fellows are also asked to summarize, for public dissemination, their accomplishments over the past year and the intellectual merit and broader impact of their work. The new format implemented in 2010 will make these activity reports a rich source of data going forward. There is limited data for prior years.
There are three major drivers for the new data collection. First, the overall climate for graduate education has changed over the past two decades along with the characteristics of students enrolled in college7 and we need to better understand the current environment and its effect on program outcomes and on institutions hosting Fellows. Second, addressing issues of impact requires a more rigorous and sophisticated modeling approach than has been used in previous studies. Better estimates of the program’s impact can help inform NSF’s policies and program review. This cannot be addressed with the data that NSF collects from Fellows because issues of impact require a counterfactual (a comparison group) against which experiences and outcomes of Fellows can be compared. Third, there have been changes to the program since 2010 (including increased number of annual awards and policies related to permitted service, no concurrent federal fellowships, and affiliation with U.S. institutions, among others), largely based on earlier studies and program review. NSF now needs information on the implementation of the revised policies and their effect on institutions and students to inform future decisions about program structure and design. Again, these cannot be addressed by the data currently collected by NSF because that is not the focus of the activity reports submitted by Fellows.
To address these needs, the study will use both primary and secondary data sources and a mix of rigorous quantitative and qualitative analytic techniques. In terms of primary data collection, the study will:
Collect data from Fellows and carefully-matched counterparts (QG2 Honorable Mentions)8 through a survey (GRFP Follow-Up Survey) that asks about graduate school experiences, educational attainment, career outcomes, employment characteristics, and professional productivity. The survey also asks Fellows about the influence of program elements (choice, flexibility, and monetary value) on their decision to enroll in and successfully complete STEM graduate programs.
Collect data from two samples of institutions:
In-depth data from six institutions (“institutional site visit sample”) gathered from site visits which will encompass in-person interviews with administrators, faculty, and staff to understand: (1) the current climate, (2) perceived impact of the program on Fellows, institutions, and programs, (3) program implementation, and (4) GRFP policies.
Targeted data from a larger sample of 20 institutions (“institutional phone sample”) gathered from shorter phone interviews more narrowly focused on implementation and specific GRFP policies.
These two institutional samples serve two different purposes, each suited to the type of data to be collected. By limiting the site visit sample to six institutions and broadening the institutional phone sample to include 20 institutions, we balance the need for in-depth data collection with the goals of minimizing respondent burden and collecting data from a broader pool of institutions.
Review and analyze similar federal fellowship programs using data collected from websites, program materials, and interviews with program officers managing these programs. This part of the study will help inform GRFP policies and best practices. The findings will be valuable in understanding how best to support Fellows and help develop a more diverse STEM workforce. Note that, because the respondents will be interviewed in their official capacity as program officers, this data-collection effort is exempt from OMB review and is not discussed further in this OMB submission. It is mentioned as it is an important piece of the study that will help inform the program.
Secondary data sources include the Survey of Earned Doctorates (SED) and the Survey of Doctorate Recipients (SDR). These will be used to provide a national context and national comparison groups.
Descriptive analysis will be used to examine the composition, experiences, and outcomes of Fellows, non-recipients, and national peers and will provide evidence of GRFP participation of underrepresented groups and trends in program selection, recognition, and financial support of early career scientists and engineers. To measure impact, the study will model outcomes using quasi-experimental methods to compare outcomes of the treatment group (Fellows) with outcomes of plausibly similar control groups (QG2 Honorable Mentions). These methods are widely accepted as the best methods on which to base causal inferences in the absence of a randomized experiment (i.e., when it is not feasible to randomly assign participants to treatment and control groups). To examine implementation, the study will use qualitative methods to code and analyze the institutional interviews and to draw out lessons learned from the interviews with program officers.
The GRFP evaluation will be the first comprehensive evaluation of this program since 2002 and the first to look at more recent cohorts (1994-2011) of Fellows and peers. While other NSF data collection such as the SED and SDR examine the graduate experience and career trajectories of doctoral recipients in STEM research fields, the GRFP evaluation is the only study to specifically assess the impact of this program on Fellows (both those in doctoral programs and master’s degree programs) and institutions. In addition, the GRFP evaluation is the only current study that will examine the program impact from multiple angles in regards to Fellows (e.g., graduate experience, career trajectories) while also gaining an external perspective from academic institutions. Although the several previous studies noted above inform this study, the present study’s approach will contribute to and significantly advance the current state of knowledge regarding the program, its implementation, outcomes, and impact.
A.2. Purpose and Uses of the Data
GRFP’s goals are:
To select, recognize, and financially support individuals early in their careers with the demonstrated potential to be high achieving scientists and engineers, and
To broaden participation in STEM fields of underrepresented groups, including women, minorities, and persons with disabilities.
Underpinning the program goals are NSF’s broader strategic organization goals, including that of performing as a model organization. To achieve this goal and to become a model Federal steward, representing excellence in management and fiscal responsibility, NSF seeks to “learn through assessment and evaluation of NSF programs, processes, and outcomes; continually improve them; and employ outcomes to inform NSF planning, policies, and procedures” (http://www.nsf.gov/news/strategicplan/nsfstrategicplan_2011_2016.pdf, pp. 16-17, italics in original). Thus, excellence in management is an underlying goal of each NSF program, including the GRFP.
To be conducted by NORC at the University of Chicago, the overall purpose of the study is to help NSF evaluate the impact of the program on Fellows, institutions, and the science and engineering workforce and the extent to which the program elements are effective in meeting the program goals. More specifically, the research questions (RQ) addressed by the study include:
RQ1. What is the impact of the GRFP fellowship on the graduate school experience?
RQ2. What is the impact of the GRFP fellowship on career outcomes?
RQ3. What are the effects of the GRFP on institutions?
RQ4. Is the program design effective in meeting program goals?
While RQ1 and RQ2 are framed in terms of impact, a necessary component of the research is examining how the Fellows compare with peers in terms of demographics, aspirations, educational trajectories, career outcomes, and professional productivity, to help address the program goals. RQ3 and RQ4 are designed to address both the GRF program goals as well as the underlying NSF strategic goal of excellence in management.
The study approach is summarized in Table A.2.1, which presents a crosswalk between the RQs and the data sources and analyses to be used to address them.
Table A.2.1. An Overview of the Study Approach: Crosswalk between Research Questions and Proposed Data Sources and Analyses
Data Source |
Analysis |
RQ1. What is the impact of the GRFP fellowship on the graduate school experience? |
|
Primary data
Secondary data
|
|
RQ2. What is the impact of the GRFP fellowship on career outcomes? |
|
Primary data
Secondary data
|
|
RQ3. What are the effects of the GRFP on institutions? |
|
Primary data
|
|
Table A.2.1. (Continued)
Data Source |
Analysis |
RQ4. Is the program design effective in meeting program goals? |
|
Primary data
|
|
Note: 1Our samples will be drawn from the GRFP applicant files which have information on each applicant’s final award status. This allows students to be classified as Fellows or QG2 non-recipients. Since a small number of students refuse the fellowship, the GRFP Follow-Up Survey includes a screening question to ask awardees whether they refused the Fellowship. Applicants who refused the Fellowship will also be asked why, and then screened out of the survey. Because no other data will be collected from them, they will be excluded from any subsequent analyses. Data on why some students refused the Fellowship may be useful to NSF. See Section B.1 on p.16 for additional information on sampling.
The data will be useful in two major ways. First, the study will provide information regarding how the GRFP influences: educational decisions, experiences, and graduate degree attainment of U.S. students enrolled in STEM graduate programs; workforce participation and career outcomes; professional productivity; and graduate school institutions in terms of recruitment, funding, reputation, diversity and quality of students participating in STEM fields, and professional development opportunities offered to students. The analyses and findings will help NSF evaluate how the program is meeting its mission and help inform future program policies and initiatives. Graduate institutions may also find the information useful in understanding how best to support their Fellows and to help develop a robust and diverse U.S. science and engineering workforce.
Second, the study will shed light on how institutions are implementing the program and the extent to which specific design elements are valued or working as intended by NSF. In addition, the review and analysis of other similar federal fellowship programs may help point to best practices in terms of program design and structure. Both of these should inform NSF and the managers of similar federal fellowship programs in terms of program evaluation, review, and improvement.
The analysis and sampling plans have been designed to optimize reliability and validity throughout the study. Reliability (or, the consistency of the measures used) will be enhanced by utilizing data reduction techniques such as factor analysis to group questionnaire items into scaled measures of like concepts, and by adapting items from instruments previously used in education studies. Validity is often defined as the best available approximation to the truth or falsity of a given inference or conclusion9 and researchers need to be concerned with both internal and external validity. Internal validity refers to the rigor with which the study was conducted and the extent to which the study has taken into account alternative explanations for causal relationships that are the focus of the study. External validity refers to the extent to which the results of a study are generalizable. In our study, internal validity will be addressed in two important ways. First, the sampling plan focuses on awarded Fellows from the highest two quality groupings (QG1 and QG2) and non-awarded Honorable Mention applicants of similar quality (QG2) so that resulting program or treatment effects are not confounded by variability in the backgrounds or academic preparation of the sample. Second, quasi-experimental analytic methods such as regression discontinuity (RD) and propensity score matching (PSM) will reduce bias from non-random assignment of individuals to treatment and control groups. In terms of external validity, it is important to note that the results of the study will be generalizable to similar populations of academically accomplished graduate students in the STEM disciplines. Conclusions based on the study results should therefore not be applied to broader graduate student populations.
Study results will be reported to the Division of Graduate Education, EHR/NSF, distributed within the community of universities who participate in GRFP, and published on the NSF website. Limited print copies of the full report will be made available to NSF as well as 500 copies of a printed executive summary that can be disseminated more widely and that will be useful to a variety of public audiences. A policy brief reporting on the review and analysis of federal fellowship programs will be made available to all federal fellowship program managers and may be of interest to the larger foundation community as well. Findings of more general policy or methodological interest will be distributed more broadly, through conference presentations and submissions for publication in peer-reviewed journals.
A.3. Use of Information Technology to Reduce Burden
In order to reduce respondent burden, internet-based surveys will be used to collect information from participants. As the populations being surveyed in this study are graduate students in STEM fields, or professionals trained as scientists and engineers, they are expected to have easy access to and be fluent in the use of web-based technologies. The use of web-based systems facilitates accuracy, completeness, and speed of data entry, and helps reduce respondent burden. Web-based surveys employ user-friendly features, such as automated tabulation, data entry with custom controls such as checkboxes, data verification with error messages for online correction, standard menus, and predefined charts and graphics. Survey skip patterns reduce time burden on respondents by automatically moving them to the next appropriate section, simplifying the survey-taking experience. Web-based surveys also allow for easy identification of non-respondents and facilitate follow-up.
In addition, data entered by participants can be automatically uploaded into standard analysis software, eliminating the additional data entry step, thus increasing the efficiency of the analysts conducting the study. Email will be used to invite participants to complete the survey and to follow up with the non-respondents to encourage their participation.
The survey will offer the same accommodation for those with disabilities as the Survey of Doctorate Recipients. There will be added navigation functionality on the Web survey so a mouse is not necessary for responding to the survey. Those with disabilities will be offered the option of a telephone or paper survey.
This evaluation does not duplicate other NSF efforts [See response to A.1].
As this is the first program evaluation of GRFP since the last study published in 2002 and the only evaluation to date to examine more recent cohorts (i.e., from 1994 on), failure to conduct the study will leave a knowledge gap at a time when the GRFP is experiencing growth and increased attention as a Foundation-wide program to support STEM workforce development. If this information is not collected, NSF will not have needed evidence to meet its accountability requirement for independent evaluations to document the effectiveness and broader impacts of STEM education programming. It would also prevent NSF from learning what policies and practices are effective in meeting GRFP program goals, identifying effective strategies adopted by other federal fellowship programs, disseminating lessons learned to the broader STEM community, and obtaining valuable information about implementation and specific design elements to help inform future policies and programs.
The scope of the current proposed study is a one-time data collection effort, thus the issue of less-frequent data collection is moot.
Two notices to the public soliciting comments on this information collection prior to OMB submission were published in the Federal Register (75 FR 36697, Monday, June 28, 2010; 75 FR 56596, Thursday, September 16, 2010). A copy of the text of both notices is attached as Appendix A. No public comments were received in response to the notice during the 60 days that each appeared in the Federal Register.
NSF contracted NORC at the University of Chicago to design and conduct the study of the GRFP. NSF and an External Advisory Panel provided consultation on the study design. The GRFP evaluation team convened one meeting of its advisory group in March 2009. Advisory group members come from research organizations and universities and include the following individuals:
Dr. Ronald G. Ehrenberg - Irving M. Ives Professor of Industrial and Labor Relations and Economics; Director - Cornell Higher Education Research Institute, Cornell University
Dr. Lisa Frehill - Executive Director, Commission on Professionals in Science and Technology (CPST)
Dr. Lewis Siegel - Dean of the Graduate School and Vice Provost for Graduate Education, Duke University
Dr. William Trent - Professor, Department of Educational Policy Studies, University of Illinois at Urbana-Champaign.
The advisory group was responsible for reviewing the evaluation plan and the framework for the questionnaire.
The GRFP Follow-Up Survey was piloted with five individuals in June-July 2010; respondents took an average of 35 minutes to complete the survey. Since then the survey has been revised and shortened. Further pilot testing of the GRFP Follow-Up Survey with additional nine or fewer respondents will be conducted in December 2011 and respondents will be asked to comment on the clarity of directions and survey items, and the overall logic of the programmed Web survey. Results from this pilot test will be used to refine the survey.
The interview protocols for administrators, faculty, and staff (for both the institutional site visit and the phone interview samples) were based on similar interview protocols used in previous studies and have yet to be piloted. Estimated times to complete the interviews used to calculate respondent burden in Section A.12 are based on results in earlier studies and the contractor’s experience with similar data collection efforts. A pilot test of these interview protocols will be conducted with nine or fewer respondents in December 2011 and respondents will be asked to comment on the clarity of questions and the flow of the interview. Results from this pilot test will be used to refine the protocols.
No payments or gifts will be provided to respondents.
All respondents will be advised that any information on specific individuals will be maintained in accordance with the Privacy Act of 1974. Data collected will be made available to the study contractors, contractors hired to manage data and data collection software, and, at the aggregate level, to NSF staff. Data will be processed in accordance with federal and state privacy statutes. Detailed procedures for making information available to various categories of users are specified in the Education and Training System of Records (63 Fed, Reg. 264, 272 January 5, 1998). The system limits access to personally identifiable information to authorized users. Data submitted will be used in accordance with criteria established by NSF for monitoring research and education grants, and in response to Public Law 99-383 and 42 USC 1885c. The information requested may be disclosed to qualified contractors in order to coordinate programs and to a federal agency, court or party in court, or federal administrative proceeding, if the government is a party.
Individuals responding to the GRFP Follow-Up Survey—Fellows and similar but non-awarded GRFP applicants—will be assured that the information they provide will not be released in any form that identifies them, and that their responses will be kept confidential to the extent provided by law. The contractor will be expected to maintain the confidentiality, security, and integrity of the survey data. The web-based survey data will be maintained on a secure server with appropriate levels of password and other types of protection. Proposed procedures for protecting the data and privacy of respondents will be reviewed by the contractor’s Institutional Review Board (IRB) prior to data collection.
Individuals interviewed as part of the institutional data collection (i.e. institutional administrators, faculty, and staff who are selected to participate in the institutional site visit and the phone interview samples) will be asked for informed oral consent. They will be assured the information they provide will not be attributed to them and that all data will be reported in aggregated form. Direct quotations will not be attributed to any individuals or their institutions. These data will be identified only by site and interviewee codes and will be kept in locked cabinets or password-protected data files. In addition, any crosswalk between the interviews and identifying information will be maintained separately from the actual interview notes and files.
The GRFP Follow-Up Survey contains very few sensitive questions, with perhaps the exception of salary. However, a respondent can choose to not answer this or other questions they deem sensitive. All survey questions will be reviewed by the contractor’s IRB prior to fielding. Any public reporting of sensitive data will be in aggregate form.
The interview protocols (for both the institutional site visit and phone interview samples) also contain few sensitive questions because they focus on program implementation and the effect of hosting GRFP Fellows on the institutions. Respondents will be informed of their right to not answer specific questions if they so wish.
Copies of the survey and protocols can be found in Appendix B. These include:
GRFP Follow-Up Survey
Institutional site visit sample: Interview protocols for (a) university administrators; (b) faculty; and (c) university staff
Institutional phone interview sample: Interview protocol (one common protocol).
The total
estimated number of respondents is shown below in Table A.12.1.
Things to note about the table include: (a) Some students are likely
to be still enrolled in the programs at the time the survey is
administered, while others may have graduated or dropped out of their
program. We refer to these groups as “current” and
“former” graduate students, respectively. The table
assumes a 65% response rate among the two survey groups; (b) Among
respondents who will be interviewed during site visits or over the
telephone, the table assumes a 100% response rate; (c) There is
likely to be a mix of professors of different ranks (full, associate,
perhaps assistant) involved with the GRFP Fellows.
Since
average salaries differ by rank, we wanted to calculate an upper
bound for the cost of the response burden by assuming that the
institutional interviews would include faculty only at the full and
associate professor ranks and would be more heavily weighted towards
the more senior faculty. Thus, the site visit respondent sample
assumes a mix of 4 full and 3 associate professors at each
institution, and the phone interview respondent sample includes 2
full professors and 1 associate professor at each institution.
To calculate the respondent burden, the following assumptions regarding completion times were used:
GRFP Follow-Up Survey:
0.50 hours for current graduate students
0.67 hours for former graduate students
Institutional Site Visit Sample:
0.33 hours for university administrators
1 hour for full professors
1 hour for associate professors
0.75 hours for university staff
Institutional phone interview sample:
0.33 hours for university administrators
0.50 hours for full professors
0.50 hours for associate professors
0.50 hours for university staff
Table A.12.1 below indicates the total sample size and expected number of responses for each category of respondent type and the time demand these instruments will place on each individual respondent and on all respondents in aggregate. The total number of respondents is estimated to be 8,728, resulting in an estimated response burden for the study of approximately 5,098 person hours.
As each respondent will complete the survey or interview once, the annual burden and the aggregate burden will be the same as shown in Table A.12.1.
Table A.12.1. Number of Respondents, Burden Hours per Respondent, and Total Person Hours, by Respondent Type
Type of Data-Collection and Respondent Type |
Total Sample Size |
Total Number of Respondents* |
Burden Hours Per Respondent |
Total Person Hours** |
|
||||
Current Graduate Students |
6594 |
4284 |
0.50 |
2142 |
Former Graduate Students |
6594 |
4284 |
0.67 |
2870 |
|
||||
University administrators |
6 |
6 |
0.33 |
2 |
Full professors |
24 |
24 |
1.00 |
24 |
Associate professors |
18 |
18 |
1.00 |
18 |
University staff |
12 |
12 |
0.75 |
9 |
|
||||
University administrators |
20 |
20 |
0.33 |
7 |
Full professors |
40 |
40 |
0.50 |
20 |
Associate professors |
20 |
20 |
0.50 |
10 |
University staff |
20 |
20 |
0.50 |
10 |
Total |
13348 |
8728 |
-- |
5112 |
Notes: *The table assumes a 65% response rate for the survey groups based on previous studies and a 100% response rate among the interviewees. The table also assumes that the total sample and the number of respondents will be evenly split between current and former graduate students.
**Rounded to nearest whole number.
The overall annualized cost to respondents is $96,510. Table A.12.2 shows the estimated total annual costs to each group of respondents over one year for the surveys and for the interviews. The assumptions underlying the table are discussed below.
Table A.12.2. Annualized Cost to Respondents, by Respondent Type
Respondent Type |
Total Number of Respondents |
Burden Hours Per Respondent |
Total Person Hours |
Hourly Salary Estimate |
Estimated Cost per Respondent |
Estimated Overall Cost* |
|
||||||
Graduate Students |
4284 |
0.50 |
2142 |
$16.00 |
$8.00 |
$17,136 |
Graduates |
4284 |
0.67 |
2870 |
$39.49 |
$26.33 |
$75,567 |
|
||||||
University administrators |
6 |
0.33 |
2 |
$89.72 |
$29.57 |
$59 |
Full professors |
24 |
1.00 |
24 |
$66.30 |
$66.30 |
$1,591 |
Associate professors |
18 |
1.00 |
18 |
$44.11 |
$44.11 |
$794 |
University staff |
12 |
0.75 |
9 |
$24.07 |
$18.05 |
$162 |
|
||||||
University administrators |
20 |
0.33 |
7 |
$89.72 |
$29.57 |
$197 |
Full professors |
40 |
0.50 |
20 |
$66.30 |
$33.15 |
$663 |
Associate professors |
20 |
0.50 |
10 |
$44.11 |
$22.06 |
$221 |
University staff |
20 |
0.50 |
10 |
$24.07 |
$12.04 |
$120 |
Total |
8728 |
-- |
5112 |
-- |
-- |
$96,510 |
Notes: *Rounded to nearest whole dollar.
The assumptions used in the table are the following:
A work year consists of 240 days.
For graduate students, the table uses the hourly salary paid to graduate assistants. These range from about $12 to $17—the table uses $16 as a reasonable approximation (see, for example, http://finweb.rit.edu/controller/graduate/job_classifications.html)
For graduates, the table uses the average of the 2006 median salaries for all full-time doctoral scientists and engineers employed 0-5 years and 6-10 years, adjusted to 2011 dollars, using the Bureau of Labor Statistics CPI calculator. (Available at http://www.nsf.gov/statistics/nsf09317/ and http://www.bls.gov/data/inflation_calculator.htm).
For university administrators, the table uses the average salary for deans of graduate programs in doctoral institutions (available at http://chronicle.com.proxy.uchicago.edu/article/Median-Salaries-of-Senior/126455/).
For university senior faculty, the table uses the average salary of full professors; for junior faculty, the table uses the average salary of associate professors (available at http://chronicle.com.proxy.uchicago.edu/article/Faculty-Salaries-Vary-by/127073).
For university staff, the table uses the average salary for academic-support-center coordinating officials (available at http://chronicle.com.proxy.uchicago.edu/article/Median-Salaries-of-Midlevel/126834/).
The above sources were all accessed on June 20, 2011.
There is no overall annual cost burden regarding capital, operation, or maintenance costs to respondents that results from this study, other than the time spent responding to the survey.
The total estimated cost of the GRFP evaluation is $2,639,512.17. This cost includes development of data-collection instruments (GRFP Follow-Up Survey and interview protocols for both the institutional site visit and phone interview samples), management of data-collection efforts, data collection through surveys, site visits, and phone interviews, cleaning and preparation of data files for evaluation, and data analysis and report writing to summarize the findings, implications, and lessons learned from the evaluation.
The GRFP evaluation is a new, one-time data collection from respondents.
The four research questions the study is designed to address are:
RQ1. What is the impact of the GRFP fellowship on the graduate school experience?
RQ2. What is the impact of the GRFP fellowship on career outcomes?
RQ3. What are the effects of the GRFP on institutions?
RQ4. Is the program design effective in meeting program goals?
Appendix C (Tables C.1-C.4) provides a crosswalk between the research questions underlying the study and the data being collected through the surveys, the site visits, and the telephone interviews. The tables also briefly outline the kinds of analyses that will be used to address each question. More details can be found in the discussion below and the next section on Sampling and Estimation.
Data from the GRFP Follow-Up Survey: As shown in Table A.2.1, these data will be used to address RQ1, RQ2, and, to some extent, RQ4. The analyses of the survey data will include both descriptive and multivariate analyses. Part of this evaluation is to provide information describing the demographic composition, graduate school experiences, educational outcomes, and career progression of GRFP participants in the national context. Descriptive approaches will be used for examining the characteristics of the overall sample population, as well as racial/ethnic, sex, and other sub-populations (e.g., STEM field; graduate degree programs, graduate institution type). In addition to calculating the relevant means, standard deviations, and frequencies of the variables under investigation, we will use cross-tabulations for categorical outcomes and ANOVA for continuous outcomes to examine whether significant differences occur across groups and across variations of educational settings. Descriptive analysis will also present information on the sample and sub-sample populations in comparison to national benchmarks, such as those obtained from the SED, SDR and other data sources. This phase of the analysis will provide a general understanding of group differences and will inform the interpretation of results from the multivariate stage of analysis.
Multivariate techniques will enable us to isolate the effects of the GRFP Fellowship award on program outcomes by statistically controlling for differences in a variety of confounding factors. Multivariate methods will include logistic and linear regression, as well as regression discontinuity (RD) and propensity score matching (PSM). Logistic and linear regression will facilitate the testing of differences among groups, while PSM and RD will account for the non-random, non-experimental program design in estimating the effects of the GRFP Fellowship award.
Data from Interviews with Participants in the Institutional Site Visit and Phone Interview Samples: These data will be used to address RQ3 and RQ4. The interview data will be analyzed using qualitative methods. Each interview will be recorded, transcribed, and coded for content relevant to the research questions underpinning the study. Mock interviews and staff trainings will be utilized along with monitoring inter-rater reliability to maintain consistency when coding interview data. Once coded, NORC staff will review the responses associated with each research question to identify the major type or types of answers, as well as any interesting individual responses. The analysis will indicate not only major trends, but also the strength of each trend (the proportion of interviewees with similar responses), the presence of any responses counter to that trend, and the source of the trend (if it comes from one particular type of respondent, for example, faculty advisors, or if it comes from multiple sources). Discussions of trends will include particularly illuminating or otherwise interesting quotes, when available. In addition to analysis based solely on site visit interviews, NORC’s analysis will be informed by responses to the GRFP Follow-Up Survey described above to understand the perspectives of students.
The contractor, NORC at the University of Chicago, will prepare a major technical report on the results of the study that provides details regarding the sampling, methodology, and analysis. In addition, the contractor will produce a short 3-5 page research brief that provides highlights of the study targeted towards the research questions and that can be disseminated to policymakers. The contractor will also prepare a separate policy brief that will report on the findings of the review and analysis of federal fellowship programs, supplemented by data from the institutional interviews on implementation of the GRFP program.
As stated earlier, GRFP’s goals are (a) to select, recognize, and financially support individuals early in their careers with the demonstrated potential to be high achieving scientists and engineers, and (b) to broaden the participation of underrepresented individuals, including women, underrepresented minorities, and persons with disabilities, in science and engineering fields. In addition, each NSF program seeks excellence in management and continuous improvement. The data collected in this study and the analytic reports will provide a comprehensive look at the GRFP, its impact on Fellows, institutions, and the science and engineering workforce, and the extent to which it is meeting its goals and speak directly to the program goals as well as the NSF’s strategic goal of performing as a model organization. In particular, the findings will provide information on:
The influence the GRFP has had on the decisions, experiences, academic attainment, and career outcomes of Fellows compared with carefully-matched peers;
The extent to which the program has broadened the participation of underrepresented groups in STEM at the graduate level;
The perceived effects on institutions in terms of student financing, enrollment, diversity and quality (among others);
Whether specific design elements—choice, flexibility, and monetary value—are working as intended and the extent to which they are valued; and
The need (if any) for changes in the way the program is structured to make it more effective.
Thus, overall, the study findings will provide valuable insights to NSF on the impact of its investments in the GRFP and will inform program management. In conjunction with findings from the review of similar federal fellowship programs, the findings may prove valuable to the larger community of program officers administering these programs as well as to the graduate community in understanding how best to support Fellows and help develop a more diverse STEM workforce.
Table A.16 shows the timeline for the study.
The data collection instruments will display the OMB clearance number and expiration date.
No exceptions are sought.
Task |
Start Date |
End Date |
Sample draw |
7/20/11 |
7/27/11 |
OMB submission |
9/29/11 |
12/29/11 |
Review of Federal fellowship programs |
|
|
Phone calls with program officers; review of materials |
10/12/11 |
11/14/11 |
Analysis of data and drafting policy brief |
11/17/11 |
1/17/12 |
Submit policy brief |
1/18/12 |
1/18/12 |
GRFP Follow-Up Survey |
|
|
Pilot test with 9 or fewer respondents |
11/12/11 |
12/14/11 |
Pre-field locating for GRFP Follow-Up Survey |
11/12/11 |
1/11/12 |
GRFP Follow-Up Survey programming |
11/12/11 |
1/4/12 |
GRFP Follow-Up Survey: Advance letter |
1/13/12 |
1/13/12 |
GRFP Follow-Up Survey: Advance email |
1/17/12 |
1/17/12 |
GRFP Follow-Up Survey: Postcard reminder |
1/27/12 |
1/27/12 |
GRFP Follow-Up Survey: 1st email prompt |
2/10/12 |
2/10/12 |
GRFP Follow-Up Survey: 1st prompt letter |
3/2/12 |
3/2/12 |
GRFP Follow-Up Survey: Data collection |
1/16/12 |
6/16/12 |
Institutional Phone Interview Sample |
|
|
Selection of sample of 20 institutions |
12/2/11 |
12/14/11 |
Initial contact (via phone and email) |
12/30/11 |
1/11/12 |
Phone interviews and data collection |
1/17/12 |
3/30/12 |
Analysis of data and report-writing |
4/2/12 |
5/14/12 |
Submit Preliminary Implementation Findings (based on phone interviews, review of federal fellowship programs, completed site visits, and early survey data) |
5/15/12 |
5/15/12 |
Institutional Site Visit Sample |
|
|
Selection of sample of 6 institutions |
12/5/11 |
12/19/11 |
Initial contact (via phone and email) |
1/3/12 |
1/13/12 |
Site visits and data collection |
1/17/12 |
6/15/12 |
Analysis of survey and interview data and report-writing |
6/15/12 |
11/26/12 |
Submit Draft Report |
11/27/12 |
11/27/12 |
Submit Final Report and Research Brief |
2/28/13 |
2/28/13 |
There are two major types of data collection: (I) GRFP Follow-Up Survey administered to Fellows and their counterparts; and (II) institutional interviews that encompass in-person interviews with several respondents at 6 institutions (institutional site visit sample) and telephone interviews with respondents at 20 institutions (institutional phone interview sample). The sections below discuss subsections B.1-B.4 for each of the two data collections separately. A final section, B.5, provides key contact information for the study.
Providing information about the study helps legitimize it and recruit potential respondents. It also serves to keep the public informed of the study’s purpose and scope. As such, NORC will develop a project-specific web page that briefly describes the goals of the project, the different data collection efforts, and the expected reports on findings. This page will be hosted on NORC’s corporate web site (http://www.norc.org) and will be accessible to the public and interested respondents and potential respondents, who may navigate to it through NORC’s website or by using a search engine. The project page will contain links to additional information about the GRFP (on NSF’s website) as well as contact information for NORC’s team.
The study will survey several cohorts of GRFP Fellows and their peers. To define the sample population, GRFP application records will be used to identify Fellows and peers (“Honorable Mentions” discussed below). In order to be considered eligible for the GRFP Fellowship award, all applicants in the sampling file meet the following three eligibility criteria:10
Applicants must be U.S. citizens, U.S. nationals, or permanent residents of the U.S.
Applicants must be in the early stages of their graduate education career, having completed no more than twelve months of full-time graduate study at the point of program application. This limit applies to all graduate education, not just current program of enrollment. If the applicant has previously completed a Master’s degree, he/she would be ineligible unless it is documented that the applicant completed a one-year Master’s degree program.
Applicants must be seeking a research-focused Master’s or doctoral degree in an NSF-supported field.11
All applicants meeting the above eligibility requirements are reviewed by panelists with disciplinary expertise and assigned one of four quality group (QG1 to QG4) rankings based on NSF merit review criteria. In addition, reviewers take into consideration applicants’ background characteristics, including their personal, professional, and educational experiences, as well as letters of reference.12 Applicants who receive a rating of QG1 receive fellowships. Applicants receiving a QG2 rating are split into two additional groups; one group receives the fellowship award, while the other group is awarded the title “Honorable Mention” without the fellowship. Recommendations for awards within QG2 help the program meet Congressional mandate for geography, and the program goal of broadening participation.
In the vast majority of cases, applicants who receive a rating of QG3 receive the title “Honorable Mention.” Finally, applicants who receive a QG4 ranking do not receive a fellowship award or the title of “Honorable Mention.” This evaluation will focus exclusively on fellowship and honorable mention award recipients in QG1 and QG2. Fellowship award recipients are further restricted to those who have accepted their award and are GRFP Fellows.
The sampling data file will contain unit-record identifiers, application information and QG rankings for all eligible GRFP applicants who received the fellowship or honorable mention award from four cohorts based on program application year: 1994-1998 (Cohort 1), 1999-2004 (Cohort 2), 2005-2008 (Cohort 3), and 2009-2011 (Cohort 4).
The main sampling frame is the list of GRFP Fellows and Honorable Mentions for 1994 – 2011. As shown in Table B.1.1, we propose to randomly select a total sample of 13,188 cases from Cohorts 1 through 4 (1,099 QG1 Fellows, 1,099 QG2 Fellows, and 1,099 QG2 Honorable Mentions per cohort). An assumed 65% overall response rate for GRFP Fellows and Honorable Mentions will result in 8,568 completed questionnaires (714 QG1 Fellows, 714 QG2 Fellows, and 714 QG2 Honorable Mentions per cohort). Table B.1.2 additionally shows expected completes for specific sub-group populations without oversampling of minorities and disabled, pooling the four cohorts. The fewest expected completes are from disabled individuals, with expected 574 disabled and 7,994 others across the entire sample.
The sampling plan was designed to select a sample large enough to make statistically valid estimates of program outcomes in answering RQ1 (What is the impact of the GRFP fellowship on the graduate school experience?), RQ2 (What is the impact of the GRFP fellowship on career outcomes?), and RQ4 (Is the program design effective in meeting program goals?). A variety of analytic techniques will be used to address the research questions. While the size of the analytic sample, minimum detectable effects, and statistical power vary with the specifications of a given comparison, it is important to broadly assess statistical power and minimum detectable effects for a given sample to determine if each is sufficient for answering the research questions.
In some cases, comparisons will be based on the full sample pooled across all four cohorts, such as when examining the overall impact of the GRFP fellowship on the graduate school experience (RQ1). Here comparisons will be made among all current and former graduate students, spanning all four cohorts. In other cases, comparisons will be between sub-samples defined according to specific cohorts or population characteristics. For example, when examining the impacts of the GRFP fellowship on career outcomes (RQ2), comparisons will be made within a sub-sample comprised of former graduate students (Cohort 1 and Cohort 2 applicants). When examining GRFP program goals (RQ4), comparisons will be made within a given cohort, or between sub-populations defined according to population characteristics (e.g., minority vs. other, female vs. male, disabled vs. other).13
Table B.1.3 provides information on the effect sizes detectable based on the expected number of respondents for pooled samples and sub-samples, following conventional standards of an 80 percent level of statistical power and a 95 percent confidence level (alpha=0.05) for different comparisons. For comparisons based on the pooled sample of 2,856 completed questionnaires in each comparison group (Cohorts 1 – 4 QG1 Fellows, QG2 Fellows, and QG2 Honorable Mentions) and an expected estimate of 50 percent for a particular outcome across the full sample of cases, we would be able to detect a 3.7 percentage point difference between two groups. If the expected estimate for a particular variable of interest is 90 percent, we could detect a 2.1 percentage point difference.
Table B.1.1. Population Counts and Sample Estimates
Cohort |
Fellowship Status |
Population |
Sample |
Expected Number of Respondents |
C1: 1994-1998 cohort (GRF+MGF)a |
QG1 Fellows |
2,580 |
1,099 |
714 |
C1: 1994-1998 cohort (GRF+MGF)a |
QG2 Fellowsb |
2,038 |
1,099 |
714 |
C1: 1994-1998 cohort (GRF+MGF)a |
QG2 Honorable Mentions |
1,490 |
1,099 |
714 |
C2: 1999-2004 cohort (GRF) |
QG1 Fellows |
2,897 |
1,099 |
714 |
C2: 1999-2004 cohort (GRF) |
QG2 Fellowsb |
2,561 |
1,099 |
714 |
C2: 1999-2004 cohort (GRF) |
QG2 Honorable Mentions |
2,220 |
1,099 |
714 |
C3: 2005-2008 cohort (GRF) |
QG1 Fellows |
2,035 |
1,099 |
714 |
C3: 2005-2008 cohort (GRF) |
QG2 Fellowsb |
1,725 |
1,099 |
714 |
C3: 2005-2008 cohort (GRF) |
QG2 Honorable Mentions |
2,233 |
1,099 |
714 |
C4: 2009-2011 cohort (GRF) |
QG1 Fellows |
~2,708 |
1,099 |
714 |
C4: 2009-2011 cohort (GRF) |
QG2 Fellowsb |
~2,536 |
1,099 |
714 |
C4: 2009-2011 cohort (GRF) |
QG2 Honorable Mentions |
~5,228 |
1,099 |
714 |
Total |
TBD |
13,188 |
8,568 |
Source: 1989-2009 merged data from NSF applicant record data files, 1989 - 1998 Ci data non-pii resort.xls; 1989 – 1998; CI 99-04 data -Non- PII.xlsx; 05-09 data -Non-PII.xlsx. Population counts for the 2010 and 2011 application years were based on personal correspondents with NSF-GRFP program directors.
a Through the 1998 GRFP application year, students could apply for the Graduate Research Fellowship and the Minority Graduate Research Fellowship (MGF) programs. For sampling purposes, no distinction will be made between the two groups. The MGF was discontinued in 1998.
b A small share of GRFP QG3 applicants were awarded the fellowship due to program exceptions. For sampling purposes, these cases will be treated as QG2 Fellows.
Table B.1.2.
Expected Sample Composition by Sub-group Populations
Variable |
Level |
Rate |
Expected Number |
Sex
|
Men |
54.73% |
4689 |
Women |
45.27% |
3879 |
|
Race/Ethnicity |
Minority |
8.82% |
756 |
Other |
91.18% |
7812 |
|
Ph.D. completion by 2012 |
Completer |
70.00% |
5998 |
Non-completer |
30.00% |
2570 |
|
Disability |
Disabled |
6.70% |
574 |
Other |
93.30% |
7994 |
Table B.1.3. Minimum Detectable Effect by Comparison Group and Estimated Outcome Value
Comparison Groupsa |
Expected Number of Respondents within Groups |
Estimated Value for Outcome |
Minimum Detectable Effect (percentage points) |
|
Pooled sample |
|
|
|
|
|
QG1 F vs. QG2 F vs. QG2 HMb |
2856 per group |
50% |
3.7 |
|
|
|
90% |
2.1 |
|
Male vs. Female |
4689 vs. 3879 |
50% |
3.0 |
|
|
|
90% |
1.8 |
|
Minority vs. Other |
756 vs. 7812 |
50% |
5.3 |
|
|
|
90% |
3.0 |
|
Disabled vs. Other |
574 vs. 7994 |
50% |
6.0 |
|
|
|
90% |
3.2 |
Sub-sample: Former graduate students |
|
|
|
|
|
QG1 F vs. QG2 F vs. QG2 HM |
1428 per group |
50% |
5.5 |
|
|
|
90% |
3.1 |
Sub-sample: Within cohort |
|
|
|
|
|
QG1 F vs. QG2 F vs. QG2 HM |
714 per group |
50% |
7.4 |
|
|
|
90% |
4.0 |
Source: http://www.dssresearch.com/KnowledgeCenter/toolkitcalculators/statisticalpowercalculators.aspx
a The pooled sample includes all 8568 Cohort 1 – 4 applicants. The sub-sample of former graduate students includes Cohort 1 and 2 applicants. The within-cohort sub-sample represents applicants within a single cohort.
b QG1=Quality Group 1, QG2=Quality Group 2, F=Fellows, HM=Honorable Mentions
The fewest expected completes from a key comparison group is among disabled students, (n=574) compared with nondisabled respondents (n=7,812). Based on these estimates, we could detect a 6.0 percentage point difference if the expected outcome estimate is 50 percent and a 3.2 percentage point difference if the expected outcome estimate is 90 percent. Table B.1.3 provides equivalent information on other comparison groups.
To provide context for the minimum detectable effects in our study, we examined past studies of GRFP applicants to see what differences have been reported in the literature and so determine if our sample size will be sufficient to support the planned analyses. While the following review focuses on Ph.D. completion rates as a useful point of reference to past studies, it is important to note that this evaluation will examine a sizable number of different outcomes related to graduate education, careers, and professional productivity.
The most recent comprehensive evaluation of the GRFP (Goldsmith, et al., 2002) provided evidence of mean differences between QG1 Fellows, QG2 Fellows, and QG2 Honorable Mention recipients. Results indicate that, for example, Ph.D. completion rates by 1999 among 1979-1988 GRFP applicants were 75.3 percent for QG1 Fellows, 69.4 percent among for QG2 Fellows, and 66.0 percent for QG2 Honorable Mentions, suggesting a 3.4 percentage point program effect among comparable QG2 applicants.14 This difference would not be significant at the 95% confidence level, given the power of the proposed sample to detect differences between quality groups within a single cohort. However, if differences of similar size were found in two or more of the 5-year cohorts the pooled cohort comparisons would have sufficient power for the difference to be significant at the 95% confidence level. The GRFP evaluation will examine Ph.D. completion rates among all but the most recent cohort of applicants.
The Goldsmith, et al. (2002) evaluation reported gender differences in Ph.D. completion rates by 1999 among the 1979-1988 GRFP applicants. The results showed 70.5 percent of male QG1 Fellows completed their Ph.D. by 1999, while 70.2 percent of QG2 Fellows and 67.5 percent of QG2 Honorable Mentions did so, indicating a 2.7 percentage point program effect. A difference of this magnitude among QG2 males would not attain 95% confidence even if all cohorts were pooled. The differences in Ph.D. completion rates were larger among females, with a 9.8 percentage point difference between QG2 Fellows and Honorable Mentions (68.3 vs. 58.5 percent); female completion rates were 73.3 percent for QG1 Fellows, 68.3 percent for QG2 Fellows, and 58.5 percent for QG Honorable Mentions.15 In contrast to males, differences of this size would be detected with 95% confidence even within cohorts.
Baker (1998) also examined gender and race differences in Ph.D. completion by 1988 among 1972 – 1981 GRFP applicants. Baker's findings indicate that rates of Ph.D. completion favored males GRFP applicants over their female counterparts by 4.5 percentage points among QG1 Fellows (77.4 vs. 72.9 percent), 9.1 percentage points among QG2 Fellows (73.6 vs. 64.5 percent), and 9.8 percentage points among QG2 Honorable Mentions (67.0 vs. 57.2 percent). Among QG2 applicants, male Fellows differed from male Honorable Mentions by 6.6 percentage points (73.6 vs. 67.0 percent), while female Fellows and Honorable Mentions differed by 7.3 percentage points (64.5 vs. 57.2 percent), again indicating a larger program effect among female applicants. Differences of this magnitude would be detected with the pooled sample of cohort 1 and cohort 2 in the current design.
Because the evaluation will include several different outcome measures and the analyses will compare a variety of different groups, Table B.1.4 presents the sample sizes required to detect a range of anticipated effect sizes based on mean differences between groups. Cohen's d metric for effect sizes (calculated as the mean difference between groups divided by the overall sample standard deviation) is frequently used to estimate sample sizes, where a smaller value necessitates a larger sample size. Following Cohen’s conventions, a mean difference between two groups of 0.20 is considered small and would be detectable for samples of 393 or more cases within each comparison group.
In combination with the previous tables, it is evident that the proposed sample will be sufficient to detect effect sizes (mean differences between two groups) as small as 0.10 for pooled sample comparisons between QG1 Fellows, QG2 Fellows, and QG2 Honorable Mentions, and for comparisons between males and females. For pooled sample comparisons between disabled versus others, minorities versus other cases, or other sub-sample comparisons, the proposed sample will be sufficient to detect effect sizes between 0.10 and 0.20.
Table B.1.4. Sample Size Requirements by Effect Sizes based on mean differences between groups.
Detectable
effect sizes |
Required sample size per comparison group b, c |
0.10 |
1571 |
0.20 |
393 |
0.30 |
175 |
0.40 |
99 |
0.50 |
64 |
0.60 |
45 |
0.70 |
33 |
0.80 |
26 |
a An effect size of 0.20 is considered small, 0.50 is considered medium, and 0.80 is considered large (Cohen, 1988).
b Assumptions: power=0.80, alpha=0.05.
c The sample sizes shown are for continuous variables. Comparable numbers exist for dichotomous variables (1560 for d =0.10, 392 for d=0.20, 174 for d=0.30, etc.).
An important component of this evaluation is to compare GRFP participants who ultimately completed a doctoral program to a nationally representative sample of “other Ph.D. recipients.” For this purpose we will identify a comparison group from subsets of the Doctorate Records File (DRF) and the 2006 Survey of Doctorate Recipients (SDR) as national peer groups of “other Ph.D. recipients.” The DRF, collected through the annual Survey of Earned Doctorates (SED), is a census of research doctorate recipients from U.S. universities. The SED data are collected at the doctorate recipients’ point of doctorate receipt and explore a number of issues related to the doctorate recipients’ graduate experience (i.e., interdisciplinary activities, time-to-degree, sources of financial support and indebtedness).
These data provide excellent benchmarks on which to compare similar data gathered from the GRFP Follow-Up Survey. The SDR is a national sample survey of doctorate scientists and engineers that focuses on career paths, further education and employment-related data. As such, it can provide a solid comparison of “other Ph.D. recipients” career-related data with those gathered from the career outcomes component of the GRFP Follow-Up Survey. To establish as close a comparison group as possible to the sample in the GRFP surveys, subsets of these datasets that approximate the cohorts and criteria for application to the NSF Fellows program will be used. The median elapsed time from graduate school entry (roughly the application point for NSF Fellows) to Ph.D. receipt for STEM graduates is 8.2 years (Hoffer, Welch, et al., 2006).16 We assume that SED and SDR respondents who received their doctorates in academic years 1996 to 200617 will approximate the NSF Fellow cohorts of interest. Further selection criteria to be applied are completion of a degree in a STEM field of study and U.S. citizenship or permanent resident alien status. In addition, we will drop from these datasets any records that we can match from the sample of GRFP participants. Using these filters, we believe we can create a national peer group of “other Ph.D. recipients” to provide a valid comparison to the surveyed Fellows who completed doctoral programs. Because these comparisons will require record matching based on the restricted-use SED and SDR microdata files, NORC work closely with the GRFP COTR, SED COTR, and SDR COTR in following the National Center for Science and Engineering Statistics guidelines for obtaining and executing a restricted-use license (see http://www.nsf.gov/statistics/license/start.cfm).
Note that this comparison excludes those who graduated with master’s degrees. For this group, comparisons will focus on Fellows and Honorable Mentions, and extant data sources are being researched to determine the feasibility of obtaining a nationally representative sample of M.A. recipients in research-focused STEM fields.
We will draw from other available data resources as needed. Among the other data we will incorporate are the Integrated Postsecondary Education Data System (IPEDS) for measures of institutional characteristics and Carnegie Classification, Barron’s Profiles of American Colleges for measures of institutional selectivity or prestige, and possibly the National Faculty Directory18 for measures of faculty employment status.
The GRFP Follow-Up Survey will be conducted as a Web-based instrument, accessible to respondents using a combination of personalized PIN and password. NORC will administer the GRFP Follow-Up Survey to Fellows and Honorable Mentions sampled in Cohorts 1 through 4 (1994-2011). A copy of the GRFP Follow-up Survey is provided in Appendix B.
In preparation for data collection, NORC will prepare and mail letters inviting sample members to participate in the study. Letters will be sent through the U.S. Postal Service. The mailings will describe the study and its purpose, and the measures taken to assure confidentiality. Included in the mailings will be a unique PIN and password to use for accessing the survey online. Upon receiving this advance letter, sample members will be able to go to the survey website and complete the questionnaire. The advance letter will also include a study toll-free number and email address through which respondents can directly contact project staff to verify study authenticity, ask questions about their participation, or receive technical assistance.
The survey will offer the same accommodation for those with disabilities as the Survey of Doctorate Recipients. There will be added navigation functionality on the web so a mouse is not necessary to respond to the survey. For those with disabilities, we will offer the option of a telephone or paper survey.
To target a 65% response rate overall, and no less than a 60% response rate within each cohort, NORC will follow up the advance letter mailings with a series of letter and postcard prompts as reminders to complete the survey, with special emphasis on monitoring and prompting the Honorable Mention sample members to ensure adequate response rates. NORC will consider offering incentives to Honorable Mention sample members if necessary to obtain a sufficient response rate. When NORC receives information that a sample member no longer resides at a particular location, additional steps will be taken to locate the individual (see Locating section below).
In addition to the standard letter prompts, NORC will employ both phone and email prompting. Approximately one month after data collection begins, NORC will use phone prompting to encourage sample members to participate in the survey. NORC will also use two methods of email prompting to gain participation. Weekly batch emails will be sent out during data collection to all non-respondents. These emails will come from a GRFP study email address and will contain the respondent’s personalized PIN and password. The second method will involve a more targeted and personalized email strategy to boost response rates for those groups having lower than expected response rates. These emails will come from a personal email address and will contain the respondent’s PIN, password, and a direct link into the survey.
To ensure the confidentiality of sample members the survey Web page and all advance and prompting materials will contain generic branding referring to a “graduate student follow-up study” rather than referring to any particular group (e.g., Fellows and Honorable Mentions). This will not affect the types of questions included in the survey, and specific paths through the survey will be based on participants’ fellowship status, determined upfront by the survey login PIN and password each is assigned.
To further enhance user-friendly access for sample members, NORC will maintain a survey-specific web page throughout data collection. In addition to a link to the secure web instrument, the study’s web site will serve as a source of information for potential respondents. This web site will be cited in all advance and prompting materials sent to sample members and will also be accessible through the main NORC website and general search engines, such as Google. The Web page will contain links allowing a sample member to:
Access a detailed description of the survey
Review the GRFP Follow-Up Survey’s Frequently Asked Questions
Obtain contact information for NORC survey staff
Review the Privacy Policy for the GRFP Follow-Up Survey
Review citations and/or publications of previous surveys’ data
Link to the GRFP Follow-Up Survey Web Questionnaire
Send an e-mail message to the GRFP Survey in-box
Call a 1-800 number for information on the survey.
Locating
Accurate address and telephone contact information are essential for notifying sample members of their selection in the study and further prompting for survey completion. Because we will be using the GRFP applicant records as the data source for sample member contact information captured at the time of applying to the program, the locating strategy has been designed to handle varying degrees of outdated information. Past NORC studies have found that 80% of located cases ultimately go on to complete the survey when following such a prompting strategy. Therefore to target a 65% response NORC will need to locate 75-85% of sample members within each cohort. To accomplish this location rate, NORC will use a multi-stage strategy for locating sample members that will be responsive to varying amounts of locating information within any given cohort.
As noted, NSF will provide contact information from GRFP applicant records, including names and birthdates for all cases. For many cases, available information will also include social security numbers, address information, phone numbers, email addresses, and educational institutions (i.e., intended graduate school, current or previous college or university). Cases with incomplete contact information and cases with outdated contact information will be submitted for locating.
Our primary locating tools will include Accurint and LinkedIn searches. Accurint is a locating service that maintains a database of national information. When there is a match, the Accurint search yields address, phone, and/or email information. Previous NORC studies with populations similar to GRFP applicants have been able to locate 60% of their sample using Accurint searches. Because Accurint searches rely on SSN and birthdate information, LinkedIn will additionally be used for cases that lack these critical information fields. LinkedIn searches have been found to be highly successful in locating profiles of professionals when using academic institution information. NORC estimates that an additional 15-20% of the sample can be located through LinkedIn searches. LinkedIn is a professional networking web site where individuals create profiles listing their academic and professional credentials. Often these profiles list current employers, and/or contact emails and phone numbers. NORC has developed methods of searching LinkedIn profile pages using educational institution information listed on profile pages. These searches will be used to identify cases where sample members’ profiles list a current employer, current educational institution, or where there is contact information listed. Locators will then enter this information into our cases management system. This information will also be used to guide academic directory and employer directory searches.
These locating strategies will be employed during two phases of the study: prior to data collection (i.e., Pre-field), and during data collection.
Pre-field Locating
The locating strategy is based on the assumption that current address information may be incomplete for a portion of the sample. To account for this, pre-field locating will be conducted using Accurint. In addition, with NSF’s support, NORC may contact coordinating officials at GRFP-sponsoring institutions to request updated contact information for more recent cohorts who may still be enrolled graduate students.
Locating during Data Collection
Once the survey is in the field, we will mail letters to addresses obtained through Accurint searches, or from applicant data, and send out an email prompt. Cases having mail returned as undeliverable or invalid email addresses or phone numbers will be designated for more intensive locating treatments. These cases will be forwarded to NORC’s locating department, where staff will conduct additional Accurint individual searches.
In additionally, NORC will use LinkedIn searches using educational institution data contained in the applicant files (including BA institution, current institution, and intended graduate institution at the time they submitted the GRFP application) to locate sample members on LinkedIn using automated search techniques. Locators will manually review LinkedIn profiles for new contact information, including: email, phone, current employer, and current location. For older cohorts, LinkedIn searches will be necessary to determine updated location information. For example, where LinkedIn searches produce an affiliation with an educational institution, the information will be used to conduct academic directory searches.
NORC expects that more recent cohorts will have more up-to-date information and will require less intensive locating efforts in comparison to older cohorts. For older cohorts, employing LinkedIn searches will likely be necessary to locate 75% to 85% of sample members. Location rates within cohorts, as well within awardee status group will be monitored throughout the data collection period, and the locating strategies applied as necessary.
Case Management
All sample member information, updated locating data, case history and status will be maintained in NORC’s Case Management System (CMS). This comprehensive database maintains records for all incoming and outgoing contacts with sample members along with complete address history information. Every update in the CMS records the status and date of the update, as well as the staff member who made the update. The CMS acts as a central ‘brain’ of the GRFP system design, holding much of the case data and directing the processing flow of the other component systems.
Pilot and Cognitive Testing
The survey was time tested with five individuals. Complete pilot testing with up to nine respondents will occur in December and will gather respondent comments on directions, clarity of items and overall logic of the programmed Web survey. Results from this pilot test will be used to refine the survey.
Cognitive testing also will be used as a tool that explores the respondent’s understanding of the survey questions and the cognitive processing to formulate an answer. The scripted and unscripted cognitive probing during the interview will be directed towards understanding these issues. NORC will conduct five cognitive interviews. After each interview, respondents will be asked to provide feedback on the interview including respondent’s overall interview experience, suggestions for improving the survey, and an open question and answer period for the respondent and interviewer.
The factors that will be examined during the cognitive interviews include: respondent understanding of the task/questions, respondent burden, interview timings, incorporating feedback from interviewers/respondents on problems with the instruments. Some specific questions that will guide the cognitive testing include:
Do respondents have any difficulty comprehending the survey questions?
Are there any survey questions that can be improved, clarified?
Are there any additional survey questions that should be included?
How burdensome is the survey and has burden been reduced as much as possible?
Can respondents provide accurate responses to survey questions that ask about events that may be more than a few years in the past?
Has all relevant feedback from respondents and cognitive interviewers been incorporated?
Is the timing of the instruments within the appropriate parameters?
The cognitive interviews will bring to light problems that exist with the GRFP survey. Some of these problems will be ones that we anticipated based on our review of the instruments, while other issues may be revealed by the cognitive interviews. Following the set of five interviews, data will be examined and materials will be revised in order to address the issues that emerge from testing.
Nonresponse Bias Analysis
High nonresponse overall or differential response rates between the treatment and control groups, and/or between older and newer cohorts can jeopardize the integrity of a study. The contractor plans to conduct a series of comparisons to assess the extent to which nonresponse has resulted in the respondent sample being different from the original baseline sample.
NORC will employ different approaches to examining non-response bias and accounting for it in the final analysis. Two potential options for our non-response bias tests are the Confidence Interval Bias Test and Cochran’s Bias Test.19 For the Confidence Interval Bias Test, variables such as demographic characteristics will be compared for the original baseline sample and the respondent sample. A confidence interval (CI) around the mean value among the responders will be calculated and compared to the mean value of the baseline sample—if the mean of the baseline sample falls within the CI, this suggests that the responders are sufficiently similar to the baseline sample and that it is not necessary to correct for nonresponse bias.
A more rigorous test to detect bias is the Cochran Bias Test, which is calculated by taking the difference of the mean of the responders and the mean of the baseline sample and dividing it by the standard error of the responders. A resulting bias of greater than 0.10 is considered problematic. It is important to note, however, that the Cochran test is extremely sensitive and leads to the conclusion of bias on most factors being tested. Thus, it is important to consider more than one type of bias test to determine if bias exists.
If non-response bias appears to be an issue, NORC plans to re-weight the sample data according to each respondent’s likelihood or propensity of being a respondent. A logistic regression using baseline characteristics is used to predict the probability of being a respondent. Respondents in the sample with characteristics that most often are associated with nonresponse would effectively receive a higher weight to make up for their low incidence in the sample. These steps are essential to ensuring that our final estimates are not biased by the under-representation of any important subgroups, particularly the older cohorts. Where such methods are used, they will, of course, be carefully noted in the final report.
The next section discusses sections B.1-B.4 for the institutional data collection.
As noted earlier, two different samples of institutions will be selected to address RQ3 and RQ4 that focus on the effects of hosting GRFP Fellows on institutions and program implementation. The first, the Institutional Site Visit Sample, consists of six institutions that will participate in a site visit during which the NORC team will conduct in-depth, in-person interviews with up to 10 administrators, faculty and staff to ask about the effect of the GRFP on the institution and students as well as implementation of the GRFP and recommended changes. The second, the Institutional Phone Interview Sample, consists of 20 institutions where up to five administrators, faculty, and staff will be asked—via short telephone interviews—about implementation of specific design and policy elements of the GRFP and recommended changes to the GRFP.
The contractor will work with NSF to put together a sampling frame for institutions where Fellows in Cohorts 1-4 enrolled. This sampling frame will contain the name of the institution, location, and total number of Fellows enrolled at that institution from 1994 through 2011. The contractor will add institutional characteristics to this sampling frame—size of graduate student population, Census region, type of institution in terms of public/private and Carnegie classification, among others.
Institutional site visit sample: We hypothesize that effects on faculty, students, and the institution are likely to require some threshold number of Fellows. Thus, institutions will be ranked according to the total number of Fellows they have hosted in each of the cohorts. NSF and NORC will then jointly select a purposive sample of six institutions from among these institutions with perhaps greater weight given to those that rank highest for the most recent cohorts.
Institutional phone interview sample: We wish to obtain a more representative and diverse sample so that we can understand more broadly how the specific policies are affecting institutions with different characteristics. Thus institutions will be ranked on several dimensions (for example, size of graduate enrollment in STEM fields, number of Fellows enrolled, type, reputation, and geographical location, among others) and selected into the sample based on characteristics of interest. Although not a random sample, the sample will be balanced to some degree with respect to the variables where we might expect some variation in responses. Greater weight may be given to institutions that are hosting Cohort 4 Fellows because we need to capture information on recently-changed policies and their effect on institutions and students. NSF and NORC staff will jointly select the sample of 20 institutions.
For both samples: NSF will send a letter to the graduate dean/GRFP coordinating official informing them of the study and the purpose of the data collection and encouraging them to participate in the study. NORC will then follow-up with the dean/GRFP coordinating official to solicit participation in the study through an email that reiterates the purpose of the study and ask for assistance in identifying relevant faculty and staff and a contact person in the graduate office with whom they could work to set up the interviews. If needed, the dean will receive a follow-up phone call. As noted, we propose to interview up to five faculty and staff (institutional phone interview sample) and up to 10 faculty and staff (institutional site visit sample) at each institution. These will likely include deans, program chairs, program administrators, and faculty.
Rather than rely entirely on the dean, NORC will focus on representing different departments and faculty and staff positions, with the goal of recruiting those potential participants most likely to have insights on how the presence of Fellows affects the institution (deans, department chairs, and GRFP Fellows’ faculty advisors). We will look to Principal Investigators of grants and GRFP coordinating officials on campus, and will refer to information from GRFP’s administrative records (such as those gathered from the FastLane web page) for current representatives at each institution with existing knowledge of the GRFP from a management perspective. We will target these individuals to gain cooperation on each campus. In addition, when selecting respondents we will focus on individuals who have a history of interacting with Fellows.
Institutional phone interview sample: Each of the potential respondents will be contacted and informed of the purpose of the study. Sample respondents will be told that this is an information-gathering exercise aimed at understanding how GRFP policies affect institutions and students, how institutions implement those policies, and whether they have recommendations for changes in policy. Once a mutually-convenient time is decided, the interviews will be conducted via phone by a team of two NORC staff members. NORC will use a semi-structured protocol with potential follow-up questions and probes targeted to the type of respondent (Appendix B). Respondents will be asked for verbal informed consent and be assured of the confidentiality of their responses. They will be informed that neither institutions nor respondents will be identified in the final report and briefs, and data will be presented only in aggregate form. Illustrative quotes will be presented with the speaker described in a non-identifiable fashion. Interviewers will pre-code many of the responses while taking detailed notes, by identifying whether they fit into one of the expected types of responses for that question.
Institutional site visit sample: Sample respondents will be contacted and informed of the purpose of the study and that NSF is interested in understanding the value of the GRFP to both students and institutions and the larger effects of GRFP on institutions, students, and career outcomes. Once a mutually-convenient time for the site visit is decided, an agenda and interview schedule will be developed. For the site visits, the team will consist of two to three NORC staff members. The interviews will begin with an introduction to the study and ask for verbal informed consent. NORC will use semi-structured protocols with potential follow-up questions and probes (Appendix B). In-person interviews will be recorded and transcribed for analysis, with the participants’ permission (if participants refuse, a team member will take detailed notes about the interview in lieu of a recording and transcription).
We believe the contractor can obtain 100% response rates among the sample, especially with the cooperation of the dean or the GRFP coordinating official and given that both samples are fairly small. Most institutions benefitting from the GRFP are generally willing to participate in studies that may help inform the program. If a particular institution refuses to participate, they will be replaced, with NSF input.
Interview protocols will be tested via cognitive interviews with faculty and administrators at graduate institutions similar to those selected for the site visit. This iterative cognitive interviewing process will allow NORC’s qualitative research experts to quickly identify which questions yield answers relevant to the identified research questions, and which need to be revised or replaced to improve clarity and flow. NORC will pilot test the three interview protocols with at least two participants per protocol (i.e., at least six total participants) prior to the site visits. NORC will consult with NSF before selecting an appropriate local institution for pilot testing to make sure that we do not select an institution that would more appropriately be included in the full study.
The section below provides key contact information for the study.
Key personnel who have been involved in the statistical aspects and who will be involved in collecting and analyzing data are presented in the table below (Table B.5). The contractor for collection and analysis of data in this study is NORC at the University of Chicago, Chicago, IL. Staff with experience in evaluation of research programs, expertise in scientific research, and knowledge of statistical methods, was involved in the design. NSF program staff members familiar with the programs have been included in the design of the evaluation.
Table B.5. Individuals Consulted
Name |
Role |
Phone |
NORC at the University of Chicago |
|
|
Marie Halverson |
Project Director |
(312) 759-4041 |
Hee-Choon Shin |
Sampling Statistician |
(773) 256-6150 |
Task Leader |
(312) 759-2356 |
|
Jake Bartolone |
Task Leader |
(312) 759-4002 |
Lisa Setlak |
Task Leader |
(312) 357-3774 |
Tom Hoffer |
Senior Scholar |
(773) 256-6097 |
Sheila Nataraj Kirby |
Senior Scholar |
|
National Science Foundation |
|
|
Carol Stoel |
Program Officer, Division of Graduate Education |
(703) 292-8630 |
Gisele Muller-Parker |
Program Director, Division of Graduate Education |
(703) 292-7468 |
Gilbert John |
Program Director, Division of Graduate Education |
(703) 292-2343 |
Roosevelt Johnson |
Program Director & COTR, Division of Research and Learning |
(703) 292-5152 |
Appendix A. Federal Register Notice
Federal
Register/Vol.
75, No. 123 /
Monday, June 28, 2010 /Notices 36697
Dated: June 22, 2010. Richard W. Sherman, Deputy General Counsel.
[FR Doc, 2010-15662 Filed 6-25-10; 8:45 am]
BILLING CODE P
N ATIONAL AERONAUTICS AND SPACE ADMINISTRATION
[Notice (10-071)]
Aerospace Safety Advisory Panel; Meeting
AGENCY: National Aeronautics and Space Administration (NASA).
ACTION: Notice of meeting.
SUMMARY: In accordance with the Federal Advisory Committee Act, Public Law 92-463, as amended, the National Aeronautics and Space Administration announce a forthcoming meeting of the Aerospace Safety Advisory Panel.
DATES: Friday, July 16, 2010, 1 p.m. to 3 p.m.
ADDRESSES: Langley Research Center (LaRC), Building 1250, Room 116, Hampton, VA 23681.
FOR FURTHER INFORMATION CONTACT: Ms.
Kathy Dakon, Aerospace Safety Advisory Panel Executive Director, National Aeronautics and Space Administration, Washington, DC 20546,
(202) 358-0732.
SUPPLEMENTARY INFORMATION: The
Aerospace Safety Advisory Panel will hold its 3rd Quarterly Meeting for 2010. This discussion is pursuant to carrying out its statutory duties for which the Panel reviews, identifies, evaluates, and advises on those program activities, systems, procedures, and management activities that can contribute to program risk. Priority is given to those programs that involve the safety of human flight. The agenda will include LaRC Overview; LaRC Safety Overview; Aviation Safety Program Activities at LaRC; Constellation Safety Risk Tolerance; Commercial Human Rating Plan; Infrastructure Funding Issues Update; NASA Engineering and Safety Center Update.
The meeting will be open to the public up to the seating capacity of the room. Seating will be on a first-come basis. Visitors will be requested to sign a visitor’s register. Photographs will only be permitted during the first 10 minutes of the meeting. During the first 30 minutes of the meeting, members of the public may make a 5-minute verbal presentation to the Panel on the subject of safety in NASA. To do so, please
contact
Ms. Susan Burch at susan.burch@nasa.gov
at
least 48 hours
in
advance. Any member of the public
is
permitted to file a written statement with the Panel at the time of
the
meeting.
Verbal presentations and
written
comments should be limited to
the
subject of safety in NASA. All U.S. citizens
desiring to attend the Aerospace Safety
Advisory Panel meeting at the LaRC must provide their full name,
company
affiliation (if applicable), citizenship,
place of birth, and date of birth no later than close of business on
July
14, 2010. All non-U.S. citizens must
submit their name; current address;
citizenship; company affiliation
(if applicable) to include address,
telephone number, and title; place
of birth; date of birth; U.S. visa information
to include type, number,
and
expiration date; U.S. Social Security Number (if applicable);
Permanent Resident
Alien card number and expiration date (if applicable); place and
date
of entry into the U.S.; and Passport information
to include Country of issue, number, and expiration date no later
than
close of business on July 6, 2010.
If
the above information is
not
received
by
the noted dates, attendees should expect
a minimum delay of two (2)
hours.
All visitors to this meeting will
be
required to process in through LaRC’s Badge
and Pass Office located to the right
of the main entrance gate. Please provide
the appropriate data, via e-mail, to
cheryl.w.cleghorn@nasa.gov
or
fax to
the
attention of Cheryl Cleghorn at (757) 864-6521,
noting at the top of the page “Public
Admission to the ASAP Meeting at LaRC.” It is imperative that
the
meeting
be held on this date to accommodate the scheduling priorities
of
the key participants.
Kathy Dakon,
Acting Director, Advisory Committee Management Division, National Aeronautics and Space Administration.
[FR Doc. 2010-15666 Filed 6-25-10; 8:45 am]
BILLING CODE P
NATIONAL SCIENCE FOUNDATION
Notice of Intent To Seek Approval To Establish an Information Collection
AGENCY: National Science Foundation. ACTION: Notice and request for comments.
SUMMARY:
The
National Science Foundation (NSF) is announcing plans
to
request clearance of this collection. In accordance with the
requirement of Section
3506(c)(2)(A) of the Paperwork Reduction
Act of 1995 (Pub. L. 104-13), we
are providing opportunity for public
comment on this action. After obtaining and considering public comment, NSF will prepare the submission requesting that OMB approve clearance of this
collection
for no longer than three years. DATES:
Written
comments on this notice must
be received by August 27, 2010 to be
assured of consideration. Comments received
after that date will be
considered to the extent practicable.
FOR FURTHER INFORMATION CONTACT:
Suzanne
Plimpton, Reports Clearance Officer, National Science Foundation,
4201 Wilson Boulevard, Suite 295, Arlington, Virginia 22230;
telephone (703) 292-7556; or send e-mail to splimpto@nsfgov.
Individuals
who use
a
telecommunications device for the
deaf
(TDD) may call the Federal Information Relay Service (FIRS) at 1-
800-877-8339
between 8 a.m. and 8
p.m., Eastern time, Monday through
Friday.
You may obtain a copy of the data
collection instruments and instructions
from Ms. Anderson.
SUPPLEMENTARY INFORMATION:
Title of Collection: Graduate Research Fellowship Program Follow-up Survey.
OMB Number: 3145-NEW.
Expiration Date of Approval: Not Applicable.
Type of request: New.
Abstract:
The
purpose of this study is
to provide evidence on the impact of
the GRPF on individuals’ educational decision,
career preparations,
aspirations
and progress, as well as professional
productivity. This includes the study design and data collection as
well as subsequent analysis and report writing.
As part of NSF’s commitment to graduate
student education in the U.S., the
GRFP seeks to promote and maintain
advanced training in science, technology,
engineering, and mathematics
(STEM) field by annually awarding
roughly 1,000 fellowships to graduate student in research-based
programs.
As the first program
evaluation
since 2002, the GRFP evaluation
comes on the heels of increased funding by NSF to supporting
additional
fellowship awards.
NSF contracts with the National Opinion Research Center (NORC) at the University of Chicago to design, implement, and assess a study that will address relevant procedures and components of the GRFP in regards to the application and award process and support for Fellows and sponsoring institutions with an aim towards measuring and increasing the program’s effectiveness.
There
are four goals of the GRFP evaluation.
The first goal is
to
maintain
a high quality evaluation through consultation
with an advisory group of
36698 Federal
Register /Vol. 75, No. 123 / Monday, June 28, 2010 /
Notices
Educational decisions, experiences, and graduate degree attainment of STEM graduate students.
Career preparation and aspirations.
Career activities, progress, and job characteristics following graduate school.
Professional productivity.
Workforce participation and career outcomes.
Graduate school institutions and student recruitment at GRFP-sponsoring institutions.
Faculty attitudes at GRFPsponsoring institutions.
Diversity of students participating in STEM fields at GRFP-sponsoring institutions.
This survey would address two separate components of the planned GRPF evaluation. First, this component will assess the influence of GRFP awards on recipients’ graduate school experience and outcomes, which includes program of study and institution attended, professional productivity (e.g., publishes papers, conference presentations, etc.) during graduate schools and career aspirations. Second, the survey will evaluate the impact of participation in the in the GRPF on subsequent career options, progress and contributions to respondents’ professional fields. This will be conducted as a web-based survey.
Estimate of Burden: Public reporting burden for this collection of information is estimated to average 30 minutes for current graduate students and 40 minutes per graduates.
Respondents: Individuals.
Estimated Number of Responses per Form: 2,826 graduate students; 6,429 graduates.
Estimated Total Annual Burden on Respondents: 5,699 hours (2,826 graduate student respondents at 30 minutes per response = 1,413 hours + 6,429 graduate respondents at 40 minutes per response = 4,286 hours).
Frequency of Response: One time.
Comments:
Comments
are invited on
(a) whether the proposed collection of
information is
necessary
for the proper performance of the functions of the NSF,
including whether the information shall
have practical utility; (b) the accuracy of the NSF’s estimate
of the burden of the proposed collection of information; (c) ways to
enhance the quality, utility, and clarity of the information on
respondents, including through the use of automated collection
techniques or other forms of information technology; (d) ways to
minimize the burden
of the collection of information on
those who are to respond, including through the use of appropriate
automated, electronic, mechanical or other technological collection
techniques or other forms of information technology.
Dated: June 22, 2010.
Suzanne H. Plimpton,
Reports Clearance Officer, National Science Foundation.
[FR Doc. 2010-15569 Filed 6-25-10; 8:45 am]
BILLING CODE 7555-01—P
NATIONAL SCIENCE FOUNDATION
Committee Management Renewals
The
NSF management officials having responsibility for the advisory
committees listed below have
determined
that renewing these groups for
another two years is
necessary
and
in
the public interest in connection with the performance of duties
imposed upon the
Director, National Science
Foundation (NSF), by 42 U.S.C. 1861
et
seq. This
determination follows
consultation with the Committee
Management Secretariat, General
Services
Administration.
Committees
Committee on Equal Opportunities in
Science and Engineering, 1173
Advisory Committee for Computer and
Information Science and Engineering,
1115
Advisory
Committee for GPRA
Performance
Assessment, 13853
Advisory Committee for Mathematical
and Physical Sciences, 66
Advisory Committee for Social, Behavioral, and Economic Sciences,
1171
Business and Operations Advisory Committee, 9556
Proposal Review Panel for Astronomical Sciences, 1186
Proposal Review Panel for Chemical, Bioengineering, Environmental, and Transport Systems, 1189
Proposal Review Panel for Chemistry,
1191
Proposal Review Panel for Civil, Mechanical, and Manufacturing Innovation, 1194
Proposal Review Panel for Computer and Network Systems, 1207
Proposal Review Panel for Computing &
Communication Foundations, 1192 Proposal Review Panel for
Cyberinfrastructure, 1185
Proposal Review Panel for Electrical Communications and Cyber Systems,
1196
Proposal Review Panel for Engineering Education and Centers, 173
Proposal Review Panel for Experimental Programs to Stimulate Competitive Research, 1198
Proposal Review Panel for Graduate Education, 57
Proposal
Review Panel for Human
Resource
Development, 1199 Proposal
Review Panel for Information
and
Intelligent Systems, 1200
Proposal Review Panel for Materials
Research, 1203
Proposal Review Panel for Mathematical Sciences, 1204
Proposal Review Panel for Physics, 1208
Proposal Review Panel for Polar Programs, 1209
Proposal Review Panel for
Undergraduate Education, 1214
Effective
date for renewal is July 1,
2010. For more information, please
contact Susanne Bolton, NSF, at (703)
292-7488.
Dated: June 23, 2010.
Susanne Bolton,
Committee Management Officer.
[FR Doc. 2010-15565 Filed 6-25-10; 8:45 am]
BILLING CODE 7555-01—P
N UCLEAR REGULATORY COMMISSION
[NRC-2010-0229]
Draft Regulatory Guide: Issuance, Availability
AGENCY: Nuclear Regulatory Commission.
ACTION: Notice of Issuance and Availability of Draft Regulatory Guide, DG-1216, “Plant-Specific Applicability of Transition Break Size Specified in 10 CFR 50.46a.”
FOR FURTHER INFORMATION CONTACT:
Robert L. Tregoning, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001, telephone: (301) 251— 7662, e-mail Robert.Tregoning@nrc.gov, or, Richard Jervey, U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001, telephone: (301) 251— 7404, e-mail Richard.Jervey@nrc.gov.
56596 Federal
Register
/
Vol. 75, No. 179 / Thursday, September 16, 2010 /Notices
Date: October 14, 2010. Time: 9 a.m. to 5 p.m. Room: 415.
Program: This meeting will review applications for U.S. History and Culture Tin Preservation and Access Humanities Collection and Reference Resources, submitted to the Division of Preservation and Access at the July 15, 2010 deadline.
Date: October 18, 2010. Time: 9 a.m. to 5 p.m. Room: 421.
Program: This meeting will review applications for Anthropology in America's Historical and Cultural Organizations Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 19, 2010. Time: 9 a.m. to 5 p.m. Room: 421.
Program: This meeting will review applications for Arts, Religion, and Culture in America's Historical and Cultural Organizations Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 19, 2010. Time: 9 a.m. to 5 p.m. Room: 415.
Program: This meeting will review applications for World Studies I in Preservation and Access Humanities Collection and Reference Resources, submitted to the Division of Preservation and Access at the July 15, 2010 deadline.
Date: October 21, 2010. Time: 9 a.m. to 5 p.m. Room: 415.
Program: This meeting will review applications for World Studies II in Preservation and Access Humanities Collection and Reference Resources, submitted to the Division of Preservation and Access at the July 15, 2010 deadline.
Date: October 21, 2010. Time: 9 a.m. to 5 p.m. Room: 421.
Program: This meeting will review applications for United States History in America's Historical and Cultural Organizations Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 22, 2010. Time: 9 a.m. to 5 p.m.
Room: 421.
Program: This meeting will review applications for Art History in America's Historical and Cultural Organizations Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 25, 2010. Time: 9 a.m. to 5 p.m. Room: 421.
Program: This meeting will review applications for American Studies in America's Media Makers Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 26, 2010. Time: 9 a.m. to 5 p.m. Room: 415.
Program: This meeting will review applications for Literature in Preservation and Access Humanities Collection and Reference Resources, submitted to the Division of Preservation and Access at the July 15, 2010 deadline.
Date:
October 26, 2010. Time: 9 a.m. to 5 p.m.
Room: 421.
Program: This meeting will review applications for African-American & Civil Rights History in America's Historical and Cultural Organizations Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 27, 2010. Time: 9 a.m. to 5 p.m. Room: 421.
Program: This meeting will review applications for United States History in America's Historical and Cultural Organizations Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 28, 2010. Time: 9 a.m. to 5 p.m. Room: 421.
Program: This meeting will review applications for United States History in America's Media Makers Grants Program, submitted to the Division of Public Programs at the August 18, 2010 deadline.
Date: October 28, 2010. Time: 9 a.m. to 5 p.m. Room: 415.
Program:
This
meeting will review applications for U.S. History and
Culture
II in Preservation and Access Humanities Collection and Reference
Resources, submitted to the Division of
Preservation and Access at the July 15, 2010 deadline.
Michael P. McDonald,
Advisory Committee Management Officer. [FR Doc. 2010-23034 Filed 9-15-10; 8:45 am]
BILLING CODE 7536-01—P
NATIONAL SCIENCE FOUNDATION
Agency Information Collection Activities: Comment Request
AGENCY: National Science Foundation. ACTION: Notice; Submission for OMB Review; Comment Request.
SUMMARY:
The
National Science Foundation (NSF) has submitted the following
information collection
requirement to OMB for review and
clearance under the Paperwork
Reduction
Act of 1995, Public Law 104– 13.
This is the second notice for public comment; the first was
published in the Federal
Register at
75 FR 36697, and no substantial
comments were received.
NSF
is
forwarding
the proposed renewal submission
to the Office of Management and
Budget (OMB) for clearance simultaneously
with the publication of this
second notice. The full submission may
be found at: http://
www.reginfo.gov/public/do/PRAMain. Comments
regarding (a) whether the collection of information is necessary
for
the proper performance of the
functions of the agency,
including
whether the information will have
practical
utility; (b) the accuracy of the agency's
estimate of burden including
the
validity of the methodology and assumptions
used; (c) ways to enhance
the
quality, utility and clarity of the information to be collected; (d)
ways to minimize the burden of the collection of information on
those who are to
respond,
including through the use of appropriate automated, electronic,
mechanical, or other technological
collection techniques or
other forms of information technology should be
addressed to:
Office of Information and Regulatory Affairs of OMB, Attention:
Desk Officer for National Science Foundation, 725-17th Street,
NW.
Room 10235, Washington, DC 20503,
and to Suzanne
Plimpton,
Reports Clearance Officer, National Science
Foundation, 4201 Wilson Boulevard,
Room 295, Arlington, VA
22230, or by e-mail to splimpton@nsf.gov.
Comments
regarding
these information collections are
best assured of having their full
effect
if received within 30 days of this notification.
Copies of the submission(s) may
be obtained by calling 703-292–
7556.
Federal
Register/Vol. 75,
No. 179 / Thursday, September 16, 2010 / Notices 56597
and
the agency informs potential
persons
who are to respond to the collection of information that such
persons are not required to respond to
the collection of
information unless it displays
a currently valid OMB control number.
Under OMB regulations, the agency may continue to conduct or sponsor the collection of information while this submission is pending at OMB.
ADDRESSES: Submit written comments to Suzanne Plimpton, Reports Clearance Officer, National Science Foundation, 4201 Wilson Boulevard, Room 295, Arlington, VA 22230, or by e-mail to splimpton@nsfgov.
FOR FURTHER INFORMATION CONTACT: Call
or
write, Suzanne Plimpton, Reports Clearance
Officer, National Science Foundation,
4201 Wilson Boulevard, Room
295, Arlington, VA 22230, or by e-mail
to splimpton@nsfgov.
Individuals
who use a telecommunications device for the deaf (TDD)
may call the Federal Information Relay
Service (FIRS) at 1-800-877-8339 between 8 a.m. and 8 p.m., Eastern
time, Monday
through Friday.
SUPPLEMENTARY INFORMATION:
Title of Collection: Graduate Research Fellowship Program Evaluation.
OMB Approval Number: 3145—NEW.
Abstract:
The
purpose of this study is to
provide evidence on the impact of the GRPF
on individuals' educational decision,
career preparations,
aspirations
and progress, as well as professional
productivity. This includes the study design and data collection as
well as subsequent analysis and report writing.
As part of NSF's commitment to graduate
student education in the U.S, the
GRFP seeks to promote and
maintain
advanced training in science, technology,
engineering, and
mathematics
(STEM) field by annually awarding
roughly 1,000 fellowships to graduate
student in research-based programs.
As the first program
evaluation
since 2002, the GRFP evaluation
comes on the heels of
increased funding by NSF to supporting
additional
fellowship awards.
NSF contracts with the National Opinion Research Center (NORC) at the University of Chicago to design, implement, and assess a study that will address relevant procedures and components of the GRFP in regards to the application and award process and support for Fellows and sponsoring institutions with an aim towards
measuring and increasing the program's effectiveness.
There
are four goals of the GRFP evaluation.
The first goal is to maintain
a high quality evaluation
through consultation
with an advisory group of national
experts. The second goal is to assess
impacts of the GRFP on graduate school
experiences through a follow-up study
of GRFP award recipients and
other
applicants. The third goal is to assess
impacts
of the GRFP on career
and
professional outcomes through analysis
of GRFP participants and comparable
national populations. The fourth
goal is
to
assess the benefits of
the GRFP on institutions that enroll
GRFP
Fellows. The evaluation is designed
to address research questions that
explore the influences of the GRFP on the following broad sets of
variables:
Educational
decisions, experiences, and
graduate degree attainment of
STEM
graduate students;
Career preparation and aspirations;
Career activities, progress, and job characteristics following graduate school;
Professional productivity;
Workforce participation and career outcomes;
Graduate school institutions and student recruitment at GRFP-sponsoring institutions;
Faculty attitudes at GRFPsponsoring institutions;
Diversity of students participating in STEM fields at GRFP-sponsoring institutions.
This
survey would address two separate
components of the planned GRPF
evaluation. First, this component will assess the influence of GRFP
awards
on recipients' graduate school experience and outcomes, which
includes program of study and
institution attended,
professional productivity
(e.g., publishes papers, conference presentations, etc.)
during
graduate schools and career aspirations. Second, the survey will
evaluate the impact of participation in the GRPF on subsequent
career options, progress and contributions
to respondents' professional
fields. This will-be conducted
as a web-based survey.
Estimate
of Burden: Public
reporting burden
for this collection of information is estimated to average 30
minutes for current
graduate students and 40
minutes
per graduates.
Respondents: Individuals.
Estimated Number of Responses per Form: 2,826 graduate students; 6,429 graduates.
Estimated Total Annual Burden on Respondents: 5,699 hours (2,826 graduate student respondents at 30 minutes per response = 1,413 hours +
6,429 graduate respondents at 40 minutes per response = 4,286 hours).
Frequency of Response: One time.
Comments:
Comments
are invited on (a)
whether the proposed collection of information is necessary for the
proper performance
of the functions of the
NSF,
including whether the information shall have practical utility; (b)
the accuracy
of the NSF's estimate of the burden
of the proposed collection of information;
(c) ways to enhance the quality,
utility, and clarity of the information
on respondents, including through
the use of automated collection techniques or other forms of
information technology;
(d) ways to minimize the burden
of the collection of information on those who are to respond,
including through the use of appropriate automated, electronic,
mechanical or other
technological collection
techniques
or other forms of information technology.
Dated: September 13, 2010.
Suzanne H. Plimpton,
Reports Clearance Officer, National Science Foundation.
[FR Doc. 2010-23170 Filed 9-15-10; 8:45 am] BILLING CODE 7555-01-P
N UCLEAR REGULATORY COMMISSION
[Docket No. 50-156; NRC-2010-0203]
University
of Wisconsin; University of Wisconsin Nuclear Reactor Environmental
Assessment and
Finding
of No Significant Impact
The U.S. Nuclear Regulatory Commission (NRC or the Commission) is considering issuance of a renewed Facility Operating License No. R-74, to be held by the University of Wisconsin (the licensee), which would authorize continued operation of the University of Wisconsin Nuclear Reactor (UWNR), located in Madison, Dane County, Wisconsin. Therefore, as required by Title 10 of the Code of Federal Regulations (10 CFR) Section 51.21, the NRC is issuing this Environmental Assessment and Finding of No Significant Impact.
Environmental Assessment Identification of the Proposed Action
The
proposed action would renew Facility
Operating License No. R-74 for
a
period of 20 years from the date of issuance
of the renewed license. The proposed action is in accordance with
the licensee's application dated May 9, 2000, as supplemented
by letter dated October
17, 2008. In accordance with 10
Appendix B. Data Collection Instruments
NORC at the University of Chicago
GRFP Follow-Up Survey
June 21, 2011
Revised November 23, 2011
The following table summarizes the source keys used throughout the Follow-up Survey to identify and reference items or series of items adapted for the GRFP Evaluation from other survey research projects. Items accompanying superscript notes have been adapted from other projects, while items without superscript notes were developed for the present GRFP Evaluation or were considered general enough not to warrant a citation. Please note that the final survey to be administered to the sample of GRFP Fellows and Honorable Mentions will not contain this documentation.
Key of GRFP survey items adapted from other survey research projects
Key |
Source |
1 |
Goldsmith, S.S., Presley, J.B., & Cooley, E.A. (2002). National Science Foundation Graduate Research Fellowship Program: Final Evaluation Report (REC 9452969 and 9912174). Available on-line at: http://www.nsf.gov/pubs/2002/nsf02080/nsf02080.pdf |
2 |
Survey of Doctorate Recipients (2010). Available on-line at: http://www3.norc.org/sdr/2010_SDR_Survey_Form.pdf |
3 |
Survey of Earned Doctorates (2010). Available on-line at: http://www.norc.org/NR/rdonlyres/D46C147D-7247-40B7-A087-24B0B390A1EC/0/SED0910_frn.pdf |
4 |
Weidman, J.C., & Stein, E.L. (2003). Socialization of doctoral students to academic norms. Research in Higher Education, 44, 641-658. |
5 |
Anderson, M.S., & Louis, K.S. (1994). The graduate student experience and subscription to the norms of science. Research in Higher Education, 35, 273-299. |
6 |
Anderson, M.S., & Swazey, J.P. (1998). Reflections on the graduate student experience: An overview. New Directions for Higher Education, 101, 3-13. |
7 |
Cohort I Second Follow-up Questionnaire of the Washington State Achievers Tracking and Longitudinal Study (2007). Bill & Melinda Gates Foundation. Questionnaire available upon request. |
8 |
College Senior Survey (2011). Cooperative Institutional Research Program, UCLA. Available on-line at: http://www.heri.ucla.edu/researchers/instruments/FUS_CSS/2011CSS.pdf |
9 |
Cohort 5 First Follow-up Instrument of the Gates Millennium Scholars Tracking and Longitudinal Study (2007). Bill & Melinda Gates Foundation. Questionnaire available upon request. |
Note. Sources are listed in descending order, from the most often cited to the least often cited.
GRFP Fellowship status
The first set of questions address your GRFP Fellowship status and experiences with the program.
******************************************************************************
Programming note: Section 1 (Part A) will be asked of GRFP Fellowship Recipients only.
Honorable Mention recipients will be skipped ahead to Section II (Part B).
******************************************************************************
Did you accept the NSF Graduate Research Fellowship you were awarded? 1
Choose one.
Yes (Skip to Question A.4)
No (Proceed to Question A.2)
Why did you not accept the NSF Graduate Research Fellowship?
Choose all that apply.
I received another fellowship that offered a higher stipend1
I received another fellowship that offered better non-stipend support (expenses for research, travel, etc.)1
I received another financial award (e.g., scholarship, grant, etc.) that offered a higher stipend
I received another financial award (e.g., scholarship, grant, etc.) that offered better non-stipend support (expenses for research, travel, etc.)
I accepted a research assistantship instead of the GRFP award
I accepted a teaching assistantship instead of the GRFP award
I decided not to pursue my graduate studies at that time1
Other (Please specify) ________________________________
Did the NSF Graduate Research Fellowship program requirements influence your decision to not accept the award?
Choose one.
Yes (Proceed to Question 3a)
No (Skip to Question 4)
3a. Please indicate which program requirement discouraged you from accepting the award (check all that apply):
The requirement that I attend a graduate program at a U.S. institution
The service requirement
The three year duration of the award
Not being allowed to concurrently accept other federal fellowship money
Other (Please specify) ________________________________
Yes
No (Skip to Question 4)
In your field, are there other fellowships or other sources of student support that are more desirable than the NSF Graduate Research Fellowship? 1
Yes
No (Skip to Section II)
5a. If yes, please select from the drop-down list and identify why that award was more
desirable.
Fellowship or other source (please write in) |
Larger stipend |
Longer duration |
More prestige |
Other—please specify |
1. See below list |
|
|
|
|
2. See below list |
|
|
|
|
3. See below list |
|
|
|
|
Insert the following list as a drop-down menu for Questions A.4:
Bell Labs Graduate Fellowship
Dept of Defense SMART
Dept of Energy
Dept of State Fulbright Program
EPA Star Fellowship
Ford Foundation Fellowship
GAANN
Hertz Fellowship
Dept of Homeland Security
Jacob Javits Fellowship
LSAMP Bridge to the Doctorate
Marshall Scholarship Program
NASA GSRP
NASA Aeronautics Scholarship Program
NDSEG Fellowship
National Physical Science Consortium Fellowship
Rhodes Scholarship
University Fellowship
Whitaker
Arctic Research Opportunities
Centers of Research Excellence in Science and Technology (CREST)
HBCU Research Infrastructure for Science and Engineering (RISE)
Developing Global Scientists and Engineers (International Research Experiences for Students (IRES) and Doctoral Dissertation Enhancement Projects (DDEP))
Doctoral Dissertation Improvement Grants in the Directorate for Biological Sciences
Dynamics of Coupled Natural and Human Systems
East Asia and Pacific Summer Institutes for U.S. Graduate Students
Ethics Education in Science and Engineering
Federal Cyber Service: Scholarship for Service
Graduate Research Fellowship Program
Minority Graduate Research Fellowship
Integrative Graduate Education and Research Traineeship Program
Graduate STEM Fellows in K-12 Education
Bridge to the Doctorate (LSAMP-BD)
Alliances for Graduate Education and the Professoriate (AGEP) Program
International Research and Education: Planning Visits and Workshops
National STEM Education Distributed Learning
NSF Astronomy and Astrophysics Postdoctoral Fellowships
Pan-American Advanced Studies Institutes Program
Partnerships for International Research and Education
Postdoctoral Fellowships in Polar Regions Research
Presidential Awards for Excellence in Science, Mathematics and Engineering Mentoring
Other
The next set of questions address your graduate school history.
Institution name __________________________
Branch or city __________________________
State or province __________________________
Country __________________________
(Programming note: We will employ a drop down menu for this item based on the GRFP application, see Appendix A)
Institution name __________________________
Branch or city __________________________
State or province __________________________
Country __________________________
Primary field of study from Appendix A __________________________
(Programming note: we will employ a drop down menu for this item based on Appendix A)
Less than one academic year
One academic year
Between one and two academic years
Two or more academic years
1d.2 What were your main factors that led you to take a leave of absence from your graduate program? Choose all that apply
Financial
Family-related
Work-related
Health
Other…Please specify ___________
If yes, select the most important reason for changing fields - Choose one
C. Experiences during Graduate School
Please respond to the following items in terms of your experience at REFERENCE PROGRAM:
How would you rate REFERENCE PROGRAM on each of the following? 1
|
Excellent |
Above
|
Average |
Below |
Extremely |
Not applicable |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
How often did the following occur while you were a graduate student at REFERENCE PROGRAM?
|
Very often |
Often |
Sometimes |
Rarely |
Never |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
your work |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
To what extent would you agree with the following statements about your experience at REFERENCE PROGRAM?
3a. Resources and support
|
Strongly agree |
Agree |
Neither agree nor disagree |
Disagree |
Strongly disagree |
|
|
|
|
|
|
|
|
|
|
|
|
presentations/publications5 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3b. To what extent would you agree with the following statements about your REFERENCE PROGRAM?
Opportunities provided by your program
|
Strongly agree |
Agree |
Neither agree nor disagree |
Disagree |
Strongly disagree |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3c. To what extent would you agree with the following statements about your faculty at REFERENCE PROGRAM?
Relationship with faculty
|
Strongly agree |
Agree |
Neither agree nor disagree |
Disagree |
Strongly disagree |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
To what extent did your graduate program provide you with the following?
|
To a great extent |
Somewhat |
Very little |
Not at all |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
To what extent did you participate in the following graduate school activities?
|
To a great extent |
Somewhat |
Very little |
Not at all |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
When thinking about your intended career path, how important were the following considerations?
|
Essential
|
Very important |
Somewhat important |
Not important |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Reflecting on your graduate school enrollment decision, to what extent do you agree or disagree with the following statements?
I decided to enroll in my particular graduate institution:
|
Strongly agree |
Agree |
Neither agree nor disagree |
Disagree |
Strongly disagree |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
D. Professional Productivity and Financial Support during Graduate School
The next series of questions addresses your employment patterns and finances during graduate school.
Which of the following were sources of financial support during the course of your overall graduate school experience? 3
Choose all that apply
Fellowship, scholarship
Grant
Teaching assistantship
Research assistantship
Other assistantship
Internship
Loans (from any source)
Personal savings
Personal earnings during graduate school (other than sources listed above)
Spouse’s, partner’s, or family’s earnings or savings
Employer reimbursement/assistance
Foreign (non-U.S.) support
Other…Please specify ___________
Did you work for pay during your REFERENCE PROGRAM? Please do not include teaching and research assistantships, traineeships, fellowships, and internships.
Yes (Go to Question D.2a)
No (Skip to Question D.2e)
If Yes:
2a. On average, how many hours per week did you work? _______
(Programming
note:
We will employ a drop down menu with response options in
0 –
4 hour increments.)
2b. Did you work on-campus, off-campus, or both?
On-campus
Off-campus
Both on-campus and off-campus
2c. Was your job related to your graduate school major field of study?
Yes
No
2d. Why did you work? (Please check all that apply) 7, 9
Required as part of my teaching, research, or graduate assistantship
Living expenses
Experience
Support family
Pay back loans or other education-related debt
Pay for school
Help pay for social activities
Fits into career path/internship/Gain experience or skills
Make money/create savings
Other
2e. Did you have an internship during your time at REFERENCE PROGRAM?
Yes
No (Skip to Question D.3)
If yes, was (were) your internship(s):
Paid
Unpaid
I held both paid and unpaid internships while at REFERENCE PROGRAM
If yes, was (were) your internship(s):
In an academic setting
In a non-academic setting (e.g., industry, government)
I held internships in both academic and non-academic setting while at REFERENCE PROGRAM
How many papers did you present while in graduate school? 1
National meetings or conferences |
International meetings or conferences |
□ None |
□ None |
□ 1-4 |
□ 1-4 |
□ 5-8 |
□ 5-8 |
□ 9-12 |
□ 9-12 |
□ 13-16 |
□ 13-16 |
□ 17-20 |
□ 17-20 |
How many of the following publications did you produce in graduate school?
(Include publications in press.) 1
|
Primary Author (Number) |
Other Co-Author (Number) |
Refereed journal articles |
Drop down range 0-20 |
Drop down range 0-20 |
Non-refereed articles (i.e., newspaper and magazine articles, book reviews) |
Drop down range 0-20 |
Drop down range 0-20 |
Book chapters/edited books |
Drop down range 0-20 |
Drop down range 0-20 |
Books published |
Drop down range 0-20 |
Drop down range 0-20 |
How many patents did you apply for while in graduate school? (Number) 1 ______
Drop down range 0-10
Did you apply to any of the following types of grants/contracts as a Principal Investigator (PI) or Co-PI while in graduate school? (Mark Yes or No for each) 1
Yes No
□ □ Federal government
□ □ State government
□ □ Local government
□ □ Foundation
□ □ Business/industry
□ □ Employing organization
□ □ Not-for-profit agency
□ □ Professional society or association
E. Job History
Here we will be gathering more specific information regarding your employment history. Please note that for the purposes of this section, a postdoctoral appointment should be considered employment.
At the present time, which of the following best describes your employment status? 1, 2
Employed (including self-employment, postdoctoral appointment, or on any kind of paid or unpaid leave, including vacation) (Skip to Question E.3)
Not currently working for pay (Proceed to Question E.2a)
2a. What best describes your reason for not currently working? (Select one) 2
Further education (this EXCLUDES postdoctoral study)
Retired
On layoff from job
Family responsibilities
Medical condition (chronic illness, disability, etc)
Seeking employment
Do not need or want to work
2b. What was the most recent year of employment? ______
(PROGRAMMING NOTE: We will employ a drop down menu with range 1990-2011)
How many jobs have you held since leaving REFERENCE PROGRAM? _____
(Programming note: We will employ a drop down menu with range 0 – 20)
4. Thinking of your current / most recent employment, using the job categories listed, please choose the code that best describes your current or most recent job:2
Code _ _ _ DROP DOWN MENU
(Programming note: we will employ a drop down menu for this item based on Appendix B)
5. Please consider your current / most recent employer. Counting all locations that your employer operates, how many people work for your principal employer? Your best estimate is fine. 2
Choose one
10 or fewer employees
11 - 24 employees
25 - 99 employees
100 - 499 employees
500 - 999 employees
1,000 - 4,999 employees
5,000 - 24,999 employees
25,000+ employees
6. For what type of employer are you/were you most recently working? 2
Choose one
EDUCATION (if selected, go to Question E.7)
U.S. 4-year college or university other than medical school
U.S. medical school (including university-affiliated hospital or medical center)
U.S. university-affiliated research institute
U.S. community or two-year college
U.S. preschool, elementary, middle, secondary school or school system
Foreign educational institution
GOVERNMENT - other than education institution (if selected, skip to Question E.9)
Foreign government
U.S. federal government
U.S. state government
U.S. local government (e.g., city, county, school district)
U.S. Military service
U.S. national laboratory
PRIVATE SECTOR - other than education institution (if selected, skip to Question E.9)
Not for profit organization
Industry or business (for profit)
Start-up company
OTHER
Self-employed
Other…Please specify _______________________________________
7. (If EDUCATION was selected in Question E.6) Is your current / most recent job an academic position? 2
Yes (Go to Question E.8)
No (Skip to Question E.9)
8. What type of academic position(s) do you currently hold / did you most recently hold? 2
Choose all that apply
President, Provost, or Chancellor (any level)
Dean (any level), department head or chair
Research faculty, scientist, associate, or fellow
Non-tenure track-track faculty (e.g., instructor, lecturer, etc)
Tenure-track faculty (e.g., assistant professor, etc)
Tenured faculty (e.g., associate professor, etc)
Adjunct faculty
Postdoc (e.g., postdoctoral fellow or associate)
Research assistant
Teaching assistant
Other position... Please specify_______________________________
9. Please indicate your basic annual salary for your current / most recent job (in the current or most recent year you were employed). Do not include bonuses or additional compensation for summertime teaching or research. If you are not salaried, please estimate your earned income 3
$ |
|
OR: If you prefer not to report an exact amount, please indicate into which range you expect your salary to fall: |
|
Choose one |
|
|
|
|
|
|
|
|
|
|
|
|
|
10. Was this salary based on a 52-week year, or less than that? 2
Include paid vacation and sick leave.
□ 52-week year
□ Less than 52 weeks…Please specify number of weeks _____________
11. What are your primary and secondary work activities at your current / most recent job? 3
Choose one in each column PRIMARY SECONDARY
Research and development □ □
Teaching □ □
Management or administration □ □
Professional services to individuals □ □
Other □ □
If Other, Please specify _____________
Mark if no secondary work activities □
12. To what extent is your work on your current /most recent job related to your field of graduate studies at REFERENCE PROGRAM? 2
Choose one answer.
Closely related
Somewhat related
Not related
13. Thinking about your current / most recent employment, do/did you work for pay at a second job (or business), including part-time, evening, or weekend work? 2
Yes
No
14. Is your current / most recent job the same as your first job after leaving REFERENCE PROGRAM? 2
□ Yes, current / most recent job is same as first job (Skip to Question E.24)
□ No, current / most recent job is different is different than first job (Po to Question E.15)
15. Thinking of the first job you held after leaving REFERENCE PROGRAM, when did you hold this job? 2
Month and year you started working this job __________________________
Month and year you last worked at this job __________________________
15a. Thinking of the first job you held after leaving REFERENCE PROGRAM, please choose the code that best describes that first job. 2
Code _ _ _DROP DOWN MENU
(Programming note: we will employ a drop down menu for this item based on Appendix B)
16. Please consider your first employer. Counting all locations that your employer operates, how many people work for your principal employer? Your best estimate is fine. 2
Choose one
10 or fewer employees
11 - 24 employees
25 - 99 employees
100 - 499 employees
500 - 999 employees
1,000 - 4,999 employees
5,000 - 24,999 employees
25,000+ employee
17. What type of employer did you work for your first job after REFERENCE PROGRAM? 2
Choose one
EDUCATION (if selected, see Question E.18)
U.S. 4-year college or university other than medical school
U.S. medical school (including university-affiliated hospital or medical center)
U.S. university-affiliated research institute
U.S. community or two-year college
U.S. preschool, elementary, middle, secondary school or school system
Foreign educational institution
GOVERNMENT - other than education institution (if selected, skip to Question E.20)
Foreign government
U.S. federal government
U.S. state government
U.S. local government (e.g., city, county, school district)
U.S. Military service
U.S. national laboratory
PRIVATE SECTOR - other than education institution (if selected, skip to Question E.20)
Not for profit organization
Industry or business (for profit)
Start-up company
OTHER
Self-employed
Other…Please specify _______________________________________
18. (If
EDUCATION was selected in Question E.17)
Was your first
job after leaving REFERENCE PROGRAM an academic position?
2
Yes (Proceed to Question E.19)
No (Skip to Question E.20)
19. What type of academic position(s) did you have at your first job? 2
Choose all that apply
President, Provost, or Chancellor (any level)
Dean (any level), department head or chair
Research faculty, scientist, associate, or fellow
Non-tenure track-track faculty (e.g., instructor, lecturer, etc)
Tenure-track faculty (e.g., assistant professor, etc)
Tenured faculty (e.g., associate professor, etc)
Adjunct faculty
Postdoc (e.g., postdoctoral fellow or associate)
Research assistant
Teaching assistant
Other position… Please specify______________________________
What was your basic annual salary for your first job after REFERENCE PROGRAM? Do not include bonuses or additional compensation for summertime teaching or research. If you were not salaried, please estimate your earned income. 3
$ |
|
If you prefer not to report an exact amount, please indicate into which range you expect your salary to fall: |
|
Mark (X) one |
|
|
|
|
|
|
|
|
|
|
|
|
|
21. Was this salary based on a 52-week year, or less than that? 2
Include paid vacation and sick leave.
52-week year
Less than 52 weeks…Please specify number of weeks _____________
22. What were your primary and secondary work activities at your first job after leaving REFERENCE PROGRAM? 3
Choose one in each column PRIMARY SECONDARY
Research and development □ □
Teaching □ □
Management or administration □ □
Professional services to individuals □ □
Other □ □
Mark if no secondary work activities □
23. To what extent was your work on your first job related to your field of graduate studies at REFERENCE PROGRAM? 2
Choose one answer.
Closely related
Somewhat related
Not related
24. Since leaving graduate school, how many papers have you presented? 2
|
National meetings |
International meetings |
Number of Papers Presented |
Range 0-100 |
Range 0-100 |
25. Since leaving graduate school, how many of the following publications have you produced (include in press)? 2
|
Primary Author (Number) |
Other Co-Author (Number) |
Refereed journal articles |
Range 0-50 |
Range 0-50 |
Non-refereed articles (i.e., newspaper and magazine articles, book reviews) |
Range 0-50 |
Range 0-50 |
Book chapters/edited books |
Range 0-50 |
Range 0-50 |
Books published |
Range 0-50 |
Range 0-50 |
26. Since leaving graduate school…2
|
Number |
How many applications for U.S. patents have named you as an inventor? |
Range 0-100 |
How many U.S. patents have been granted to you as an inventor? |
Range 0-100 |
How many of the patents recorded as granted (in category 2 above) have resulted in commercialized products or processes or have been licensed? |
Range 0-100 |
27. Since leaving graduate school, how many and what types of grants/contracts have you been awarded as Principal Investigator? 2
If you have not received a grant as PI, mark here and go to the next question.
Type of Agency |
Number of Grants or Contracts |
Total Amount (Including Overhead) |
Federal government |
Range 0-50 |
|
State government |
Range 0-50 |
|
Local government |
Range 0-50 |
|
Foundation |
Range 0-50 |
|
Business/industry |
Range 0-50 |
|
Employing organization |
Range 0-50 |
|
(Programming note: Total Amount will be open ended with range checks)
28. What teaching activities have you undertaken since graduate school? 1
Mark all that apply. |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
29. What professional services have you undertaken since leaving graduate school?
Choose all that apply.
Conference presentation proposal reviewer1
Manuscript/chapter reviewer1
Departmental committee1
Institutional/company-wide committee1
Professional organization committee1
Local community/government committee/panel1
State-level committee/panel1
National committee/panel1
Off-campus peer review panel, accreditation and certification team1
Member of editorial board of professional journal1
Editor of professional journal1
Professional peer review of grant proposals
Involved in K-12 STEM policy
Outreach to K-12 professionals
Participated in professional development activities for K-12 teachers
Other (Please specify)_________________________________________
We are also interested in your experiences as an undergraduate.
Did you attend community college at any point during your undergraduate education? 3
□ Yes
□ No
□ Yes (Proceed to F.2a)
□ No (Skip to F.3)
2a. If yes, please check all that apply:
Advanced Technological Education
Arctic Research Opportunities
Centers of Research Excellence in Science and Technology (CREST) and HBCU Research Infrastructure for Science and Engineering (RISE)
Developing Global Scientists and Engineers (International Research Experiences for Students (IRES) and Doctoral Dissertation Enhancement Projects (D DEP))
Dynamics of Coupled Natural and Human Systems
Federal Cyber Service: Scholarship for Service
Historically Black Colleges and Universities Undergraduate Program
Integrative Graduate Education and Research Traineeship Program
Interdisciplinary Training for Undergraduates in Biological and Mathematical Sciences
International Research and Education: Planning Visits and Workshops
NSF Scholarships in Science, Technology, Engineering, and Mathematics
Louis Stokes Alliance for Minority Participation (LSAMP)
Tribal Colleges and Universities Program (TCUP)
NSF Computer Science, Engineering, and Mathematics Scholarships (CSEMS)
Partnerships f or International Research and Education
Presidential Awards for Excellence in Science, Mathematics and Engineering Mentoring
Research Experiences for Undergraduates
Research in Disabilities Education
Research in Undergraduate Institutions
Robert Noyce Teacher Scholarship Program
Science, Technology, Engineering, and Mathematics Talent Expansion Program
Other
Finally, we have a few background questions that will help us understand and analyze the data.
Are you…
Male
Female
Which of the following best describes your ethnicity? 3
Hispanic or Latino (go to Question G.3)
Not Hispanic or Latino (go to Question G.4)
Which of the following best describes your Hispanic or Latino origin or descent? 3
Mark one.
Mexican or Chicano
Puerto Rican
Cuban
Other Hispanic
Do you consider yourself (choose one)… 1, 2
American Indian or Alaskan
Asian
Black or African American
Native Hawaiian or Other Pacific Islander
White
Two or more races
4a. If two or more races, please select the racial groups to which you belong (choose all that apply)…
American Indian or Alaskan
Asian
Black or African American
Native Hawaiian or Other Pacific Islander
White
Two or more races
What is your marital status? (choose one) 3
Married
Living in a marriage-like relationship
Widowed
Separated
Divorced
Never married
Not including yourself or your spouse/partner, how many dependents (children or adults) do you have - that is, how many others receive at least one half of their financial support from you? 3
Drop down 0-10 _______
If 0 Skip to G.7
6a. How many of these dependents are under the age of 18? 3
Drop down 0-10 _______
What is the highest educational attainment of your mother and father? 3
Mother Father
□ □ Less than high/secondary school graduate
□ □ High/secondary school graduate
□ □ Some college
□ □ Bachelor’s degree
□ □ Master’s degree (e.g., MA, MS, MBA, MSW, etc.)
□ □ Professional degree (e.g., MD, DDS, JD, D.Min, Psy.D., etc.)
□ □ Research doctoral degree or Ph.D.
□ □ Do not know
Which best described your citizenship when entering graduate school? 1
U.S. Citizen
Non-U.S. Citizen with a Permanent U.S. Resident Visa (“Green Card”)
Non-U.S. Citizen with a temporary U.S. Visa
8a. Which best describes your current citizenship status? 1
U.S. Citizen (Skip to Question G.1)
Non-U.S. Citizen
with a Permanent U.S. Resident Visa
(“Green Card”)
(Proceed to Question G.8b)
Non-U.S. Citizen with a temporary U.S. Visa (Proceed to Question G.8b)
8b. Of which country are you currently a citizen? 2, 3
__________________________
What is your date of birth? 1
____Month (1-12)
____Day (1-31)
____Year (19__)
Do you have the following disabilities? Please mark Yes or No for each. 1
Yes No
□ □ Hearing impairment
□ □ Visual impairment
□ □ Mobility/orthopedic impairment
□ □ Learning/Cognitive Disability
□ □ Vocal/Speech Disability
□ □ Other (Please specify) ______________________
Thank you for your participation! Please hit the submit button before you close out of the survey.
Institutional sITE VISIT Sample (SIX INSTITUTIONS)
INTERVIEW PROTOCOLS
I. Departmental Staff Protocol
Questions for relevant departmental staff members (e.g., graduate student office or student affairs staff members who have worked with GRFP Fellows):
What is your overall impression of the NSF Graduate Research Fellowship Program (GRFP)? How does it compare, in reputation, with other fellowship programs?
How do Fellows benefit from their GRFP Fellowship? How does your department benefit from hosting GRFP Fellows?
Let’s talk about the program’s enrollment patterns in terms of gender, ethnicity, and Master’s/Ph.D. student ratios. Do Fellows differ from other graduate students in terms of these characteristics? To what extent does the GRFP promote diversity among graduate students enrolled in your department?
What kinds of supports are offered to Fellows that are different than those offered other graduate students? In your opinion, are these helpful to Fellows in terms of timely progress towards degree or better integration into the department?
How does your department financially support its graduate students, for example, how many students receive full support to the completion of their degree, and how is aid awarded? How would the department be affected if GRFP funding were to disappear? Does the GRFP figure into the financial planning of the department?
Now let’s talk about how the Fellows in your department actually use their Fellowships. When do most Fellows use the three years of the Fellowship? How common is it for Fellows to place their GRFP Fellowship on reserve for one or two years? How do most GRFP students secure funding when they are not receiving GRFP support? What supplemental funding, if any, is provided to Fellows by the department? How do the guidelines on when Fellows may use their funding affect the experiences of the Fellows and the department? How has this changed over the past few years?
What are the expectations and opportunities for TAing and RAing in the department? Do Fellows participate in these opportunities to the same degree as their peers? How has this changed over the past few years?
Over the past few years, how would you say the GRFP has changed, whether in terms of regulations, how it affects Fellows, how it affects your institution, or in some other way?
How could the GRFP be improved? What ideas would you like to communicate to NSF? [If perceived problems are reported:] What solutions would you propose?
II. Departmental Faculty Protocol
Questions for relevant departmental faculty members (e.g., graduate program coordinating officials, department chairs, and faculty who have worked with GRFP Fellows):
What is your overall impression of the NSF Graduate Research Fellowship Program (GRFP)? How does it compare, in reputation, with other fellowship programs? What does it mean to faculty members that a student is a GRFP Fellow?
How does a GRFP Fellowship influence the admissions decisions of your department? How does receiving a GRFP Fellowship influence faculty members’ willingness to work with a prospective student?
How do the experiences of Fellows differ from those of other students in the program? Probe for:
whether Fellows are fully integrated into the program or if their source of funding isolates them;
whether the GRFP funding provides greater autonomy/flexibility since it is not tied to an advisor or lab;
whether program guidelines affect Fellows’ service to the department in terms of TAing/RAing?
Compared to the other students in your department, do Fellows differ in the length of time they need to finish? What are the career goals of your GRFP Fellows, and do they differ from those of the other students in your department? Compared to other students, to what extent are the Fellows developing the personal and professional skills necessary for success in their chosen field after graduating?
How do Fellows benefit from their GRFP Fellowship? How does your department benefit from hosting GRFP Fellows?
To what extent do Fellows contribute to the research activity of the department? Are the educational and research experiences of Fellows similar to those of other students? How has this changed over the past few years? Do Fellows have different opportunities or make different choices compared to other students? If there are differences, what are they?
How would your department be affected if GRFP funding were to disappear? Does the GRFP figure into the financial planning of the department?
How could the GRFP be improved? What changes to the program might benefit the Fellows and the department?
III. University Administrators Protocol
Questions for graduate studies deans and other relevant university administrators (e.g., directors of student financial support, external fellowship advisors, designated Coordinating Officials (COs) of the GRFP):
What is your overall impression of the NSF Graduate Research Fellowship Program (GRFP)? How does it compare, in reputation, with other fellowship programs?
What trends, if any, have you noticed in the granting of GRFP Fellowships? Has the recent increase in the number of Fellowships awarded contributed to these trends? [If needed: for example, in terms of quality of students, racial, ethnic, and gender diversity, field of study, etc.] How has this increase affected your graduate program, if at all?
We are interested in how the GRFP affects the university. To what extent does the program help:
Recruit students to STEM programs at your university?
Offset the costs necessary to fund students?
Diversify the student body of STEM programs?
How would your university be affected if GRFP funding were to disappear? Does the GRFP figure into the financial planning of the graduate studies office or any of your graduate programs?
Does the current amount of funding provided by the GRFP adequately meet the needs of graduate students at your university? How is the cost-of-education allowance provided by the GRFP Fellowship used by the university? Are Fellows provided any kind of supplemental funding if the allowance cannot cover their financial needs?
How many staff members are involved in the administration of GRFP Fellowships? Are there supports or activities provided by the university to the Fellows that are separate from those provided to other graduate students?
Compared to other students, to what extent are Fellows contributing to the research endeavors of the university while they are in graduate school? To what extent are they supporting the department through service and teaching? How has this changed over the past few years? (Probe specifically for changes in Fellows’ participation in teaching and research.) To what extent are they succeeding in STEM fields upon graduation?
Over the past few years, how would you say the GRFP has changed, whether in terms of regulations, how it affects Fellows, how it affects your institution, or in some other way?
How could the GRFP be improved? What changes to the program would most benefit your university?
Institutional phone Interview Sample (20 Institutions)
Interview Protocol
NSF is interested in learning how some particular policies of the GRFP are working and the extent to which they could be improved. We are interested in both your experiences with these policies as well as your opinions, suggestions for improvement, and ideas.
How would you describe the goals of the GRFP program?
Let’s talk about how the Fellows in your department actually use their Fellowships. When do most Fellows use the three years of the Fellowship? How common is it for Fellows to place their Fellowship on reserve for one or two years? Has this pattern changed over the past few years?
How are most Fellows funded when they are not receiving GRFP support? What supplemental funding, if any, is provided to Fellows by the department? How do the GRFP policies on when Fellows may utilize their funding affect the experiences of the Fellows and the department? How do the policies affect the Fellows’ progress to degree completion?
Does the current amount of funding provided by the GRFP adequately meet the needs of graduate students at your university? How is the cost-of-education allowance provided by the Fellowship used by the university? How does the institution cover tuition if the cost-of-education allowance of $10,500 is insufficient?
How do the experiences of Fellows differ from those of other students in the program? Probe for:
whether Fellows are fully integrated into the program or if their source of funding isolates them;
whether the GRFP funding provides greater autonomy/flexibility since it is not tied to an advisor or lab;
What kinds of supports are offered to Fellows that are different than those offered to other graduate students? In your opinion, are these helpful to Fellows in terms of timely progress towards degree or better integration into the department?
What are the requirements and opportunities for TAing and RAing in the department? Do Fellows participate in these opportunities to the same degree as their peers? How do the program guidelines about the amount of service Fellows may provide to the institution while funded by the GRFP affect the experiences of Fellows and the department? Could this policy be improved for the Fellows? How has the service provided by Fellows changed over the past few years?
The program requires that the status of Fellows is decided on an annual basis—i.e. whether they are in a “Tenure” or “Reserve” status for the following GRFP Fellowship year. How do you think this policy works? Is there any need to change it?
The program also requires that Fellows are affiliated with a U.S. institution. Are there instances (for example, in particular fields) where you would suggest revisiting this policy?
Is there anything about the program policies [refer to the Administrative Guide if needed] that, if changed, would improve the program or be beneficial for your institution, the graduate programs, or Fellows?
Appendix C. Crosswalk between Research Questions and Data Sources
Primary data sources:
GRFP Follow-Up Survey of current and former graduate students [termed “Survey (Current)” and “Survey (Former) in the tables below], Fellows and Honorable Mentions
In-Person interviews with institutional administrators, faculty and staff during site visits [termed “Interviews (Site visits) in the tables below]
Phone interviews with institutional administrators, faculty, and staff [termed “Interviews (phone) in the tables below]
Phone interviews with program officers of federal fellowship programs similar to the GRFP and review of program materials [exempt from OMB review—see page 2]
Secondary data sources: Survey of Doctorate Recipients (SDR), Survey of Earned Doctorates (SED), Integrated Postsecondary Education Data System (IPEDS), Barron’s Profiles of American Colleges. These data sources will be used to:
Define a comparison group of national peers
Obtain characteristics of institutions, including reputation and ranking, to be used in the modeling or to look at differences in selectivity of institutions hosting Fellows, Honorable Mentions, and the national peers
Calculate outcomes in terms of degree attainment and time to degree
Examine career trajectories and characteristics of academic and non-academic employment
Examine future professional productivity
Tables C.1-C.4 provide a crosswalk of data sources and analysis for each of the research questions.
Table C.1. Research Question 1: Data Sources and Analysis
RQ 1: What is the impact of the GRFP fellowship on the graduate school experience? |
|
Data Sources |
Analysis |
Section I (A): GRFP Award Status
Section II (C): Experiences with Program
Section II (D): Professional Productivity and Financial Support During Graduate School
Section IV (F). Educational Background
Section IV (Demographics)
2. Survey (Former) Section B: Graduate School Background Information
Change in primary field of study and why
3. Interviews (Site Visits) Faculty:
Senior University Administrators:
Departmental/Graduate Studies Staff:
4. Interviews (Phone)
|
This question is addressed by comparing graduate student experiences (such as participation in STEM graduate study, selection of institution, professional productivity, career aspirations, graduate degree attainment and time-to-degree) of Fellows with those of a matched comparison group of similar but non-awarded GRFP applicants. In addition, Fellows’ experiences will be compared with those of a matched comparison group of doctoral students nationally.
|
Table C.2. Research Question 2: Data Sources and Analysis
RQ2. What is the impact of the GRFP fellowship on career outcomes? |
|
Data Sources |
Analysis |
1. Survey (Former) Section III (E): Job History
Section III (E): Professional Productivity Since leaving graduate school:
|
This question is addressed by comparing the career outcomes (for example, in terms of academic and non-academic career choices, science and engineering careers versus careers in other fields, job characteristics, and professional productivity) of Fellows with those of a matched comparison group of similar but non-awarded GRFP applicants, and other national populations of doctoral students.
|
Table C.3. Research Question 3: Data Sources and Analysis
RQ3. What are the effects of the GRFP on institutions? |
|
Data Sources |
Analysis |
1. Interviews (Site Visits) Senior University Administrators:
Faculty:
Department/Graduate Studies Staff:
2. Survey (current); Survey (former) Section I (A): GRFP Award Status
Section II (D): Professional Productivity and Financial Support During Graduate School
|
Possible effects of the GRFP on graduate institutions are assessed through a series of interviews focusing on financial aspects including adequacy of the cost-of-education allowance and ability to free up resources to provide funding to other students, the extent to which Fellows participate in departmental teaching and research (“service to the department”), effects on student diversity and (to the extent feasible) student quality, and effects, if any, on scholarly productivity and research.
|
Table C.4. Research Question 4: Data Sources and Analysis
RQ4. Is the program design effective in meeting program goals? |
|
Data Sources |
Analysis |
1. Survey (current); Survey (former) Section I (A): GRFP Award Status
Section II (D): Professional Productivity and Financial Support During Graduate School
2. Survey (Former) Section III (E): Job History
Section III (E): Professional Productivity Since leaving graduate school:
Types of professional services undertaken
3. Interviews (Phone) NSF is interested in learning how some particular policies of the GRFP are working and the extent to which they could be improved. We are interested in both your experiences with these policies as well as your opinions, suggestions for improvements, and ideas.
4. Interviews (Site Visits) Senior University Administrators:
Faculty:
Department/Graduate Studies Staff:
|
This question is addressed by asking (a) Fellows about the impact of GRFP on decision to go to graduate school in a STEM field, the impact of particular program elements on choice, flexibility, and ability to fund and complete their graduate programs, and future employment and professional productivity in STEM fields; (b) students who refused the Fellowship the role that particular requirements played in the decision; (c) institutional administrators and faculty about whether the program could be improved; and (d) program officers managing similar federal fellowship programs about what they have learned from their programs regarding implementation and promising practices.
|
1 The annual number of fellowships awarded increased from approximately 1,000 to 2,000 in 2010.
2 The quality groupings (QG) refer to the categories assigned to each GRFP participant upon applying to the fellowship program. QG1 is the highest ranking an applicant can receive. The study sample includes the highest two categories: QG1 and QG2 applicants.
3 For a full list of NSF-supported fields of study, see http://www.nsf.gov/pubs/2010/nsf10604/nsf10604.htm#appendix
4 The number of annual fellowships awarded increased from 1,000 to 2,000 in 2010.
5 Harmon, L. R. (1977). Career achievements of NSF graduate fellows: The awardees of 1952-1972. Washington, D.C.: Commission on Human Resources, NRC; Baker, J. (1994). Career paths of the National Science Foundation graduate fellows of 1972-1981. Washington, D.C.: Office of Scientific and Engineering Personnel (OSEP), National Research Council (NRC); Baker, J. (1995). Minority science paths: National Science Foundation Minority Graduate Fellows of 1979-1981. Washington, D.C.: OSEP, NRC; Snyder, J. (1988). Early career achievements of National Science Foundation graduate fellows, 1967-1976. Washington, D.C.: OSEP, NRC.
6 Goldsmith, S.S., J.B. Presley, & E.A. Cooley (2002). National Science Foundation Graduate Research Fellowship Program final evaluation report. Arlington, VA: National Science Foundation.
7 Wendler, C., Bridgeman, B., Cline, F., Millett, C., Rock, J., Bell, N., and McAllister, P. (2010). The
path forward: The future of graduate education in the United States. Princeton, NJ: Educational Testing Service.
8 The quality groupings (QG) refer to the categories assigned to each GRFP participant upon applying to the fellowship program. QG1 is the highest ranking an applicant can receive. The study sample includes the highest two categories: QG1 and QG2 applicants.
9 Cook, T.D. and Campbell, D.T. (1979). Quasi-Experimentation: Design and Analysis for Field Settings. Rand McNally, Chicago, Illinois.
10 http://www.nsfgrfp.org/how_to_apply/eligibility_guide; http://www.nsf.gov/pubs/2010/nsf10604/nsf10604.pdf
11 For NSF-supported fields of study, see http://www.nsf.gov/pubs/2010/nsf10604/nsf10604.pdf
12 http://www.nsf.gov/pubs/2010/nsf10604/nsf10604.pdf
13 See Table C.1 for additional details on data sources per specific analysis.
14 See Goldsmith, et al. (2002), Table G14, p.141. http://www.nsf.gov/pubs/2002/nsf02080/nsf02080.pdf.
15 See Goldsmith, et al. (2002), Table G9, p.136. http://www.nsf.gov/pubs/2002/nsf02080/nsf02080.pdf
16 Hoffer, T.B., V. Welch, Jr., K. Webber, K. Williams, B. Lisek, M. Hess, D. Loew, & I. Guzman-Barron (2006). Doctorate Recipients from United States Universities: Summary Report 2005. Chicago: National Opinion Research Center.
17 The 2006 SDR includes sample members who received their doctorates through AY 2005. The SDR 2008, sampling doctorate cohorts up to AY 2007, was launched in the fall of 2008. If the SDR 2008 data are available in time for the GRF Survey comparison study, we propose using 2008 SDR instead of the 2006 SDR.
18 http://www.gale.cengage.com/pdf/facts/NFDonGDL.pdf
19 Cochran, William G. (1977). Sampling techniques (Third ed.). NY: Wiley.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |