Attachment A:
Data Collection Instrument
[Survey Introduction]
Thank you for your participation. This survey is designed to help NSF understand the factors that affect researchers as they submit proposals to or review proposals for NSF, and the impact of various approaches to proposal review. Your responses will help NSF to improve its service to the community of proposers and reviewers.
All of the results will be reported in such a way that no single individual can be identified. Your answers will be used only for research and evaluation of the NSF proposal process. Your response is voluntary and you may skip any answer you do not wish to answer. Deciding not to take part in the survey will not adversely affect consideration of your pending or future proposals. A summary of the results of this survey will be posted on NSF's web site after the survey's conclusion.
This survey should take approximately 20 minutes to complete. We recommend that you do not try to complete this survey on a mobile phone. The survey should be straightforward on a tablet, laptop or desktop.
If you should encounter a “security certificate” error, or have any other difficulty taking this survey, please try accessing the survey on a home or public network. If the problem persists, please contact surveyhelp@insightpolicysurvey.com for assistance.
Paperwork Burden Statement
This information is collected under the authority of the National Science Foundation Act of 1950, as amended. According to the Paperwork Reduction Act of 1995, no persons are required to respond to a collection of information unless it displays a valid OMB control number. The valid OMB control number for this information collection is 3145-0215. The time required to complete this voluntary information collection is estimated to average 30 minutes, including the time to review instructions, search existing data resources, gather the data needed, and complete and review responses. If you have any comments or concerns about the contents or the status of your individual submission of this questionnaire, e-mail Suzanne Plimpton at splimpto@nsf.gov.
This survey consists of two sections. The first asks about your experiences as someone who has reviewed proposals for NSF (if applicable) and the second asks about your experiences as someone who has submitted proposals to NSF (if applicable). Someone who submits a proposal to NSF, a proposer, is also called a Principal Investigator (PI).
Some questions ask what sort of changes you have seen between the periods before and after October 1, 2013 (roughly 3.5 years ago). This date will be referenced throughout this survey.
For the purpose of this survey, please do not count post-doctoral fellowship applications or student fellowship applications as proposals. For example, if you have only submitted a student fellowship application to NSF, you would select ‘No’ as the answer to Question 1A; if you have only reviewed graduate research fellowship applications for NSF, you would select ‘No’ as the answer to Question 1B.
Q1A. *(MASTER FILTER A): Since October 1, 2013, have you reviewed a proposal for NSF, other than a post-doctoral or student fellowship application?
1 Yes
0 No
Q1B. *(MASTER FILTER B): Since October 1, 2013, have you submitted a proposal to NSF, other than a post-doctoral or student fellowship application? (Do not include your experience as a co-investigator.)
1 Yes
0 No
[If answers to both Q1A and Q1B are ‘No’, apologize for sending the survey in error and exit.]
Q2A-F. Since October 1, 2013, with which NSF Directorate(s) and Division(s) have your scholarly activities been most closely affiliated? (Note: If your work aligns with more than one, select up to three Directorate/Division combinations in the drop-down menus below.)
Q2A Directorate 1
Q2B Division 1
Q2C Directorate 2
Q2D Division 2
Q2E Directorate 3
Q2F Division 3
Drop-down list of NSF Directorates:
BIO = Biological Sciences
CISE = Computer & Information Science & Engineering
EHR = Education & Human Resources
ENG = Engineering
GEO = Geosciences
MPS = Mathematical & Physical Sciences
SBE = Social, Behavioral & Economic Sciences
-8 = Skip
-9 = Missing
Drop-down list of NSF Divisions:
DBI = Biological Infrastructure
DEB = Environmental Biology
IOS = Integrative Organismal Systems
MCB = Molecular & Cellular Biosciences
ACI = Advanced Cyberinfrastructure (Division or Office)
CNS = Computer & Networking Systems
CCF = Computing & Communication Foundations
IIS = Information & Intelligent Systems
DGE = Graduate Education
HRD = Human Resource Development
DRL = Research on Learning in Formal & Informal Settings
DUE = Undergraduate Education
CBET = Chemical, Bioengineering, Environmental, and Transport Systems
CMMI = Civil, Mechanical & Manufacturing Innovation
ECCS = Electrical, Communications & Cyber Systems
EEC = Engineering Education & Centers
IIP = Industrial Innovation & Partnerships
AGS = Atmospheric & Geospace Sciences
EAR = Earth Sciences
OCE = Ocean Sciences
PLR = Polar Programs
AST = Astronomical Sciences
CHE = Chemistry
DMR = Materials Research
DMS = Mathematical Sciences
PHY = Physics
BCS = Behavioral & Cognitive Sciences
SES = Social & Economic Sciences
-8 = Skip
-9 = Missing
If “No” selected for Q1A, and “Yes” for Q1B, skip to 37. [I.e. jump to questions for investigators.]
If “Yes” selected for Q1A, continue to 3.
EXPERIENCES AS A REVIEWER
[Visible only if answered ‘Yes’ to question 1A]
The following questions ask about your experiences reviewing proposals. For these questions, please use the definitions below.
There are two types of reviewers:
An ad hoc reviewer is someone who submits a written review of a proposal but does not participate in a discussion of the proposal with other reviewers.
A panelist, or panel reviewer, is someone who participates in a discussion of a proposal (usually more than one proposal) with other reviewers. A panelist may or may not prepare a written review.
There are two types of panelists:
A face-to-face panelist is someone who gathers with other reviewers at a common location (often NSF) to discuss proposals.
A remote panelist is someone who participates in the panel discussion via telephone, video-conference, web-based virtual meeting technology, or similar.
[REVIEWER WORKLOAD]
Q3A. Approximately how many reviews of individual proposals have you written for NSF since October 1, 2013, regardless of whether as an ad hoc reviewer or a panelist? (Your best estimate is fine.) [text box]
Q3B. Approximately how many reviews of individual proposals or applications have you written for other organizations since October 1, 2013? (Your best estimate is fine.) [text box]
Q3C. How many reviews would you be willing to undertake in an average year for NSF?
Q3C1 As an ad hoc reviewer [text box]
Q3C2 As a panelist [text box]
*During the past 12 months, when asked, did you ever decline to…
Yes 1 |
No 0 |
Q4A Serve as an ad hoc reviewer for NSF?
Q4B Serve as a face-to-face panelist on an NSF review panel?
Q4C Serve as a remote panelist on an NSF review panel?
[Show 5 if “yes” to any option in Q4]
Thinking about the most recent time you declined to participate in a review, to what extent did the following factors influence your decision?
To a Great Extent 3 |
To a Moderate Extent 2 |
To a Small Extent 1 |
To No Extent 0 |
Q5A. Proposal or program was not related to my professional interests
Q5B. Lack of time
Q5C. Conflict of interest
Q5D. Too many NSF review requests
Q5E. Competing professional pressures (including teaching, organizational administration service, etc.)
Q5F. Dissatisfaction with the proposal review process
Q5G. Increasing commitments as a reviewer to other funding agencies
Q5H. [Visible only if Q4B is selected] Unable to travel to a face-to-face panel
Q5I. [Visible only if Q4B is selected] Unwilling to travel to a face-to-face panel
Q5J. [Visible only if Q4C is selected] Dislike participating in discussions over phone, video-conference, or web-based meeting technology
Q5K1. (text box) Other factor? If so, please describe the factor:
Q6. Thinking about the most recent time you wrote a review of an NSF proposal, please estimate the amount of time (rounded to the nearest hour) that it took you to read the proposal, write, and submit that single written review. Please do not count time spent travelling to or sitting in panels.
(Please enter a whole number in the box below). [text box]
Q7. When do you typically read proposals and write reviews of NSF proposals?
1 During your normal work-day
2 Mainly outside of your normal working hours
3 Both during the work-day and outside your normal working hours
Q8. How does your employer view your participation as a reviewer (for NSF or other agencies)?
1. My employer considers my participation as a reviewer to fall within the scope of my normal work duties.
2. My employer considers my participation as a reviewer to fall outside the scope of my normal work duties.
3. I am unsure whether my employer considers my participation as a reviewer to fall within or outside the scope of my normal work duties.
[REVIEW QUALITY]
Q9. Which of the following best applies to you?
1. I have reviewed proposals for NSF only before October 1, 2013. Skip to 12
2. I have reviewed proposals for NSF both before and after October 1, 2013.
3. I have reviewed proposals for NSF only since October 1, 2013. Skip to 12
NSF made some changes to their proposal submission and review process in recent years. The next few questions ask about whether you have changed the way you approach reviews or whether you have noticed a change in the nature of proposals you have reviewed since October 1, 2013.
How have the following changed from before October 1, 2013 to the present?
Greatly Increased Compared to Those I reviewed Before Oct 1, 2013 5 |
Somewhat Increased Compared to Those I reviewed Before Oct 1, 2013 4 |
Stayed the Same Compared to Those I reviewed Before Oct 1, 2013 3 |
Somewhat Decreased Compared to Those I reviewed Before Oct 1,
2013 |
Greatly Decreased Compared to Those I reviewed Before Oct 1, 2013 1 |
10A. The time you are able to devote to each review
10B. The thoroughness you provide to each review
10C. The overall quality of proposals
NSF would like to gain a better understanding of how reviewers weigh different factors in forming their assessment of a proposal’s merit.
When you form a judgment of the intellectual merit of a research proposal, please indicate the relative weight you give to each of the following factors:
Very High 5 |
High |
Medium 3 |
Low 2 |
Very Low 1 |
Q11A. Originality of the research question
Q11B. The project’s potential to change our understanding of an important existing scientific or engineering concept
Q11C. The extent to which the research may open a new field in science or engineering
Q11D. The extent to which the research challenges current understanding
Q11E. The appropriateness of the proposed methodology
Q11F. Qualifications of the principal investigator and any co-investigators to implement the research plan
Q11G. Adequacy of the budget
Q11H. Presence of a mechanism to assess the project's progress
Q11I. The likelihood that the proposed project will be completed successfully
Q11J. The quality of the data management plan
When you form a judgment of the likely broader impacts of a research proposal, please indicate the relative weight you give to each of the following factors:
Very High 5 |
High |
Medium 3 |
Low 2 |
Very Low 1 |
Q12A. Originality of the character of the broader impacts
Q12B. The significance of the potential broader impacts
Q12C. The clarity and detail with which the proposal explains its broader impacts
Q12D. Integration of research and education within the project
Q12E. The project’s potential contribution to broadening participation in research
Q12F. The project’s potential contribution to enhancing local, regional or national infrastructure to support future research
Q12G. Plans for disseminating the results of the proposed research
Q12H. Past record of the principal investigator and co-investigators (if any)
Q12I. Adequacy of the budget
Q12J. The quality of the data management plan
Sometimes, research proposals include specific education, outreach or broadening participation components. In such cases, please indicate the relative weight you give to each of the following factors:
Very High 5 |
High |
Medium 3 |
Low 2 |
Very Low 1 |
Q13A. The significance of the potential impacts of these specific components
Q13B. The extent to which these specific components use evidence-based practices
Q13C. The presence of a mechanism to assess the impacts of these specific components
Q13D. The qualifications of the principal investigator and any co-investigators to implement the specific education, outreach or broadening participation components
[VIRTUAL PANEL]
NSF is interested in discovering whether you have ever participated in an NSF review panel that was wholly virtual. NSF holds three types of review panels:
Face-to-face panels. All panelists gather at the same location to discuss proposals.
Wholly virtual panels. All panelists participate remotely.
Hybrid panels. Some panelists gather at a common location and others “join” remotely.
Q14. *[WHOLLY VIRTUAL PANEL FILTER] Have you ever participated in a wholly virtual NSF proposal review panel?
1 Yes, once only Continue to 15
2 Yes, more than once Continue to 15
0 No Skip to 19
NSF would like to learn more about your experiences participating in wholly virtual proposal review panels, referred to as virtual panels, hereafter.
Which of the following technologies have you used in NSF virtual panels? (Check all that apply)
Q15A = 1 Teleconferencing
Q15B = 1 Web-based virtual meeting software (e.g. WebEx, BlueJeans)
Q15C = 1 Video-conferencing, whether web-based or otherwise (e.g. Skype, iChat)
Q15D = 1 Virtual worlds (e.g., Second Life)
Q16. *For NSF, I have served as a reviewer:
1 Only on virtual panels Skip to 19
2 In both virtual panels and face-to-face panels Continue
Compare your experience as a virtual panelist to your experience as a face-to-face panelist on the following dimensions.
Significantly Better 1 |
Somewhat Better 2 |
About the same 3 |
Somewhat Better 4 |
Significantly Better 5 |
in Virtual Panel format |
in Face-to-Face Panel format |
Q17A. Quality of panel briefing/training
Q17B. Quality of group discussions
Q17C. Quality of the panel summaries
Compare your experience as a virtual panelist to your experience as a face-to-face panelist on the following dimensions.
Significantly More 1 |
Somewhat More 2 |
About the same 3 |
Somewhat More 4 |
Significantly More 5 |
in Virtual Panel format |
in Face-to-Face Panel format |
Q18A. Time spent on preparing reviews
Q18B. Time spent preparing for panel
Q18C. Overall time commitment
Q18D. Average amount of time spent discussing each proposal
Q18E. Number of proposals discussed by the panel
Q18F. Overall satisfaction
Q19. *Have you declined to participate in a face-to-face panel?
1 Yes
0 No
Which of the following were factors in your (most recent) decision to decline to participate in a face-to-face panel? (Select all that apply)
Q20A= 1 Scheduling time away from my research and/or teaching commitments is too difficult
Q20B = 1 Scheduling time away has too much of an impact on my work/life balance
Q20C = 1 I prefer interacting with other co-panelists in a virtual capacity
Q20D = 1 I was otherwise unable to travel
Q20E = 1 I was otherwise unwilling to travel
Q20F1 = 1 Other (Q23F2 please describe in 10 words or less):
Q21. Based on your experience reviewing proposals for NSF, to what extent do you agree or disagree with the following statement?
Strongly Disagree 1 |
Disagree 2 |
Agree 3 |
Strongly Agree 4 |
Not Applicable 0 |
Q21A. Overall, the majority of proposals I have reviewed in recent years have been of high quality
[REVIEWER ORIENTATION]
Q22. *[REVIEWER ORIENTATION FILTER] NSF recently began offering reviewer orientation sessions that are conducted using a web-meeting format, including a 20-minute video with hints about how to prepare a high-quality review. Have you ever participated in one of these reviewer orientation sessions?
1 Yes
No Skip to 37
2 Unsure Skip to37
Q23. The video in the reviewer orientation included three segments:
hints for how to prepare an analytical review,
a description of the merit review criteria that included NSF guidance on the broader impact criterion, and
information about strategies to mitigate the effects of unconscious cognitive biases.
Please indicate the degree to which you found the information in these segments to be helpful:
Very Helpful 4 |
Moderately Helpful 3 |
Slightly Helpful 2 |
Not Helpful 1 |
Do Not Recall 0 |
Q23A. Hints on how to prepare an analytical review
Q23B. Guidance to reviewers on the broader impact criterion
Q23C. Information about strategies to mitigate the effects of unconscious cognitive biases
Q24. Did you find the orientation helpful when you prepared your reviews?
1 Yes
0 No
Q25. Do you now recall any of the hints provided in the video?
1 Yes
0 No
EXPERIENCES AS A PRINCIPAL INVESTIGATOR
[Visible only if answered ‘Yes’ to question 1B]
ENTRY POINT FOR Q26: If “No” is selected for 1B, skip to 39.
NSF is interested in the factors that influence your decision to seek funding from NSF compared to other sources. For the purposes of this survey, please answer the following questions based on your experience as a principal investigator (PI), not on any experience that you may have had as a co-investigator. Please think only of the proposals you have submitted to NSF since October 1, 2013.
[DEMAND MANAGEMENT]
Q26. Beyond the goal of making contributions to your area of science, to what extent do the following factors motivate you to submit research proposals to any funding source?
To a Great Extent 3 |
To a Moderate Extent 2 |
To a Small Extent 1 |
To No Extent 0 |
Q26A. Building/maintaining a record of submitting proposals for academic tenure and/or promotion
Q26B. Contributing to my employing organization's research status/reputation
Q26C. Securing funding to pay for my own salary
Q26D. Supplementing my salary
Q26E. Being able to continue to pay the salaries of staff (non-students) who currently work with me in a professional capacity (e.g. post-doctoral associates, technicians, lab managers)
Q26F. Being able to continue to pay the stipends of students (graduate or undergraduate) who currently work with me
Q26G. To enable me to involve students (graduate, undergraduate or high school) in research
Q26H. To pay for the acquisition, development, maintenance, or operation of laboratory equipment and/or instrumentation
Q26I. To fund travel to conferences
Again, thinking only of the proposals you have submitted to NSF since October 1, 2013, to what extent did the following factors influence your decision to submit a proposal?
To a Great Extent 3 |
To a Moderate Extent 2 |
To a Small Extent 1 |
To No Extent 0 |
Q27A. Decreased funding available from other sources
Q27B. Better chance of funding at NSF than other agencies
Q27C. Need to submit proposals for tenure and/or promotion
Q27D. Need to obtain grants for tenure and/or promotion
Q27E. Need to build and maintain research facilities, centers or programs
Q27F. NSF is the major source of funding for my area of research
Q27G. The NSF budget in my area of research has increased
Q27H. Interesting and relevant new funding opportunities
Q27I. Opportunities for funding inter-, cross-, or multidisciplinary research
Q27J. Opportunities for funding collaborative research
Q27K. Encouragement from NSF staff
Q28. Over the next 5 years, I view NSF as the primary source of potential funding for the following percentage of my research:
1 10% or Less
2 11-25%
3 26-50%
4 51-75%
5 76-100%
Q29. In general, after how many declines of a proposed project would you...
Q29A. Stop submitting the project to any agency?
1 1 decline
2 2-3 declines
3 4-6 declines
4 7 or more declines
Q29B. Stop submitting the project anywhere within NSF?
1 1 decline
2 2-3 declines
3 4-6 declines
4 7 or more declines
Q29C. Stop submitting the project to a particular NSF program?
1 1 decline
2 2-3 declines
3 4-6 declines
4 7 or more declines
Q30A *. Have you ever submitted a proposal to NSF that was declined?
1 Yes
0 No Skip to 31
Q30B. To what extent did the written reviews that accompanied the declination of one of your NSF proposals:
To a Great Extent 3 |
To a Moderate Extent 2 |
To a Small Extent 1 |
To No Extent 0 |
Q30B1. Improve your understanding of the proposal process?
Q30B2. Provide useful information for revising and improving your next proposal?
Q30B3. Influence you to submit to another funding agency?
Q30B4. Discourage you from revising and submitting your proposals to NSF?
Q31. *When did you first begin submitting proposals to NSF?
1 After October 1, 2013 Skip to 33
2 Before October 1, 2013
The next few questions ask about whether you have noticed changes in the nature of reviews you have received on proposals submitted in recent years.
Q32. Thinking back to funding decisions and reviews you received prior to October 1, 2013 and those you received after that date (between October 1, 2013 and the present), how much, if any, have the following changed?
Greatly Increased Compared to Those Received Before October 1, 2013 5 |
Somewhat Increased Compared to Those Received Before October 1, 2013 4 |
Stayed the same Compared to Those Received Before October 1, 2013 3 |
Somewhat Decreased Compared to Those Received Before October 1, 2013 2 |
Greatly Decreased Compared to Those Received Before October 1, 2013 1 |
Q32A. The overall quality of feedback in the written reviews of your proposals
Q32B. The overall quality of feedback from NSF staff about your proposals
Q32C. The timeliness of the decision to award or decline funding
Q32D. The timeliness of responses by NSF staff to your inquiries
Q32E. The quality of your interaction with NSF staff
Q33. For the NSF program to which you most frequently submit proposals, how many submission deadlines does the program have per year?
1 No submission deadlines or target dates (i.e., accepts proposals at any time).
2 One submission deadline or target date each year.
3 Two or more submission deadlines or target dates each year.
4 There is no single NSF program to which I most frequently submit proposals.
Q34. Since October 1, 2013, I have submitted ____ proposals to NSF. (Note: Please enter a whole number in the box below.) [textbox]
[PI SATISFACTION]
For the following questions, please refer to the most recent proposal that you submitted to NSF since October 1, 2013 for which you have received an award or decline decision.
Q35. How satisfied or dissatisfied were you with...
Very Satisfied 5 |
Somewhat Satisfied 4 |
Neither Dissatisfied nor Satisfied 3 |
Somewhat Dissatisfied 2 |
Very Dissatisfied 1 |
Q35A. The quality of the information NSF provided during the proposal submission process (i.e., FastLane, FAQs, web site content)
Q35B. The timeliness of the decision to award or decline funding
Q35C. Your interaction with NSF staff
Q35D. The overall quality of NSF’s merit review process
[PI WORKLOAD]
Q36. Compared to other federal agencies' proposal submission systems, how much effort, on the part of a researcher preparing a proposal, does it take to write and complete a proposal in the required format and submit it to NSF?
3 More Effort
2 Nearly the Same Effort
1 Less Effort
0 Not applicable because I have not submitted proposals to other agencies
Q37. Thinking the most recent full proposal you submitted to NSF, how much of your own time did you spend preparing (writing, formatting and submitting) the proposal?
1 Less than 40 hours
2 41 - 80 hours
3 81 - 120 hours
4 121 - 160 hours
5 161 - 200 hours
6 More than 200 hours
[REVIEW QUALITY]
Q38. Based on your experience submitting proposals to NSF, to what extent do you agree or disagree with the following statements?
Strongly Disagree 1 |
Disagree 2 |
Agree 3 |
Strongly Agree 4 |
Not Applicable 0 |
Q38A. Researchers submitting proposals are treated fairly
Q38B. Written reviews are thorough
Q38C. Written reviews are technically sound
Q38D. Overall, written reviews were of high quality
Q38E. The panel summary or summaries are of high quality
Q38F. The information provided regarding the outcomes of the competition is of high quality
Q38G. The PO Comments I viewed in FastLane helped me understand the decision to decline or award my proposal
Q38H. The conversations (email, phone, face-to-face) I had with my program officer provided me with helpful feedback about my proposal
[ALL RESPONDENTS]
Q39. Please indicate whether you agree or disagree with the following statement:
Strongly Disagree 1 |
Disagree 2 |
Neither Agree nor Disagree 3 |
Agree 4 |
Strongly Agree 5 |
Q39A. Overall, I am satisfied with NSF’s merit review process
Q40. This survey has asked about your experiences with NSF’s merit review process. In your opinion, improving which one of the following factors in that process will have the most significant effect in fostering the progress of science? Please select one.
1 Timeliness of decisions about, and responsiveness to, proposals by NSF staff
2 Quality of feedback to PIs in the form of comments in written reviews
3 Quality of feedback to PIs in the form of comments in panel summaries
4 Quality of PI conversations with, and written comments from, program officers
5 Quality of information available during proposal submission
6 Quality of the review process from the perspective of a reviewer
Q41. Please enter any additional comments you may have about NSF’s merit review process in the space below: ____
Q42. NSF intends to conduct a survey on the Intergovernmental Personnel Act (IPA) program, under which scientists and other professionals “rotate” through the Foundation for a period of up to four years. Your participation in this survey will provide important feedback for the Foundation on how to conduct this program. Please indicate below whether you would be willing to participate in a brief anonymous survey about this program.
Q42A. I am willing to respond to a brief anonymous survey regarding the IPA program and can be reached at the following email address: ____
Q42B. I do not wish to be contacted for the survey on the IPA program.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Meg Trucano |
File Modified | 0000-00-00 |
File Created | 2021-01-22 |