|
|
|
Supporting statement PART b: Collection of Information Employing Statistical Methods
B.1. Respondent Universe and Selection Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The statistical method employed in the NRT monitoring task collection is a census that includes all active NRT awards. Since this collection will primarily be used for accountability and program management purposes, including responding to queries from Committee of Visitors1 (COVs) and other scientific experts, a census rather than sampling design is necessary. This project has five respondent types (Principal Investigators (PI), Co-PIs, Project Coordinators (PC), Faculty, and Trainees) and is comprised of five survey instruments (Project, PI, Co-PI, Faculty, and Trainee surveys). Table B.1.1 below shows the surveys and the respondents charged with completing each.
|
Respondent Type |
||||
Survey |
PI |
Co-PI |
PC |
Faculty |
Trainee |
Project Survey |
X |
X* |
X |
|
|
PI Survey |
X |
|
|
|
|
Co-PI Survey |
|
X |
|
|
|
Faculty Survey |
|
|
|
X |
|
Trainee Survey |
|
|
|
|
X |
Table
B.1.1 NRT
Instrument Surveys and Respondents Responsible for Survey Completion
*The PI and PC
both can respond to the Project Survey. In addition, the PI can grant
permission to a Co-PI to access and complete the Project Survey if
they wish. Only the PI can submit the completed report to NSF for
review.
Data collection for the project involves all PIs, Co-PIs, PCs, Faculty, and Trainees in the NRT program. The universe of respondents is estimated to be 2880 respondents. This estimate is based on 24 respondents per award with 120 awards total. A typical award is comprised of one PI, two Co-PIs, one PC, ten Faculty, and ten Trainees. Note that the PC is tasked with assisting the PI with completing the project survey. They are not asked to complete a PC survey as the only information collected about PCs is their contact details, and this is entered by the PI and confirmed by the PC.
The universe of respondents is expected to remain stable over time as new awards are added and other awards conclude.
B.2. Procedures for the Collection of Information
The NRT data collection employs web-based instruments, and respondents (PIs, Co-PIs, PCs, Faculty, and Trainees) enter data annually to satisfy their usual Research Performance Progress Report (RPPR) requirements. To ensure that data are entered in a timely manner, respondents will receive an initial email from the contractor with the link to their specific survey(s), directions for setting up their login credentials, and a note encouraging them to log in to the monitoring system and begin entering data. Requirements for login, data entry, and timelines will also be shared at annual program and new PI orientation meetings and during virtual training sessions held at the beginning of each collection cycle.
B.2.1 Statistical Methodology for Stratification and Sample Selection.
Not applicable.
B.2.2 Estimation Procedures.
Not applicable.
B.2.3. Degree of accuracy needed
Not applicable.
B.2.4. Unusual problems requiring specialized sampling procedures
Not applicable.
B.2.5. Any use of periodic (less frequent than annual) data collection cycles to reduce burden
Not applicable.
B.3. Methods to Maximize Response Rates and the Issue of NonResponse
Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
The data collection methodology described above incorporates strategies intended to maximize the response rates for the target population of respondents. The expected response rate for the NRT program’s project survey is 100%, as data reporting for the NRT program will satisfy awardee’s usual RPPR requirements. The following strategies will be used to help achieve the expected response rate:
Requiring completion of all survey instruments prior to submission of Project Survey: As noted above, the survey’s completion is required to satisfy the awardee’s RPPR requirements with NSF. In accordance with NSF policy, failure to submit reports on time will result in NSF processing of pending proposals for all identified PIs and Co-PIs on a given award2. PIs are unable to submit the Project Survey until all associated survey instruments have been completed (e.g., Trainee, Faculty, Co-PI, etc.). Exceptions to override this requirement must be made, in writing, to the cognizant Program Officer assigned to the award. Decisions are made on a case-by-case basis to waive the requirement for survey completion by certain individuals. Rationale approved for waiving survey requirements for individual Faculty or Trainees is typically due to the individual being unresponsive over a long period of time despite repeated outreach by the PI, PC, and/or contractor. These cases occur when the individual has left the institution due to graduation, job change, or death.
Convenience of web survey instrument: A web-based data collection system has been selected as the mode to reduce respondent burden and improve overall response. The NRT program monitoring system of online surveys allows respondents to complete the surveys at their convenience (within the annual timeframe of the system opening to the system close.) To further reduce their burden, the system will carryover any data that can be pre-populated after the initial award year. Respondents will automatically be routed to questions that apply to them and their experiences through skip patterns. Respondents can save and resume the survey as needed within the allowed timeframe.
Targeted outreach to non-respondents: While PIs and PCs are responsible for communicating with the individuals involved in the project about submission requirements of all necessary data, the system has built-in features to encourage response and minimize burden on the behalf of PIs and PCs. For example, PIs and PCs have access to status information via the web-based system indicating whether the individual respondents in their projects have completed their data entry. Respondents also receive auto-generated emails when a PI or PC adds (or updates) them to the system. Based on user feedback, an email feature was included in the system for PIs and PCs to send friendly reminders to non-respondent faculty and trainees. In addition, NRT staff have access to online monitoring tools to check the status of reporting by award. Targeted email messages sent by the contractor are used to follow up with respondents who fall in the category of PI and PC to ensure that all necessary data are reported in a timely manner. In addition, the contractor employs a system “broadcast” to draw attention to deadlines, events, and new or updated information. The contractor will implement the following outreach plan shown in Table B.3.1:
Table B.3.1 NRT Targeted Outreach Timeline
Notification Timeline |
Notification Purpose |
Initial survey invitation email
|
Upon collection period launch, all PIs will receive an email invitation from the contractor containing a unique link to the web survey, details on creating their login credentials, and a request that they begin the survey. PCs will also be notified via communication with the NRT PC Network. |
Follow-up email |
At the end of the first month, the contractor will send emails to PIs who have not yet logged in and PIs who have logged in but who have not made any progress with data entry. |
Phone call |
If a PI has not responded to contractor emails after the one-month mark, the contractor will reach out to that PI via phone call, while continuing to send reminder emails. |
NSF involvement
|
If a PI has not responded to any emails or phone calls within one month prior to the award’s reporting deadline, the contractor will escalate to NSF NRT Program Staff. |
One month prior to deadline |
The contractor will send a reminder email of the approaching deadline to PIs who have not submitted the survey. |
One week prior to deadline |
The contractor will send a deadline reminder email to PIs who have not submitted the survey. |
One day prior to deadline |
The contractor will send a final deadline reminder email to PIs who have not submitted the survey. |
One day post deadline |
The contractor will send an email informing the PI of the missed deadline and will ask them to respond with their immediate plans to complete and submit the survey. The contractor will inform NSF NRT Program Staff of the PIs who did not submit their report by their reporting deadline. |
Survey
support (for technical issues and survey content questions):
A helpdesk/technical support email and phone number will be provided
to respondents on the survey and all outreach emails, so that they
may contact a representative with questions about the survey or
troubleshooting issues such as login requirements. The contractor
will aim to respond to all initial questions within four hours of
receipt and resolve those issues within 48 hours of receipt. It is
the expectation of both NSF’s and the contractor that most
issues will be resolved on the same day they are received, and thus
far, as the majority of the requests relate to password
creation/reset or clarifying questions, 95% of requests are resolved
on the same day if received during technical support hours.
Individuals who submit requests can rate their experience, and the
contractor currently has a 4.9/5-star rating for both collection
years. The contractor staff uses a ticket management system to log
all requests, track completed tickets, partially completed tickets,
and nonresponse tickets throughout the data collection period; the
ticket management system is used daily by the contractor to manage
open tickets through to resolution. The ticket management system is
also available to the Contracting Officer Representative (COR), and
the contractor submits monthly status reports to the COR to report
the number of open tickets, the number of tickets resolved, and the
type of requests that were submitted (e.g. login issues, system bug,
clarifying question). In addition, the contractor will hold monthly
Office Hours while the system is open for data collection to provide
ongoing support for respondents and to serve as another platform from
which to ask questions and receive clarity when needed, especially
for first-time PIs and other respondents. During these virtual Office
Hours, survey respondents may drop in and ask questions regarding
both technical aspects and programmatic requirements for their
surveys. The availability of Office Hours will be shared with the
community by emails to the PIs, the NRT Coordinator Network, a
monitoring system broadcast, and in the NRT Knowledge Base, which is
accessible to all respondents.
B.4. Tests of Procedures or Methods
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.
Prior to the collections administered under OMB 3145-0263, the NRT data collection was tested using a pilot test to obtain user feedback to ensure that the questions could be answered in a reasonable amount of time (i.e., that the burden estimates are accurate) and that the directions and question content were easy to understand and follow. The pilot test involved nine individuals who were either PIs or PCs. Members from the contracting team and NSF moderated video “Think Aloud” exercises with two of the nine pilot testers. Data obtained from the pilot test revealed some areas needing clarification that have since been addressed.
The system continues to be monitored and updated by the contractor throughout each data collection cycle by allowing respondents to provide comments and feedback entered directly in the web system. Any input on the system received from users will be shared with NSF, and any changes needed will be implemented as the system is upgraded in the future. Other opportunities to obtain feedback for improvement purposes include collecting feedback from PIs during meetings and conferences; comprehensive reviews by NSF staff; recommendations submitted through tech support requests; and on-going testing performed by the system developers. In addition, the items and response categories used by this data collection follow formats that are already in use by other current and former NSF data collections including ETAP, (OMB Control No. 3145-0248) and consistent with OMB standards as currently defined in the 1997 Statistical Policy Directive No, 153.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
NSF has contracted with Creative Business Solutions, Inc. (CBS) to produce and conduct the data collection using the NRT monitoring system. Table B.5.1 identifies specific individuals who consulted on survey design and development and who will be responsible for collecting and analyzing the data. The COR, Elizabeth Webber, will be responsible for receiving and approving all contract deliverables. CBS’s project manager, Melissa Strickland, (see below for contact details) will work with the COR on any data analysis needs.
Table B.5.1
Individuals Responsible for Statistical Aspects, Data Collection, and Analysis
Name |
Title (Project Role) |
Organizational Affiliation and Address |
Phone Number |
Elizabeth Webber |
NSF Contracting Officer Representative |
National Science Foundation 2415 Eisenhower Ave. Alexandria, VA 22314 |
703-292-4316 |
Daniel Denecke |
NSF-NRT Program Director |
National Science Foundation 2415 Eisenhower Ave. Alexandria, VA 22314 |
703-292-8072 |
Melissa Strickland |
CBS-NRT Project Manager |
Creative Business Solutions, Inc. 13003 Winding Creek Rd. Bowie, MD 20721 |
912-282-0842 |
Jim Biancolo |
CBS-NRT Engineer Lead |
Creative Business Solutions, Inc. 13003 Winding Creek Rd. Bowie, MD 20721 |
413-822-5702 |
Travis Myers |
CBS-NRT Web Support |
Creative Business Solutions, Inc. 13003 Winding Creek Rd. Bowie, MD 20721 |
765-318-1886 |
Flos Mathavan |
CBS-NRT Programmer |
Creative Business Solutions, Inc. 13003 Winding Creek Rd. Bowie, MD 20721 |
860-416-3062 |
Trevor English |
CBS-NRT Technical Support |
Creative Business Solutions, Inc. 13003 Winding Creek Rd. Bowie, MD 20721
|
240-743-7607 |
|
|
|
|
2 For more information see the NSF Proposal and Award Policies and Procedure Guide (PAPPG) 23-1, Chapter VII. D.4 Compliance with Technical Reporting Requirements available at https://new.nsf.gov/policies/pappg/23-1.
3 The 1997 Statistical Policy Directive No. 15: Standards for Maintaining, Collecting, and Presenting Federal Data on Race and Ethnicity is available at https://www.govinfo.gov/content/pkg/FR-1997-10-30/pdf/97-28653.pdf.
|
|
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Melissa Strickland |
File Modified | 0000-00-00 |
File Created | 2024-07-20 |