3145-0101 Part B

3145-0101 Part B.doc

Survey of Science and Engineering Research Facilities

OMB: 3145-0101

Document [doc]
Download: doc | pdf

3145-0101



Part B. Description of Statistical Methodology

B.1. Statistical Design and Estimation

B.1.1 Survey Population

For academic institutions, the Facilities survey is designed to provide national estimates for U.S. colleges and universities with research expenditures equal to or greater than $1 million in the prior academic fiscal year (i.e., in FY 2006 for the FY 2007 cycle and FY 2008 for the FY 2009 cycle). The FY 2007 cycle will be a census of approximately 475 institutions. The listing of eligible institutions will be derived from the NSF Survey of Research and Development Expenditures at Universities and Colleges. No sampling will be performed.


For biomedical institutions, the Facilities survey is designed to provide national estimates of U.S. nonprofit, biomedical, research organizations and hospitals that received research awards equal to or greater than $1 million from the NIH in the prior fiscal year. The FY 2007 cycle will be a census of approximately 190 institutions. The listing of eligible institutions will be derived from the administrative records of the NIH. No sampling will be performed.


Because no sampling will be performed, sampling variances will not be calculated.


NSF is seeking a 94 % response rate in this survey. The response rate on the FY 2005 survey was 94%.


B.1.2 Estimation Procedures

No sampling weights will be required because the survey is a census. However, adjustments will be performed for both unit nonresponse and item nonresponse, with the approach depending on the level of nonresponse and the characteristics of the particular item involved (for item nonresponse).


Adjustments for Unit Nonresponse

Since some nonresponse is likely, provisions will be made to compensate for the missing data in the survey estimates. Unit nonresponse (an institution does not respond to the entire survey) occurs when there is no information for a sampled unit most often because of refusal to participate in the survey or failure to contact the respondent.


In the FY 2003 and FY 2005 survey cycles, unit nonresponse for the research space section of the survey (part 1) was handled by assigning weights to the participating institutions. The nonresponse weight was the ratio of the number of eligible institutions in the survey to the number of responding eligible institutions. The weights were computed separately for the academic and biomedical institutions.


The weights for the academic institutions were adjusted for the known number of academic institutions by: R&D expenditure categories (the quintiles of the distribution), census region, control (public/private), whether the institution was a historically black college or university, and whether the institution granted Ph.D. degrees. For the biomedical institutions the only adjustment variables were the grant dollar amounts from NIH (quintiles of the distribution) and census region. The minimum weights for both academic and biomedical institutions were constrained to be at least 1.0. NSF anticipates using a similar weighting approach for the research space section for the FY 2007 and FY 2009 cycles of the survey.


The data in Part 2 of the survey (networking and computing) was not weighted in the FY 2003 and FY 2005 cycles. These data were not weighted because of the potential for measurement error within the survey responses. It is believed that substantially greater measurement error may exist in the Part 2 data because the majority of the Part 2 questions change with each survey cycle due to extremely rapid developments in cyberinfrastructure. For example, approximately two-thirds of the Part 2 questions will be implemented for the first time in the FY2007 survey cycle. Also, extensive variability in the cyberinfrastructure environments and expertise at different institutions may lead to greater measurement error. NSF anticipates using a similar approach for the Part 2 data for the FY 2007 and FY 2009 survey cycles. NSF plans to consider weighting for unit nonresponse beginning in survey cycle FY 2011.


Adjustments for Item Nonresponse

Item nonresponse occurs when there is no information for a respondent on an individual item on the questionnaire, most often because of refusal to answer that item or the provision of an invalid response (e.g., one that falls outside of the possible range of values). We will use imputation on selected variables to adjust for item nonresponse for part 1 variables and will not use imputation for the Part 2 items (see previous section).


The method of imputation will depend on the characteristics of the variable. In some cases logical imputation might be used, with the response to one item being used to infer a response on another item. For example, an institution that indicates on one questionnaire item that it does not have a medical school can be assumed to make equivalent responses elsewhere on the questionnaire, even if the item is left blank. In other cases, statistical imputation might be used, based on a statistical model to predict the expected response of the institution (e.g., based on responses elsewhere in the questionnaire, responses to previous cycles of the survey, or responses of similar institutions). Sometimes, such as if there is a large amount of missing data, it may not be advisable to use imputation. The final determination of the imputation plan will be determined after more information is available on the amount of missing data.


Flags will be created to indicate all instances of imputed values.


B.2. Survey Procedures

The facilities survey is a mixed-mode mail and web survey, with telephone and email followup.


The president of each institution will be mailed a copy of the questionnaire, a cover letter, and a copy of the report from the FY 2005 survey cycle. In addition, the president will receive one of two institutional coordinator forms depending on whether the institution planned to retain the previous survey cycle coordinator for the upcoming survey cycle (see below). NSF has found through experience that different sections of the questionnaire are often completed by different offices throughout the institution, so it is important to have an institutional coordinator who can delegate sections of the questionnaire to appropriate individuals, and sometimes prepare a composite response based on their individual responses. The coordinator also acts as the central communication point for NSF and the contractor collecting the data.


During the FY 2005 survey cycle each president was asked if they wished to keep the FY 2005 institutional coordinator for the FY 2007 data collection. If the president indicated that the coordinator would be the same in FY 2007, the president will be sent a pre-filled form along with the previously mentioned materials indicating their FY 2007 coordinator (and providing an opportunity to name a new coordinator if he/she wishes to do so). Simultaneously, the prior cycle’s coordinator will receive a letter indicating that data collection is beginning and that his or her name has been provided to the president as the past coordinator. At this time the coordinator will also receive a copy of all materials.


If the president indicated in the FY 2005 data collection that he/she did not wish to keep the same coordinator for the FY 2007 data collection, the president receives a blank form to use to indicate the FY 2007 coordinator. To aid a president selecting a new coordinator, the letter to the president will indicate who acted as the coordinator in the previous survey cycle (if the institution responded to the cycle). Simultaneously, the prior cycle’s coordinator will receive a letter indicating that data collection is beginning, that a letter has been sent to the institution’s president requesting a coordinator, and that his or her name has been provided to the president as the past coordinator. At this time the coordinator will also receive a copy of all materials.


If no response is received from the president’s office within a week, telephone prompts will be used to determine the name and contact information for an institutional coordinator. Following designation of the coordinator, the coordinator will be notified that he or she has been appointed survey coordinator.


Regular email and/or telephone prompts will be used to encourage the institution to respond. Institutions will have the option of completing either a paper copy of the questionnaire or providing the data on the web through a designated web site. Based on past experience, we expect over 85 % of the institutions to respond using the web. Returned questionnaires will be examined for quality and completeness using both visual and computerized edits. In the case of questionnaires completed on the web, computerized edits will check for quality and completeness as the data are entered, and prompt the respondents if problems are found. If key items have missing data or other problems appear in the data (e.g., two responses appear to be inconsistent), then respondents will be recontacted to resolve the issues.


B.3. Methods for Maximizing Response Rates

NSF is seeking a 94 % response rate in this survey.


A key to achieving this response rate is the tracking of the response status of each included institution, with telephone followup of those institutions that do not respond in a timely manner. The survey responses will be monitored through an automated receipt control system. Approximately three weeks after the initial mailout, the contractor will begin calling nonrespondents to verify that they received the questionnaire and to prompt the individuals to respond. Additional telephone or email prompts will be made as the data collection period continues.


Several other steps will be taken to maximize the response rate. The survey materials will provide a toll-free 800 number that people may call to resolve questions about the survey. Respondents may seek help by email. In addition, standard survey techniques that have proven successful in other academic survey efforts will be employed to achieve a maximum response rate. These techniques include:


  • A cover letter signed by the director of NSF and a cover letter signed by the director of the National Center for Research Resources (NIH).


  • Institution coordinators will be contacted by telephone prior to the intended closeout of the survey. This contact is intended both to offer assistance to respondents and to encourage their speedy response.


  • Follow-up telephone calls will be made to nonrespondent institutions as required. These follow-up calls are expected to achieve significant improvements in response rates.


Finally, institutions will be informed in their materials that institution-level survey responses are currently available for the FY 2003 and FY 2005 surveys and will also be available for the FY 2007 survey. This data will be available on a publicly accessible database on the World Wide Web. NSF believes that having publicly available data will maximize responses rates because institutions will be more likely to participate if they believe that the data will be useful to them.


B.4. Tests of Procedures and Methods

The questionnaire is based on versions of the survey used in previous cycles. As part of survey improvement efforts, a total of 22 institutions were visited for the purpose of conducting cognitive interviews to determine how the questionnaire was interpreted and how respondents obtained their responses. An additional 110 respondent debriefings and 10 usability tests were conducted. The issues addressed in the visits and debriefings varied by sets of draft survey questions that will appear on the FY 2007 survey. Many of the interviews were performed in an iterative manner, so that when a problem was discovered with a question, the question was modified and then tested through a cognitive interview with other institutions.


In addition to the site visits, two workshops were held with a total of 20 FY 2003 and FY 2005 respondent institutions. These participants provided feedback based on their expertise, experiences with the survey, and made recommendations for improvements. Finally, two informal pretests of the FY 2007 survey were conducted with a total of 12 workshop participants. Pretest respondents were asked to participate in a debriefing that included questions on completion time, problem questions, undefined terms, and other comments about the questionnaire (such as the content, format, and appearance).


The pretest respondents indicated that the survey was improved over previous cycles, with greater clarity in the questions. Respondents were able to provide the data requested. A number of changes (e.g., changes in question wording and survey definitions) were made to the questionnaire to prevent potential problems that were identified in the pretests.


B.5. Individuals Responsible for Study Design and Performance

The individuals listed below participated in the study design.


Leslie Christovich, NSF 703-292-7782

Fran Featherston, NSF 703-292-4221

John Jankowski, NSF 703-292-7781

Timothy Smith, Westat 240-314-2305

Cynthia Gray, Westat 301-251-4336


The contractor for the FY 2007 and FY 2009 data collection will be Macro International. Leslie Christovich at NSF will oversee Macro International contract.




7


File Typeapplication/msword
File Title3145-0101
File Modified2007-07-31
File Created2007-07-31

© 2025 OMB.report | Privacy Policy