REQUEST FOR NEW EHR PROGRAM MONITORING DATA COLLECTIONS
Forms Clearance Package
Submitted by:
National Science Foundation
4201 Wilson Boulevard
Arlington, VA 22230
The National Science Foundation (NSF) is the primary federal agency supporting research at the frontiers of knowledge, across all fields of science and engineering (S&E) research and all levels of S&E education (NSF, "Empowering the Nation: Through Discovery and Innovation," NSF Strategic Plan for Fiscal Years (FY) 2011-2016). NSF awards grants, contracts, and cooperative agreements to more than 2,000 colleges, universities, and other eligible institutions, and provides graduate fellowships to individuals in all parts of the United States.[1]
NSF provides nearly 20 percent of federal funding for basic research to academic institutions.[2] Within NSF, the Directorate for Education and Human Resources (EHR) has primary responsibility for promoting rigor and vitality within the Nation’s science, technology, engineering, and mathematics (STEM) education enterprise to further the development of the 21st century’s STEM workforce and public scientific literacy. In order to support the development of a diverse and well-prepared workforce of scientists, technicians, engineers, mathematicians, and educators and a well-informed citizenry that has access to the tools of science and engineering, EHR’s mission includes identifying means and methods to promote excellence in U.S. STEM education at all levels and in all settings (both formal and informal). To these ends, EHR provides support for research and implementation activities that may improve STEM learning and education from pre-school through postdoctoral studies, in traditional and non-traditional venues, among all United States citizens, permanent residents, and nationals. EHR also focuses on broadening participation in STEM learning and careers, particularly among those individuals traditionally underrepresented and underemployed in the STEM workforce, including but not limited to, women, persons with disabilities, and racial and ethnic minorities.
In March 2011, EHR requested that the Office of Management and Budget (OMB) renew EHR's Generic Clearance, OMB 3145-0136. The collection was renewed for one year with the provision that EHR review its procedures and processes for collecting and utilizing the data collected through these processes.
Subsequent information from OMB directed EHR to submit a new request for clearance for data collections associated with its program monitoring systems that was not to be based on the OMB generic clearance model. EHR has identified a number of opportunities to increase the efficiency and effectiveness of its current practices in data collection. This request for a new clearance for 11 data collections is based on the similarity of the items collected in each of the various programs and two categories of programs (e.g., scholarship/fellowship programs and implementation and research programs).
The committee report was not completed in time to allow submission of a renewal of OMB 3145-0136. EHR requested and OMB granted an extension of OMB 3145-0136 for six months, from July 2012 to January 2013.
This request seeks approval for 11 data collections that have similar elements and purposes and will provide essential information for program monitoring purposes. It also includes a report on our progress in reviewing procedures and processes for collecting and utilizing monitoring data, our purposes for the collections, and how we use the monitoring data to inform our project/program management and evaluations. Additionally, we report on progress to date in analyzing the effectiveness, efficiency, and feasibility of creating a common set of data elements to be collected and our proposed plan to pilot test this model with scholarship and fellowship programs.
Responses and actions to three other items raised in the earlier review are also provided. Those items are:
Do monitoring systems collect data needed to assess programs?
To what extent are monitoring data used to shape questions for a third-party evaluation?
Provide changes to disabilities reporting formats as needed in accordance with OMB policy
Data collected by EHR program monitoring systems are used for program planning, management, evaluation, and audit purposes. Summaries of monitoring data are used to respond to queries from Congress, the public, NSF's external merit reviewers who serve as advisors, including Committees of Visitors (COVs), and NSF's Office of the Inspector General. These data are needed for effective administration, program and project monitoring, evaluation, and for measuring attainment of NSF's program and strategic goals, as identified by the President's Accountable Government Initiative, the Government Performance and Results Act (GPRA) Modernization Act of 2010, and NSF's Strategic Plan.
The
11 program-specific collections included in this request (see
attachments A1 through K4) are designed to assist in management of
specific programs, divisions, or multi-agency initiatives and to
serve as data resources for current and future program evaluations.
Program |
Type of Program |
Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System |
Implementation & Development & Research |
Graduate STEM Fellows in K-12 Education (GK-12) Monitoring System |
Scholarships and Fellowships |
Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System |
Scholarships and Fellowships |
Advancing Informal STEM Learning (AISL) Monitoring System, formerly Informal Science Education (ISE) |
Implementation & Development & Research |
Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System |
Implementation & Development & Research; Scholarships and Fellowships |
Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System |
Scholarships and Fellowships |
Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System |
Scholarships and Fellowships |
Research in Disabilities Education (RDE) Monitoring System |
Implementation & Development & Research; Scholarships and Fellowships |
Scholarships in Science, Technology, Engineering, and Mathematic (S-STEM) Monitoring System |
Scholarships and Fellowships |
Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System |
Implementation & Development & Research |
Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES) Monitoring System |
Implementation & Development & Research |
[1] National Science Foundation. (2012). How we work. Retrieved from http://www.nsf.gov/about/how.jsp
[2] National Science Foundation. (2012). NSF at a glance. Retrieved from http://www.nsf.gov/about/glance.jsp
The NSF Directorate for Education and Human Resources is responsible for analyzing and evaluating STEM education and human resource development activities and research in NSF’s Education and Training (E&T) portfolio.
History of the EHR Monitoring Systems Clearance
In 1995, at the request of OMB and in response to the Government Performance and Results Act of 1993, an EHR Generic Clearance (EHR Generic) was established to integrate management, monitoring, and evaluation information pertaining to NSF’s E&T portfolio. Under this generic survey clearance (OMB 3145-0136), data from NSF administrative databases were incorporated with information gathered through initiative-specific, division-specific, and program-specific data collections. EHR has used these data for monitoring, managing, and evaluating its investment in E&T programs, initiatives, and activities.
When first approved in 1998, the EHR Generic Terms of Clearance (TOC) (appendix A) specified how individual packages would be handled. Those terms stated that “All . . . individual tasks associated with this generic . . . must be submitted to OMB for clearance prior to implementation. If approved those individual approvals will expire, at the latest, when this generic expires in 9/2001 . . . When NSF seeks to add additional tasks to 3145-0136 other than those previously mentioned, the additional request will be accompanied by an 83-C burden change sheet so that the appropriate burden total for the generic clearance can be changed accordingly. Further, each additional request shall contain a cover memo which describes why the specific task is appropriate to include in the generic. Consistent with past procedures under this generic clearance, submissions of individual tasks are done informally (i.e., sent directly to the desk officer rather than to the docket library) and OMB will attempt to complete the review expeditiously.”
The 2001 TOC further prescribed a “…cross-walk that was provided by NSF …” and specified that the cover memos submitted with new requests “…should contain a similar crosswalk that details how the new questions fit into the three categories given” of "Staff and Project Participant Characteristics, Project Implementation Characteristics, and Project Outputs" (See appendix B - October 31, 2001, Memo from Mary Sladek to Lauren Wittenberg. The memo refers to these items as “the common core.”). In addition, the 2001 TOC stated that “NSF has agreed to consider this clearance to encompass only ‘monitoring’ surveys, and no program evaluations will be completed under this generic clearance. Evaluations will need to go through a full clearance review under the Paperwork Reduction Act (PRA). All monitoring studies must conform to the three-category configuration explained in the memo of 10/24.” In accordance with the 2001, 2005, and 2008 TOC, NSF primarily uses the data from its monitoring efforts for program planning, management, and audit purposes. Evaluation studies are submitted to OMB under separate information collection requests. However, information from the data monitoring has been used to inform the design of evaluations, as a source in survey design, and as data sets in contractual evaluations.
As noted above, the EHR Generic was renewed for one year, from July 2011 to July 2012. The TOC of the current EHR Generic (appendix C) specify that "...the EHR [Evaluation] committee will work to determine whether there are areas to streamline/or improve [EHR monitoring systems]." Additionally, the TOC include a provision that "...NSF will make the recommended changes to disabilities reporting, adopting the currently approved RPPR [(Research Performance Progress Report)] format." The former clearance was extended for six months, from July 2012 to January 2013.
In July 2011, the EHR Evaluation Committee formed a Monitoring Working Group specifically charged to:
Identify all monitoring activities underway in EHR
Determine how the monitoring information is used
Determine whether common questions are being asked to enable aggregating across programs
Determine whether minimal changes in question format and survey design would enable questions to be standardized and aggregated across programs into common data elements
Determine whether aggregated monitoring systems could be developed to apply to all EHR programs
Determine how to link monitoring data to program and thematic evaluations and longitudinal studies
The Working Group submitted its report to the Evaluation Committee and EHR management in which it summarized its work and findings in June 2012:
Monitoring systems continue to be important in managing programs effectively because they provide information on implementation aspects of programs and outcomes of individual projects that, when aggregated, provide a view of the program as a whole.
Given the number of monitoring systems currently in use in EHR, both covered by the former EHR Generic and stand-alone collections, there are efficiencies to be gained from including all monitoring data collections under a single clearance while developing, within each, common data elements that enable EHR to report across programs.
These common data elements can increase the effective utilization of the data from the collection systems, specifically with respect to cross-program data collection, reporting, and fine tuning the unique information needs of each program.
Monitoring data should be explicitly incorporated in evaluation plans and activities.
The Working Group has identified a number of options for testing the use of common data elements:
Conduct a pilot study to test the effectiveness, efficiency, and feasibility of creating common data elements for scholarship and fellowship programs by utilizing a standard format for items previously identified in the TOC three common data category configuration in the crosswalk of items.
Identify similar commonalities for implementation programs and conduct a pilot study to test the effectiveness, efficiency, and feasibility of common core items for these collections.
Expand the type of questions in the collections' monitoring systems to include those that might be used to identify effective practices or processes in some or all parts of individual projects and programs as demonstrated by the Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics, Research in Disabilities Education, and Advancing Informal STEM Learning (formerly Informal Science Education) collections included in this request.
Establish an internal EHR review process for each monitoring system collection request to identify common core questions and to enforce standardization across program collections.
Include as a part of each statement of work for a program evaluation the requirement that an evaluation include the assessment and use of data from EHR monitoring systems as a part of the evaluation plan.
Collaborate on NSF-wide initiatives intended to coordinate evaluation and monitoring efforts across NSF led by the NSF Office of Integrative Activities, including methods and processes for collection of standard data about projects currently limited to annual and final reports as well as new reporting requirements of the RPPR.
Examine the usefulness of emerging systems such as STAR Metrics for collecting and reporting on standard project metrics across NSF programs.
These actions are based on current understanding of data needs for managing programs, use in evaluation activities, and documenting changes to the original TOC for this collection and include:
Emphasis of reporting program progress based on changes in program focus as determined by EHR's 2013 budget request
Development of metrics associated with program or agency progress toward the NSF strategic goals in coordination with EHR and NSF efforts
The most recent request from OMB to accommodate consolidated or common core items as well as items unique to individual programs that may aid in conducting internal evaluations or more readily inform external evaluations, limiting the extent of additional data collection by external evaluators
Approval of this request will provide EHR with the opportunity to identify efficiencies in new processes for requesting and collecting information, and to identify and test a common core of items that would serve all programs that currently have monitoring systems and provide economies not now experienced with a variety of individual program-level systems. To test this consolidation effort, EHR includes, as part of this request, the Research in Disabilities Education collection, previously cleared as OMB No. 3145- 0164, which expired October 31, 2012. While the RDE collection contains several of the common elements identified in the fellowship and scholarship collections, it also contains questions that seek more in depth information about institutional and participant experiences in RDE projects, which also fit into the three categories defined in the 2001 TOC. The integration of the RDE collection will serve as a test for the development of a potential common core of open-ended evaluative questions for all programs in the EHR program monitoring data collections.
Also included in this request is the Advancing Informal STEM Learning collection, formerly the Informal Science Education collection, OMB 3145-0158, which expired September 30, 2012. Like the RDE collection, the AISL collection is a monitoring system with more attributes of the development and implementation systems rather than identification of individual participants. Participants in the AISL collection are identified in the aggregate for each project by specific target audience type. An interesting feature of the AISL system is the identification by each project of specific impacts the activities of the project are expecting and indicators illustrating and guiding the development of strategies to measure those impacts. In this way, its design may serve as a model for similar collections in EHR.
Two recent developments within EHR are expected to lead to improvements in how data from these systems are used. The first is a joint pilot program between EHR and the NSF Directorate for Engineering in which the NSF Division of Information Systems will integrate data from monitoring applications from both organizations into the NSF data system as part of a data warehouse. EHR has chosen the STEP program as its test case. Data from the monitoring system will be integrated with NSF corporate data and corporate data management and analysis applications similar to efforts that were initiated when OMB 3145-0136 was first approved. Technological developments have made the integration of these data much more efficient and productive.
The second is the development of a comprehensive evaluation plan for consideration by the EHR Advisory Committee in November 2012, with two goals that specifically address the use of monitoring data. The goals and objectives are:
"Develop and implement state of the art information systems for documenting, visualizing, analyzing, synthesizing, and communicating EHR's STEM education portfolio
Objective 1: EHR will implement effective utilization of existing data information systems with the objective of achieving consistency and coherence.
Objective 2: EHR will support, test, and implement innovative and well-conceived information systems.
Educate EHR users about tools that are already available
Develop prototype systems that integrate and visualize internal NSF data and external EHR monitoring system data
Create and maintain an IT infrastructure that serves the needs of EHR users"
and:
"Use best practices in education evaluation to institute consistent and coherent processes and policies for gathering and using performance data, where performance data is defined as information from monitoring systems and findings from evaluations
Objective 1: EHR will develop and use mechanisms and processes that support the integration of performance data and evaluation findings.
Objective 2: EHR directorate decision-making activities will be based on linked monitoring data and evaluation findings.
Examine the ways in which monitoring data could be integrated into evaluation throughout NSF and in other federal programs
Investigate the feasibility of creating a common core of indicators for EHR monitoring data collection systems
Explore internal systems that allow for more effective access and use of performance data
Examine the usefulness of emerging systems and strategies for collecting and reporting performance data
Expand the range of questions in the monitoring collection systems to anticipate future evaluations and include items that might be used to identify effective practice or process outcomes"
Additional Issues Raised By Earlier Review
In addition to the issue of common data elements to enable aggregation of data across programs, prior review of this collection raised several other concerns that were clarified over the course of the interim clearance period.
Do monitoring systems collect data needed to assess programs?
These monitoring systems provide data required to assess the progress of projects in each program. The monitoring data contribute to the overall assessment of program performance.
In the case of programs that are primarily fellowship or scholarship programs, collection of information about participants in those programs is essential to any future tracking of their progress and determination of the impact of participation in the program.
Programs whose goals are implementation or development require that detailed information about the initial efforts of individual projects be identified in order to track the potential impact of those efforts in successive locations.
The monitoring systems collect project-level information on the scale, scope, and state of each project along with information on types of activities implemented; results, such as publications and number of students and/or faculty involved in the project; and partners. This information is essential for documenting the development, implementation, adaptation, dissemination, and results of supported activities in institutions of higher education and across STEM disciplines. In addition to program management and reporting purposes, the program monitoring system data set has been designed as a primary source of information for a future evaluation of each program.
As further evidence of the integral part that monitoring data play in the administration of programs and projects, the STEM Talent Expansion Program in the Division of Undergraduate Education is implementing an effort to better align data collection across key points in the life cycle of projects. The program is developing a requirement in the program solicitation to include the reporting of common data elements in proposals that align with the data collected by the monitoring system. Consideration of the proposal for award will be partially based on the baseline data submitted with the proposal and verified during award negotiation. New awardees will be notified that these data elements are to be included in the required third-year review of projects through information describing the review that is sent to PIs. These data are an important source of information for the review and form the basis for recommendations.
To what extent is monitoring data used to shape questions for a third-party evaluation?
As noted in the previous clearance request, and reiterated above, EHR relies on the program monitoring data to contribute to and inform third-party evaluations. Without these data, third-party evaluators would be required to collect data about program participants and program projects mainly after awards had been completed rather than during the period of performance of an award.
Because of the time lag in initiating, conducting, and completing third-party evaluations, programs like TUES and RDE have expanded the type of questions they include in the monitoring system to enable them to more directly document and describe program outputs as well as program progress.
Make Changes in Disabilities Reporting Format Where Needed
Additionally, this request responds to TOC of the current collection requiring EHR to make changes to the disabilities reporting format by incorporating the standard RPPR format in each collection that includes disabilities as a characteristic in reporting on individuals, including the Centers of Research Excellence in Science and Technology, Graduate STEM Fellows in K-12 Education, Integrative Graduate Research and Traineeship Program, Louis Stokes Alliances for Minority Participation, LSAMP Bridge to the Doctorate, Robert Noyce Teacher Scholarship Program, and Scholarships in Science, Technology, Engineering, and Mathematics Program monitoring systems. The monitoring systems have incorporated the new format, consisting of three mutually exclusive answer options (i.e., Yes, No, Do not wish to provide - see appendix D), as part of the next release for each system, as follows:
CREST—October 2012
GK-12—May 2012
LSAMP—June 2012
LSAMP-BD—October 2012
Noyce—September 2012
S-STEM—September 2012
The IGERT system will be updated at its next scheduled release and survey opening in March 2013.
Circumstances of Data Collection
To fulfill its planning and management responsibilities, and to answer queries from Congress, OMB, and NSF management, EHR needs current and standardized information about projects in NSF’s E&T portfolio. This information is specifically important to support studies and evaluations by EHR, and studies by other NSF organizational units for project monitoring and effective program administration. The information is retained in accordance with the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). The Education and Training System of Records has several purposes, including:
Providing a source of information on demographic and educational characteristics and employment plans of participants in NSF-funded educational projects, in compliance with Foundation responsibilities to monitor scientific and technical resources enabling NSF to monitor the effectiveness of NSF-sponsored projects and identify outputs of projects funded under NSF awards for management and for reporting to the Administration and Congress, especially under the GPRA Modernization Act of 2010, 5 U.S.C. 306 and 39 U.S.C. 2801-2805, and under the President’s Accountable Government Initiative, and Performance Improvement Guidance as represented by OMB’s guidance to agencies (M-10-24)
Creating public use files (which contain no personally identifiable information) for research purposes
The data collected under this request are focused on initiative-specific, division-specific, and program-specific quantitative and qualitative data collection activities. Data from these collections are focused on participant demographic detail (particularly for scholarship and fellowship programs) and activities and outputs (i.e., the accomplishments of program grantees (projects) in terms of specific objectives). These descriptive data collections provide essential information for documenting progress toward NSF’s major performance goals, as described in NSF’s Strategic Plan. (The Foundation’s FY 2011-2016 Strategic Plan describes three strategic goals: Transform the Frontiers, Innovate for Society, and Perform as a Model Organization. See NSF's Strategic Plan.)
The information collected under the this request is required for effective program administration, program and project monitoring, evaluation, and for measuring attainment of NSF’s program and strategic goals as laid out in NSF’s Strategic Plan. This section describes how the data to be collected under the new clearance authority will be used for internal program management and administration; as a data source for NSF’s performance assessment activities, including Committees of Visitors and Directorate and Office Advisory Committees (ACs); for documenting the attainment of NSF’s program and strategic goals in accordance with the President’s Accountable Government Initiative and GPRA reporting; and as a foundation for the rigorous research required to evaluate the effectiveness of STEM education programs. For more general information about NSF’s performance assessment activities see NSF Performance Activities.
Program Management and Administration
One of the primary uses of data from the EHR Program Monitoring Clearance is for the general oversight of project and program activities by EHR staff. Because EHR has a limited number of staff members who must monitor hundreds of projects, large-scale data collection is the only way by which program officers can track project activities. The monitoring systems that fall under OMB 3145-NEW allow program officers and other NSF staff to integrate pre-existing data from the NSF administrative data system and newly generated data in a coherent and timely manner, giving them information needed to make adjustments to the program portfolio. The pilot project currently underway to integrate the STEP monitoring data with the NSF business information systems will make this process more efficient and transparent. This kind of monitoring can lead to corrections by respondents to their project activities, may facilitate changes in program guidelines and/or NSF funding levels to a particular project, and may result in improved benefits to participants in NSF projects.
In guidance from the Director of OMB, M-10-32, the need for rigorous evaluations and the objectives of program evaluations were clearly outlined, including the use of evaluation resources. Because the collection of data contained in these monitoring efforts contributes to the formal evaluation of programs and provides regular measures of program performance by accumulating operating information from each project in the programs included in this request, this guidance is particularly pertinent to this request. In this regard, the OMB guidance provides a rationale for the collections covered under this request and the activities implemented in behalf of the development of this request.
“Improving and coordinating the use of existing evaluation resources. In addition to the voluntary evaluation initiative, agencies should continue to carefully assess, report on, and allocate the base funds and resources that the agencies have for conducting evaluation. Agencies are encouraged to share information beyond what is requested in guidance and consult with OMB’s Resource Management Offices (RMOs) to coordinate and improve the design, implementation, and utilization of evaluations.”
This directive reinforces the need for EHR to engage in an integrative process of collecting information about its programs to improve program evaluation assessment processes.
Data for NSF’s Performance Assessments and Committees of Visitors
Data from the monitoring systems contribute to NSF’s performance assessment activities, and support the larger NSF evaluation model. NSF relies on the judgment of external experts to maintain high standards of program management and to provide advice for continuous improvement of NSF performance. COVs for divisions or programs meet once every three years. COV reviews provide NSF with external expert judgments in two areas: (1) assessments of the quality and integrity of program operations and program-level technical and managerial matters pertaining to proposal decisions; and (2) comments on how the results generated by awardees have contributed to the attainment of NSF’s mission and strategic outcome goals. Data collected in the monitoring systems are often used in these reviews. For example, the April 2011 S-STEM program COV materials included summary data about the program that had been collected via the S-STEM monitoring system (attachments I1 and I2). COV reports are available at http://www.nsf.gov/od/oia/activities/cov/covs.jsp.
GPRA Reporting
Another central use of the EHR Program Monitoring Clearance data is to document attainment of NSF’s program and strategic goals and to report on the attainment of these goals. NSF’s performance assessment is guided by three elements: the GPRA Modernization Act of 2010, the President’s Accountable Government Initiative, and NSF’s Strategic Plan. The Foundation’s FY 2011-2016 Strategic Plan describes three strategic goals: Transform the Frontiers, Innovate for Society, and Perform as a Model Organization. EHR’s portfolio of E&T programs is a critical part of the Foundation’s goals to Innovate for Society and Transform the Frontiers. Under the Innovate for Society goal specifically, EHR programs contribute to the performance goals of “Building the capacity of the nation’s citizenry for addressing societal challenges through science and engineering” (p. 10) and "Support the development of innovative learning systems" (p. 11). Under the Transform the Frontiers goal, EHR programs "Prepare and engage a diverse STEM workforce motivated to participate at the frontiers" (p. 7). Much of the information that enables EHR to report on these developments is derived from the data elements collected in the monitoring systems to be cleared under OMB 3145-NEW. Monitoring systems and the data they collect identified in this request enable the successful reporting and use of these performance assessments, which is essential in meeting GPRA requirements.
Monitoring data are being used to address the current NSF performance goal of “Public Understanding and Communication of Science,” which is expected to identify a “common set of evidentiary standards...” to establish measures of the effect of projects to increase public understanding of science and communicate science to public audiences that are developed to inform the success of the goal. The AISL (formerly ISE) program monitoring data serve as primary models for categories in which programs might establish methods to identify project influence.
A
Foundation for Future Evaluations
Finally, a key measure of NSF’s success at achieving its goals is the effectiveness of its STEM education programs. NSF is committed to implementing program evaluation in accordance with the President’s Accountable Government Initiative. While the monitoring systems used to collect data under the EHR Generic Clearance currently play a role in this work, it was understood that they are not evaluative studies. NSF does conduct program-level management reviews to ensure that programs are administered properly and in accordance with federal guidelines and agency missions. This is currently one use of data from the EHR monitoring systems. Going forward, EHR will emphasize the use of monitoring data in future evaluation activities, creating a foundation for the kind of evaluation the President’s Accountable Government Initiative requires of federal agencies. While data collected under the prior clearance were not used to evaluate program effectiveness, some of the data collected contributed to programmatic evaluations. For example, in order to conduct program-level or portfolio-level evaluations, both experimental and quasi-experimental evaluation research studies on STEM education interventions require researchers to identify individual-level and organizational-level or project-level control and treatment groups or comparison groups. NSF-funded contract or grantee researchers and evaluators have used the data to identify control, comparison, or treatment groups for NSF’s E&T portfolio using some of the descriptive data gathered through OMB 3145-0136 to conduct well-designed, rigorous research and portfolio evaluation studies.
Two examples of third-party evaluations that used EHR OMB 3145-0136 data to inform study design are: OMB No. 3145-0187 (Expired 8/2011) Evaluation of the NSF’s Graduate STEM Fellows in K-12 Education (GK-12) Program and OMB No. 3145-0182 (Expired 3/2011) Evaluation of the NSF’s Integrative Graduate Education and Research Traineeship (IGERT) Program: Follow-up Study of IGERT Graduates, both conducted by Abt Associates.
The IGERT program has an ongoing descriptive evaluation of interdisciplinary training in IGERT, and the evaluators used IGERT monitoring data to shape the evaluation. The use of the monitoring system data expedited the evaluation considerably.
The AISL, formerly ISE, program also has an ongoing evaluation in which the monitoring data have informed both the design of the evaluation and the form of the survey questions to projects.
The Noyce program has incorporated the data from the monitoring system into the design and execution of two program evaluations. In the first of two program evaluations of the Noyce program, a team of evaluators that included science and mathematics education faculty and staff at the University of Minnesota designed and conducted an evaluation of the program incorporating the information from the program monitoring system in the following ways:
Collection and archiving of the characteristics of recipients, with annual updates of population of recipients
Descriptive analysis and verification of information on characteristics of recipients that were reported through the monitoring system compared to characteristics reported by recipients on the program evaluation, “Scholar Survey”
Combination of data from the monitoring system with data collected by the program evaluation to produce quantitative models of the program
In the current evaluation, OMB 3145-0217, by Abt Associates, the use of information collected by the monitoring system is a requirement of the evaluation. Before developing a set of surveys for the current program evaluation, the evaluation team compared the information needs for the evaluation with information collected through the monitoring system as a way to reduce burden hours, to avoid redundancy in data collection, to coordinate data collection, and to maintain data quality for the current evaluation. For this mixed-methods evaluation, information from the Noyce program monitoring system is one of several data sources for addressing the following questions:
RQ 2: How do stakeholders perceive the Noyce award and Noyce recipients? (Descriptive Study/Descriptive Analyses)
RQ 4: What are the relationships between the types of supports, activities, and training that Noyce recipients receive, the types of Noyce recipients, and the recipients' plans to enter and/or remain in teaching and leadership roles? (Descriptive Study/Relational Analyses)
RQ 5: What is the impact of Noyce on teacher recruitment, retention, and student achievement? (Comparative Study/Impact Analyses)
All of the collections included under this clearance request use Web-based data collection systems to minimize data duplication and respondent burden. EHR favors Web-based systems because they facilitate respondents’ data entry across computer platforms. One innovative feature of many of the individual Web systems is the thorough reviewing and editing of all submitted data for completeness, validity, and consistency. Editing and validation are performed as data are entered. Most invalid data cannot be entered into the system, and questionable or incomplete entries are called to respondents’ attention before they are submitted to NSF.
EHR Program Monitoring Clearance Web-based data collection systems employ user-friendly features such as automated tabulation, data entry with custom controls such as checkboxes, data verification with error messages for easy online correction, standard menus, and predefined charts and graphics. All of these features facilitate the reporting process, provide useful and rapid feedback to the data providers, and reduce burden.
All collections in the EHR Program Monitoring Clearance comply with Section 508, the 1998 amendment to the Federal Rehabilitation Act, which mandates that the electronic and information technology used by federal agencies be made accessible to all people with disabilities.
The EHR Program Monitoring Clearance does not duplicate efforts undertaken by the Foundation, other federal agencies, or other data collection agents. For example, NSF grants require the submission of annual and final project reports in accordance with OMB 3145-0058. Recipients of NSF grants, such as principal investigators (PIs), must create and submit annual and final project reports using NSF’s nationally recognized FastLane Web template. (For more information on FastLane, see the FastLane Demo Site.) Data collected under the EHR Program Monitoring Clearance are unique and not available in either the NSF annual or final reporting system. The planned introduction of the new annual and final reports based on the RPPR format will improve the submission of project information, but does not change the need for additional information that monitoring systems provide on a program-specific basis.
Of the 11 collections in the EHR Program Monitoring Clearance, only TUES collects information from small businesses. TUES collects only a small amount of data from small business organizations, with the total small business response burden accounting for less than one percent of the total TUES response burden. Based on current data, fewer than five small businesses are affected by the TUES data collection. Together these businesses hold fewer than five awards in total, and each small business would spend no more than 4 hours responding per award.
Data collected for the EHR Program Monitoring Clearance are used to manage programs, monitor projects, inform project and program evaluations, coordinate with federal and non-federal education partners, provide Congress with information about government-supported activities, and report for GPRA and other requirements. In many cases, the data need to be collected annually to inform the NSF management and evaluation processes. Data collected under the EHR Program Monitoring Clearance can be used by NSF management to document and measure NSF’s success at achieving both Strategic Outcome Goals and internal Annual Performance Goals.
If the information were not collected, NSF would be unable to document the implementation of project activities and outcomes of its programs. It would be unable to meet its accountability requirements or assess the degree to which projects and programs are meeting their goals.
All data collections will comply with 5 CFR 1320.6. All collections under the EHR Program Monitoring Clearance ask respondents for data annually, with the exception of the S-STEM monitoring system (attachments I1 and I2), which asks respondents to submit data each semester/quarter. See attachment I1 for more information on the frequency of this collection.
The notice inviting comments on the EHR Program Monitoring Clearance (OMB 3145-NEW) was published in the Federal Register June 7, 2012, Volume 77, Number 110, pages 33774-33776. No comments were received. A copy of the notice can be found at the end of this document.
When developing collection instruments, EHR routinely consults with research and evaluation experts, PIs, and educators affected by EHR investments. The purpose of these consultations is to assess the relevance, availability, and clarity of items. As suggested by OMB guidelines, these consultations also enable EHR staff to obtain a reliable estimate of the respondent burden generated by new instruments. When a new collection is added or when an existing collection is modified to add new instruments, each instrument is pretested with nine or fewer individuals and revised following debriefings with participating respondents.
For data collections conducted earlier under the EHR Generic Clearance, consultations have included knowledgeable outsiders such as representatives of EHR contractors responsible for technical and evaluation tasks and fellows who work at the Foundation as guests under programs such as the Einstein Fellows Program or the American Association for the Advancement of Science Washington Fellows Program.
To date no payments or gifts have been provided to respondents. There are no plans to provide incentives to respondents because the value of program and project monitoring surveys is of value to the respondents as well as NSF. Program monitoring can be used by projects as a foundation for project-level evaluation.
Respondents are informed that any information on specific individuals is maintained in accordance with the Privacy Act of 1974. Every data collection instrument displays both OMB and Privacy Act notices.
Respondents are told that data collected for the EHR Program Monitoring Clearance are available to NSF officials and staff, evaluation contractors, and the contractors hired to manage the data and data collection software. Data are processed according to federal and state privacy statutes. Detailed procedures followed by EHR for making information available to various categories of users are specified in the Education and Training System of Records (63 Fed. Reg. 264, 272 January 5, 1998). This system limits access to personally identifiable information to authorized users. Data submitted are used in accordance with criteria established by NSF for monitoring research and education grants and in response to Public Law 99-383 and 42 USC 1885c.
The information requested through NSF monitoring systems may be disclosed to qualified researchers and contractors in order to coordinate programs and to a federal agency, court, or party in court or federal administrative proceedings, if the government is a party.
Seven of the proposed collections in the EHR Program Monitoring Clearance request information from respondents, including either name, address, Social Security number (SSN), date of birth (DOB), and/or grade point average (GPA). These data are collected in order to monitor the award sites and evaluate the success of the award programs. Information of this nature is also used to track recipients of funding and training. For example, in the IGERT survey (attachments C1, C2, and C3), trainees’ SSNs are used as a tracking mechanism to permit follow-up studies that examine the long-term effect of the IGERT program on individuals’ success. However, in the IGERT collection and in all collections that request SSN, SSN is a voluntary field. Responses to all items of a sensitive nature are voluntary. Respondents may choose not to provide information that they deem as privileged, such as SSN, address, or DOB. Any individual-level data that are collected are provided only to program staff and consultants conducting studies using the data as authorized by NSF. Any public reporting of data is in aggregate form.
The table below shows which individual collections include questions of a sensitive nature.
Table 1. Questions of a Sensitive Nature
Attachments |
Collection Title |
Address |
DOB |
GPA |
Name |
SSN |
A1-A2 |
Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System |
X |
|
|
X |
|
B1-B4 |
Graduate STEM Fellows in K-12 Education (GK-12) Monitoring System |
X |
|
|
X |
X |
C1-C3 |
Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System |
X |
|
X* |
X |
X |
D1-D4 |
Advancing Informal STEM Learning (AISL) Monitoring System, formerly Informal Science Education (ISE) |
X*** |
|
|
X*** |
|
E1-E2 |
Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System |
|
|
X |
X |
X |
F1-F2 |
Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System |
X |
|
X |
X |
X |
G1-G3 |
Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System |
|
X |
X |
X |
|
H1-H3 |
Research in Disabilities Education (RDE) Monitoring System |
X |
X** |
X |
X |
|
I1-I2 |
Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) Monitoring System |
X |
X |
X |
X |
|
J1-J3 |
Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System |
X*** |
|
|
X*** |
|
K1-K4 |
Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES) Monitoring System |
|
|
|
X**** |
|
*IGERT does not collect GPAs, but does collect the Graduate Record Exam (GRE) scores of individual trainees.
**RDE collects just the birth year as opposed to the full date of birth.
***AISL and STEP collect names and addresses for PIs/respondents but not for individual students.
****TUES collects names for PIs/data collection personnel but not for individual students.
As shown in appendix E, and in table 2 below, the annual response burden for the 11 collections under OMB 3145-NEW is 62,649 hours (for 9,345 respondents and 9,845 responses). Given the diversity of respondent types, the methods used to arrive at individual collection burden estimates are described in detail in attachments A1 through K1.
Table 2. Respondents, Responses, and Annual Hour Burden
Attachment |
Collection Title |
No. of Respondents |
No. of Responses |
Annual Hour Burden |
A1 |
Centers of Research Excellence in Science and Technology (CREST) and Historically Black Colleges and Universities Research Infrastructure for Science and Engineering (HBCU-RISE) Monitoring System |
37 |
37 |
1,374 |
B1 |
Graduate STEM Fellows in K-12 Education (GK-12) Monitoring System |
1,626 |
1,626 |
3,941 |
C1 |
Integrative Graduate Education and Research Traineeship Program (IGERT) Monitoring System |
4,658 |
4,658 |
12,156 |
D1 |
Advancing Informal STEM Learning (AISL) Monitoring System, formerly Informal Science Education (ISE) |
157 |
157 |
2,047 |
E1 |
Louis Stokes Alliances for Minority Participation (LSAMP) Monitoring System |
518 |
518 |
17,094 |
F1 |
Louis Stokes Alliances for Minority Participation Bridge to the Doctorate (LSAMP-BD) Monitoring System |
50 |
50 |
3,600 |
G1 |
Robert Noyce Teacher Scholarship Program (Noyce) Monitoring System |
316 |
316 |
4,108 |
H1 |
Research in Disabilities Education (RDE) Monitoring System |
31 |
31 |
1,439 |
I1 |
Scholarships in Science, Technology, Engineering, and Mathematics (S-STEM) Monitoring System |
500 |
1,000 (500 respondents X 2 responses/yr.) |
6,000 |
J1 |
Science, Technology, Engineering, and Mathematics Talent Expansion Program (STEP) Monitoring System |
242 |
242 |
6,050 |
K1 |
Transforming Undergraduate Education in Science, Technology, Engineering, and Mathematics (TUES) Monitoring System |
1,210 |
1,210 |
4,840 |
|
Total |
9,345 |
9,845 |
62,649 |
EHR estimates that approximately three new collections will need to be cleared under the EHR Program Monitoring Clearance during the next three years, dependent on budgetary limitations and Congressional mandates. The overall response burden in any year should not exceed 90,000 hours. The burden associated with each new collection will be outlined in the individual requests that will be submitted to OMB with a burden change request form.
Below is an example that shows how the hour burden was estimated for the CREST monitoring system (attachment A1).
The estimated average number of annual respondents is 37 (23 CREST center PIs/program coordinators and 14 HBCU-RISE award PIs/program coordinators), with an estimated annual response burden of 1,374 hours. The Web-based data collection is an annual activity of the CREST program. The respondents are either PIs or program coordinators. One PI or program coordinator per award completes the questionnaire. The estimated annual hour burden per respondent was determined using the burden information reported by respondents from the last three collection cycles.
The burden estimate is outlined below:
Respondent |
Estimated
Average Annual No. of |
Estimated Average Annual Burden Hours Per Respondent |
Estimated Annual Burden Hour Total |
CREST center PIs/program coordinators |
23 |
50 |
1,150 |
HBCU-RISE award PIs/program coordinators |
14 |
16 |
224 |
Total |
37 |
37.14 |
1,374 |
Details on the burdens of each form can be found in attachments A1 through K1. The table below is an example of how this burden was estimated for the CREST monitoring system (attachment A1):
Form Type |
Respondent Type |
No. of Respondents |
Burden Hours Per Respondent |
Total Burden Hours |
CREST data collection form |
PIs/program coordinators |
37 |
37.14 |
1,374 |
Total |
|
37 |
|
1,374 |
As shown in appendix E, the total annual cost to respondents generated by the 11 ongoing data collections is currently estimated to be $1,975,294. Below is an example of the method used to calculate cost burden for the CREST monitoring system (attachment A1):
The overall annualized cost to the respondents is estimated to be $56,341. The following table shows the annualized estimate of costs to PI/program coordinator respondents, who are generally university professors. This estimated hourly rate is based on a report from the American Association of University Professors, “Annual Report on the Economic Status of the Profession, 2011-12,” Academe, March–April 2012, Survey Report Table 4. According to this report, the average salary of an associate professor across all types of doctoral-granting institutions (public, private-independent, religiously affiliated) was $86,319. When divided by the number of standard annual work hours (2,080), this calculates to approximately $41 per hour.
Respondent Type |
No. of Respondents |
Burden Hours Per Respondent |
Average Hourly Rate |
Estimated Annual Cost |
PIs/Program Coordinators |
37 |
37.14 |
$41 |
$56,341 |
Total |
37 |
|
|
$56,341 |
The costs to respondents generated by each data collection are described in attachments A1 through K1.
There is no overall annual cost burden to respondents or record-keepers that results from the EHR Program Monitoring Clearance other than the time spent responding to online questionnaires that are described in specific detail in attachments A1 through K4. It is usual and customary for individuals involved in education and training activities in the United States to keep descriptive records. The information being requested is from records that are maintained as part of normal educational or training practice. Furthermore, the majority of respondents are active or former grantees or participants in programs or projects funded by NSF. In order to receive funding, institutions must follow the instructions in the NSF Grant Proposal Guide (GPG) that is cleared under OMB 3145-0058. The GPG requires that all applicants submit requests for NSF funding and that all active NSF awardees do administrative reporting via FastLane, an Internet-based forms system, or via Research.gov. Thus, PIs, K-12 administrators, faculty members, and college students, who are the primary respondents to the individual data collections within the EHR Program Monitoring Clearance, make use of standard office equipment (e.g., computers), Internet connectivity that is already required as a startup cost and maintenance cost under OMB 3145-0058, and free software (e.g., Netscape or Microsoft Explorer) to respond.
As shown in appendix E, the total annual cost to the Federal government of the 11 ongoing data collections is currently estimated to be $3,631,771. Details of the costs of each collection can be found in appendix E.
Below is an example of the costs to the Federal government from the CREST data collection (attachment A1):
Computing the annualized cost to NSF for the CREST data collection was done by taking the budgets for three years and calculating the costs for each of the following operational activities involved in producing, maintaining, and conducting the data collection:
Operational Activities |
Cost Over Three Years |
System Development (includes initial development of the database and Web-based application, and later changes requested by the program, e.g., increased reporting tools, additional validations) |
$357,166 |
System Maintenance, Updates, and Technical Support (system requires updates each year before opening the collection; maintenance is required to keep the system current with technology, e.g., database servers, operating systems) |
$178,583 |
Data Collection Opening and Support (e.g., online and telephone support to respondents and contacting respondents to encourage completion of the questions), Reporting (as defined by HRD), and Followup Activities (e.g., providing data to other consultants) |
$134,157 |
Three-Year Total for All Operational Activities |
$669,906 |
The annualized cost was computed as one-third of the total three-year costs; thus, the annualized cost to NSF for the CREST data collection is $223,302.
More details on the costs of existing collections can be found in attachments A1 through K1.
Not applicable
Like many agencies, NSF no longer relies on formal (i.e., traditional) publication methods and publication formats. News media advisories, notices of funding opportunities for colleges and universities, and results from survey collections are all examples of the types of publications that NSF regularly publishes without putting ink to paper.
For content authored by NSF or by a third party at NSF’s request, the agency rarely uses paper to publish the information. NSF publishes most documents electronically only using the agency’s Web site, from requests for proposals to evaluation or statistical reports, using an archive called an On-Line Document System (ODS).
In addition, NSF runs a Custom News Service, an e-mail and Web-based alert service that sends documents newly published in the ODS (e.g., vacancy announcements, calls for proposals, statistical reports) to subscribers. Subscribers receive electronically those NSF documents of interest and not the agency’s entire publications line.
The other major venue for NSF publications is FastLane. The NSF FastLane system collects and publishes information from NSF’s clients (i.e., applicants for NSF funding) using the Web. When an applicant’s proposal has been funded, that applicant’s name and other key data are published on NSF’s Web site. Each week the FastLane Web site publishes a list of new awards using data gathered from the application process.
Like NSF itself, the scope of publication plans and practices by the OMB 3145-NEW EHR Program Monitoring Clearance has a dual nature. Some individual collections contribute to formal products (e.g., analytical reports) that can be published by NSF’s ODS. Some collections produce only the respondents’ replies that are posted verbatim on the EHR share of the NSF Web site for anyone to download.
Most of what the EHR Program Monitoring Clearance collects, however, is not published as a stand-alone product, because the data are an input to how NSF manages, documents, evaluates, and measures its performance as an agency. NSF’s GPRA Performance Report or an individual division’s annual report to the NSF Director uses information from the collection to report to Congress. This is an annual cycle.
The data collection efforts included under this request are administered by third-party contractors that deliver (1) analytical reports, (2) the raw data from the collections, or (3) both. Third parties are contractually forbidden from publishing results unless NSF has made a specific exception. In short, all products of the collections are the property of NSF. After the products are delivered, NSF determines whether the quality of the products deserves publication verbatim by NSF; i.e., NSF typically is the exclusive publisher of the information collected by the collections. Often it is only after seeing the quality of the information the collection delivers that NSF decides the format (raw or analytical) and manner (in the ODS or simply a page on the NSF Web site) in which to publish.
EHR recurring studies based on monitoring data are requested by program staff and are done to monitor, manage, and communicate with and about the clients funded by NSF’s investment in education and training. In most cases the primary purpose for each recurring study is program management. These studies generate data that enable both NSF and the funded education and training projects to improve management and performance. Typically, recurring studies generate information that NSF uses as inputs to other reports, and therefore EHR cites no specific publication plans other than internal or general use to meet reporting requirements.
EHR uses data from recurring studies to provide information that can be mined for program evaluation purposes, such as identifying best practices in the education of graduate and undergraduate students, or as a baseline for summative evaluation reports. In the past, using data in part, but not exclusively, from OMB 3145-0136, the following evaluative or descriptive analysis research reports have been produced:
A Description and Analysis of Best Practice Finding of Programs promoting participation of underrepresented undergraduate student in Science, Mathematics, Engineering and Technology (Westat) (NSF 01-31)
Summary Report on the Impact Study of the National Science Foundation’s Program for Women and Girls (The Urban Institute) (NSF 01-27)
At this time, NSF has no set timeline for publishing reports from these recurring studies, but plans that a summary or descriptive report be produced within two years of completion of the data collections for each recurring study.
Not applicable
No exceptions apply.
File Type | application/msword |
File Title | Supporting Statement |
Author | 21773 |
Last Modified By | splimpto |
File Modified | 2012-12-21 |
File Created | 2012-12-21 |