Download:
pdf |
pdfS-STEM OMB Cover Memo
National Science Foundation
Directorate for Education and Human Resources
Division: Division of Undergraduate Education (DUE)
Scholarships in Science, Technology, Engineering and Mathematics (S-STEM)
Overview of the S-STEM Program Evaluation
Prime Contractor: Abt Associates Inc.
Program Purpose:
The National Science Foundation’s (NSF) Scholarships in Science, Technology, Engineering, and
Mathematics (S-STEM) program awards grants to institutions of higher education (IHEs) that then
provide scholarships for academically talented students, in science and engineering disciplines, who
have demonstrated financial need. Individuals may be granted S-STEM scholarships for up to five
years and receive up to $10,000 per year depending on financial need. The institutions also provide
resources and support services (e.g. academic support, career counseling, recruitment, research
opportunities) to students to support them in becoming and/or remaining engaged in science and
engineering through successful pursuit of associate, baccalaureate, or graduate-level degrees.
S-STEM awards to institutions may last up to five years. The maximum S-STEM request is normally
not to exceed $600,000 in total direct costs; annual budgets are limited to $225,000 direct costs. As
part of the direct costs, institutions may request funds up to 5 percent of the total scholarship amount
for expenses related to program administration, and up to 10 percent of the total scholarship amount
for student support services.
The program’s goals are: (1) improved educational opportunities for students; (2) increased retention
of students to degree achievement; (3) improved student support programs at institutions of higher
education; and (4) increased numbers of well-educated and skilled employees in technical areas of
national need. 1 Successful outcomes of the program include graduation with a STEM major, transfer
of STEM students from two-year to four-year colleges, pursuit of STEM graduate degree studies, and
employment in the STEM workforce. Funding for S-STEM comes from H-1B VISAs, funding which
was reauthorized in FY 2005 through Public Law 108-447. NSF receives 40 percent of the H-1B
funding, and the agency uses 75 percent of its portion of these funds for the S-STEM program. 2 In
2006, S-STEM expanded to include technology and science fields beyond the original computer
science, engineering, and mathematics fields included in its precursor program—the Computer
1
2
National Science Foundation. NSF Scholarships in Science, Technology, Engineering, and Mathematics (SSTEM) Program Solicitation (NSF 12-529). Retrieved 5/21/13 from
http://www.nsf.gov/pubs/2012/nsf12529/nsf12529.htm
National Science Foundation (February 14, 2011). National Science Foundation: FY 2012 Budget Request
to Congress. Retrieved 5/1/11 from http://www.nsf.gov/about/budget/fy2012/pdf/fy2012_rollup.pdf
1
Science, Engineering, and Mathematics Scholarship (CSEMS) Program, targeting instead the
development of the broader STEM workforce.
A descriptive study of the precursor CSEMS was conducted in 2003-2004, 3 but this study is the first
evaluation to be conducted of the S-STEM program. The current evaluation has convened a
committee of Subject Matter Experts (SMEs) to provide input to the evaluation design,
implementation, and interpretations of findings. SMEs reviewed and commented on the: (1) general
evaluation design; (2) study’s research questions; (3) outcomes being investigated; (4) strategies,
practices, and characteristics of implementation that were being explored; (5) sources of data; (6)
draft instruments; (7) sampling plan; and (8) approach to analysis.
The evaluation will draw on extant data as well as require new data collection efforts. This package
seeks OMB approval for the new data collection efforts, which include a survey of S-STEM principal
investigators, a survey of S-STEM scholarship recipients, and site visit interview and focus group
protocols. An overview of the evaluation is presented below.
Evaluation Questions:
The evaluation is designed to answer four overarching research questions and related specific
subquestions.
1. How do individual awardees implement their S-STEM projects?
a. What are promising practices in key areas (e.g. recruitment, academic support,
retention) of and/or lessons learned from highly effective and successful S-STEM
projects?
2. What are the educational and career outcomes of the S-STEM scholarship recipients? How do
outcomes of S-STEM recipients compare to an appropriate comparison group or national
trends?
a. What is the effect of the S-STEM program scholarships on recipients?
b. To what extent do S-STEM scholarship recipients transfer to a four-year program as
compared to an appropriate comparison or national trends?
c. To what extent do S-STEM scholarship recipients join the STEM workforce after
graduation as compared to an appropriate comparison or national trends?
d. To what extent do S-STEM scholarship recipients apply for and attend a STEM
graduate program as compared to an appropriate comparison group or national
trends?
e. How effectively does the program meet the needs of academically talented
financially needy STEM students as compared to other need-based opportunities
and/or mechanisms?
3. Are there student or programmatic outcomes associated with receipt of an S-STEM award?
a. What is the effect of the S-STEM program on student outcomes (e.g. recruitment,
3
Temple University Institute for Survey Research and Caliber Associates, Inc. (no date) Evaluation of the
National Science Foundation’s Computer Science, Engineering, and Mathematics Scholarship (CSEMS)
Program: Phase Two – Survey Finding 2003-2004 Summary Report, Temple University, Institute for
Survey Research and Caliber Associates, Inc.
2
retention) in STEM at institutions that have received an award compared to an
appropriate comparison or national trends?
b. What is the added value of the S-STEM program on institutions that receive awards?
c. What is the added value of the S-STEM program on student support and educational
opportunities for recipients in institutions receiving S-STEM program funding?
d. What is the added value of the S-STEM program on STEM programs, STEM
departments, and/or IHEs that have received S-STEM funding?
4. What is the relationship between specific project features or practices and student outcomes?
Are there promising practices or lessons learned about implementation of S-STEM projects?
a. Are there any unintended positive and negative consequences/outcomes that can
inform project and program management and design?
b. To what extent can outcomes be attributed to components supported by the S-STEM
program?
c. Are there other NSF-funded education-oriented projects at the institution?
d. What is the relationship among these NSF-funded education-oriented efforts at the
institution?
Overview of the Evaluation Approach and Design:
Because of the nature of the S-STEM program and the type of information being sought, a mixedmethods evaluation design will be employed. The evaluation will collect data through web surveys
and site visits (consisting of interviews and focus groups), as well as draw on data from extant
sources. The study components include: a descriptive study to describe project implementation, a
relational study that explores the associations between project characteristics and practices and
recipient outcomes, benchmark comparisons of recipients’ educational experiences to national trends;
and a comparative, quasi-experimental study to compare individual educational and career outcomes
of S-STEM scholarship recipients to a matched sample of respondents from national surveys.
One purpose of the evaluation is to determine, to the extent feasible, the effects of receiving an SSTEM scholarship. When program evaluation precludes use of a randomized control trial, a quasiexperimental method with an appropriate and credible comparison group is the next best alternative.
Quasi-experimental designs use statistical controls and a comparison group rather than random
assignment to provide a counter-factual against which to measure program impacts. In this evaluation
of S-STEM, it is not possible to randomly assign students (or institutions) to scholarship awardee
status. Instead, our evaluation design uses respondents from the Beginning Postsecondary Students
(BPS) longitudinal study as the comparison group. To mitigate selection bias issues, we will use a
quasi-experimental statistical method called propensity score matching (PSM).4 The advantages and
limitations of the evaluation approach are discussed in Section B.3 of Supporting Statement B.
The sections below contain brief descriptions first of the data sources and then the analyses that will
be employed to answer the study’s research questions. More details are provided in Supporting
Statements A and B. Exhibit 1 below lists the relevant sections of the OMB package that contain
descriptions of all new data collections, extant data sources, and analyses.
4
Rosenbaum. P. & Rubin, D. (1984). Reducing bias in observational studies using subclassification on the
propensity score, Journal of the American Statistical Association 79, 516-524.
3
Exhibit 1: Relevant Sections of OMB Package
Data Source
Relevant Sections of OMB package
New Data Collections
PI survey
Appendix C
Recipient survey
Appendix A
Site visit interview/focus
groups
5
Existing Data Sources
Appendix D
S-STEM Monitoring System
and PI Annual Reports
Integrated Postsecondary
Education Data System
(IPEDS)
Beginning Postsecondary
Students (BPS) data
National Survey of Student
Engagement (NSSE) data
Analysis
Described in Part A Section A.1
Descriptive
Comparative
Relational
Described in Part B, Section B.3
Described in Part A Section A.1
Described in Part A Section A.1
Described in Part A Section A.1
Relevant Sections of OMB package
Data Collection and Sources:
While approval is sought only for the new data that will be collected for the study, below we describe
both the new data collection and the extant data that will serve the evaluation.
New data collection
Primary data sources will include web surveys of S-STEM Principal Investigators (PIs) and S-STEM
scholarship recipients, and in-depth interviews or focus groups with a series of respondents during
site visits to a subset of awardee institutions.
PI surveys will gather information on how the S-STEM projects operate and how the S-STEM
awards are implemented, including specific components that are offered to students (see Appendix C
for the PI survey). Data from these surveys will be used in descriptive analyses to describe the SSTEM projects as well as institutional outcomes.
Recipient surveys will gather information on their postsecondary education enrollment and
graduation, work experiences, educational experiences, and financial aid including S-STEM
scholarship receipt and use (see Appendix A for the recipient survey and Appendix B for sources of
specific items and a mapping of items to the study’s research questions). Many of the questions
on the survey are drawn from the extant surveys described below. These surveys will collect outcome
5
To determine the feasibility of the matching, NSF’s evaluation contractor has obtained access to the extant
datasets, processed the data, and identified the samples.
4
data from S-STEM recipients that will be used in the comparative analyses using propensity score
matching (PSM). (The approach to PSM is detailed in Appendix E of the Supporting Statement.) Data
from these surveys also will be used to describe STEM students’ experiences and in relational
analyses investigating associations between S-STEM experiences and observed outcomes.
Site visits will consist of a series of interviews or focus groups that gather in-depth information
about the implementation of the S-STEM program at a given institution, individuals’ experiences
with the program, and outcomes related to the program (see Appendix D for the interview protocols).
Site visits will contribute to the depth of understanding about the implementation of the S-STEM
projects as well as possible mechanisms between the program and associated outcomes. 6 Interviews
or focus groups will be conducted with PI/co-PIs, STEM faculty, other campus officials, and current
and former S-STEM recipients. Descriptive analyses of these data will provide an in-depth
understanding of the program activities and experiences. See Supporting Statement A, Section A.1 for
more information on site visits.
Extant data
Extant data sources include both sources that provide information about the S-STEM program
directly as well as sources that will provide data for comparisons.
Extant NSF program data sources that will serve the evaluation include the annual reports prepared
by PIs as part of their grant requirements and the program monitoring system. PIs submit Annual
Reports to NSF that provide an overview of the current year’s activities and scholarship recipients.
The S-STEM Monitoring System, administered by ICF International (approved under OMB generic
clearance #3145-0226), contains information entered by PIs each year about project activities,
scholarship recipients, and recipient characteristics. Data from the monitoring system and annual
reports will be used in descriptive analyses to describe the services offered by S-STEM as well as the
characteristics of S-STEM participants.
The Integrated Postsecondary Education Data System (IPEDS) is a system of interrelated surveys
conducted annually by the U.S. Department of Education’s National Center for Education Statistics
(NCES) that gathers information from every college, university, and technical and vocational
institution that participates in federal student financial aid programs. Institutional characteristics data
from IPEDS including types of degrees granted; public versus private control; non-profit or for-profit
status; demographic characteristics of student body (e.g., race, ethnicity, gender); undergraduate and
graduate student enrollment; retention and degree completion rates; percentage of students majoring
in academic fields; tuition and cost-of-attendance; percentage of students receiving financial aid;
expenditures on instruction, research and development, operations will be used to identify institutions
from which a matched comparison group of individuals can be selected, as necessary, for the
comparative analyses.
The Beginning Postsecondary Students (BPS) Longitudinal Study follows the academic progress
of a nationally-representative sample of first-time beginning post-secondary students, contains
information about academic progress and persistence in postsecondary education, degree completion,
financing of education and entry into the workforce. Outcomes (e.g. degree attainment, time to
6
Yin, R. K. (2009). Case Study Research: Design and Methods (4th ed.). Thousand Oaks, CA: Sage
Publications. Stake, R. E. (1995). The Art of Case Study Research. Thousand Oaks, CA: Sage Publications
5
degree) for the matched comparison will be drawn from the BPS and used in the comparative
analyses. An overview of the study’s approach to matching, using propensity score matching, is
provided on pages 9- 10 of this memo.
The National Survey of Student Engagement (NSSE) is a survey of first year and senior
undergraduate students from participating institutions of higher education in the US and Canada. The
survey asks about how students allocate time and effort to coursework and other activities, levels of
academic challenge and collaboration, the nature of students’ interactions with faculty, and other
aspects of the campus climate. Data from NSSE will provide benchmark comparisons for the
experiences on campus of S-STEM recipients.
Exhibit 2 shows how each of these data sources will be used to address the evaluation questions. See
Appendix B for sources of specific items and a mapping of items to the study’s research questions.
6
Exhibit 2: Map of Research Questions to Data Sources and Analyses
Research Questions
Data Sources
1.
2.
How do individual awardees implement their S-STEM projects?
a. What are promising practices in key areas (e.g. recruitment, academic support, retention) of and/or
lessons learned from highly effective and successful S-STEM projects?
What are the educational and career outcomes of the S-STEM scholarship recipients? How do
outcomes of S-STEM recipients compare to an appropriate comparison group or national trends?
a.
X
4.
To what extent do S-STEM scholarship recipients transfer to a four-year program as compared to an
appropriate comparison or national trends?
c. To what extent do S-STEM scholarship recipients join the STEM workforce after graduation as compared
to an appropriate comparison or national trends?
d. To what extent do S-STEM scholarship recipients apply for and attend a STEM graduate program as
compared to an appropriate comparison group or national trends?
e. How effectively does the program meet the needs of academically talented financially needy STEM
students as compared to other need-based opportunities and/or mechanisms?
Are there student or programmatic outcomes associated with receipt of an S-STEM award?
a. What is the effect of the S-STEM program on student outcomes (e.g. recruitment, retention) in STEM at
institutions that have received an award compared to an appropriate comparison or national trends?
b. What is the added value of the S-STEM program on institutions that receive awards?
c. What is the added value of the S-STEM program on student support and educational opportunities for
recipients in institutions receiving S-STEM program funding?
d. What is the added value of the S-STEM program on STEM programs, STEM departments, and/or IHEs
that have received S-STEM funding?
What is the relationship between specific project features or practices and student outcomes? Are
there promising practices or lessons learned about implementation of S-STEM projects?
a. Are there any unintended positive and negative consequences/outcomes that can inform project and
program management and design?
b. To what extent can outcomes be attributed to components supported by the S-STEM program?
c. Are there other NSF-funded education-oriented projects at the institution?
d. What is the relationship among these NSF-funded education-oriented efforts at the institution?
7
NSSE data
BPS data
Site visit
interview/focus
groups
Annual Reports/
Monitoring System
IPEDS
Descriptive,
Relational
X
X
X
X
X
Comparative
X
X
X
Comparative
X
X
X
Comparative
X
X
Comparative
X
X
Comparative
X
X
X
X
X
X
X
X
X
X
X
X
X
Descriptive,
Benchmarking
X
X
X
Analyses
Extant
X
X
What is the effect of the S-STEM program scholarships on recipients?
b.
3.
Recipient survey
PI survey
New
X
Descriptive
X
X
X
Descriptive
Descriptive,
Benchmarking
Descriptive,
Relational
Descriptive
Descriptive
Descriptive
Exhibit 3 below lists the recipient outcomes of interest. The surveys include questions that explore
recipients’ educational experiences, educational outcomes, and career goals and outcomes. Wherever
feasible, surveys have been constructed to include items administered in nationally-representative
surveys to allow for comparisons to national estimates of student outcomes (see Appendix B for
sources of specific items and a mapping of items to the study’s research questions). When it is not
possible to compare S-STEM recipient outcomes to a nationally representative estimate, the study
will present benchmarks from large-scale surveys to provide a meaningful context for descriptive data
on recipient outcomes. In particular, we will compare outcomes for S-STEM recipients to similar
outcomes for participants in the Beginning Postsecondary Students Longitudinal Study (BPS) and to
the National Survey of Student Engagement (NSSE).
Both current and former recipients will be surveyed so that the study can describe individuals who are
at varying stages of their educational and career trajectories and who have had varying amounts of
time post-scholarship to achieve certain milestones. Separate representative samples will be drawn
from those pursing Associate’s and Bachelor’s degrees. Projects that solely support graduate students
(n=19) are not included in the study. 7
Exhibit 3: S-STEM Scholarship Recipient Outcomes
For S-STEM Recipients in Associates or Bachelor’s degree program when S-STEM
scholarship received:
Primary Outcomes
Source
1. Attainment of degree pursued at S-STEM Recipient survey/BPS item
institution of higher education
2. Time to degree
Recipient survey/BPS item
3. Employment in a STEM occupation
Recipient survey/BPS item
Secondary Outcomes
4. Major field of study at S-STEM institution Recipient survey/BPS item
of higher education
5. Graduate degree attained/pursued in a
Recipient survey/BPS item
STEM field
6. Hours per week employed during
Recipient survey/NSSE item
enrollment at S-STEM institution of
higher education
7. Amount borrowed for undergraduate
Recipient survey/BPS item
education
Additional Outcomes S-STEM Recipients in Associate’s degree program when S-STEM
scholarship received:
Primary Outcomes
Source
1. Transfer from two- to four-year institution Recipient survey/BPS item
of higher education (IHE) for a bachelor’s
degree (or transfer from Associate’s to
Bachelor’s degree program if same IHE)
2. Bachelor’s degree attained
Recipient survey/BPS item
Secondary Outcomes
3. Time to bachelor’s degree
Recipient survey/BPS item
7
Some S-STEM projects also support graduate students in addition to undergraduate students but the
evaluation will focus on recipients pursuing an undergraduate degree at the time they received an S-STEM
scholarship. The Recipient Survey includes appropriate items should we reach students who are no longer
enrolled in higher education or who are currently enrolled in a graduate degree program.
8
Analyses
The research questions of this study will be addressed through a combination of descriptive,
relational, benchmarking, and quasi-experimental comparative analyses. An overview of each of
these is presented below, and additional details including the advantages and limitations of the
evaluations approach, are discussed in Supporting Statement B.
Descriptive analyses of strategies of S-STEM projects will describe the ways in which S-STEM
projects (i.e., grantee institutions) recruit and retain students in STEM fields, allocate scholarship
funds, and provide educational and support programming for scholarship recipients. Data from the SSTEM Monitoring System will be analyzed to describe variation in student support services (e.g.
academic support, career counseling, recruitment, research opportunities) offered by the awardee
institution. PI surveys and interviews during site visits will further probe the activities that are offered
as part of the S-STEM program and other supports that are available to S-STEM students.
Relational analyses of associations between academic strategies (e.g. student support services)
of S-STEM projects and outcomes will explore the relationships between S-STEM program
services and supports and outcomes of interest. The PI web surveys will provide data on program
characteristics and the recipient web surveys will provide data on recipient outcomes for these
analyses.
Benchmark comparisons of S-STEM recipient educational and academic support experiences
will use items and data from the NSSE survey to provide a non-matched comparison group against
which to benchmark selected outcomes for currently-enrolled S-STEM recipients. The variables that
will be used in the benchmark comparison include the items from the 2011 NSSE survey. 8 These
include measures of students’ allocation of time and effort to curricular and co-curricular activities
and interactions with faculty members.
Quasi-experimental comparative analyses of S-STEM recipients to a matched comparison
group will provide estimates of effects on key program outcomes. We will use propensity-scorematching (PSM) to compare responses of S-STEM recipients to a comparison group of participants in
the Beginning Postsecondary Study (BPS) surveys (NPSAS:04, BPS:04/09). Propensity score
matching allows a comparison of the S-STEM scholarship recipients (treatment group) to BPS
respondents (comparison group) selected based on their similarity to the S-STEM scholarship
recipients.9 PSM is a common quasi-experimental design approach that has been shown to produce
unbiased estimates of the difference between the treatment and comparison group.10 A detailed
exposition of these methods is presented in Supporting Statement B and Appendices.
8
We have secured both permission to use NSSE 2011 survey items and a data license from the Center for
Postsecondary Research at the University of Indiana School of Education.
9
J. D. Angrist, (1998). Estimating the labor market impact of voluntary military service using social security
data on military applicants, Econometrica, 66: 249-288. 1998; Heckman, J. Ichimura, H., Smith, J. and
Todd, P. (1998) Characterizing selection bias using experimental data, Econometrica, 66: 1017-1098.
10
Rosenbaum, P.R., and Rubin, D.B. (1984) Reducing bias in observational studies; Journal of the American
Statistical Association, 79, 387: 516-524; Heckman et al., (1998) Characterizing selection bias using
experimental data. Econometrica, 66: 1017-1098; Cook, T.D., Shadish, W.R. and Wong, V.C. (2008) Three
conditions under which experiments and observational studies produce comparable causal estimates: New
findings from within-study comparisons, Journal of Policy Analysis and Management, 27(4), 724-750.
9
The treatment group for this quasi-experiment will consist of S-STEM scholarship recipients. The
sample of S-STEM recipients will be restricted to those enrolled in an associate’s or bachelor’s
degree program who received an S-STEM scholarship from either a two- or four-year institution that
was awarded an S-STEM grant between 2006 and 2010. The comparison group will consist of BPS
survey respondents. Within each S-STEM awardee institution,11 S-STEM recipients will be matched
on student level characteristics (see Exhibit 4 below) to BPS survey respondents from the same
institution. Items on the study instruments were designed to align with items from BPS in order to
make matching and comparison of outcomes possible. If the matching is not possible within an
awardee institution, we will match students from institutions with similar institutional characteristics
measured in the Integrated Postsecondary Education Data System (IPEDS).12
Exhibit 4: List of individual characteristics for matching
Financial aid
Received Federal Stafford Loan
Received Pell Grant
Received school grant/scholarship
Received State grant/scholarship
Received any other financial aid for education
Academic information
SAT I math score
SAT I verbal score
ACT composite score
Cumulative GPA ( or an estimate of GPA) through the end of the first school
year
Demographic characteristics
Gender
Age
Race and Ethnicity
Citizenship
Disability
Other characteristics
Type of degree (Associates or Bachelor’s degree)
Major (Current field of degree)
Full-time enrollment status
Monitoring Data
This study capitalizes on existing data maintained in the S-STEM Program Monitoring System,
managed by ICF International and cleared separately by OMB. Data from the S-STEM Monitoring
System (e.g. characteristics of students served, supports and services provides) will be analyzed to
provide a descriptive picture of S-STEM scholars across institutions. This monitoring system
11
Examples of these characteristics are: state and geographic information, sector of institution (public,
private, non-profit), level of institution (2 - year vs. 4 year), historically black college, degree of
urbanization, Carnegie classification, cost of attendance, selectivity of the institution, enrollment size,
enrollment characteristics
12
S-STEM recipients and BPS respondents will be matched on selected characteristics during their first year
of enrollment – either at their S-STEM institution (S-STEM recipient) or at their first-ever post-secondary
institution (BPS respondents).
10
provides yearly data on project components, support services, and participants, allowing NSF to
monitor the progress of the S-STEM projects at the various institutions. At the project level, PIs are
required to collect and report a standard set of information regarding their S-STEM projects on an
annual basis. PIs report information on their S-STEM activities and S-STEM recipients.
As part of the design process for the current study, the data in the S-STEM monitoring system has
been reviewed and analyzed. The S-STEM Monitoring System does not provide all of the data
needed for the proposed evaluation. Accordingly, the study makes extensive use of extant data on
undergraduate student outcomes from the national Beginning Postsecondary Student study and the
original items from two of the surveys fielded as part of this study. By using these existing data the
study minimizes the number of respondents to new data collection efforts (i.e., the comparison group
data already exists).
Expected Contributions of S-STEM Program Evaluation
Accountability. The Program Evaluation will allow S-STEM to investigate the outcomes of
individuals supported by S-STEM who are academically talented, have demonstrated financial need,
and are science, technology, engineering, and mathematics (STEM) majors. Their outcomes will be
compared to similar students using available data from national data collections.
Inform Program Management. Program Officers will utilize and disseminate the evaluation findings
about successful systematic approaches and best practices for S-STEM projects that support
academically talented students with demonstrated financial need as they pursue STEM degrees and
subsequent work in STEM fields. This evaluation will produce findings and methods that are needed
to further the study of the effects of targeted financial and student support programs.
Inform the Field. Findings from the program evaluation could contribute to the knowledge base on
factors that are associated with increased participation in STEM by individuals from disadvantaged
backgrounds.
11
File Type | application/pdf |
File Title | Microsoft Word - S-STEM OMB Memo 05-22-13.docx |
Author | martineza1 |
File Modified | 2013-06-18 |
File Created | 2013-05-23 |