Download:
pdf |
pdfSUPPORTING STATEMENT B
B.1. Respondent Universe and Sampling Methods
Respondent Universe
Since 2003, NSF has made nearly 200 awards to universities, school districts, and nonprofit
organizations across the United States to implement different types of ITEST projects. The
majority of the awards have been made to projects directly providing experiences to students and
teachers, but awards have also been made to a variety of other projects including technical
assistance providers, research on STEM education issues, conferences, and workshops. In the
most recent solicitation for cohort nine, NSF is funding three categories of projects: strategies,
scale-up, and research. The “strategies” category, which includes the majority of projects,
involves projects that focus directly on teachers and students. Existing projects that fit ITEST
program goals and have evidence of effectiveness are eligible to receive an award in the “scaleup” category. The research projects vary in topic but are broadly aimed at understanding issues
related to expanding the STEM workforce.
At the time of this OMB submission, 95 projects currently receive funding from the ITEST
program. 5 Exhibit 8 provides a breakdown of the current universe of active ITEST projects, by
project type. The projects types have been collapsed into the most current ITEST project
categories. 6
Exhibit 8: Number of Active ITEST Projects, by Project Type
Project Type
Number of Projects
Strategies
80
Research
9
Scale-Up
6
Source: NSF Award Database
Given the small number of research projects and their indirect link to teacher and student
outcomes, case study projects will be selected from the “scale-up” and “strategies” project type
category. Further, we will only include a scale-up project if we are able to collect the full
complement of data at a single location.
Exhibit 8 enumerates the number of active projects as of February 2012. The ninth cohort of
ITEST projects is currently being awarded and also will be available for inclusion in the sample.
5
6
There is a distinction between awards and projects. In some cases, multiple organizations receive an award as part
of one project. These 95 projects correspond with 104 active awards. The sample will include 24 projects.
The projects have been collapsed into the current ITEST project categories. Comprehensive and youth-based –
two categories from earlier cohorts of projects that were eventually collapsed into a category called “strategies” -are represented under the “strategies” category. The one project classified as a “conference/workshop” was
collapsed into the “research” category.
19
Twenty-six projects from earlier cohorts are currently set to end by August 31, 2012, although
ITEST projects frequently receive no-cost extensions. Depending on when we receive OMB
approval and whether these projects receive no-cost extensions, these 26 projects may or may not
be available for inclusion in the sample.
In addition to these 95 projects, a smaller number of “continuing” projects are still in
operation despite the expiration of their ITEST award. The sample from the case studies will be
drawn primarily from the universe of active projects, but as will be described in the following
section, it is possible that a few projects may be drawn from these continuing projects.
Sample Selection
We will select the sample of 24 case studies using a two-step process.
Step 1. The first step will involve selecting 12 projects with the strongest evidence for
effectiveness. These 12 projects will be identified using the data gathered through the evaluation
review as part of the portfolio review task and nominations from ITEST program officers
(described in the introduction to Supporting Statement A). All ITEST projects, regardless of
whether they are currently receiving funding, are included in the evaluation review. As a result, a
small number of projects that no longer receive funding, but which have been identified as highly
successful and which have adequate data from which to draw, may be included in the sample.
The research team has already conducted interviews with ITEST program officers and collected
nominations of projects with evidence of effectiveness. These recommendations will be more
heavily weighted when selecting projects from the most recent cohorts since newer projects are
less likely to have evidence of effectiveness in the NSF files.
Step 2. After identifying 12 promising projects, the remaining 12 sites will be selected from
a list of the remaining active strategies and scale-up projects. Notably under-performing projects,
identified from conversations with program officials and our review of project evaluations, will
be removed from the list before the sample is selected since these projects will not be useful in
answering the research questions. Based on information developed through the portfolio review
(described in the introduction of Supporting Statement A), the remaining projects will be
categorized by their topic area, project components, and project format. Projects will be
purposively sampled so that the entirety of the 24 case study sample reasonably approximates the
full range of ITEST projects on these three criteria. To the extent possible, we also will attempt
to ensure that the sample reflects the range of geographic regions, urbanicity, and grantee
organizations (e.g., universities, museums, school districts, other non-profit organizations)
represented by active ITEST projects.
While the small number of cases and the non-random selection process preclude a sample
that is statistically representative for these criteria, every effort will be made to achieve
reasonable face validity that the sample of 24 cases reflects the breadth of the ITEST portfolio.
20
B.2 Procedures for Collecting Information
Once the data collection is fully approved by OMB, the respective site visit teams will begin
to contact sites to schedule the visits. The site visits will last three days each and will be
conducted by two-person teams. While as many of the 24 site visits as possible will be done in
the months immediately following OMB approval (likely fall and winter of 2012), it is possible
some visits may not take place until the summer of 2013. This would only be the case if the
timing of OMB approval did not allow for site visits in the summer of 2012.
To maximize the reliability of case study data, all site visitors will be trained before going
into the field and will receive a manual containing all materials relevant to case study data
collection (e.g., selection criteria for respondents, protocols, consent forms, debriefing forms). In
addition to general case study training, site visitors will also receive detailed background
information on their sites drawn from the portfolio review task.
The training will help team members develop a common understanding of the data collection
and analysis goals. SRI has used a similar training model in other evaluations and has found that
it increases the reliability of data collected by multiple researchers because shared understanding
maintains consistency in data collection across projects and facilitates cross-project comparisons.
Giving the site visitors detailed background information on each of their sites also allows them to
probe more effectively on key items of interest and maximize their time on site.
The site visit team will work closely with the Principal Investigator at each visited project to
ensure all relevant respondents are interviewed. All respondents must be actively involved or
very knowledgeable about a site’s ITEST project. At each case study site we will interview up to
38 participants. Interviews will be conducted with eight types of informants:
•
•
•
•
•
•
•
•
Principal investigators
Co-PIs
Project staff
Local evaluator
Project partners
Teachers (focus group)
Students (focus group)
Parents (focus group)
Exhibit 5 in Supporting Statement A provides a full account of the number of each
respondent type we expect to interview, although the number will vary based on each project’s
size and organizational structure. We have developed protocols for each group of respondents
and they are included in Appendix A.
Degree of Accuracy Needed
The research team will do everything possible to maximize the accuracy of the data collected
for each of the case studies. All interviews (subject to the permission of the respondent) will be
recorded to improve the accuracy of reporting. Furthermore, site visitors will attend detailed
21
training and will review background information prior to planning their visit to ensure efficient,
consistent, and accurate data collection.
Use of Periodic Data Collection
The case studies will be conducted at each site one time in the months immediately following
OMB approval.
B.3. Methods for Maximizing the Response Rate and Dealing with
Nonresponse
The contractor (SRI International) has extensive experience in gaining access to schools,
universities, and educational programs for research purposes. Because the potential programs
and schools to be visited are present (or former) ITEST grantees, it is not expected there will be
difficulty in obtaining permission for the site visits. Key access strategies that the research team
will use include having one researcher be the primary contact with the principal investigator;
using multiple methods (phone, email, mail if necessary) to communicate with the principal
investigator; providing ample opportunities for the principal investigator to ask questions about
the study; building in flexibility to work with multiple coordinators for scheduling if necessary;
selecting mutually convenient dates; and providing easy-to-use tools such as scheduling
templates to minimize the burden on the site.
To ensure that each relevant respondent group is represented in each case study, the research
team will conduct interviews by phone at a later date in any case where respondents are unable to
schedule a meeting during the site visit or become unavailable on short notice. Because the
research team will work closely with the principal investigator to select respondents based on
their role and will be flexible in scheduling the time and location of the interviews, a 100 percent
response rate is anticipated.
B.4. Tests of Procedures or Methods
The case studies interview protocols were not piloted; however, they were reviewed by both
the evaluation team’s panel of consultative experts (see Exhibit 4) and the NSF ITEST Program
Officers and were found to be appropriate for use in the case study interviews.
B.5. Names and Telephone Numbers of Individuals Consulted
Agency
Monya Ruffin, COTR, National Science Foundation, 703.292.7322
Contractors
SRI International will be responsible for data collection and analysis, under the direction of
Patrick Shields, 650.859.3503.
22
References
American Evaluation Association (2004). American Evaluation Association: Guiding principles
for evaluators. Retrieved from http://www.eval.org/Publications/GuidingPrinciples.asp
American Museum of Natural History (AMNH). (2010). The Science Learning City: A new
model of middle school science education. New York, NY: Author.
Appleton, J. J., Christenson, S. L., & Furlong, M. J. (2008). Student engagement with school:
critical conceptual and methodological issues of the construct. Psychology In The Schools,
45, 369–386.
Basista, B. & Mathews, S. (2002). Integrated science and mathematics professional development
programs. School Science and Mathematics, 102(7), 359−370.
Brown, B., Reveles, J., & Kelly, G. (2004). Scientific literacy and discursive identity: A
theoretical framework for understanding science learning. Science Education, 89(5),
779-802.
Chinn, C. A., & Malhotra, B. A. (2002). Epistemologically authentic inquiry in schools: A
theoretical framework for evaluating inquiry tasks. Science Education, 86(2), 175–218.
Davis-Kean, P. (2007). Educating a STEM Workforce: New Strategies for U-M and the State of
Michigan. Paper presented at Educating a STEM Workforce Summit, Ann Arbor, May 21.
Erickson, F. (1986) Qualitative methods in research on teaching. In M. C. Wittrock (ed.), The
handbook of research on teaching (3rd. ed., pp. 119−161). New York: Macmillan.
Girod, M. (2005). Tech Report Attitudes Measure. [White Paper]. Lawrence Hall of Science:
University of California, Berkeley.
Holsti, O. R. (1969). Content analysis for the social sciences and humanities. Reading, MA:
Addison-Wesley.
Hurtado, S., Cabrera, N. L., Lin, M. H., Arellano, L., & Espinosa, L. L. (2009). Diversifying
science: Underrepresented student experiences in structured research programs. Research in
Higher Education, 50(2), 189−214.
Kind, P., Jones, K. & Barmby, P. (2007). Developing Attitudes towards Science Measures.
International Journal of Science Education, 29(7), 871−893.
Kozoll, R. & Osborne, M. (2004). Finding Meaning in Science: Lifeworld, Identity, and Self.
Science Education, 88, 157−181.
Malcolm, S., Teich, A.H., Jesse, J.K., Campbell, L.A., Babco, E.L., & Bell, N.E. (2005).
Preparing Women and Minorities for the IT Workforce: The Role of Nontraditional
Educational Pathways. Washington, DC: American Association for the Advancement of
Science
Messersmith, E. E., Garrett, J. L., Davis-Kean, P. E., Malanchuk, O., & Eccles, J. S. (2008).
Career Development from Adolescence through Emerging Adulthood: Insights from
Information Technology Occupations. Journal of Adolescent Research, 23(2), 206−227.
23
National Research Council. (1996). National Science Education Standards. Washington, DC:
National Academies Press.
National Research Council. (2007). Taking science to school: Learning and teaching science in
grades K-8. Washington, DC: National Academies Press.
National Research Council. (2009). Learning science in informal environments: People, places,
and pursuits. Washington, DC: The National Academies Press.
Online Evaluation Resource Library (n.d.). Quality criteria for evaluation plans. Retrieved from
http://oerl.sri.com/plans/planscrit.html
Pintrich, Paul R.; de Groot, Elisabeth V. Motivational and self-regulated learning components of
classroom academic performance. Journal of Educational Psychology, 82(1), Mar 1990,
33-40.
Thomas, G. P., Anderson, D., & Nashon, S. (2008). Development of an instrument designed to
investigate elements of students’ metacognition, self-efficacy and learning processes: The
SEMLI-S. International Journal of Science Education 30(13), 1701−1724.
24
File Type | application/pdf |
File Title | Supporting Statement |
Author | Kyle Goss |
File Modified | 2012-04-13 |
File Created | 2012-04-13 |