Short Form: NASA SCIS Educator Instruments

NASA Education STEM Challenges Impact Surveys-EDUCATOR INSTRUMENT SHORT FORM.pdf

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

Short Form: NASA SCIS Educator Instruments

OMB: 2700-0159

Document [pdf]
Download: pdf | pdf
REQUEST FOR APPROVAL under the Generic Clearance for NASA Education Performance
Measurement and Evaluation, OMB Control Number 2700-0159, expiration 04/20/2018
_____________________________________________________________________________________
I. TITLE OF INFORMATION COLLECTION:
NASA Office of Education STEM Challenges Impact Surveys: Educator Retrospective Instrument
II. TYPE OF COLLECTION:
 Attitude/Behavior Scale
 Baseline Survey
 Cognitive Interview Protocol
 Consent Form
 Focus Group Protocol
 Follow-up Survey
 Instructions
 Satisfaction Survey
 Usability Protocol
GENERAL OVERVIEW: NASA Office of Education Science, Technology, Engineering, and Mathematics
(STEM) Engagement line of business activities are designed to provide opportunities for participatory
and experiential learning activities that connect learners to NASA-unique resources. NASA Education’s
STEM Engagement line of business activities are based on best practices in motivation, engagement,
and learning in formal and informal settings and include the following areas:
o Public Education Activities that foster interactions with learners of all ages to spark an interest
in STEM disciplines using NASA-unique materials and resources. These may be part of a larger
public event and are often shorter in duration than Experiential Learning Opportunities and
STEM Challenges. Public Education Activities often require close coordination with the NASA
Office of Communications.
o Experiential Learning Opportunities that enable learners to acquire knowledge, understand
what they have learned, and apply that knowledge through inquiry-based and project-based
activities. NASA opportunities include participatory activities designed to increase involvement,
knowledge, understanding/comprehension, and application of learning in one or more STEM
disciplines using NASA’s resources.
o STEM Challenges that provide creative applications of NASA-related science, technology,
engineering, mathematics, and cross-cutting concepts. They challenge existing assumptions and
encourage learners to demonstrate their knowledge of STEM subjects while enhancing
innovation, critical thinking, and problem-solving skills. (nasa.gov, 2016)
This baseline instruments information collection is specific to determining the impact of engineering
design and scientific research STEM Challenge activities on middle school students (grades 5 through
8, depending on the school system of record in the U.S.)
III. INTRODUCTION AND PURPOSE: STEM Challenge activities are based on best practices in motivation,
engagement, and learning for students and educators in formal and informal settings (e.g., FarlandSmith, 2012; Gasiewski, Eagan, Garcia, Hurtado, & Change, 2012; Kim, et al., 2015; Leblebicioglu,
Metin, Yardimci, & Cetin, 2011; Maltese & Tai, 2011. The constructs of interest for these baseline
surveys are the engineering design and scientific research processes. In a NASA engineering design
challenge (EDC) activity, the focus is a design task in which students must meet certain criteria through

Wills

NASA Office of Education

1

a series of steps that engineers follow to arrive at a solution to a problem. This engineering problem is
within the context of NASA-unique content and subject matter experts. Similarly, in an a scientific
research challenge (SRC) activity, students are connected with opportunities to participate in science
data collection by conducting real, hands-on science according to the scientific method, a body of
techniques for investigating phenomena, acquiring new knowledge in an empirical or measurable
manner, and then correcting and/or integrating previous knowledge subject to specific principles of
scientific reasoning.
While other related surveys explore our interest in understanding why, how, and in what ways
students are impacted in the short-, intermediate, and long-term by participation in STEM Challenge
activities with an engineering design or scientific research process focus, these instruments explore
impact on educators. Thus, the purpose for pilot testing is to develop valid instruments that reliably
explain the ways in which educator participants’ attitudes and behaviors are impacted by participation
in these challenge activities. Guided by the most current STEM education and measurement
methodologies, it is the goal of this rigorous instrument development and testing procedure to provide
information that becomes part of the iterative assessment and feedback process for the NASA STEM
Engagement line of business.
Hence, the goals of this cycle of pilot testing are as follows:
o Determine clarity, comprehensibility, and preliminary psychometric properties (e.g.,
validity, reliability) of these instruments. And, to explore individual item functioning, and to
make any necessary adjustments in preparation for large-scale testing as the basis for more
sophisticated statistical testing.
o Determine an accurate response burden for these instruments.
IV. RESEARCH DESIGN OVERVIEW: NASA Education is pilot testing a retrospective survey and a short
version of the survey. Despite the absence of a control group, the retrospective design can still yield
strong causal effects when effort is made to satisfy requirements of quasi-experimentation such as
identifying and reducing the plausibility of alternative explanations for the intervention- as- treatment
effect (Shadish, Cook, & Campbell, 2002), identifying conceivable threats to internal validity, and
statistically probing likelihood of treatment-outcome covariation (Mark & Reichardt, 2009).
Empirical research (e.g., Howard, 1980; Drennan & Hyde, 2008; Nimon, 2014) suggests that a
retrospective pretest (then-test) may provide a more accurate pre-intervention measure than a
traditional pretest if it happens that respondents change their perceptions of their initial level of
functioning as a consequence of the intervention. In other words, respondents change their internal
standards of measurement having gained in experience or familiarity with the self-rating dimension(s)
(Nimon, 2014). According to Norman (2003), “[r]esponse shift theory presumes that [participants’]
prior state is adjusted in retrospective judgment on the basis of new information acquired in the
interim, so that the retrospective judgment is more valid” (p. 243). The statistical manifestation of
rating oneself on a different dimension or metric at post-test results in a mismatch between pre- and
post-test scores known as response shift bias (Goedhart & Hoogstraten, 1992). The retrospective
pretest is considered to be a valid assessment tool when respondents cannot be expected to know
what they do not know at the onset of an intervention (Pelfrey and Pelfrey, 2009). Such may be the
case with respondents who are participating in a NASA activity and/or are completing an attitude and
behavior or knowledge survey for the very first time.

Wills

NASA Office of Education

2

Following this pilot phase of testing and subsequent determination of instrument psychometric
properties, indeed NASA Education has tentative research questions and hypotheses to test regarding
the impact of challenge activity training on NASA STEM Challenge educator participants. Thus, this
work is integral to the iterative assessment and feedback process for the NASA STEM Engagement line
of business.
V. TIMELINE: Pilot testing of surveys will take place approximately September 1, 2016 through February
28, 2017, coordinated with the implementation periods of the STEM Challenge activities.
VI. SAMPLING STRATEGY: Since the number of educator participants is 200 or below, NASA Education will
administer surveys for testing to the census of educator participants.
Table 1. Calculation chart to determine statistically relevant number of respondents

Data
Collection
Source

(N)
Population
Estimate for
FY16

(A)
Sampling
Error +/5% (.05)

(Z)
Confidence
Level 95%/
Alpha 0.05

(P) *Variability
(based on
consistency of
intervention
administration)
50%

200

N/A

N/A

N/A

200

N/A

200

10

N/A

N/A

N/A

10

N/A

10
210

EDC
Educators
SRC
Educators
TOTAL

Base
Sample
Size

Response
Rate

(n) Number
of
Respondents

VII. BURDEN HOURS: Burden calculation is based on a respondent pool of individuals as follows:
Data Collection Source

EDC Educators
SRC Educators

Number of
Respondents
200
10

Frequency of
Response
1
1

TOTAL

Total minutes per
Response
20
20

Total Response
Burden in Hours
67*
3
70

*If the decision is made to test the short version of the Educator instrument, then the total response
burden will be lower.
VIII. DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance

will be maintained in accordance with the Privacy Act of 1974, the e-Government Act of 2002, the
Federal Records Act, and as applicable, the Freedom of Information Act in order to protect
respondents’ privacy and the confidentiality of the data collected.
IX. PERSONALLY IDENTIFIABLE INFORMATION:
1. Is personally identifiable information (PII) collected? Yes  No
2. If yes, will any information that is collected by included in records that are subject to the
Privacy Act of 1974? Yes  No
3. If yes, has an up-to-date System of Records Notice (SORN) been published?
Yes  No
Published March 17, 2015, the Applicable System of Records Notice is NASA 10EDUA, NASA
Education Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.

Wills

NASA Office of Education

3

APPLICABLE RECORDS:
4. Applicable System of Records Notice: SORN: NASA 10EDUA, NASA Education Program
Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html
5. Completed surveys will be retained in accordance with NASA Records Retention Schedule 1,
Item 68D. Records will be destroyed or deleted when ten years old, or no longer needed,
whichever is longer.
X. PARTICIPANT SELECTION APPROACH:
1. Does NASA Education have a respondent sampling plan? Yes  No
If yes, please define the universe of potential respondents. If a sampling plan exists,
please describe? The universe of potential respondents is the census of educator
participants in the engineering design and scientific research STEM Challenge activities.
If no, how will NASA Education identify the potential group of respondents and how will
they be selected? Not applicable.
XI. INSTRUMENT ADMINISTRATION STRATEGY
Describe the type of Consent:  Active  Passive
6. How will the information be collected:
 Web-based or other forms of Social Media (95%)
 Telephone
 In-person
 Mail
 Other (5%)
If multiple approaches are used for a single instrument, state the projected percent of
responses per approach. The feedback forms will be administered via the web. Because it is
preferable that all baseline surveys be administered at the start of an activity, hard copy surveys
will be made available to collect survey responses in the event web access is temporarily
unavailable. In the past, no more than 5% of respondents were asked to complete hard copy
surveys due to internet or computer difficulties.
7. Will interviewers or facilitators be used?  Yes  No
XII. DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:
 Consent form
 Instrument (attitude & behavior scales, and surveys)
 Protocol script
 Instructions NOTE: Instructions are included in the instrument
 Other (Specify ________________)

Wills

NASA Office of Education

4

XIII. GIFTS OR PAYMENT:  Yes  No If you answer yes to this question, please describe and provide a
justification for amount.
XIV. ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $168. The cost is
based on an annualized effort of 4 person-hours at the evaluator’s rate of $42/hour for administering
the survey instruments, collecting and analyzing responses, and editing the survey instruments for
ultimate approval through the methodological testing generic clearance with OMB Control Number
2700-0159, exp. 04/30/2018.
XV. CERTIFICATION STATEMENT:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low burden for respondents and low cost for the Federal Government.
3. The collection is non-controversial and does raise issues of concern to other federal
agencies.
4. The results will be made available to other federal agencies upon request, while maintaining
confidentiality of the respondents.
5. The collection is targeted to the solicitation of information from respondents who have
experience with the program or may have experience with the program in the future.
Name of Sponsor: Richard Gilmore
Title: Educational Programs Specialist/Evaluation Manager, NASA GRC Office of Education
Email address or Phone number: richard.l.gilmore@nasa.gov
Date: 10/21/2016

Wills

NASA Office of Education

5

Bibliography
Drennan, J., & Hyde, A. (2008). Controlling response shift bias: The use of the retrospective pre-test
design in the evaluation of a master's programme. Assessment & Evaluation in Higher Education,
33(6), 699-709.
Farland-Smith, D. (2012). Personal and Social Interactions Between Young Girls and Scientists: Examining
Critical Aspects for Identity Construction. Journal of Science Teacher Education, 23(1), 1-18.
Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Change, M. J. (2012). From gatekeeping to
engagement: A multicontextual, mixed method study of student academic engagement in
introductory STEM courses. Research in Higher Education, 53(2), 229-261.
Goedhart, H., & Hoogstraten, J. (1992). The retrospective pretest and the role of pretest information in
valuative studies. Psychological Reports, 70(3), 699-704.
Howard, G. S. (1980). Response-shift bias: A problem in evaluating interventions with pre/post selfreports. Evaluation Review, 4(1), 93-106.
Kim, C., Kim, D., Yuan, J., Hill, R. B., Doshi, P., & Thai, C. N. (2015). Robotics to promote elementary
education pre-service teachers' STEM engagement, learning, and teaching. Computers &
Education, 91, 14-31.
Leblebicioglu, G., Metin, D., Yardimci, E., & Cetin, P. S. (2011). The Effect of Informal and Formal
Interaction between Scientists and Children at a Science Camp on Their Images of Scientists.
Science Education International, 22(3), 158-174.
Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational
experiences with earned degrees in STEM among US students. Science Education, 95(5), 877907.
Mark, M. M., & Reichardt, C. S. (2009). Quasi-experimentation. In L. Bickman, & D. J. Rog (Eds.), The
SAGE handbook of applied social research methods (2nd ed., pp. 182-214). Thousand Oaks, CA:
SAGE Publications, Inc.
nasa.gov. (2016, January). Retrieved from NASA Education Implementation Plan 2015-2017:
http://www.nasa.gov/sites/default/files/atoms/files/nasa_education_implementation_plan_ve
4_2015-2017.pdf
Nimon, K. (2014). Explaining differences between retrospective and traditional pretest self-assessments:
Competing theories and empirical evidence. International Journal of Research & Method in
Education, 37(3), 256-269.
Norman, G. (2003). Hi! How are you? Response shift, implicit theories and differing epistemologies.
Quality of Life Research, 12, 239-249.
Pelfrey, Sr., W. V., & Pelfrey, Jr., W. V. (2009). Curriculum evaluation and revision in a nascent field: The
utility of the retrospective pretest-posttest model in a Homeland Security program of study.
Evaluation Review, 33(1), 54-82.

Wills

NASA Office of Education

6


File Typeapplication/pdf
AuthorTeel, Frances C. (HQ-JF000)
File Modified2016-10-21
File Created2016-10-21

© 2024 OMB.report | Privacy Policy