Download:
pdf |
pdfREQUEST FOR APPROVAL under the Generic Clearance for NASA Education Performance
Measurement and Evaluation, OMB Control Number 2700‐0159, expiration 04/20/2018
_____________________________________________________________________________________
I. TITLE OF INFORMATION COLLECTION:
NASA Office of Education STEM Challenges Impact Surveys: Student Baseline Instruments
II. TYPE OF COLLECTION:
Attitude/Behavior Scale
Baseline Survey
Cognitive Interview Protocol
Consent Form
Focus Group Protocol
Follow‐up Survey
Instructions
Satisfaction Survey
Usability Protocol
GENERAL OVERVIEW: NASA Office of Education Science, Technology, Engineering, and Mathematics
(STEM) Engagement line of business activities are designed to provide opportunities for participatory
and experiential learning activities that connect learners to NASA‐unique resources. NASA Education’s
STEM Engagement line of business activities are based on best practices in motivation, engagement,
and learning in formal and informal settings and include the following areas:
o Public Education Activities that foster interactions with learners of all ages to spark an interest
in STEM disciplines using NASA‐unique materials and resources. These may be part of a larger
public event and are often shorter in duration than Experiential Learning Opportunities and
STEM Challenges. Public Education Activities often require close coordination with the NASA
Office of Communications.
o Experiential Learning Opportunities that enable learners to acquire knowledge, understand
what they have learned, and apply that knowledge through inquiry‐based and project‐based
activities. NASA opportunities include participatory activities designed to increase involvement,
knowledge, understanding/comprehension, and application of learning in one or more STEM
disciplines using NASA’s resources.
o STEM Challenges that provide creative applications of NASA‐related science, technology,
engineering, mathematics, and cross‐cutting concepts. They challenge existing assumptions and
encourage learners to demonstrate their knowledge of STEM subjects while enhancing
innovation, critical thinking, and problem‐solving skills.
This baseline instruments information collection is specific to determining the impact of engineering
design and scientific research STEM Challenge activities on middle school students (grades 5 through
8, depending on the school system of record in the U.S.)
III. INTRODUCTION AND PURPOSE: STEM Challenge activities are based on best practices in motivation,
engagement, and learning for students and educators in formal and informal settings (e.g., Farland‐
Smith, 2012; Gasiewski, Eagan, Garcia, Hurtado, & Change, 2012; Kim, et al., 2015; Leblebicioglu,
Metin, Yardimci, & Cetin, 2011; Maltese & Tai, 2011. The constructs of interest for these baseline
surveys are the engineering design and scientific research processes. In a NASA engineering design
challenge (EDC) activity, the focus is a design task in which students must meet certain criteria through
Wills
NASA Office of Education
1
a series of steps that engineers follow to arrive at a solution to a problem. This engineering problem is
within the context of NASA‐unique content and subject matter experts. Similarly, in an a scientific
research challenge (SRC) activity, students are connected with opportunities to participate in science
data collection by conducting real, hands‐on science according to the scientific method, a body of
techniques for investigating phenomena, acquiring new knowledge in an empirical or measurable
manner, and then correcting and/or integrating previous knowledge subject to specific principles of
scientific reasoning.
Our interest is in understanding why, how, and in what ways students are impacted in the short‐,
intermediate, and long‐term by participation in STEM Challenge activities with an engineering design
or scientific research process focus. Thus, the purpose for pilot testing is to develop valid instruments
that reliably explain the ways in which participants’ attitudes and behaviors are impacted by
participation in these activities. Guided by the most current STEM education and measurement
methodologies, it is the goal of this rigorous instrument development and testing procedure to provide
information that becomes part of the iterative assessment and feedback process for the NASA STEM
Engagement line of business.
Hence, the goals of this cycle of pilot testing are as follows:
o Determine clarity, co mprehensibility, and preliminary psychometric properties (e.g.,
validity, reliability) of these instruments. And, to explore individual item functioning, and to
make any necessary adjustments in preparation for large‐scale testing as the basis for more
sophisticated statistical testing.
o Determine an accurate response burden for these instruments.
To assuage any concerns about the respondents being able to progress from the testing scenario
description, through the survey being tested, and then to the questions on clarity and
comprehensibility, a truncated baseline survey usability testing (web‐based) protocol was
implemented with the following 9 Students in Grade levels 5th ‐ 8th, under parental and/or adult
supervision:
5th graders‐ 2
6th graders‐ 2
7th graders‐ 3
8th graders‐ 2
The respondent pool was composed of nine students as follows: 3 African‐Americans; 3 Hispanic or
Latino/Latina; 2 White/Caucasian and 1 Asian, corresponding to four male and five female students
Please see Appendix A. for a summary of the pre‐testing results that support this research.
I. RESEARCH DESIGN OVERVIEW: NASA Education is using a one‐group pretest‐posttest quasi‐
experimental design. Responses will be used to validate these baseline surveys for clarity,
comprehensibility, and to determine psychometric properties with the respondent pool.
Following this pilot phase of testing, indeed NASA Education has tentative research questions and
hypotheses to test regarding the impact of STEM Challenge activities on all participants—students and
teachers alike. Thus, this work is integral to the iterative assessment and feedback process for the
STEM Engagement line of business.
II. TIMELINE: Pilot testing of surveys will take place approximately September 1, 2016 through February
28, 2017, coordinated with the implementation periods of the STEM Challenge activities.
Wills
NASA Office of Education
2
III. SAMPLING STRATEGY: NASA Education employed an estimation procedure to determine the
statistically adjusted number of respondents for the final sample size that meets the minimum criteria
for number of respondents (N ≥ 200) necessary to determining preliminary item characteristics
(Komrey & Bacon, 1992; Reckase, 2000). This estimation procedure accounts for the potential
respondent universe, estimated variance in respondent universe, precision desired, confidence level,
and the prior observed response rate for the category of respondents (Watson, 2001). Watson’s
sample size formula as applied to respondent estimates in Table 1 demonstrates the number of
respondents this pilot effort should reach in order to collect the base sample size of respondents
(2001). In brief, this formula suggests that this pilot effort oversample EDC students by 218
respondents. NASA Education will randomly sample EDC sites to meet the 545 respondent minimum,
but because the number of participants in the SRC activity is less than 200, NASA Education will
administer surveys for testing to the census of participants.
Table 1. Calculation chart to determine statistically relevant number of respondents
Data
Collection
Source
(N)
Population
Estimate for
FY16
(A)
Sampling
Error +/‐
5% (.05)
(Z)
Confidence
Level 95%/
Alpha 0.05
(P) *Variability
(based on
consistency of
intervention
administration)
50%
2,200
0.0025
3.8416
0.5
327
0.6
545
110
N/A
N/A
N/A
110
N/A
110
200
N/A
N/A
N/A
200
N/A
200
10
N/A
N/A
N/A
10
N/A
10
865
EDC
students
SRC
students
EDC
Educators
SRC
Educators
TOTAL
Base
Sample
Size
Response
Rate
(n) Number
of
Respondents
IV. BURDEN HOURS: Burden calculation is based on a respondent pool of individuals as follows:
Data Collection Source
EDC students
SRC students
EDC Educators
SRC Educators
TOTAL
Number of
Respondents
545
110
200
10
Frequency of
Response
1
1
1
1
Total minutes per
Response
10
10
5
5
Total Response
Burden in Hours
91
18
17
1
127
*Burden for Educators, in this instance, is calculated to determine the amount of time spent reading
instructions to student survey respondents.
V. DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance
will be maintained in accordance with the Privacy Act of 1974, the e‐Government Act of 2002, the
Federal Records Act, and as applicable, the Freedom of Information Act in order to protect
respondents’ privacy and the confidentiality of the data collected.
Wills
NASA Office of Education
3
VI. PERSONALLY IDENTIFIABLE INFORMATION:
1. Is personally identifiable information (PII) collected? Yes No
2. If yes, will any information that is collected by included in records that are subject to the
Privacy Act of 1974? Yes No
3. If yes, has an up‐to‐date System of Records Notice (SORN) been published?
Yes No
Published in October 2007, the Applicable System of Records Notice is NASA 10EDUA, NASA
Education Program Evaluation System ‐ http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.
APPLICABLE RECORDS:
4. Applicable System of Records Notice: SORN: NASA 10EDUA, NASA Education Program
Evaluation System ‐ http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html
5. Completed surveys will be retained in accordance with NASA Records Retention Schedule 1,
Item 68D. Records will be destroyed or deleted when ten years old, or no longer needed,
whichever is longer.
VII. PARTICIPANT SELECTION APPROACH:
1. Does NASA Education have a respondent sampling plan? Yes No
If yes, please define the universe of potential respondents. If a sampling plan exists,
please describe? The universe of potential respondents includes a statistically
representative sample of students participating in the engineering design STEM Challenge
activity and the census of students participating in the scientific research STEM Challenge
activity.
If no, how will NASA Education identify the potential group of respondents and how will
they be selected? Not applicable.
VIII. INSTRUMENT ADMINISTRATION STRATEGY
Describe the type of Consent: Active Passive
1. How will the information be collected:
Web‐based or other forms of Social Media (95%)
Telephone
In‐person
Mail
Other (5%)
If multiple approaches are used for a single instrument, state the projected percent of
responses per approach. The feedback forms will be administered via the web. Because it is
preferable that all baseline surveys be administered at the start of an activity, hard copy surveys
will be made available to collect survey responses in the event web access is temporarily
unavailable. In the past, no more than 5% of respondents were asked to complete hard copy
surveys due to internet or computer difficulties.
Wills
NASA Office of Education
4
2. Will interviewers or facilitators be used? Yes No
Note: “Facilitators” refers to Educators who will read and explain student survey instructions.
IX. DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:
Consent form
Instrument (attitude & behavior scales, and surveys)
Protocol script (Specify type: Script)
Instructions NOTE: Instructions are included in the instrument
Other (Specify ________________)
X. GIFTS OR PAYMENT: Yes No If you answer yes to this question, please describe and provide a
justification for amount.
XI. ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $294. The cost is
based on an annualized effort of 7 person‐hours at the evaluator’s rate of $42/hour for administering
the survey instruments, collecting and analyzing responses, and editing the survey instruments for
ultimate approval through the methodological testing generic clearance with OMB Control Number
2700‐0159, exp. 04/30/2018.
XII. CERTIFICATION STATEMENT:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low burden for respondents and low cost for the Federal Government.
3. The collection is non‐controversial and does raise issues of concern to other federal
agencies.
4. The results will be made available to other federal agencies upon request, while maintaining
confidentiality of the respondents.
5. The collection is targeted to the solicitation of information from respondents who have
experience with the program or may have experience with the program in the future.
Name of Sponsor: Richard Gilmore
Title: Educational Programs Specialist/Evaluation Manager, NASA GRC Office of Education
Email address or Phone number: richard.l.gilmore@nasa.gov
Date: 08/22/16
Wills
NASA Office of Education
5
Bibliography
Farland‐Smith, D. (2012). Personal and Social Interactions Between Young Girls and Scientists: Examining
Critical Aspects for Identity Construction. Journal of Science Teacher Education, 23(1), 1‐18.
Gasiewski, J. A., Eagan, M. K., Garcia, G. A., Hurtado, S., & Change, M. J. (2012). From gatekeeping to
engagement: A multicontextual, mixed method study of student academic engagement in
introductory STEM courses. Research in Higher Education, 53(2), 229‐261.
Kim, C., Kim, D., Yuan, J., Hill, R. B., Doshi, P., & Thai, C. N. (2015). Robotics to promote elementary
education pre‐service teachers' STEM engagement, learning, and teaching. Computers &
Education, 91, 14‐31.
Komrey, J. D., & Bacon, T. P. (1992). Item analysis of acheivement tests based on small numbers of
examinees. Paper presented at the annual meeting of the American Educational Research
Association. San Francisco.
Leblebicioglu, G., Metin, D., Yardimci, E., & Cetin, P. S. (2011). The Effect of Informal and Formal
Interaction between Scientists and Children at a Science Camp on Their Images of Scientists.
Science Education International, 22(3), 158‐174.
Maltese, A. V., & Tai, R. H. (2011). Pipeline persistence: Examining the association of educational
experiences with earned degrees in STEM among US students. Science Education, 95(5), 877‐
907.
nasa.gov. (2016, January). Retrieved from NASA Education Implementation Plan 2015‐2017:
http://www.nasa.gov/sites/default/files/atoms/files/nasa_education_implementation_plan_ve
4_2015‐2017.pdf
Reckase, M. D. (2000). The minimum sample size needed to calibrate items using the three‐parameter
logistic model. Paper presented at the annual meeting of the American Educational Research
Association. New Orleans.
Watson, J. (2001). How to Determine a Sample Size: Tipsheet #60. Retrieved from Penn State
Cooperative Extension: http://www.extension.psu.edu/evaluation/pdf/TS60.pdf
Wills
NASA Office of Education
6
Appendix A
Baseline Survey Usability Testing (Web‐based Experiment)
Target middle school students for testing:
9 Students in Grade levels 5th ‐ 8th
o 5th graders‐ 2
o 6th graders‐ 2
o 7th graders‐ 3
o 8th graders‐ 2
3 African‐Americans; 3 Hispanic or Latino/Latina; 2 White/Caucasian and 1 Asian
5 female and 4 male students
Survey Administration:
Parents (6)
Educator (1)
Administrator (1)
Provided responsible adult (parent and/or educator) baseline survey link and
instructions
o Thank you for agreeing to participate in the testing of a NASA Student Feedback
Pre‐Survey. The survey starts with two Educator Questions (pg. 1) which must
be filled in as follows before the student introduction (pg. 2) and
instructions/survey (pg. 3) begin.
Educator Instructions: Please complete the following two items first, prior
to obtaining student responses.
*1. What is the name of the site where the student participated in
the NASA Scientific Research Challenge (SRC) Activity? Enter
location (home, site or school name)
* 2. What is the student’s identification number? Enter Grade
Level
o To launch the baseline surveys, please go to the survey links below and follow
the instructions.
EDC Baseline Survey – https://www.research.net/r/NASA_EDC_PRE
SRC Baseline Survey – https://www.research.net/r/NASA_SRC_PRE
Usability Questions/Functionality to Test:
• Did student(s) understand how to navigate through the entire web‐based survey?
• Did student(s) locate the progression button (NEXT and Done) at the bottom of the
survey?
• Where there any questions about how to navigate through the entire web‐based
survey?
Wills
NASA Office of Education
7
•
Navigation buttons to progress through survey (NEXT and DONE) and filling out test
survey
Submitting survey responses and testing responses (complete the test survey)
•
Usability Testing Results/Analysis:
• Did student(s) understand how to navigate through the entire web‐based survey?
o 100% of student respondents understood how to navigate through the entire
web‐based survey.
o Adult responses to Usability Questions
1. She did understand how to navigate through.
2. He navigated well but got a little frustrated because he didn't understand
why he was doing it. He didn't have any problems and got through it by
himself.
3. Yes
4. Yes, she understood how to navigate through the entire survey.
5. Yes, she understood how to navigate thought the entire web‐based survey
with not difficulty.
6. I read through the instructions with them both students and I made sure
they understood what was expected of them before taking the survey.
Neither one of students had questions.
• Did student(s) locate the progression button (NEXT and Done) at the bottom of the
survey?
o 100% of student respondents were able to locate the progression button (i.e.,
NEXT and DONE) at the bottom of the survey page online.
o Adult responses to Usability Questions
1. She was able to locate the next buttons without any issues.
2. Yes. He knew the next and done and had no problems.
3. Yes
4. Yes, she was able to locate next or done button at the bottom of the
page.
5. Yes, she was able to successfully locate the progression buttons at the
bottom of the survey.
6. Both students were able to locate the next and done buttons at the
bottom of the survey with no problems.
• Where there any questions about how to navigate through the entire web‐based
survey?
o None of the students had questions regarding navigation of the web‐based test
survey
o Adult responses to Usability Questions
1. There were no questions. She took the computer to her room and
completed the survey. She came down finished with the 'thank you for
participating' screen showing. She said that it was 'really easy'.
2. He really didn't have an idea what engineering is.
3. NO: however, she did have trouble staying on the correct line at times
Wills
NASA Office of Education
8
•
•
•
•
•
•
4. No, there were not any questions about how to navigate through the
survey.
5. She did not know how to respond to the final text box (instruction
recommendations). She did not know if this was optional.
6. No, neither student had any question about navigating through the entire
web‐based survey.
Navigation buttons to progress through survey (NEXT and DONE) and filling out test
survey
o 100% of respondent students demonstrated that they were able to navigate
through the web‐based test survey
Submitting survey responses and testing responses (complete the test survey)
o 100% of respondent students were able to complete the test survey submitting
survey and testing responses online
How many minutes did it take you to read the instructions and answer the questions?
o Recorded times ranged from 5 minutes to 18 minutes
o Average time to complete EDC baseline survey (6 minutes)
o Average time to complete SRC baseline survey (9 minutes)
100% of respondent students either Agreed or Strongly Agreed with the following
statements
o The survey instructions were clear.
o The questions were easy to understand.
As you went through the survey, did you think of any comments or feedback you can
give us about the instructions or questions?
o None
o No
o No.
o Nope
o No
o No
o Why the questions on this survey was all about science?
o None
o 1 blank response
Recommended revision to grade level question based on review of survey responses:
Original
1. What grade did you enter last fall 20XX? [To be autocompleted at administration]
5th 6th 7th 8th
Revised
1. What grade did you enter this school year?
5th 6th 7th 8th
Wills
NASA Office of Education
9
File Type | application/pdf |
File Title | Microsoft Word - 1-NASA Education STEM Challenges Impact Surveys-Student Baseline Instruments SHORT FORM |
Author | lwills2 |
File Modified | 2016-08-29 |
File Created | 2016-08-29 |