REQUEST FOR APPROVAL under the Generic Clearance for NASA STEM Engagement Performance Measurement and Evaluation, OMB Control Number 2700-0159, expiration 09/30/2024
_____________________________________________________________________________________
TITLE OF INFORMATION COLLECTION:
NASA Intern Survey
TYPE OF COLLECTION:
|
Attitude/Behavior Scale |
|
Baseline Survey |
|
Cognitive Interview Protocol |
|
Consent Form |
|
Focus Group Protocol |
|
Follow-up Survey |
|
Instructions |
|
Satisfaction Survey |
|
Usability Protocol |
GENERAL OVERVIEW: NASA Science, Technology, Engineering, and Mathematics (STEM) Engagement is comprised of a broad and diverse set of programs, projects, activities and products developed and implemented by HQ functional Offices, Mission Directorates and Centers. These investments are designed to attract, engage, and educate students, and to support educators, and educational institutions. NASA’s Office of STEM Engagement (OSTEM) delivers participatory, experiential learning and STEM challenge activities for young Americans and educators to learn and succeed. NASA STEM Engagement seeks to:
Create unique opportunities for students and the public to contribute to NASA’s work in exploration and discovery.
Build a diverse future STEM workforce by engaging students in authentic learning experiences with NASA people, content, and facilities.
Strengthen public understanding by enabling powerful connections to NASA’s mission and work.
The NASA Internships Program leverages NASA’s unique missions and programs to enhance and increase the capability, diversity and size of the nation’s future science, technology, engineering and mathematics (STEM) workforce. Internships are available from high school to graduate level. Internships provide students with the opportunity to participate in either research or other experiential learning, under the guidance of a mentor at NASA.
The NASA Intern survey for this information collection is specific to determining the immediate outcomes of participating in NASA Internships, assess how and to what extent interns are contributing to NASA’s missions, and identify sources of group differences and how NASA can address group differences to continue to broaden participation of historically underrepresented groups in STEM fields.
INTRODUCTION AND PURPOSE: This revised NASA Intern Survey was developed as an adapted version of the valid and reliable Apprentice Questionnaire for the Department of Defense (DoD) Army Education Outreach Program (AEOP) after piloting and validating its use with NASA interns. This revised NASA Intern Survey assesses student attitudes toward science, mathematics, engineering and technology, 21st century skills, and student interest in STEM careers. In the NASA Internship Program, the focus is a design task in which students must meet certain criteria through a series of steps that engineers follow to arrive at a solution to a problem. This engineering problem is within the context of NASA-unique content and subject matter experts.
Our interest is to measure students’ immediate outcomes of participating in a NASA internship and assess how and to what extent interns are contributing to NASA’s missions. Additionally, we want to identify sources of group differences and address how NASA can continue to broaden participation of students from historically underrepresented groups in STEM fields. Thus, the purpose for testing is to revalidate the instrument and its reliability to explain the ways in which participants’ attitudes and behaviors are impacted by participation in the NASA Internship Program.
RESEARCH DESIGN OVERVIEW: Research has demonstrated that internships and work-based learning experiences are positively associated with student outcomes such as STEM concept knowledge and STEM persistence (National Academies of Sciences, Engineering, and Medicine, 2017). Thus, participation in such experiences has been viewed as an important evidence-based practice to addressing current STEM workforce needs. Although there is an extant literature documenting the outcomes of such experiences on students, there is much less research documenting the contributions of such experiences to the STEM field.
NASA Intern Survey items will be placed into Survey Monkey online software, and a survey link will be distributed through email to ~2500 NASA Intern participants. Quantitative and qualitative methods will be used to analyze survey data. Quantitative data will be summarized using descriptive statistics such as numbers of respondents, frequencies and proportions of responses, average response when responses categories are assigned to a Likert scale (e.g., 1 = “Never Used” to 4 = “Used Every day”), and standard deviations. Emergent coding will be used for the qualitative data to identify the most common themes in responses.
Descriptive survey item analysis. Descriptive statistics will be used to describe survey participant responses to individual items that do not form a construct or scale. Charts or tables with average participant responses and/or frequencies and percentages will be reported. Percentages of responses will be reported in charts unless the percentage for a category is <5%, in this case the percentage is not identified.
Construct survey item analysis. Rasch (1960, 1980) measurement was previously employed to assess the construct sections of the NASA Intern Survey in the Spring of 2021 (Sondergeld & Johnson, 2021). Pilot results showed all construct sections functioned well and could be used to form respective scales, or composite measures. Thus, items in different survey construct sections will be first descriptively analyzed and an average scale score computed for the purpose of looking for significant differences in each construct by intern demographic variables (gender, minority status, disability status, English language status, first generation college status, Pell Grant eligible, high school free/reduced lunch eligible, high school location, term completed internship) and overall U2 status. Because all grouping variables are dichotomous, independent samples t-tests will be conducted to investigate group differences by varying STEM composite variables (scales). Additionally, a dependent samples t-test will be implemented to determine if interns reported engaging with STEM tasks more or less frequently while in school compared to their STEM internship program.
Open-ended survey item and interview data analysis. Conceptual content analysis (Christie, 2007) will be implemented for all open-ended survey item responses and interview data in this evaluation study because this method of qualitative analysis allows for the determination and frequency of concepts found in a text. Multiple cycles of manual coding will be conducted to ultimately identify themes following Saldana’s (2013) process of moving from the (raw data) to the general (themes). Our analysis process will begin with initial coding by one researcher thoroughly reading participant responses and taking notes on commonalities. Next, the same researcher will perform line-by-line coding with every segment or line being given a code. Categorization of codes will determine overarching categories within the data. The first researcher will share codes, categories, and analyses with a second researcher who will check their work for clarity and consistency. The two researchers will collaboratively identify themes by collapsing categories due to overlap or redundancy, and broader themes will be named. Appropriate descriptive statistics (number of cases and percentages) will be computed and reported for each theme to demonstrate importance or weight (Pyrczak, 2008). Theme description with aligned direct quotes from participants will be provided to support findings.
TIMELINE: Testing of the NASA Intern Survey will take place in May 2022 – April 2023 with internship program student participants, coordinated with the implementation sessions of the NASA STEM Engagement Internship Program.
SAMPLING STRATEGY: The universe of NASA intern participants is 2500 or below. NASA Intern Survey items will be placed into Survey Monkey online software, and a survey link will be distributed through email to ~2500 NASA Intern participants.
Table 1. Calculation chart to determine statistically relevant number of respondents
Data Collection Source |
(N) Population Estimate |
(A) Sampling Error +/- 5% (.05) |
(Z) Confidence Level 95%/ Alpha 0.05 |
(P) *Variability (based on consistency of intervention administration) 50% |
Base Sample Size |
Response Rate |
(n) Number of Respondents |
NASA Intern Participants |
2500 |
N/A |
N/A |
N/A |
2500 |
N/A |
2500 |
TOTAL |
|
|
|
|
|
|
2500 |
BURDEN HOURS: Burden calculation is based on a respondent pool of individuals as follows:
Data Collection Source |
Number of Respondents |
Frequency of Response |
Total minutes per Response |
Total Response Burden in Hours |
NASA Intern Participants |
2500 |
1 |
20 |
833.33 |
TOTAL |
|
|
|
833.33 |
DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance will be maintained in accordance with the Privacy Act of 1974, the e-Government Act of 2002, the Federal Records Act, and as applicable, the Freedom of Information Act in order to protect respondents’ privacy and the confidentiality of the data collected.
PERSONALLY IDENTIFIABLE INFORMATION:
Is personally identifiable information (PII) collected? Yes No
– NOTE: First and Last Name are not collected but demographic information is collected (e.g., gender, ethnicity, race, grade level etc.)
If yes, will any information that is collected by included in records that are subject to the Privacy Act of 1974? Yes No
If yes, has an up-to-date System of Records Notice (SORN) been published?
Yes No
Published March 17, 2015, the Applicable System of Records Notice is NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.
APPLICABLE RECORDS:
Applicable System of Records Notice: SORN: NASA 10EDUA, NASA STEM Engagement Program Evaluation System - http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html
Completed surveys will be retained in accordance with NASA Records Retention Schedule 1,
Item 68D. Records will be destroyed or deleted when ten years old, or no longer needed, whichever is longer.
PARTICIPANT SELECTION APPROACH:
Does NASA STEM Engagement have a respondent sampling plan? Yes No
If yes, please define the universe of potential respondents. If a sampling plan exists, please describe? The universe of NASA intern participants is 2500 or below. NASA Intern Survey items will be placed into Survey Monkey online software, and a survey link will be distributed through email to ~2500 NASA Intern participants.
If no, how will NASA STEM Engagement identify the potential group of respondents and how will they be selected? Not applicable.
INSTRUMENT ADMINISTRATION STRATEGY
Describe the type of Consent: Active Passive
How will the information be collected:
Web-based or other forms of Social Media
Telephone
In-person
Other
If multiple approaches are used for a single instrument, state the projected percent of responses per approach.
Will interviewers or facilitators be used? Yes No
DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:
Consent form
Instrument (attitude & behavior scales, and surveys)
Protocol script (Specify type: Script)
Instructions NOTE: Instructions are included in the instrument
Other (Specify ________________)
GIFTS OR PAYMENT: Yes No If you answer yes to this question, please describe and provide a justification for amount.
ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $1200. The cost is based on an annualized effort of 20 person-hours at the evaluator’s rate of $60/hour for administering the survey instrument, collecting and analyzing responses, and editing the survey instrument for ultimate approval through the methodological testing generic clearance with OMB Control Number 2700-0159, exp. exp. 09/30/2024.
CERTIFICATION STATEMENT:
I certify the following to be true:
The collection is voluntary.
The collection is low burden for respondents and low cost for the Federal Government.
The collection is non-controversial and does raise issues of concern to other federal agencies.
The results will be made available to other federal agencies upon request, while maintaining confidentiality of the respondents.
The collection is targeted to the solicitation of information from respondents who have experience with the program or may have experience with the program in the future.
Name of Sponsor: Richard Gilmore
Title: Performance Assessment and Evaluation Program Manager, NASA
Office of STEM Engagement (OSTEM)
Email address or Phone number: richard.l.gilmore@nasa.gov
Date:
References
Christie, C. (2007). Content analysis. In Baumeister, R. & Vohs, K. (Eds.), Encyclopedia of social
psychology (pp. 176). SAGE.
National Academies of Sciences, Engineering, and Medicine. (2017). Undergraduate research experiences
for STEM students: Successes, challenges, and opportunities. National Academies Press.
Pyrczak, F. (2008). Evaluating research in academic journals: A practical guide to realistic evaluation (4th
ed.). Pyrczak Publishing.
Rasch, G. (1960/1980). Probabilistic models for some intelligence and attainment tests. (Copenhagen, Danish Institute for Educational Research), with foreward and afterword by B.D. Wright. The University of Chicago Press.
Saldana, J. (2013). The coding manual for qualitative researchers (2nd ed.). SAGE.
Sondergeld, T. A., & Johnson, C. C. (2021). NASA intern study: Quantitative field study of intern survey. 1-
18.
NASA Office of STEM
Engagement
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Teel, Frances C. (HQ-JF000) |
File Modified | 0000-00-00 |
File Created | 2024-09-11 |