Generic Clearance. NASA Education Performance Measurement and Evaluation Testing. Supt Stmt. Part A

Generic Clearance. NASA Education Performance Measurement and Evaluation Testing. Supt Stmt. Part A.pdf

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

OMB: 2700-0159

Document [pdf]
Download: pdf | pdf
Section

Page

A. JUSTIFICATION .................................................................................................................. 1
1. Necessity For Information Collection:.................................................................................... 1
2. Uses of Information ................................................................................................................ 3
3. Considerations of Using Improved Technology ..................................................................... 5
4. Efforts to Identify Duplication ................................................................................................ 7
5. Efforts to Minimize Burden on Small Business ..................................................................... 8
6. Consequences of Less Frequent Data Collection.................................................................... 8
7. Special Circumstances ............................................................................................................ 8
8. Federal Register Announcement and Consultation Outside the Agency ................................ 9
9. Payment or Gifts to Respondents ............................................................................................ 9
10. Assurance of Confidentiality .............................................................................................. 10
11. Justification for Sensitive Questions ................................................................................... 10
12. Estimate of Respondent Burden.......................................................................................... 11
13. Cost Burden to Respondents ............................................................................................... 13
14. Cost Burden to Federal Government .................................................................................. 14
15. Reason for Change in Burden ............................................................................................. 14
16. Schedule for Information Collection and Publication ........................................................ 14
17. Display of OMB Expiration Date ....................................................................................... 15
18. Exception to the Certificate Statement ............................................................................... 15
References .................................................................................................................................... 17
APPENDIX A: NASA Education Mission and Goals .............................................................. 19
APPENDIX B: NASA Center Education Offices ..................................................................... 21
APPENDIX C: Data Instrument Collection Testing Participation Generic Consent Form 23
APPENDIX D: Descriptions of Methodological Testing Techniques .................................... 25
APPENDIX E: Privacy Policies and Procedures ..................................................................... 28
APPENDIX F: Overview: NASA Education Data Collection Instrument Development
Process
30

FROM OUTPUTS TO SCIENCE, TECHNOLOGY, ENGINEERING, AND MATHEMATICS (STEM)
EDUCATION OUTCOMES MEASUREMENT: DATA COLLECTION INSTRUMENT DEVELOPMENT
PROCESS ....................................................................................................................................... 30
APPENDIX G: Explanatory Content for Information Collections for Testing Purposes ... 33
List of Tables ............................................................................................................................... 35

ii

GENERIC CLEARANCE FOR THE NASA OFFICE OF EDUCATION/
PERFORMANCE MEASUREMENT AND EVALUATION (TESTING)
SUPPORTING STATEMENT

A.

JUSTIFICATION

1. NECESSITY FOR INFORMATION COLLECTION:
Explain the circumstances that make the collection of information necessary. Identify any legal
or administrative requirements that necessitate the collection. Attach a copy of the appropriate
section of each statute and regulation mandating or authorizing the collection of information.
One of NASA’s missions is to drive advances in science and technology. The NASA Office of
Education (NASA Education) supports that mission by deploying programs to advance the next
generation’s educational endeavors and expand partnerships with academic communities (see
Appendix A). The Office of Education Infrastructure Division (OEID) was formed to support
NASA’s approach to science, technology, engineering, and mathematics (STEM) education by
implementing the principles of transparency, participation, and collaboration throughout all of its
education activities executed through headquarters and across the ten Center Education Offices
(see Appendix B). The OEID’s goal is to provide support that improves education policy and
decision-making, provides better education services, increases accountability, and ensures more
effective administration. Since the OEID’s launch, team members have been leading and
supporting the NASA Education community in the areas of information technology,
dissemination and Web services, communications and operations support, and performance
assessment using a systematic approach.
The purpose of this request for clearance for methodological testing is to significantly enhance
the quality of the Office Education’s data collection instruments and overall data management
through interdisciplinary scientific research, utilizing best practices in educational,
psychological, and statistical measurement. NASA Education is committed to producing the
most accurate and complete data within the highest quality assurance guidelines for reporting
purposes by NASA Education leadership and by authority of the Government Performance and
Results Modernization Act (GPRMA) of 2010 that requires quarterly performance assessment of
Government programs for purposes of assessing agency performance and improvement. It is
with this mission in mind, then, that this clearance package is submitted.1
The OEID leads performance measurement and program evaluation activities within the Office
of Education. Responsibilities include recommending and implementing agency-wide strategy
for performance measurement and evaluation; ensuring the collection of high-quality data;
process documentation of NASA Education projects; formative and outcome evaluations;
training and technical assistance on performance measurement and evaluation.

1

The entire GPRMA of 2010 can be accessed at http://www.gpo.gov/fdsys/pkg/BILLS-111hr2142enr/pdf/BILLS111hr2142enr.pdf.

1

Towards monitoring performance of its education activities, NASA Education will use
rigorously developed and tested instruments administered and accessed through the Office of
Education Performance Measurement system.2 Each data collection form type possesses unique
challenges which can be related to respondent characteristics, survey content, or form of
administration. In the absence of meticulous methods, such issues impede the effectiveness of
instruments and would decrease the value of the data gathered through these instruments for both
NASA Education and the Agency.
The central purpose of measurement is to provide a rational and consistent way to summarize the
responses people make to express achievement, attitudes, or opinions through instruments such
as achievement tests or questionnaires (Wilson, 2005, p. 5). In this particular instance, our
interest lies in attitude and behavior scales, surveys, and psychological scales related to the goals
of NASA STEM education activities. Yet, since NASA Education captures participant
administrative data from activity application forms and program managers submit administrative
data, OEID extends the definition of instruments to include electronic data collection screens,
project activity survey instruments, and program application forms, as well.3 Research-based,
quality control methods and techniques are integral to obtaining accurate and robust data, data of
high quality to assist leaders in policy decisions.
The following research techniques and methods may be used in these studies:
 Usability testing: Pertinent are the aspects of the web user interface (UI) that impact the
User’s experience and the accuracy and reliability of the information Users submit (Kota,
n.d.; Jääskeläinen, 2010).


Think-aloud protocols: This data elicitation method is also called ‘concurrent
verbalization’, meaning subjects are asked to perform a task and to verbalize whatever
comes to mind during task performance. The written transcripts of the verbalizations are
referred to as think-aloud protocols (TAPs) (Jääskeläinen, 2010, p 371) and constitute the
data on the cognitive processes involved in a task (Ericsson & Simon, 1984/1993).



Focus group discussion: With groups of nine or less per instrument, this qualitative
approach to data collection comprises the basis for brainstorming to creatively solve
remaining problems identified after early usability testing of data collection screen and
program application form instruments (Colton & Covert (2007), p. 37).



Comprehensibility testing: Comprehensibility testing of program activity survey
instrumentation will determine if items and instructions make sense, are ambiguous, and
are understandable by those who will complete them (Colton & Covert, 2007, p. 129).

2

The Office of Education Performance Measurement System (OEPM) is the project level data component of NASA Education’s data collection
suite. It is an automated system for collecting, managing, and securing data, and uses web interfaced on-line data collection screens with a backend database. As an automated, information technology system that is the centralized collection point for NASA Education performance
measurement data, OEPM reduces respondent burden by: 1.) bringing clarity to the exact nature of data required of program managers; 2.)
consolidating disparate NASA Education systems in use throughout the NASA Centers of Education; 3.) providing a means to monitor project
performance data for the purposes of determining education-related outputs and outcomes; 4.) improving the quality of performance measurement
data (i.e., monitoring mechanism for missing data points); and 5.) refining reporting consistency through automated reminder functionality.
3
If constituted as a form and once approved by OMB, forms will be submitted to NASA Forms Management according to NASA Policy
Directive (NPD) 1420. Thus, forms used under this clearance, will have both an OMB control number and an NPD 1420 control number that also
restricts access to NASA internal users only. Instruments not constituted as forms will display an OMB control number only.

2



Pilot testing: Testing with a random sample of at least 200 respondents to yield
preliminary validity and reliability data (Haladyna, 2004; Komrey and Bacon, 1992;
Reckase, 2000; Wilson, 2005).



Large-scale statistical testing: Instrument testing conducted with a statistically
representative sample of responses from a population of interest. In the case of
developing scales, large-scale statistical testing provides sufficient data points for
exploratory factor analysis, a “large-sample” procedure (Costello & Osborne, 2005, p. 5).



Item response approach to constructing measures: Foundations for multiple-choice
testing that address the importance of item development for validity purposes, address
item content to align with cognitive processes of instrument respondents, and that
acknowledge guidelines for proper instrument development will be utilized in a
systematic and rigorous process (DeMars, 2010).



Split-half method: This method is an efficient solution to parallel-forms or test/retest
methods because it does not require developing alternate forms of a survey and it
reduces burden on respondents, requiring only participation via a single test rather than
completing two tests to acquire sufficient data for reliability coefficients.

The OEID’s goal and purpose for data collection through methodological testing is to provide
support that improves education policy and decision-making, provides better education services,
increases accountability, and ensures more effective administration within the NASA Office of
Education. More in depth descriptions of techniques and methods can be found in Appendix D.

2. USES OF INFORMATION
Indicate how, by whom, and for what purpose the information is to be used. Except for a new
collection, indicate the actual use the agency has made of the information received from the
current collection.
The purpose of this data collection by the OEID is to ultimately improve our Federal data
collection processes through scientific research. Theories and methods of cognitive science, in
combination with qualitative and statistical analyses, provide essential tools for the development
of effective, valid, and reliable data collection instrumentation.
The OEID’s methodological testing is expected to 1) improve the data collection instruments
employed by NASA Office of Education, 2) increase the accuracy of the data produced by
execution of NASA Education project activities upon which policy decisions are based, 3)
increase the ease of administering data collection instruments for both respondents and those
responsible for administering or providing access to respondents, 4) increase response rates as a
result of reduced respondent burden, 5) increase the ease of use of the data collection screens
within the Office of Education Performance Management system, and 6) enhance NASA

3

Education’s confidence in and respect for the data collection instrumentation utilized by the
NASA Education community.
The application of cognitive science, psychological theories, and statistical methods to data
collection is widespread and well established. Neglecting accepted research practices and relying
on trial and error negatively impact data quality and unfairly burden respondents and
administrators of data collection instruments. For example, without knowledge of what
respondents can be expected to remember about a past activity and how to ask questions that
effectively aid in the retrieval of the appropriate information, researchers cannot ensure that
respondents will not take shortcuts to avoid careful thought in answering the questions, or be
subject to undue burden. Similarly, without investigating potential respondents’ roles and
abilities in navigating electronic data collection screens, researchers cannot ensure that
respondents will read questions correctly with ease and fluency, navigate electronic data screens
properly or efficiently, or record requested information correctly and consistently. Hence,
consequences of failing to scientifically investigate the data collection process should and can be
avoided.
In light of the Administration’s call for increased sharing of federal STEM education resources
through interagency collaborations, NASA Education may make available results of
methodological testing to other federal STEM agencies in the form of peer-reviewed methods
reports or white papers describing best practices and lessons learned. For instance, from
inception NASA has supported the Federal Coordination in STEM (FC-STEM) Graduate and
Undergraduate STEM Education interagency working groups’ efforts determine cross-agency,
common metrics and share effective program evaluations. Coordination Objective 2: Build and
use evidence based approaches calls for agencies to:
Conduct rigorous STEM education research and evaluation to build evidence
about promising practices and program effectiveness, use across agencies, and
share with the public to improve the impact of the Federal STEM education
investment. (National Science and Technology Council, 2013, p. 45)
The methods to be employed in developing and testing data collection instruments will be
methodologically sound, rigorously obtained, and will thus constitute evidence worthy of
dissemination through appropriate vehicles.
The first project in development constitutes data collection instruments appropriate for a
participant in a postsecondary NASA Education research experience and are specific to the
category of participant: undergraduate student, graduate student, mentor participant. One survey
instrument explores a participant’s preparation for a research experience while its complement
explores a participant’s attitudes and behaviors pre- and post-experience (undergraduate or
graduate student) (Crede & Borrego, 2013.) Two non-cognitive competency scales explore a
participant’s developmental levels of affect (grit and mathematics self-identity & self-efficacy)
as related to participation in a NASA Education research experience (Duckworth, Peterson,
Matthews, & Kelly, 2007; National Center for Education Statistics, 2009.) Lastly, the mentor
4

survey explores a mentor’s attitudes and behaviors associated with participation as a mentor of a
NASA Education research experience (Crede & Borrego, 2013.) Appendix G shows the
explanatory content that will accompany each information collection for methodological testing
purposes.

3. CONSIDERATIONS OF USING IMPROVED TECHNOLOGY
Describe whether, and to what extent, the collection of information involves the use of
Automated, electronic, mechanical, or other technological collection techniques or other forms
of information technology, e.g., permitting electronic submission of responses, and the basis for
the decision for adopting this means of collection. Also describe any consideration of using
information technology to reduce burden.
The OEID will plan, conduct, and interpret field and laboratory research that contributes to the
design of electronic data collection screens, project activity survey instruments, and program
application forms used within the context of the NASA Education community spread across ten
Center Education Offices. These efforts are supported in two ways, by use of information
technology applications and strategic efforts to improve the overall information technology data
collection systems used by NASA Education.
Use of Information Technology (IT) Applications
IT applications will be used to bridge the distance between the OEID team of researchers mostly
based at headquarters in Washington, DC, multiple modes of technology may be used to bring
the laboratory environment to study participants at various Center locales. In addition, data
management and analyses applications have been made available to study leads to optimize data
collection and analyses.
Different laboratory methods may be used in different studies depending on the aspects of the
data collection process being studied. Computer technology will be used when appropriate to aid
the respondents and interviewers, and to minimize burden. For instance, the OEID team may use
Adobe Connect, VidyoDesktop, or VidyoWeb to conduct focus groups and cognitive interviews
if indeed there is inadequate representation of participant populations at area NASA research
centers.4,5 Adobe Connect and VidyoDesktop platforms are used throughout the NASA research
centers and have the potential to facilitate instrument development by providing access to
appropriate study participants. The OEID team has direct access and is also training in using
other IT applications to facilitate this work as described below:


Adobe Connect: Adobe Systems Incorporated describes Adobe Connect as “a web
conferencing platform for web meetings, eLearning, and webinars [that] powers mission

4

More information on Adobe applications is available at http://www.adobe.com/products/adobeconnect.html
More information on Vidyo applications is available at http://info.vidyo.com/schedule-live-vidyodemo.html?utm_source=bing&utm_medium=cpc&utm_campaign=Brand+-+Vidyo(US)&utm_adgroup=BrandVidyo&utm_term={keyword&_kk=vidyo}
5

5









critical web conferencing solutions end-to-end, on virtually any device, and enables
organizations […] to fundamentally improve productivity.”
VidyoDesktop: Key features include Ultra HD 4k support to display rich content and
multiple full HD participants; Multiple user-selectable layouts for continuous presence,
active speaker, and shared content; Supported in Windows and Mac environments; Inconference public and private text chat, and ability to switch between multiple streams of
shared content; Far-end camera control of Vidyo. Benefits include conferences hosting in
your own virtual conference room with simple click-to-connect access for both
administered users and guests; and works on existing computers and laptops with no need
for an expensive dedicated appliances. The VidyoWeb browser plug-in provides guest
participants a comparable in-conference experience to VidyoDesktop, but without user
account or special software requirements.
SurveyMonkey: This application may be used to collect non-sensitive, non-confidential
qualitative responses to determine preliminary validity. This online survey software
provides an electronic environment for distributing survey questionnaires.6 For the
purpose of NASA Education, SurveyMonkey is a means by which feedback can be
collected from a variety of participants such as from subject matter experts when in the
early stages of instrument development when operationalizing a construct is vital to the
process of instrument development. A process referred to as operationalization is another
tangible means to measure a construct since a construct cannot be observed directly
(Colton & Covert, 2007, p. 66). The qualitative feedback of subject matter experts, in
addition to the research literature, provides the factors or variables associated with
constructs of interest. SurveyMonkey will facilitate the gathering of such information and
interface with NVivo 10 for Windows qualitative software for analyses and consensus
towards developing valid items and instruments.
NVivo 10 for Windows: This software is a platform for analyzing multiple forms of
unstructured data. The software provides powerful search, query, and visualization tools.
A few features pertinent to instrument development include pattern based auto-coding to
code large volumes of text quickly, functionality to create and code transcripts from
imported audio files, and convenience of importing survey responses directly from
SurveyMonkey. 7
STATA SE v14: This data analysis and statistical software features advanced statistical
functionality with programming that accommodates analysis, testing, and modeling from
large data sets with the following characteristics: Maximum number of variables-32,767;
Maximum number of right-hand variables- 10,998; and unlimited observations. These
software technical specifications allow for the statistical calculations to determine and

6

More information on SurveyMonkey can be found at https://www.surveymonkey.com/mp/take-atour/?ut_source=header. This application has been approved by the OCIO for uses not requiring a high level of
security. In that regard, NASA Office of Education has a license to this application.
7
More information is available at http://www.qsrinternational.com/products_nvivo.aspx

6

monitor over time item functioning and psychometric properties of NASA Office of
Education data collection instrumentation. 8
Strategic Planning and Designing Improved Information Technology Data Collection Systems
NASA OEID has invested much time and effort in developing secure information technology
applications that will be leveraged on behalf of instrument piloting and for the purposes of
routine deployment that will enable large-scale statistical testing of data collection instruments.
New information technology applications, the Composite Survey Builder and Survey Launcher,
are in development with the NEACC. The Survey Launcher application will allow OEID to
reach several hundred NASA project activity participants via email whereas the Composite
Survey Builder will allow OEID to administer data collection instruments approved by the
Office of Management and Budget (OMB) Office of Information and Regulatory Affairs via
emailed web survey links. This same technology OEID will leverage to maximize response rates
for piloting and routine data collection instrument deployment.
Most recently, NASA Office of Education has acquired a full-time SME specifically tasked with
strategizing approaches to enhance the Office’s IT systems and applications to be more
responsive to Federal mandates as well as to the needs of the Education community. This
person’s work is intended to lay the foundation for fiscally responsible IT development now and
in the future.
Recall, participants in focus groups and cognitive interviews must mirror in as many
characteristics as possible the sample of participants upon which the instrument will eventually
be tested and then administered. Using technology to employ qualitative and quantitative
methods is a means to establish validity from the onset prior to field testing and quantitative
measures to determine instrument reliability and validity while monitoring and minimizing
burden on study participants. Having the proper IT foundations in place for this work is a NASA
Office of Education priority.

4. EFFORTS TO IDENTIFY DUPLICATION
Describe efforts to identify duplication. Show specifically why any similar information already
available cannot be used or modified for use for the purposes described in Item 2 above.
Because developing new valid and reliable data collection instrumentation is a relatively new
procedure for NASA Education, many participants within our community have yet to participate
in this kind of procedure. Participation in instrument development or testing is not mandatory.
Further, to reduce burden, any participant within our community recruited to participate in
instrument development will only be solicited to contribute effort towards a single instrument,
unless he or she volunteers for other opportunities. The OEID team will attempt to reduce some

8

More information is available at http://www.stata.com/products/which-stata-is-right-for-me/#SE

7

of the testing burden by identifying appropriate valid and reliable instruments/scales through
Federal resources or the educational measurement research literature.

5. EFFORTS TO MINIMIZE BURDEN ON SMALL BUSINESS
If the collection of information impacts small businesses or other small entities (Item 5 of OMB
Form 83-I), describe any methods used to minimize burden.
Not applicable. NASA Office of Education does not collect information from any small business
or other small entities.

6. CONSEQUENCES OF LESS FREQUENT DATA COLLECTION
Describe the consequence to Federal program or policy activities if the collection is not
conducted or is conducted less frequently, as well as any technical or legal obstacles to reducing
burden.
This planned collection of data will allow OEID the opportunity to design appropriate valid and
reliable data collection instrumentation, and the prerogative to modify and alter instruments in an
on-going manner in response to changes in respondent demographics and the NASA Office of
Education portfolio of activities. Because this collection is expected to be an on-going effort, it
has the potential to have immediate impact on all data collection instrumentation within NASA
Education. Any delay would sacrifice potential gains in development of and modification to data
collection instrumentation as a whole.

7. SPECIAL CIRCUMSTANCES
Explain any special circumstances that would cause an information collection to be conducted in
a manner: requiring respondents to report information to the agency more often than quarterly;
requiring respondents to prepare a written response to a collection of information in fewer than
30 days after receipt of it; requiring respondents to submit more than an original and two copies
of any document; requiring respondents to retain records, other than health, medical,
government contract, grant-in-aid, or tax records, for more than three years; in connection with
a statistical survey, that is not designed to produce valid and reliable results that can be
generalized to the universe of study; requiring the use of a statistical data classification that has
not been reviewed and approved by OMB; that includes a pledge of confidentiality that is not
supported by authority established in statute or regulation, that is not supported by disclosure
and data security policies that are consistent with the pledge, or which unnecessarily impedes
sharing of data with other agencies for compatible confidential use; or requiring respondents to
submit proprietary trade secrets, or other confidential information unless the agency can
demonstrate that it has instituted procedures to protect the information's confidentiality to the
extent permitted by law.
Not applicable. This data collection does not require any one of the reporting requirements listed.

8

8. FEDERAL REGISTER ANNOUNCEMENT AND CONSULTATION OUTSIDE THE AGENCY
If applicable, provide a copy and identify the date and page number of publication in the Federal
Register of the agency's notice, required by 5 CFR 1320.8(d), soliciting comments on the
information collection prior to submission to OMB. Summarize public comments received in
response to that notice and describe actions taken by the agency in response to these comments.
Specifically address comments received on cost and hour burden.


The 60-day Federal Register Notice, Volume 78, Number 237 (pages 7416974170) was published on Tuesday, December 10, 2013. No comments were received
from the public.



The 30-day Federal Register Notice, Volume 79, Number 235 (pages 72705-72706)
was published on Monday, December 8, 2014. No comments were received from the
public.

Describe efforts to consult with persons outside the agency to obtain their views on the
availability of data, frequency of collection, the clarity of instructions and recordkeeping,
disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or
reported.
Outside Consultation: Valador, Inc., has availed subject matter expert, Dr. Lisa E. Wills as
contracted support to NASA OEID in a full-time capacity. In her role as Education Research
Manager and co-lead of the Performance Assessment Team, Dr. Wills is specifically contracted
to provide technical expertise in the following areas pertinent to this information collection:
Quantitative, Qualitative, & Mixed Research Methods; Cognitive, Psychometric & Survey
Instrument Development; Inferential & Descriptive Statistics; Big Data Analytics; Multi-level
modeling; and Discourse, Narrative, & Case Study Analyses. Program analysts also support this
testing endeavor as subject matter experts in the information technology (IT) environments that
house NASA Education data collection instruments. Lastly, NASA Education has contracted the
services of the NASA Enterprise Applications Competency Center (NEACC) to collaborate on
development of an information technology infrastructure to support data collection. The NEACC
operates and maintains a broad spectrum of NASA's Enterprise Applications, with an emphasis
on fully integrating business process expertise with application and technical know-how. A small
team of civil servants and approximately 300 support contractors sustain operations, implement
new applications and capabilities, and provide business readiness support to the stakeholders and
end users.

9. PAYMENT OR GIFTS TO RESPONDENTS
Explain any decision to provide any payment or gift to respondents, other than remuneration of
contractors or grantees.
Not applicable. NASA Office of Education does not offer payment or gifts to respondents.
9

10. ASSURANCE OF CONFIDENTIALITY
Describe any assurance of confidentiality provided to respondents and the basis for the
assurance in statute, regulation, or agency policy.
NASA Education is committed to protecting the confidentiality of all individual respondents that
participant in data collection instrumentation testing. Any information collected under the
purview of this clearance will be maintained in accordance with the Privacy Act of 1974, the eGovernment act of 2002, the Federal Records Act, and as applicable, the Freedom of Information
Act in order to protect respondents’ privacy and the confidentiality of the data collected (See
Appendix E.)
The data collected from respondents will be tabulated and analyzed only for the purpose of
evaluating the research in question. Laboratory respondents will be asked to read and sign a
Consent form, a personal copy of which they are provided to retain. The Consent form explains
the voluntary nature of the studies and the use of the information, describes the parameters of the
interview (taped or observed), and provides assurance of confidentiality as described in NASA
Procedural Requirements (NPR) 7100.1.9
The consent form administered will be edited as appropriate to reflect the specific testing
situation for which the participant is being recruited (See Appendix C). The confidentiality
statement, edited per data collection source, will be posted on all data collection screens and
instruments, and will be provided to participants in methodological testing activities per NPR
7100.1 (See Appendix E.)

11. JUSTIFICATION FOR SENSITIVE QUESTIONS
Provide additional justification for any questions of a sensitive nature, such as sexual behavior
and attitudes, religious beliefs, and other matters that are commonly considered private. This
justification should include the reasons why the agency considers the questions necessary, the
specific uses to be made of the information, the explanation to be given to persons from whom
the information is requested, and any steps to be taken to obtain their consent.
Assuring that students participating in NASA education projects are representative of the
diversity of the Nation requires NASA Education to capture the race, ethnicity, and disability
statuses of its participants. Therefore, to assure the reliability and validity of its data collection
instruments, OEID will need to ascertain that study participants are representative of students
participating in NASA education projects. Race and ethnicity information is collected according
to Office of Management and Budget (1997) guidelines in “Revisions to the Standards for the
Classification of Federal Data on Race and Ethnicity.”10 Although disclosure of race and
ethnicity are not required to be considered for opportunities at NASA, respondents are strongly

9

The entire NPR 7100.1 Protection of Human Research Subjects (Revalidated 6/26/14) may be found at:
http://nodis3.gsfc.nasa.gov/displayDir.cfm?Internal_ID=N_PR_7100_0001_&page_name=main
10
http://www.whitehouse.gov/omb/fedreg_1997standards

10

encouraged to submit this information. The explanation given to respondents for acquiring this
information is as follows:
In order to determine the degree to which members of each ethnic and racial group are reached by
this internship/fellowship program, NASA requests that the student select the appropriate
responses below. While providing this information is optional, you must select decline to answer
if you do not want to provide it. Mentors will not be able to view this information when
considering students for opportunities. For more information, please visit
http://www.nasa.gov/about/highlights/HP_Privacy.html.

Information regarding disabilities is collected according to guidelines reflected in the “SelfIdentification of Disability” form SF-256 published by the Office of Personnel Management
(Revised July 2010) and is preceded by the following statement:
An individual with a disability: A person who (1) has a physical impairment or mental impairment
(psychiatric disability) that substantially limits one or more of such person's major life activities;
(2) has a record of such impairment; or (3) is regarded as having such an impairment. This
definition is provided by the Rehabilitation Act of 1973, as amended (29 U.S.C 701 et. seq.)11

Regulations safeguarding this information is provided to study participants on the informed
consent form as governed by NPR 7100.1.

12. ESTIMATE OF RESPONDENT BURDEN
Provide estimates of the hour burden of the collection of information. The statement should:
Indicate the number of respondents, frequency of response, annual hour burden, and an
explanation of how the burden was estimated. Unless directed to do so, agencies should not
conduct special surveys to obtain information on which to base hour burden estimates.
Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour
burden on respondents is expected to vary widely because of differences in activity, size, or
complexity, show the range of estimated hour burden, and explain the reasons for the variance.

11

http://www.opm.gov/forms/pdf_fill/sf256.pdf

11

The estimate of respondent burden for methodological testing is as follows (See Table 1):
Table 1: Estimate of Respondent Burden for Methodological Testing

Data Collection
Sources
Office of Education
Performance
Measurement
System

Frequency
of
Response

Total
minutes
per
Response

Total
Response
Burden
in Hours

629

2

20

420

639

2

15

319

External program
manager- Data
collection screens

264

2

60

528

Pre-College
surveys

517

2

10

172

Undergraduate
surveys
Graduate surveys

618
444

2
2

20
20

412
296

Post-Graduate
surveys

247

2

20

165

Respondent
Category

Statistically
Adjusted Number
of Respondents

Undergraduate
and graduate
student profiles
Educator
participant
surveys

One Stop Shopping
Initiative

Total Burden for
Methodological
Testing

3,358

2,312

Generally, estimates should not include burden hours for customary and usual business
practices. If this request for approval covers more than one form, provide separate hour burden
estimates for each form and aggregate the hour burdens in Item 13 of OMB Form 83-I.
Not applicable.
Provide estimates of annualized cost to respondents for the hour burdens for collections of
information, identifying and using appropriate wage rate categories. The cost of contracting out
or paying outside parties for information collection activities should not be included here.
Instead, this cost should be included in Item 13.
The estimate of annualized cost to respondents for methodological testing is as follows (See
Table 2). Annualized Cost to Respondents is calculated by multiplying Total Response Burden in
Hours by Wage specific to Respondent Category (Bureau of Labor Statistics, 2014).
12

Table 2: Estimate of Annualized Cost to Statistically Adjusted Number of Respondents Required for Methodological
Testing

Total
Response
Burden
in Hours

Wage

Annualized Cost to
Respondents

Data Collection Sources

Respondent
Category

Office of Education
Performance Measurement
System

Undergraduate and
graduate student
profile

420

7.25

$3,042.52

Educator
participant surveys

319

25.09

$8,015.32

External program
manager- Data
collection screens

528

25.09

$13,243.60

Pre-College surveys

172

7.25

$1,249.98

Undergraduate
surveys
Graduate surveys

412
296

7.25
7.25

$2,985.26
$2,146.71

Post-Graduate
surveys

165

7.25

$1,192.98

One Stop Shopping Initiative

Total Burden for
Methodological
Testing

2,312

$31,876.37

13. COST BURDEN TO RESPONDENTS
Provide an estimate for the total annual cost burden to respondents or recordkeepers resulting
from the collection of information. (Do not include the cost of any hour burden shown in Items
12 and 14). The cost estimate should be split into two components: (a) a total capital and startup cost component (annualized over its expected useful life) and (b) a total operation and
maintenance and purchase of services component. The estimates should take into account costs
associated with generating, maintaining, and disclosing or providing the information. Include
descriptions of methods used to estimate major cost factors including system and technology
acquisition, expected useful life of capital equipment, the discount rate(s), and the time period
over which costs will be incurred. Capital and start-up costs include, among other items,
preparations for collecting information such as purchasing computers and software; monitoring,
sampling, drilling and testing equipment; and record storage facilities. If cost estimates are
expected to vary widely, agencies should present ranges of cost burdens and explain the reasons
for the variance. The cost of purchasing or contracting out information collections services
should be a part of this cost burden estimate. In developing cost burden estimates, agencies may
consult with a sample of respondents (fewer than 10), utilize the 60-day pre-OMB submission
13

public comment process and use associated with the rulemaking containing the information
collection, as appropriate. Generally, estimates should not include purchases of equipment or
services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory
compliance with requirements not associated with the information collection, (3) for reasons
other than to provide information or keep records for the government, or (4) as part of
customary and usual business or private practices.
Not applicable. Participation in testing does not require respondents to purchase equipment,
software, or contract out services. The instruments used will be available in electronic format
only. NASA Office of Education’s expectation is all targeted respondents can access the NASA
OEPM System/forms/instruments electronically for the purposes of testing as they have in the
past when applying to NASA opportunities.

14. COST BURDEN TO FEDERAL GOVERNMENT
Provide estimates of annualized costs to the Federal government. Also, provide a description of
the method used to estimate cost, which should include quantification of hours, operational
expenses (such as equipment, overhead, printing, and support staff), and any other expense that
would not have been incurred without this collection of information. Agencies may also
aggregate cost estimates from Items 12, 13, and 14 in a single table.
The total annualized cost estimate for this information collection is $0.7 million based on
existing contract expenses that include contract staffing, staff training for data collection, data
cleaning, validation, and management, and reporting relating to contract staffing for the two
online systems (OEPM, OSSI) that compose the OEID data collection suite.

15. REASON FOR CHANGE IN BURDEN
Explain the reasons for any program changes or adjustments reported in Items 13 or 14 of the
OMB Form 83-I.
Not applicable. This is a new application for methodological testing of data collection
instrumentation within the NASA Office of Education by the OEID.

16. SCHEDULE FOR INFORMATION COLLECTION AND PUBLICATION
For collections of information whose results will be published, outline plans for tabulation and
publication. Address any complex analytical techniques that will be used. Provide the time
schedule for the entire project, including beginning and ending dates of the collection of
information, completion of report, publication dates, and other actions.
NASA Education may make available results of methodological testing to other federal STEM
agencies in the form of peer-reviewed methods reports or white papers describing best practices and
lessons learned on an as-appropriate basis determined by NASA Education leadership. Although
there is no intent to publish in academic journals, standards for drafting will reflect peer-reviewed,
publication-level standards of quality.

14

Since these methodological testing efforts will be conducted within the context of STEM education
activities and programs, methods or white papers would be developed following briefs publication
guidelines employed by the Educational Researcher, a premier publication of the American
Educational Research Association as follows:
Briefs are brief analyses focusing on a specific topic or question using new data or
existing databases (e.g., available from the National Center for Education Statistics).
Briefs should include a brief introduction of the issue or question, a brief discussion
of the data, up to two figures or tables, and a maximum six references, with text
totaling no more than 1,000 words. Titles should be no more than eight words in
length. Authors should also submit an abstract of 100 words or less, which will
appear online only. Methods (quantitative and/or qualitative) should be included in
supporting online material. Manuscripts are peer-reviewed in the usual manner.
(Educational Researcher, 2015)
A schedule of efforts cannot be determined at this time due to the impact of several variables in the
process including, but not limited to, determination of the NASA Education portfolio of activities,
input from lines of business directors, and approval of this methodological testing generic clearance
application. The process of developing a data collection instrument prior to any possible
dissemination effort is extensive and complex with many steps of indeterminate time to completion.
In this regard, no time schedule is included. Appendix F articulates NASA Education’s data
collection instrument development process as we revise our processes to a focus on STEM education
outcomes measurement.

17. DISPLAY OF OMB EXPIRATION DATE
If seeking approval to not display the expiration date for OMB approval of the information
collection, explain the reasons that display would be inappropriate.
The OMB Expiration Date will be displayed on every data collection instrument, once approval
is obtained.

18. EXCEPTION TO THE CERTIFICATE STATEMENT
Explain each exception to the certification statement identified in Item 19, "Certification for
Paperwork Reduction Act Submissions," of OMB Form 83-I.
NASA does not take exception to the certification statements below:
The proposed collection of information –
(a) is necessary for the proper performance of the functions of NASA, including that the information to be
collected will have practical utility;
(b) is not unnecessarily duplicative of information that is reasonably accessible to the agency;
(c) reduces to the extent practicable and appropriate the burden on persons who shall provide information
to or for the agency, including with respect to small entities, as defined in the Regulatory Flexibility Act (5
U.S.C. 601(6)), the use of such techniques as:

15

(1) establishing differing compliance or reporting requirements or timelines that take into account
the resources available to those who are to respond;
(2) the clarification, consolidation, or simplification of compliance and reporting requirements; or
(3) an exemption from coverage of the collection of information, or any part thereof;
(d) is written using plain, coherent, and unambiguous terminology and is understandable to those who are
targeted to respond;
(e) indicates for each recordkeeping requirement the length of time persons are required to maintain the
records specified;
(f) has been developed by an office that has planned and allocated resources for the efficient and effective
management and use of the information to be collected, including the processing of the information in a
manner which shall enhance, where appropriate, the utility of the information to agencies and the public;
(g) when applicable, uses effective and efficient statistical survey methodology appropriate to the purpose
for which the information is to be collected; and
(h) to the maximum extent practicable, uses appropriate information technology to reduce burden and
improve data quality, agency efficiency and responsiveness to the public; and
(i) will display the required PRA statement with the active OMB control number, as validated on
www.reginfo.gov

Name, title, and organization of NASA Information Collection Sponsor certifying statements
above:
NAME: Patricia Moore Shaffer, Ph.D.
TITLE: Acting Director &Evaluation Manager
ORG: Office of Education Infrastructure Division

16

References
Bureau of Labor Statistics. (2014). Retrieved from http://www.bls.gov/home.htm.
Colton, D., & Covert, R. W. (2007). Designing and constructing instruments for social reserch
and evaluation. San Francisco: John Wiley and Sons, Inc.
Costello, A. B., & Osborne, J. W. (2005). Best practices in exploratory factor analysis: Four
recommendations for getting the most from your analysis. Practical Assessment,
Research & Evaluation, 10(7), 1-9.
Crede, E., & Borrego, M. (2013). From ethnography to items: A mixed methods approach to
developing a survey to examine graduate engineering student retention. Journal of Mixed
Methods Research, 7(1), 62-80.
Davidshofer, K. R., & Murphy, C. O. (2005). Psychological testing: Principles and applications.
(6th ed.). Upper Saddle River, NJ: Pearson/Prentice Hall.
DeMars, C. (2010). Item response theory. New York: Oxford University Press.
Duckworth, A. L., Peterson, C., Matthews, M. D., & Kelly, D. R. (2007). Grit: Perserverance and
passion for long-term goals. Journal of Personality and Social Psychology, 92(6), 10871101.
Educational Researcher. (2015, April 13). Retrieved from Sage Publications:
http://www.sagepub.com/journalsProdDesc.nav?ct_p=manuscriptSubmission&prodId=Jo
urnal201856
Fabrigar, L. R., & Wegener, D. T. (2011). Exploratory factor analysis. New York, NY: Oxford
University Press.
Haladyna, T. M. (2004). Developing and validating multiple-choice test items (3rd ed.).
Mahwah, NJ: Lawrence Erlbaum Associates, Inc.
Jaaskelainen, R. (2010). Think-aloud protocol. In Y. Gambier, & L. Van Doorslaer (Eds.),
Handbook of translation studies (pp. 371-373). Philadelphia, PA: John Benjamins.
Komrey, J. D., & Bacon, T. P. (1992). Item analysis of acheivement tests based on small
numbers of examinees. Paper presented at the annual meeting of the American
Educational Research Association. San Francisco.
Kota, K. (n.d.). Testing your web application: A quick 10-step guide. Retrieved from
http://www.adminstrack.com/articles/testing_web_apps.pdf.
17

National Center for Education Statistics, U. (2009). High School Longitudinal Study of 2009,
First Follow-up. OMB No: 1850-0852.
National Science and Technology Council. (2013). Federal science, technology, engineering,
and mathematics (STEM) education 5 year strategic plan. Retrieved from
http://www.whitehouse.gov/sites/default/files/microsites/ostp/stem_stratplan_2013.pdf.
Reckase, M. D. (2000). The minimum sample size needed to calibrate items using the threeparameter logistic model. Paper presented at the annual meeting of the American
Educational Research Association. New Orleans.
Wilson, M. (2005). Constructing measures: An item response modeling approach. New York:
Psychology Press.

18

APPENDIX A: NASA Education Mission and Goals
Education is a fundamental part of NASA's work to execute its vision to reach for new heights
and reveal the unknown so that what we do and learn will benefit all humankind.
Inspired by the NASA Education vision to “advance high quality STEM education using
NASA’s unique capabilities,” NASA’s Offices, Mission Directorates, and Centers are
collaborating to implement a single Agency-wide approach to STEM education. This approach
provides unique NASA experiences to learners, educators, and institutions, and streamlined
access to our content, Web sites, people, resources, and facilities.
NASA is launching into the future with four key Lines of Business, which will enable the agency
to ensure its education investments are unique and non-duplicative of other Federal Agencies
also involved in STEM education:
•
STEM Engagement addresses national needs in STEM education, while also providing
unique opportunities to underrepresented and underserved communities. Activities utilize
NASA-unique resources and include STEM Public Education Events, STEM Experiential
Learning Opportunities, and STEM Challenges.
•
Educator Professional Development at NASA increases an educator’s (in-service, preservice, informal) confidence and enthusiasm in delivering STEM materials within their
education environments through interaction with NASA-unique content, facilities, and personnel.
NASA is strategic and efficient in managing its efforts to design and deliver stimulating
professional development opportunities that will prepare and inspire educators to utilize NASArelated STEM content.
•
Institutional Engagement builds the capacity of formal and informal education
institutions to participate in NASA’s mission. The agency supports colleges and universities by
helping them gain access to cutting-edge engineering and science facilities. NASA also enables
informal institutions, such as museums, planetariums, and science centers, to open the door to the
universe through exhibits and displays which showcase the NASA’s dynamic content.
•
NASA Internships, Fellowships, and Scholarships (NIFS) motivate students to pursue
careers in STEM and improve the retention of students in STEM disciplines. The agency
provides opportunities along the full spectrum of the pipeline and helps increase the pool of
STEM graduates. NASA is also committed to providing significant, direct student awards in
higher education to underserved and underrepresented communities of learners, educators, and
researchers.
Education at NASA is guided by Strategic Objective 2.4 in the agency’s strategic plan, which
states that NASA will “advance the Nation’s STEM education and workforce pipeline by
19

working collaboratively with other agencies to engage students, teachers, and faculty in NASA’s
missions and unique assets.” Associated performance goals are:
2.4.1: Assure that students participating in NASA higher education projects are representative of
the diversity of the Nation.
2.4.2: Continue to support STEM educators through the delivery of NASA education content and
engagement in educator professional development opportunities.
2.4.3: Assure that the institutions NASA engages with represent the diversity of institution types
and category levels in the Nation as defined by the US Department of Education (FY2015 only).
2.4.4: Continue to provide opportunities for learners to engage in STEM education through
NASA unique content provided to informal education institutions designed to inspire and
educate the public.
2.4.5: Continue to provide opportunities for learners to engage in STEM education engagement
activities that capitalize on NASA unique assets and content.

20

APPENDIX B: NASA Center Education Offices
Strategic management of the NASA education portfolio requires the participation of the Office
of Education (headquarters), the four Mission Directorates and all ten NASA Centers. This
extensive participation provides broad education engagement with NASA content, people and
facilities. Close and effective consultation, coordination and cognizance among all entities are
critical to the optimal fulfillment of NASA's objectives relative to its education investment.
The Office of Education provides integration and evaluation support to the Education
Coordinating Committee (ECC). As such, OEID maintains a centralized database of all NASA
education activities and investments, and supports coordination of evaluation and assessment of
the Agency education portfolio. Upon improved compliance of the Center Education Offices, all
Centers will submit data collection instruments for development and clearance through OEID
first and then approval by the NASA OMB liaison prior to submission to OMB. This process
will reduce burden on the Education community while optimizing data collection.
Center Education Offices are responsible for implementing NASA education programs, projects
and activities for the Mission Directorates and the Office of Education, as well as planning and
implementing education projects that are unique to and funded by their Centers. Centers are
responsible for execution of programs and projects and for institutional assets. The Center
Education Offices provide expertise in state standards and requirements in their area of
geographic responsibility for K-12 education, and provide valuable field-based input into
education program planning.
Locations of NASA Center Education Offices
Ames Research Center
Ames specializes in research geared towards creating new knowledge and new technologies that
span the spectrum of NASA interests.

Armstrong Flight Research Center
As the lead for flight research, Armstrong continues to innovate in aeronautics and space
technology. The newest, fastest, the highest -- all have made their debut in the vast, clear desert
skies over Armstrong.

Glenn Research Center
Glenn Research Center develops and transfers critical technologies that address national priorities
through research, technology development, and systems development for safe and reliable
aeronautics, aerospace, and space applications.

21

Goddard Space Flight Center
The mission of the Goddard Space Flight Center is to expand knowledge on the Earth and its
environment, the solar system, and the universe through observations from space.

Jet Propulsion Laboratory
The Jet Propulsion Laboratory, managed by the California Institute of Technology is NASA's lead
center for robotic exploration of the Solar System.

Johnson Space Center
From the early Gemini, Apollo, and Sky Lab projects to today's Space Shuttle and International
Space Station programs, Johnson Space Center continues to lead NASA's effort in Human Space
Exploration.

Kennedy Space Center
Kennedy Space Center is America's Gateway to the Universe -- leading the world in preparing and
launching missions around the Earth and beyond.

Langley Research Center
Langley continues to forge new frontiers in aviation and space research for aerospace,
atmospheric sciences, and technology commercialization to improve the way the world lives.

Marshall Space Flight Center
Bringing people to space; bringing space to people. Marshall Space Flight Center is world leader in the access to
space and use of space for research and development to benefit humanity.

Stennis Space Center
Stennis is responsible for NASA's rocket propulsion testing and for partnering with industry to
develop and implement remote sensing technology.

22

APPENDIX C: Data Instrument Collection Testing Participation Generic
Consent Form12
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this
study is sponsored by the National Aeronautics and Space Administration (NASA) Office of Education
Infrastructure Division (OEID), under authority of the Government Performance and Results Modernization
Act (GPRMA) of 2010 that requires quarterly performance assessment of Government programs for purposes
of assessing agency performance and improvement. Your participation is important to the success of this
study. The information we collect will help us improve the nature of NASA education project activities and
the accuracy with which NASA Office of Education can report to the stakeholders about the project activities
offered. The NASA OEID will use the information provided for statistical purposes related to data collection
instrument development only and will hold the information in confidence to the full extent permitted by law.
Information will be secured and removed from this server and location upon guidelines set out by the NASA
Records Retention Schedule 1392, 68-69. Although the following efforts will be taken to ensure
confidentiality, there remains a remote risk of personal data becoming identifiable. A non-identifying code
number will be assigned to participants’ data records, which will be stored in accordance with federal
regulatory procedures and accessible only to the investigator. Any use of individual data to illustrate specific
assessment results will be labeled in a manner to preserve the participants’ anonymity. In no way does
refusing participation in this instrument development study preclude you from eligibility for NASA education
project activities now or in the future.

Introduction
This research seeks to support the mission of the NASA Office of Education by asking you to take
part in a (focus group/cognitive interview/ instrument development testing) pertaining to our
interest in the ways in which NASA project activities impact outcomes for participants.13 The
information we collect will help us to improve the nature of the project activity and the accuracy
with which NASA Office of Education can report to the community about the project activities it
offers.
Purpose of the Study
Determine the degree to which this instrument accurately captures the ways participant outcomes
are measured by this data collection instrument.
Description of Study Procedures
Participants will be asked to complete XXX.
There are no foreseeable risks to participants electing to participate in this study.
Estimation of Time Required
We estimate it will take you an average of [enter #] minutes to participate in this research (ranging from
[enter #] minutes to [enter #] minutes).
Securing Your Responses
Under no circumstances will the results of your surveys be shared with anyone without your
explicit permission. The results of this research may be presented at meetings or in publications,
12

Once approved by OMB, this form will be submitted to NASA Forms Management according to NASA Policy
Directive (NPD) 1420. Thus, this form, and all others used under this clearance, will have both an OMB control
number and an NPD 1420 control number that also restricts access to NASA internal users only.
13
This clearance package is to obtain permission to develop instruments to be used in testing that will be approved
by OMB first for inclusion under this clearance prior to testing.

23

however your identity will not be disclosed. Presentations and manuscripts typically contain
participants’ quotes, but participants are never identified by name. Your involvement in the
development of this instrument is entirely voluntary and you have the right to discontinue
participation at any time.
Contact Persons
If you have any additional questions concerning the research, this informed consent, or
confidentiality of responses, please contact Dr. Lisa E. Wills, Education Research Manager, at
lisa.e.wills@nasa.gov or call (202)258-6021.
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

I have read and understand the contents of this study information and informed consent form and
have been encouraged to ask questions. I have received answers to the questions I have asked. I
give my consent to participate freely in this research. I have signed and retained a copy of the
information and consent form for my records and future reference. I have signed and submitted
this information and consent form for the researcher’s records.
___________________________________

___________________________

Participant's signature

Date

___________________________________
Participant's printed name
___________________________________
Researcher's signature

OMB Control Number: XXXX-XXXX
Expiration Date: [enter expiration date]
HQ-Form-XXXX MM/YYYY

PREVIOUS EDITIONS ARE OBSOLETE
24

APPENDIX D: Descriptions of Methodological Testing Techniques






Usability testing: Pertinent are the aspects of the web user interface (UI) that impact the
User’s experience and the accuracy and reliability of the information Users submit. The
ease with which Users navigate the data collection screens and the ease at which the User
accesses the actions and functionality available during the data input process are equally
important. User experience is also impacted by the look and feel of the web UI and the
consistency of aesthetics from page to page, including font type, size, color scheme
utilized and the ways in which screen real estate is used (Kota, n.d.). The foundation for
Usability testing will be a think-aloud protocol analysis as described by Jääskeläinen
(2010) that exposes distractions to accurate input of data whereas a short Likert Scale
survey with qualitative questions will determine the extent of distraction and nature of the
distractions that impede accurate data input.
Think-aloud protocols (commonly referred to as cognitive interviewing): This data
elicitation method is also called ‘concurrent verbalization’, meaning subjects are asked to
perform a task and to verbalize whatever comes to mind during task performance. The
written transcripts of the verbalizations are referred to as think-aloud protocols (TAPs)
(Jääskeläinen, 2010, p 371) and constitute the data on the cognitive processes involved in
a task (Ericsson & Simon, 1984/1993). When elicited with proper care and instruction,
think-aloud does not alter the course or structure of thought processes, except with a
slight slowing down of the process. Although high cognitive load can hinder
verbalization by occupying all available cognitive resources, that property is of no
concern regarding the tasks under analysis that are restricted to information actively
processed in working memory (Jääskeläinen, 2010, p. 371). For the purposes of NASA
Education, think-aloud protocols will be especially useful towards the improvement of
existing and developing of new data collection screens, which are different in purpose
from online applications. Whereas an online application is an electronic collection of
fields that one either scrolls through or submits, completed page by completed page, data
collection screens represent hierarchical layers of interconnected information for which
user training is required. Since user training is required for proper navigation, think-aloud
protocols capture the user experience to incorporate it into a more user-friendly design
and implementation of this kind of technology. Lastly, data from think-aloud protocols is
used to ensure that user experiences are reliable and consistent towards collecting robust
data.
Focus group interviews: With groups of nine or less per instrument, this qualitative
approach to data collection is a matter of brainstorming to creatively solve remaining
problems identified after early usability testing of data collection screen and program
application form instruments (Colton & Covert, 2007, p. 37). Data from this type of
research will include audiotapes obtained with participant consent, meeting minutes taken
25











by a subject matter expert in administration assistance, and reflective comments
submitted by participants after conclusion of the focus group. Focus group interviews
may be used to refine items that failed initial reliability testing for the purposes of
retesting. Lastly, focus group interviews may be used with participants as a basis for a
grounded theory approach to instrument development or for refining an already existing
instrument to be appropriate to a specific audience.
Comprehensibility testing: Comprehensibility testing of program activity survey
instrumentation will determine if items and instructions make sense, are ambiguous, and
are understandable by those who will complete them. For example, comprehensibility
testing will determine if items are complex, wordy, or incorporate discipline- or
culturally-inappropriate language (Colton & Covert, 2007, p. 129).
Pilot testing: After program activity survey instruments have performed satisfactorily in
readability and comprehensibility testing, the next phase is pilot testing with a sample of
the target population that will yield statistically significant data, a random sample of at
least 200 respondents (Komrey and Bacon, 1992; Reckase, 2000). The goal of pilot
testing is to yield preliminary validity and reliability data to determine if items and the
instrument are functioning properly (Haladyna, 2004; Wilson, 2005). Data gleaned from
pilot testing will be used to fine-tune items and the instrument in preparation for more
complex statistical analysis upon large-scale statistical testing.
Large-scale statistical testing: Instrument testing conducted with a statistically
representative sample of responses from a population of interest. In the case of
developing scales, large-scale statistical testing provides sufficient data points for
exploratory factor analysis (EFA), a multivariate statistical method used to uncover the
underlying structure of a relatively large set of variables and is commonly used when
developing a scale, a collection of questions used to measure a particular research topic
(Fabrigar & Wegener, 2011). EFA is a “large-sample” procedure where generalizable
and/or replicable results is a desired outcome (Costello & Osborne, 2005, p.5). This
technique is particularly relevant to examining relationships between participant traits
and the desired outcomes of NASA Education project activities.
Item response approach to constructing measures: Foundations for testing that address the
importance of item development for validity purposes, address item content to align with
cognitive processes of instrument respondents, and that acknowledge guidelines for
proper instrument development will be utilized in a systematic and rigorous process.
Validity will be determined as arising from item development, from statistical study of
item responses, and from exploring item response patterns via methods prescribed by
Haladyna (2004) and Wilson (2005.)
Split-half method: This method for determining test reliability is an efficient solution to
parallel-forms or test/retest methods. Split-half method does not require developing
alternate forms of a survey and it places a reduced burden on respondents in comparison
to other methods, requiring participation in a single test scenario rather than requiring
retesting at a later date. This method involves administering a test to a group of
26

individuals, dividing the test in half along odd and even item numbers, and then
correlating scores on one half of the test with scores on the other half of the test
(Davidshofer & Murphy, 2005).

27

APPENDIX E: Privacy Policies and Procedures











Information collected under the purview of this clearance will be maintained in
accordance with the Privacy Act of 1974, the e-Government act of 2002, the Federal
Records Act, NPR 7100.1, and as applicable, the Freedom of Information Act in order to
protect respondents’ privacy and the confidentiality of the data collected.14
Data is maintained on secure NASA servers and protected in accordance with NASA
regulations at 14 CFR 1212.605.
Approved security plans are in place for the Office of Education Performance
Measurement (OEPM) system in accordance with the Federal Information Security
Management Act of 2002 and Office of Management and Budget, Circular A-130,
Management of Federal Information Resources.
Only authorized personnel requiring information in the official discharge of their duties
are authorized access to records from workstations within the NASA Intranet or via a
secure Virtual Private Network (VPN) connection that requires two-factor hardware
token authentication.
OEPM resides in a certified NASA data center and has met strict requirements relating to
application security, network security, and backup/recovery of the NASA Office of the
Chief Information Officer’s security plan.
Data will be secured and removed from this server and location upon guidelines set out
by the NRRS/1392, 68-69. Specific guidelines relevant to the OPEM system include the
following:
o Project management records documenting basic information about projects and/or
opportunities, including basic project descriptions, funding amounts and sources,
project managers, and NASA Centers, will be destroyed when 10 years old or
when no longer needed, whichever is longer.
o Records of participants (in any format), maintained either as individual files
identified by individual name or number, or in aggregated files of multiple
participants identified by name or number, including but not limited to application
forms, personal information supplied by the individuals, will be destroyed 5 years
after the last activity with the file.
o Survey responses and other feedback (in any format) from project participants and
the general public concerning NASA educational programs, including interest
area preferences, participant feedback, and reports of experiences in projects, will
be destroyed when 10 years old or when no longer needed, whichever is longer.

14

http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html

28

The following confidentiality statement, edited per data collection source, will be posted on all
data collection screens and instruments, and will be provided to participants in methodological
testing activities per NPR 7100.1:
In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you are hereby notified that this
study is sponsored by the National Aeronautics and Space Administration (NASA) Office of Education
Infrastructure Division (OEID), under authority of the Government Performance and Results
Modernization Act (GPRMA) of 2010 that requires quarterly performance assessment of Government
programs for purposes of assessing agency performance and improvement. Your participation is important
to the success of this study. The information we collect will help us improve the nature of NASA education
project activities and the accuracy with which NASA Office of Education can report to the stakeholders
about the project activities offered. The NASA OEID will use the information provided for statistical
purposes related to data collection instrument development only and will hold the information in
confidence to the full extent permitted by law. Information will be secured and removed from this server
and location upon guidelines set out by the NASA Records Retention Schedule 1392, 68-69. Although the
following efforts will be taken to ensure confidentiality, there remains a remote risk of personal data
becoming identifiable. A non-identifying code number will be assigned to participants’ data records, which
will be stored in accordance with federal regulatory procedures and accessible only to the investigator. Any
use of individual data to illustrate specific assessment results will be labeled in a manner to preserve the
participants’ anonymity. Any photographs or video of participants involved in the study will not be
released without prior written consent. In no way does refusing participation in this instrument
development study preclude you from eligibility for NASA education project activities now or in the
future.

29

APPENDIX F: Overview: NASA Education Data Collection Instrument
Development Process

FROM OUTPUTS TO SCIENCE, TECHNOLOGY, ENGINEERING, AND MATHEMATICS
(STEM) EDUCATION OUTCOMES MEASUREMENT: DATA COLLECTION
INSTRUMENT DEVELOPMENT PROCESS
WORKING WITH THE LINES OF BUSINESS AND PROGRAM DIRECTORS
I. Develop a logic model
a. Information & training sessions to provide guidance
b. Facilitation of logic modeling upon request
c. Review and recommendations to ensure incorporation of evidence-based practice
II. Identify outputs and short-term outcomes from logic models for performance
indicators
a. Identify outputs and outcomes across lines of business and projects aligned with
CAP goals and FC-STEM investment priority areas
b. Convert outputs and outcomes into performance indicators and outcome measures,
identifying required data elements and data collection methods
UNDERSTANDING THE IMPACT OF STEM EDUCATION PROJECT ACTIVITIES ON PARTICIPANTS
III. Develop survey instruments based on line of business performance indicators and
outcome measures
a. Conduct a scholarly STEM education and measurement literature review (assures
that the evidence base is rigorous and current)
b. Connect outcomes from literature review with identified outcome measures, given
constraints of inputs and within the context of activities
c. Search the STEM education research and measurement literature for instrument
candidates for adaptation (previous literature review augments this step)15
d. Create a draft instrument targeting a specific project activity to explore specific
outcomes impacted by the quality of outputs (e.g., non-cognitive competencies
associated with STEM degree attainment in the NASA Internships, Fellowships,
and Scholarships line of business)16

15

Provides opportunity to add to the research literature while using an instrument already determined to be reliable
and valid for a particular respondent population.
16
For example, reporting on STEM undergraduate attainment is much less meaningful without understanding what
kinds of experiences contributed to degree attainment and the quality of their NASA experience.

30

i. Draft should be lengthy and exhaustive to allow editing down in the testing
process
ii. Draft should reflect many questions that ask the same question to allow
editing down
iii. Draft should demonstrate multiple items per construct as convergence is
important
e. Obtain stakeholder feedback & edit instrument draft
i. Editing question type
ii. Adding new constructs and items
f. Conduct cognitive interviews with a small number (less than 10) of appropriate
respondents & edit accordingly17
i. Editing question language
ii. Editing question type
DEVELOPING VALID AND RELIABLE DATA COLLECTION INSTRUMENTS
IV. Conduct field test of an instrument draft
a. Provide draft to OMB to approve for testing under the NASA OE methodological
testing generic clearance (no official timeline associated with this informal process)
b. Small scale field testing18
i. Statistical analysis of responses
ii. Remove items with low p-values
c. Large scale field testing
i. Determine population/universe size for respondent audience
ii. Implement steps to enhance response rate
iii. Remove items with low p-values
OBTAINING AND MAINTAINING OMB-APPROVED DATA COLLECTION INSTRUMENTS
V. Obtaine clearance from OMB for tested data collection instruments
a. Update OMB-approved drafts according to results obtained from large scale field
testing
b. Submit tested data collection instrument for review by OMB, in accordance with
the terms of clearance set upon approval of the plan as stipulated in the generic
clearance.19

17

Involves qualitative research skills and analysis using software NASA Ed has provided for this purpose.
Involves statistical analysis skills and analysis using software NASA Ed has procured.
19
PRA_Gen_ICRs_5-28-2010.pdf.
Accessed at https://www.whitehouse.gov/sites/default/files/omb/assets/inforeg/PRA_Gen_ICRs_5-28-2010.pdf
18

31

VI. Reevaluate instrument function
a. Maintain first universe of collected responses as baseline data
b. On an annual basis, pool recently collected instrument responses with current data
set and rerun statistical analyses
c. Take barely passing items back through process starting at III.f.
d. Integrate refreshed items into instrument and forward draft to OMB for approval
under the NASA OE methodological testing generic clearance
VII.

Reevaluate alignment of data collection instruments
a. Maintain alignment with portfolio as updated
b. Maintain alignment with line of business logic model as updated

32

APPENDIX G: Explanatory Content for Information Collections for
Testing Purposes
Every information collection for the purposes of methodological testing will be prefaced by a
version of the information categories, edited to be appropriate for that particular instrument
and audience. Below is a sample that demonstrates the type of information and content that
reflects the following: 1.) Source of adaptation (if applicable); 2) Constructs of interest; 3)
Bibliographic sources that support the particular adaption or instrument draft; 4) Privacy
statement; 5) Instrument introduction; 6) Purpose of the study; 7) Description of study
procedures; 8) Estimate of time to complete the instrument; 9) Assurance of confidentiality; 10)
Contact person’s information; 11) Office of Management and Budget control information; and
12) NASA headquarters form information.

33

34

List of Tables
Table 1: Estimate of Respondent Burden for Methodological Testing ........................................ 12
Table 2: Estimate of Annualized Cost to Statistically Adjusted Number of Respondents Required
for Methodological Testing........................................................................................................... 13

35


File Typeapplication/pdf
AuthorWills, Lisa E (HQ-HA000)[VALADOR INC]
File Modified2015-04-27
File Created2015-04-23

© 2024 OMB.report | Privacy Policy