Office of Education Performance Measurement (OEPM) Program-Level Data Collection Screens

2700-0159 Supplemental Form_OE IT Infrastructure DC Screens_OEPM.12.15.16.pdf

Generic Clearance for the NASA Office of Education Performance Measurement and Evaluation (Testing)

Office of Education Performance Measurement (OEPM) Program-Level Data Collection Screens

OMB: 2700-0159

Document [pdf]
Download: pdf | pdf
REQUEST FOR APPROVAL under the Generic Clearance for NASA Education Performance Measurement
and Evaluation (Testing), OMB Control Number 2700-0159, expiration 04/20/2018
_____________________________________________________________________________________
I.

TITLE OF INFORMATION COLLECTION:
Office of Education Performance Assessment, Evaluation, and Information Management Data
Collection Screens: Office of Education Performance Measurement (OEPM) Program-level Data

II.

TYPE OF COLLECTION:
 Attitude & Behavior Scale
 Baseline Survey
 Cognitive Interview Protocol
 Data Collection Screens
 Focus Group Protocol
 Follow-up Survey
 Satisfaction Survey
 Usability Protocol

III.

GENERAL OVERVIEW: In compliance with the Government Performance and Results Modernization
(GPRAMA) Act of 2010, NASA collects data on its educational activities to ensure that progress is
being made toward Strategic Objective 2.4 and its associated performance goals, and to collect
evidence of the impact of NASA educational programs (NASA Education, 2016, pp. 8-9). The
information from this data collection, project activity data, will be used in accordance with the
criteria established by NASA for monitoring research and education projects. This information
collection is also necessary to provide NASA Education projects with information on participants
necessary to determine participant eligibility, selection for activity participation, identify
accommodations participants may have, and provide other information necessary for effective
activity implementation. Preparations are underway by the NASA Office of Education Performance
Assessment, Evaluation, and Information Management (PAEIM) team to consolidate tested,
redesigned, and improved project activity data collection screens underneath one unified,
modernized IT infrastructure provided by the Office of Education Information Technology (OE IT)
team. Rigorous testing of data collection screens underneath this clearance is the first step in
PAEIM’s plan for a redesigned and consolidated project activity data collection. Currently project
activity data is collected via the NASA Internship, Fellowship, and Scholarship (NIFS) One Stop
Shopping Initiative (OSSI) and Office of Education Performance Management (OEPM) applications.
This request pertains to OEPM data collection screens only, which constitute program-level data.

IV.

INTRODUCTION AND PURPOSE: Project activity data is collected through data collection screens, a set
of data fields strategically aligned to the NASA Education architecture of four lines of business and
accompanying programs (NASA Education, 2016, pp. 38-47) to facilitate performance measurement,
analysis, and accurate reporting of NASA’s contributions to STEM education. The project activity data
collected characterizes our recruitment pool, educational opportunities, participant pool, participant
experiences, our partners, our outputs, and outcomes, and enables effective program administration,
communication, and program and project monitoring and review. The data also allows NASA Education
to assess portfolio performance by tracking activity outputs, which help identify best practices and
constitutes information vital to strategic planning and continuous process improvement. In so doing,

Wills

1

internal users of this data collection use it to make data driven decisions and to monitor and assess
performance of the NASA education portfolio of projects administered by NASA field centers and
facilities.
While the OE IT team is responsible for the IT infrastructure, the PAEIM team is responsible for the
constitution, reliability and validity of NASA Education data collection instruments, to include attitude
and behavior scales, surveys, and project activity data collection screens. Testing is an integrated, twopronged approach supported by subject matter expertise from the PAEIM and OE IT teams. OE IT will
execute quality assurance and operability testing and PAEIM will implement usability testing and
determine reliability and validity characteristics. The teams’ combined efforts will accomplish two
objectives:
 Gain an understanding of why particular fields within the data collection screens are
yielding inconsistent, unreliable data, meaning understanding the degree to which users
are misinterpreting specific questions, and determining what to change to positively
improve the user experience and increase the quality of data collected in those fields of
concern; and


Obtain baseline quality assurance and operability (QA & O) measures in support of current
use and in support of the efforts to modernize the information technology infrastructure
supporting the data collection screens.

The purpose of testing this compendium of data collection screens is to garner information regarding
the challenges and success of the user experience to shape and improve the future design of the data
collection screens. Specific fields within the data collection screens in question will undergo
methodologically rigorous testing to identify issues with the wording of questions and instructions, for
instance, and navigation within and between data collection screens. Lastly, testing under this
clearance will allow NASA Education to establish a procedure for testing annual updates to the data
collection screens that arise in response to changes in project activities, the NASA Education portfolio,
Congressional mandates and budget requirements.
V.

RESEARCH DESIGN OVERVIEW: OE IT QA & O measures are comprised of testing techniques to
establish documented evidence that the IT system accomplishes its intended requirements, and
validate that the product being developed does what the user is expecting it to do. This is facilitated
by validating that requirements are adequately defined, designs and functionality conform to
requirements, data is treated correctly, and that test results are accurate. Some phases of dynamic
testing techniques may include:
 Unit testing – Validates that individual units of product are working as designed
 Integration testing – Units of product are combined and tested as a group
 Function testing - Involves validating product functionality against defined requirements
 System testing – Testing of both hardware and software on a completely integrated
system
 User acceptance testing – End user testing of product functionality
PAEIM’s efforts are both qualitative and quantitative in nature, seeking to identify description of
impact on user experience as well as measure impact of usability on burden, and determine reliability
and validity of the data collection screens. PAEIM may first choose to establish baseline information by
applying the following multi-method testing techniques as described briefly:

Wills

2





Focus group interviews: With groups of nine or less per instrument, this qualitative
approach to data collection is a matter of brainstorming to creatively solve remaining
problems identified after early usability testing of data collection screen and program
application form instruments (Colton & Covert, 2007, p. 37).
Think-aloud protocols (commonly referred to as cognitive interviewing): This data
elicitation method is also called ‘concurrent verbalization’, meaning subjects are asked to
perform a task and to verbalize whatever comes to mind during task performance
(Jaaskelainen, 2010). Think-aloud protocols will be especially useful towards the
improvement of existing data collection screens, which are different in purpose from
online applications. Whereas an online application is an electronic collection of fields that
one either scrolls through or submits, completed page by completed page, data collection
screens represent hierarchical layers of interconnected information for which user
training is required. Since user training is required for proper navigation, think-aloud
protocols capture the user experience to incorporate it into a more user-friendly design
and implementation of this kind of technology.
Think-aloud protocols will be in the form of a semi-structured qualitative data collection
method in which there are consistencies across the ways in which each testing session is
introduced and initiated and the ways in which the moderator is trained to intervene
when a participant falls silent. The differences will be in the object of the test scenario,
the various uses of prompts by the moderator to maintain a steady flow of engagement
from the participant, and the length of time (burden) it takes each participant to proceed
through a test scenario.
A test scenario contains the task that a user needs to accomplish during a test session and
when a participant completes the task, the session is over (UX Passion, 2016). A usability
testing scenario will not include any information about how to accomplish the task.
Hence, the usability test will show how the participant accomplishes the task and
demonstrates whether the interface, the OEPM data collection screens, facilitates
completing the scenario. After the test scenario, a comparison of how it was anticipated
that the user would complete the task to how they actually completed the task will
provide insight into the effectiveness of the OEPM architecture and navigation (U.S.
Department of Health & Human Services, 2016).



Comprehensibility testing: Comprehensibility testing of program activity survey
instrumentation will determine if items and instructions make sense, are ambiguous, and
are understandable by those who will complete them. For example, comprehensibility
testing will determine if items are complex, wordy, or incorporate discipline- or culturallyinappropriate language (Colton & Covert, 2007, p. 129).

Given the user-centric nature of NASA Education data collections, users’ requirements should drive
initial usability testing for determining the completeness and accuracy with which users achieve their
goals, the speed (with accuracy) with which the information solicited can be inputted, how satisfying
the OEPM user interface is to use, how well the data collection screens prevent errors and help users
recover from errors, and how easy it is to learn to use the OEPM data collection screens (Quensenbery,
2011).
This formative testing will utilize a small study environment in an iterative process, which includes:

Wills

3






Identifying a specific user profile (or profiles) for the study;
Creating scenarios that are task based and goal directed;
Encouraging users to think out load as they work;
And testing again to confirm that the changes worked for users (Barnum, 2010).

The time and effort reflected below in the burden estimate chart (Table 2.) indicate time spent across
all OEPM user experience levels and roles only to provide a perspective on the extent of this testing
effort and the depth of information PAEIM is seeking in this testing endeavor to enhance the
redesigned data collection screens in partnership with OE IT.
VI.

TIMELINE: The research literature demonstrates that three to five participants per scenario yields
approximately 85% of useful observations (Nielsen & Landauer, 1993, p. 212), and in this instance,
number of scenarios is dependent upon the number of different roles and the number of different
tasks each role must perform. Therefore, a timeline to completion would be devised in collaboration
with the OE IT Manager and the information technology provider or vendor. In addition, usability
testing of the OEPM data collection screens will take place over the same time period during which the
IT infrastructure modernization is underway.

VII.

SAMPLING STRATEGY: The maximum cost-benefit ratio, derived by weighing costs of testing and the
benefits gained, can be achieved with three to five-person participants per scenario. Participants will
be randomly solicited from three user experience categories: Novice (less than 1 year of OEPM use),
Moderate-experience User (between 1 year and 3 years of OEPM use), and Super User (More than 3
years of OEPM use.) Categories of users are further defined by Roles as indicated in Table 1.:
Table 1. Categories of participants in usability testing
ModerateData Collection Source (Roles)
Novice
experience
Administrator (OEPM)
Center Education Director
(OEPM)
Evaluation Manager (OEPM)
Program Manager (OEPM)
Project Manager (OEPM)
Total Participants

Wills

4

Super
User

2

2

1

2

2

1

1

1

0

2

2

1

2

2

1

9

9
22

4

VIII.

BURDEN HOURS: The scenarios would be determined by data collection source and so each scenario
would have no more than five participants. Note, students are the only participants considered
members of the public and therefore burden only applies to their participation time in usability testing.
Table 2. Burden hours for usability testing

Data Collection
Source

Novice
Testing
Participants Testing
per Role
Hrs.

Moderateexperience

Super User

Testing
Hrs.

Testing
Hrs.

Subtotal
hours per
Role/Across
User Levels

Number of
Scenarios
for
Usability
Testing

Total
Response
Burden in
Hours

Administrator
(OEPM)
2

1.0

2

0.75

1

0.5

4

5

20

2

1.0

2

0.75

1

0.5

4

14

56

1

1.0

1

0.75

0

0

1.75

2

3.5

2

1.0

2

0.75

1

0.5

4

14

56

2

1.0

2

0.75

1

0.5

4

14

Center Education
Director (OEPM)
Evaluation
Manager (OEPM)
Program Manager
(OEPM)
Project Manager
(OEPM)
Total

56
191

*Burden estimate for testing in OEPM is based completely on roles that are filled by civil
servant employees for whom burden does not apply. Due to the extensive nature of this
testing, however, and given that this testing will be focused within scenarios on data
collection fields that traditionally yield inconsistent data, we propose to track this testing for
burden accountability purposes and dissemination. We posit 196 hours as a maximum burden
estimate.
IX.

DATA CONFIDENTIALITY MEASURES: Any information collected under the purview of this clearance
will be maintained in accordance with the Privacy Act of 1974, the e-Government act of 2002, the
Federal Records Act, and as applicable, the Freedom of Information Act in order to protect
respondents’ privacy and the confidentiality of the data collected.

X.

PERSONALLY IDENTIFIABLE INFORMATION:
1. Is personally identifiable information (PII) collected? Yes No
2. If yes, will any information that is collected by included in records that are subject to the Privacy
Act of 1974? Yes No
3. If yes, has an up-to-date System of Records Notice (SORN) been published?
Yes  No
Published in October 2007, the Applicable System of Records Notice is NASA 10EDUA, NASA
Education Program Evaluation System http://www.nasa.gov/privacy/nasa_sorn_10EDUA.html.

Wills

5

APPLICABLE RECORDS: Submitted data will be retained in accordance with NASA Records
Retention Schedule 1, Item 68D. Records will be destroyed or deleted when ten years old, or no
longer needed, whichever is longer.
XI.

PARTICIPANT SELECTION APPROACH:
Does NASA Education have a respondent sampling plan? Yes  No
If yes, please define the universe of potential respondents. If a sampling plan exists, please
describe? The universe of potential usability testing participants includes OEPM
Administrators, Center Education Directors, the Evaluation Manager, and Program
Managers.
If no, how will NASA Education identify the potential group of respondents and how will they
be selected? Not applicable.

XII.

INSTRUMENT ADMINISTRATION STRATEGY
Describe the type of Consent:  Active  Passive
4. How will the information be collected:
 Web-based or other forms of Social Media
 Telephone
 In-person
 Mail
 Other
5. Will interviewers or facilitators be used?  Yes No

XIII.

DOCUMENTS/INSTRUMENTS ACCOMPANYING THIS REQUEST:
 Consent form (Appendix C: Confidentiality, Consent, & Recording Release-Adult)
 Instructions
 Instrument (Appendix A: OEPM Data Collection Screens)
 Protocol script (Appendix B: Sample Usability Testing Script)
 Other (Specify ________________)

XIV.

GIFTS OR PAYMENT:  Yes  No

XV.

ANNUAL FEDERAL COST: The estimated annual cost to the Federal government is $6,171. The cost is
based on an annualized effort of 187 person-hours at the evaluator’s rate of $33/hour for administering
the usability testing protocols, collecting and analyzing responses, and drafting a report for the
Information Technology vendor’s use towards development of the modernized system of data collection
screens for ultimate approval under the methodological testing generic clearance with OMB Control
Number 2700-0159, exp. 04/30/2018.
CERTIFICATION STATEMENT:
I certify the following to be true:
1. The collection is voluntary.
2. The collection is low burden for respondents and low cost for the Federal Government.
3. The collection is non-controversial and does raise issues of concern to other federal agencies.

XVI.

Wills

6

4. The results will be made available to other federal agencies upon request, while maintaining
confidentiality of the respondents.
5. The collection is targeted to the solicitation of information from respondents who have
experience with the program or may have experience with the program in the future.
Sponsor: Dr. Roosevelt Johnson
Title: Deputy Associate Administrator
Office of Education
Email address or Phone number: roosevelt.y.johnson@nasa.gov
Date: 12/5/16

Wills

7

References
Barnum, C. M. (2010). Usability testing essentials: Ready, set...test! Burlington, MA: Morgan Kaufmann.
Colton, D., & Covert, R. W. (2007). Designing and constructing instruments for social reserch and
evaluation. San Francisco: John Wiley and Sons, Inc.
Jaaskelainen, R. (2010). Think-aloud protocol. In Y. Gambier, & L. Van Doorslaer (Eds.), Handbook of
translation studies (pp. 371-373). Philadelphia, PA: John Benjamins.
NASA Education. (2016, January). Retrieved from NASA Education Implementation Plan:
http://www.nasa.gov/sites/default/files/atoms/files/nasa_education_implementation_plan_ve
4_2015-2017.pdf
Nielsen, J., & Landauer, T. K. (1993). A Mathematical Model of the Finding of Usability Problems.
INTERCHI 93 Conference on Human Factors in Computing Systems (pp. 206-213). Amsterdam:
ACM Press. Retrieved July 26, 2016, from
http://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.544.5899&rep=rep1&type=pdf
Quensenbery, W. (2011, July 25). WQUsability. Retrieved from
http://wqusability.com/publications.html: http://wqusability.com/handouts/righttechniqueuf2008.pdf
U.S. Department of Health & Human Services. (2016, July 28). Scenarios. Retrieved from Usability.gov:
https://www.usability.gov/how-to-and-tools/methods/scenarios.html
UX Passion. (2016, July 28). Usability: What is a test scenario and how to design it. Retrieved from UX
Passion: http://www.uxpassion.com/blog/usability-what-is-test-scenario/

Wills

8

Appendix A: Sample Usability Testing Script/Protocol (Barnum, 2010):
Welcome to participant. NASA Office of Education appreciates your participation in this study. Your being
here today will help NASA Education improve the ways in which users are able to access, navigate, and
successfully interact with the OEPM data collection screens.
State the purpose of the study. The purpose of this study is to learn from you what works well and what
does not, the reason for conducting a usability test of the OEPM data collection screens. The research
study team is completely open to anything you might want to share that would help improve the
experience of these data collection screens for future users.
Provide forms required for participation, unless completed electronically in advance. I have on record that
you have already completed your video consent form for the study. The video recording of this session
will be used to gather in a form we can analyze what works well and what does not work well as you
access and navigate through the OEPM data collection screens according to the scenario provided. This
video will not be seen by anyone outside of the research group, which includes X, X, X, etc.
Describe the participant room. As described in the solicitation for participants, note that there is a video
camera posted over your shoulder to capture your efforts to navigate the scenario. Also note, there is an
observer who will take notes as well of what she/he sees during the testing.
Explain the testing process. I will be asking you to perform some tasks with the OEPM data collection
screens that are typical of your user role and level of experience. I will be sitting beside you while you
move through the scenario, observing what you do and asking for clarification about anything you do as
you think out load during the testing.
Describe thinking out loud. I realize that it’s not natural to think out load while working, but doing so will
help the team get insight into your experience when you share your thoughts this way. Some examples
are, “I like the tabs at the top because they are clearly labeled.” “Using the “Description” tab takes me to
information I find useful.” Or, “I have no idea what this work means…” or “This is not what I expected to
happen when I clicked on that link...” My role as moderator is to ask questions or remind you to share
thoughts if you become quiet.
Ask participant to share any questions or concerns. Do you have any questions or concerns? Are you
comfortable? You can stop any time.
Thank the participant. Thanks again for participating. NASA Education values your time and efforts in this
research study. Are you ready to begin?
Start the study. Indicate to the videographer to start the camera. Ask the participant to read the scenario
out load to transition her/him into thinking out loud more easily.
[Testing situation proceeds]
Thank the participant. Indicate to the videographer and observer that the session is at an end. Thank the
participant for her/his time and effort. Reassure her/him of confidentiality measures. Thank the research
support staff for their participation.

Wills

9

Appendix B:

Office of Education Performance Measurement (OEPM) Data
Collection Screens Usability Testing
Confidentiality, Consent & Recording Release Form – Adult

Confidentiality Statement: In accordance with the Privacy Act of 1974, as amended (5 U.S.C. 552a), you
are hereby notified that this study is sponsored by the National Aeronautics and Space Administration
(NASA) Office of Education (Education), under authority of the Government Performance and Results
Modernization Act (GPRMA) of 2010 that requires quarterly performance assessment of Government
programs for purposes of assessing agency performance and improvement. NASA Education may use
the information provided for purposes related to improving the user interface of the Office of Education
Performance Measurement (OEPM) data collection screens. NASA Education will hold the information in
confidence to the full extent permitted by law. Although efforts will be taken to ensure confidentiality,
there remains a remote risk of personal data becoming identifiable. A non-identifying code number will
be assigned to participants’ data records, which will be stored in accordance with federal regulatory
procedures and accessible only to the investigator. While qualitative data is usually analyzed in
aggregate, any use of individual data to illustrate specific results will be labeled in a manner to preserve
the participants’ anonymity. Information will be secured and removed from the NASA server and
location upon guidelines set out by the NASA Records Retention Schedule 1392, 68-69.
Introduction: This research seeks to support the mission of the NASA Office of Education by asking you
to take part in usability testing of the OEPM data collection screens that accept applicant information for
a NASA internship, fellowship, or scholarship opportunity. The information we collect in this usability
study will help us improve the data collection screen users’ experience in regards to wording and
understanding of questions and instructions, and navigation through the data collections screens.
Purpose of the Study: Determine the degree to which OEPM data collections screens users are
interpreting and answering questions consistently and reliably, and determining how the user interface
is impacting the user experience.
Description of Study Procedures: You will be asked to think aloud while accomplishing a task as
described in a test scenario. A moderator will be present to remind you to think aloud, if you fall silent.
You will audio and video taped while completing the test scenario. When you complete the task as
described in the scenario, the session is complete.
Estimation of Time Required: To be determined through the testing procedure.
Contact Persons: If you have any additional questions concerning the research, this consent or
confidentiality of responses, please contact Dr. Lisa E. Wills, Senior Education Research Associate, at
lisa.e.wills@nasa.gov or call (202)258-6021.
Study Consent: There are no foreseeable risks to participants electing to participate in this study. Your
participation is completely voluntary. You may cease participation at any time. In no way does refusing
participation in this study preclude you from eligibility for NASA education project activities now or in
the future.
---TO BE MAINTAINED BY PARTICIPANT--I agree to participate in the study conducted and recorded by the National Aeronautics and Space
Administration (NASA) Office of Education (Education).

Wills

10

I understand and consent to the use and release of the recording by [[Agency/Organization]. I
understand that the information and recording is for research purposes only and that my name and
image will not be used for any other purpose. I relinquish any rights to the recording and understand the
recording may be copied and used by [Agency/Organization] without further permission.
I understand that participation in this usability study is voluntary and I agree to immediately raise any
concerns or areas of discomfort during the session with the study administrator.
Please sign below to indicate that you are 18 years of age or older, have read and understand the
information on this form, and that any questions you might have about the session have been answered.

Date:_________
Please print your name: ____________________________________________________
Please sign your name: ____________________________________________________

Thank you!
We appreciate your participation.

Wills

11


File Typeapplication/pdf
AuthorWills, Lisa E (HQ-HA000)[VALADOR INC]
File Modified2016-12-18
File Created2016-12-18

© 2024 OMB.report | Privacy Policy