TITLE OF INFORMATION COLLECTION: User Research & Usability Testing of SSA Electronic Projects
SSA SUB-NUMBER:
PURPOSE:
The mission of the User eXperience Group (UXG) is to provide the Social Security Administration (SSA) with well-designed, well-executed usability evaluations for all data collection and information dissemination methods SSA uses to interface with the public. Thus, we ensure these methods are customer-centered and effective.
Given the approaching retirement wave of both SSA employees and the general public, it is critical SSA’s self-service forms, applications, and other service delivery vehicles provide viable alternatives to SSA’s in-office and telephone interview service channel. To this end, User‑Centered Design Activities and Evaluations are a critical success factor. We will design all of SSA’s public facing information dissemination and self-service forms, and applications to ensure user success and accurate data collection. SSA’s goal is to provide American citizens; businesses; and state and local governments with self-service applications that are extremely secure, highly rated, and easy to use.
SSA would like to seek clearance for an expanded list of usability activities through the end of the calendar year 2020. Citizen-centered design activities, conducted early and iteratively during the design of a vehicle would allow SSA to design more usable, effective and accepted self-service vehicles. Conducting these citizen‑centered design activities with more participants would allow SSA to satisfy the needs of a broader segment of the public. Conducting a series of tests iteratively with design refinements would allow design refinement and confirmation that design changes actually improved participant performance.
Some of the specific types of User-Centered Design Activities and Evaluations that this clearance pertains to are as follows:
User Interviews - Interviewing one or more users is an effective method to gather information about their tasks, issues, goals, and the environment in which they work. Interviewing a small group of users can elicit multiple perspectives about the work. Interviewing can also be done from a distance and may involve traditional or newer technologies ranging from the telephone to internet-based applications such as NetMeeting. Interviewing users at remote locations can provide access to more users and to users who might not otherwise be able to participate.
Contextual Inquiries - Contextual inquiry combines interviewing users with work observation and allows teams to watch users perform tasks. Teams listen to users explain what they are doing as they work, and can interject questions to elicit more about the user, the work, and the environment.
Card sorting - Card sorting is a simple design technique that can help structure the content of an application. We give users a pile of cards, with a description of the menu or content item written on each card, and told to sort them into logical groups. Card sorting streamlines the process of structuring content, such as pages on a web site or menus. It also provides an excellent way of finding out what terminology is best for labels and links. We believe card-sorting works best when used early in the design phase before making decisions about the structure of the web site or the grouping of menu options. It is an effective tool for capturing feedback from target users before producing a paper mock-up.
Usability Evaluations – This iterative process, which involves testing a design with users and then using the test results to improve the design. Tests can be informal or formal and done in person or at a distance using a combination of telephone and remote technologies such as Skype. The purpose of is always to see what is working well and what is not working well – with the goal of improving the design based on feedback from users.
The UXG team records information related to the effectiveness of the vehicle’s design in enabling the participants to conduct their tasks accurately, efficiently, and to the participant’s satisfaction, such as:
Task completion percentages and time;
Task completion accuracy;
Problems the participants encountered; and,
Comments the participants make regarding the vehicle’s ease-of-use.
The study participants provide feedback consisting of one or more of the following:
Completion of the System Usability Scale (SUS) questionnaire or other standardized usability rating surveys;
Specific suggestions for change; and,
Both their positive and negative reactions to the system including general comments.
We typically select study participants from SSA’s beneficiary rolls or by contacting organizations and businesses that would meet the user requirements for the application or website undergoing testing. For example, if the UXG were to test an application designed for disabled people, we may randomly select people from those receiving disability benefits. If the UXG were to test a form that attorneys would complete, we would contact law firms for test participants. If the UXG were to test an application designed for wage reporting, we would contact employers and payroll service providers.
SSA currently conducts usability evaluations within our design activities, which obtains the same type of data sets for each project. Because of our short timeframes and inability to stop and obtain clearance for each Usability session, we limit our usability groups to nine or less participants.
SSA uses eye-tracking technology for testing the usability of software and conducting user experience evaluations. This technology helps identify areas of confusion or interest without having to probe the user for specific information. In many cases, it helps the facilitators locate areas of frustration and discuss thoughts and alternate options with the testing participant, in real time. The eye-tracker data will supplement the more commonly collected usability data, such as error rate, time on task, and mouse clicks; all of which help us improve the screen design and customer experience. This tool will also assist us in evaluating mobile software by providing concrete data to support our findings and recommendations from test results.
Data mining – Allows SSA to look at all data we receive by the following methods:
recorded phone calls to support centers;
text recorded by chat bots;
Web metrics collected by analytic tools (e.g., Google Analytics); and
eye tracking methods
We will use the gathered data to identify common points of failure in usability of our systems. This will supplement the data gathered by other methods listed above in order to make data based decisions for future IT investments. Data mining will help the UXG in designing better systems to help end users work more proficiently by removing the points of failure and providing a more efficient workflow.
Team members of UXG, which consists of SSA personnel and contractor staff, conduct these activities and evaluations. Study participants are members of the public or members of a group, such as state and local government agencies, employers, and advocates, who use the vehicle under evaluation in conducting their business task(s) with SSA. The vehicles may be paper, such as forms and pamphlets; or automated, such as computer systems, the Internet, or voice-response telephone menus, or data mining of data collected for other purposes.
In this Generic Clearance Request, we are requesting clearance for all usability sessions through the end of calendar year 2020. Although the subject of the usability sessions may change, the questions and procedures remain constant. Therefore, instead of requesting individual clearance for each usability session, we have attached for your review a list of projected projects for which we will conduct usability testing next year, and the questions we will use for each of these projects.
DESCRIPTION OF RESPONDENTS:
Study participants are members of the general public or members of a group, such as state and local government agencies, employers, advocates, who use the vehicle under evaluation in conducting their business task(s) with SSA.
TYPE OF COLLECTION: (Check one)
[ X] Customer Comment Card/Complaint Form [X ] Customer Satisfaction Survey
[ X] Usability Testing (e.g., Website or Software) [X ] Small Discussion Group
[X] Focus Group
[X ] Other: Survey questionnaires related to the project process, Informed consent form to be signed by the participant, answers to interview and observation questions
CERTIFICATION:
I certify the following to be true:
The collection is voluntary.
The collection is low-burden for respondents and low-cost for the Federal Government.
The collection is non-controversial and does not raise issues of concern to other federal agencies.
The results are not intended to be disseminated to the public.
Information gathered will not be used for the purpose of substantially informing influential policy decisions.
The collection is targeted to the solicitation of opinions from respondents who have experience with the program or may have experience with the program in the future.
Name: Naomi Sipple, Reports Clearance Team Leader, Social Security Administration
To assist review, please provide answers to the following question:
Personally Identifiable Information:
Is personally identifiable information (PII) collected? [ X ] Yes [ ] No
If Yes, is the information that will be collected included in records that are subject to the Privacy Act of 1974? [ ] Yes [ X ] No
If Applicable, has a System or Records Notice been published? [ ] Yes [ X ] No
Gifts or Payments:
Is an incentive (e.g., money or reimbursement of expenses, token of appreciation) provided to participants? [ X ] Yes [ ] No
If a vendor assists in recruiting and hosting our evaluations for this generic clearance, we contract with them to provide the compensation for the participants. For example, recruiting physicians to participate in evaluating a Consultative Examination Scheduling System may require higher incentives. The vendor will provide an average payment of $75 unless another amount is required.
BURDEN HOURS
Category of Respondent |
No. of Respondents |
Participation Time (minutes) |
Burden (hours) |
Individuals |
1,800 |
120 |
3,600 |
FEDERAL COST: The estimated annual cost to the Federal government is $300,000.
If you are conducting a focus group, survey, or plan to employ statistical methods, please provide answers to the following questions:
The selection of your targeted respondents
Do you have a customer list or something similar that defines the universe of potential respondents and do you have a sampling plan for selecting from this universe? [ X] Yes [ ] No
If the answer is yes, please provide a description of both below (or attach the sampling plan). If the answer is no, please provide a description of how you plan to identify your potential group of respondents and how you will select them?
We typically select study participants from SSA’s beneficiary rolls or by contacting organizations and businesses that would meet the user requirements for the application or website undergoing testing. For example, if the UXG were to test an application designed for disabled people, we may randomly select people from those receiving disability benefits. If the UXG were to test a form that attorneys would complete, we would contact law firms for test participants. If the UXG were to test an application designed for wage reporting, we would contact employers and payroll service providers.
Administration of the Instrument
How will you collect the information? (Check all that apply)
[X] Web-based or other forms of Social Media
[X] Telephone
[X] In-person
[X] Other, Explain: Emailed surveys
Will interviewers or facilitators be used? [ X ] Yes [ ] No
Please make sure that all instruments, instructions, and scripts are submitted with the request.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |