American Factfinder Pretesting Plan

OMB1311AFF.docx

Generic Clearance for Questionnaire Pretesting Research

American Factfinder Pretesting Plan

OMB: 0607-0725

Document [docx]
Download: docx | pdf

The Census Bureau plans to conduct a usability test of the new American FactFinder (AFF) Web site under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). The objective of this research is to identify issues that are problematic and frustrating to the user and to compare the results to the original baseline usability testing that was conducted on the legacy AFF system in 2008. AFF is a free online tool that allows users to find, customize and download Census information. It is available to the public, and a multitude of diverse users search the site seeking a vast range of information.


Over the past few years the developers of the AFF Web site have worked with stakeholders and the usability lab to address usability issues in their updates of the application. An update that included some user-centered design changes occurred in January of 2013. This study is intended to be a “follow-up baseline” on the latest version of the site to understand how it performs for users with respect to the legacy version. This is the third follow-up baseline study. The initial baseline study was submitted to OMB for approval in a letter dated October 9, 2008. Follow-ups have been conducted with OMB approval (letters dated April 4, 2011, and May 15, 2012).


During June 2013, staff from the Census Bureau’s usability lab will interview 20 non-Census Bureau participants from the Washington DC metropolitan area. We will recruit two different user groups: novice users of census data who have a minimum of one year of Internet experience and use the Internet at least three times a week to search for information; and expert users of census data who use census data for work purposes. Novice participants will be recruited from the Usability Lab database, which is composed of people from the metropolitan DC area who volunteered to participate after responding to a craigslist.com posting or an ad in a local newspaper. Expert users will be recruited both by the Usability Lab database, and by recommendations from contacts made by the Census Regional Offices or census partners. Participants will come to the Usability Lab at the Census Bureau for the study.


Participants will initially be asked to complete a questionnaire about their demographic characteristics and a “background questionnaire” about their Internet experience. Then each participant will be given a set of tasks. The study will use similar tasks and protocol that was used on the original baseline so that we will be able to compare the performance of the new site with respect to the performance of the old site1. Tasks will be randomized such that no participant will receive the tasks in the same order. Participants will be asked to think aloud while they are working on the tasks and will be prompted to think aloud when they fall silent.


In this study we will be asking a post-task satisfaction question after each task. The first two studies (original baseline in 2009 and first follow-up to baseline in 2011) did not collect this information. The third study (second follow-up baseline in 2012) did collect the post-task satisfaction question. While we will not use this measure to compare back to the 2008-2011 studies, we will use it to compare to the 2012 study.


Finally, participants will be asked to complete a final satisfaction questionnaire designed to measure their satisfaction with the new version of the AFF site. Subjective satisfaction ratings will be collected for such design elements as the layout of page, ease of finding information, and use of Census jargon. Participants will also provide feedback about the Web site during a debriefing at the conclusion of the session.


A copy of the background questionnaire, demographic questionnaire, the task set, the protocol, the post-task satisfaction questionnaire, the final satisfaction questionnaire and the debriefing questionnaire are enclosed. We are also enclosing a draft usability report that compares the results of the 2008 baseline study with the 2011 and 2012 follow-up studies.


Respondents will be informed that their involvement is voluntary and that the information they provide is confidential and will be seen only by employees involved in the research project. Participants will be videotaped and only employees involved in the research project will see the video recordings. Participants will be compensated $40 for their participation.


We estimate that users will spend one hour on average taking the study, including time spent working on the demographic questions, Internet experience questions, the tasks, the satisfaction questions, and the debriefing. Thus, the total estimated respondent burden for this test is 20 hours.


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Erica Olmsted-Hawala

Center for Survey Measurement

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-4893

Erica.L.Olmsted.Hawala@census.gov


1 A few of the tasks have been updated with minor changes, such as date, to reflect a more recent time-period. A task that is not currently available on the new site has been re-written. This task will not compare going back to the original baseline but will be used going forward to future follow-up baseline studies. See the xls spreadsheet for a highlight of the tweaks to the tasks. The protocol is the same aside from a new research component that has a short memory task, done at the beginning of the study, and the research questions conducted at the conclusion of the study.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorolmst001
File Modified0000-00-00
File Created2021-01-29

© 2024 OMB.report | Privacy Policy