OMB Memo

Generic Information Collection Request for AIES Interactive Question Preview Tool 11.17.23.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf

Generic Information Collection Request:
Usability Testing for the
Interactive Question Preview Tool for the Annual Integrated Economic Survey



Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). Researchers will be conducting up to 30 usability testing interviews with participants to understand whether respondents can effectively use the Interactive Question Preview Tool for the Annual Integrated Economic Survey (AIES).


Background and Previous Research: The Census Bureau’s Economic Directorate asked the National Academy of Sciences (NAS) to convene an expert panel to review their appropriated annual economic surveys and recommend improved methodologies for conducting and processing them. The panel started work in July 2015 and the final report, with recommendations, was released in May 2018. To address these recommendations, the Economic Directorate has been conducting research into harmonizing and simplifying the design and production process for these surveys and the Economic Census, briefly summarized1:


  • Two-round Unit Harmonization Study consisting of interviews on record keeping practices and data accessibility;

  • Respondent and Non-respondent Debriefings to support contact consolidation;

  • AIES Pilot Phase I pilot test of 78 companies using the harmonized content;

  • AIES Pilot Phase II pilot test of about 800 companies that included additional instrument features and systems; and

  • 2022 AIES (Dress Rehearsal) test of about 8,000 companies using Census Bureau production infrastructure.


In all of this work, we asked respondents about the response process they use to report to Census Bureau economic surveys. We heard from many respondents that part of the response process is to preview questions to get a ‘sense’ of the scope and breadth of the survey overall before actually providing response. In early rounds of testing, we included tools like PDF previews of the survey instrument and additional supplementary information to provide response.


However, in the most recent collection – the 2022 AIES, colloquially known as the Dress Rehearsal and representing the third phase of the AIES Pilot overall (collected under OMB 0607-1024 and expires on 6/30/2026) – the complexities of the instrument design warranted a new approach to survey preview. The AIES is a cross-economy collection; the content is driven by all in-scope industry assignments of a given company, and so determining content for any given company is more complicated than could reasonably be displayed in a PDF preview.


Nonetheless, the Census Bureau provided a content summary document for respondents (available: https://www.census.gov/programs-surveys/aies/information/aies-content-summary.html) to get a sense of the types of questions they would encounter). We noted, however, that this documentation was not dynamic, and the length of the PDF would increase perceived response burden, respondents’ perception of the level of effort necessary to complete the survey (Bottone 2021: 813).


From there, we started to develop an interactive content selection tool that respondents could use to preview questions that they would encounter when reporting to the AIES. This tool is designed to display content based on user-identified industries defined by the North American Industry Classification System (NAICS).


Purpose: Researchers in the Economy-Wide Statistics Division (EWD) will conduct usability interviews to assess functionality of the prototype interactive question preview tool. This interviewing will examine whether respondents can successfully complete tasks that are designed to mimic those they execute using the interactive question preview tool to support response to the AIES. Researchers will investigate whether the interactive question preview tool is intuitive by assessing respondents’ ability to navigate through the tool in an efficient way to a successful completion of the task.


The research will be guided by the following two research questions:

  • Do users understand the purpose of the tool?

    • Do they identify that the content is industry driven?

    • Do they understand collection unit differences?

    • Does this tool meet the need for the ability to preview the survey prior to reporting?

  • Can users generate content that matches their AIES content?

    • What features of the tool are intuitive? What features are not?

    • What additional features might support response?



Population of Interest: Respondents participating in the interactive question preview tool usability testing will have responded to the 2022 AIES (Dress Rehearsal). Note that these respondents have not participated in any respondent debriefing interviews (collected under OMB 0607-1024 and expires on 6/30/2026) or the AIES Pilot Phase III Response Analysis Survey (collected under OMB Control Number 0607-0971 and expires on 12/31/2025). Total recruitment pool is about 600 businesses.


Timeline: Testing will be conducted between December 2023 through January 2024.


Language: Testing will be conducted in English only.


Method: To test the content selection tool, we will engage in usability interviewing. Usability interviewing is a task-oriented semi-structured interviewing methodology. Interviewers provide users with specific tasks designed to mimic the actions they would need to do when interacting with the content selection tool outside of the testing environment. The success or failure of the tasks allow researchers to assess the functionality, effectiveness, and efficiency of the tool. For the purposes of this research, the usability tasks will be focused on the participant’s ability to complete basic tasks such as selecting all industries that apply to the business, clearing selections, accessing questions across unit specific tabs (company, industry, and establishment), and exporting a question preview file.


Interviews: All interviews will be conducted using Microsoft Teams. The interviews will follow a semi-structured interview protocol (Attachment A) that include a suite of tasks designed to assess the usability of the tool by having respondents complete actions they will need to complete when using the question preview tool. Answers to any data entry tasks will be provided to the participant so the focus of the tasks remains on navigation and interacting with the tool. The interviews may be recorded (with consent), to facilitate summarization.


Note that these tasks are subject to change when the final tool design is in hand. The basic goals of each task (e.g., data entry; navigation) will remain the same. Attachment B contains mock-ups of the draft question preview tool that we will ask participants to evaluate. Note that the tool screens will change as we move closer to production and incorporate feedback from testing. We anticipate this testing to be iterative.


Sample: We plan to conduct a maximum of 30 interviews with a variety of sizes and types (i.e., industries) of businesses. We will recruit from a list of about 600 businesses that have provided response to the 2022 AIES but were not invited to participate in the Phase III RAS or in the 2022 AIES respondent debriefing interviews.


Recruitment: Participants will be recruited from the about 600 businesses that have provided response to the 2022 AIES but were not contacted to participate in the 2022 AIES respondent debriefing interviews or the Phase III RAS. Before beginning the interviews, we will provide participants a consent form (Attachment D), informing them that their response is voluntary and that the information they provide is confidential under Title 13.


Protocol: A copy of a draft interview protocol including the participant tasks is attached (Attachment A).


Use of incentive: Monetary incentives for participation will not be offered.


Length of interview: For the usability interviews, we expect that each interview will last no more than 60 minutes (30 cases x 60 minutes per case = 30 hours). Additionally, to recruit respondents we expect to reach out via email and to make up to 3 phone contacts per completed case. The recruiting emails and calls are expected to last on average 3 minutes per call (3 attempts per phone call per completed case x 30 cases x 3 minutes per case = 4.5 hours). 

Thus, the estimated burden for this project is 34.5 hours (30 hours for interviews + 4.5 hours for recruiting).


Below is a list of supporting documents referenced herein:


  1. Attachment A: Protocol used to outline how the research study will be conducted

  2. Attachment B: Screenshots of the draft interactive question preview tool

  3. Attachment C: Recruitment e-mail draft

  4. Attachment D: Consent Form to obtain participant consent for participation and recording of the interview session


Works cited:

Bottone, Marco, Lucia Modugno, and Andrea Neri. 2021. “Response Burden and Data Quality in Business Surveys.” Journal of Official Statistics 37(4):811–36.



The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:

Krysten Mesner

Economy-Wide Statistics Division, Office of the Division Chief

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-9852

krysten.mesner@census.gov



cc:
Nick Orsini (ADEP) With enclosures

Lisa Donaldson (EWD) “ ”

Melissa Cidade (EWD) “ ”

Jenna Morse (EMD) “ ”

James Burton (EWD) “ ”

Blynda Metcalf (ADEP) “ ”

Heidi St.Onge (ADEP) “ ”

Michelle Karlsson (EMD) “ ”

Jasmine Luck (ADRM) “ ”


1 The full suite of respondent research approved through OMB Control Number 0607-0725 is listed below:

  • In-depth Exploratory Interviewing to Study Record-Keeping Practices, July 2019

  • Respondent Debriefings for the Coordinated Contact Pilot Experiment, March 2020

  • Cognitive Interviewing for the Content Harmonization and Collection Unit Determination Instrument, October 2020

  • Non-respondent Debriefings for the Coordinated Contact Pilot Experiment, August 2021

  • Cognitive Interviews for the Annual Integrated Economic Survey, Phase I, October 2021

  • Cognitive Interviews for the Annual Integrated Economic Survey, Phase II, October 2021

  • AIES Dress Rehearsal Usability Interviewing, March 2023

4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRebecca Keegan (CENSUS/ESMD FED)
File Modified0000-00-00
File Created2025-08-12

© 2025 OMB.report | Privacy Policy