OMB Memo

Generic Information Collection Request.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf

Generic Information Collection Request:
Usability Testing for the
Annual Integrated Economic Survey



Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). Researchers will be conducting usability testing with participants to understand whether respondents can navigate through and understand the proposed structure of the new Annual Integrated Economic Survey (AIES), and how this structure would be expected to affect procedures respondents use to gather and enter the data for it.


Background and Previous Research: The Census Bureau’s Economic Directorate asked the National Academy of Sciences (NAS) to convene an expert panel to review their appropriated annual economic surveys and recommend improved methodologies for conducting and processing them. The panel started work in July 2015 and the final report, with recommendations, was released in May 2018. To address these recommendations, the Economic Directorate has been conducting research into harmonizing and simplifying the design and production process for these surveys and the Economic Census.


This research program began with a series of focus groups and interviews to determine the record-keeping practices of medium-sized multi-unit companies, beginning in July 2019. The findings from this research informed a further exploration of the accessibility of various data by topic (including revenue, expenditures, and others) by unit of analysis (including company, establishment, state, and others), beginning in October 2020.


At the same time, survey operations staff in the Economic Directorate began the process of consolidating contact information for the largest firms, leading to a multi-year pilot focused on coordinated contact for the standing annual surveys. The first two rounds of contact consolidation were supported by two rounds of debriefing interviews: the first, beginning in March 2020, examined the impact of the contact strategies used in the pilot, while the second, beginning in August 2021, focused on non-respondents and barriers to completion.


The results of all this research have been incorporated to compile a single harmonized survey instrument designed to be administered as one survey cycle throughout the economy, regardless of firm size, industry, or other characteristics.


In December 2021, we received approval to conduct the first phase of pilot research in support of the new instrument and have additional requests for approval to conduct other research for this survey throughout 2023.


In total, six rounds of respondent research have been conducted to support the development and design of the AIES product.1


Purpose: Usability interviews will be conducted to assess functionality of the prototype instrument, by examining whether respondents can successfully complete tasks that are designed to mimic those they would need to complete when filling out the AIES. Researchers will assess whether the instrument is user friendly by assessing the ability of respondents to navigate through the instrument in an efficient way, and to test the respondents’ ability to successfully provide data to the AIES.


Objectives for the evaluation of the online AIES instrument include the following:


  • Evaluate the instrument’s performance in terms of efficiency, accuracy, and user satisfaction.

  • Identify areas of the instrument that are problematic for users.

  • Identify instructions/features that are difficult for users to understand.

  • Evaluate the ability of respondents to complete the basic data collection steps.

  • Understand how respondents navigate and use the spreadsheet.

  • Identify if respondents demonstrate an understanding of the establishment versus industry reporting.

  • Evaluate if respondents can access instrument support documents.

  • Evaluate how respondents resolve errors.

  • Evaluate if respondents understand how to submit their data.

  • Provide recommendations for improvements to the design of the instrument that will enhance its usability.



Staff from ESMD’s Data Collection Methodology & Research Branch will be conducting the interviews for this testing.


Population of Interest: Respondents representing companies selected in at least two of the following surveys: ARTS, AWTS, SAS, ASM, and/or ACES. Respondents participating in the usability testing will not have taken part in previous AIES pilot studies.


Timeline: Testing will be conducted over three rounds from April through December 2023.


Language: Testing will be conducted in English only.


Method: The method of research will be usability interviews. Usability testing is a method of testing the usability of an instrument by providing users with specific tasks designed to mimic the actions they would need to do when interacting with the instrument outside of the testing environment. The success or failure of the tasks allow researchers to assess the functionality, effectiveness, and efficiency of the instrument. For the purposes of this research, the usability tasks will be focused on the participant’s ability to complete basic tasks such as data entry, site navigation, and accessing or interacting with new features built into the instrument (e.g., auto-summing; optional reporting features). Researchers will also assess the flow of the instrument and examine whether any difficulties arise with reporting at various units (company; industry; establishment.


Most interviews will be completed in person. Travel is required for the interviews2. These interviews will take place in-person at participants’ places of business or another location of their choosing. Any interviews not conducted in person will be conducted over the telephone or via a conference call line using Microsoft Teams. The interviews will follow a semi-structured interview protocol (Attachment A) that include a suite of tasks designed to assess the usability of the instrument by having respondents complete actions they will need to complete during actual data collection. Answers to any data entry tasks will be provided to the participant so the focus of the tasks remains on navigation and interacting with the website and not data retrieval. The reporting spreadsheet participants fill out will be customized to their company in that the locations listed (with associated NAICS code and address) will be prefilled and accurate. No other substantive information, such as employee count or any financial data, will be prefilled. The answer key is provided in Attachment B. The interviews may be recorded (with consent), to facilitate summarization.


Note that these tasks are subject to change when the final instrument design is in hand. The basic goals of each task (e.g., data entry; navigation) will remain the same. Attachment C contains mockups of the draft instrument that we will ask participants to evaluate. Note that the instrument screens will change over the course of the year as we move closer to production and incorporate feedback from testing.


Sample: We plan to conduct a maximum of 90 interviews with a variety of sizes and types (i.e., industries) of businesses. These 90 interviews will be distributed across the three rounds as follows: Rounds 1 and 2 will include 30-45 interviews each and the remaining interviews will constitute Round 3. Rounds 2 and 3 will include revisions to the instrument based on results from the preceding round. Usability testing will involve recruiting primarily larger establishments, which will be most affected by the new collection method, followed by a smaller sample of medium and small sized establishments, as determined by annual revenue. The sample size necessary for this test was determined by qualitative research experience. This sample will yield a suitable, broad representation of U.S. businesses for this cognitive testing, and it should be large enough to provide reactions to the questions in order to identify meaningful findings.


Recruitment: Participants will be recruited using the sample file from the coordinated collection sample frame which included respondents representing companies selected in at least two of the following surveys: ARTS, AWTS, SAS, ASM, and ACES. The recruitment sample will not include those who participated in the pilot testing. Before beginning the interviews, we will provide participants a consent form (Attachment D), informing them that their response is voluntary and that the information they provide is confidential under Title 13.


Protocol: A copy of a draft interview protocol including the participant tasks is attached (Attachment A). The answer key for the tasks is in Attachment B.


Use of incentive: Monetary incentives for participation will not be offered.


Length of interview: For the usability interviews, we expect that each interview will last no more than 60 minutes (90 cases x 60 minutes per case = 90 hours). Additionally, to recruit respondents we expect to reach out via email and to make up to 3 phone contacts per completed case. The recruiting emails and calls are expected to last on average 3 minutes per call (3 attempts per phone call per completed case x 90 cases x 3 minutes per case = 13.5 hours). 

Thus, the estimated burden for this project is 103.5 hours (90 hours for interviews + 13.5 hours for recruiting).


Below is a list of supporting documents referenced herein:


  1. Attachment A: Protocol used to outline how the research study will be conducted

  2. Attachment B: Answer Key for respondent data entry tasks

  3. Attachment C: Mockup of the draft instrument and reporting spreadsheet.

  4. Attachment D: Consent Form to obtain participant consent for participation and recording of the cognitive interview session


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:

Rebecca Keegan

Data Collection Methodology and Research Branch

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-6003

rebecca.keegan@census.gov



cc:
Nick Orsini (ADEP)

Amy Anderson Riemer (ESMD) “ ”

Lisa Donaldson (EMD)

Jenna Morse (EMD)

James Burton (EMD)

Kimberly Moore (ESMD)

Blynda Metcalf (ADEP)

Melissa Cidade (EMD)

Jennifer Hunter Childs (ADRM) “ “

Jasmine Luck (ADRM) ” “



1 The full suite of respondent research approved through OMB Control Number 0607-0725 is listed below:

  • In-depth Exploratory Interviewing to Study Record-Keeping Practices, July 2019

  • Respondent Debriefings for the Coordinated Contact Pilot Experiment, March 2020

  • Cognitive Interviewing for the Content Harmonization and Collection Unit Determination Instrument, October 2020

  • Non-respondent Debriefings for the Coordinated Contact Pilot Experiment, August 2021

  • Cognitive Interviews for the Annual Integrated Economic Survey, Phase I, October 2021

  • Cognitive Interviews for the Annual Integrated Economic Survey, Phase II, October 2021

2 Testing locations to be determined.

4


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorRebecca Keegan (CENSUS/ESMD FED)
File Modified0000-00-00
File Created2025-08-12

© 2025 OMB.report | Privacy Policy