OMB Memo

OMB generic clearance request 2025 BTOS AI Supplement Cognitive Research 12.23.24.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf

Generic Information Collection Request:
Cognitive Interviewing for the Business Trends and Outlook Survey (BTOS)



Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We will be conducting cognitive interviews to test a proposed supplement on the topic of Artificial Intelligence (AI) for the Business Trends and Outlook Survey (BTOS). Note that a previous AI supplement was fielded with BTOS from December 2023 to February 2024, but this version includes new questions to address concerns about Generative AI, and AI use beyond the production of goods and services.


The Business Trends and Outlook Survey collects qualitative survey responses every two weeks from a representative sample of all non-farm businesses in the United States, Puerto Rico, and Washington D.C. This high frequency program produces timely, granular data products that provide measures of business conditions and outlook and capture how extensive the impact on businesses events such as economic downturns or adverse weather events are. As economic and other conditions change, businesses are faced with new challenges and opportunities; if successfully tested, newly proposed content may help address data gaps for these emerging issues. For more information about the Business Trends and Outlook Survey, see: Business Trends and Outlook Survey (BTOS) (census.gov)


This revised AI supplement has a number of key differences from the first AI supplement fielded from December 2023 to February 2024. While the core questions are unchanged, the new supplement goes beyond the concept of AI use in “producing goods and services” and asks about AI use more generally, including use in other business functions (outside of production) and use by workers in several tasks.


Purpose:


The purpose of cognitively testing newly proposed supplemental questions for the BTOS is to assess whether the questions are measuring the underlying constructs of interest and to better understand the accessibility of the requested data and the burden of compiling responses to the questions. The feedback from these interviews will be used to refine question wording and decide whether (or not) to include these new questions on Artificial Intelligence as a supplement to the 2025 Business Trends and Outlook Survey.


A number of the core questions will be asked of participants, to add context for the Artificial Intelligence supplement that will come after all the core questions. We will do minimal probing on these core questions, though there are some questions where we would like to learn more about how participants are interpreting them, particularly to gather feedback on the revised proposed monthly time frame.


Results: The results from the cognitive testing will be documented in a report, outlining the findings of the pretest and suggesting recommendations for improvement to questions. This report will include:


  • Understanding how respondents comprehend the proposed Artificial Intelligence supplement questions

  • Identifying respondents’ use of records and/or estimation strategies for answering these questions

  • Assessing respondents’ ability to answer these questions

  • Identifying difficulties in completing the questionnaire (where applicable)

  • Recommended changes to questions, or the addition of new questions, and response options to be implemented in the BTOS Artificial Intelligence supplement


Population of Interest: Respondents from single and multi-unit businesses across all sectors.


Timeline: Testing will be conducted during January – March, 2025.


Language: Testing will be conducted in English only.


Method: The purpose of cognitively testing the proposed BTOS Artificial Intelligence supplement questions is to minimize measurement error and maximize the validity of these questions by assessing whether the questions accurately measure the underlying construct of interest. Cognitive interviewing is a method of pretesting instruments that involves in-depth interviewing, paying particular attention to the mental processes respondents use to respond to questions1. Cognitive interviewing uses a framework dependent on evaluating questions against their outcome objectives, including accurately eliciting the underlying construct of interest, and to what level of accuracy respondents can provide data in response.


Staff from the Data Collection and Methodology Branch plan to conduct up to 30 moderated interviews over Microsoft Teams/telephone, over two rounds of interviewing. The interviewers will follow a semi-structured interview protocol (Attachment B). Interviewers will send a link to the survey online to respondents prior to calling so that respondents can work through the survey while on the phone. The survey will be hosted through the Qualtrics online survey platform. Interviews will be recorded if the respondent consents.


In the first round, to test the new content, we will target up to 15 respondents of single and multi-unit businesses across all industries to complete a cognitive interview. The research staff will then review the resultant data and make adjustments to the new supplemental content questions and to the interviewing protocol if necessary. We will then conduct a second round of moderated cognitive interviews targeting up to 15 single and multi-unit business respondents, again across all industries.


Sample: We plan to conduct a maximum of 30 moderated cognitive interviews total over two rounds of data collection. This number of interviews is targeted because it is a manageable number of interviews for the time period allotted, it should adequately cover targeted businesses, and it should be large enough to provide reactions to the questions in order to identify meaningful findings.


The sampling universe for the BTOS cognitive testing sample consists of single-unit and multi-unit firms in the 2024 Annual Business Survey universe for which the Census Bureau has email addresses. Those companies that indicated they used artificial intelligence will be prioritized, though a small number of companies that did not indicate they used AI will also be recruited to ensure that the supplement works well for those companies as well.


Recruitment: Participants will be recruited using a sample file from the 2024 Annual Business Survey universe. First, we will send an email to the contact. This email will include instructions for respondents to schedule an interview time and date. We will verify the appointment time, and respond by email to the respondent with confirmation of scheduling; in that email, we will also verify the best number to reach the respondent. About 30 minutes before their appointment time, we will email the respondent again reminding of the upcoming appointment and including a link to the survey. The first screen of the online survey will be a Paperwork Reduction Action (PRA) and Privacy Act (PA) statement, informing participants that their response is voluntary and that the information they provide is confidential under Title 13 and asking for consent to be interviewed. Respondents will need to click a checkbox indicating that they understand these rights and agree to be interviewed. If email recruitment is not meeting our recruitment goals, we will move to telephone calls.

Protocols: A copy of a draft interview protocol for the moderated interviews is enclosed.


Use of incentive: Monetary incentives for participation will not be offered.


Below is a list of materials to be used in the current study:


Attachment A: Draft instrument for the BTOS including the Artificial Intelligence Supplement


Attachment B: Draft protocol outlining intended questions to guide the moderated debriefings for the BTOS Artificial Intelligence Supplement


Attachment C: Consent form example, including PRA/PA statements


Burden Estimate: For moderated cognitive interviews, we expect that each interview will last no more than 45 minutes (30 cases x 45 minutes per case = 22.5 hours).


Additionally, to recruit respondents testing, we expect to make up to 3 email contacts per completed case. The recruiting emails are expected to take on average 3 minutes to read [(3 attempted emails per completed case x 30 cases x 3 minutes per case) / 60 minutes] = 4.5 hours).


Thus, the estimated burden for the moderated cognitive interviews is 27 hours.


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Kristin Stettler

Data Collection Methodology and Research Branch

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-7596

Kristin.j.stettler@census.gov


cc:
Nick Orsini (ADEP) with enclosures

Kathryn Bonney (EMD) “ “

Catherine Buffington (EID)     “          ”

Lucia Foster (CES)     “          ”

Katie Genadek (ESMD)     “          ”

Temika Holland   (ESMD)     “          ”

Jasmine Luck (ADRM)     “          ”

Jessica Holzberg (ADRM) “ “

Danielle Norman (PCO)     “          ”

Mary Lenaiyasa (PCO)     “          ”


1 Campanelli, P. 2007. “Methods for Testing Survey Instruments.” Short Course, Joint Program in Survey Methodology (JPSM). Arlington, VA.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2025-08-12

© 2025 OMB.report | Privacy Policy