OMB Memo

Harmonization OMB letter FINAL 9-25-2020.docx

Generic Clearance for Questionnaire Pretesting Research

OMB Memo

OMB: 0607-0725

Document [docx]
Download: docx | pdf


Generic Information Collection Request:

Cognitive Interviewing for the Content Harmonization

and Collection Unit Determination Instrument



Request: The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We will be conducting in-depth exploratory interviews to study the recordkeeping practices of medium-to-large-sized multi-unit companies for the purpose of improving the annual survey programs in the Economic Directorate.



Purpose:

The Census Bureau’s Economic Directorate (ECON) asked the National Academy of Sciences (NAS) to convene an expert panel to review their appropriated annual economic surveys and recommend improved methodologies for conducting and processing them. The panel started work in July 2015 and the final report was released in May 2018 (Reengineering the Census Bureau's Annual Economic Surveys). From these recommendations, the Economic Directorate is conducting research towards the goal of harmonizing and simplifying the design and production process for these surveys and the Economic Census.

In previous research, the Census Bureau conducted a series of focus groups and interviews to determine the record-keeping practices of medium-sized multi-unit companies. This research shed light on the sources of burden that respondents encounter when responding to Census Bureau annual economic surveys; this burden ranges from interpreting the meaning of questions to allocating firm data to these questions to rallying the appropriate company component to respond to such requests. One of the major findings of this work is that from the respondents’ perspective, asking for data about a piece of the entire company can be confusing at best and overly burdensome at worst. This mismatch leads to negative impact on resultant data quality and an erosion of respondent goodwill to data requests.

With that in mind, this project seeks to extend these previous record-keeping studies by better understanding the accessibility and burden associated with reporting data on various business attributes at various levels of operation. The proposed research will test the concept of a topic-driven instrument developed based on the Economic Directorate Data Management Framework (see Table 1) to determine the best collection unit for core data items. In consultation with the appropriate Census Bureau stakeholders, researchers in the Economic Statistical Methods Division (ESMD) have identified three key variables to focus on in this research: revenue, expenses, and assets.





Table 1: Economic Directorate Data Management Framework and Example Subtopics (Abridged)

Overall Topic

Examples

Assets

Accounting period, accounts receivable

Depreciable Assets

Capital expenditure, inventory, vehicles

Building Permits

Jurisdiction, units, residential permits

Business Characteristics

Ownership information, EIN, establishments

Business Classification

Kind of business, method of selling, industry specific

Collection Process

Burden, remarks, time period covered

Expenses

Operating, benefits, payroll

Manufacturing and Production

Process structure, production capability, production targets

Revenue

Products, contract, electronic sources

Structure

Project details, project type, structure details

Workforce

Employment, personnel, wages



In order to move forward towards implementing the NAS panel recommendations for improving and streamlining the Economic Directorate’s survey processes, learning more about the business units at which companies can accurately report data on these topics is critical. During these in-depth interviews, companies will describe to researchers the accessibility of data at various levels of operation. To frame this research, we will extend the Business Information Accessibility Scale framework displayed in Table 2, which was put forth by Snijkers and Arentsen (2015). These researchers developed a four point color coded scale as a reference for respondents when assessing the accessibility of their data at various increments, in terms of both time and organization. That work centered on combining two surveys collecting similar data from large non-financial firms, and then adjudicating the level of measurement between these two efforts. They hypothesize that “the more steps and the more sources involved in the response process, and the deeper within the business information has to be retrieved, the higher the risks of survey errors like measurement errors and item non-response.” This builds on Bavdaz (2010) work on accessibility as an underlying concept of both burden and measurement error.



Table 2: Snijkers and Arentsen's (2015) Business Information Accessibility Scale

Concept:

Description

Easily accessible

the information is easily and readily available (at group accounts level)

Accessible with minor effort

the information is available at a central location, but not in the group accounts (treasury level), which requires more effort

Accessible with major effort

the information is available, but decentralized(general ledger level), which requires considerable effort to acquire

Inaccessible

the information is not available



Staff from the Data Collection Methodology and Research Branch (DCMRB) within ESMD will be working with staff throughout the Economic Directorate to conduct these exploratory case study interviews. We will interview respondents from up to 40 companies via telephone throughout the United States. Interviews will be no longer than 90 minutes and use a series of cognitive interviewing prompts and probes. We will limit this research to firms that are in at least two in-scope annual surveys. We will also target six to 10 participants from the prior Record Keeping Study or other related research where we have built rapport with respondents.



Method

Our research questions include the following:

  • Topic” Approach: How does a drill-down approach (wherein respondents give an overall total and then the finer details of that total) hinder or enable respondents to answer harmonized questions?

    • Are there industries for which this approach does not work?

    • How comfortable or uncomfortable are businesses with a topic-driven instrument?

  • Collection Unit: What is the most detailed breakdown level the data are most readily available?

    • Are there industries that are more readily able to provide these data at a granular level?

    • From the participants’ perspective, what is the best collection unit?

    • Are there questions that should be asked at varying levels of collection unit?

    • Can participants reconcile our collection unit choices (varying granularity of NAICS industries, establishment-level, state or other geographic level, etc.) with their own organizational setup?

  • Burden and Accessibility: Does asking for these data at the most easily available collection unit reduce the amount of time and effort it takes for respondents to complete the survey?

    • Does the new approach hinder or facilitate data reporting?

    • Using a green (accessible), yellow (accessible with minor effort), orange (accessible with large effort), and red (inaccessible) categorizing scheme, how easy or difficult is it to report the requested data at the lowest collection unit?

    • What are businesses’ preferred collection unit for each of these topics?

We plan to investigate these questions by conducting exploratory case study interviews with business survey respondents and with others responsible for financial records and reporting, to the degree possible. While our research usually involves a single interviewer and a single respondent, in this case, more than one Census Bureau researcher may engage more than one business representative in a single meeting to better ascertain how these data are compiled. These interviews will take place over the phone and will be augmented by a secure online research response aid to facilitate effective communication with respondents. If necessary, respondents can share their screens with researchers using Skype for Business.1 The interviews will follow an interview guide (Attachment A).



We will begin the interview by introducing the study and learning about the company’s activities and data reporters. Then, we will ask a few questions on the record keeping practices of the company. Most of the interview, however, will focus on the topic-based questions for a variety of collection units. These questions focus respondents to first a topic of interest (revenue, expenses, and assets, and related subtopics) and then to the levels of operation of interest, including:

  • Establishment: an economic unit, usually at a single location, where business is conducted or where services or industrial operations are performed.

  • Industry: as measured by the National American Industry Classification System (NAICS), industry is the production-oriented classification of establishments into industries according to similarity in the processes used to produce goods or services.

  • Business Segment or “Kind of Activity Unit (KAU)”: a line of business that can be specified and for which data can be isolated for a unit that represents the consolidation of all establishments within the same 2-, 4-, or 6-digit NAICS code within a company.

  • Product Line: as measured by the North American Product Classification System (NAPCS), product line is an aggregation of outputs and products of any given business; whereas a business is assigned one NAICS code (industry), businesses can have multiple NAPCS codes linked to any given unit of operation.

Using primarily unscripted probes in conjunction with the interview guide, we will lead an open discussion with respondents to better understand the accessibility and burden of reporting these three critical topics/subtopics by these four business units of operation. We will primarily rely on scripted and unscripted probes to direct the discussion and work collaboratively towards shared meaning. Subject matter experts from the Economy-Wide Statistics Division (EWD) or other Economic divisions, as appropriate, will participate in the preparation for these interviews.

In order to address the research questions, we will operationalize the accessibility scale described above in Table 2 by administering a “cart sort” methodology with participants. A card sort is a well-established methodology for understanding how "participants relate and categorized concepts" (Goodman et al. 2012: 202).  In this case, we are asking respondents to categorize the accessibility of data at each of the different levels of measurement by assigning each level to a color representing accessibility.  The respondent will use virtual "cards," each containing a level of measurement, and will sort them into one of four color-coded categories rating the accessibility of the data.  Because we are providing the respondent with the accessibility scale, this is considered a closed card sort (as opposed to an open card sort, wherein respondents would create the categories themselves).

Interviews will be audio recorded and may include screen capture technology if appropriate to aid researchers in accurately reporting findings and recommendations. No audio or screen recording will be conducted without respondents’ consent. Respondents will be informed of their privacy rights through an informed consent form that researchers will send prior to the scheduled interview.



Survey Population

ESMD researchers will select firms for recruitment using specific criteria that indicates that they may be most impacted by proposed changes to the reporting structure. Selected companies should represent the “most typical” medium-to-large multi-unit organizational structures. We will use the following information to aid in identifying appropriate firms:

  • Business size (as measured by payroll and employment)

  • Number of establishments

  • Number of states or jurisdictions with establishments

  • Number of unique employer identification numbers (EINs) for a business

Additionally, targeted firms must be in-sample for at least two in-scope annual economic surveys. We will exclude companies currently enrolled in the Full Service Account Manager Program.2 We will focus on companies that operate in multiple industries within the manufacturing, retail, wholesale, and/or service sectors.



Sample Selection:

We will contact potential study participants via phone or email to explain the nature of our research and extend an invitation to participate in the study. The sample of participants, then, will be those firms that agree to participation. Respondents will be informed that their response is voluntary and that the information they provide is confidential and will be seen only by Census Bureau employees involved with the research project. We will not provide incentives to participate. Once interviews are scheduled, researchers will send a confirmation email to respondents verifying the time, date, and best staff to attend the meeting. Shortly before the interview appointment, researchers will send participants from the associated firm an email with links to the research response aid (Attachment B) and a consent form informing them of their rights under the Paperwork Reduction Act (PRA) and Privacy Act (PA) (Attachment C), along with obtaining consent for audio recording and screen capture.



Timeline:

Researchers will begin recruiting for these interviews in October-November, 2020. Interviewing and recruitment may take place concurrently through the four-month lifespan of data collection, concluding in February, 2021.



Burden Hours:

We anticipate that each interview may have up to two participants per company, and will last no longer than 90 minutes. Further, we estimate it will take on average five minutes to recruit respondents, and expect to make up to five phone contacts per completed interview. We are anticipating that recruitment calls will last on average five minutes per call, putting the total burden at 16.7 hours total for recruitment (5 attempted phone calls per completed interview x 40 cases x 5 minutes per case = 16.7 hours). At estimates of 40 interviews, maximum 90 minutes a piece, and maximum of 2 participants per interview, the total interviewing burden is 120 hours (40 interviews x 90 minutes x 2 participants = 120 hours). Finally, we will include an optional pre-interview meeting to troubleshoot any issues with technology, particularly with regard to Skype for Business; we anticipate these meetings to be no more than 10 minutes (10 minute meeting x 40 cases = about 6.7 hours). Combined with recruiting, we estimate the total burden hours for this project to be 143.4 hours (16.7 hours for recruitment + 120 hours for interviewing + 6.7 hours technical support = 143.4 hours).



References Cited within This Letter:

Bavdaz, Mojca.  2010.  "Sources of Measurment Errors in Business Surveys."  Journal of Official Statistics, 26(1): 25-42.

Goodman, E., Kuniavsky, M. and Moed, A. (2012), Observing the User Experience: A Practitioner’s Guide to User Research, Morgan Kaufmann, An Imprint of Elsevier, Waltham, MA.

Snijkers, Ger and Kenneth Arentsen.  2015.  "Collecting Financial Data from Large Non-Financial Enterprises:  A Feasibility Study."  Presentation at the 4th International Workshop on Business Data Collection Methodology, Bureau of Labor Statistics, Washington, DC.

Contact:

The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:

Diane K. Willimack

Methodology Director for Measurement & Response Improvement

Economic Statistics and Methodology Division

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-3538

diane.k.willimack@census.gov



Enclosures

  • Attachment A: Sample interview guide for exploratory interviews

  • Attachment B: Sample of research response aid for the card sort, organized by topic

  • Attachment C: Consent form informing participants of their rights under the Paperwork Reduction Act (PRA) and Privacy Act (PA), and requesting consent for audio taping and screen capture



cc:

Nick Orsini (ADEP) with enclosure

Carol Caldwell (ESMD) with enclosure

Diane K. Willimack (ESMD) with enclosure

Amy Anderson Riemer (ESMD) with enclosure

Melissa Cidade (ESMD) with enclosure

Laura Blaugh (EMD) with enclosure

Kathy Bonney (EMD) with enclosure

James Burton (EMD) with enclosure

Jennifer Hunter Childs (ADRM) with enclosure

Jasmine Luck (ADRM) with enclosure

Danielle Norman (PCO) with enclosure

Mary Lenaiyasa (PCO) with enclosure









1 Per the parameters outlined in the June 18, 2020 memo from the Policy Coordination Office to the Chair of the Data Stewardship Executive Committee on Using Skype for Business for Conducting Title 13 Qualitative Research Remotely.

2 A select number of large and complex companies have been assigned an account manager by the Census Bureau. These managers work with key contacts at companies to develop rapport and provide support in meeting the reporting requirements for economic surveys. They are excluded from this study because of their special relationship already in place with the Census Bureau.

Page 7 of 7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorMelissa A Cidade(CENSUS/ADDP FED)
File Modified0000-00-00
File Created2021-01-13

© 2024 OMB.report | Privacy Policy