Nonresponse Debriefings

Attachment H - Nonresponse Debriefings.pptx

Generic Clearance for Census Bureau Field Tests and Evaluations

Nonresponse Debriefings

OMB: 0607-0971

Document [pptx]
Download: pptx | pdf

Barriers to Response in Annual Economic Surveys

Melissa A. Cidade

Kelsey Drotning

<number>

Thank you for your time today!  I’m Melissa Cidade and I’m with the Economic Management Division at the US Census Bureau.  Today we are going to be talking about nonresponse on establishment surveys, specifically, the suite of annual economic surveys fielded by the Census Bureau.

<number>

<number>

 

Framework – Fisher et al. 2003

External

  • Environmental 

  • Business 

  • Respondent 

Internal

  • Mode 

  • Marketing 

  • Burden 

  • Request 

<number>

In 2003, Sylvia Fisher and her colleagues at the Bureau of Labor Statistics published a fascinating investigation of barriers to response for economic surveys.  They conducted a total of 32 interviews and developed a schema of explanations for nonresponse.  Whereas for demographic surveys, the emphasis tends to be on salience, reciprocity, and other individual-level contexts impacting response, for establishment surveys, the authors identified both external and internal drivers of nonresponse.  Externally, this includes environmental factors, like economic conditions, legal and regulatory requirements, and the wider survey landscape.  Business factors are things like gatekeeping, data availability, and third parties who prepare records on behalf of businesses.  And, for respondent factors, the most important factors are authority to complete, capacity to provide the data, and motivation to respond.  Internal factors include things like response mode, contact strategies and sponsor identification, actual and perceived burden, and the nature of the data being requested.  

<number>

<number>

 

Participant Overview

  • Interviewing period:  Sept. 6 through 17, 2021 

  • Total or partial non-response to the following annual surveys: 

    • Annual Survey of Manufactures (ASM) 

    • Service Annual Survey (SAS) 

    • Annual Wholesale Trade Survey (AWTS) 

    • Annual Retail Trade Survey (ARTS) 

  • 19 total interviews, lasting about 20 minutes each, all multi-unit firms 

<number>

We wondered, almost 20 years later, if the same barriers are still in place, or if we would see shifts in the factors influencing establishment survey response.  As part of a larger project to harmonize, standardize, and streamline several annual economic surveys at the Census Bureau, we conducted follow-up interviews with 19 partial or total non-responding firms.  In this case, non-responding refers to their status on the in-scope annual surveys for this work, listed on screen now. We conducted these interviews by phone over a two-week period in September 2021 with each interview lasting no more than 20 minutes a piece.  All interviews were with firms with more than one establishment, called “multi-unit” firms.

<number>

<number>

 

Research Questions

Research Question 1:  Are firms getting our communications?

Research Question 2:  For total non-responders:  What are the reasons why firms are not completing at all?

 

Research Question 3:  For partial non-responders:  What are the reasons why firms are not completing all requests?

<number>

What happened?!

 

External Factors

Things we can’t change….

<number>

Let’s start with the external drivers of nonresponse in establishment surveys.  Remember, these included environmental factors, business characteristics, and respondent-based drivers.

<number>

<number>

 

Environmental

  • Mail delays 

  • Remote work 

  • Changing economic conditions 

  • Normally [survey requests] come to one person through the mail, but with COVID, we do go to the office only once a month to collect the mail.  Sometimes [the mail is] not on time - there's a lag.” 

<number>

When we think of the survey landscape, we cannot begin to consider the traditional barriers like economic conditions or the general social tides around survey response.  All of these influences are muted by the behemoth in the room:  the ongoing COVID-19 global pandemic. Some participants noted that they are not reporting to their normal business locations making access to records more complicated.  Others pointed out that others within the company – at various establishments – are not currently physically reporting to work, making the process of farming out responses take longer and be more complex.  A few participants pointed to mail delays as a contributor to survey non-completion.  On the right of the screen is a participant quote noting that not only is there a lag in the mail, but he is also only sporadically getting work mail.  At the same time, we heard of structural changes to companies being driven by changing economic conditions that may be impacting response.

<number>

<number>

 

Business

  • Restructuring and layoffs 

  • Staff turnover 

For the most part, [the delinquent surveys are] due to COVID and everything -- we've had departments let go and its increased workload.  Things have fallen behind.”

We had some layoffs and organizational changes -- they don't have the resources and they're busy, so it is so much harder to ask for the data.”

We had a lot of restructuring over the last year, so some of the surveys we respond to, they're not even up to date given our new structure.  It's confusing for us to try to find the information.”

<number>

But what do I mean by changing economic conditions influencing response?  This phenomenon is often expressed in institutional or structural change at the business that impedes response.  

We found that a number of respondents pointed to structural change at their companies as keeping them from completing.  By structural change, we mean changes to the way the company does business that has negatively impacted their ability to complete the survey.  In these cases, we do not mean issues of time or interest, but rather systemic issues related to a reconfiguration that are impacting response.  Some of this goes hand-in-hand with COVID-19, as some changes are a result of the impact of the pandemic, but we also recognize that businesses are dynamic and are prone to turnover and restructuring.

The first quote ties the business structural changes to the pandemic, as the structural changes are attributed “to COVID and everything.” Note here the respondent mentions not having time – we have a tendency to think this is an indicator of salience, but with the additional information, we can see that the lack of time is actually a lack of resources.  The second quote states a simple reality: with layoffs and organizational changes, surveys in particular fall farther and farther down the priority list for the remaining staff.  

The third quote, though, has less to do with a lack of resources – that is, staff time – and more to do with a change in the approach to completing the survey, mainly, that the surveys do not match the new structure.  Again, here, we might think that there is an impediment to the data because of lack of salience, but in actuality, it is a change in the organization leading to a mismatch with the survey instrument that is impeding response.

<number>

<number>

 

Data Dispersion

8

 

Internal Factors

Things we can change….

<number>

Now, let’s turn our attention to the internal influences of nonresponse.  Remember that this includes things like response mode, contact strategies and sponsor identification, actual and perceived burden, and the nature of the data being requested. These factors are typically related to survey design.

<number>

<number>

 

Instrument Misalignment

  • For some reason there are a number of locations [listed on our survey] that have been closed for several years, so long that our systems don't have any info on them.  Of our [more than 150] locations listed, probably 50 to 60 [of them] are no longer active.”  

  • If the questions are very particular, asking in a way we don't report on, I can't access those data easily, so I need someone from tech to build a report to give us that information.  We might not break our reporting down  to the level that you are asking… and IT has a bunch going on [integrating] new companies.” 

<number>

 

Unit Misalignment

When we received the notification [about the annual surveys], they mixed…the enterprise and incorporation in one survey.  I was in contact, and they fixed something in the system to split the two, but we received a notification this year that they were combined again.”

One [business unit] has the same name as the wider company name, so we end up filling out [the survey] more than once.  We are getting duplicate requests, and we've called in to have it changed, but I have no idea if that is fixed.”

<number>

In addition to misalignments in data requests and company-wide, we noticed unit misalignments.  This gets to the broader issue of the ‘unit problem’ in establishment research – the space between how a company defines its disparate parts and how these parts translate into response and statistical units.  In the first quote, the respondent notes that one business unit – what they call the ‘enterprise’ – has the same name as another – what they call the ‘incorporation,’ and that while the Census Bureau sees them as the same unit, the business itself does not.  In order to provide the data requested at the level of granularity requested, this business needs these two units separated.  In the second quote, we see the opposite issue: in this case, the Census Bureau sees two business units, but the company itself sees these two units as synonymous, and thus, the requests are deemed duplicative.  Same underlying issue with the same negative impact on response.

<number>

<number>

 

Instrument Topic(s)

Mixing the questions on the survey makes life a little harder.  When we get a mixed survey, there's not going to be a single person with access to the data, so I'm usually coordinating with multiple departments to get the information…I have some of the other data ready, but I can't submit [the survey] until I get these [requested] data [back].  It is more helpful to make the surveys about each type of information you're looking for, so we wouldn't have to wait for the rest of the data to come in.”

<number>

In addition to misalignments, however, an interesting finding came out of our interviewing:  mixing topics within an instrument can be problematic.  Many respondents noted that if the requests are “purely financial” – income and expenses – they can usually pull the data independently.  However, when ‘non-financial’ data are requested – including payroll, assets, inventory, and others – this is where they run into the dispersion issue.  Note that for these respondents, there are often consolidated figures available or they have limited access to disaggregated data, but only for particular topics; it is when a survey covers multiple topics that they run into issues.

<number>

<number>

 

Communication Challenges

    • Yeah - I received the [survey].  I was under the assumption - it was three different ones, and I assumed that I submitted them fully.  The annual wholesale one - that's not me.  That's another department.” 

<number>

Finally, we have an example on screen now of communication challenges as a barrier to survey completion.  This quote highlights an instance where our survey invitations are sometimes being directed to the wrong person.  The quote outlines the space between the Retail and Wholesale reporting for one company.  We are sending requests to the right company, but the wrong person – in this case, the respondent is deputized to respond to surveys pertaining to the business’ retail activities, but not their wholesale activities.  

<number>

<number>

 

Fisher Revisited

External

  • Environmental 

  • Business 

  • Respondent 

Internal

  • Mode 

  • Marketing 

  • Burden 

  • Request 

<number>

So, what are we to make of all of this?  First, let’s revisit Fisher et al.  We note in our interviews the same or similar external factors impacting nonresponse.  The survey landscape is particularly complex right now with the lingering impact of COVID-19 and other issues.  Businesses are still as dynamic – or more so – as they’ve ever been.  And, respondent characteristics – particularly those big three: authority, capacity, and motivation – are still drivers of response.  We find this schema to hold up.  As for the internal factors, we agree with mode, marketing, burden, and the request, (click) but add misalignments in content, unit, and contact as additional drivers of nonresponse.

This work is exploratory – we are undergoing a major redesign of our current annual surveys, and identifying the barriers to response is a first step toward mitigation.  We will continue to refine our collection approach to address the issues identified here.  

<number>

<number>

 

Thanks

<number>

 
File Typeapplication/vnd.openxmlformats-officedocument.presentationml.presentation
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy