2018 ACS Mail Materials Test Study Plan

2018Mailing Materials Test_REAP_OMB_3.0.docx

American Community Survey Methods Panel Tests

2018 ACS Mail Materials Test Study Plan

OMB: 0607-0936

Document [docx]
Download: docx | pdf


American Community Survey Research and Evaluation Program

June 14, 2018



ACS Research & Evaluation Analysis Plan (REAP)


2018 ACS Mail Materials Test


Work Request ID: RS18-6-0715




Dorothy Barth

DSSD

Author


Agnes Kee

ACSO

Project Manager


Jennifer Ortman, ACSO

Anthony Tersine, DSSD

Division Authority




Census Bureau Logo Census Bureau Logo

REAP Revision Log

Version

Date

Description

Author

1.0

3/20/18

Initial Draft for Feedback

Dorothy Barth

2.0

4/09/18

Second Draft for Approval

Dorothy Barth

OMB

4/26/2018

Draft for OMB/Includes updates from Approver comments and newly revised paper questionnaire

Dorothy Barth

OMB

6/14/2018

New OMB Draft/Includes updates to include two new treatments that will test the difference in bi-fold vs. tri-fold pressure seal mailers

Dorothy Barth


TABLE OF CONTENTS


  1. Introduction

The U.S. Census Bureau continually evaluates how the American Community Survey (ACS) mail materials and methodology might be further refined to increase survey participation and reduce survey costs while also addressing concerns raised by stakeholders and potential respondents about the ACS being perceived as burdensome and intrusive. Concerns raised by potential respondents and stakeholders who view messages pertaining to the mandatory nature of the ACS survey in many of the mail materials (such as, “Your response is required by law.”) as overbearing led to previous tests to try to reduce these concerns.

Previous research on mandatory messaging demonstrated that removing mandatory messages from the outgoing envelopes causes a decline in self-response rates (Barth, 2016). Additionally, softening messages in the letters and postcards also causes a decline in self-response return rates and, in some experimental treatments, a decline in overall response rates after Computer-Assisted Personal Interviews (CAPI) (Oliver et al., 2016; Longsine et al., forthcoming). Implementing these changes, either the removal or softening of language about the mandatory nature of the survey, would lead to increases in data collection costs, reductions to the quality of the estimates produced from the survey, or both.



As part of the 2015 Summer Mandatory Messaging Test, in addition to experimental treatments aimed at softened mandatory messages, one experimental treatment was tested that strengthened mandatory messages (typically by highlighting them in bold text) along with other design changes.1 This treatment outperformed the control materials at the time, showing a statistically significant increase in self-response rates prior to Computer-Assisted Telephone Interviews (CATI) of 3.6 percentage points (Oliver et al., 2016).



The purpose of this test is to continue to explore changes to mandatory messaging in the mail materials that would help address respondent concerns while at least maintaining, if not improving, self-response rates. To that end, we will incorporate elements used in past tests, as well as aspects of data collection that have changed in production since those tests, into the design of this test.

  1. Background

This section presents information on the current ACS data collection strategy so readers can understand how this experiment uses and modifies the current approach.

To encourage self-response in the ACS, the Census Bureau sends up to five mailings to a sample address. The first mailing (Initial Mailing Package) is sent to all mailable addresses in the sample. It includes an invitation to participate in the ACS online and states that a paper questionnaire will be sent in a few weeks to those unable to respond online. The mailing includes a Frequently Asked Questions (FAQ) brochure, multilingual brochure, and an instruction card. About seven days later, the same addresses are sent a second mailing (Reminder Letter), which repeats the instructions to either respond online, wait for a paper questionnaire, or call with questions.2

Responding addresses are removed from the address file after the second mailing to create a new mailing universe of nonresponders. The third mailing (Paper Questionnaire Package) includes a letter, a paper questionnaire, a business reply envelope, an FAQ brochure, and an instruction card. The enclosed letter includes instructions for responding online, the telephone questionnaire assistance number, and offers a new response option –– a paper questionnaire. About four days later, these addresses are sent a fourth mailing (Reminder Postcard) which encourages them to respond.


After the fourth mailing, responding addresses are again removed from the address file to create a new mailing universe of nonresponders. The remaining sample addresses are sent the Additional Reminder (fifth mailing) as a last attempt to collect a self-response.3 Two to three weeks later, responding addresses are removed to create the universe of addresses eligible for the CAPI nonresponse followup operation.4 Of this universe, a subsample is chosen to be included in the CAPI operation. Field representatives visit addresses chosen for this operation to conduct in-person interviews.5

Additional information about the ACS can be found in the ACS Design and Methodology Report (U.S. Census Bureau 2014).

  1. Literature Review

In an effort to increase self-response, between October 2013 and November 2014 the U.S. Census Bureau collaborated with Reingold, Inc. to research and propose revisions to design elements and messages in the ACS mailing materials (U.S. Census Bureau, 2015).


The high-level recommendations from the report were:

  • Emphasize the Census brand in ACS mail materials.

  • Use visual design principles to draw attention to key messages and help respondents navigate through ACS materials with greater ease.

  • Use deadline-oriented messages to attract attention and create a sense of urgency.

  • Prioritize an official “governmental” appearance over a visually rich “marketing” approach.

  • Emphasize effective “mandatory” messaging.

  • Demonstrate benefits of ACS participation to local communities.

  • Draw a clearer connection between questions with sensitive topics and real-world applications and benefits of the information provided by respondents’ answers.

  • Streamline mail packages and individual materials.

Based on these and other recommendations, the Census Bureau conducted field tests to improve the ACS mail materials and messaging with the objectives of addressing respondent concerns about the perceived intrusiveness of the survey, improving self-response rates, and reducing survey costs. Some of the findings from the tests have been incorporated in the production mail materials while other features require further testing. The design of the 2018 Mail Materials Test will incorporate both the new features and the features that require more testing.

The field tests that contributed to the research leading up to this test are:

  • 2015 Replacement Mail Questionnaire Package Test: conducted using the March 2015 panel to examine ways to reduce the complexity of this package by omitting some of its contents (Clark, 2015a)

  • 2015 Mail Contact Strategy Modification Test: conducted using the April 2015 panel to examine ways to streamline the mail materials by eliminating a pre-notice and sending the initial mailing earlier, replacing the reminder postcard with a letter highlighting the internet user ID, and other modifications to the mailings (Clark, 2015b)6

  • 2015 Envelope Mandatory Messaging Test: conducted using the May 2015 panel to study the impact of removing mandatory messages from the envelopes (Barth, 2015)

  • 2015 Summer Mandatory Messaging Test: conducted using the September 2015 panel to study the impact of removing or modifying (both softening and strengthening) the mandatory messages in the mail materials and updating the visual design of the materials (Oliver et al., 2016)

  • Why We Ask” Mail Package Insert Test: conducted using the November 2015 panel to study the impact of including a flyer in the paper questionnaire mailing package explaining why the ACS asks the questions that it does to be responsive to concerns about the intrusiveness of some of the ACS questions (Heimel et al., 2016)

  • 2017 Pressure Seal Mailing Materials Test: conducted using the May 2017 panel to study the impact of replacing reminder letters and postcards with pressure seal mailers (Risley et al., 2018)

  • 2017 Mail Design Test: conducted using the August 2017 panel to study the impact of new messaging and an updated look-and-feel to increase respondent engagement and self-response while softening the tone of the mandatory requirement of the survey (Longsine et al., forthcoming)


  1. Research Questions and Methodology

    1. Experimental Design

This test will be conducted using the September 2018 ACS production sample. The monthly ACS production sample consists of approximately 295,000 housing unit addresses and is divided into 24 nationally representative groups (referred to as methods panel groups) of approximately 12,000 addresses each. Each treatment in this test will use two randomly assigned methods panel groups (approximately 24,000 mailing addresses per treatment). The experimental design of this test includes seven treatments. The remaining ten methods panel groups not selected for this experiment will receive production ACS materials.7


In the 2015 Summer Mandatory Messaging Test one treatment, that contained stronger mandatory messaging and other revised design features, outperformed the other treatments. This test will use that treatment, called the Revised Design in the 2015 test, and make adjustments according to updates that have recently been made in production and updates that have resulted from further testing.


This test will also have treatments that soften the messaging regarding the mandatory nature of the survey. Prior tests that have used the softened language have not included the mandatory language on the outside of the mailing envelopes. In the 2015 Summer Mandatory Messaging Test, the treatments with softened language had lower response rates than the treatment that emphasized or strengthened the mandatory messaging language. It is difficult to distinguish whether the decline in response was because of the messages inside the mailings or because the envelopes were not opened due to a lack of mandatory messaging on the outside. This test will include treatments with softened language for the mandatory nature of the survey on the letters, with the mandatory language included on the outside of the envelopes.


The 2017 Pressure Seal Mailing Materials Test indicated that the ACS program would benefit from replacing the second and fifth mailings with pressure seal mailers. In that test, the second and fifth mailings were replaced with tri-fold pressure seal mailers. Due to efficiencies in processing and cost, the National Processing Center recommended using bi-fold pressure seal mailers instead. In this test, the production materials and all but one experimental treatment will use bi-fold pressure seal mailers. We are including a treatment that uses tri-fold mailers in the second and fifth mailings to test whether a change to the folding process, which affects the size of the mailer, will have an effect on self-response.


Additionally, the 2017 Mail Design Test showed that using a redesigned cover for the questionnaire may result in an increase in response. Some of the results of that test were confounded, thus we are retesting two similar versions of a newly redesigned questionnaire cover with some of the treatments in this test. The 2017 Mail Design Test also replaced Department of Commerce logos on mailing materials with a Census Bureau logo. This test will include that logo in the redesigned materials. Finally, other wording changes tested in the 2017 Mail Design Test and revisions made in production will be implemented in this test.


Appendix B provides an overview of the experimental treatments. Below is a brief description of the experimental changes in each treatment.

Treatment 1 (Modified Production Materials)

This treatment will test the impact of removing some of the materials from the Initial Mailing package and Paper Questionnaire package, but will not modify the visual design or mandatory language in the materials. As the primary purpose of this test is to evaluate design changes and wording changes, we needed a treatment to control for the removal of the FAQ brochure, which is currently included in the production mailings but will be excluded from all of the experimental treatments. The removal of the FAQ brochure from the Initial Mailing package and the Paper Questionnaire package necessitated changes to the letters in those mailings. By adding the FAQ information to the back of each letter, we were also able to remove the cybersecurity language from the front of each letter and place the wording on the back with the FAQ information. The paper questionnaire will be the same as the production questionnaire. Unlike production, the internet instruction card will also be removed from the Paper Questionnaire package to declutter and streamline the materials in that mailing. The instruction card will be removed from all of the experimental treatments.


Treatment 2 (Emphasized Mandatory with Revised Questionnaire)

Treatment 2 is the treatment that places the strongest emphasis on the mandatory nature of the survey. A stronger emphasis on the mandatory language will be carried out by using bold-faced font or by using the “required by law” wording in more prominent places throughout the letters (such as at the beginning of a paragraph instead of in the middle). This is the only experimental treatment that will add “Your Response Is Required By Law” to the front (address) side of the Reminder Postcard and will use bold font on that postcard to emphasize that an interviewer may be contacting the recipient if we do not receive a response. This treatment, like some other treatments, will also emphasize the urgency of a response by adding phrases like “Open Immediately” and “Final Notice Respond Now” to the outside of the mailings. This treatment will also include visual design changes to the mail materials, and it is the treatment that is most similar to the Revised Design treatment from the 2015 Summer Mandatory Messaging Test.


This treatment will use a newly revised questionnaire. The revised design cover will include new wording and icons to draw attention to the response options. It will also include a new paragraph about the mandatory nature of the survey.


The American Community Survey is conducted by

the U.S. Census Bureau. This survey is one of only

a few surveys for which all recipients are required

by law to respond. The U.S. Census Bureau is

required by law to protect your information.”


To make room for this new wording, the paragraph (in production) that is in Spanish will be omitted from the new cover design. There will be only one sentence in Spanish referring to the toll free number to call for Spanish speakers. To declutter (thus making it more visually appealing) the right side of the cover (where the first questions of the survey appear), the item about the date of response will be moved to first page of the revised questionnaire.


As Treatment 2 is the treatment that most strongly emphasizes the mandatory nature of the survey, just above the new paragraph (seen above) there will be a heading in bold-font that reads, “Your response is required by law.


Treatment 3 (De-emphasized Mandatory with Revised Questionnaire)

Treatment 3 will present a “softened” or de-emphasized version of the mandatory nature of the survey within the text of the mailings, but will maintain the stronger language on the outside of the mailings and it will maintain the same revised design elements in the letters as Treatment 2. The Reminder Letter will de-emphasize the mandatory language more than the production materials. In other mailings the mandatory language will be either similar to production or will appear in regular font instead of bold-faced font. Except for the Reminder Postcard, Treatment 3 will have all of the same messaging on the outside of the mailings as Treatment 2. The paper questionnaire will be revised in the same way as the one in Treatment 2, except that it will NOT include the heading “Your response is required by law.”


Treatment 4 (“Softer”/Eliminated Mandatory with Revised Questionnaire)

Treatment 4 will “soften” the mandatory language in the first two mailings, similar to Treatment 3, but it will eliminate the mandatory language in the other mailings. The mandatory language will remain on the outside of the mailings; however, “Open Immediately” will not be used on the outside of the envelopes. While “Open Immediately” expresses urgency it is also commonly used on junk mail, so this treatment will omit the phrase. Although direct comparisons can only be made to test the effectiveness of omitting “Open Immediately” from the Initial Mailing (because of design changes made to Treatment 4 after that mailing), removing the wording remains an intentional part of the holistic “softening” of language in this treatment. Treatment 4 also will have the same revised design elements in the letters as Treatments 2 and 3. The paper questionnaire will be the same as the one used in Treatment 3.


Treatment 5 (De-emphasized Mandatory with Production Questionnaire)

Treatment 5 will be identical to Treatment 3 with one exception, the paper questionnaire. The questionnaire in Treatment 5 will have the current design used in production, while Treatment 3 will have a revised version of the questionnaire. This treatment was designed specifically to test the new revisions made to the front page of the paper questionnaire (except the mandatory bold-faced heading).


Treatment 6 (Production with tri-fold Pressure Seal Letters)

Treatment 6 will have materials identical to production except that the 2nd and 5th mailings will be tri-fold pressure seal letters instead of bi-fold pressure seal letters. This treatment is designed to test using a tri-fold pressure seal mailer instead of a bi-fold.


Treatment 7 (Production, Sorted Separately)

Treatment 7 will have materials identical to production materials, including using bi-fold pressure seal letters for the 2nd and 5th mailings. All mailings will be sorted separately from production to ensure a similar sample size to Treatment 6 for comparison purposes. Previous ACS testing has found that mailings that go to more respondents arrive quicker due to U.S. Postal Service efficiencies in processing.


    1. Research Questions

    1. What is the impact on self-response return rates of removing materials from the Initial Mailing (FAQ Brochure) and the Paper Questionnaire Mailing (FAQ Brochure and the Instruction Card) and modifying the letter in the mailing?

    2. What is the impact on self-response return rates of using a redesigned front cover of the questionnaire? What is the impact on item nonresponse rates for the questions on the front cover of the questionnaire and the question that was moved to the first page of the questionnaire?

    3. What is the impact on self-response return rates of modifying the design and wording of the mail materials? What is the impact of not including the phrase “Open Immediately” on the Initial Mailing envelope?

    4. What is the impact on self-response return rates of using bi-fold pressure seal mailers instead of tri-fold mailers for the 2nd and 5th mailings?

    5. What would be the cost impact, relative to current production, of implementing each experimental treatment into a full ACS production year? What would be the impact on the reliability of the ACS estimates?

    1. Analysis Metrics

All self-response analyses, except for the cost analysis, will be weighted using the ACS base sampling weight (the inverse of the probability of selection). The CAPI response analysis will include a CAPI subsampling factor that will be multiplied by the base weight. The sample size will be able to detect differences of approximately 1.25 percentage points between the self-response return rates of the experimental treatments (with 80 percent power and α=0.1). Detectable differences for the analysis of item-level data (such as item nonresponse rates) vary depending on the item, with housing-level items having minimum detectable differences up to 1.6 percentage points. We will use a significance level of α=0.1 when determining significant differences between treatments. For analysis that involves multiple comparisons, we will adjust for the Type I familywise error rate using the Hochberg method (Hochberg, 1988).


      1. Unit Response Analysis

To assess the effect of the experimental changes on self-response, we will calculate the self-response return rates at selected points in time in the data collection cycle. The selected points in time reflect the dates of additional mailings or the end of the self-response data collection period. An increase in self-response presents a cost savings for each subsequent phase of the mailing process by decreasing the number of mailing pieces that need to be sent out. A significant increase in self-response before CAPI decreases the number of costly interviews that need to be conducted. Calculating the return rates at different points in the data collection cycle gives us an idea of how the experimental treatments would affect operational and mailing costs if they were implemented into a full ACS production year.


If there is a significant decrease in response (both self-response and CAPI) by the end of the data collection period, then there may be a negative effect on the reliability of the estimates produced by the data collected. To assess whether the experimental changes affected response in this manner we will calculate final response rates and how each response mode contributes to the total final response.


        1. Self-Response Return Rates

To evaluate the effectiveness of the experimental treatments, we will calculate self-response return rates. The rates will be calculated for total self-response and separately for internet and mail response. For the comparisons of return rates by mode, the small number of returns obtained from Telephone Questionnaire Assistance (TQA) will be classified as mail returns.

The return rates will be calculated using the following formula:


Self-Response Return Rate



=

Number of mailable and deliverable sample addresses that either provided a non-blank8 return by mail or TQA, or a complete or sufficient partial9 response by internet



* 100

Total number of mailable and deliverable sample addresses10


        1. Final Response Rates

To evaluate the effect of the experimental treatments on overall response to the survey, we will calculate final overall response rates and how each mode contributed to the overall final response rate.

The final response rates will be calculated using the following formula:

Final Response Rate



=

Number of eligible sample addresses that either provided a

non-blank9 return by mail or TQA, a complete or sufficient partial10 response by internet, or a complete CAPI interview



* 100

Total number of sample addresses eligible to reply to the survey and not sampled out of CAPI


Self-responses will be weighted with the initial base weights, and CAPI responses and nonresponses will be weighted by multiplying a subsampling factor by the initial base weights.

      1. Item Response Analysis

We will also calculate item nonresponse rates to assess the impact of the revised questionnaire design on response to the survey questions. The formula for the item nonresponse rate is:

Item Nonresponse Rate

=

Number of nonresponses to item of interest

*100

Universe for item of interest


We will calculate item nonresponse for the items on the front of the questionnaire to assess whether the new design affects the response to those items: Last Name, First Name, Middle Initial, Phone Number, and the number of people living or staying at the address. We will also assess item nonresponse for the date of response question. This question is on the front cover of the questionnaire in production but is on the first page of the redesigned questionnaire.


      1. Cost Analysis

To assess the impact on cost of implementing any of the experimental treatments into a full production year we will calculate unweighted workloads and check-in rates, along with their associated standard errors, to be used as inputs into the cost analysis. We will calculate and compare self-response for each mailing phase. Significant increases or decreases in self-response will affect the workloads of the subsequent mailing phases. We will compare each experimental treatment to Treatment 7 (production materials) before the file creation for the second mailout phase (Paper Questionnaire Package and Reminder Postcard) and before the file creation of the third mailout phase (the Additional Reminder Postcard or Pressure Seal).

In addition to changes in workloads for each mailing, the cost analysis will take into account any differences associated with the mailing materials including printing, assembly, and postage costs. Any cost differences associated with CAPI will account for any significant changes in the CAPI workload due to a significant increase or decrease in self-response before CAPI for each experimental treatment compared to production.

      1. Response Reliability Analysis

Significant differences in total response, as well as the distribution of mode of response, have the potential to impact the reliability of the ACS estimates. We will calculate final response rates and how each mode of response contributes to the final response (as described in Section 4.2.1.2). We will also take into account self-response return rates before CAPI, as the amount of subsampling done in CAPI will also affect data reliability. We will calculate and compare rates between Treatment 7 (Production, Sorted Separately) and each experimental treatment using two-tailed hypothesis tests.


To assess the potential impact of each treatment on the reliability of the estimates produced from the response data, when there are significant differences between treatments, we will calculate a reliability of the estimates metric. The metric, a ratio of the sum of the squared weights for the interviews for an experimental treatment as compared to the control, will estimate the overall impact on the reliability of the estimates rather than the impact on specific characteristics. The weights will then be adjusted to take into consideration the effect of significant increases or decreases in nonresponse, as well as the shift in mode distribution due to significant differences in self-response. Additionally, to assess impacts on costs we will explore alternative sampling and sub-sampling approaches, that could be implemented, that take into account significant differences in self-response.


    1. Research Question Analysis

      1. Removal of Materials in the Mailings

What is the impact on self-response return rates of removing materials from the Initial Mailing (FAQ brochure) and the paper questionnaire mailing (FAQ brochure and the Instruction Card) and modifying the letter in the mailing?

This analysis will assess the impact on self-response of removing the FAQ brochure from the Initial Mailing and removing the FAQ brochure and the Instruction Card from the Paper Questionnaire package. It will also assess changes to the letters such as adding some FAQ information to the back of the letters and moving the language about cybersecurity from the front of the letters to the back. We will calculate and compare self-response return rates of the Initial Mailing universe for Treatment 7 (T7) and Treatment 1 (T1). Since an increase in self-response will decrease the cost of subsequent phases of the data collection cycle targeting nonresponders, we will compare self-response return rates just before the Paper Questionnaire package mailing, before the fifth mailing, and before the start of CAPI. We will compare return rates by response mode and overall (modes combined). We will make each comparison using a two-tailed hypothesis test. Each null hypothesis will be H0: T1 = T7 and each alternative hypothesis will be HA: T1 ≠ T7.


      1. Redesign of the Questionnaire Cover

What is the impact on self-response return rates of using a redesigned front cover of the questionnaire?

To assess the impact on self-response of redesigning the front cover of the questionnaire, we will calculate and compare the self-response return rates of the addresses that are mailed the Paper Questionnaire Package for Treatment 3 (T3) and Treatment 5 (T5).11 We will compare self-response return rates before the fifth mailing and before the start of CAPI. We will compare return rates by response mode (internet and mail separately) and overall (modes combined). We will make each comparison using a two-tailed hypothesis test. Each null hypothesis will be H0: T3 = T5 and each alternative hypothesis will be HA: T3 ≠ T5.


What is the impact on item nonresponse rates for the questions on the front cover of the questionnaire and the question that was moved to the first page of the questionnaire?

To assess the impact on item response of redesigning the front cover of the questionnaire, we will calculate and compare item nonresponse rates for each item on the front cover. We will also calculate and compare section completion rates (all items on the front cover combined) to determine if respondents are skipping the entire front cover of the questionnaire because of the redesigned cover. We will also calculate item nonresponse rates for the date of the response question. This question is currently on the first page of the questionnaire and the redesigned questionnaire moves it to the first response fill-in item on the next page. We want to see if moving the respondent filled-in response date has any impact on response to that item.


The items we will use for analysis are the respondent’s first name (RFN), the respondent’s last name (RLN), the respondent’s telephone number (RTEL), the number of persons living at the address (RPER), and the date of response (RDATE). This analysis will be done on all mail responses collected until the end of the monthly panel closeout. We will calculate these rates using two-tailed hypothesis tests at the α = 0.1 level. We will compare Treatments 3 and 5 with H0: T3 = T5 and HA: T3 ≠ T5.


Note: The expectation for the inclusion of the header “Your response is required by law.” on Treatment 2’s questionnaire is that it will increase response rates, based on past testing of “mandatory” language on the ACS mail materials (Barth, 2015; Oliver et al., 2016). A decision was made to not add another treatment just to test this design change feature. The comparison between T3 and T5 will be able to test the effect of all other elements that will be changed on the questionnaire cover.


      1. Wording and Design Modifications

What is the impact on self-response return rates of modifying the design and wording of the mail materials?

To assess the impact on self-response (unit response) of wording and design modifications of the mail materials, we will calculate self-response return rates of the Initial Mailing universe for Treatments 1, 2, 3, and 4. We will compare Treatment 1 to the other treatments. Comparing the modified production treatment (Treatment 1) to each experimental treatment will allow us to best see how the design elements in each experimental treatment (other than omitting the FAQ brochure and modifying the Reminder Letter) affect self-response as compared to current production materials. We will compare self-response return rates before the Paper Questionnaire Package mailing, before the fifth mailing, and before the start of CAPI. We will compare return rates by response mode (internet and mail separately) and overall (modes combined). We will also calculate return rates for High Response Areas and Low Response Areas (as defined on the current National Planning Database) to see if the experimental treatments affect these areas differently. We will make each comparison using a two-tailed hypothesis test.


To assess how the design elements of each experimental treatment compare with the modified production treatment the hypotheses will be H0: T1 = T2 and HA: T1 ≠ T2; H0: T1 = T3 and HA: T1 ≠ T3; H0: T1 = T4 and HA: T1 ≠ T4.


To determine the effects of softening and eliminating the mandatory language, we will compare Treatments T3 (de-emphasized mandatory) and T4 (softer/eliminated mandatory). With the hypotheses H0: T3 = T4 and HA: T3 ≠ T4.


To determine the effects of strengthening or emphasizing the mandatory language as compared to softening the language, we will compare Treatment T2 and the “winner” of T3 vs. T4 (if there is only a nominal difference we will use the treatment with the higher return rates). H0: T2 = (T3 or T4) and HA: T2 ≠ (T3 or T4).


What is the impact of not including the phrase “Open Immediately” on the Initial Mailing envelope?


To assess the impact on self-response of not including the phrase “Open Immediately” on the Initial Mailing envelope, we will calculate and compare self-response return rates of the Initial Mailing universe for Treatment 4 to Treatments 3 and 5 combined. We will compare self-response return rates just before the Paper Questionnaire Package mailing. We will only be able to calculate internet and TQA return rates, since mail response will not be an option at that point in time. We will make the comparison using a two-tailed hypothesis test. The hypotheses will be H0: (T3 + T5) = T4 and HA: (T3 + T5) ≠ T4.


      1. Pressure Seal Mailer Design

What is the impact of self-response return rates of using bi-fold pressure seal mailers instead of tri-fold mailers for the 2nd and 5th mailings?

To assess the impact on self-response of using bi-fold pressure seal mailers instead of tri-fold mailers for the 2nd and 5th mailings, we will calculate and compare self-response return rates of the Initial Mailing universe for Treatment 6 and Treatment 7. We will compare self-response return rates before the Paper Questionnaire Package mailing, before the fifth mailing, and before the start of CAPI. We will compare return rates by response mode (internet and mail separately) and overall (modes combined). We will make the comparison using a two-tailed hypothesis test. The hypotheses will be H0: T6 = T7 and HA: T6 ≠ T7.


      1. Cost and Reliability of Estimates

What would be the cost impact, relative to current production, of implementing each experimental treatment into a full ACS production year?



To assess impacts on costs we will calculate the annual expected cost of implementing each experimental treatment into a full ACS production year and compare it to the production costs.

A confidence interval for the cost that accounts for sampling error in the workload estimates will also be calculated. Cost differences will be calculated as described in Section 4.3.3.



What would be the impact on the reliability of the ACS estimates?

If an experimental treatment shows significant differences in response (overall and how each mode of response contributes to the final response, we will assess the impact on the reliability of the estimates for the treatment compared to production (as described in Section 4.3.4).

This analysis will assess the impact of each experimental treatment on the reliability of the estimates and cost under three different scenarios:

  1. Maintaining the current size of the initial sample

  2. Maintaining the current level of reliability of estimates

  3. Maintaining the current level of survey costs


      1. Exploratory Analysis

This section contains analysis that we consider to be important enough to explore, but not important enough to rise to the level of being a factor in the decision making process when determining if a treatment should be implemented in production. We will report significant findings in the final report.12 This analysis will not include Treatment 2 that has the bold header “Your response is required by law.” just above the paragraph mentioning the mandatory nature of the survey, because there is no other treatment in the experimental design to make a clean comparison to evaluate that design element.

Does the redesigned front cover of the questionnaire have any impact on form completeness?

One of the changes to the cover of the paper questionnaire is the inclusion of an explanation of the mandatory requirement to respond to the survey. It would be interesting to see if this change, in combination with the other design changes, affects form completeness (the quantity of items completed on paper questionnaire responses). While this analysis would not contribute to any deciding factors when determining the effectiveness of any of the treatments, it may be a worthwhile topic to investigate. If it turns out that the newly redesigned questionnaire cover significantly increases or decreases mail response, it would be beneficial to have some knowledge of whether or not the questionnaire design affected form completeness.


We will calculate the form completeness rate for mail mode responses using the following formula:

Overall Form Completeness Rate

=

Number of questions answered

*100

Number of questions that should have been answered


We will calculate form completeness rates for Treatments T3 and T5. The comparisons will use the following hypotheses: H0: T3 = T5 and HA: T3 ≠ T5.13

  1. Assumptions and Limitations

    1. Assumptions

  1. A single ACS monthly sample is representative of an entire year (twelve panels) and the entire frame sample, with respect to both response rates and cost, as designed.

  2. A single methods panel group (1/24 of the full monthly sample) is representative of the full monthly sample, as designed.

  3. We assume that there is no difference in mail delivery timing or subsequent response time across samples of similar size using the same postal sort and mailout procedures, as we have chosen sample sizes of the experimental treatments considering postal procedures.

    1. Limitations

  1. Group quarters and sample housing unit addresses from remote Alaska and Puerto Rico are not included in the sample for the test.

  2. The cost analysis uses estimates to make cost projections. These estimates do not account for monthly variability in production costs such as changes in staffing, production rates, or printing price adjustments.

  3. While the original paper questionnaire itself is written in English (a respondent needs to specifically request a Spanish language questionnaire), there is a lengthy paragraph on the current front page of the questionnaire that (paraphrased) says in Spanish, “Do you need help? If you speak Spanish and you need help completing this questionnaire, call toll-free 1-877-833-5625. You can also complete your interview by phone with an interviewer that speaks Spanish. Or you can respond by internet: https://respond.census.gov/acs”. The Spanish language on the redesigned cover (paraphrased) says, “Do you need help? Call toll free 1-877-833-5265”. Because the new cover design of the questionnaire adds wording not included in the current questionnaire, something from the original design had to be omitted. The majority of Spanish responses to the ACS are received by personal interviews, so it was conjectured that the impact on response of decreasing the Spanish language paragraph to one sentence would be minimal.


  1. Table Shells

Below are samples of tables that will be used in the final report to show results from this test.

Table 1. Sample Table for Self-Response Return Rates

Point in Data Collection Cycle

Treatment X

Treatment Y

Difference

P-Value

Before the Third Mailing

####

####

####

####

Before the Fifth Mailing

####

####

####

####

Before CAPI

####

####

####

####

Source: U.S. Census Bureau, American Community Survey, 2018 Mail Materials Test

Note: Minor additive discrepancies are due to rounding. Standard errors are in parentheses. An asterisk (*) indicates a statistically significant result. Significance was tested based on a two tailed t-test (Treatment X ≠ Treatment Y) at the α=0.1 level.


Table 2. Final Response Rates and Response Distributions by Mode


Treatment X

Production

Difference

P-Value

Overall Response

####

####

####

####

Internet

####

####

####

####

Mail

####

####

####

####

CAPI

####

####

####

####

Source: U.S. Census Bureau, American Community Survey, 2018 Mail Materials Test

Note: Minor additive discrepancies are due to rounding. Standard errors are in parentheses. An asterisk (*) indicates a statistically significant result. Significance was tested based on a two tailed t-test (Treatment X ≠ Production) at the α=0.1 level.


Table 3. Item and Section Nonresponse Rates, Mail Responses –

Treatment X vs Treatment Y

Item

Treatment X

Treatment Y

Difference

P-Value

Section Nonreponse

####

####

####

####

Name

####

####

####

####

Telephone Number

####

####

####

####

Number of Persons in Household

####

####

####

####

Source: U.S. Census Bureau, American Community Survey, 2018 Mail Materials Test

Note: Minor additive discrepancies are due to rounding. Standard errors are in parentheses. An asterisk (*) indicates a statistically significant result. Significance was tested based on a two tailed t-test (Treatment X ≠ Treatment Y) at the α=0.1 level.


  1. Potential Changes to ACS

Based on the results of this research, the Census Bureau may consider revising the ACS production mail materials. If a change is made, the decision will be informed by the cost analysis, response metrics, and the reliability of estimates analysis (if applicable) of each experimental treatment.

  1. References

Barth, D. (2015). “2015 Envelope Mandatory Messaging Test,” 2015 American Community Survey Research and Evaluation Report Memorandum Series #ACS16-RER-04. Retrieved on March 12, 2018 from http://www.census.gov/content/dam/Census/library/working‑papers/2016/acs/2016_Barth_01.pdf

Clark, S. (2015a). “2015 Replacement Mail Questionnaire Package Test,” 2015 American Community Survey Research and Evaluation Report Memorandum Series #ACS15-RER-18. Retrieved on March 12, 2018 from https://www.census.gov/content/dam/Census/library/working‑papers/2015/acs/2015_Clark_02.pdf

Clark, S. (2015b). “2015 Mail Contact Strategy Modification Test,” 2015 American Community Survey Research and Evaluation Report Memorandum Series #ACS15-RER-19. Retrieved on March 12, 2018 from https://www.census.gov/content/dam/Census/library/working‑papers/2015/acs/2015_Clark_03.pdf

Heimel, S., Barth, D., & Rabe, M. (2016). “Why We Ask” Mail Package Insert Test,” 2015 American Community Survey Research and Evaluation Report Memorandum Series #ACS16-RER-10. Retrieved on March 12, 2018 from http://www.census.gov/content/dam/Census/library/working‑papers/2016/acs/2016_Heimel_01.pdf

Hochberg, Y. (1988). “A Sharper Bonferroni Procedure for Multiple Tests of Significance,” Biometrika, 75 (4), 800-802. Retrieved on January 17, 2017 from http://www.jstor.org/stable/2336325?seq=1#page_scan_tab_contents

Longsine, L., Oliver, B. (Forthcoming). “2018 ACS Mail Design Test,” 2017 American Community Survey Research and Evaluation Report Memorandum Series #RS-4-0190.

Oliver, B., Risley, M., & Roberts, A. (2016). “2015 Summer Mandatory Messaging Test,” 2015 American Community Survey Research and Evaluation Report Memorandum Series #ACS16-RER-5-R1. Retrieved on March 12, 2018 from https://www.census.gov/content/dam/Census/library/working‑papers/2016/acs/2016_Oliver_01.pdf

Risley, M., Barth, D. (Forthcoming). “2017 Pressure Seal Mailing Materials Test,” 2017 American Community Survey Research and Evaluation Report Memorandum Series #ACS17-RER-20.

U.S. Census Bureau (2014). “American Community Survey Design and Methodology,” Retrieved on March 12, 2018 from http://www2.census.gov/programs‑surveys/acs/methodology/design_and_methodology/acs_design_methodology_ch12_2014.pdf

U.S. Census Bureau (2015). “2014 American Community Survey Messaging and Mail Package Assessment Research: Cumulative Findings Report,” Retrieved on February 10, 2017 from https://www.census.gov/content/dam/Census/library/working-papers/2014/acs/2014_Walker_02.pdf

  1. 2018 Mailing Descriptions and Schedule for the 2018 September Production Panel

Mailing

Description of Materials

Mailout Date

Initial Mailing Package

A package of materials containing the following: Introduction Letter, Frequently Asked Questions (FAQ) Brochure, Multi-Lingual Informational Brochure, and Internet Instruction Card. This mailing urges housing units to respond via the internet.

08/30/18

Reminder Letter

A reminder letter sent to all addresses that were sent the Initial Mailing Package, reiterating the request to respond. [Pressure seal mailer]

09/07/18

Paper Questionnaire Package

A package of materials sent to addresses that have not responded. Contains the following: Introduction Letter, Paper Questionnaire, Return Envelope, Internet Instruction Card, and FAQ Brochure.

09/20/18

Reminder Postcard

A reminder postcard sent to all addresses that were also sent the Paper Questionnaire Package, reiterating the request to respond.

09/24/18

Additional Postcard

An additional reminder postcard sent to addresses that have not yet responded and are ineligible for telephone follow-up. [Pressure seal mailer]

10/12/18



Shape1

Note: The areas highlighted in yellow in this table indicate how the treatment differs from current production.


  1. Description of Experimental Treatments

Mailing and Contents

Current Production

Treatment 1

(Modified Production)

Treatment 2

(Emphasized Mandatory with Revised

Questionnaire)

Treatment 3

(De-emphasized Mandatory with Revised Questionnaire)

Treatment 4

(Softer/ Eliminated Mandatory with

Revised Questionnaire)

Treatment 5

(De-emphasized Mandatory with Current Questionnaire

Treatment 6

(Production with Tri-fold Pressure Seal Mailers)

Treatment 7 (Production, Sorted Separately)

Initial Mailing Outgoing Envelope


Your Response is Required by Law


Your Response is Required by Law


Your Response is Required by Law


Open Immediately


Your Response is Required by Law


Open Immediately


Your Response is Required by Law


Your Response is Required by Law


Open Immediately


Your Response is Required by Law


Your Response is Required by Law

FAQ Brochure Letter Design


Letter Wording

YES


Current design No Callout Box


Current wording

NO


Current design No Callout Box


Current wording; Cybersecurity removed from front, FAQ Information added to back

NO


Updated design Callout Box


Emphasized mandatory, FAQ information added to back

NO


Updated design Callout Box


Mandatory wording similar to Treatment 1, FAQ information added to back

NO


Updated design Callout Box


Mandatory wording similar to Treatment 1, FAQ information added to back

NO


Updated design Callout Box


Mandatory wording similar to Treatment 1, FAQ information added to back

YES


Current design No Callout Box


Current wording

YES


Current design No Callout Box


Current wording

Reminder Letter



No message



No message



No message



No message



No message



No message



No message



No message

Outside of

Mailer



Letter


Current wording

Bi-fold Printing

Current wording

Bi-fold printing

Updated design, emphasized mandatory



Bi-fold printing

Updated design, mandatory wording slightly softer than

Treatment 1


Bi-fold printing

Updated design, mandatory wording slightly softer than Treatment 1

Bi-fold printing

Updated design, mandatory wording slightly softer than Treatment 1


Bi-fold printing

Current wording

Tri-fold printing

Current wording



Bi-fold printing

Questionnaire

Package


Outgoing

Envelope


Your Response is Required by Law


Your Response is Required by Law


Your Response is Required by Law


Open Immediately


Your Response is Required by Law


Open Immediately

Your Response is Required by Law


Your Response is Required by Law


Open Immediately


Your Response is Required by Law


Your Response is Required by Law

Questionnaire

Current design

Current design

Design Changes; Bold Mandatory Heading

Design Changes

Design Changes

Current design

Current design

Current design

FAQ Brochure

YES

NO

NO

NO

NO

NO

YES

YES

Letter

Current wording

Cybersecurity removed from front, FAQ Information added to back

Updated design, emphasized mandatory wording, FAQ Information added to back

Updated design, mandatory similar to Treatment 1, FAQ Information added to back

Updated design, NO Mandatory, FAQ Information added to back

Updated design, mandatory similar to Treatment 1, FAQ Information added to back

Current wording

Current wording

Instruction Card

YES

NO

NO

NO

NO

NO

YES

YES

Reminder Postcard

Address Side




Your Response is Required by Law






Wording




Shape2

Note: The areas highlighted in yellow in this table indicate how the treatment differs from current production.



Current wording

Current wording


Updated design, bold mandatory, bold interviewer contact note

Updated design, unbold mandatory, unbold interviewer

contact note


Updated design, no mandatory, unbold interviewer contact note

Updated design, unbold mandatory, unbold interviewer

contact note

Current wording

Current wording


Final Reminder

Outside of Mailer


No message


No message


Final Notice Respond Now’


Final Notice Respond Now’


Final Notice Respond Now’


Final Notice Respond Now’


No message


No message

Wording

Current wording


Bi-fold printing

Updated wording, added callout box


Bi-fold printing

Updated design, bold mandatory


Bi-fold printing

Updated design, unbold mandatory


Bi-fold printing

Updated design, NO mandatory wording


Bi-fold printing

Updated design, unbold mandatory


Bi-fold printing

Current wording




Tri-fold printing

Current wording




Bi-fold printing

Note: The areas highlighted in yellow in this table indicate how the treatment differs from current production.

1 Some of the design changes included writing in a bulleted format instead of longer paragraphs for ease of reading and testing a new logo design that more closely connects the ACS to the Census Bureau.

2 The Reminder Letter is scheduled to become a pressure seal mailer this year.

3 The Additional Reminder Postcard is scheduled to become a pressure seal mailer this year and it will highlight internet user ID information in a manner similar to the Reminder Letter.

4 CAPI interviews start at the beginning of the month following the Additional Postcard Reminder mailing.

5 CAPI interviewers also attempt to conduct interviews by phone when possible.

6 Previously the letter was a Reminder Postcard that included the internet user ID in the address label.

7 See Appendix A for dates of the mailout schedule for the September 2018 panel.


8 A blank form is a form in which there are no persons with sufficient response data and there is no telephone number listed on the form.

9 A sufficient partial internet response is one in which the respondent reached the Pick Next Person screen for a household with two or more individuals on the roster or has gone through the place of birth question for a

1-person household

10 We will remove addresses deemed to be Undeliverable as Addressed by the Postal Service if no response is received.

11 The second mailing universe

12 If there are no significant differences, we will report the findings in an internal memorandum.

13 This analysis will not include Treatment 2 that has the bold header “Your response is required by law.” just above the paragraph mentioning the mandatory nature of the survey, because there is no other treatment in the experimental design to make a clean comparison to evaluate that design element.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
SubjectAmerican Community Survey
AuthorAgnes S Kee (CENSUS/ACSO FED)
File Modified0000-00-00
File Created2021-01-12

© 2024 OMB.report | Privacy Policy