Download:
pdf |
pdf2020 Census Experiment
Extending the Census Environment to the
Mailing Materials Study Plan
Michael Shaw, Decennial Statistical Studies Division
Julia Coombs, Decennial Statistical Studies Division
Michael Bentley, Decennial Statistical Studies Division
Kelly Mathews, Decennial Statistical Studies Division
Sarah Konya, Decennial Statistical Studies Division
Gina Walejko, Decennial Statistical Studies Division
Austin Schwoegl, Decennial Statistical Studies Division
Casey Eggleston, Center for Behavioral Science Methods
January 31, 2020
Version 0.7
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Page intentionally left blank.
ii
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Table of Contents
I.
Introduction ......................................................................................................................... 1
II.
Background ......................................................................................................................... 2
III.
Assumptions........................................................................................................................ 3
IV.
Research Questions ............................................................................................................. 3
V.
Methodology ....................................................................................................................... 4
VI.
Data Requirements ............................................................................................................ 12
VII.
Risks.................................................................................................................................. 12
VIII.
Limitations ........................................................................................................................ 13
IX.
Issues that Need to be Resolved........................................................................................ 14
X.
Division Responsibilities .................................................................................................. 14
XI.
Milestone Schedule ........................................................................................................... 15
XII.
Review/Approval Table .................................................................................................... 15
XIII.
Document Revision and Version Control History ............................................................ 15
XIV. Glossary of Acronyms ...................................................................................................... 16
XV.
References ......................................................................................................................... 16
XVI. Appendix A ....................................................................................................................... 18
iii
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Table of Tables
Table 1: Promotional Insert Treatments ......................................................................................... 4
Table 2: Description of Mail Design Treatments ........................................................................... 5
Table 3: Description of EDDM treatment ...................................................................................... 5
Table 4: Description of Panels ........................................................................................................ 5
Table 5: Response Rate by Promotional Insert Treatment, National Sample................................. 8
Table 6: Response Rate by Promotional Insert Treatment, Geographic Cluster Sample ............... 8
Table 7: Response Rate by Promotional Insert Treatment after Each Mailing, National Sample.. 9
Table 8: Response Rate by Promotional Insert Treatment after Each Mailing, Geographic Cluster
Sample............................................................................................................................................. 9
Table 9: Response Rate by Mail Design Treatment, National Sample......................................... 10
Table 10: Response Rate by Mail Design Treatment, Geographic Cluster Sample ..................... 10
Table 11: Response Rate by EDDM, Geographic Cluster Sample............................................... 10
iv
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
I.
Introduction
The decennial census of housing and population occurs in a unique survey environment.
Everyone living in the United States is included in the survey universe, and all levels of
government have a tangible interest in ensuring that the people living within their boundaries
respond at the highest levels. We propose tapping into the unique environment surrounding the
decennial census through the materials we use to contact households and request selfenumeration. Specifically, we want to test the effect of a wearable nonmonetary insert that
promotes the 2020 Census on self-response rates, mailing materials that incorporate messaging
developed by the 2020 Census communications campaign, and the addition of an every door
direct mailer (EDDM) sent on Census Day. The communication campaign and heightened public
awareness are unique to the 2020 Census and would be difficult to replicate in a census test. The
three treatments proposed all rely on these two elements that are unique to the census
environment. Convincing a respondent to wear a promotional item would be challenging for a
census test, and the communication campaign and emphasis on Census Day are not as salient in a
test. For these reasons, we believe this experiment needs to be conducted during the 2020
Census. All interventions are hypothesized to increase self-response rates by different
mechanisms, and all rely on the heightened public awareness that exists only during a decennial
census in order to effectively test their influence with responders.
Though incentives have generally been found to increase response rates for mail surveys (Singer
and Kulka, 2001), and promotional inserts have also seen some increases in response rates,
research on the effect of inserts on self-response rates in a census are limited. Inserts that
promote the 2020 Census brand and connect to the once-a-decade nature of this national
undertaking may emphasize the importance of responding. Even less understood are the gains
that may be seen from a wearable promotional insert. A wearable item such as a sticker may not
only encourage the recipient to respond but, if worn, may capitalize on pro-social factors to
remind and encourage encountered community members to respond.
Also of interest is measuring the effect on response rates of sending households mailing
materials that are integrated with the overall 2020 Census communications campaign. We
hypothesize that such redesigned letters, postcards, and envelopes may be more easily
recognizable to respondents as being connected to the 2020 Census and thus may lead to a higher
response rate, compared with materials designed iteratively throughout the decade.
This experiment is not intended to evaluate the effectiveness of specific messages developed for
the 2020 Census communications campaign. This research will not recommend the use of any
particular messages in the 2030 Census. Rather, this experiment aims to measure changes in
response rates when mailing materials could be viewed as an extension of, rather than distinct
from, the larger communications campaign. The messages included in the 2020 Census
campaign may not be the same as the messages included in the 2030 Census campaign.
1
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
II.
Background
This experiment’s proposal to test a wearable item promoting the 2020 Census was inspired by
the “I Voted” stickers worn by voters during elections. Such stickers have been distributed by
local entities since at least the 1980s as a get-out-the-vote effort (Waxman, 2016). Though often
seen as a potent motivator for Election Day voting behaviors, research on the direct effect of the
ubiquitous “I Voted” sticker on voting turnout is difficult to find. Instead, the sticker’s effect is
often equated with other measured effects, such as a trigger for habitual behavior (Aldrich et al,
2010) or a way to publicize taking part in a socially valued behavior (Bolsen et al, 2010). Other
research about the effect of social pressure on voting supports theories about the sticker’s effect.
People who know they will be asked about their voting behavior (DellaVinga et al, 2016) or who
know their voting behavior will be reported to neighbors (Gerber et al, 2008) are more likely to
vote.
The Census Bureau has limited experience in sending incentives of any kind to sampled
households, especially in a decennial census. During the 2000 Census, an experiment was
conducted that included an incentive. In this experiment, 6,130 nonrespondents were invited to
respond to the census in a mode other than the standard mail questionnaire. Half of the housing
units in the sample received a calling card worth 30 minutes of long distance service, activated
after the response was obtained through the experimental mode. Results showed that while
response rates were higher in the experimental mode for housing units that received the
incentive, they were not higher overall (Guarino, 2001).
The 2014 Survey of Income and Program Participation (SIPP) incorporated the use of incentives
and found that a higher incentive value saves more money on average than a lower incentive
value. The SIPP is conducted across multiple years, and within each year, there are waves that
represent one round of interviewing over a four-month interval. In Wave 1 of the 2014 SIPP,
they offered $20 and $40 conditional upon completion of the survey, and on average, the cost per
interview went down by $6.50 and $8.53, respectively (Westra, Sundukchi, and Mattingly,
2015).
In May 2018, the American Community Survey (ACS) tested sending a modified version of their
popular data wheel in the first or fourth mailing. This new data slide includes a set of metrics
derived from ACS data at the national and state levels. The ACS experiment is based on research
that suggests that building trust is the most important aspect of survey messaging (Dillman,
Smyth, and Christian, 2014), and the hypothesis for this experiment is that a data slide will
communicate the legitimacy of the ACS and thus increase response rates (Barth and Heimel,
2018).
There are numerous factors when setting up an experiment that can determine the effectiveness
of an incentive on response rates. Providing an incentive unconditionally at the beginning of a
survey performed better than promising a conditional incentive after the completion of a survey
(Church, 1993; Mercer et al, 2015). Non monetary incentives could have other advantages that
have yet to be thoroughly tested. Dillman, Smyth, and Christian (2014) suggest that the irregular
shape of a package containing an incentive could possibly lead to less mailings being unopened
or discarded, which in turn could lead to higher response rates. Gendall and Healey (2008) as
2
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
well as Nederhof (1983) found that nonmonetary incentives may increase initial self-response
rates thereby decreasing the total survey cost since reminders do not need to be mailed.
Nonmonetary incentives that are tailored to particular groups of the sample may be effective
(Gendall and Healey, 2008).
This proposal primarily supports the 2020 Census innovation area Optimizing Self-Response.
Specifically, this proposal motivates people to respond by sending new notices that encourage
self-response. Redesigned letters that include elements from advertising can help motivate
respondents. In addition, a wearable insert could transform reliable respondents into temporary
outreach partners at a micro level for the duration that the insert is displayed, urging others to
respond at any location and at any time.
III.
Assumptions
1. The project team will obtain adequate funding to implement the experiment as it is
designed in this study plan.
2. The Census Data Lake will contain 2020 Census response and operational data required
for analysis.
3. The Census Bureau will be able to obtain the services of a contractor to create the
nonmonetary promotional inserts that will be tested in this experiment.
4. The 2020 Census communications campaign will be adequately developed in time to
inform the treatments of this test.
5. The printing, assembly, addressing, and mailing of mailing materials that are different
from production materials can be supported by the National Processing Center (NPC) or
a print vendor to implement the experiment as it is designed in this study plan.
IV.
Research Questions
1. Does the inclusion of a promotional insert featuring 2020 Census branding in the initial
mailing package increase self-response rates?
2. Does the inclusion of a promotional insert in the initial mailing package lead to earlier
responses?
3. Does the cost of producing and sending promotional inserts outweigh any observed
savings gained from increased or earlier self-response compared to a control group?
4. Do mailing materials designed to complement the 2020 Census communications
campaign increase the self-response rate?
3
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
5. Does sending an EDDM on Census Day lead to an increase in response rates?
V.
Methodology
A. Experimental Design
This experiment will test the effect on the response rate of including a wearable insert in the
invitation to respond. It will also test the effect on response rates of redesigned mailing materials
that reflect the 2020 Census communications campaign and of the addition of an EDDM that
arrives on or near Census Day. Panels will be assigned randomly to housing units selected from
across the nation and from sampled geographic clusters.
1. Control and treatment panels
We propose treatments for two aspects of the mail packages: the experimental inclusion of a
nonmonetary promotional insert and the redesign of the envelopes, letters, and postcards to
mirror major elements developed by the communications campaign.
Table 1 describes the treatments for the nonmonetary promotional inserts. The first will contain
no insert, which is the current design for the 2020 Census. The treatment is the inclusion of
multiple copies of a wearable insert in the first mailing. This insert would be small, such as a
sticker, and will feature the 2020 Census logo or a phrase encouraging response, such as “I
count.” The design may also feature other languages, such as a multilingual word cloud for “I
count.”
Table 1: Promotional Insert Treatments
Nonmonetary Promotional
Description
Insert Treatments
Households will not receive a nonmonetary promotional
P0: No insert
insert, which matches the current production design.
Households will receive multiple copies of a wearable
P1: Wearable insert
insert promoting the 2020 Census, such as a sticker that
says, “I count.”
Table 2 describes the treatments for the new mail designs. The first treatment consists of sending
standard 2020 Census mailing materials. These materials were designed iteratively throughout
the decade and describe why and how to respond to the 2020 Census. They will not, however,
feature anything that respondents may encounter in television, print, radio, or digital medias
around Census Day, including slogans, images, colors, or points of emphasis other than the
census’s general benefit to communities. One treatment will therefore send envelopes, letters,
and postcards that have been redesigned to blend with the 2020 Census communications
campaign. Changes to content will be limited to what is needed to add these new elements.
Respondents who are exposed to 2020 Census advertising may more easily recognize and
connect with mailing materials that have been designed similarly.
4
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Table 2: Description of Mail Design Treatments
Mail Design Treatments
Description
Production mailing materials, which were designed
L0: Production design
iteratively through the mid decade tests.
Mailing materials that reflect elements of the
L1: Communications
communications campaign, such as slogans, logos, or other
campaign design
messaging features.
Table 3 describes the treatments for other aspects of the mailing strategy. The first treatment
consists of sending standard 2020 Census mailing materials following the production strategy.
Households in the second treatment will receive an EDDM on Census Day reminding them to
respond in the language that they would receive or an EDDM promoting the ease of responding
online to the Census. An EDDM is a mailer that is unaddressed and delivered to every housing
unit on a letter carrier’s route. Because of the nature of this treatment, it will not be included in
the national sample.
Other Strategy Treatments
S0: Production design
S1: Census Day EDDM
Table 3: Description of EDDM treatment
Description
Production strategy, which was designed iteratively
through the mid decade tests.
An EDDM promoting the 2020 Census will arrive on
Census Day.
Table 4 describes the five panels included in this proposed experiment created by the two control
groups, and the three treatments.
Panel
Table 4: Description of Panels
Language of
National
Treatment
Materials
Sample Size
1: P0, L0, S0
(Control #1)
No treatment
2: P0, L1, S0
Geographic
Sample Size
Production
24,956
10,760
Communications
campaign
Bilingual
24,956
0
3: P1, L0, S0
Wearable Insert
Bilingual
24,956
25,824
4: P0, L0, S1
EDDM delivered on
Census Day
0
21,520
5: P0, L0, S2
(Control #2)
No Treatment
24,956
25,824
99,824
83,928
Production
Bilingual
Total
Even though the sample will include housing units identified as eligible for either English-only
or English/Spanish bilingual mailing materials in the 2020 Census, all mailing materials, e
except those for the panel receiving the EDDM, will be bilingual (English and Spanish) to reduce
5
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
operational complexity. Also, mailing materials will be developed for both production contact
strategies for the 2020 Census, Internet First, and Internet Choice.
2. Sample Design
The Master Address File (MAF) extract will be divided into seven sub frames. Each sub frame
will be used to select the sample for different components of the 2020 Census experiments.
Control groups for other 2020 Census experiments will be sampled from the sub frames used in
the sampling procedure. See the study plan for Optimized Self-Response for more details. The
sample will consist of 99,824 housing units selected for a national sample and 83,928 housing
units, approximately 28 tracts and 54 United States Postal Service (USPS) routes, selected in
geographic clusters. This results in a total sample size of 183,752 housing units. Sampling for
this experiment will be coordinated with other sampling activities to ensure that housing units are
not sampled for more than one experiment or evaluation.
Other direct mailers outside of the five production mailings are in the process of being
developed. These mailers will be sent before or during self-response in order to promote
response in subpopulations of interest. Though the addresses identified to receive these
additional mailers will not be known until after sampling for this experiment is underway, it is
possible that the sample selected in this experiment may overlap with the additional mailers
being proposed. One of the proposed mailers may target up to 20 percent of all mailable
addresses. To account for this possible overlap in samples, all of the panels except the EDDM
panels will be sampled at 120 percent of the minimum sample size calculated in the appendix.
Geographic clusters
To measure any social norming behaviors that would result from a wearable promotional insert,
part of the sample for this experiment will be clustered geographically. This ensures that
households in a given region will have a chance to see and react to inserts worn at work, school,
or other public places. Specifically, about 44,800 housing units within about 32 tracts that pass
to-be-determined thresholds for size and population density will be eligible to receive one of the
three panels. These tracts will be randomly selected as described below.
1. Tracts that do not pass to-be-determined thresholds for size and population density will
be removed from the designated sampling sub frame.
2. Tracts will be sorted by 2020 Census contact strategy, language of materials, the number
of housing units eligible to receive mailout in the 2020 Census, and any demographic
measures of interest.
3. A sample of 32 tracts will then be systematically selected using an appropriate sampling
fraction.
4. Tracts will be assigned one of the two panels sequentially.
A sample of 32 tracts, which amounts to around 44,800 housing units, will allow us to detect a
difference of 3 percentage points in response rates, with α=0.1, β=0.2, and a design effect of 4.0.
The EDDM experiment will be sampled at the USPS carrier route level. Entire carrier routes of a
to-be-determined housing unit size will be selected, and all housing units in the carrier route will
receive the EDDM.
1. Mail routes will be sorted by zip code and route identifier within their state..
6
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
2. A sample of 54 carrier routes will then be systematically selected using an appropriate
sampling fraction.
3. The carrier routes that do not have a minimum of 200 housing units’ will not receive the
EDDM. (USPS does not allow EDDM deliveries on routes with less than 200 housing
units.)
4. Carrier routes will be assigned one of the two panels sequentially.
National sample
A national sample of 99,824 households will be selected to receive one of the three panels. The
national sample will allow us to understand the overall effect of receiving a promotional insert or
redesigned materials.
1. Housing units in the designated sub frame will be stratified by 2020 Census contact
strategy to form two strata: Internet First and Internet Choice.
2. Housing units within each stratum will be sorted by language, geographic identifiers, and
MAF identifier.
3. A sample of 99,824 housing units will be systematically selected from each of the two
strata using an appropriate sampling fraction.
4. Housing units will be assigned to one of the three panels sequentially.
A national sample of 99,824 housing units allows us to detect a difference of 3 percentage points
in response rates, with α=0.1, β=0.2, and a design effect of 1.0.
B. Answering the Research Questions
The primary measure of interest for this study is the self-response rate, which this study plan will
simply call the response rate. The response rate is a measure of respondent cooperation and
reflects the sampling housing units that respond to the census by one of the three self-response
modes: responding online to the internet instrument, providing information to a phone
interviewer, or completing and returning the mail questionnaire. In general, the response rate will
be calculated using the following formula:
𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅𝑅 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟 =
𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈1 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟𝑟
∗ 100
𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠 𝑠𝑠𝑠𝑠𝑠𝑠𝑠𝑠
Response rates will be calculated for specific panels or other subsets of cases using the same
formula, but in each case, the denominator will be restricted to the appropriate set of eligible
cases. The total response rate of each treatment group will be compared to that of the control
group using t-tests, and distributions will be compared using chi-squared tests.
For appropriate estimation, the mailing materials response data will be weighted to reflect the
complex sample designs and adjusted to reduce nonresponse bias. Replicate weights will be
created, and we will use a stratified jackknife replication estimation method. In this method,
housing units are sorted in the order they were selected and reassigned to a replicate group. To
1
Households providing more than one self-response are counted in the response rate calculation only once.
7
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
help ensure the validity of statistical inference when making multiple statistical comparisons,
when applicable, multiple comparison corrections will be used to maintain the family-wise error
rate at α = 0.1. The Holm-Bonferroni procedure will be performed to adjust for the increased
possibility of erroneous conclusions when making multiple comparisons. Multiple comparison
corrections reduce the possibility of identifying false-positive differences and ensure that we do
not cloud our ability to form inferential conclusions. For this report, each table is considered a
family of comparisons.
1. Does the inclusion of a nonmonetary promotional insert featuring 2020 Census branding
in the initial mailing package increase self-response rates?
To answer this question, response rates will be calculated by treatment for the national sample
and the geographic cluster sample separately. Responses rates for panels receiving an insert will
be compared to the control panel.
Table 5: Response Rate by Promotional Insert Treatment, National Sample
Internet
CQA
Mail
Total
Promotional Insert Treatment
Response
Response
Response
Response
Rate
Rate
Rate
Rate
No insert (Panel 1)
Wearable Insert (Panel 3)
Source: U.S. Census Bureau, 2020 Census Decennial Response File
Table 6: Response Rate by Promotional Insert Treatment, Geographic Cluster Sample
Internet
CQA
Mail
Total
Promotional Insert Treatment
Response
Response
Response
Response
Rate
Rate
Rate
Rate
No insert (Panel 1)
Wearable Insert (Panel 3)
Source: U.S. Census Bureau, 2020 Census Decennial Response File
Response rates will also be calculated by contact strategy and treatment. We will compare
response rates of the treatment compared to neighboring tracts. We also calculate non-ID
responses for Internet Self-Response.
2. Does the inclusion of a nonmonetary promotional insert in the initial mailing package
lead to earlier responses?
To answer this question, response rates will be calculated after each contact and before the
Nonresponse Followup (NRFU) operation begins by treatment for the national sample and the
geographic cluster sample separately. Responses rates for panels receiving an insert will be
compared to the control panel.
8
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Table 7: Response Rate by Promotional Insert Treatment after Each Mailing, National Sample
Response
Response
Response
Response Response
Final
Promotional Insert
Rate after
Rate after
Rate after Rate after
Rate
Response
Treatment
First
Second
Third
Fourth
before
Rate
Mailing
Mailing
Mailing
Mailing
NRFU
No insert (Panel 1)
Wearable Insert
(Panel 3 )
Source: U.S. Census Bureau, 2020 Census Decennial Response File
Table 8: Response Rate by Promotional Insert Treatment after Each Mailing, Geographic Cluster
Sample
Response
Response
Response
Response Response
Final
Promotional Insert
Rate after
Rate after
Rate after Rate after
Rate
Response
Treatment
First
Second
Third
Fourth
before
Rate
Mailing
Mailing
Mailing
Mailing
NRFU
No insert (Panel 1)
Wearable Insert
(Panel 3)
Source: U.S. Census Bureau, 2020 Census Decennial Response File
Response rates will also be calculated by contact strategy and treatment. We also calculate nonID responses for Internet Self-Response.
3. Does the cost of producing and sending promotional inserts outweigh any observed
savings gained from increased or earlier self-response compared to a control group?
To answer this question, the cost of a control case will be compared to the cost of a treatment
case. The cost of a control case will be calculated by dividing the product of the overall NRFU
cost per case and the number of nonresponders in the control panel by the total number of control
cases. The cost of a treatment case will be calculated similarly, but the cost of the insert will be
included in the numerator. These calculated costs ignore the cost of developing the data
collection instruments, sending the other self-response materials, and other common costs that
would be the same for control and treatment cases.
𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶 𝑝𝑝𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 =
(𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑝𝑝𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐)(𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁 𝑜𝑜𝑜𝑜 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛)
𝑁𝑁𝑁𝑁𝑁𝑁𝑏𝑏𝑒𝑒𝑒𝑒 𝑜𝑜𝑜𝑜 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶 𝑝𝑝𝑝𝑝𝑝𝑝 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
(𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 𝑝𝑝𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐)(𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁 𝑜𝑜𝑜𝑜 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛𝑛) + (𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐)
=
𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁𝑁 𝑜𝑜𝑜𝑜 𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡𝑡 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
The cost of the insert will be calculated by summing the total costs unique to the insert.
9
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼𝐼 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 = 𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇𝑇 𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖𝑖 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐 + 𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈 𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝𝑝 𝑐𝑐𝑐𝑐𝑐𝑐𝑐𝑐
4. Do mailing materials designed to complement the 2020 Census communications
campaign increase self-response rate?
To answer this question, response rates will be calculated by treatment for the national sample
and the geographic cluster sample separately. Responses rates for panels receiving the
communications campaign materials will be compared to those receiving production materials.
Table 9: Response Rate by Mail Design Treatment, National Sample
Internet
CQA
Mail
Total
Mail Design Treatment
Response Response Response Response
Rate
Rate
Rate
Rate
Production (Panel 1)
Communications campaign
(Panel 2)
Source: U.S. Census Bureau, 2020 Census Decennial Response File
Table 10: Response Rate by Mail Design Treatment, Geographic Cluster Sample
Internet
CQA
Mail
Total
Mail Design Treatment
Response Response Response Response
Rate
Rate
Rate
Rate
Production (Panel 1)
Communications campaign
(Panel 2)
Source: U.S. Census Bureau, 2020 Census Decennial Response File
Response rates will also be calculated by language of materials and treatment as well as by
contact strategy and treatment. The response rates for geographic treatment panels will also be
compared to the results of the 2020 Census.
Table 11: Response Rate by EDDM, Geographic Cluster Sample
Internet
CQA
Mail
Total
Response Response Response Response
EDDM Treatment
Rate
Rate
Rate
Rate
Control (Panel 5)
EDDM Treatment 1
EDDM Treatment 2
Source: U.S. Census Bureau, 2020 Census Decennial Response File
C. Interventions with the 2020 Census
Name of solution/system/process: Content and Forms Design IPT
10
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Explicit intervention requested
• Develop experimental mailing materials and questionnaires.
• Assign form types to the newly developed materials.
Estimated impact: Minimal impact on the 2020 Census.
Name of solution/system/process: NPC
Explicit intervention requested
• Printing of mailing materials.
• Assembling mail packages.
• Addressing materials.
• Mailing packages to respondents.
• Receiving and storing materials provided by other print vendors.
Estimated impact: Minimal impact on the 2020 Census.
Name of solution/system/process: Forms Printing and Distribution, print vendor RR Donnelly,
any other to-be-determined print vendor
Explicit intervention requested
• Printing of production mailing materials.
• Bulk printing of alternative mailing materials that do not contain variable data.
• Assembling mail packages.
• Addressing mail packages that do not differ from production.
• Mailing packages that do not differ from production to respondents.
• Shipping bulk printed materials or assembled packages to NPC.
Estimated impact: Minimal impact on the 2020 Census.
Name of solution/system/process: CaRDS
Explicit intervention requested:
• Sample cases for this experiment as specified.
• Add necessary experiment variables to the sample delivery file.
Estimated impact: Moderate impact on the 2020 Census. The sampling specifications are
complicated, especially considering that the frame comes in three stages and by state. Errors in
sampling could affect other systems.
Name of solution/system/process: ECaSE-OCS
Explicit intervention requested
• Ingest the sample delivery file with experiment variables.
• Create workloads for each contact.
• Send created workloads to NPC.
Estimated impact: Minimal impact on the 2020 Census.
Name of solution/system/process: Paper Data Capture
Explicit intervention requested: Receive and process experimental questionnaires.
Estimated impact: Minimal impact on 2020 Census.
Name of solution/system/process: iCADE
11
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Explicit intervention requested: Data capture of experimental questionnaires.
Estimated impact: Minimal impact on 2020 Census.
Name of solution/system/process: Response Processing Operation
Explicit intervention requested: Process responses from experimental questionnaires.
Estimated impact: Minimal impact on 2020 Census.
D. Implications for 2030 Census Design Decisions and Future Research and Testing
This experiment tests different mailing material treatments in the 2020 Census. Results from this
experiment can potentially be used to improve or enhance the strategies for encouraging and
motivating self-responses during the research and testing phase of the 2030 Census program and
in the 2030 Census. If one of the mailing treatments does increase the self-response rate at no or
low additional cost, the Census Bureau should focus the mid decade tests on refining the mailing
materials in this experiment.
VI.
VII.
Data Requirements
Data File/Report
Source
Purpose
Decennial Response File
Census Data Lake,
Response
Processing
Operation
CQA call records
Census Data Lake
This is the main file for
analysis. It contains
census responses and
includes mode and time
of response.
This file will contain
records from CQA to
analyze any addiotional
burden due to EDDM.
Expected
Delivery Date
Fall 2020
Fall 2020
Risks
1. The messaging developed for the 2020 Census communications campaign may not
resonate with respondents. If the 2020 Census campaign is ineffective, the experiment
may not find an effect on the response rates. Such a finding would imply that mailing
materials that reflect the communications campign are not effective.
2. This experiment relies on having elements of the 2020 Census communications campaign
developed in time for the design and printing of mailing materials. If the communications
campaign is developed too late, this experiment cannot test materials that incorporate
elements from the campaign.
3. The wearable incentive must be compatible with NPC equipment for inserting into mail
packages. If a compatabile wearable incentive cannot be developed, then this panel may
need to be removed.
12
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
4. This experiment relies on NPC to assemble and mail packages. If NPC has commitments
to other surveys during this experiment’s mailout time, then the mail packages may not
be sent at the same time as the production 2020 Census materials.
5. A mitigation for the previous risk is that the print vendor could mail packages that do not
differ from the production packages, which would mainly be the control cases. If the print
vendor mails the control packages, then any differences in actual mailout procedures or
timing may lead to observed differences between control and treatment panels that may
not be distinguishable from differences because of the experimental mainipulation of
interest.
6. Sample sizes were calculated considering the number of housing units needed to detect a
meaningful difference. Some housing units are being selected by tracts or by carrier route
level; these geographic units vary in size. If the variance in the number of housing units
per tract or carrier route is not properly accounted for, the actual number of housing units
sampled may not be large enough to support the comparisons of interest.
7. The sampling frame will be available in three extracts, and sampling must occur from
each extract without knowing what will come in the next extract and without the ability
to sample more housing units from the previous extract. Sampling must therefore
properly account for the contents of all extracts so that enough housing units are sampled
and that all housing units have a chance to be sampled. If sampling ratios are improperly
estimated, then the actual number of housing units sampled may not be sufficient to
support the comparisons of interest or some housing units may not have a chance to be
sampled.
8. The EDDM is designed to arrive on Census Day. EDDMs are delivered to each door by
the mail carrier within a couple of days after the materials are given to the local post
office, so the timing of delivery depends on when the post office receives the materials
from the Census Bureau. If the EDDM is not delivered on Census Day, then it may
appear to be less effective than it is.
9. If the data are not available in the Census Data Lake, then the analysis cannot be
performed.
10. If sufficient funds are not granted for this experiment, than the scope of the experiment
may be reduced.
11. Other Census Bureau groups will be sending additional mailers. It is possible that these
mailings will overlap with the mailings in this experiment. The minimum sample size is
being increased by twenty percent to account for possible overlap. If the overlap is not
random and significant, then the ability to make adequate conclusions about the effect of
the planned treatments may be negatively affected.
VIII.
Limitations
1. The 2020 Census communiations campaign will include some local advertisements.
Elements unique to local advertising will not be included in the tested mailing materials;
the materials will be designed to reflect the national campaign.
13
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
IX.
Issues that Need to be Resolved
1.
2.
3.
4.
5.
6.
X.
The printing solution needs to be identified.
The nonmontary promotional inserts need to be designed.
The number copies of the wearable insert must be decided.
A contract needs to be established for the nonmonetary promotional inserts.
The 2020 Census communications campaign needs to be developed.
Appropriate sampling ratios need to be calculated.
Division Responsibilities
Division or Office
Decennial Statistical Studies Division
Center for Behavioral Science Methods
DCMD
NPC
Responsibilities
• Plan and manage the experiment
• Design the panels
• Select the sample
• Monitor the results
• Analyze the data
• Write and release the report
• Qualitatively test developed materials
• Develop alternative materials
• Project management support
• Receive workloads
• Print, assemble, address, and mail packages
according to the workloads
14
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
XI.
XII.
XIII.
Milestone Schedule
Extending the Census Environment to the Mailing Materials Study Plan
Milestones
Date
Design Mailing Materials
January 1, 2019 –
November 30, 2019
Select Sample
June 28, 2019 –
February 12, 2020
2020 Census Self-Response
March 12 –
August 31, 2020
Receive, Verify, and Validate Data For Extending the Census Environment to the
Mailing Materials
December 31, 2020
Distribute Initial Draft of the Extending the Census Environment to the Mailing
Materials Report to the Decennial Research Objectives and Methods Working Group for
Pre-Briefing Review
March 31, 2021
Decennial Census Communications Office Staff Formally Release the FINAL Extending
the Census Environment to the Mailing Materials Report in the 2020 Memorandum
Series
June 30, 2021
Review/Approval Table
Role
Approval Date
Primary Author’s Division Chief (or designee)
March 20, 2019
Decennial Census Management Division Asisstant Division Chief for Nonresponse,
Evaluations, and Experiments
March 20, 2019
Decennial Research Objectives and Methods Working Group
March 20, 2019
Decennial Census Communications Office
June 12, 2019
Document Revision and Version Control History
Version/Editor
Version 0.1
Version 0.2
Version 0.3
Version 0.4
Date
8/15/18
9/20/18
3/08/19
3/25/19
Version 0.5
6/12/19
Version 0.6
8/07/19
Version 0.7
1/31/20
Revision Description
Initial draft for peer review
Draft for division chief review
Draft for DROM review
Revised from DROM and Quality Process Review
Revised from Decennial Census Communications Office
review
Updated sample sizes and corresponding text, in addition to
minor editorial and formatting changes
Made relevant changes to describe the two EDDM designs
15
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
XIV.
Glossary of Acronyms
Acronym
ACS
CQA
EDDM
MAF
NPC
NRFU
SIPP
XV.
Definition
American Community Survey
Census Questionnaire Assistance
Every Door Direct Mailer
Master Address File
National Processing Center
Nonresponse Followup
Survey of Income Program Participation
References
Aldrich, J. H., Montgomery, J. M., Wood, W. (2010). “Turnout as a Habit.” Political Behavior,
Vol. 33, No. 4, pp. 535-563.
Barth, D., and Heimel, S. (2018). “ACS Research & Evaluation Analysis Plan: 2018 Data Slide
Test RS17-4-0220.” U.S. Census Bureau, January 2018.
Bolsen, T., Ferraro, P. J., M., Jose, J. (2010). “Are Voters More Likely to Contribute to Other
Public Goods? Evidence from a Large-Scale Randomized Policy Experiment.” American
Journal of Political Science, Vol. 58, No. 1, pp. 17-30.
DellaVinga, S., List, J. A., Malmendier, U. (2016). “Voting to Tell Others.” The Review of
Economic Studies, Vol. 48, No. 1, pp. 143-181.
Dillman, D., Smyth, J., and Christian, L. (2014). Internet, Phone, Mail, and Mixed-Mode
Surveys: The Tailored Design Method (4th ed.). Hoboken, NJ: Wiley.
Gerber, A.S., Green, D.P., and Larimer, C.W. (2008). “Social Pressure and Voter Turnout:
Evidence from a Large-Scale Field Experiment.” The American Political Science Review,
Vol. 102, No. 1, pp. 33-48.
Guarino, J. (2001). “Assessing the Impact of Differential Incentives and Alternative Data
Collection Modes on Census Response.” Census 2000 Testing, Experimentation, and
Evaluation Program, U.S. Census Bureau.
Mercer, A., Caporaso, A., Cantor, D., Townsend, R. (2015). “How Much Gets You How Much?
Monetary Incentives and Response Rates in Household Surveys.” Public Opinion
Quarterly, Vol. 79, No. 1, pp. 105-129.
Singer, E., and Kulka, R.A. (2001). “Paying Respondents for Survey Participation.” Studies of
Welfare Populations: Data Collection and Research Issues, National Research Council,
The National Academies Press, Washington DC, pp. 105-128.
16
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
Waxman, O. B. (November 7, 2016). “This is the Story Behind Your ‘I Voted’ Sticker.”
Retrieved from http://time.com/4541760/i-voted-sticker-history-origins/, August 8, 2018.
17
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
XVI.
Appendix A
The formula used to calculate the minimum sample size necessary for the desired comparisons is
where
= minimum sample size
= minimum detectible difference
δ
= alpha level adjusted for multiple comparisons (Bonferroni)
α*
Zα*/2 = critical value for set alpha level assuming a two-sided test
= critical value for set beta level
Zβ
p1
= proportion for group 1
p2
= proportion for group 2
deff = design effect due to unequal weighting
Wang, H. and Chow, S. (2007). “Sample Size Calculation for Comparing Proportions,” Wiley
Encyclopedia of Clinical Trials (eds R.B. D’Agostino, L. Sullivan, and J. Massaro).
n
National sample size
0.03
δ =
0.05
α* =
Zα*/2 = 1.959964
n with deff = 10,396.8503
Zβ = 0.841621
p1 =
0.57 n without deff = 5,941.0573
p2 =
0.54
deff =
1.75
Cluster sample size
0.03
δ =
0.053
α* =
=
Zα*/2
n with deff = 10760
Zβ = 0.841621
p1 =
0.57 n without deff = 4304
p2 =
0.54
deff =
2.50
•
The sample sizes for the national sample of the Extending the Census Environment to the
Mailing Materials Experiment and the Optimization of Self-Response Experiment were
calculated simultaneously. The samples share two national controls (one for production
language materials and one for bilingual only mailing materials) and will be selected
simultaneously.
18
Extending the Census Environment to the Mailing Materials Study Plan, Version 0.7
•
•
•
The value of p1 is the expected 2020 Census response rate after six weeks, which is
between the minimum and average expected 2020 Census response rate.
The estimated design effect is 1.75 for the national sample. This is based off of 2015
National Content Test (NCT) data evaluation of the variable analysis_response. The
Optimizing Self-Response sample for 2015 NCT is the most complex this experiment
would be which has a deff = 1.2351. The design effect of the entire 2015 NCT is 1.9513.
The estimated design effect is 2.50 for the cluster sample. This is based off of 2015 NCT
data evaluation of the variable analysis_response increased to take into consideration the
clustering nature of the sample.
The values of n found here are the sample size needed in both group 1 and group 2 to detect a 3
percentage point difference. Note that most of the panels in this experiment have a national and
geographic sampling component to them. Therefore, if only two groups were being compared
using the national sample, we would need 20,796 housing units with the given parameters. If
comparing two groups using the geographic sample, we would need 21,520 housing units
The sample for the Extending the Census Environment to the Mailing Materials Experiment
includes four groups, three of which will be compared. Panels 1 and 3 are sampled at both the
national and geographic level Panel 2 only has a national sample. Panel 4 is sampled at the
carrier route level. For the four groups, the total mailing sample size is 116,912. Note that the
selection of the national control portion of Panels 1 and 5 was outlined in the Optimization of
Self-Response Experiment Study Plan where each panel has a sample size of 20,796 housing
units. Therefore, the grand total mailing sample size needed for this experiment is 158,564
housing units.
Minimum Sample Size
43,040
Number of
Tracts for
Minimum
Sample Size1
Number of HUs Receiving Mail
Materials1
32
44,800
1
Calculated assuming 1,400 HUs/tract
Minimum Sample Size
32,280
Number of
Carrier Routes
for Minimum
Sample Size1
Number of HUs Receiving Mail
Materials1
54
21,600
1
Calculated assuming 600 HUs/carrier route
19
File Type | application/pdf |
Author | Julia Coombs (CENSUS/DSSD FED) |
File Modified | 2020-02-13 |
File Created | 2020-02-07 |