Revised Statement to OMB

OMB Memo_2020 CBAMS Quant_110217_Final.docx

Generic Clearance for Internet Nonprobability Panel Pretesting

Revised Statement to OMB

OMB: 0607-0978

Document [docx]
Download: docx | pdf

Generic Information Collection Request (ICR): 2020 CBAMS Survey


Request: The U.S. Census Bureau plans to conduct additional research under the Generic Clearance for Internet Nonprobability Panel Pretesting (OMB Control Number 0607-0978). Similar to the 2010 enumeration, the Census Bureau’s Communications Research and Analytics Team (CRAT) of the Integrated Partnership and Communications (IPC) program plans to conduct a quantitative survey as part of the 2020 Census Barriers, Attitudes, and Motivators Study (2020 CBAMS). This survey is designed to understand mindsets that relate to census participation across demographic subgroups. The 2020 CBAMS Survey covers a range of topics related to respondents’ knowledge of, attitudes towards, and barriers related to the 2020 Census. Results will drive the creative development and media planning for the 2020 Integrated Communications Campaign. In this current submission, we are seeking approval to conduct this survey.


Purpose: The purpose of fielding the 2020 CBAMS Survey is twofold. First, the results will compare the barriers, attitudes, and motivators related to 2020 Census participation across demographic subgroups such as Asian, Black, Hispanic, White, and, given sufficient sample size, other demographic subgroups such as income level, sexual orientation, and education level. Results of comparing these barriers, attitudes, and motivators across subgroups will be used to develop targeted messaging for the 2020 Census campaign. Note that subgroups we are unable to target through the survey—including key demographic subgroups like Middle Eastern North African (MENA) and others, to be detailed in a forthcoming ICR—will be reached through the qualitative component of the 2020 CBAMS being conducted in parallel with the quantitative survey.


Second, CRAT will use 2020 CBAMS Survey data as the basis of a mindset component of a tract-level audience segmentation. This segmentation will cluster U.S. tracts with similar household demographics and other characteristics including response behavior and mindsets. Mindsets are constructs of a person’s knowledge of and attitudes towards the decennial census based on answers to the 2020 CBAMS Survey. Mindsets will be constructed using an exploratory data analysis process (such as an unsupervised machine learning algorithm) to cluster respondents with similar answers to attitudinal and knowledge questions.


Mindsets will then be used as one input into a tract-level audience segmentation, in addition to response rates in previous Census Bureau data collections and household demographics. The communications contract team led by Young and Rubicam (Team Y&R) will use the audience segmentation to create tailored messages and develop a media plan to guide and justify media buys. The segmentation will also inform the strategy of the IPC program as a whole.


Population of Interest: Results will inform 2020 Census planning; therefore, we are interested in data representing housing units across the United States.


Sample: We will select a sample of 50,000 addresses from the Census Bureau’s Master Address File (MAF), stratifying by concentration of minorities (Asian or Native Hawaiian Pacific Islander [NHPI], Black, Hispanic, or other) and high and low propensity to respond (based on American Community Survey response rates, and information on internet access from the Federal Communications Commission). The design of the sample is shown in Table 1.


Table 1: Sample Design for 2020 CBAMS Survey


Strata

Sample Size

Low*/Asian or NHPI tracts

1,000

Low/Black tracts

7,000

Low/Hispanic tracts

3,000

Low/Other tracts

6,000

High**/Asian or NHPI tracts

8,000

High/Black tracts

6,000

High/Hispanic tracts

4,000

High/Other tracts

15,000

Total

50,000

*Low propensity to respond

** High propensity to respond


Given an anticipated response rate of 30 percent, we expect to produce estimates for subpopulations as small as 10 percent of the sample within a margin of error less than 4 percent. The sample will exclude housing units that: have been selected for other recent Census Bureau tests and the American Community Survey (ACS); are in Puerto Rico; are in in enumeration areas other than mailout mailback; have incomplete addresses; are in group quarters; or are Congressional refusals.


Timeline: The survey will be open between February 20 and April 9, 2018.


Language: The survey and mail materials are available in English and Spanish.

Method: Addresses will fall within one of two mail strategies, Internet First or Internet Choice, similar to the mail strategies planned for the 2020 Census. The Internet First strategy encourages people to respond to the 2020 CBAMS Survey via the web, until the fourth mailing when a paper questionnaire is sent; the Internet Choice strategy offers a paper questionnaire in the first mailing. The mail strategy for each address will be determined by its sampling stratum with addresses in high response strata receiving Internet First and addresses in low response strata receiving Internet Choice. Mailings are described in Table 2.










Table 2: CBAMS Mailings


Mailing

Description

Date

Universe

1

Letter Invite + Questionnaire*

Tuesday, Feb. 20

Entire sample

2

Reminder Letter

Friday, Feb. 23

Entire sample

3

Sealed Reminder Postcard

Monday, Mar. 5

Non-responders as of Feb. 26

4

Reminder Letter + Questionnaire

Monday, Mar. 12

Non-responders as of Feb. 26

5

Sealed Reminder Postcard

Monday, Mar. 26

Non-responders as of Mar. 19

*Internet First mailings will not receive a questionnaire in this mailing.

Note: Dates are for the year 2018. Closeout will happen on Monday, April 9, 2018.


Addresses will receive either English or bilingual (i.e., English and Spanish) mailings depending on the concentration of households estimated to need Spanish language assistance in each census tract. Housing units that need Spanish assistance are defined as those in which at least one adult (i.e., age 15 or older) in the household speaks Spanish and does not speak English “very well” according to a five-year ACS estimate. The bilingual mailings will include letters and postcards printed in both English and Spanish. The bilingual version of the questionnaire will be “flip-style” with English on one side, Spanish on the other side, and a two-sided cover in both English and Spanish.


The mail materials, paper survey cover page, and web survey introduction page instruct that the survey should be completed by the person who typically opens the mail for the household. It is hypothesized that whoever opens the mail is likely to be the same person who would fill out the household’s census form. The mail materials and questionnaire clearly indicate that the survey is for research purposes and is not the 2020 Census.


Respondents will be able to access the mobile-optimized web survey instrument by typing the URL from the mailing into a browser and entering a unique access code on the website. A single URL will be offered on all mailings. Once respondents enter the English web instrument, they will have the option to complete the survey in Spanish. Of those who respond, we anticipate the majority of surveys (58 percent) will be completed online, with the remaining portion (42 percent) completed on paper.


Incentives: The 2020 CBAMS Survey includes a prepaid, cash incentive in the first mailing to each sampled address, as a token of appreciation for participation. Research shows that incentives increase survey response rates, prepaid incentives are more effective than contingent ones, and cash is more effective than cash equivalents (Groves and Couper, 1998). Research has also shown that prepaid incentives are so effective at increasing response rates that they can reduce total cost per interview by increasing participation enough to offset their cost (Singer, 2012).


Incentives are particularly relevant for the 2020 CBAMS Survey because the survey will oversample hard-to-count populations, who by definition are unlikely to respond to surveys. Not only are these incentives intended to increase overall response rates to the 2020 CBAMS Survey, they will also help ensure a more representative sample (responses from everyone and not just those most likely to respond), which will in turn reduce non-response bias.


With the goals of this survey being to understand what would motivate people to self-respond to the 2020 Census (especially people who are otherwise unlikely to do so), and to inform the communications campaigns tailored to minority groups such as Hispanics, Blacks, Asians, and NHPI, the ability to survey populations less likely to self-respond to the 2020 CBAMS survey is critical. Using incentives to increase response rates of these groups helps ensure that they are properly represented in the 2020 CBAMS Survey. This increased coverage would in turn improve the data quality of the 2020 CBAMS Survey by collecting enough responses from each group to make statistically valid inferences. Without a large enough and representative sample from these groups, the quality of the 2020 CBAMS Survey data would be compromised, which in turn would negatively impact the 2020 Census communications campaign.


The value of 2020 CBAMS Survey incentives is different for different strata, with incentive amounts highest for those hardest to count (low propensity to respond and minority), intermediate for minority strata with higher response propensity, and lowest for the “other” strata for both low and high response propensity. The resulting assignments are inversely related to strata-based estimates of ACS self-response rates for all but one stratum (See Table 3). This also serves as an experimental effort to attempt to equalize response rates across strata, minimizing the differential observed in ACS response rates. The incentive amounts are based on the findings that response rates increase as incentives increase but only up to a point, after which there are diminishing marginal returns (Mercer et al.,2015).


Table 3: Proposed CBAMS Incentive Amounts


Strata

Incentive Amount

Estimated ACS Self-Response

Low/Asian or NHPI tracts

$10.00

43.63%

Low/Black tracts

$10.00

39.46%

Low/Hispanic tracts

$10.00

35.04%

Low/Other tracts

$1.00

61.68%

High/Asian or NHPI tracts

$5.00

63.12%

High/Black tracts

$5.00

50.97%

High/Hispanic tracts

$5.00

44.57%

High/Other tracts

$1.00

68.59%

*Low propensity to respond

** High propensity to respond


Enclosures: The first enclosure includes:

  • Initial letters/invites for the two mail strategies (Internet Choice contains a paper questionnaire and instructions to log into the web survey, whereas Internet First contains only the letter and no paper questionnaire)

  • Reminder letter for the two mail strategies (the wording differs slightly by mail strategy)

  • Sealed reminder postcard (this does not differ by mail strategy)

  • Reminder letter (this does not differ by mail strategy)

  • Final sealed reminder postcard (this does not differ by mail strategy)


The second enclosure includes the survey questionnaire.


Length of Interview: We estimate that users will spend 15 minutes on average completing the survey. Thus, for 100% response, the total estimated respondent burden for this study is approximately 12,500 hours.


Point of Contact: The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Monica Vines

Researcher

Center for New Media and Promotion (CNMP)

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-8813

monica.j.vines@census.gov



References:


Groves, B. and Couper M. (2012). “How Survey Design Features Affect Participation,” in Nonresponse in

Household Interview Surveys, pp. 269-293. New York: Wiley.


Mercer, A., Caporaso A., Cantor, D., and Townsend R. (2015). “How Much Gets You How Much? Monetary Incentives and Response Rates in Household Surveys” in Public Opinion Quarterly 79(1),pp. 105-129.


Singer, E. (2012). “The Use and Effects of Incentives in Surveys.” Presented to the National Science

Foundation, Washington, DC.



5


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorErica L Olmsted Hawala
File Modified0000-00-00
File Created2021-01-20

© 2024 OMB.report | Privacy Policy