The Census Bureau plans to conduct new research under the Generic Clearance for Internet Nonprobability Panel Pretesting (OMB number 0607-0978). We plan to conduct testing using an online questionnaire to gather information about email notifications for survey participation. This “May” test using a sample from the U.S. Census Bureau’s Contact Frame is part of the iterative testing strategy that we plan to use to supplement the 2020 Census research program. Thus far, we conducted six email notification tests using the Census Bureau’s nonprobability opt-in research panel and one study from the Census Bureau’s Contact Frame, which we refer to as the “2015 March study.” Unfortunately, there were technical difficulties with the survey during the latter test. Although we were able to collect email open rates and click-through rates, we were unable to collect address data. This second study, which we refer to as the “2015 May study” repeats the 2015 March study with some sample selection changes.
The Census Bureau’s Contact Frame is a list of emails purchased from commercial vendors covering approximately 77 million U.S. housing units. These emails are available within the Census Bureau for research and production purposes. A sample of 10,000 test site household emails from this list was selected for one panel of the 2014 Census Test and a nationwide sample of 10,000 household emails was selected for the 2015 March study. The housing unit response rate to the both the 2014 Census Test and 2015 March study were very low – between 2 and 4 percent, even though the 2015 March study was a voluntary study.
In the 2015 March study, we learned that as the number of emails for the housing unit increased, so did the click-through rate to the survey. We also learned that some email domains were more likely to be working than other email domains, and that emails in some email domains were more likely to be opened than other email domains. Using these data the Census Bureau will modify the ranking order of emails associated with each housing unit and send a maximum of four emails to the sampled housing units for the current 2015 May study.
In the 2015 May study, we will repeat the email tests to confirm that there are no significant differences in the response rate, as we found with the 2015 March study click-through rate, for either the email format or the email content. In the March and May 2014 nonprobability panel testing, we found that the “text-based” email format consistently had higher click-through rates to the survey than did the “graphical” email format. In the October 2014 nonprobability testing, we found that more content in the email message (including a salutation and a signature) generated a higher response than less content.
The 2015 May study will focus on retesting the email format and the content of the email message with a new revised ranking of the Contact Frame sample. Based on the 2015 March study, we expect slightly more than a 3.9 percent response rate to the survey.
There are four email panels in the current test (2 email format by 2 email content variations). Each email panel consists of a sample of 2,500 housing unit addresses for a total sample of 10,000 addresses. Each address can have up to four emails associated with it. All emails for an address will be in the same panel.
There are two email format panels. Half the sample will receive a “text based” format. The other half uses a “graphical” format – defined as using colors and graphics, but the same content. With the sample size we have chosen, we will be able to detect a difference in the response rate of 2 percent or higher between the two email format panels using an alpha of .05 and a power of 0.80.
There are two email content panels. Half the sample will use email content used in other nonprobability panel tests, called the “short” email content. The other panel uses a modification of the 2014 Census Test email content, called the “long” email content. The “long” email has a salutation, a signature line, and a “green” statement focusing on environmental benefits of going paperless. All emails will contain a statement indicating that survey participation is voluntary. With the sample size we have chosen, we will be able to detect a difference in the response rate of 2 percent or higher between the two email content panels.
The survey is identical to the one used in the 2015 March study. As a reminder, the opinion and demographic questions come first, followed by the address collection screen, and the number of rooms question from the American Community Survey (ACS) online form.
All email panels will use the “10-minute U.S. Census Survey to Help your Community” subject line in the initial email; the “Reminder: Complete the U.S. Census Survey” subject line for the first reminder email; and the “Final reminder for the U.S. Census Survey” subject line for the final reminder. These subject lines performed well in prior nonprobability tests.
Each email address in the sample will receive a maximum of three notification emails:
one of the initial emails on Monday, May 18,
a reminder email on Thursday, May 21 (if they had not yet click on the link to the survey), and
a final reminder email on Thursday, May 28.
The survey will close on Friday, May 29 at midnight.
Staff from the Center for Administrative Records Research and Administration will select a random sample of 10,000 housing units from the Contact Frame. If the housing unit has more than one email associated with it, the sample will include a maximum of four emails from each housing unit selected. The sample will not contain any areas in the 2015 Census Test sites, sample selected for the American Community Survey in 2015, nor sample from the 2015 March study.
Staff from the Center for Survey Measurement will send the emails through GovDelivery. The survey will be hosted on our secure servers within the Application Services Division of the Census Bureau that hosts all other secure online production surveys. The username needed to enter the survey will be the email address where the email was sent (this is the same email used to sign up to participate in Census Bureau research studies). If the respondent starts the survey but does not complete it, that person will not be allowed to re-enter the site later.
We estimate that users will spend 8 minutes on average completing the survey and approximately 5 minutes reading emails. Thus, the total estimated respondent burden for this study is approximately 2,167 hours, which assumes everyone reads the emails and answers the survey.
The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:
Elizabeth Nichols
Center for Survey Measurement
U.S. Census Bureau
Washington, D.C. 20233
(301) 763-1724
Elizabeth.May.Nichols@census.gov
File Type | application/msword |
File Title | The purpose of this letter is to inform you of our plans to conduct research under the generic clearance for questionnaire pre |
Author | Bureau Of The Census |
Last Modified By | Elizabeth May Nichols |
File Modified | 2015-04-29 |
File Created | 2015-04-24 |