Opt-In Panel Test 2 Plan

OMB1325opt-inpanel_rev.doc

Generic Clearance for Questionnaire Pretesting Research

Opt-In Panel Test 2 Plan

OMB: 0607-0725

Document [doc]
Download: doc | pdf

The Census Bureau plans to conduct additional research under the generic clearance for questionnaire pretesting research (OMB number 0607-0725). We plan to conduct additional testing using an online questionnaire to gather information about email subject lines and screen designs to collect residential address information. This “March” test using a nonprobability panel is part of the iterative testing strategy that we plan to use to supplement the 2020 Census research program. The nonprobability panel pilot test was successfully conducted in January 2014.


The goal of the iterative testing is to determine the best email subject lines to maximize self-response for the 2020 Census. Email subject lines that fail to generate significant click-through rates in the nonprobability panel are not likely to perform better in a probability panel. Eliminating poorly performing email subject lines before larger-scale testing is the purpose of the nonprobability testing we propose here. Another specific goal of this March test, building on the January test, is to determine whether answers to the opinion questions regarding optimal email subject lines generates similar responses when using those subject lines in a test.


We will send survey invitation emails to a sample of people who opted-in to participate in Census Bureau research studies through the Census Bureau’s email subscription website, run by GovDelivery. There was no incentive to sign up and there is no incentive (other than a copy of the research report) to participate. Currently, the panel has over 8000 emails. Because of the opt-in nature, this panel is considered a nonprobability panel. Based on the January pilot test, we expect approximately a 35 percent open rate to the emails and a 24 percent click-through rate to the survey. In this “March” test, we will again measure the open rates and click through rates to the emails with different subject lines.


There are six email panels (3 subject line variations by 2 formats) and two survey panels in this test. Each email panel consists of a sample of 250 email addresses for a total sample of 1,500 email addresses. The two survey panels consist of a sample of 750 email addresses each. There are three initial email where the only difference is the subject line of the initial email. One subject line in the March test is a repeat from the January 2014 pilot test; the second subject line received the highest rating in the opinion question in the January 2014 pilot test; and the third subject line received the lowest rating in the opinion question in the January 2014 pilot test. With the sample size we have chosen, we will be able to detect a difference in the email open rate of 8 percent or higher between the different subject line panels.


The format of the emails also differs: for each initial email panel, half the sample will receive a “standard” non-mobile friendly format used in the January 2014 pilot test and the other half uses a mobile-friendly format. With the sample size we have chosen, we will be able to detect a difference in the email open rate of 7 percent or higher between the two email format panels.


There are two different versions of the online survey, both nearly identical to the surveys used in the January test. The only difference in the surveys is the screen that collects residential address information. Approximately one half of the sample will receive each version of the address collection, assigned randomly.


The sample will receive a maximum of three notification emails:

  • one of the initial emails on Monday, March 10,

  • a reminder email on Thursday March 13 (if they had not yet click on the link to the survey), and

  • a final reminder email on Thursday, March 20.


The survey will be closed on Friday, March 21 at midnight.


The survey that the emails link to will collect data on address collection screen designs. The Census Bureau hypothesizes that many people will come to the online 2020 Census without their housing unit ID and thus they will have to enter their address as part of the data collection. We must make sure that the address screen collects accurate address data in order for us to match to our master address file and geocode. Without good addresses, we will not be able to count people at the correct location for the census. We continue to use the same address screens as the January pilot test. In the final report, we plan to collapse the address data across the iterative tests.


After the address screens, we ask opinion questions as a part of the “debriefing.” The objective of these questions is to gather qualitative data in order to guide future iterations of this test, to gain a sense of how respondents want to be contacted about the census, to answer the census, and what these highly motivated individuals think the census collects. These data will be shared with 2020 Census staff and the communications area of the Census Bureau. The answers to the demographic questions and questions on new technologies will allow us to look at characteristics of respondents. These questions are virtually identical to the questions used in the January pilot test and we will collapsed the results across the iterative tests for the final report.


The pilot test will be conducted from March 10 through March 21, 2014. Staff from the Center for Survey Measurement’s Human Factors and Usability Research Group will select the sample and send the emails through GovDelivery. The survey will be hosted on our secure servers within the Application Services Division of the Census Bureau that hosts all other secure online production surveys. The username needed to enter the survey will be the email address where the email was sent (this is the same email used to sign up to participate in Census Bureau research studies). If the respondent starts the survey but does not complete it, that person will not be allowed to re-enter the site later. The emails and questions were usability tested earlier under a separate generic clearance letter sent on June 12, 2013.


Future iterations of this survey will be conducted under a new generic clearance that is in preparation.


We estimate that users will spend 5 minutes on average completing the survey and approximately 5 minutes reading emails. Thus, the total estimated respondent burden for this study is approximately 250 hours, which assumes everyone reads the emails and answers the survey.


The contact person for questions regarding data collection and statistical aspects of the design of this research is listed below:


Elizabeth Nichols

Center for Survey Measurement

U.S. Census Bureau

Washington, D.C. 20233

(301) 763-1724

Elizabeth.May.Nichols@census.gov

File Typeapplication/msword
File TitleThe purpose of this letter is to inform you of our plans to conduct research under the generic clearance for questionnaire pre
AuthorBureau Of The Census
Last Modified ByElizabeth May Nichols
File Modified2014-03-05
File Created2014-02-25

© 2024 OMB.report | Privacy Policy