2013_NCCT_8.106_OMB_submission_Part_B_v6.1

2013_NCCT_8.106_OMB_submission_Part_B_v6.1.docx

2013 National Census Contact Test

OMB: 0607-0972

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

U.S. Department of Commerce

U.S. Census Bureau

2013 National Census Contact Test


Part B. Collection of Information Employing Statistical Methods

Question 1. Universe and Respondent Selection

For the 2013 National Census Contact Test (NCCT), formally the 2013 Alternative Contact Strategy Test, the initial sample will be selected from housing units in the 50 states and the District of Columbia where the Census Bureau delivered questionnaires to respondents via U.S. Postal Service mail service. Group quarters and housing units in any other types of 2010 Census enumeration areas will not be included in the sampling frame. Further, to reduce burden on respondents, the Census Bureau will exclude from the NCCT sampling frame any housing units selected for the first half of the 2013 American Community Survey (ACS) sample.

The sample of 40,000 housing units will be selected from a frame of addresses with supplemental contact information, most of which have been linked to the Census Master Address File (MAF). This frame was constructed using data from five commercial vendors. Two sampling strata will be used. The first stratum will include housing unit addresses for which the supplemental frame information is linked to an address record in the current MAF. This strata addresses our research questions on validating the accuracy of phone and email addresses from vendor-provided files. The second stratum will include units for which the supplemental frame information is not linked to a MAF record. The second strata is focused on the research questions relating to non-id processing of respondent-provided addresses. The sampling frame will be sorted by source of supplemental data in order to allocate the sample proportionally into each of the two strata.

The 40,000 housing units in sample will all be interviewed using the same webCATI (web-based Computer Assisted Telephone Interview) instrument. An experimental panel design is not applicable for this survey because the data collection methodology is not being tested – only the alternative contact frame information.

Response rates for Pew Research polls typically range from 5 percent to 20 percent; these response rates are comparable to those for other major opinion polls. The response rate is the percentage of known or assumed residential households for which a completed interview was obtained. The response rate we report is computed using the American Association for Public Opinion Research’s (AAPOR) Response Rate 3 (RR3) method. We expect slightly higher response rates because we have several methods to maximize response: use of an advanced letter, mandatory status of this survey, and the use of CATI methodology. These rates are adequate for our uses because this study is exploratory in nature. Our focus is not on creating estimates of population; initial relative magnitudes will guide us in the next steps of our research.

Our research questions focus on: determining the quality of our contact frame in providing data related to alternate means of contact; the coverage over demographic and geographic characteristics; an evaluation of how to moreeffectively match and geocode addresses lacking a pre-assigned Census ID, and what did we learned that that could significantly reduce the Field Verification workload. To answer these questions, our methodology includes studies based on:

  1. A household-level analysis of our sampled cases that considers the instrument completion rate, refusal rate and percentage of addresses returned undeliverable as addressed from the advance letter mailout.

  2. A household-level analysis of the interviewed cases that considers the number receiving an advance letter, the number who scheduled a call back, the language in which the interview was conducted.. We will also look at the details related to the proportion who confirmed that our address to phone number association was correct, the household demographic and geographic and results of the GPS coordinates attitudinal question

  3. A phone number/email and household-level analysis that considers contact information that were erroneous or missing from our frame.

  4. A person level analysis that considers the demographic characteristics of respondents and their roster members.

  5. A vendor-level analysis that includes the matching results of commercial data source by provider .

  6. An analysis of key survey measures by stratum to determine if the quality of our frame data for cases that successfully matched to the frame were comparable to those that unsuccessfully match.

Data from this research will be included in research reports with the understanding that the data were produced for strategic and tactical decision-making and not for official estimates. Research results may be prepared for presentation at professional meetings or in publications in professional journals to promote discussion among the larger survey and statistical community, and encourage further research and refinement. Again, all presentations or publications will provide clear descriptions of the methodology and its limitations.





Question 2. Procedures for Collecting Information

The Census Bureau will conduct the 2013 National Census Contact Test with a national sample of 40,000 households, utilizing webCATI. Prior to the start of interviewing, all of the sample phone numbers will be validated using a vendor-provided service, following established confidentiality procedures. The vendor-provided service is a two-stage process. First, the telephone numbers are compared to the vendor’s database of listed business and household telephone numbers, as well as data identifying cellular phone numbers. In the second stage, the vendor staff manually dials each of the remaining (i.e., non-cellular, residential) telephone numbers, with trained staff monitoring the calls in case someone answers. The calling is always conducted from 9 a.m. to 5 p.m. in the time zone associated with the area code. The caller ID will display either “Unknown Caller” or “000-000-0000.” If someone answers, the vendor agent inquires if the number dialed is a business. Once residential/commercial status is established, the vendor agent says either “Sorry, I have dialed the wrong number” for residential status or “Thank you, we are conducting research” and “thank you for your time for a business and ends the call. Use of this vendor service will make calling more efficient (i.e., less costly) and provide data to support the research objectives as well.

To collect survey information from respondents, interviewers will call households in sample to collect alternative contact information such as address, cell phone numbers, and email addresses, as well as associated usage metrics, such as how often email is checked. The phone numbers and emails will be independently collected by the CATI operator and then compared against the frame data for each housing unit. This process can evaluate the quality of the alternative contact frame information obtained via commercial vendors.

Key survey measures for this data collection are the following:

  • Proportion of telephone numbers and e-mail addresses which were not on the frame but were supplied during the survey interview. This may be an indicator of missing information in the frame.

  • Proportion of households with telephone numbers and e-mails which were contained on the files from the vendors but not provided by the respondent during the interview. This may indicate erroneous or out-of-date frame information.

  • Proportion of cases in the linked MAF Identification stratum which contained a different physical address as listed in the vendor data, to estimate inconsistent frame information.

  • Number of MAF non-ID records successfully resolved through automated matching and geo-coding processes.

  • Analysis of the data provided by each vendor to determine which vendors have the most useful data for the U.S. Census Bureau to purchase.

Key survey measures will be used to determine if the supplemental contact information for both sample groups (cases with MAF ID match and those without a MAF ID match). This will establish a baseline measure for the mid-decade contact frame development research. Additionally, the non-MAF ID cases will provide an early opportunity to test enhancements to the automated Non-ID processing conducted by the Census Bureau.

The Census Bureau plans to conduct the 2013 National Census Contact Test in January 2013.

Question 3. Methods to Maximize Response

This survey is mandatory, which will be conveyed to respondents through the advance letter and during the telephone interview. An advance letter will be sent to all sampled households in order to maximize response and increase the respondent’s understanding of the legitimacy of the survey. Callbacks will be used to maximize response rates in the telephone interviews.

For respondents concerned about the validity of the survey, the Census Bureau will provide the OMB clearance number, so that they may verify validity. In addition, this survey will send an alert memo to field offices, which is where resondents’ questions will be directed by the Census Bureau’s “Am I in a survey?” webpage.

The Census Bureau will develop a set of Frequently Asked Questions (FAQ) and train the interviewers to use it in answering questions from respondents.

Question 4. Tests of Procedures or Methods

Cognitive testing was done to verify whether the questionnaire would be successful in eliciting phone numbers and e-mail addresses to validate against the vendor-provided data (Smirnova and Scanlon, forthcoming). Twelve interviews were conducted for this study throughout the Washington, D.C. Metropolitan area during August 2012. The sample was diverse in age, income and race. During these cognitive interviews, comprehension of and reactions to the questions were observed and analyzed. There were a few revisions made to wording of the questions in order to more accurately convey the intended meanings and to capture the appropriate responses, but the intent of each question was generally understood by respondents.

Overall, respondents seemed comfortable sharing telephone numbers and e-mail addresses for themselves. Some respondents were uncomfortable providing this information for themselves or for others in their households. Some respondents indicated that they would be more comfortable simply providing the domain of an e-mail than the specific e-mail addresses. As such, it was recommended that an additional question be asked regarding whether the respondent has e-mail addresses at a number of common e-mail domains. This will serve a similar goal of verifying information on the contact frame if the respondent declines to provide specific e-mail contact information. This suggestion was incorporated into the questionnaire.

The cognitive interview report also recommended that a length of stay question be asked of respondents due to inaccurate contact data for those who indicated that they rented their apartment/ house or have only been there for a short time. In assessing the quality of our frame, this question will help contextualize the data. This recommendation also was incorporated into the questionnaire.

All associated contact materials, such as the Advance letter and webCATI script, have undergone an internal expert review. In selecting members of the expert review panel, we sought a diverse group of methodological and subject-matter experts. The review panel included experts with experience in different areas of expertise, including survey methodology, questionnaire design, and research psychology.

Question 5. Contacts for Statistical Aspects and Data Collection

For questions on statistical methods or the data collection described above, please contact Dave Sheppard of the Decennial Statistical Studies Division at the Census Bureau (Phone: 301-763-9291 or email David.W.Sheppard@census.gov ).


Attachments to the Supporting Statement


Attachment A DRAFT 2013 NCCT Advance Letter

Attachment B DRAFT 2013 NCCT WebCATI Questionnaire

Attachment C DRAFT 2013 NCCT Frequently Asked Questions (FAQs)

Attachment D DRAFT Test 15: Alternative Contact Test Cognitive Testing Report, Michelle Smirnova & Paul Scanlon


2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorPhilip Lutz
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy