SUPPORTING STATEMENT B
U.S. Department of Commerce
U.S. Census Bureau
American Community Survey Methods Panel Tests
OMB Control No. 0607-0936
B. Collections of Information Employing Statistical Methods
For each test or set of tests that use similar methodology, we have outlined the general planned respondent universe, sampling method, sample size, and response rate. Once details of specific tests are determined this information will be further detailed as part of a non-substantive change request (and Federal Register Notice, as required).
Every effort is taken to use the existing American Community Survey (ACS) sample for testing when the tests do not involve content changes. This approach reduces overall burden to the public, does not require additional burden of respondents already selected for the ACS, and reduces costs of the ACS program. Because the decision to use production sample is made on a case by case basis, the sample details listed below assume a sample independent of the production ACS is needed to conduct testing (except where noted). When an independent sample is used addresses selected to participate in production ACS will be out-of-scope for the test.
Self-Response Mail Messaging and Contact Strategies Testing
Universe: The sample universe for the mail messaging tests consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection and Sample Size: The sample design will be based on the ACS production multi-stage sample design. Once the number of treatments is determined, methods will be developed to sample addresses and randomly assign them to the various experimental treatments. We are also considering using production sample, when possible, to conduct these tests. The monthly ACS production sample of approximately 295,000 housing unit addresses are divided into 24 groups, called methods panel groups. Each methods panel group contains approximately 12,000 addresses and are a representative subsample of the entire monthly sample. Each monthly sample is representative of the entire yearly sample and the country. Any combination of methods panel groups can be used for a test.
Response Rate: The ACS mail contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response including internet self-response. This strategy results in a self-response rate prior to Computer-Assisted Personal Interview (CAPI) of approximately 55.2 percent. The Self-Response Mail Messaging and Contact Strategies Tests are expected to have similar self-response rates to the production ACS.
Respondent Feedback Pilot
Universe: The sample universe for the Respondent Feedback Pilot consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. No addresses can be in sample more than once in a 5-year period.
Sample Selection and Sample Size: This pilot will use production ACS sample. We have not yet determined which monthly ACS panel(s) will be used or for how long the pilot will run.
Response Rate: This pilot will collect feedback from ACS production internet respondents. Approximately 44 percent of sampled housing units respond online each month. It is unknown how many of those respondents will provide feedback.
Use of Administrative Data Test
Universe: The sample universe for the Administrative Data Use Test consists of all mailable residential addresses in the United States from the Census Bureau’s Master Address File as well as Group Quarters (GQ) facilities. Addresses and facilities selected to participate in production ACS will be out-of-scope for the test. Also, no housing unit addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection and Sample Size: The sample design will be based on the ACS production multi-stage sample design. Additional sampling methodology may be incorporated in the design based on the availability of administrative data in an area and other considerations. Once the number of treatments is determined, methods will be developed to sample addresses and randomly assign them to the experimental treatments. This test is estimated to need a national sample of 100,000 addresses and facilities, divided into treatments. The number of treatments will be determined as the details of the test are defined.
Response Rate: This test would utilize similar data collection strategies to production ACS; therefore, we would estimate similar overall response rates as the production ACS.
Group Quarters Testing
Universe: The sample universe for the Group Quarters Test consists of all non-institutional GQ facilities.
Sample Selection: The sample selection will include residents within non-institutional GQs. The quality of the GQ internet instrument will be monitored and evaluated against traditional data collection methods during the GQ internet pilot.
Sample Size: This test is estimated to need a national sample of 2,000 non-institutional GQ facilities (which includes interviewing 2,000 administrators and approximately 30,000 residents). Not all residents will complete the interview on the internet, but it will be offered. As this is a new mode of data collection for this population, we don’t have a strong estimate of how many residents will chose to complete the survey using the internet instrument. Currently, about 20 percent of residents in non-institutional GQs complete the survey on paper (the remaining are interviewed in person or completed using administrative records through the facility administrator).
Content Testing and Follow-up Interview
Universe: The sample universe for the Content Tests consists of all mailable residential addresses in the United States from the Census Bureau’s Master Address File. Addresses selected to participate in production ACS will be out-of-scope for the tests. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection: Content Testing is typically conducted with sample separate from the production ACS. The sample design will be based on the ACS production multi-stage sample design and sampling methods used in prior Content Tests. The sample size will be determined based on the topics being tested to ensure differences between treatments are detectable to a sufficient degree and is estimated to require 70,000 housing units. Past ACS Content Tests have excluded Puerto Rico, Alaska, and Hawaii for cost reasons. The Census Bureau will reconsider if Hawaii and non-remote portions of Alaska should continue to be excluded. These tests will continue to exclude Remote Alaska and Puerto Rico.
Follow-up Interviews are conducted with respondents to complete the initial interview in order to measure response bias or variance on the testing question topics. Currently no additional sampling is planned.
Response Rate: The ACS contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response. Nonrespondents are then subsampled for in person and telephone interviewing. This strategy results in an overall response rate of approximately 90.7 percent. The Content Tests employ the same contact strategy as production and as a result expects to produce similar response rates.
Follow-up Interviews have historically been conducted by telephone. In the previous Content Test conducted in 2016, the response rate for the followup was 45 percent. The Census Bureau is exploring options to increase this response rate for future tests, including changing the mode of followup.
Internet Instrument Testing
Universe: The sample universe for the internet instrument testing consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Addresses selected to participate in production ACS will be out-of-scope for the tests. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection: The sample design will be based on the ACS production multi-stage sample design. Once the number of treatments is determined, methods will be developed to sample addresses and randomly assign them to the various experimental treatments. We are also considering using production sample, when possible, to conduct these tests. The monthly ACS production sample of approximately 295,000 housing unit addresses is divided into 24 groups, called methods panel groups. Each methods panel group contains approximately 12,000 addresses and is a representative subsample of the entire monthly sample. Each monthly sample is representative of the entire yearly sample and the country. Any combination of methods panel groups can be used for a test.
Sample Size: Each internet instrument test will have a national sample of 60,000 addresses in the United States divided into treatments. The number of treatments will be determined as the details of the tests are defined.
Response Rate: The ACS mail contact strategy consists of sending up to five mailings to sampled housing unit addresses to solicit self-response including internet self-response. This strategy results in an internet self-response rate prior to CAPI of approximately 38.6 percent. The Internet Instrument Tests are expected to have similar internet self-response rates to the production ACS.
Respondent Help Testing
Universe: The sample universe for the respondent help testing consists of all mailable residential housing unit addresses in the United States from the Census Bureau’s Master Address File. Addresses selected to participate in production ACS will be out-of-scope for the tests. Also, no addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection: The sample design will be based on the ACS production multi-stage sample design. Once the number of treatments is determined, methods will be developed to sample addresses and randomly assign them to the various experimental treatments. We are also considering using production sample, when possible, to conduct these tests. The monthly ACS production sample of approximately 295,000 housing unit addresses is divided into 24 groups, called methods panel groups. Each methods panel group contains approximately 12,000 addresses and is a representative subsample of the entire monthly sample. Each monthly sample is representative of the entire yearly sample and the country. Any combination of methods panel groups can be used for a test.
Response Rate: This testing could include various forms of assistance for respondents who call Telephone Questionnaire Assistance (TQA) or who respond online. Approximately 20,000 calls are placed to TQA each month (though that estimate varies widely from month to month). The internet self-response rate prior to CAPI is approximately 38.6 percent, though it is unknown how many of those respondents would need or utilize online help while completing the ACS.
Nonresponse Followup Data Collection Testing
Universe: The sample universe for the Nonresponse Followup Data Collection Testing consists of all addresses in the United States from the Census Bureau’s Master Address File which did not respond to the ACS prior the Computer-Assisted Personal Interviewing data collection. Also, no housing unit addresses can be in sample (production and methods panel tests) more than once in a 5-year period.
Sample Selection: The sample design will be based on the ACS production multi-stage sample design. Additional sampling methodology may be incorporated in the design based on the nonresponse distributions and other considerations.
Sample Size: This test is estimated to need a national sample of 100,000 addresses. The details of the test are yet to be defined.
Response Rate: The Nonresponse Followup Data Collection Testing focus is on in-person and telephone interviews conducted by Census Bureau Field Representatives (FRs). Since FRs also encourage response online and may also mail back a paper questionnaire they received during the self-response phase of the ACS, response rates would depend on these factors, as well as the specific data collection intervention, and thus unknown at this time.
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The Methods Panel Tests typically follow the data collection procedures for the production ACS mailing strategy, with modifications based on the design and purpose of the test. For example, changing the wording in a letter mailed to the sampled address, changing the content of a question, or sending an additional reminder to a sampled address. Section 4 below, on Test Procedures, outlines specific changes to the data collection methodology for the proposed tests. A summary of data collection methodology for production ACS is below. More details about the data collection methodology for the ACS can be found in the ACS Design and Methodology report (U.S. Census Bureau, 2014).
The ACS employs a 3-month data collection process for each monthly sample. First through self-response and later through in-person and telephone interviews.
To solicit self-response, the Census Bureau sends up to five mailings to potential respondents. The first two mailings are sent to all mailable addresses in the monthly sample. The first mailing is a package that includes a letter, a multilingual brochure, and a card with instructions on how to respond via the internet. The letter contains an invitation to participate in the ACS online and more information in a frequently asked questions format on the back of the letter. The multilingual brochure provides basic information about the survey in English, Spanish, Russian, Chinese, Vietnamese, and Korean, and provides a phone number to call for assistance in each language. A week later, the same addresses are sent a second mailing (reminder letter in a pressure seal mailer).
Responding addresses are removed from the address file after the second mailing to create a new mailing universe of nonrespondents; these addresses are sent the third and fourth mailings. The third mailing is a package that includes a letter, a paper questionnaire, and a business reply envelope. Four days later, these addresses are sent a fourth mailing (reminder postcard) which encourages them to respond.
After the fourth mailing, responding addresses are again removed from the address file to create a new mailing universe of nonrespondents. The remaining sample addresses are sent the fifth mailing (a more urgent final reminder letter with a due date in a pressure seal mailer).
The Census Bureau provides TQA for respondents who need assistance with completing the paper or internet questionnaires, who have questions about the survey or who would like to complete the ACS interview over the telephone instead of by other modes. Respondents may call the ACS toll-free TQA numbers listed on various ACS mail materials. The TQA staff answers respondent questions and/or completes the entire ACS interview using a computer-assisted telephone interview (CATI) instrument.
Two to three weeks after the fifth mailing is sent, responding addresses are removed and the unmailable and undeliverable addresses (from the initial sample) are added to create the universe of addresses eligible for the CAPI Nonresponse Followup operation. Of this universe, a subsample is chosen to be included in the CAPI operation, which starts at the beginning of the month following the fifth mailing. A pressure seal reminder letter is also sent to all mailable addresses sampled for CAPI at the start of the interviewing month. This letter lets respondents know that a Census Bureau field representative (FR) may call or visit them to complete the interview and encourages them to complete the survey online, if possible. Census Bureau FRs first attempt to interview those selected for CAPI by phone. If the FR is unable to complete a phone interview, they visit the address to conduct an in-person interview.
In addition to the ACS data collection from housing units, the data are also collected from a sample of about 19,000 GQ facilities each year. A GQ is a place where people live or stay in a group living arrangement that is owned or managed by an entity or organization providing housing and/or services for the residents. At each sampled GQ facility, one GQ contact is interviewed to collect data about the GQ and to provide a list of residents in the GQ. This list is used to randomly select the sample of individuals to complete the ACS. An introductory letter is mailed to the sample GQ approximately two weeks prior to the period when an FR may begin making contact with the GQ. Resident-level personal interviews with sampled GQ residents are conducted using CAPI, but bilingual paper questionnaires can also be used for self-response. The GQ CAPI and paper questionnaires contain questions for one person. There are two categories of GQs: institutional and non-institutional. Institutional GQs include places such as correctional facilities and nursing homes. Non-institutional GQs include college housing, military barracks, and residential treatment centers. Most interviews conducted in GQs are interviewer-administered (94 percent of interviews in institutional GQs and 75 percent in non-institutional GQs), but some GQ respondents self-respond using a paper questionnaire.
Similar to production ACS, the Methods Panel Tests include a well-researched mail contact strategy (as outlined in Section 2 above) to encourage self-response. TQA is available via a toll-free number in the mailings, which respondents may call to obtain help in completing the survey, to address questions regarding their participation in the ACS, or to complete the questionnaire over the phone. Similarly, the mailing materials and online survey will provide links to additional information about the ACS as well as Census Bureau’s policies on privacy, security, and accessibility.
Nonresponse Followup operations are conducted to ensure a final high weighted response rate. The Nonresponse Followup operations are conducted via computer-assisted interviewing for a sample of addresses for which we have not obtained response. We maintain high levels of data accuracy and response rates through interviewer instruction, training, and close monitoring of the data.
Data collection instruments are available to respondents and interviewers in English and Spanish. Respondents may also complete the survey via a phone interview in Russian, Chinese, Vietnamese, and Korean.
Additional methods for maximizing response are explored as part of several of the proposed tests. Once details of specific strategies are determined they will be provided as part of a non-substantive change request.
The Census Bureau is continuously engaging and responding to stakeholders to continually adapt the way we gather data, administer the survey, and conduct the way we do business. The ACS Methods Panel program allows the Census Bureau to respond to emerging trends and changes in our nation that spawn new data needs by building on our comprehensive research agenda. This work not only improves the ACS, but also allows the Census Bureau to innovate responsively across key aspects of our work. The ACS Methods Panel program also provides an opportunity to research and test elements of survey data collection that relate to the decennial census.
The series of tests that are proposed over the next three years allow the Census Bureau the opportunity to improve data quality, reduce data collection costs, improve questionnaire content and data collection materials, as well as react to emerging needs.
Self-Response Mail Messaging and Contact Strategies Testing are focused on studying methods to increase self-response. The proposed tests would evaluate changes to the mailings, such as using plain language to improve communication, changing the look and feel of the materials, updating messages to motivate response, and adding or removing materials included in the mailings. Changes to the contact method, the number of contacts, and the timing of the contacts may also be tested.
The first Self-Response Mail Messaging test is planned to be conducted in the fall of 2021, called the Strategic Framework Field Test.
In 2017, the Census Bureau began the Strategic Framework Project—a long-term, multi-phase project to update the messaging in the ACS mail materials. The goals of the project are to improve communication with potential respondents, increase self-response to the survey, reduce program costs, and reduce respondent burden. The project includes research of best practices in messaging to gain survey cooperation, development of new materials based on the research, and testing (qualitative and quantitative) of the new materials.
After conducting research on best practices in communications in a variety of disciplines, the Census Bureau’s Strategic Framework Project team made recommendations for messaging in ACS mail materials in two reports (Oliver, Heimel, and Schreiner, 2017; Schreiner, Oliver, and Poehler, 2020). Following the recommendations, new materials were designed holistically resulting in four sets of materials. The Census Bureau’s Center for Behavioral Science Methods (CBSM) tested the materials in three rounds of cognitive testing (report forthcoming). The materials are now ready to be field-tested.
The purpose of the 2021 Strategic Framework Mail Materials Test is to identify which set of new materials is most successful at increasing self-response. Effectiveness of individual messages or ideas will not be isolated in this test, but it will provide evidence of overall effectiveness of the materials.
The new materials were developed with the following in mind: limit the number of messages in each mailing, reduce repetitious messaging, use new appeals in each mailing, use messages justified by research, and make the connection to the well-known Census Bureau brand in a more prominent way. The new materials were also designed with a heavy focus on the use of plain language writing to lower the reading level of the letters and plain language design principles (such as white space, organization of letter text, graphics, and color) to make the letters easier to read.
While the designs of the materials are new, they adhere to the current ACS mail contact strategy (see Section 2.2), which includes the type of mailers (package, pressure seal, or postcard) and the number and timing of mailings that are sent. Each set of materials was designed holistically so that the messaging and look-and-feel within and across the five mailings are interconnected. The Census Bureau designed three sets of updated ACS mail materials and a team of researchers outside of the Census Bureau designed a fourth set.
Experimental Design and Sample Size: This test will include four treatments and a control. This test will use production ACS sample. The monthly ACS production sample consists of approximately 295,000 housing unit addresses and is divided into 24 nationally representative groups (referred to as methods panel groups) of approximately 12,000 addresses each. The sample for each of the four experimental treatments in this test will consist of two randomly assigned methods panel groups (approximately 24,000 mailing addresses per treatment). The sample for the control treatment will also consist of two randomly assigned methods panel groups. The control treatment will receive production ACS materials, but will be sorted and mailed separately from production. All remaining methods panel groups will receive production ACS materials and will not be part of the test. The test will exclude Remote Alaska since sampled addresses do not receive mail materials. Similarly, Puerto Rico is out of scope for this test because different mail methods are used.
Evaluation: The primary metric of interest in evaluating the experiment is to compare the self-response rates versus the control. See Attachment A for more details about the treatment development, analysis plan and methodology, and for draft mail materials.
The Respondent Feedback Pilot is designed to collect respondent feedback at the time of the interview, with the pilot focusing on internet respondents. Respondents will not be required to answer the feedback question.
The Use of Administrative Data to reduce burden of existing questions by allowing for modification of the questions will be tested. A field test is proposed for questions that may need to remain on the questionnaire but be modified in conjunction with the use of administrative records.
Group Quarters Testing will focus on evaluating an internet version of the ACS available to non-institutional GQ residents, especially in college dorms, military barracks, and group homes. We would evaluate the quality of the data received from the internet instrument compared with traditional data collection methods for GQs (paper questionnaires and interviewer-administered) as well as assess operational issues with offering the internet option, including feedback from interviewers.
Content Testing is conducted by the Census Bureau periodically to improve data quality. Working through the Office of Management and Budget Interagency Committee for the ACS, the Census Bureau solicited proposals from other Federal agencies to change existing questions or add new questions to the ACS. These proposals included changes to the following questions: household roster, educational attainment, health insurance, disability, means of transportation to work, income, weeks worked, Supplemental Nutrition Assistance Program (SNAP), condominium fees, and home heating fuel. Additionally, three new questions on solar panels, electric vehicles, and sewage disposal were proposed. The objective of content testing is to determine the impact of changing question wording and response categories, as well as redefining underlying constructs, on the quality of the data collected. The Census Bureau proposes evaluating changes to current questions by comparing the revised questions to the current ACS questions. For new questions, the Census Bureau proposes comparing the performance of two versions of any new questions and benchmark results with other well-known sources of such information. The questions would be tested using all modes of data collection. Response bias or variance may also be measured to evaluate the questions by conducting a followup interview with respondents. Multiple tests may be conducted. Additional content testing may include a shift in the content collection strategy for the fifth person in the household on the paper questionnaire. In order to reduce respondent burden for large households who self-respond using the paper questionnaire, as well as potentially increase self-response by reducing the size of the paper questionnaire, one testing proposal includes no longer collecting detailed data for Person 5 on the paper questionnaire (i.e., the same items collected for Person 1 through 4) and only collecting basic demographic information (as is currently done for Person 6 through Person 12). Detailed person information for households with five or more people would be collected through a telephone follow-up, similar to what is currently done for households with six or more people.
Internet Instrument Testing is proposed to test and evaluate revised features of the internet instrument. In 2013, the ACS incorporated the use of an internet instrument to collect survey responses. The design of the instrument reflected the research and standards of survey data collection at that time. With a growing population using the internet to respond to the ACS, as well as the increased use of smartphones and other electronic devices with smaller screens, an evaluation and redesign of the internet instrument is needed. Design elements will be developed and tested based on input from experts in survey methodology and web survey design. Testing may include revisions focused on improving login procedures and screen navigation, improving the user interface design, as well as methods to decrease respondent burden. Multiple tests may be conducted.
Respondent Help Testing will focus on methods to answer respondent questions about the survey and improve operational efficiency. If respondents need help completing the ACS or have questions, they can call the TQA toll-free hotline. When respondents call the TQA, they enter an Interactive Voice Recognition (IVR) system, which provides some basic information on the ACS and recorded answers to frequently asked questions. Callers can also request to speak directly to a Census Bureau employee. The Census Bureau is proposing potential testing of changes to the IVR system to improve content and efficiencies in the system. Other methods of offering help to respondents may also be explored and tested, such as the use of chatbots and live online chat assistance.
Nonresponse Followup Data Collection Testing is proposed to evaluate the use of adaptive survey design techniques for the ACS CAPI Nonresponse Followup operation. Models and rules would be developed to predict case outcomes and determine interventions for a case, such as assigning a case to a refusal specialist. The models and rules would also prioritize cases based on the likelihood of completing an interview. The adaptive approach would be evaluated by comparing results to traditional methods of case assignment and progress.
The Census Bureau will collect and process these data as needed for each test. Within the Census Bureau, please consult the following individuals for further information on their area of expertise.
Statistical Aspects
Joan Hill Assistant Division Chief for Experiments and Evaluations
Decennial Statistical Studies Division
Phone: (301) 763-4286
Overall Data Collection
Donna Daily Division Chief, American Community Survey Office
Phone: (301)763-5258
Oliver, B., Heimel, S., and Schreiner, J. (2017). “Strategic Framework for Messaging in the American Community Survey Mail Materials.” Washington, DC: U.S. Census Bureau. Retrieved on May 6, 2020 from https://www.census.gov/content/dam/Census/library/working-papers/2017/acs/2017_Oliver_01.pdf
Schreiner, J., Oliver, B., and Poehler, E. (2020). “Assessment of Messaging in the 2018 American Community Survey Mail Contact Materials.” Washington, DC: U.S. Census Bureau. Retrieved on May 6, 2020 from https://www.census.gov/content/dam/Census/library/working‑papers/2020/acs/2020_Schreiner_01.pdf
U.S. Census Bureau. 2014. American Community Survey Design and Methodology. Washington, D.C.: U.S. Census Bureau. Accessed February 1, 2018. https://www.census.gov/programs-surveys/acs/methodology/design-and-methodology.html
Attachment A: Research and Analysis Plan for the Strategic Framework Field Test
Page
|
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Mary Reuling Lenaiyasa (CENSUS/PCO FED) |
File Modified | 0000-00-00 |
File Created | 2021-06-25 |