SupportingStatement_Part A__2018_2021 MP Tests v0.6 (3)_OMB

SupportingStatement_Part A__2018_2021 MP Tests v0.6 (3)_OMB.docx

American Community Survey Methods Panel Tests

OMB: 0607-0936

Document [docx]
Download: docx | pdf

Department of Commerce

United States Census Bureau

OMB Information Collection Request

American Community Survey Methods Panel Tests

OMB Control No. 0607-0936



Part A – Justification


Question 1. Necessity of the Information Collection


The U.S. Census Bureau requests authorization from the Office of Management and Budget (OMB) to conduct the American Community Survey (ACS) Methods Panel tests.


American Community Survey


The ACS collects detailed demographic, social, economic, and housing data from about 3.5 million addresses in the United States and 36,000 in Puerto Rico each year. The ACS also collects detailed socioeconomic data from about 195,000 residents living in Group Quarter (GQ) facilities. Resulting tabulations from this data collection are provided on a yearly basis. ACS data are critical to the missions of 23 other federal agencies and is the only comparable source of data for all of America’s communities. Federal and state government agencies use such data to evaluate and manage federal programs and to distribute funding for various programs that include food stamp benefits, transportation dollars, and housing grants. State, county, and community governments, nonprofit organizations, businesses, and the general public use information like housing quality, income distribution, journey-to-work patterns, immigration data, and regional age distributions for decision-making and program evaluation.


The Census Bureau mails survey materials via the U.S. Postal Service to a sample of about approximately 295,000 housing unit (HU) addresses each month. For addresses eligible to receive survey materials by mail, first, an initial mailing package is sent that directs respondents to complete the survey online or to call our telephone questionnaire assistance (TQA) line; a reminder mailing follows this. For those addresses from whom the Census Bureau has not received a response, a paper questionnaire package is sent, followed shortly after by a reminder postcard. Finally, if a response via the internet, mail, or TQA has not been received, a final reminder is sent.


For addresses that were mailed survey materials but did not respond by internet, mail, or by calling the TQA line, the Census Bureau selects a subsample of all housing units and assign them to the Computer-Assisted Personal Interview (CAPI) nonresponse followup (NRFU) data collection mode. Unmailable housing unit addresses are sampled and included in the CAPI data collection mode.


For sample housing units in Puerto Rico, a different mail strategy is employed. Based on the results of testing in 2011 and concerns with the resulting internet response rates from that testing, we deferred the introduction of an internet response option in order to further assess the best implementation approach. Therefore, we continue to use the previously used mail strategy with no references to an internet response option. Our first Puerto Rico mailing includes a pre-notice letter in Spanish and English. The second Puerto Rico mailing includes an introductory letter, an FAQ brochure, a copy of the paper questionnaire, an instruction booklet, and a return envelope. The third Puerto Rico mailing is a reminder postcard. The fourth Puerto Rico mailing is a replacement package similar to the second mailing and is mailed only to non-respondents. The fifth Puerto Rico mailing is a reminder postcard that is mailed only to non- respondents.


In 2016, the HU sample yielded approximately 133,000 self-response interviews per month, or 63 percent of the household addresses in sample, each month. The HU CAPI NRFU yielded an estimated response rate of approximately 88 percent in 2016. The 2016 final weighted response rate for ACS was 95 percent.


In addition to selecting a sample of residential addresses, the Census Bureau selects a sample of GQs. The ACS samples about 195,000 residents living in 18,000 GQ facilities each year. A GQ is a place where people live or stay, in a group living arrangement, which is owned or managed by an entity or organization providing housing and/or services for the residents. Examples of GQs include college dorms, nursing homes, and prisons.


Data is collected in GQs by first sending an introductory letter to the facility administrator prior to being contacted by a Field Representative (FR). The FRs make initial telephone contact to schedule an appointment to conduct a personal visit at the sample GQ and also use a GQ listing sheet to generate the subsample of persons for ACS interviews. Resident-level personal interviews with sampled GQ residents are conducted using CAPI or paper questionnaires. The GQ CAPI and paper questionnaires contain questions for one person (not the entire GQ facility).


Methods Panel Tests

An ongoing data collection effort with an annual sample of the magnitude of the ACS requires that the Census Bureau continue research, testing, and evaluations aimed at improving data quality, achieving survey cost efficiencies, reducing respondent burden, and improving ACS questionnaire content and related data collection materials. The ACS Methods Panel program is a set of robust research and tests focused on enhancing the quality of the respondent experience, survey operations, and data.


From 2018 to 2021, the ACS Methods Panel program may include testing methods for increasing survey efficiencies, reducing survey cost, improving the respondent experience, increasing response rates, and improving data quality. At this time, plans are in place to propose several tests:

  • Mail Materials Test that will explore ways to address respondent concerns about mandatory language in mail materials while maintaining or increasing self-response rates.

  • Self-Response Mail Messaging Tests that research various aspects of the mail materials and contact strategy.

  • Multilingual Testing that would explore ways to engage respondents with limited English proficiency during the self-response portion of data collection.

  • Content Tests that will test new and/or revised content.

  • Respondent Burden Questions Test that would incorporate questions about respondents’ experience with the survey.

  • Respondent Comment/Feedback Test which would add a comment field to make it easy for respondents to give feedback about the ACS.

  • Administrative Data Use Test to assess the potential for supplementing or replacing survey content with administrative data.

  • Group Quarters Test to assess the feasibility of making an internet self-response option available to non-institutional GQ residents.

Since the ACS Methods Panel program is designed to address emerging issues, we may conduct additional testing as needed. Any additional testing would focus on methods for reducing data collection costs, improving data quality, revising content, or testing new questions that have an urgent need to be included on the ACS. The tests may be conducted on HUs, GQs, or both. Although burden hours associated with the tests described above are included in this package, any additional burden hours needed for other testing would be requested separately in the future, as needed.


The Census Bureau collects data for this survey under authority of Title 13, United States Code, Sections 141, 193, and 221. All data are afforded confidential treatment under Section 9 of that Title.


Detailed information is provided in this request for the Mail Materials Test. For other tests described in this justification, when revisions to the content of the questionnaire are being tested or when revisions to other materials are likely to be of high level interest to the public, we will provide an opportunity for public comment via a 30-day Federal Register notice, and additional details will be provided. When tests are not likely to be of high interest to the public, and are of an operational nature, we will provide updated plans with additional details through a non-substantive change request.


Mail Materials Test

The Mail Materials Test is designed to continue research on ways to improve the respondent experience and address respondent concerns about the perceived intrusiveness of the ACS balanced against gaining response to the survey. This test would combine successful experimental changes from several prior tests together and determine the overall impact of those changes on response rates.


Previous research on mandatory messaging has demonstrated that removing mandatory messages from the outgoing envelopes causes a decline in self-response rates (Barth, Zelenak, Asiala, Castro, & Roberts, 2016). Additionally, softening messages in the letters and postcards also causes a decline in self-response return rates and in one case a decline in overall response rates after CAPI (Oliver, Risley, & Roberts 2016 and Longsine, et al. forthcoming). Implementing these changes may lead to increases in data collection costs, reduce the quality of the estimates produced from the survey, or both.



Conversely, the Census Bureau has found that updating the look and feel of the mail materials, based on recommendations from the Messaging and Mail Package Assessment research, may have a positive impact on the respondent experience. The use of bulleted lists, fewer words, and callout boxes, for example, could reduce burden by making the materials easier to read. Implementing these types of changes in previous tests may have had a positive impact on self-response rates and highlighting the mandatory nature of the survey improves self-response rates (Oliver, Risley, & Roberts 2016 and Reingold 2014). Additionally, redesigning the front page of the questionnaire to provide clearer guidance on how to respond and how the survey is conducted and using a letter instead of a postcard for the final mailing may have a positive impact on self-response rates (Longsine, Oliver, Castro, & Rabe, forthcoming). Finally, using pressure seal mailers has no significant impact on response but reduces costs compared to a letter (Risley, Barth, Cheza, & Rabe, forthcoming).


The Census Bureau proposes to conduct a field test of seven treatments as part of the production ACS September 2018 panel. This requires a change in the mail materials for a subset of the sampled addresses, while adhering to the same data collection protocols as production ACS. No additional burden is expected.


The ACS has five mailing pieces: 1) an initial mailing package, 2) a first reminder, 3) a paper questionnaire package (if a response has not been received), 4) a reminder postcard (if a response has not been received), and 5) a final reminder (if a response has not been received). Changes would be made to each of these mailing pieces in the experimental treatments as described below.


Treatment 1 (Modified Production) will isolate a few changes to the experimental treatments for better comparisons. The materials in this treatment are primarily the same as the production ACS materials with the following exceptions: the FAQ brochure is not included in the initial mailing and instead information from the brochure is added to the back of the letter, the FAQ brochure and Instruction Card are not included in the paper questionnaire mailing, and wording in the final reminder is updated.


Treatment 2 (Emphasized Mandatory) is similar to the Revised Design treatment from the 2015 Summer Mandatory Messaging Test (Oliver, Risley, & Roberts, 2016). This treatment uses mail materials designed to better emphasize the benefits of survey participation. Included in the changes were the use of different logos on the envelopes and letters, the use of bold lettering and boxes (callout boxes) to highlight elements of the materials, and the addition of a box that says “Open Immediately” on some of the envelopes. The mandatory nature of the survey is highlighted by using bold text and isolating sentences about mandatory in the materials. Some materials, such as the Frequently Asked Questions (FAQ) brochure are excluded from the mailings to simplify the materials and focus the attention of the respondent to what they need to do. Content from the FAQ brochure is included on the back of the letters.


Treatment 3 (De-emphasized Mandatory with Modified Questionnaire) incorporates the same design features used in Treatment 2. The main difference between the treatments is that Treatment 3 does not highlight the mandatory nature of the survey. Instead, mandatory statements are either similar to the control materials or softened by moving the placement or unbolding the text. The front page of the questionnaire was updated to provide clearer guidance on how to respond to the survey.


Treatment 4 (Softer/Eliminated Mandatory) again incorporates the same design features used in Treatment 2 but further softens mandatory statements by removing them from some of the mailings. Additionally, the “Open Immediately” box is not added to the initial mailing and paper questionnaire mailing.


Treatment 5 (De-emphasized Mandatory with Current Questionnaire) is designed to test changes to the front of the paper questionnaire made in Treatment 3. Note, Treatments 2 and 4 also use the updated questionnaire.


Treatment 6 (Production with tri-fold Pressure Seal Letters) is designed to test using a tri-fold pressure seal letter. The materials will be identical to production except that the 2nd and 5th mailing will be tri-fold pressure seal letters instead of bi-fold pressure seal letters.


Treatment 7 (Production, Sorted Separately) is identical to the production materials, including using bi-fold pressure seal letters. The mailings are sorted separately from production to ensure a similar sample size to Treatment 6 for comparison purposes. Previous ACS testing has found that mailings that go to more respondents arrive quicker due to U.S. Postal Service efficiencies in processing.


See Table A1 in Attachment A for a comparison of the seven treatments. See Attachment B for the Treatment 1 materials, Attachment C for Treatment 2 materials, Attachment D for Treatment 3 materials, Attachment E for Treatment 4 materials, and Attachment F for Treatment 5 materials. Relevant production materials including the FAQ brochure and Instruction Card (which were removed from the above treatments) are provided for reference in Attachment G. Attachment G also contains the pressure seal letters. The content of the letters is the same for Production, Treatment 6, and Treatment 7. The way the pressure seal letter is folded is the only difference.


To analyze the impact of various features of the mail materials the Census Bureau will compare self-response rates and final response rates. To understand the impact of removing the FAQ brochure, Instruction Card, and changes in the final reminder, Treatment 1 (Modified Production) will be compared with Treatment 7 (Production, Sorted Separately). To understand the impacts of design and wording changes Treatment 1 (Modified Production) will be compared to Treatment 2 (Emphasized Mandatory), Treatment 3 (De-emphasized Mandatory with Modified Questionnaire), and Treatment 4 (Softer/Eliminated Mandatory). To assess the impact of the questionnaire cover changes Treatment 3 (De-emphasized Mandatory with Modified Questionnaire) will be compared to Treatment 5 (De-emphasized Mandatory with Current Questionnaire). To assess the impact of using tri-fold pressure seal letters compared to bi-fold pressure seal letters Treatment 6 (Production with Tri-fold Pressure Seal Letters) will be compared to Treatment 7 (Production, Sorted Separately). To understand the net impact on return rates as well as cost and the impact on the reliability of the ACS estimates, the Census Bureau will also compare each treatment to Treatment 7 (Production, Sorted Separately).


Additionally, the change to the cover of the questionnaire will require the Census Bureau to move a date field from the cover page to the inside page of the questionnaire. The Census Bureau will assess the impact on the data quality by looking at item missing data rates for the date.

The monthly ACS production sample of approximately 295,000 addresses is divided into 24 groups, where each group contains approximately 12,000 addresses. Each group is a representative subsample of the entire monthly sample, and each monthly sample is representative of the entire yearly sample and the country. The Census Bureau will use two randomly selected groups for each treatment. Hence, each treatment will have a sample size of approximately 24,000 addresses. In total, approximately 168,000 addresses will be used for the seven treatments. The sample size will be able to detect differences of approximately 1.25 percentage points between the self-response return rates for comparisons between the treatments. The power of the test, which is 80 percent and where α=0.1, assumes a 50 percent response rate.

Self-Response Mail Messaging Tests

To address concerns with declining response rates and increasing costs, the Census Bureau plans to study methods to increase self-response, the least expensive modes of data collection. The Census Bureau currently sends up to five mailings to a sampled address to inform the occupants that their address has been selected and to encourage them to respond to the ACS.

These self-response mail messaging tests would potentially include:

  • updating the mail materials based on the Strategic Framework research conducted in 2017.

  • testing changes to the mailing envelope(s).

  • the use of due dates and commitment interventions.

  • continued research on adaptive design of mail materials that address different response preferences of respondents, building on testing conducted in 2017.

  • modifications to the mail schedule that address no longer utilizing Computer-Assisted Telephone Interviews (CATI) in the second month of data collection, potentially including an additional mailing or contact via another method, such as an automated dialer.

The Census Bureau has conducted considerable research to improve its mail materials and messaging to address declining response rates and concerns from the public about the legitimacy of the survey and its legal mandate, purpose, and burden on respondents. In 2017, the Census Bureau conducted research to develop a strategic framework for messaging in the ACS mail to help in this endeavor (Oliver, Heimel, & Schreiner, 2017). This framework provides an overall mail messaging strategy by establishing the specific objective and audience for each mailing sent to respondents. In addition to evaluating the current mail materials in the context of this framework, the next step in this research is to develop revised mail materials and test them.


Changes to the mailing envelopes include updated logos and layout of the return address label based on recommendations from the American Community Survey Messaging and Mail Package Assessment Research (Reingold, Penn Schoen Berland, & Decision Partners, 2014). Additionally, the National Academy of Sciences suggested that respondents may be concerned about the legitimacy of the mailings because of a disconnect between the address of the National Processing Center in Jeffersonville, IN displayed on the envelopes and letterhead that indicates that the Census Bureau is located in Washington, D.C. (National Academies of Sciences, 2016).


The Reingold team, members of the National Academy of Sciences, and members of the Social and Behavioral Science Team (SBST) have suggested to the Census Bureau that providing a due date for a response would help the respondent prioritize their commitments and serve as a reminder to respond in a timely manner. Now that CATI is no longer a nonresponse followup operation, all nonrespondents receive the fifth mailing. Operationally, this makes it easier to send out response due dates, making a test of this sort possible where previously there were barriers. An increase in self-response, and more timely self-responses, would decrease operational costs. The Census Bureau would like to test variations of wording and placement in the mailings for due dates or commitment dates to determine the effectiveness of the idea, if any, for the ACS.


The use of due dates (or deadline messages) on mail materials has been previously tested in the decennial census (see Bouffard, Brady, & Stapleton 2004; Martin 2007; Stokes, et al. 2011). In one of those studies, researchers did not see an increase in cooperation rates for the initial questionnaire mailing, but did see an impact on the speed of response (Bouffard, Brady, & Stapleton, 2004). A study conducted as an experiment during the 2010 Census showed a significant increase in return rates when including a date the form should be mailed back by (but did not use the term ‘deadline”) on the advance letter, initial mailing letters and envelope, and reminder postcard (Stokes, Reiser, Bentley, Hill, & Meier, 2011). Additionally, the SBST reviewed the ACS mail materials in 2015 and suggested “sometimes people fail to respond due to being too busy, forgetful, or distracted.” They recommended, “Interventions help elicit commitments and follow through on them, as well as anchor people to quicker responses.” These interventions could include a commitment box in the initial letter, including a calendar where respondents could indicate a date during which they would complete the survey and hang up the calendar as a reminder, or potentially asking respondents to set a goal for when they will complete the survey and sign next to the goal to confirm their commitment (Feygina, Foster, & Hopkins, 2015). ACS testing of the mail materials could involve including due dates on the mail materials as well as the use of commitment interventions.


In 2017, the Census Bureau conducted an adaptive design mail test to try to identify areas of the country where receiving a paper questionnaire in the initial mailing may improve self-response rates. Pending the results of that study, we may continue to refine those results to better identify areas that benefit from receiving paper questionnaires earlier in the mail process as well as exploring areas where sending sampled addresses directly to CAPI would be beneficial.


As part of efforts to continually evaluate ACS program operations, the Census Bureau decided to no longer collect ACS NRFU information from respondents through our CATI operation. The last month of ACS CATI NRFU was September 2017. The Census Bureau will continue to conduct the TQA and Failed Edit Follow Up operations for the ACS. The ACS data collection schedule runs for approximately three months for each monthly sample, with a majority of the mail contacts occurring in the first month and CAPI occurring in the third month (the former CATI contact occurring in the second month). With the decision to no longer contact respondents via CATI NRFU, we are now proposing to explore various options for the first two months of data collection including changing the mailing schedule and/or adding a sixth contact, either via another mail contact or an automated telephone reminder.


Multilingual Testing

The ACS provides mail materials to respondents in English, with Spanish translations provided only on two instruction cards, and a multilingual brochure that provides basic information about the survey included in the initial mailing in English, Spanish, Chinese, Vietnamese, Russian, and Korean.1 Respondents can call TQA to speak to an interviewer in those languages to complete the interview over the phone or to request a paper questionnaire in Spanish. Field Representatives also conduct interviews in a variety of languages to meet the needs of the respondents.


An overwhelming majority of self-response to the ACS is in English (with less than 1,000 responses received in Spanish in a year). It is unclear if this is due to cultural differences among respondents who do not speak English well enough to use the English questionnaires, due to a lack of the availability of mail materials in other languages, or some other reason. A field test would potentially include adding messages to the envelopes in other languages to get the envelopes opened and testing targeted bilingual mailings (following a model similar to the decennial census). The targeted mailings could follow a different mailing strategy than currently used or modify the current mailings to include more bilingual materials (letters, questionnaires, etc.).



Content Tests

In response to Federal agencies’ requests for new and revised ACS questions, the Census Bureau plans to conduct up to two Content Tests. The Census Bureau has laid out a formal process for making content changes to the ACS.

  • First, federal agencies evaluate their data needs and propose additions or changes to current questions through OMB.

  • The Census Bureau assesses the need for the proposed new content. Proposed new content is classified as mandatory (required by law to come from the ACS), required (required by law or a federal court, though not specifically from the American Community Survey), or programmatic (the content has no explicit mandate or requirement but is sought for program planning, implementation, or evaluation). In order to better achieve the balance between the need for information with the need to reduce the burden on respondents to provide that information, the Census Bureau has prioritized mandatory and required content. Programmatic content—which may be useful but, by definition, is not required—faces a higher threshold to be deemed sufficiently essential to outweigh the additional burden placed on respondents by a longer survey and the associated likelihood for lower response rates and reduced data quality.

  • Next, an interagency review panel – headed by OMB and the Census Bureau – evaluates the proposals.

  • Final proposed questions result from extensive cognitive and field testing to ensure they result in the proper data, with an integrity that meets the Census Bureau’s high standards.

  • This process includes several opportunities for public comment.


The objective of the ACS Content Tests, for both new and existing questions, is to determine the impact of changing question wording, response categories, and redefinition of underlying constructs on the quality of the data collected.


The Census Bureau proposes to evaluate changes to the questions by comparing the revised questions to the current ACS questions, or for new questions, to compare the performance of question versions to each other as well as to other well-known sources of such information.


The Census Bureau is exploring conducting two content tests. One content test in 2019 and one in 2021. The content test in 2019 would be limited to items that do not require extensive cognitive testing (i.e., the proposed wording has been previously cognitively tested or is in use on another survey). The test in 2019 would be implemented using a small portion of production ACS sample. This allows the Census Bureau to be more agile in reacting to changes in society and legislation. The content test in 2021 would include new items, if proposed, as well as revised content following cognitive testing and would involve all modes of data collection with a sample separate from production ACS.


Respondent Burden Questions Test

The ACS has been a “target of criticism for…excessive burden” and “generates a small but continuous stream of complaints to members of Congress” (National Academies of Sciences, 2016). In response to these concerns, the Census Bureau has studied several possible methods of reducing burden, including, but not limited to: softening the mandatory response messaging, reducing the number of CAPI contact attempts, and removing questions from the survey. However, it is unclear if or how these changes have affected how burdened respondents feel about participation in the ACS.


There are two approaches to measuring response burden. The first, and most common, is to measure the length of the interview (i.e., how long the interview or survey takes to complete). The second approach attempts to measure the “multidimensional nature of burden,” which includes four factors: length of the interview, amount of effort required by respondent, amount of stress experienced by the respondent, and frequency with which the respondent is interviewed (Bradburn 1978; National Academies of Sciences 2016). The subjective measures related to respondent burden could be measured through self-reports of burden by respondents, as is being done by the Bureau of Labor Statistics with participants in the Consumer Expenditure Survey (Fricker, Gonzalez, & Tan 2011; Fricker, Kreisler, & Tan 2012).


Feedback received at the National Academies of Sciences, Engineering, and Medicine Committee on National Statistics workshop suggested that the Census Bureau conduct research and methods test for an improved definition of burden, including measuring the perception of the respondent on burden. Another idea recommended establishing a baseline measurement so that progress toward reducing the burden over time can be measured (National Academies of Sciences, Engineering, and Medicine 2016).


In 2017, the Census Bureau, in consultation with the Bureau of Labor Statistics, began development work on measuring respondents’ perceptions of burden. Focus groups were conducted first to explore how prior ACS respondents felt about their experience responding to the survey. This qualitative research was designed to learn which features of the ACS contribute to or affect respondents’ level of perceived burden and how much each of these features contribute to respondents’ perceived burden. As a result of the focus groups, questions were developed and cognitively tested, with the first round finishing in 2017 and a second round planned in 2018. Cognitive testing is meant to refine and establish a set of questions to use in a field test to measure respondent burden. The field test will evaluate how often the questions are answered and begin to build a picture of respondent burden, in conjunction with other related metrics. The initial testing will likely only include self-response modes of data collection.


Respondent Comment/Feedback Test

The Census Bureau is interested in understanding respondents’ perceptions of participation in the survey and, therefore, has paid attention to feedback received by respondents. However, there have been concerns about the number of telephone calls the Census Bureau receives regarding the ACS. The respondent advocate answers an estimated 300 telephone calls about concerns about the ACS each year and additional concerns are received by other Census Bureau staff, both at headquarters and in the field, as well as to members of Congress.

In response, the Census Bureau is exploring the idea of adding an open comment/feedback field to provide respondents with an immediate opportunity to submit feedback to the ACS about their experience answering the survey. The Census Bureau is currently conducting cognitive testing on various placements and wording to accompany the comment/feedback field. As a next step, the Respondent Comment/Feedback Test will field test the wording that will accompany the comment/feedback field and assess both the volume and context of comments that would be received if implemented.

Administrative Data Use Test

The Census Bureau has made significant progress exploring the use of administrative data in surveys and censuses.2 Administrative data refer to data collected by government agencies for the purposes of administering programs or providing services.


The Census Bureau plans to evaluate the quality, coverage, and feasibility of using administrative records in lieu of interviews at institutional GQs, which includes nursing homes and correctional facilities (U.S. Census Bureau, 2017). The first step of this process was to identify available administrative data for institutional GQs (Flanagan-Doyle, forthcoming).


For housing units, the Census Bureau evaluated the availability and suitability of several different data sources for use in the ACS, researched these data sources, and released several reports that summarized findings about their suitability (Ruggles, 2015). Those reports focused on telephone service (Moore, 2015a), year (a residence is) built (Moore, 2015b), condominium status (Flanagan-Doyle, 2015), property taxes (Seeskin, 2016), income (O'Hara, Bee, & Mitchell, 2016), residence one year ago (O'Hara, Field, Koerber, & Flanagan-Doyle, 2016), self-employment income (Majercik, 2016), and the sale of agricultural products (Majercik, 2017). The Census Bureau has continued to explore other topics for which administrative records could substitute for questions asked of respondents. Some of those topics include property values, acreage, number of rooms and bedrooms, and fuel type used to heat a home.


In addition to replacing questions on the survey, administrative data may be used to reduce burden of existing questions by allowing for modification of the questions. For example, the ACS currently asks respondents to provide their total income for the past 12 months as well as income received from various sources (for example, wages, interest, retirement income). The Census Bureau recently conducted cognitive testing on modifications to the income questions, including changing the reference year from the past 12 months to the previous calendar year, as well as only asking respondents if income was received from various sources rather than asking the exact amount for each source (Steiger, Robins, & Stapleton, 2017).


As a continuation of this research, the Census Bureau proposes a possible field test of revised content, for housing items as well as other topics, both for the housing unit questionnaire as well as the GQ questionnaire. Some questions may be modified while others would be removed.


Group Quarters Test

There are two categories of GQs: institutional and non-institutional. Institutional GQs include places such as correctional facilities and nursing homes. Non-institutional GQs include college housing, military barracks, and residential treatment centers. Most interviews conducted in GQs are interviewer-administered (over 90 percent of interviews in institutional GQs and just under 75 percent in non-institutional GQs), but some GQ respondents self-respond using a paper questionnaire. The Census Scientific Advisory Committee Working Group on Group Quarters in the ACS recommended that the Census Bureau consider making an “internet version of the ACS available to non-institutional GQ residents, especially in college dorms, military barracks, and group homes” (National Academies of Sciences, 2016). Additional support was identified for this in a workshop held in 2016 with the National Academies of Science Committee on National Statistics. The Census Bureau proposes a field test of an internet self-response GQ form for residents in non-institutional GQs.


In this test, a sample of GQ respondents will be given the option of completing the survey via self-response using an internet instrument. We would evaluate the quality of the data received from the internet compared to traditional data collection methods for GQs (paper questionnaires and interviewer-administered) as well as assess operational issues with offering the internet option, including feedback from interviewers.


Question 2. Needs and Uses


The primary necessity for continued full implementation of the ACS is to provide comparable data at small geographies, including metropolitan and micropolitan areas, as well as the census tract and block group level. The 2014 ACS Content Review collected information about how ACS estimates are being used to meet current federal data needs; the following are examples of these uses:



Federal agencies frequently use ACS data as an input for a funding allocation formula. The Department of Housing and Urban Development (HUD) uses state, county, and metropolitan area level ACS median income estimates to allocate Section 8 Housing funds and to set Fair Market Rents for metropolitan areas.3 Both these calculations use a yearly update factor based on ACS data and earlier data (currently from the Census 2000 Long Form, though HUD is in the process of phasing this out).4



Federal agencies also fund state and local programs through block grants that are administered and evaluated at the state and local level. The data collected via the ACS are useful not only to the federal agencies in determining program requirements but also to state, local, and tribal governments in planning, administering, and evaluating programs. For example, within the Department of Health and Human Services (HHS), the Community Services Block Grant program uses ACS data at the county level to determine the allocation of funds from states to eligible entities, to determine guidelines used for participant eligibility, and to assess the need for assistance for low-income, including elderly, low-income households.5 Additionally, the USDA’s Food and Nutrition Service (FNS) provides states and school districts data based on ACS poverty estimates in order to evaluate their Supplemental Nutrition Assistance Program programs.6

Federal agencies find value in using ACS estimates to understand characteristics of population groups in order to make program decisions. The Federal Communications Commission uses computer and internet use estimates to assist in evaluation of the extent of access to, and adoption of, broadband.7 Additionally, HHS uses disability, health insurance and other estimates to measure, report, and evaluate health disparities and improvements in health equity.8


Some federal agencies use ACS data to estimate future needs; the ACS provides more timely data for use in estimation models that provide estimates of various concepts for small geographic areas. The Department of Transportation’s Federal Highway Administration (FHWA) uses American Community Survey Journey to Work estimates (including means of transportation, time a worker leaves the house to go to work, travel time, and work location) to create traffic flow models.9 These flow patterns are used by both the FHWA and state transportation agencies to plan and fund new road and other travel infrastructure projects. Additionally, the Department of Energy uses ACS estimates to project residential energy demand over the next 30 years, which is detailed in EIA's Annual Energy Outlook (AEO), the premier source for assessing the energy needs of the U.S. economy in a domestic and international context.


Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines10). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act of 1995.


The Methods Panel tests proposed in this package allow the Census Bureau to continue to examine operational issues, research the data quality, collect cost information and make recommendations for improvements to this annual data collection. The tests are designed to examine changes to data collection for the ACS and are used to make decisions about future data collection methods. The methods panel tests are needed to increase survey efficiencies, reduce survey cost, improve the respondent experience, increase response rates, and improve data quality.


Question 3. Use of Information Technology


We use internet and computer-assisted interview (CAI) technologies for collecting data from housing units for the ACS. These technologies allow respondents and interviewers to skip questions that may be inappropriate for a person/household, which, in turn, keep respondent burden to a minimum. We use CAI technologies for collecting information from GQ facilities to accurately classify the GQs by type and to generate a sample of residents at the GQs. CAI is also used to conduct personal interviews with GQ residents. We use CAI technologies for both the HU and GQ Reinterview operations. Additionally, by continuing to offer an internet response option in the ACS, the Census Bureau is taking further steps to comply with the e-gov initiative. Based on early implementation of an internet response option, this method also slightly improves self-response rates and creates cost savings by reducing printing and data capture costs and workloads for more costly follow-up operations.


The proposed tests will continue to utilize these technologies. In some tests, we hope to expand the use of the internet by encouraging more response via the internet as well as improving the use of the internet for response among Spanish-speaking respondents.


Question 4. Efforts to Identify Duplication


The ACS is the instrument used to collect long-form data that have traditionally been collected only during the decennial census. The content of the ACS reflects topics that are required directly or indirectly by the Congress and that the Census Bureau determines are not duplicative of another agency’s data collection. A number of questions in the ACS appear in other demographic surveys, but the comprehensive set of questions, coupled with the tabulation and dissemination of data for small geographic areas, does not duplicate any other single information collection. Moreover, many smaller Federal and non-Federal studies use a small subset of the same measures in order to benchmark those results to the ACS, which is often the most authoritative source for local area demographic data.


In addition, the OMB Interagency Committee for the ACS, co-chaired by OMB and the Census Bureau, includes more than 30 participating agencies and meets periodically to examine and review ACS content. This committee provides an extra safeguard to ensure that other agencies are aware of the ACS content and do not duplicate its collection and content with other surveys.


The ACS Methods Panel program is the only testing vehicle for the ACS. There is no other program designed to improve the ACS. Testing for the ACS builds on other research conducted by other surveys and other statistical agencies. Specifically, lessons learned from ongoing decennial cognitive and field testing of mail materials and multilingual approaches inform the design of ACS testing. Proposals for content changes in the ACS also frequently build on research conducted for other federal surveys. Additionally, the research on respondent burden questions has included collaboration and consultation with staff at the Bureau of Labor Statistics.


Question 5. Minimizing Burden


Previous research and data from survey administrators indicates that the ACS HU questionnaire takes an estimated 39 minutes to complete; CAPI data collection takes an estimated 27 minutes; response via internet takes an estimated 39 minutes. The GQ facility questionnaire takes an estimated 15 minutes to complete and the ACS GQ person questionnaire takes an estimated 25 minutes to complete.


Several of the tests outlined here are proposed to reduce respondent burden through the use of administrative records, by encouraging self-response which is seen as less burdensome than in person data collection efforts, or by modifying the questions via content tests that may clarify confusion therefore reducing the time it takes to answer the survey.


Other tests may minimally increase the time it takes for a respondent to complete the survey by adding an optional comment field or adding optional questions to the survey to assess the respondents’ survey experience. One test also proposed an additional contact via either an additional mail contact or an automated telephone message. This additional contact will be weighed against the potential to avoid personal visit interviews for some respondents as well as any insight that can be gained about respondents’ perceived burden related to our contact methods.


Also, while we are requesting to conduct these studies assuming that we will sample HUs or GQ facilities that are not already part of the current ACS sample, to the degree possible the ACS will use production sample to conduct the tests.


Question 6. Consequences of Less Frequent Collection


The Methods Panel tests represent one-time, special tests with a defined period for data collection. Less frequent testing would not allow the ACS program to be responsive to the changing needs of our stakeholders and the public.



Question 7. Special Circumstances


The Census Bureau will collect these data in a manner consistent with the OMB guidelines.


Question 8. Consultations outside the Agency


In August 2012, the OMB in conjunction with the Census Bureau established a Subcommittee of the Interagency Council on Statistical Policy (ICSP) on the ACS. The ICSP Subcommittee on the ACS exists to advise the Chief Statistician at OMB and the Director of the Census Bureau on how the ACS can best fulfill its role in the portfolio of Federal household surveys and provide the most useful information with the least amount of burden. It may also advise Census Bureau technical staff on issues they request the subcommittee to examine or that otherwise arise in discussions.


The ICSP Subcommittee on the ACS was involved in the 2016 Content Test by reviewing proposals, approving question versions to move forward for field testing, and reviewed the proposed 2019 ACS content changes and recommended their approval to the OMB and Census Bureau. It is expected that the ICSP Subcommittee on the ACS will continue to be involved in this manner for future content tests.


Additionally, federal agencies will continue to be consulted and included as part of interdisciplinary teams as has been done for the 2016 Content Test as well as ongoing cognitive testing studies the Census Bureau is currently conducting.


The proposed tests reflect consultation with outside organizations, including members of the National Academy of Science and the Bureau of Labor Statistics. In addition, staff regularly review survey methodology literature and attend conferences that present state-of-the-art methods and procedures.


The Census Bureau published a notice, Federal Register Document 2017-24943 on November 17, 2017 (Federal Register Volume 82 FR 54317 pages 54317-54320), inviting the public and other federal agencies to comment on our plans to submit this request. We received six comments in response to that notice from the following:


  • National State Data Center Steering Committee

  • The Williams Institute, UCLA School of Law

  • The Leadership Conference on Civil and Human Rights and other undersigned organizations

  • National LGBTQ Task Force, Asian Americans Advancing Justice, and the National Association of Latino Elected and Appointment Officials Educational Fund

  • EveryoneOn

  • Economic Policy Institute

The comments received provided general support of the research and testing proposed by the Census Bureau’s Methods Panel program and in some cases provided specific feedback on the proposed tests as well as proposals for additional testing. The National State Data Center Steering Committee expressed their “full support [for the] Census Bureau’s continuous improvement of ACS research design and resulting data products….[and] the proposed messaging test and content tests…” The Williams Institute said, “We applaud the Census Bureau for its continued research, testing, and evaluations aimed at improving the ACS.” The National LGBTQ Task Force, Asian Americans Advancing Justice, and the National Association of Latino Elected and Appointment Officials Educational Fund indicated that they “strongly support the Census Bureau’s continuous efforts to the improve the ACS.”


Self-Response Mail Messaging Tests

For the self-response mail messaging tests commenters noted and recommended the following:


In light of [the] impending challenges to securing adequate levels of self-response” that fears about security and confidentiality pose, “the Bureau must test messaging that aims to convince respondents that their personal information will not be stolen and cannot be disclosed for any non-statistical purpose…”

The most important findings from message testing to improve self-response will be those regarding the hardest-to-count communities that have been and are prospectively the most likely not to self-respond.”


To ensure that its testing produces reliable and detailed information about its ultimate target audience of undercounted non-self-responders, the Bureau must oversample in underrepresented communities.”


Oversampling is the best method for securing the necessary volume of information to produce detailed disaggregated data on respondents of greatest interest as the Bureau confronts the challenge of increased non-response and false response.”


Our organizations urge the Bureau to devote its highest degree of attention and effort to including an oversampling of MENA and non-U.S. citizen respondents in message testing through the ACS.”


The Census Bureau will consider messaging that addresses the most common concerns raised by respondents about the survey, including security and confidentiality of their response. We will also consider hard-to-count populations in the design of the materials as well as sampling procedures. Analysis of sub-groups will also be considered for the tests.


Multilingual Tests

For the multilingual tests commenters noted and recommended the following:


“…devote resources to communicating in-language with as much as possible of the resident population not yet fully fluent in English.”


To obtain useful, robust information about messages that work and fail with as many members as possible of the communities that are chronically undercounted in Census surveys, it is imperative that messaging research be conducted in the maximum possible number and range of languages. Linguistic accessibility is particularly important to the success of self-response-related message testing because of the confluence between rapidly-growing segments of the populations, their elevated levels of limited English proficiency, and heightened likelihood of refusing engagements with government agents.”

“…electronically publish not only explanations but also survey instruments themselves in as many as possible of the languages spoken by residents who are not yet fully fluent in English. The Bureau should also supplement electronic in-language questionnaires and related materials with interviews, focus groups, or other research methodologies that offer the opportunity to survey speakers of less-common languages, particularly in areas where households are disproportionally likely to lack high-speed internet access.”


We encourage the Bureau to use ACS tests if and as possible to refine aspects of its plan for critical 2020 language assistance efforts.”


The Census Bureau is committed to meeting the language needs of our respondents. We will evaluate current mail materials to identify ways to improve our communications with non-English speaking respondents and make it easier for them to respond to the survey.


Content Tests

For the content tests commenters noted and recommended the following:


Four of the responses urged the Census Bureau to conduct additional research and testing to accomplish the goal of including sexual orientation and gender identity measures on the ACS. Additionally, EveryoneOn proposed additional content related to computer and internet use that obtained detailed and specific data on reasons behind non-adoption, to measure digital literacy, and to identify price points that would catalyze the closure of the digital divide. The Economic Policy Institute asked the Census Bureau to consider adding questions on retirement plan participation, hourly wages, contingent work, union membership, employer size, and prior incarceration or felony convictions.


The Census Bureau reviews requests for new content through a formal process.

  • First, federal agencies evaluate their data needs and propose additions or changes to current questions through OMB.

  • The Census Bureau assesses the need for the proposed new content. Proposed new content is classified as mandatory (required by law to come from the ACS), required (required by law or a federal court, though not specifically from the American Community Survey), or programmatic (the content has no explicit mandate or requirement but is sought for program planning, implementation, or evaluation). In order to better achieve the balance between the need for information with the need to reduce the burden on respondents to provide that information, the Census Bureau has prioritized mandatory and required content. Programmatic content—which may be useful but by definition is not required—faces a higher threshold to be deemed sufficiently essential to outweigh the additional burden placed on respondents by a longer survey and the associated likelihood for lower response rates and reduced data quality.

  • Next, an interagency review panel – headed by OMB and the Census Bureau – evaluates the proposal.

  • Final proposed questions result from extensive cognitive and field testing to ensure they result in the proper data, with an integrity that meets the Census Bureau’s high standards.

  • This process includes several opportunities for public comment.


The Census Bureau received four federal agency requests to add sexual orientation and gender identity to the American Community Survey for various needs. Three requests were determined to be programmatic, as there was no statutory or other legal requirement for those agencies to obtain the information. The fourth agency withdrew its request. New requests will be considered as outlined in the formal process above. No federal agency has recently proposed adding the other topics identified by the FRN comments.


The Economic Policy Institute also proposed research aimed at ways to improve the quality of data on retirement income and coverage. They specifically recommended that the Census Bureau address under reporting of retirement income by “rephrasing, reordering, or streamlining questions” and address the shift from traditional pensions to employer-sponsored savings plans and IRAs. They also raised concerns about how lump sum distributions are handled in some measures of retirement income and recommended that the Census Bureau “eliminate arbitrary distinctions between lump sum and regular distributions.” Additionally, they recommended that the Census Bureau should, “ask survey respondents about retirement plan contributions in order to allow researchers, if they wish, to calculate earnings net of these contributions and avoid counting the same income at two stages of the lifecycle. Tracking employee contributions (and, ideally, employer contributions) would also be useful for assessing the use of tax-qualified plans.”


The Census Bureau recently completed the 2016 Content Test with the goals of improving income reporting, reducing item missing data rates, reducing reporting errors, and updating questions on retirement income and the income generated from retirement accounts and all other assets in order to better measure retirement income. An updated question was successfully tested that will be implemented in the 2019 production ACS. The question was expanded to ask about “retirement income, pensions, survivor or disability income” and includes an instruction to note that income from “a previous employer or union, or any regular withdrawals or distributions from IRA, Roth IRA, 401(k), 403(b) or other accounts specification designed for retirement” should be included (Posey & Roberts, 2017). This change should address the concerns raised by the Economic Policy Institute. Other changes to the retirement income question would need to be reconsidered as part of a new content test and would be subject to the formal process as described above.


Respondent Burden Questions Test

Comments received on the Respondent Burden Questions Test included the following:


“…solutions that focus on building respondent support for the ACS are the most effective way to reduce response burden and should be the focus of the Bureau’s efforts.”


We “support the Bureau’s efforts to better understand respondents’ perception of the burden of completing the ACS. We recommend that the Bureau field questions that ask respondents how to improve their perceptions of the ACS, rather than questions that focus on how to reduce the time spent on completing the survey.”


Investing more resources in reworking the ‘Why We Ask’ materials and undertaking additional efforts to improve public perception of the ACS is the most effective and efficient solution to reducing respondent burden. Therefore, [we] recommend that questions tested on the ACS Methods Panel Tests should reflect an intention to address respondent burden through improved public perception.”


The Census Bureau will work to better communicate the purpose and importance of the ACS through its mail contacts to engage respondents and convey the importance of their response to provide comparable data for all of America’s communities.


The Census Bureau is testing a series of feedback questions as well as an open-ended comment box to provide respondents the opportunity to tell us about their experience and offer insights as to how it could be improved. The feedback questions do not focus on how to reduce the time spent on completing the survey.


The Census Bureau has tested the “Why We Ask” insert in two mail message tests, one where it was included in the Paper Questionnaire Package and one where it was included in the Initial Mailing. Neither test supported including the insert. There were no significant differences in self-response return rates when included in the paper questionnaire package, but there were additional costs to the ACS program (Heimel, Barth, & Rabe, 2016). Including it in the initial mailing reduced self-response return rates (Longsine, Oliver, Castro, & Rabe, forthcoming). Given the results of these two tests, the Census Bureau is not planning to undertake additional testing of the “Why We Ask” insert in the mail materials. It will continue to be available to FRs to use when interacting with respondents.


Respondent Comment/Feedback Test

There was no specific feedback received on this proposed test.


Administrative Data Use Test

Some responders to the FRN were concerned about the use of administrative data and some provided support for the use. The comments of concern about the use of administrative data were:


“…we are concerned that enumeration that solely relies on administrative records may significantly reduce the quality and quantity of information included in Census data releases about the national’s hardest-to-reach populations.”


…”the use of administrative data would be a particularly problematic solution.”


Administrative records are unlikely to adequately capture the rich demographic data that result from self- and proxy reports to ACS surveys. For example, recent testing by the Bureau found that administrative records frequently lack accurate information about the race or ethnicity of a respondent.”


We urge the Bureau to employ a critical eye as it conducts this testing, and to decline to adopt widely any use of administrative data that tends to reduce the reliability of resulting data.


Responses also raised concerns about using administrative data for institutional GQs.


Incorporating administrative records into Census Bureau data gathering and analysis efforts will have a palpable impact on respondents by reducing the amount of information we request from them. Administrative records may also increase data reliability and provide cost-savings by reducing the need for follow up visits. While there is great potential for the role of administrative records in the future of data collection and processing, there are also great challenges to using these data (e.g., issues with matching rates, geographic coverage, leveraging data designed for different uses). The Census Bureau will continue to pay careful attention to ensure that any proposed implementation of these data in ACS production will meet the Census Bureau’s high quality standards.


The comment received in the support of exploring the use of administrative records included the following:


The Economic Policy Institute recommended the potential to supplement survey data with tax and social security records as it relates to income data.


The National State Data Center Steering Committee “agrees with the proposition that administrative data can productively substitute for, or provide validation for, data that is currently collection through ACS household surveys and group quarters surveys. [They] also agree that some administrative data resources will be as reliable as (or possibly more reliable than) data collected through surveys…The Bureau may want to prioritize administrative data substitution specifically for topics where the potential for respondent error is highest, and where the substituted administrative records are consistently complete.” They also recommended, “comparing administrative records-derived data with comparable data from household surveys” as well as “focus inquiry on the workflow necessary to standardize data from a diverse sample of data-providing agencies.” They mention that it will take the Bureau “time to identify data steward agencies, request and obtain data from these, and extract, transform, load, and further process the administrative records data.”


Much of the proposed research suggested is already underway at the Census Bureau, including assessing the level of effort it would take to incorporate the use of administrative data into the current ACS workflow and systems as well as simulation studies to look at the impact on data products. Additionally, research into the use of administrative records data for the income questions is also ongoing. The testing proposed in Question 1 of this document would further the study of the use of administrative records.


Group Quarters Test

Two comments received on the Group Quarters Test supported testing the option of using internet for data collection in non-institutional GQs.


Question 9. Paying Respondents


We do not pay respondents or provide respondents with gifts.


Question 10. Assurance of Confidentiality


The Census Bureau collects data for this survey under Title 13, United States Code, Sections 141, 193, and 221. All data are afforded confidential treatment under Section 9 of that Title.


In accordance with Title 13, each household, GQ administrator, and each person within a GQ participating in the ACS is assured of the confidentiality of their answers. A brochure is sent to sample housing units with the initial mail package and contains this assurance. Housing units responding using the internet questionnaire are presented with additional assurances of their confidentiality and security of their online responses. The brochure mailed to sample GQs with the GQ introductory letter contains assurances of confidentiality. It is also provided to sample GQ residents at the time of interview.


Household members, GQ administrators or GQ residents may ask for additional information at the time of interview. A Question and Answer Guide, and a Confidentiality Notice are provided to respondents, as appropriate. These materials explain Census Bureau confidentiality regulations and standards.


At the beginning of follow-up interviews, the interviewer explains the confidentiality of data collected and that participation is required by law. For all CAPI interviews, the interviewer gives the household respondent, GQ administrator, or GQ resident a copy of a letter from the Census Bureau Director explaining the confidentiality of all information provided.


Question 11. Justification for Sensitive Questions


Some of the data we collect, such as race and sources of income and assets may be considered to be of a sensitive nature. The Census Bureau takes the position that the collection of these types of data is necessary for the analysis of important policy and program issues and has structured the questions to lessen their sensitivity. We have provided guidance to the CAPI interviewers on how to ask these types of questions during the interview. The Census Bureau has materials that demonstrate how we use the data from sensitive questions, and how we keep that data confidential. Respondents who use the internet to complete the survey have access to links on the survey screens that provide information to help address their questions or concerns with sensitive topics.


Question 12. Estimate of Hour Burden


The Mail Materials Test will use production ACS sample, therefore no additional burden beyond what is already covered by the production ACS is requested. The 2019 Content Test will also use production ACS sample, with an additional followup needed. Burden hours for the additional followup are reflected in Table 1.


Burden hours expected to be needed for the other methods panel tests are also included in Table 1. These burden hours are in addition to the hours already covered by the production ACS package. However, when possible, we will use production ACS sample to conduct the tests to reduce burden on the public. The burden hours are provided along with the estimated number of respondents and time per response for each of the proposed tests. We are currently estimating that, on average, we will conduct up to four self-response mail messaging tests per year, so estimates for that proposed project are broken out by year in the table below. In the event of a change to the scope of a test or the inclusion of another test, the burden hours will be noted when submitting the FRN or non-substantive change request for approval.


Table 1. Total Burden Hours for the Proposed Tests

Test

Estimated number of respondents

Estimated time per response (in minutes)

Total burden hours

2019 Self-Response Mail Messaging Tests

Test A – 60,000

Test B – 60,000

Test C – 60,000

Test D – 60,000

Test A – 40

Test B – 40

Test C – 40

Test D - 40

Test A – 40,000

Test B – 40,000

Test C – 40,000

Test D – 40,000

2020 Self-Response Mail Messaging Tests

Test A – 60,000

Test B – 60,000

Test C – 60,000

Test D – 60,000

Test A – 40

Test B – 40

Test C – 40

Test D - 40

Test A – 40,000

Test B – 40,000

Test C – 40,000

Test D – 40,000

2021 Self-Response Mail Messaging Tests

Test A – 60,000

Test B – 60,000

Test C – 60,000

Test D – 60,000

Test A – 40

Test B – 40

Test C – 40

Test D - 40

Test A – 40,000

Test B – 40,000

Test C – 40,000

Test D – 40,000

Respondent Burden Field Test

100,000

45 (40 minutes for the production ACS interview and 5 minutes for the optional followup burden questions)

75,000

Respondent Comment/Feedback Test

100,000

41 (one additional minute for the optional comment field)

68,334

Testing the Use of Administrative Data in HUs and GQs

100,000

40

66,667

Group Quarters Test

500

40 (includes the facility interview)

334

2019 Content Test#

Follow-up Reinterview – 41,500

Follow-up Reinterview – 20

Follow-up Reinterview – 13,834

2021 Content Test

Initial Interview –100,000

Follow-up Reinterview – 83,000

Initial Interview – 40



Follow-up Reinterview – 20

Initial Interview – 66,667

Follow-up Reinterview – 27,667

Total

1,366,500


798,503

(over 3 years)

# The 2019 Content Test will be conducted using production sample for the initial interview. An additional follow-up reinterview that is not normally conducted in production ACS will be conducted. The additional burden hours for the reinterview are reflected here.


The annual estimate of the number of respondents and reporting burden are calculated by dividing totals above by three years.


The estimated total annual number of respondents is 455,500; the estimated total annual burden hours is 266,168.


Question 13. Estimate Cost Burden


There are no costs to the respondent other than his/her time to respond to the survey.


Question 14. Cost to Federal Government


The estimated cost of the ACS Methods Panel program in the FY 2018 is approximately $7.7 million. The Census Bureau will pay all costs of the Methods Panel tests.


Question 15. Reason for Change in Burden


This collection is being submitted as revised as it is an ongoing activity. The burden hours are increasing because the Methods Panel program is implementing different tests to improve the ACS. The experimental designs and objectives for these tests require additional sample to measure the impact of the changes.


Question 16. Project Schedule


The preliminary schedule for the Mail Materials Test is in Table 2. Schedules for all other testing will be determined when the details of those tests are developed and will be included in future submissions with the rest of the details for the test.


Table 2. Mail Materials Test Preliminary Schedule

Activity

Time Frame

Determine experimental design and develop mail material wording for testing

January – February 2018

Develop testing procedures and analysis plan

March 2018 – August 2018

Field test data collection

August 2018 – December 2018

Analyze results and document findings

December 2018 – August 2019



Question 17. Request to not Display Expiration Date


The Methods Panel tests will display the expiration date on materials in line with the methods used for the production ACS. We request that we not display the OMB expiration date on the paper questionnaire. The ACS is an ongoing and continuous survey that is mandatory. If there is an expiration date on the questionnaire, respondents may infer that the survey is over as of the expiration date, which is not the case.


Question 18. Exception to the Certification


There are no exceptions to the Certification for Paperwork Reduction Act submission.




References

Barth, D., Zelenak, M., Asiala, M. E., Castro, E., & Roberts, A. (2016). 2015 Envelope Mandatory Messaging Test. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2016/acs/2016_Barth_01.html

Bouffard, J. A., Brady, S. E., & Stapleton, C. N. (2004). 2003 National Census Test: Contact Strategy Analysis. Washington, D.C.: U.S. Census Bureau.

Bouffard, J., Brady, S., & Stapleton, C. (2004). Washington, D.C.: U.S. Census Bureau.

Bradburn, N. (1978). Respondent Burden. Proceedings of the Survey Research Methods Section of the American Statistical Association, (pp. 35-40).

Feygina, I., Foster, L., & Hopkins, D. (2015). Re: Potential Pilot Interventions to Increase Response Rates to Census Surveys. Email received by Ted Johnson, September 10, 2015.

Flanagan-Doyle, D. (2015). Potential Data Sources to Replace or Enhance the Quesetion on Condominium Status on the American Community Survey. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2015/acs/2015_Flanagan_Doyle_01.html

Flanagan-Doyle, D. (forthcoming). Use of Administrative Records for Institutional Group Quarters. Washington, D.C.: U.S. Census Bureau.

Fricker, S., Gonzalez, J., & Tan, L. (2011). Are You Burdened? Let's Find Out. Paper presented at the Annual Conference of the American Association for Public Opinion Research, Phoenix, AZ. Retrieved February 7, 2018, from http://www.aapor.org/AAPOR_Main/media/AnnualMeetingProceedings/2011/05-14-11_3B_Fricker.pdf

Fricker, S., Kreisler, C., & Tan, L. (2012). An Exploration of the Application of the PLS Path Modeling Approach to Creating a Summary Index of Respondent Burden. Paper presented at the Joint Statistical Meeting, San Diego, CA. Retrieved February 7, 2018, from https://www.bls.gov/osmr/pdf/st120050.pdf

Heimel, S., Barth, D., & Rabe, M. (2016). 'Why We Ask' Mail Package Insert Test. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2016/acs/2016_Heimel_01.html

Longsine, L., Oliver, B., Castro, E., & Rabe, M. (forthcoming). 2017 Mail Design Test. Washington, D.C.: U.S. Census Bureau.

Majercik, J. (2016). Preliminary Research for Replacing or Supplement Self-employment Income on the American Community Survey with Administrative Records. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2016/acs/2016_Majercik_01.html

Majercik, J. (2017). Preliminary Research for Replacing or Supplementing the Sale of Agricultural Products Question on the American Community Survey with Administrative Records. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2017/acs/2017_Majercik_01.html

Martin, E. A. (2007). Final Report of an Experiment: Efforts of a Revised Instruction, Deadline, and Final Question Series in the Decennial Mail Short Form. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/srd/papers/pdf/rsm2007-25.pdf

Moore, B. (2015a). Preliminary Research for Replacing or supplementing the Phone Service Question on the American Community Survey with Administrative Records. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2015/acs/2015_Moore_01.html

Moore, B. (2015b). Preliminary Research for Replacing or Supplementing the Year Built quesetion on the American Community Survey with Administrative Records. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2015/acs/2015_Moore_02.html

National Academies of Sciences, E. a. (2016). Reducing Response Bruden in the American Community Survey: Proceedings of a Workshop. Washington, D.C.: The National Academies Press. doi:10.17226/23639

O'Hara, A., Bee, A., & Mitchell, J. (2016). Preliminary Research for Replacing or Supplementing the Income Question on the American Community Survey with Administrative Records. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2016/acs/2016_Ohara_01.html

O'Hara, A., Field, A., Koerber, W., & Flanagan-Doyle, D. (2016). Preliminary Research for Replacing or Supplementing the Residence One year Ago Question on the American Community Survey with Administrative Records. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2016/acs/2016_Ohara_02.html

Oliver, B., Heimel, S., & Schreiner, J. (2017). Strategic Framework for Messaging in the ACS Mail Materials. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2017/acs/2017_Oliver_01.html

Oliver, B., Risley, M., & Roberts, A. (2016). 2015 Summer Mandatory Messaging Test. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2016/acs/2016_Oliver_01.html

Posey, K., & Roberts, A. (2017). 2016 American Community Survey Content Test Evaluation Report: Retirement, Survivor, and Disability Income. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2017/acs/2017_Posey_01.html

Reingold, Penn Schoen Berland, & Decision Partners. (2014). American Community Survey Messaging and mail Package Assessment Research: Cumulative Findings. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2014/acs/2014_Walker_02.html

Risley, M., Barth, D., Cheza, K., & Rabe, M. (forthcoming). 2017 Pressure Seal Mailing Materials Test. Washington, D.C.: U.S. Census Bureau.

Ruggles, P. (2015). Review of Administrative Data Sources Relevant to the American Community Survey. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2015/acs/2015_Ruggles_01.html

Seeskin, Z. H. (2016). Evaluating the Use of Commercial Data to Improve Survey Estimates of Property Taxes. Washington D.C.: U.S. Census Bureau. Retrieved February 7, 2018, from https://www.census.gov/content/dam/Census/library/working-papers/2016/adrm/carra-wp-2016-06.pdf

Steiger, D., Robins, C., & Stapleton, M. (2017). Cognitive Testing of the American Community Survey Respondent Burden: Weeks Worked and Income. Washington, D.C.: U.S. Census Bureau and Westat. Retrieved February 1, 2018, from https://www.census.gov/library/working-papers/2017/acs/2017_Westat_01.html

Stokes, S., Reiser, C., Bentley, M., Hill, J., & Meier, A. (2011). 2010 Census Deadline Messaging and Compressed Mailing Schedule Experiment. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/2010census/pdf/2010_Census_DM_CS.pdf

U.S. Census Bureau. (2014). American Community Survey Design and Methodology. Washington, D.C.: U.S. Census Bureau. Retrieved February 1, 2018, from https://www.census.gov/programs-surveys/acs/methodology/design-and-methodology.html

U.S. Census Bureau. (2017). Agility in Action 2.0: A Snapshot of Enhancements to the American Community Survey. Washington, D.C.: U.S. Census Bureau. Retrieved August 8, 2017, from https://www.census.gov/programs-surveys/acs/operations-and-administration/agility-in-action/agility-in-action-2.html


1 The Puerto Rico Community Survey (PRCS) provides all mail materials in Spanish.

2 Administrative records refer to data collected by government agencies for the purposes of administering programs or providing services.

3 See 42 U.S.C. 1437b and 1437f. HUD’s funding formulas are available at: http://www.huduser.org/portal/datasets/fmr/fmrover_071707R2.doc and http://www.huduser.org/portal/datasets/il/il10/IncomeLimitsBriefingMaterial_FY10.pdf. The results of these formulas are announced yearly in the Federal Register.

4 See United States Housing Act of 1937, Public Law 93-383, as amended, and 42 U.S.C. § 1437f(c)(1);

24 CFR 888.113, 24 CFR 982.401.

5 See Community Services Block Grant Act, Pub. L. No. 105-285, Sections 673 (2), 674, and 681A, and 42 U.S.C. § 9902 (2), 9903, and 9908 (b)(1)(A), (b)(11) & (c)(1)(A)(i),

6 See 7 U.S.C. 2025(d)(2) and 7 CFR 275.24(b)(3). The FNS calculates a Program Access Index that allows them to provide additional award funds to states that have the highest levels of SNAP access, or show the greatest annual improvement in SNAP access. For the PAI formula, see: http://www.fns.usda.gov/ora/menu/Published/snap/FILES/Other/pai2008.pdf and 7 CFR 275.24.

7 See Broadband Data Improvement Act of 2008, Pub. L. No. 110-385;

47 U.S.C. § 1303(d)

8 See Patient Protection and Affordable Care Act, Pub. L. No. 111-148, §10334 and 42 U.S.C. 300kk.

9 See 23 U.S.C. 134 and 23 U.S.C. 135. See also 23 U.S.C. 303 and 23 CFR 450.316-322. See also P.L. 109-59.

10 Refer to the Census Bureau's Information Quality Guidelines at this link: https://www.census.gov/about/policies/quality/guidelines.html

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Title2020 Census ACS MP Supporting Statement A
AuthorAgnes S Kee (CENSUS/ACSO FED)
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy