The U.S. Federal Statistical System (FSS) is comprised of numerous federal agencies that collect and produce data about the country’s people, businesses, and systems. The FSS must maintain public trust in order to maximize participation in its on-going surveys; lapses in trust could increase operational costs, reduce data quality, and hinder research and policy efforts as a result of reduced respondent cooperation and declining response rates. To improve our understanding of public trust in the FSS, the Census Bureau will collect data on public attitudes, beliefs, and concerns regarding federal statistics and the agencies that collect them. These data will help the Census Bureau and the FSS to develop strategies for communicating with the public about the importance and purposes of federal statistical data, and to plan data collection efforts that reflect understanding of public perceptions and concerns.
I. Abstract
From December 2009 through April 2010, the Census Bureau contracted the Gallup Organization to conduct a nightly poll of the public’s opinion toward the 2010 Census, public awareness of Census promotional efforts, and intent to mail back their Census forms. This nationally representative, probability-based, sample of 200 respondents per night was sampled from random digit dialing (RDD) and cell frames, The estimates which were based on aggregating these data over week-long time periods, provided nearly immediate feedback on public reaction to national events that could possibly influence response to the 2010 Census.
The Census Bureau used this feedback to make communication campaign decisions during data collection that contributed to achieving a mail-back participation rate of 74%, despite increased vacancy rates due to the economic downturn, increased public skepticism about the role of the Federal Government, and a general decline in survey response rates during the decade that crossed both public and private sector surveys.
From February 2012 through March 2014, the Gallup Organization incorporated the Federal Statistical System (FSS) Public Opinion questions into an ongoing Gallup Daily Tracking Survey, under a contract with the U.S. Census Bureau. The mission critical objective was to track public opinion toward statistics produced by the Federal Government. During this time, we saw a relatively stable level of trust in Federal statistics until several events became headlines in the news, including scandals involving the IRS and NSA and then the Government shutdown of 2013. As these events progressed, we saw a downturn in trust in Federal statistics, which also happened to correlate with a decrease in response rates to several Census Bureau surveys. It would be useful to collect additional data to further explore these relationships. To date, the data have been gathered nightly from small (n=200) independent cross-sectional samples of individuals participating in a general multi-topic Random Digit Dial (RDD) telephone survey. We collected 200 cases per night, leading up to 1,400 cases per week and 6,000 cases per month, etc. The nightly sample data was aggregated over weeks or months to examine trends in attitudes towards the FSS. The cross-sectional design offered the opportunity to examine large marginal shifts in attitudes on a daily basis. The cross-sectional design precluded examination of small daily marginal changes in attitudes, as well as any change at the individual level. Because we did not specifically ask about potentially-influential events, the design also limited our ability to relate very specific events in the news, such as the IRS and NSA stories, to shifts in opinion toward Federal statistics.
The objective of the planned study is to conduct a nationally representative sample survey of public opinion, primarily on attitudes toward the FSS and the use of Federal statistics. The collected data will be used to track changes in attitudes towards the FSS and in data use. The data will also enable the Census Bureau to assess how news events related to the statistical system or government and public perceptions of these events affects usage of and attitudes towards Federal statistics. The methodology for the planned survey is very similar to the recently-conducted FSS Public Opinion Survey (described above), however with a smaller weekly sample with additional questions that will allow us to examine possible predictors of change over time. The smaller sample size makes this data collection cheaper, and thus possible to continue this survey for a longer period of time.
The legal authority under which this information is being collected is Title 13 U.S.C. Chapter 5 Sections 141 and 193.
Information quality is an integral part of the pre-dissemination review of the information disseminated by the Census Bureau (fully described in the Census Bureau's Information Quality Guidelines). Information quality is also integral to the information collections conducted by the Census Bureau and is incorporated into the clearance process required by the Paperwork Reduction Act.
These public opinion data will enable the Census Bureau to better understand public perceptions of federal statistical agencies and their products, which will provide guidance for communicating with the public and for future planning of data collection that reflects a good understanding of public perceptions and concerns. Because all federal statistical agencies are also facing these issues of declining response rates and increasing costs in a time of constrained budgets, the Census Bureau will share the results of these surveys with other federal statistical agencies, to maximize the utility of this information collection and ultimately, the quality and efficiency of federal statistics. Specifically, other federal statistical agencies have expressed interest in continuing this data collection for use in communications strategies within their own agencies.
The Census Bureau plans to add 7 questions to a sample of cases in the Gallup Daily Tracking, which is an ongoing daily survey asking U.S. adults about various political, economic, and well-being topics. The initial 7 questions will allow us to continue the time series begun under the previous study and to add open-ended questions, which will allow us to measure change in the basis of attitudes. The additional questions will allow us to investigate other issues that could be related to trust and other perceptions of the FSS. These questions are included in Attachment A.
The survey methodology for the planned collection is the same as the past collection. It includes sample coverage of the entire United States, including Alaska and Hawaii, and relies on a three-call design to reach respondents not contacted on the initial attempt. The survey methods for the Gallup Daily Tracking rely on live interviews, dual-frame sampling (which includes listed landline interviewing as well as cell phone sampling to reach those in cell phone-only households, cell phone-mostly households, and unlisted landline-only households), and a random selection method for choosing respondents within the household. The Census Bureau will ask questions of 850 respondents a week who participate in the Gallup Daily Tracking from March 1, 2015 through October 31, 2019 via a contract that has a base year and four options years.
Up to 20 additional pulse questions can be added to the nightly survey for a total of 100 days per year. These “pulse” questions will be used for several distinct purposes:
First, additional questions can be added to and removed from the initial set of 7 questions in a series of question “rotations”. Rotations will be planned to explore public opinion of different aspects of statistical uses of administrative records. Topics for the additional questions will including knowledge about administrative records use, public perception of the quality of such records, public perception of privacy and confidentiality implications of such use, and differentiation between types of administrative records and types of statistical uses. These rotations might include introducing or framing the questions differently, varying the types of records mentioned and the methods of use in the question, willingness-to-pay/stated preference questions, and so on. These types of questions would add up to 5 questions in the nightly interview and would be fielded for a limited amount of time surrounding the particular event. These questions will be submitted to OMB by way of update to this submission (specified in more detail below). Some illustrative examples are provided in Attachment B.
Second, rotating questions will be used to explore awareness of other statistics or other statistical agencies not mentioned in the core questions. For example, we may ask additional questions to explore awareness of specific types of statistics, like health statistics, or agricultural statistics. These types of questions would add up to 3 questions in the nightly interview and would be fielded for a limited amount of time. These questions will also be submitted to OMB by way of update to this submission (specified in more detail below).
Third, rotating questions will be used to explore opinions towards initiatives, like Bring Your Own Device, that the Census Bureau and other federal statistical agencies are considering adopting. These types of questions would add up to 3 questions in the nightly interview and would be fielded for a limited amount of time. These questions will also be submitted to OMB by way of update to this submission (specified in more detail below).
Fourth, rotating questions will be used for communications, public relations and similar message testing. Examples of such messages would be different ways of describing confidentiality or privacy protection, or different ways of encouraging response to a survey. These types of questions would add up to 5 questions in the nightly interview and would be fielded for a limited amount of time surrounding the particular event. These questions will also be submitted to OMB by way of update to this submission (specified in more detail below). In general, they would ask things like awareness of the event, and opinions about the relationship (if any) between those events and the federal statistical system.
Finally, we may wish to add rotating questions very quickly after an unanticipated event to gage awareness of those events and opinions about the relationship (if any) between those events and the federal statistical system. These could be events like a data breach (public or private sector), political scandal, or any other unanticipated news event that may alter public perceptions. Gallup can add questions with as little as 48 hours notice. Up to 3 additional questions could be fielded in the nightly interview for a limited amount of time surrounding the particular event. These questions would be submitted to OMB for a quick-turn-around approval and would be very limited in scope to address the particular unanticipated event.
OMB and the Census Bureau have agreed that these rotating questions constitute nonsubstantive changes to this submission. OMB will be informed approximately monthly of the intent to make these changes through a single tracking document. This document will contain a complete history of all questions asked and the months that each question was asked. See Attachment C for an example.
Although the Gallup Daily Tracking Survey is portrayed by Gallup as being nationally representative, it does not meet Census Bureau quality standards for dissemination and is not intended for use as precise national estimates or distribution as a Census Bureau data product. The Census Bureau will use the results from this survey to monitor awareness and attitudes, as an indicator of the impact of potential negative events, and as an indicator of potential changes in awareness activities. Although the response rate to the survey is insufficiently high to be used for point estimation, the results are expected to provide useful information for describing general trends and for modeling opinions. Data from the research will be included in research reports with clear statements about the limitations and that the data were produced for strategic and tactical decision-making and exploratory research and not for official estimates. Research results may be prepared for presentation at professional meetings or in publications in professional journals to promote discussion among the larger survey and statistical community, encourage further research and refinement. Again, all presentations or publications will provide clear descriptions of the methodology and its limitations.
Theoretical Framework
Prior to the original data collection in this iteration, the Federal Statistical System (FSS) Team focused on definitions of trust in statistical products and trust in statistical institutions that are derived from work by Ivan Fellegi (1996, 2004), a model of which is shown in Figure 1 below. In addition to considering questions developed and tested by the OECD working group, the FSS working group considered questions used by the Office of National Statistics and the National Centre for Social Research in the United Kingdom and by the Eurobarometer. Based on the U.S. cognitive laboratory results from NCHS work on the OECD survey, we started from the premise that we needed to measure awareness of statistics and statistical institutions first, and then assess level of knowledge/data use, before proceeding to questions addressing trust. We consulted additional previous research that examined the U.S. public’s knowledge of statistics (Curtin, 2007) and sought to create a questionnaire that would be comprehendible by the general population. More detail on questionnaire development is available in Childs, et al. (2102).
Figure 1. Fellegi’s model of Trust in Official Statistics
Reproduced from “Report of the electronic working group on measuring trust in Official Statistics”, STCCSTAT/BUR (2010)2, January 20 2010, OECD 2010
The analytic goals of FSS Public Opinion Survey (POS) are to:
Continue to monitor the relationship between awareness of and trust in the federal statistical system and federal statistics in the United States.
Further explore the relationship between trust in the statistical system and attitudes towards the statistical uses of administrative records.
Observe how current events impact public perception towards the federal statistical system.
Continue the time series of trust in the statistical system.
Inform a variety of internal management decisions, such as how to provide better information sharing with the public to address any misperceptions or concerns raised by the survey results (see Miller and Walejko, 2010 for an example of how awareness activities could be targeted).
The FSS POS will consist of survey questions added onto the Gallup Daily Tracking Survey for one year, with four additional option years, of data collection.
Gallup Daily Tracking Survey Documentation
Gallup Daily Tracking is a daily survey asking 1,000 U.S. adults about various political, economic, and wellbeing topics. Gallup also routinely incorporates additional questions into the Gallup Daily tracking survey on a short-term basis. These questions cover topical issues, including election voting intentions and views of events in the news. On any given evening, approximately 250 Gallup interviewers conduct computer assisted telephone interviewing with randomly sampled respondents 18 years of age and older, including cell phone users and Spanish speaking respondents from all fifty states and the District of Columbia. The survey questions include many of the standard demographics, including race, income, education, employment status, and occupation. Location data, such as Zip Codes, allow researchers to map the responses to particular parts of the country and accumulate data for local level comparison and interpretation.
Of the 1,000 nightly Gallup Daily Tracking Survey respondents, 200 will be sampled for the 25 Census Bureau questions. These questions will focus on issues of awareness and trust in Federal statistics and in the statistical use of administrative records as described above.
All interviews will be conducted using computer assisted telephone interviewing (CATI). Telephone interviews will be conducted with respondents on cell phones and landline telephones.
This research adds to information collected by the Census Bureau between 2010 and 2014, but does not duplicate any other research currently being done by the Census Bureau or other Federal agencies. In the private sector, some organizations periodically measure trust in government, but none measure trust specifically in the statistical sector. This research fills a void identified by the Interagency Committee on Statistical Policy.
The Organization for Economic Co-operation and Development (OECD) electronic working group on measuring trust in official statistics developed a survey for measuring trust in official statistics that was cognitively tested in six of the member countries including the United States (Brackfield 2011). The goal of this development was to produce a model survey questionnaire that could be made available internationally to be used comparably in different countries. Unfortunately, a 2010 National Center for Health Statistics (NCHS) cognitive study revealed that these questions are inadequately understood by U.S. respondents (Willson et.al. 2010) and therefore would be unable to sufficiently measure the trust in the FSS in the United States. As such, the FSS Working Group sought to build upon the theoretical constructs and previous research on this subject (Felligi 2004; OECD Working Group 2011; Willson et.al. 2010) in designing and administering a version of this poll that might adequately measure U.S. public opinion of the FSS.
The data collection does not impact small entities.
A key analytic goal is to measure how current events influence trust in the federal statistical system. While some current events, like a presidential election, may be known in advance, other current events are unforeseen, like a data breach or policy change in another part of the government. In order to measure public opinion immediately before and after any given event – predicted or not – we must maintain a consistent, daily data collection for a period of time.
There are no special circumstances. Data collection is conducted in accordance with the Office of Management and Budget (OMB) guidelines.
Within the Federal Government, consultants include the Statistical and Science Policy Office, Office of Management and Budget; National Agricultural Statistics Service, the National Center of Health Statistics, the Economic Research Service, the Bureau of Labor Statistics, and the Statistics of Income Division, IRS. This group represented the interests of the ICSP. In addition, the Census Bureau also received feedback from the Federal Committee on Statistical Methodology’s Subcommittee on Statistical Uses of Administrative Records.
On August 8, 2014, we published a notice in the Federal Register (Document citation: 79 FR 51302, Pages: 51302 -51303, Document Number: 2014-20418) seeking public comment on the necessity, content, and scope of the data collection. We received one comment, dated September 3, 2014, which expressed concern over respondent fatigue in a questionnaire that could have more than 25 questions. We agree with this concern and do not plan to ask more than 15 questions of any given respondent during any given survey. The baseline will be 7 questions, but occasionally more questions will be asked, per details in this submission. On a regular basis, burden should be 5 minutes or less.
Respondents will not be offered any gift or payment.
Because the Census Bureau is adding questions to an ongoing Gallup Survey, Gallup will introduce a statement indicating that participation in the survey is voluntary and that Gallup will not make the respondent’s information available in any way that would personally identify him or her. The Census Bureau will not receive any information to identify survey respondents directly. We address this during interview consent with the following statement: “Your responses will not be shared with anyone in a way that could personally identify you.”
The survey does not include questions of a sensitive nature.
The annual respondent burden for conducting 44,200 interviews is estimated at 7,367 hours. The maximum length for the Census Bureau’s portion of each interview is estimated to be 10 minutes. The average length is estimated to be 5 minutes. Respondents will not be contacted prior to or following the phone interview (that is, there are no advance letters or reminders accompanying this survey), so the entire hour burden is based on estimated interview time. There will be 850 interviews per week each year. Data collection will span up to 5 years.
There are no costs to respondents other than that of their time to respond.
The annual cost of this data collection is an estimated to be $750,000 per year and is funded by the Census Bureau. Data collection will span up to 5 years.
To reduce costs, the number of nightly interviews was changed from 200 to 121. This change resulted in a corresponding reduction in the estimated hour burden from 11,667 to 7,367.
The timeline below is based on receiving OMB approval on 6/30/2015.
Task |
Start |
Finish |
Data collection |
July 1, 2015 |
October 31, 2019 |
Rotations – monthly |
July 1, 2015 |
October 31, 2019 |
Data analysis |
July 31, 2015 |
December 30, 2019 |
|
|
|
Although the Gallup Daily Tracking Survey is portrayed as being nationally representative, it does not meet Census Bureau quality standards for dissemination and is not intended for use as precise national estimates or distribution as a Census Bureau data product. The Census Bureau will use the results from this survey to monitor awareness and attitudes, as an indicator of the impact of potential negative events, and as an indicator of potential changes in awareness activities. Other federal statistical agencies may use this data for similar purposes. Data from the research will be included in research reports with clear statements about the limitations and that the data were produced for strategic and tactical decision-making and exploratory research and not for official estimates. Research results may be prepared for presentation at professional meetings or in publications in professional journals to promote discussion among the larger survey and statistical community, encourage further research and refinement. Again, all presentations or publications will provide clear descriptions of the methodology and its limitations.
We are requesting an exemption to not display the expiration date because these data will be collected in the middle of a series of questions already collected by Gallup, most of which are not collected for the government and not covered under this clearance. See item 19 for further explanation of this justification.
In addition to the exemption covered above, we are requesting an exemption to item (g), sections (i), (ii), (iii) and (vi) of the certification. Because this data collection is happening as a part of an ongoing data collection by Gallup, most of which questions are not related to this effort or to this clearance and because the OECD recommends that these types of data be collected by an independent third party (OECD Working Group, 2011), we are requesting an exception to informing all participants of (i) Why the information is being collected; (ii) Use of information; (iii) Burden estimate; and (vi) Need to display currently valid OMB control Number. This information will be available to Gallup interviewers in FAQs, should a respondent request any of this information, however, to maintain the flow of the survey questions from those collected by Gallup to those added by the Census Bureau and to gather data free of bias, we request a waiver from presenting these pieces of information for all participants.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Riki Conrey |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |