SSA_Rapid Surveys_24MAY23

SSA_Rapid Surveys_24MAY23.docx

[NCHS] National Center for Health Statistics (NCHS) Rapid Surveys System (RSS)

OMB: 0920-1408

Document [docx]
Download: docx | pdf


Supporting Statement A for Request for Clearance:

NATIONAL CENTER FOR HEALTH STATISTICS

RAPID SURVEYS SYSTEM


OMB No. 0920-XXXX

Expiration Date: XX/XX/XXXX


Contact Information:


Stephen Blumberg, Ph.D.

Director

Division of Health Interview Statistics

National Center for Health Statistics/CDC

3311 Toledo Road

Hyattsville, MD 20782

301-458-4107

swb5@cdc.gov




May 24, 2023


  • Goal of the study: The Rapid Surveys System has three major goals: 1) to provide CDC and other partners with time-sensitive data of known quality about emerging and priority health concerns, 2) to use these data collections to continue NCHS’ learning and evaluation of the quality of public health estimates generated from commercial online panels, and 3) to improve methods to appropriately communicate the fitness for use of public health estimates generated from commercial online panels. representativeness and accuracy.

  • Intended use of the resulting data: Provide time sensitive or other estimates that are appropriate for policy or programmatic needs.

  • Methods to be used to collect: The platform is designed to collect self-reported health data using commercially available, statistically sampled national probability-based online panels.

  • The subpopulation to be studied: The data are intended to be nationally representative, representing the U.S. household populations. Depending on the sample size, topic and analytic goals, estimates for specific subpopulations defined by demographics or socioeconomic characteristics can be generated.

  • How data will be analyzed: Data will be weighted and have appropriate variance estimation variables. A set of tables and infographics will be produced, and the data will be released via a dashboard, public-use files, and restricted-use files via the RDC.



The major NCHS data collections referenced in this package:

  • National Health Interview Survey – (NHIS – OMB No. 0920-0214, Expiration Date 12/31/23)

  • National Health and Nutrition Examination Survey – (NHANES – OMB No. 0920-0950, Expiration Date 4/30/25)

  • National Survey Family Growth – (NSFG – OMB No. 0920-0314, Expiration Date 12/31/24)

  • Research and Development Survey – (RANDS – OMB No. 0920-0222, Expiration Date 1/31/26)








Table of Contents


A. JUSTIFICATION 3

1. Circumstances Making the Collection of Information Necessary 3

2. Purpose and Use of Information Collection 4

Background for Probability-Sampled Commercial Survey Panels 4

Rapid Surveys Goals 6

Implementation of Quarterly Surveys 7

Research Areas 8

Cognitive Testing 11

3. Use of Improved Information Technology and Burden Reduction 12

4. Efforts to Identify Duplication and Use of Similar Information 12

5. Impact on Small Businesses and Other Small Entities 14

6. Consequences of Collecting the Information Less Frequently 14

7. Special Circumstances Relating to Guidelines of 5 CFR 1320.5 14

8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agencies 14

9. Explanation of Any Payment or Gift to Respondents 15

10. Protection of the Privacy and Confidentiality of Information Provided by Respondents 16

Rapid Surveys and Personally Identifiable Information (PII) 17

General Privacy and Confidentiality Protection Procedures 17

11. Institutional Review Board (IRB) and Justification for Sensitive Questions 19

12. Estimates of Annualized Burden Hours and Costs 19

Estimated Annualized Burden Table 20

Estimated Annualized Burden Costs to Respondents. 20

13. Estimates of Other Total Annual Cost Burden to Respondents and Record keepers 21

14. Annualized Costs to the Federal Government 21

15. Explanation for Program Changes or Adjustments 21

16. Plans for Tabulation and Publication and Project Time Schedule 21

17. Reason(s) Display of OMB Expiration Date is Inappropriate 22

18. Exceptions to Certification for Paperwork Reduction Act Submissions 22



LIST OF ATTACHMENTS


Attachment A – Public Health Service Act

Attachment B – Published 60-Day FRN

Attachment B1 – Comments

Attachment C – Guidance for sponsors

Attachment D – RSS Round 1 (2023) Burden Estimate and Questionnaire

Attachment E – RSS Round 1 (2023) Content Justification from Sponsors

Attachment F – 30-day Federal Register Notice for the RSS and RSS Round 1 (2023)

Attachment G – NCHS Ethics Review Board Determination Notice

Attachment H– Publications on RANDS Estimation and Calibration Research



Supporting Statement A

NCHS Rapid Surveys System


A. JUSTIFICATION


1. Circumstances Making the Collection of Information Necessary


The National Center for Health Statistics (NCHS), Centers for Disease Control and Prevention (CDC), requests OMB approval for a new data collection, the NCHS Rapid Surveys System. The system is intended for contexts in which decision makers’ need for time-sensitive data of known quality about emerging and priority health concerns is a higher priority than their need for statistically unbiased estimates.


Traditionally, NCHS generates robust nationally representative statistics using methods that maximize relevance, accuracy and reliability1. Whereas NCHS’ gold standard sampling, interviewing, and post-processing strategies are pivotal for examining yearly trends in disease and behavioral risk factors across the nation as well as differences across demographic and geographic groups, they are not as useful for responding to short-term challenges to public health. CDC uses other data sources to identify and track emerging public health threats, such as those associated with disease outbreaks. During the COVID-19 pandemic, the implications of the unknown quality of data from some public health surveillance approaches became clearer. In response, CDC is working to better understand the limits of its public health surveillance systems and to develop a mechanism that facilitates collection of time-sensitive survey data with known quality2. This Information Collection Request (ICR) is part of the CDC Director’s commitment to developing and institutionalizing new systems, processes, and policies that will enable more time-sensitive sharing of data and scientific findings suitable for decision making. This ICR requests approval for a survey data collection system that would allow CDC to generate time-sensitive estimates of known representativeness and accuracy.


For the first year, the Rapid Surveys System will be designed around a quarterly data collection cadence that can be increased or reduced, thereby avoiding the expense and time associated with setting up and contracting new data collections for each public health need. NCHS will seek approval from OMB under the Paperwork Reduction Act (PRA) for each individual survey through a non-substantive change request, accompanied by a 30-day Federal Register Notice that announces the content, weighting, and development work associated with that particular survey. The cadence may be longer or shorter during any given year, depending upon demand from CDC and HHS partners and collaborators.


The Rapid Surveys System will use a platform designed to approximate national representation in ways that many data collection approaches cannot. Specifically, the platform is designed to collect self-reported health data using commercially available, statistically sampled national probability-based online panels. Since the resulting estimates will only approximate nationally representative estimates, the System includes a number of mechanisms to carefully evaluate the resulting survey data for its appropriateness for use in public health surveillance and research (e.g., hypothesis generating). As such, an important aspect of the System will be a data dissemination strategy that communicates the strengths and limitations of data collected through the System’s online panel approach, depending on the topic and sample, as compared to more robust data collection methods. The System also includes mechanisms to facilitate continuous quality improvement by supplementing these panels with intensive efforts to understand how well the estimates reflect populations at most risk.


Consistent with NCHS’s commitment to quality and professional standards of practice, the Rapid Surveys System will carefully evaluate the quality of the data and clearly communicate what is known about its validity, accuracy, and the appropriateness of the data for a given purpose. This quality assessment will include analytic evaluations of alternative methods for obtaining estimates after adjusting, calibrating, and modeling data from commercial survey panels with other data sources. It will benefit from statistical work with commercial survey panels completed over the past few years by the NCHS Research and Development Survey (RANDS) and from the ability to coordinate content with NCHS’s established surveys. These gold-standard surveys will be used to help calibrate the panel survey data. These gold-standard collections and other sources will also provide benchmarks to help judge the objectivity and utility of the Rapid Surveys estimates.


Data collection for this project is authorized under 42 U.S.C. 242k (Section 306 of the Public Health Service Act). A copy of the legislation is provided in Attachment A. This clearance request covers three years, 2023-2025, with four quarterly data collections conducted annually. The questionnaire content will vary from quarter to quarter. Additional data collections might be scheduled between quarterly data collections.


2. Purpose and Use of Information Collection


The ability to collect data faster than more robust methods will be achieved, in part, by conducting question evaluation concurrent with (rather than in advance of) data collection, by using sampling frames (commercial online survey panels) that may be subject to errors of representativeness, by using a survey mode that is faster than fielding interviewer-administered surveys, and by simplifying the post-processing procedures. The overall goal is to provide publicly available estimates within six months of when the survey design process starts.


Background for Probability-Sampled Commercial Survey Panel


Online survey panels have been increasingly more popular for conducting research on the U.S. population (Callegaro et al., 2014), however, there is no broad awareness of their fitness for guiding public health decisions. Panels that are not generated using probability sampling and those that are not transparent about their recruiting methods or response rates are impossible to use for generating national estimates and thus can be misleading for generating national policy responses. On the other hand, probability-sampled commercial survey panels generally consist of persons who are recruited using statistical sampling and agree to participate in multiple surveys, typically in exchange for payment or prizes. These survey panels are designed to take advantage of the efficiencies in using online surveys, though other modes such as telephone can be used to improve data accuracy. Such probability-based methods to create panels, with appropriate post-analysis evaluation and weighting, can be used to approximate nationally representative estimates. When their methods are transparently described and coverage and nonresponse bias assessments are possible, they can be an important tool in the public health toolbox.


The KnowledgePanel operated by Ipsos Public Affairs, LLC, and the AmeriSpeak Panel operated by NORC at the University of Chicago are two of the leading panels in existence today. NCHS has contracted with Ipsos and NORC for the Rapid Surveys System to collect data from both the KnowledgePanel and AmeriSpeak panels on a periodic basis. These two panels recruit panel members using traditional probability-based sampling methods such as Random Digit Dialing (RDD) or Address Based Sampling (ABS). The use of statistically sampled, probability-based, online survey panels provides a statistical foundation for making weighting adjustments, survey estimation, and computing standard errors. In addition, estimates from probability-based panels have been shown to be closer to benchmark estimates from traditional gold standard surveys like the National Health Interview Survey (NHIS) than estimates from nonprobability-based panels (Yeager et al., 2011).


Such online probability-based panel surveys have several strengths for data collection. First, survey instruments can be fielded more quickly and at a lower cost than traditional surveys since online surveys are generally self-administered and the panel members have already been recruited. Second, some information is known about the participants prior to fielding the survey so a sample for a particular survey can be selected based on target demographic characteristics. Third, panel surveys offer the opportunity for both cross-sectional and longitudinal data that can be used to track trends over time. Finally, using commercial online panels reduces the cost to the government since the panels are built and maintained by private companies and used by multiple government and nongovernment clients. The cost to NCHS of building and maintaining its own survey panel would be substantial.


While probability-based online panel surveys begin from rigorous sampling methodology, they have been criticized for a lack of representation of the U.S. population for a couple of reasons. First, although a very high percentage of people in the U.S. now have access to the internet, those without internet access differ from those who do have access on key demographics such as age, race, education, and household income (Martin, 2021; Pew Research Center, 2022). There are also differences in access to computer devices, mobile devices, and broadband internet by demographics (Martin, 2021; Pew Research Center, 2022). Commercial probability-sampled survey panels improve upon these deficiencies by using telephone- or address-based recruitment of panel participants and by offering Internet access or an alternative response mode to those who need it. Still, those who participate in survey panels likely differ from those who do not, potentially reducing accuracy. It is also possible that some people are less willing to participate in online panel surveys for other reasons, such as privacy concerns about providing information over the Internet or difficulty using technology.


Second, while the percentage of panel members who complete any given survey request to an online panel might be quite high, the response rates from the initial stage of recruitment to online panel surveys can be low and panels also experience attrition over time. These accumulating elements of nonresponse may further reduce accuracy.


In the past decade, NCHS began using commercial online survey panels primarily for methodological research. Namely, as part of NCHS’s RANDS program, NCHS used commercial survey panels to augment NCHS’ question evaluation and research program with quantitative methodologies for measuring error, and for other statistical research purposes. In response to the COVID-19 pandemic, NCHS expanded its use of commercial panels and collected survey data in order to release timely, experimental estimates on a set of COVID-19- and pandemic-related topics. With the Rapid Surveys System, NCHS will continue exploring the use of commercial survey panels, not only for methodological research, but also for producing estimates on key health outcomes and emerging topics that NCHS will release publicly.


A key feature of Rapid Surveys will be the development of a methodology for mitigating the risk of errors of representativeness. NCHS is in a unique position to address this risk due to the existence of gold standard surveys like the NHIS that deploy extensive effort to address survey error. For example, the sampling frame of the NHIS consists of addresses obtained through a commercial vendor, and then the frame is supplemented with costly listing operations in areas where the address-based sampling frame does not have adequate coverage. The number of completed NHIS interviews is large (roughly five times the expected sample size of a quarterly Rapid Survey), providing precise estimates of many subgroups in the U.S. population on an annual basis. Moreover, NHIS interviews are conducted via personal visits with telephone follow-up to achieve high response rates and minimize the risk of nonresponse bias.


NCHS will utilize weighting calibration methods that leverage estimates from gold standard surveys like the NHIS in the weighting process to reduce the bias in estimates from Rapid Surveys. The process involves raking Rapid Survey weights to weighted distributions of variables from the NHIS. Effective variables used for calibration should be differentially associated with adult participation in probability-based online survey panels, that is, adults with the associated characteristic should be over- or under-represented in online survey panels. In addition, the calibration variables should be associated with health outcomes of interest that are measured in Rapid Surveys. The NHIS already includes a substantial number of health variables that could be used for calibration. Questions can also be added to the NHIS to create additional variables to improve the weighting process and produce more robust estimates from Rapid Surveys. More details on these methods are provided on Supporting Statement B.



Rapid Surveys Goals


The Rapid Surveys System has three major goals: 1) to provide CDC and other partners with time-sensitive data of known quality about emerging and priority health concerns,

2) to use these data collections to continue NCHS’ learning and evaluation of the quality of public health estimates generated from commercial online panels, and 3) to improve methods to appropriately communicate the fitness for use of public health estimates generated from commercial online panels. The Rapid Surveys System is intended as a complement to the current household survey systems, including the NHIS. The Rapid Surveys are intended to be quicker-turn-around surveys that require less accuracy and precision than CDC’s more rigorous population representative surveys (due to differing methodology (sample design, response rates), as well as smaller sample sizes).


For the first year (four data collections), NCHS will obtain 2,000 survey responses from the KnowledgePanel and 4,000 survey responses from the AmeriSpeak Panel for the first four quarterly surveys. For subsequent data collections, NCHS plans at least 2,000 completed responses from each of the two vendors, with larger sample sizes if funds are available.


An approximate expedited timeline for each quarterly survey in Rapid Surveys will be as follows:


  • Survey design (1 month)

  • OMB clearance (45 days)

  • Instrument programming and pretesting (1 month)

  • Data collection (no more than 21 days)

  • Post-processing and public release of estimates (2 months)


Implementation of Individual Surveys


The survey design phase will involve NCHS planning content and working with partners from CDC and other agencies who have new or emerging data needs. For each individual survey, NCHS will meet with OMB to discuss the content being proposed as well as the proposed parallel cognitive testing and panel evaluation in advance of posting the required 30-Day Federal Register notice announcing the content to the public. Everything related to the specific collection will be in both the briefing and the written justification for the non-substantive change. OMB commits to a 30-day review. Once each round’s questionnaire has been finalized, each data collection contractor will develop computerized self-administered questionnaire (CSAQ) specifications and program their respective survey instruments. The data collection contractors will also develop materials for alternative response modes – computer-assisted telephone interviewing (CATI) or mailed questionnaires – if needed for panel participants without internet access or for those who request them.


Each round’s questionnaire will consist of four main components: (1) basic demographic information on respondents to be used as covariates in analyses, (2) new, emerging, or supplemental content proposed by NCHS, other CDC Centers, Institute, and Offices (CIOs), and other HHS agencies, (3) questions used for calibrating the survey weights, and (4) additional content selected by NCHS to evaluate against relevant benchmarks. Questions that are part of Components 1 and 2 will be used to achieve the first goal of providing relevant, timely data on new, emerging, and priority health topics that are fit-for-use for decision making. Attachment C includes guidance to partners for proposing content for the Rapid Surveys program. Attachments D and E include the questionnaire, burden estimate, and content justification from contributors for the first round of RSS in 2023. Attachment F requests public comment on the Rapid Surveys System overall and the first round of data collection in 2023. Attachments D, E, and F will be updated for each future round of data collection.


All content from NHIS, NSFG, NHANES, the American Community Survey (ACS), or the Current Population Survey (CPS) is available for calibration purposes. In addition, new content may be added to both NHIS and a Rapid Surveys questionnaire to evaluate its utility for calibration. NCHS will not disseminate statistical estimates for the variables added to a given iRapid Surveys questionnaire solely for the purposes of calibrating survey weights (Component 3); rather, the weighted sums of the calibration variables will be forced to equal the totals of better-estimated population totals from a source like the NHIS. However, as long as the calibration variables are of sufficient data quality and there are no confidentiality restrictions, the survey responses will be included in the publicly available data files for both the NHIS and Rapid Surveys.


Questions included specifically for benchmarking, i.e., comparing estimates from the Rapid Surveys to estimates from other sources, will be used to carefully evaluate and communicate the fitness for use of health estimates from that particular survey. Over time, the benchmarks may be helpful in evaluating estimates from available commercial probability-based online panels in general. NCHS will not disseminate statistical estimates for the variables generated from the benchmarking questions. Rather, the resulting estimates for the benchmark variables will be compared as part of methodological work to the “gold-standard” estimates from other sources like the NHIS to understand how Rapid Surveys estimates differ from the other sources. Any content from NHIS, NSFG, or NHANES could be included in a Rapid Surveys questionnaire for benchmarking purposes. Benchmarking content will vary across survey rounds and generally include questions on chronic conditions, functioning and disability, health care access and use, and health-related behaviors. Over time, this approach will permit the Rapid Surveys System to evaluate whether estimates for some health topics have greater fitness for use than other topic areas.


The inclusion of benchmarking questions serves one additional goal of the Rapid Surveys System: To develop and carefully evaluate estimates for key health outcomes (that is, topics that are currently included on NCHS surveys) using statistical models that make joint use of data from recent quarterly panel surveys with older data from NHIS. These modeled estimates may provide predictions or “nowcasted” estimates for key NHIS health outcomes or increase the reliability of disaggregated estimates for demographic or socioeconomic subgroups. In this manner, the Rapid Surveys System could improve the timeliness and/or granularity of NCHS surveys. This effort is consistent with NCHS’s Strategic Plan, which calls for “forging new methods to produce model-based estimates to address geographic, temporal, and demographic gaps in data.”

The two data collection contractors will separately administer the same questionnaire and then provide a data file with sampling weights and variance estimation variables to NCHS. NCHS, with the assistance of a support contractor (RTI International), will perform data quality checks, including evaluating each round’s data against relevant benchmarks, merging the files together, and conducting final weighting and variance estimation. Quality evaluation analyses will include an assessment of the similarities and differences in estimates from the two commercial panels and a determination whether combining data from the two panels yields greater accuracy than can be achieved by any one panel alone.


NCHS will make statistical estimates and the underlying microdata publicly available in as timely fashion as feasible after each round. Dissemination modes may include web-tables with basic descriptive statistics, an online interactive dashboard where users can select pre-tabulated estimates (including standard errors/confidence intervals), and restricted-use and public-use files. All of these analytic products will include highly visible and transparent information regarding any limitations and data quality. In particular, the documentation will indicate that Rapid Surveys differ in quality from NCHS’ official statistics, thus those higher-quality established data collections should be used as the basis for examining yearly trends in established indicators. The documentation will highlight key methodological concerns that may increase the risk of bias in Rapid Surveys estimates. A set of simple infographics are also planned that can be readily disseminated via social media. They will also include a link to the data use disclaimers.


Following each round, the data collection contractors will produce a methodology report that describes the composition and representativeness of the sample. This information will be used to inform sample selection procedures in subsequent rounds to improve sample coverage as needed. All work will be done in conjunction with NCHS and under NCHS’s oversight and direction. The overall evaluation work will be done by the data collection contractors. Some of the weighting will be done by the data collection contractors and further methodological work on how to combine data files/weights/variance estimators will be supported by RTI. RTI will also be responsible for benchmarking analyses.


Research Areas


The development of this system was guided by methodological findings from NCHS’s Board of Scientific Counselors. The Board of Scientific Counselors was asked to 1) identify ideal conditions for the use of probability-based online panels for emerging or supplemental topics where “gold standard” survey data may or may not be available and 2) identify additional research and evaluation needed to increase confidence in the fitness-for-use of estimates from online panel surveys for these purposes. The Board formed a working group which sought expert input from both working group members and outside experts as identified in Section 8 below. The working group’s findings and the Board’s recommendations (dated June 6, 2022) can be found online at: https://www.cdc.gov/nchs/about/bsc/bsc_letters.htm


Based on their findings, the BSC’s major recommendations for using online panel surveys include:

  • Avoid using online panels for estimates that can be produced by NCHS’ current cross-sectional household surveys, the former will have a greater risk of bias

  • Avoid using online panels for estimates related to the data collection methods (e.g., technology use and internet use) or willingness to participate (e.g., volunteering, voting), as these measures are known to be related to the likelihood of participation in online probability samples

  • Estimates for some subgroups can be particularly susceptible to bias in online panels, so the design of the panel needs to ensure that it is “representative” not just demographically, but within the groups of interest

  • Online panel estimates may be well-suited for generating estimates of change over time when panel designs are stable over time

  • Online panel estimates are well-suited for piloting new or altered survey questions prior to implementation on large-scale household studies.


The BSC provided recommendations for additional research in six areas, and the Rapid Surveys System will include research activities related to each of these six areas. Those recommendations and research plans are identified below:


  1. Research augmenting panels with other samples or administrative data to address undercoverage and nonresponse


Both Ipsos and NORC have submitted plans for improving the representativeness of their respective panels. These plans vary due to the specifics of the panels maintained by each contractor. The Rapid Surveys System may include any or all of the following strategies when implementing each survey:

  • Underrepresented groups can be over sampled using sampling strata based on demographics such as age, race/Hispanic ethnicity, education, household income, and gender.

  • Telephone interviews may be included for panelists who prefer phone interviews.

  • Incentives may be increased for underrepresented groups

  • Recruitment of panelists from underrepresented groups may be enhanced by mailing advance letters and refusal conversion materials.

  • Self-administered mail questionnaires may be mailed to subgroups of interest to improve response from underrepresented groups.

  • Additional respondents from underrepresented groups may be recruited from opt-in nonprobability samples or from mailings to address-based sample.


The Rapid Surveys System is particularly interested in evaluating the quality of estimates from commercial survey panels for subpopulations that are susceptible during a public health crisis to increased morbidity, disease transmission, or impacts on well-being. Estimates from these panels may be sufficiently accurate for decision making regarding emerging and priority health concerns nationally, but may be insufficient for subgroups defined by income, education, or social status, or who experience lack of access to health care or racial discrimination. Underrepresentation of these subpopulations in commercial survey panels may limit the utility of estimates from such panels.


The bulleted list in the previous paragraph identified strategies for improving the overall representativeness of the survey panels. Similar strategies may also be used as part of developmental activities that increase sample sizes and enhance the precision of estimates for specific subgroups and generate sufficient data for evaluations of the quality of estimates for these subgroups. That is, the Rapid Surveys System may target individual surveys to collect data only from specific subgroups within existing survey panels, may enhance recruitment efforts with such groups, and may supplement data collection for such groups with additional respondents from other probability or nonprobability samples.


  1. Conduct an evaluation study to inform the suitability of online panels for types of estimates, limitations of the panel approach, needed changes to the design, and improvements to weighting and estimation.


Both data collection contractors will be conducting separate evaluations that incorporate data from their own panel and the other contractor’s panel. The purpose of having both data collection contractors conduct independent evaluations is to improve the objectivity of the final conclusions. NCHS will combine and synthesize the two reports, reconcile any differences in conclusions, and prepare a single cohesive evaluation report.


The evaluations from each contractor will include literature reviews on the use of probability-based commercial panel data for national estimates and methods of estimation for combining survey data from commercial panels with national probability surveys. The literature reviews will provide the background and motivation for each contractor to propose survey estimation methods for weighting and calibrating the estimates from the panels.


  1. Evaluate different postsurvey weighting adjustments for nonresponse bias


As part of their independent evaluations, each contractor will evaluate alternative methods for producing weighted estimates from commercial survey panels. This will include weighting methods that calibrate the estimates from the panel surveys to one or more NCHS population surveys like the NHIS. Analytic evaluations will be conducted to understand the extent to which the alternative weighting methods reduce the amount of bias in estimates from Rapid Surveys. For example, the estimates from Rapid Surveys using the alternative weighting methods can be compared to benchmark estimates from a gold standard survey like the NHIS. The difference in the variance of estimates produced by alternative weighting methods will also be evaluated to understand the impact on the total error of survey estimates. Based on these analytic evaluations, the contractors will provide recommendations for alternate weighting approaches and their future implementation to achieve NCHS objectives for Rapid Surveys. The contractors will also identify directions for additional research and methodological development.


  1. Design auxiliary variables to aid weighting adjustments, rather than only rely on measures that are currently available


An important feature of Rapid Surveys is the use of the NHIS to calibrate the weights from the online survey panels. This enables NCHS to leverage gold standard surveys to evaluate and improve the estimates from online panel surveys. The NHIS already includes many health variables that can be used for the purpose of weight calibration. It is also possible to add questions to the NHIS to use as calibration variables in the weighting process. For example, in quarter 3 of 2022, nine questions covering three topic areas were added to the NHIS to study weighting calibration methods in the NCHS Research and Development Survey (RANDS). The Rapid Surveys System will further evaluate the utility of these new topic areas for calibration. The evaluations of alternative weighting methods as part of Rapid Surveys may also identify new variables worthy of investigation as weighting calibration variables.


  1. Periodically evaluate online panel methodology, including benchmarking to other sources


The literature reviews and analytic evaluations of alternative weighting methods will be updated annually by the contractors. Recommendations based on the evaluations will be used to make continuous methodological improvements to the Rapid Survey System and enhance the quality of estimates.


  1. Communicate data quality of web panel estimates relative to core survey data.


Another key aspect of Rapid Surveys is the development of standards and guidelines to communicate the quality of the data that are collected by Rapid Surveys and how that quality differs from gold standard surveys at NCHS like the NHIS. The evaluations of the panels in Rapid Surveys will inform this communication effort by providing scientific evidence about the circumstances under which Rapid Surveys can be calibrated to gold standard surveys. An important part of this communication is describing to data users and the public how the data and findings from Rapid Surveys are fit-for-use (Dever et al., 2020). For example, Rapid Surveys is not a substitute for NHIS or other NCHS population surveys given its different methodology, smaller sample sizes, lower precision (especially for subgroups), and greater potential for coverage and nonresponse biases. Rather, the program is intended for quicker-turn-around surveys that fill gaps that NHIS and other NCHS surveys cannot meet.


NCHS will evaluate whether the data support the needs of the decision maker as expected in a number of different ways. For each round, NCHS will generate statistics that help decision makers understand the strengths and limitations of the data they will receive. Each of the samples will be evaluated for their distributional properties prior to weighting and whether combining the samples to produce aggregated estimates would be appropriate. The data will be assessed for missing data rates, internal consistencies, and consistency with external benchmarks. The results of these data quality assessments will be made available at the same time as the estimates. It is expected that tables, infographics, dashboards based on aggregate estimates, and restricted-use data files (accessible in the NCHS Research Data Center (RDC)) will be made available first. A public-use data file is expected to be available once NCHS Disclosure Review Board (DRB) approval is received. NCHS will also query our CDC and HHS partners after they receive the estimates and the data quality assessments, to better understand whether the estimates met their needs and whether the quality tradeoffs were worth the increased speed.


Cognitive Testing


Cognitive testing of survey questions will be an important part of Rapid Surveys to understand how respondents answer the survey questions. Due to the expedited schedule of Rapid Surveys, it typically will not be possible to cognitively test survey questions before they are fielded with a panel. Hence, cognitive test results will not usually be used for identifying problems and making question wording changes due to the requirements for a short conception to production period. Instead, cognitive test results are more likely to be used by data users when interpreting survey results. The results from cognitive testing will be made publicly available in Q-Bank.


RTI, under contract with NCHS, in conjunction with design input from CCQDER, will conduct an average of 80 cognitive interviews per year. Interviews may be conducted either in-person at RTI offices and virtually and will last approximately one hour. The relative number of in-person and virtual interviews is flexible and may vary by round depending on the public health situation at the time, the survey topics, and the population being surveyed.


Currently, RTI is using Zoom for in-person interviews because it allows screensharing between the interviewer and participant so that an interviewer may more easily observe interactions between the participant and the survey program and provide timely concurrent probing. Using Zoom also gives NCHS staff the opportunity to observe in-person interviews remotely if requested. Zoom’s native recording function will be used for screen and audio recording, and a digital back up recorder may be used by interviewers to capture the audio portion of the interview. As technology and government procurement decisions change, the approach to remote interviewing may change.


Participants will be able to refuse consent for audio and/or video recording. In these cases, the interview will proceed without those elements. In such instances, the interviewer will take notes. The recorded files will be identified by a unique participant ID number. Recordings and notes will be destroyed within four weeks of when the final memorandum with the interview findings is completed and approved by NCHS.


Participants who complete the interview will be given a $50 cash incentive.


The sample size for these interviews and the cognitive interview procedures do not allow for statistical inference to be conducted; therefore, the analyses will entirely be qualitative. Debriefings with the interviewers will be conducted to learn from their experiences regarding participants’ reactions and responses to the survey questions and interviewer probes. The results will be summarized in a memorandum, provided to data users so that they may better understand how to interpret results based on those questions, and used to make recommendations for questions to be revised, included, or not included in subsequent NCHS surveys.


3. Use of Improved Information Technology and Burden Reduction


The Rapid Surveys will use information technology to reduce burden. The vast majority of surveys will be completed using self-response online questionnaires. Some surveys (generally no more than 10%) may be completed by computer-assisted telephone interviewing (CATI) or self-administered questionnaires distributed by mail. The online and CATI questionnaires will use skip patterns, to allow for respondents to not have to answer questions that are not applicable to them.

4. Efforts to Identify Duplication and Use of Similar Information


The Rapid Surveys System is not intended to duplicate data collected in other NCHS surveys. Instead, it is intended to fill gaps in data collections that require a shorter turn-around, or to supplement or replace data collections currently being conducted by other parts of CDC, but they would prefer to use this platform due to quality, timeliness, or cost considerations.


The Rapid Surveys System is similar to the RANDS program described earlier, but there are important differences. The primary purpose of RANDS was for methodological research on weighting calibration methods and evaluation of survey questions. Rapid Surveys is intended to produce timely estimates for high priority and emerging health topics to meet the needs of NCHS and partners within the CDC and the Department of Health and Human Services. The timing, frequency, sample size, and funding varied for RANDS based on program needs and availability of resources. Rapid Surveys is intended to collect data and release estimates on a quarterly basis with a stable sample size. RANDS typically relied on one data collector, whereas Rapid Surveys will use two data collectors for the evaluation purposes mentioned in this supporting statement.


RANDS and Rapid Surveys also differ in terms of how the content are determined, the output produced, and goals for survey estimation. The content for RANDS depended largely on statistical or question evaluation research needs. Content for Rapid Surveys is determined by NCHS and CDC priorities for health information. RANDS released findings primarily through research reports, journal manuscripts and presentations. The primary focus of Rapid Surveys is the release of quarterly estimates through an interactive dashboard (data query tool) on the NCHS website. The goal of survey estimation and the creation of weights varied in the RANDS program depending on the research purpose for a particular round of data collection. In contrast, Rapid Surveys will be utilizing model-based calibrated weights to calibrate estimates with other gold standard surveys from NCHS like the NHIS.


NCHS has also partnered with the Census Bureau on the Household Pulse Survey (HPS) to collect urgent relevant data during the pandemic. NCHS published weekly dashboards based on HPS health data on the NCHS website. The HPS dashboards have been widely used, indicating the interest in timely relevant data. The HPS is an omnibus survey with many different topics such as economics, labor, and health, but space is limited. These space limitations mean that CDC needs for timely health data cannot be entirely met through HPS. Concerns have also been raised about significant nonresponse bias due to low response rates in HPS.


There will be overlap between the Rapid Surveys and established NCHS surveys for two components of the questionnaire. First, variables that will be used for weight calibration will be included in both the Rapid Surveys and a gold-standard data collection (e.g., NHIS). Calibration variables will be used in the weighting procedure to adjust for nonresponse and other forms of bias. Second, the NHIS and Rapid Surveys will include a small set of variables that will be used for benchmarking to assess the quality of the Rapid Surveys data. Calibration and benchmarking variables may also come from other high-quality population surveys from CDC or other federal agencies.


5. Impact on Small Businesses and Other Small Entities


Information collection for Rapid Surveys does not involve small businesses or other small entities.


6. Consequences of Collecting the Information Less Frequently


Less frequent data collection would hinder the ability of NCHS to achieve the CDC Director’s goal to generate more time-sensitive data of known quality for rapid response, thereby providing timely data on new, emerging, and priority health topics that are fit-for-use for decision making.


7. Special Circumstances Relating to Guidelines of 5 CFR 1320.5


There are no special circumstances.


8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside Agencies


A 60-day Federal Register Notice was published for the Rapid Surveys on February 17, 2023 (Vol. 88, No. 33, pp. 10337-8). NCHS received two non-substantive comments and no further changes were made to the data collection plan (Attachment B1).


The following individuals have been consulted about the Rapid Surveys System:


NCHS Board of Scientific Counselors Workgroup


  • Andy Peytchev, PhD, Workgroup Chair, RTI International

  • Mollyann Brodie, PhD, Kaiser Family Foundation

  • Kennon Copeland, PhD, NORC, University of Chicago

  • Scott Holan, PhD, University of Missouri


Outside Experts


  • Andrew Mercer, PhD, Senior Research Methodologist, Pew Research Center

  • Jenny Marlar, PhD, Director, Survey Research, GALLUP

  • Michael Dennis, PhD, Executive Director, AmeriSpeak, NORC

  • Frances Barlas, PhD, Senior Vice President, Research Methods, Ipsos Public Affairs

  • Ashley Kirzinger, PhD, Director of Survey Methodology, Kaiser Family Foundation

  • Eran Ben-Porath, PhD, Executive Vice President, SSRS

  • Cameron McPhee, MS, Chief Methodologist, SSRS

  • Jason Fields, PhD, MPH, Senior Researcher for Demographic Programs and the SIPP, Social, Economic, and Housing Statistics Division, US Census Bureau

  • Jennifer Hunter Childs, Assistant Center Chief, Emerging Methods and Applications, Center for Behavioral Science Methods, U.S. Census Bureau


In addition, the NCHS Director has had ongoing conversations with CDC senior leaders about the Rapid Surveys System to ensure that the goals of Rapid Surveys align with the needs of CDC.

9. Explanation of Any Payment or Gift to Respondents


The Ipsos KnowledgePanel uses a points-based system and respondents receive a standard incentive. Standard survey incentives depend on the length of the survey. At a minimum, panelists receive their baseline level of points for each survey. Baseline points differ by demographic group as follows:


Current baseline incentive details:


  1. All panel members get 1,000 points unless they are in #2-#6 below. Each 1,000 points is approximately the cash-equivalent of $1.

  2. Randomly selected HHs that were part of a test get 1,500 points, unless they are in #3, #4, or #6 below.

  3. All Black/African Americans get 2,000 points, unless they are in #6 below.

  4. All Less than High School members get 2,000 points, unless they are in #6 below.

  5. Members (recruited through March 2022) with Black/African American members and/or with Less than High School members in their HH get the same 2,000 points given to their fellow HH member, unless they are in #6 below.

  6. A subset of randomly selected 18-29 year olds have been randomly assigned 2,000 to 5,000 points.

For longer surveys, additional incentives are required. Clients can also choose to offer additional points to further maximize response. Incentives by survey length are as follows:


Survey Length

Incentive

Bonus Points

< 16 minutes

Baseline points

Optional

16-25 minutes

Baseline points and sweepstakes

Optional

26-35 minutes

Bonus Points Required

$5

36-49 minutes

$10

50-59 minutes

$15

60 minutes

$20



For the Rapid Surveys, Ipsos will follow its current standard incentive procedures described above. As such, incentives will vary based on survey length and panel member demographics.


The NORC AmeriSpeak panel also uses a points-based system for completing surveys. One thousand AmeriPoints has a cash equivalent of $1. Respondents for this project will receive standard incentives equivalent to $5 for surveys completed by web and $10 for interviews completed by phone. NORC may increase incentives to improve the response among underrepresented groups.


Participants that complete a cognitive interview will receive a $50 cash incentive.


NCHS will evaluate the performance of these incentives in helping to reach the target number of completed surveys and cognitive interviews. Based on the findings, experiments may be performed on the impact of nominal increases to the incentive amounts to achieve these targets.


10. Protection of the Privacy and Confidentiality of Information Provided by Respondents

The NCHS Privacy Act Coordinator has reviewed this request and has determined that the Privacy Act is applicable. The related System of Records Notice is 09-20-0164 Health and Demographic Surveys Conducted in Probability Samples of the U.S. Population.


A Privacy Impact Assessment was submitted to the CDC Office of the Chief Information Security Officer. For the Rapid Surveys System, the process of informing respondents of the procedures used to keep information confidential begins with language explicating the voluntary nature of the survey and providing the legal basis and confidentiality assurance on the initial screen (shown in the questionnaire in Attachment D). Panel participants will be asked to review it before beginning the survey on the next screen. This information includes all elements of informed consent, including the purpose of the data collection, the voluntary nature of the study, and the effect upon the respondent for terminating the interview at any time.


In the activity requested in this ICR, confidentiality provided to respondents is assured by adherence to Section 308(d) of the Public Health Service Act (42 U.S.C. 242m) which states:


"No information, if an establishment or person supplying the information or described in it is identifiable, obtained in the course of activities undertaken or supported under section...306 (NCHS legislation),...may be used for any purpose other than the purpose for which it was supplied unless such establishment or person has consented (as determined under regulations of the Secretary) to its use for such other purpose and (1) in the case of information obtained in the course of health statistical or epidemiological activities under section...306, such information may not be published or released in other form if the particular establishment or person supplying the information or described in it is identifiable unless such establishment or person has consented (as determined under regulations of the Secretary) to its publication or release in other form,..."


In addition, legislation covering confidentiality is provided according to the Confidential Information Protection and Statistical Efficiency Act or CIPSEA (44 U.S.C. 3561-3583), which states:


Whoever, being an officer, employee, or agent of an agency acquiring information for exclusively statistical purposes, having taken and subscribed the oath of office, or having sworn to observe the limitations imposed by this section, comes into possession of such information by reason of his or her being an officer, employee, or agent and, knowing that the disclosure of the specific information is prohibited under the provisions of this subchapter, willfully discloses the information in any manner to a person or agency not entitled to receive it, shall be guilty of a class E felony and imprisoned for not more than 5 years, or fined not more than $250,000, or both.”


Rapid Surveys and Personally Identifiable Information (PII)


Ipsos and NORC have access to PII for their respective panel’s membership, including information such as name, date of birth, mailing address, and phone numbers. They collect these direct identifiers in order to maintain their propriety panel, independent of this project. No additional direct identifiers are associated with the Rapid Surveys data collection, and no direct identifiers are transmitted to NCHS or the support contractor, RTI, from Ipsos or NORC’s servers. However, NCHS and its contractors will have access to potential indirect identifiers such as age, sex, race-ethnicity, employment status and urbanicity. Indirect identifiers are those that could be used, in combination with other available information, to identify an individual who is a subject of the information.


Ipsos and NORC have extensive cyber and physical security in place, including a CIPSEA Information Protection Plan approved by the NCHS Confidentiality Officer and the NCHS Information Systems Security Officer, in order to protect both the security of the front-end survey interface and the back-end storage of the survey’s data. Additionally, as contractors to NCHS, all Ipsos, NORC, and RTI employees working on the Rapid Surveys will complete NCHS confidentiality training, sign the NCHS affidavit of nondisclosure, and will be NCHS designated agents via a Designated Agent Agreement with NCHS.


General Privacy and Confidentiality Protection Procedures for the Rapid Surveys System


The collection of information in identifiable form during activities encompassed by this ICR requires strong measures to ensure that private information is not disclosed in a breach of confidentiality. Only those NCHS employees, those specially designated agents (including contractor staff), and research partners designated as CIPSEA agents who must use the restricted-use data for approved statistical purposes can use such data.


As noted above, all NCHS employees as well as all contract staff, receive appropriate training and sign a “Nondisclosure Statement.” Staff from collaborating agencies are also required to sign this statement, and members of outside agencies are required to enter into a more formal agreement with NCHS. Everyone else who uses Rapid Surveys data can do so only after all identifiable information is removed (as described below). In addition, the Cybersecurity Act of 2015 permits monitoring information systems for the purpose of protecting a network from hacking, denial of service attacks and other security vulnerabilities3. Monitoring under the Cybersecurity Act may be done by a system owner or another entity the system owner allows to monitor its network and operate defensive measures on its behalf. The software used for monitoring may scan information that is transiting, stored on, or processed by the system. If the information triggers a cyber threat indicator, the information may be intercepted and reviewed for cyber threats. The cyber threat indicator or defensive measure taken to remove the threat may be shared with others only after any information not directly related to a cybersecurity threat has been removed. In addition, sharing of information can occur only after removal of personal information of a specific individual or information that identifies a specific individual.


It is NCHS policy to make Rapid Surveys data available via public use data files to the scientific community. Publicly released data sets will be available indefinitely on the NCHS website. A concerted effort is made to avoid any disclosures that may allow a researcher to go back and find individuals in the general population. To this end, prior to their release, the Rapid Surveys data files will be reviewed by the NCHS Disclosure Review Board to evaluate where disclosure risks might arise and how to minimize them. Several techniques are used to minimize these risks, including collapsing categories, top and bottom coding, adding noise to variables, removing detailed geographic information that may allow someone to identify individuals in the general population, along with other statistically sound means. Researchers wishing to conduct analysis on variables not available in the public use data files may submit a research proposal to use the NCHS Research Data Center (RDC)4. The NCHS RDC website will include a note explaining the purposes of these data collections that links to the limitations statement in the Rapid Surveys technical notes.


The CIPSEA legislation authorizes the designation of agents (“designated agents” or “agents”) to perform statistical activities on behalf of an agency. These agents function under the supervision of the agency’s employees and are subject to the same provisions of law with regard to confidentiality as an agency’s employees. A Designated Agent Agreement between the agency and the designated agents (e.g., contractors) must be executed before the agents can acquire information for the agency for exclusively statistical purposes under a pledge of confidentiality. This requirement is outlined in an OMB Notice, published in the Federal Register on June 15, 2007, entitled “Implementation Guidance for Title V of the E-Government Act, Confidential Information Protection and Statistical Efficiency Act of 2002 (CIPSEA).” Additionally, the agents (contractors) will be required to complete the NCHS Confidentiality Training (https://www.cdc.gov/nchs/training/confidentiality/training/), submit a certificate of completion, and sign a pledge to maintain confidentiality prior to completing work. If the contractor hires subcontractors to complete work, the subcontractors must adhere to the same confidentiality and security requirements as NCHS staff and contractors, including annual training and signing of the affidavit.


11. Institutional Review Board (IRB) and Justification for Sensitive Questions


A determination was made by the NCHS Human Subjects Contact that Rapid Surveys is excluded from the Common Rule (Attachment G). It does not meet the definition of research as noted in 45 CFR 46.102. It is classified as surveillance, and the NCHS Practices and Procedures for the Protection of Human Subjects apply. The protocol is undergoing review by the NCHS Research Ethics Review Board.


Given the fact that Rapid Surveys are designed to survey the public about different health topics, some of the survey topics may include potentially sensitive questions for some respondents. This may include questions about injury, chronic pain, or mental health such as anxiety or depression.


In the informed consent procedure, respondents are advised of the voluntary nature of their participation in the survey or any of its components. Sample persons are informed that they can choose not to answer any questions they do not wish to answer and that they may stop the interview at any time. In addition, most respondents will be completing surveys via a self- administered mode of administration to help ensure privacy when answering questions.


12. Estimates of Annualized Burden Hours and Costs


CDC is requesting 84,080 responses and 28,079 burden hours per year for three years. The average burden per response for a survey is estimated to be 20 minutes. The average burden per response for a cognitive interview is estimated to be 60 minutes.


Table 1: Estimated Annualized Burden Hours

Type of Respondents

Form name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hours)

Total Burden Hours

Adults 18+

RSS Surveys and Cognitive Interviews

84,080

1

20 – 60 minutes

28,079



Per guidance from OMB, CDC is providing a narrative discussion and a supplementary table (below) that describe projected utilization of the RSS and justify the capacity requested for this three-year clearance.


The Rapid Surveys are designed to have four rounds of data collection each year with two contractors. As part of the base (minimum sample size), Ipsos will be completing 2,000 surveys per quarter with KnowledgePanel respondents and NORC will be completing 2,000 surveys with AmeriSpeak respondents. Additionally, for the first year, NORC has also been contracted to do an additional 2,000 surveys each quarter. This leads to as many as 6,000 completed surveys per quarter or 24,000 surveys per year. The average time per response is expected to be 20 minutes.


The Rapid Surveys System can be expanded by increasing the number of completed surveys per round and/or the number of rounds per year as needed up to a maximum of 28,000 surveys per year per contractor or 56,000 total surveys per year. The burden table below reflects the additional 20,000 possible surveys per contractor (of which NORC will be conducting 8,000 surveys for the first year).


As noted earlier, Ipsos and NORC have submitted plans for improving the representativeness of the panels. This may include adding oversamples using sampling strata based on demographics such as age, race/Hispanic ethnicity, education, household income, and gender, or recruiting additional respondents from opt-in nonprobability samples or from mailings to address-based sample. It may also include adding additional panelists to the sample and experimenting with increased incentives or enhanced recruitment efforts. Each contractor may complete up to 2,000 additional surveys per quarter (8,000 for the year) for that task. This increases the maximum burden by up to 16,000 surveys per year. These are also reflected on two separate lines on the table.


As noted earlier, the Rapid Surveys System may also target individual surveys to collect data only from specific subgroups within existing survey panels and may supplement data collection for such groups with additional respondents from other probability or nonprobability samples. An additional 12,000 surveys per year is included as a separate line in the table to reflect the possibility of these developmental activities.


Each respondent is expected to provide a single response; however, there may be situations where a panel member is sampled into the Rapid Surveys more than once; however, respondents are not expected to deliberately be sampled into the Rapid Surveys multiple times.


Cognitive interviews will also be conducted with 20 subjects per quarter for a total of 80 interviews per year. The average time per response is 60 minutes for a burden of 80 hours per year.


Table 2 provides a summary of these components per OMB’s request.


Table 2: Breakdown of Data Collection Hours Comprising the Annualized Burden Estimate

Type of Respondents

Form name

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (in hours)

Total Burden Hours

Adults 18+

Base Surveys

16,000

1

20/60

5,333

Adults 18+

Potential Sample Expansion

No more than 40,000

1

20/60

13,333

Adults 18+

Additional Surveys to Increase Representativeness

No more than 16,000

1

20/60

5,333

Adults 18+

Developmental: Additional Surveys for Specific Subgroups

No more than 12,000

1

20/60

4,000

Adults 18+

Cognitive interviews

80

1

1

80

Total


84,080

..

..

28,079



As part of the initial approval of the Rapid Surveys System, CDC is requesting OMB approval of the Round 1 (2023) data collection (see Attachments D, E, and F). CDC will use the Change Request mechanism to update these attachments for future Rapid Surveys and/or cognitive interviews.


Estimated Annualized Burden Costs to Respondents.


At an average wage rate of $28.01 per hour, the estimated annualized cost for the 28,079 burden hours is $786,493. This estimated cost does not represent an out-of-pocket expense but represents a monetary value attributed to the time spent doing the survey. The hourly wage estimate is based on the Bureau of Labor Statistics May 2021 National Occupational Employment and Wage Estimates (http://www.bls.gov/oes/current/oes_nat.htm). There is no cost to respondents other than their time to participate.


Table 3: Estimated Annualized Burden Cost


Type of Survey

Total Burden Hours

Hourly Wage Rate

Total Respondent Costs

Web- or telephone-mode survey

27,999

$28.01

$784,252

Cognitive interview

80

$28.01

$2,241

Total

.

..

$786,493


13. Estimates of Other Total Annual Cost Burden to Respondents and Record keepers


None.


14. Annualized Costs to the Federal Government


The estimated annualized contractual cost to the federal government for the activities outlined in this information collection request is $4,203,990.37.


15. Explanation for Program Changes or Adjustments


This is a new clearance request to support this new data collection system.


16. Plans for Tabulation and Publication and Project Time Schedule


The following are key activities and projected completion dates for round 1 of the Rapid Surveys Project:


Activity

Projected Completion Date

Year 1 Quarter 1 (Y1Q1) Rapid Surveys Data Collection

4-6 weeks after OMB approval

Release of Estimates from first rapid survey data collection

2-3 months after OMB approval

Rapid Surveys Y1Q1 restricted-use data file available via NCHS RDC

2-3 months after OMB approval


17. Reason(s) Display of OMB Expiration Date is Inappropriate


The expiration date will be displayed.


18. Exceptions to Certification for Paperwork Reduction Act Submissions


Certifications are included in this submission.



1 NCHS surveys such as the National Survey of Family Growth (NSFG), the National Health and Nutrition Examination Survey (NHANES) and the National Health Interview Survey (NHIS) emphasize accuracy and reliability and the ability to describe health outcomes by demographic and socioeconomic characteristics across time. For example, the NHIS consists of a large annual sample size of approximately 30,000 sample adults and 9,000 sample children that facilitates the production of reliable sub-national estimates for many subgroups. Data are collected via a personal visit with telephone follow-up to achieve high response rates. Extensive post-processing of the survey data includes data cleaning, data editing, imputation, weighting, calibration, and disclosure review procedures. As a result, most estimates from the NHIS are made available to the public and sponsors six-to-nine months after the end of data collection at the end of each calendar year. There are a couple of exceptions to this timeline. First, the NHIS Early Release Program makes a limited set of key preliminary estimates publicly available approximately 4 months after data collection on a quarterly cadence. The NHANES estimates are produced on a longer cycle as data for two years are combined prior to the release of estimates.

2 In April of 2022, CDC launched its Moving Forward initiative which is an effort to modernize CDC to improve accountability, collaboration, communication, and timeliness.

3 To “monitor” means “to acquire, identify, or scan, or to possess, information that is stored on, processed by, or transiting an information system”; “information system” means “a discrete set of information resources organized for the collection, processing, maintenance, use, sharing, dissemination or disposition of information;” “cyber threat indicator” means information that is necessary to describe or identify security vulnerabilities of an information system, enable the exploitation of a security vulnerability, or unauthorized remote access or use of an information system.

4 Procedures for submitting the proposal and other important information can be found here http://www.cdc.gov/rdc/

22


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWhitaker, Karen R. (CDC/DDPHSS/NCHS/DRM)
File Modified0000-00-00
File Created2024-07-31

© 2024 OMB.report | Privacy Policy