Section B Supporting Statement for Request for Clearance
State and Local Area Integrated Telephone Survey
OMB # 0920-0406
Three-year generic clearance 04/30/2011
Contact Information:
Kathleen S. O’Connor, MPH
Survey Statistician
CDC/NCHS
3311 Toledo Road, Room 2119
Hyattsville, MD 20782-2003
301.458.4181 (voice)
301.458.4035 (fax)
Section B.
The target population for the State and Local Area Integrated Telephone Survey (SLAITS) reflects the particular module being conducted but is typically households with children under age 18 years. The universe from which a sample is drawn is all households. SLAITS uses the National Immunization Survey (NIS) sampling frame, for which the sample size provides an economical opportunity for SLAITS projects to survey other populations in addition to the population that eventually screens into the NIS itself.
For more information on the NIS or NIS sampling methods, please refer to the website of the NIS sponsor, CDC’s National Center for Immunization and Respiratory Diseases (NCIRD) www.cdc.gov/vaccines, or NCIRD’s webpage entitled ‘Statistics and Surveillance: Immunization Coverage in the US’ at http://www.cdc.gov/vaccines/stats-surv/imz-coverage.htm#nis.
Sample design. A sample of telephone numbers (presumably of households) will be randomly generated and drawn for the NIS. The sample of households selected for SLAITS will be a subsample of the large NIS sample. At the beginning of each quarter, landline and cellular Random Digit Dial (RDD) replicates (batches of numbers that have been identified to the extent possible to be working residential numbers) are released for the NIS. Release of these replicates is parsimonious to ensure no more replicates than necessary are sent to the telephone center.
Unless a module is developed that requires a very rare population subgroup or the NIS is used for multiple immunization surveys simultaneously, there is no need to develop an independent sampling process. Although SLAITS makes efficient use of the vast NIS sample, occasionally it is insufficient to achieve SLAITS targets in some geographic areas. There may be a need to develop a small independent sample of telephone numbers for SLAITS called “augmentation sample” to meet target sample sizes, which is developed in the same fashion as the regular sample. The only difference is that this sample is not screened for the NIS prior to administration of the relevant SLAITS module, and would not receive the NIS advance letter (Attachment 3).
The number of telephone numbers sampled for the NIS varies by state and depends largely on the working residential number rate, the percent of households with age eligible children, and the cooperation rate. NIS sample release is a dynamic process which is evaluated and adjusted at least weekly. In some areas, such as those with a high working residential number rate and a high proportion of households with young children, high cooperation rates, and in states where the NIS has sub-state targets, it is possible that there will not be enough NIS sample to meet the SLAITS sample targets. In these areas, augmentation sample is created using procedures identical to those used for the NIS. The augmentation sample households receive a SLAITS-only advance letter that does not mention the NIS (Attachment 4), and no NIS interview is conducted. Every attempt is made to minimize the amount of augmentation sample used because these cases cost more than those that use the NIS frame.
Oversampling subpopulations. Depending on a module’s design, oversampling may be required to identify adequate numbers of households with particular characteristics such as low income, children of a specific age, race or ethnicity, or a chronic condition. The technique employed by SLAITS has been to screen for the characteristic of interest at the beginning of the interview. No modules under consideration at this time require the use of this technique. We expect design effects to be comparable to those of previous SLAITS studies.
Estimation procedures. All data are weighted to national and state estimates to produce population-based estimates of totals, means, and proportions. A sampling weight is calculated for each respondent or individual, and the final weight assigned to each sampled person adjusts for various aspects of non-response or a non-equal selection probability. The final weight will also be poststratified to a set of known population totals; these calculations are made at the person-level through marginal adjustments to compensate for any imbalance in the age, sex, and race/ethnicity groupings in the sample. State-level population estimates by age, sex, and race/ethnicity published by the U.S. Bureau of the Census are almost always used as population control totals for the adjustment. Data from NCHS’s National Health Interview Survey (NHIS, OMB No. 0920-0214) are also regularly assessed to determine whether additional adjustments are necessary for the characteristics of households without fixed telephone lines. The standard error for key estimates is calculated using a Taylor linearization approach as applied in SUDAAN software, although other programs such as Stata or WesVar can be used, as they can accommodate a complex sample design to accurately calculate standard errors.
Additional technical details concerning sample design and survey execution can be found in the design and operation reports for past modules. This documentation, listed by module, is available at www.cdc.gov/nchs/slaits.htm under ‘Publications and selected presentations using SLAITS data’.
Degree of Accuracy. For each module, the primary analytic variable determines the sample size. To determine the sample size necessary for reasonable levels of precision, the baseline prevalence of a key statistic is estimated. This will be discussed in each Information Collection Request (ICR).
Unusual Problems Requiring Specialized Sampling Procedure. No special sampling procedures other than the procedure to oversample subpopulations (when indicated) are required.
In consultation with NCHS, the data collection contractor draws the sample, designs and conducts interviewer and supervisor trainings, plans the interview operations, and implements and monitors the survey interview procedures. The contractor also develops the data files, draft documentation, and preliminary and final sampling weights.
SLAITS staff members provide specifications for sample design, specific questionnaire content, detailed interview instructions, and procedures to measure quality control; monitor interviews through direct observation; and maintain continuous communication with the data collection contractor.
The SLAITS questionnaire is designed to immediately follow the NIS interview or screener for eligible children. In all modules, the respondent will be a parent or guardian who lives in the household and is knowledgeable about the health and health care of the child/children, unless noted in the module-specific ICR.
Data collection, entry, and file preparation.
Developmental work is typically conducted (although not always) to test the module instrument and procedures. The NIS advance letter (Attachment 3), which briefly mentions SLAITS modules (although not by name), will be sent prior to the telephone call to sample for which randomly-generated telephone numbers can be matched to addresses.
The questionnaire will be programmed to integrate the NIS and SLAITS module instruments seamlessly. It will make full use of the computer-assisted telephone interviewing (CATI) system’s ability to check whether a response was within a legitimate range, follow skip patterns, fill state-specific information in questions as applicable (for example, names of state Medicaid programs), and employ pick lists for response categories. The CATI system will undergo rigorous testing to ensure it functions properly.
Certain household and demographic questions are identical in the NIS and SLAITS portions of the interview. To reduce respondent burden, the system is programmed so these questions are not repeated in both surveys.
Data files using NCHS conventions will be prepared after data cleaning is completed. Data release may contain a series of linked household-level, child screening-level, and child interview-level data files with all information necessary for analysis. The data files contain demographic information on the focal child, respondent, and household, as well as substantive health and health-related data. A final sampling weight will be assigned to each child-level (or other analytic unit) observation to permit users to generate national and state estimates for variables of interest. The methodology will be described thoroughly in a report that will either accompany or follow the public use data file (PUF) release.
Response rates have long been considered a measure of data quality. The gradual decline in response rates, especially for telephone surveys, is cause for concern. Nonetheless, the telephone as a mode of data collection is still one of the most useful and economical means of obtaining population-based data. Successful conduct of a SLAITS module depends on a combination of techniques to maximize response rates and to understand the impact of nonresponse on data quality. Standard proven survey procedures have been refined through deliberate testing and experience over time. Among those techniques routinely implemented are the:
use of a carefully constructed advance letter for those households where a name and address are available (approximately 56% of the sample1),
clear and unambiguous introduction,
instrument designed to maximize semantic and cognitive clarity, and minimize respondent burden,
effective interviewer recruitment and training,
thorough review of confidentiality, privacy, and security requirements,
maintenance of a toll-free number and website to facilitate participation,
flexible scheduling of interviewing to maximize convenience to the respondent,
carefully scripted answering machine and voice mail messages,
high-quality instrument translation for other language interviews,
quality control and interviewer monitoring,
judicious use of incentives, and
refusal aversion/conversion training with experienced interviewers.
Even these measures do not assure high response rates. NCHS evaluates the extent to which nonsampling error impacts data quality for each SLAITS module during and after data collection. These evaluations might compare estimates to other surveys and related data, examine expected versus actual distributions for demographic variables, assess the rate and location of interview break-offs, and other qualitative and quantitative measures. These measures are reviewed and assessed continually during data collection.
Generally, nonresponse bias can be thought of as the degree to which nonrespondents differ from respondents in key survey variables. This quantity is generally unknown, and nonresponse analyses attempt to measure this difference using one or more approaches: response rate comparison across subgroups, use of rich sampling frame data or supplemental matched data, comparison to similar estimates from other sources, and studying variations within the existing survey. To the extent possible, some type of nonresponse analysis will be conducted for key estimates for each module.
Response rates will be calculated using the Council of American Survey Research Organizations (CASRO) formulae in accordance with the American Association for Public Opinion Research’s Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys (Response Rate #4).2 Although we wish to achieve a minimum weighted CASRO response rate of at least 80% (defined as the product of the resolution rate {determines household status}, age-screener completion rate, and interview completion rate), this may not occur. This does not reflect lax SLAITS procedures but rather a societal trend that adversely impacts many surveys regardless of mode. New telephony equipment and technologies are continually being introduced that make it easier for persons living in a household to identify and reject ‘uninvited’ or ‘unexpected’ telephone calls. As an example, Table 4 below presents weighted CASRO response rates achieved at the conclusion of the 2007 NSCH for the national sample3:
Table 4. 2007 NSCH weighted response rates, national sample.
|
Resolution rate (%) |
Age-screener completion rate (%) |
Interview completion rate (%) |
Overall response rate (%) |
Typical response rate |
81.9 |
86.4 |
66.0 |
46.7 |
Alternative response rate |
89.9 |
86.4 |
66.0 |
51.2 |
In the “typical response rate” line in Table 4 above, the response rate calculation recognizes that some cases of unknown eligibility (e.g., telephone lines that rang with no answer, or households in which the person answering the phone refused to say whether the household included children) were in fact eligible.3 In accordance with CASRO guidelines, the proportion of eligible cases among those with unknown eligibility was assumed to be the same as the proportion of eligible cases among those with known eligibility. The response rate was calculated as the product of component completion rates. The “alternative response rate” line in Table 4 above was calculated similarly, with the exception that in the calculation of the alternative rates, it is assumed that cases that were never contacted are not households.3 It is important to remember that a survey response rate provides one crude measure of the potential for nonresponse bias, and should not be used as the sole indicator of survey and data quality.
Other Languages. In almost all modules, the questionnaire will be translated into Spanish as described below, and procedures used in previous modules will be employed (respondent help screens, question-specific probes) to accommodate Spanish-speaking households. An interviewer will be able to immediately switch from English to Spanish (or vice-versa) by using one of the keyboard's function keys. Other Spanish language job aids will be developed, including question and answer materials, and revised function key guides. All person used to interview Spanish-speaking households are bilingual in English and Spanish. Other language interviews are made on a module-by-module basis depending on the survey topic, coverage, and sponsor.
Translation of English-language questionnaires into Spanish is coordinated by a translation specialist. This specialist has demonstrated skills to translate survey instruments suitable for the full range of ethnic origins found among Spanish-speakers living in the US. A collaborative translation approach is used to ensure concepts and words are translated accurately. A small group of translators each evaluate and translate the questionnaire. Once they have completed their translations, a series of meetings are held in which the translators and coordinator review the translations item-by-item to refine the instrument. Based on these discussions, the coordinator produces a Spanish-language version of the questionnaire for additional testing. The questionnaire is then reviewed by a team of experienced Spanish-language telephone interviewers and supervisors, who evaluate it for accuracy and cultural appropriateness. Issues raised during the final review are resolved in consultation with the translation coordinator, and the instrument is finalized. After the survey has been in the field for a few weeks, a debriefing is held to identify problematic questions. A bilingual SLAITS staff member oversees and participates in this entire process.
Pending adequate funding, the survey sponsor may choose to translate the instrument into
other languages as well, such as the most prevalent Asian languages. SLAITS module instruments have been translated in up to ten different languages in the past. If a sponsor chooses to interview in languages other than English and Spanish, the contractor will work with a subcontractor to translate the instrument and administer the interviews. Only professional bilingual native-speaking translators will administer the interviews. The NIS will be used to identify the non-English/non-Spanish language spoken in the household.
In the case of the 2007 NSCH, the questionnaire was translated into Spanish and four Asian languages including Cantonese, Mandarin, Vietnamese, and Korean. While the Spanish questionnaire was administered via the computer aided telephone interviewing (CATI) system, given the expected low incidence of the other languages, a different procedure was used to screen and interview these households. The NIS uses specially trained interviewers to determine the necessary language with assistance from a firm called Language Line Services, for over 170 languages. If the household was determined to include age-eligible children but no one spoke English, Spanish, or one of the four Asian languages, the case was finalized. If Spanish or one of the four Asian languages was spoken in a household that contained children, age screening occurred. Eligible cases were then assigned to the appropriate language queue to be called later by a specially trained NORC interviewer who spoke that language.
Most SLAITS modules have pretests and/or dress rehearsals conducted to test procedures and methods. New questions may be tested in the NCHS Questionnaire Design Research Laboratory (OMB No. 0920-0222), by observing the first 100 interviews, or as a separate test following OMB clearance.
In addition to these pretests, the SLAITS team wishes to field limited inquiries targeted to our listserv subscribers and data users, a procedure that has already been approved in the past by OMB. We are often asked who uses our data, how it is used, and how it has affected policy, health, and programs. Such information is helpful for our internal continuous improvement program, and to others (such as the NCHS Board of Scientific Counselors) to evaluate the utility of NCHS services. At the present time we have no way to collect this information systematically other than polling listserv subscribers and website visitors; however, we cannot perform these tasks without OMB approval since we would ask identical questions to a large group of people. We have been able to glean some information by performing SLAITS-specific literature searches and through personal contacts. However, we recognize that this method may not capture or describe the true utility and impact of SLAITS data. Therefore, we have included an appropriate level of burden in this submission under “pilot testing” to account for questions sent to data users with the listserv.
The following person was consulted on the statistical aspects of the design and data collection for SLAITS:
Stephen Blumberg, Ph.D.
Senior Research Scientist
National Center for Health Statistics
301-458-4107
swb5@cdc.gov
The following person is responsible for the collection and analysis of SLAITS data:
Marcie Cynamon, M.A.
Director, State and Local Area Integrated Telephone Survey
Chief, Survey Planning and Special Surveys Branch
National Center for Health Statistics
301-458-4174
mcynamon@cdc.gov
ATTACHMENTS TO THE SUPPORTING STATEMENT
1. National Center for Health Statistics Authorization (42 USC 242k); excerpts of applicable laws and regulations
2. Federal Register Notice announcing the 60-day public comment period
Advance letter to respondents (National Immunization Survey advance letter sent to NIS/SLAITS sample)
Advance letter to respondents (advance letter sent to SLAITS-only augmentation sample)
1 Blumberg SJ, Welch EM, Chowdhury SR, Upchurch HL, Parker EK, Skalland BJ. Design and operation of the National Survey of Children with Special Health Care Needs, 2005–2006. National Center for Health Statistics. Vital Health Stat 1(45). 2008.
2 The American Association for Public Opinion Research. 2006. Standard Definitions:Final Dispositions of Case Codes and Outcome Rates for Surveys. 4th edition. Lenexa, Kansas: AAPOR.
3 Blumberg SJ, Foster EB, Frasier AM, et al. Design and Operation of the National Survey of Children’s Health, 2007. National Center for Health Statistics. Vital Health Stat 1. Print version forthcoming ftp://ftp.cdc.gov/pub/Health_Statistics/NCHS/slaits/nsch07/2_Methodology_Report/NSCH_Design_and_Operations_052109.pdf
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Section B |
Author | Kathy O'Connor |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |