Supporting Statement B

Supporting Statement B.doc

HIFA Evaluation Enrollee Survey (CMS-10262)

OMB: 0938-1055

Document [doc]
Download: doc | pdf


HIFA Enrollee Survey Supporting Statement

Supporting Statement B


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS


1. The universe of interest is HIFA enrollees. The budget allows for a survey of enrollees in two states with HIFA initiatives: New Mexico and Oregon. The study team has consulted with CMS in the selection of the two study states, based on criteria including the size of the HIFA enrollment in each state, the presence of a premium assistance component as a substantial part of the initiative, and other factors. These two states have agreed to cooperate, assuming OMB approval. They have been asked to provide a list-based sample of HIFA enrollees. A random sample of 2,000 enrollees (de-duplicated within households) will be drawn from administrative records of HIFA enrollees in the two states. In addition to enrollee contact information (name, address, telephone number), we will request any information available regarding length of enrollment in the state’s HIFA program, use of services, and any available demographic information. We will not request Social Security Numbers.


The study universe is restricted to adult HIFA program enrollees. Sampled adults will be interviewed directly by telephone.


In each study state, we will complete telephone interviews with 400 respondents, for a total of 800 complete surveys for the project (a 5:1 ratio of sample elements to completed surveys should be adequate based on expectations about bad contact information and an estimated response rate of 40 percent-50 percent).


Number of HIFA study states (stratum)

Enrollee sample frame per state

Completed enrollee surveys per state

Total completed enrollee surveys

2

2,000

400

800

Trained interviewers will collect information from HIFA through CATI techniques, which reduce respondent and interviewer burden thereby improving response rates.



2. This is a descriptive, one-time survey, exploring: 1) self-reported health insurance coverage among known HIFA enrollees, 2) enrollee demographics, and 3) expectations of insurance status in the absence of a state HIFA program. A total of 400 completed enrollee surveys per sample stratum (state) should allow adequate statistical power for these descriptive analyses.


Samples will be drawn from state administrative records of HIFA enrollees. The sampled enrollees will be contacted by telephone to participate in the survey. A potential difficulty with this sample procedure is the unknown quality of available enrollee contact information.1 Forward and reverse phone directories will be used to update or append contact information.


The University of Minnesota has had mixed experiences with the quality of administrative records over the years in drawing samples of Medicaid and other public program enrollees in California, Florida, Minnesota and Pennsylvania. Typically, UMN has found that the address information is better (i.e., more up to date) than telephone fields. For this HIFA enrollee survey, New Mexico and Oregon will be asked to provide relevant identifying and contact information (all address and telephone fields). These data are typically available in an electronic data base format (e.g., Excel), and we expect that that will be the case with New Mexico and Oregon data (we are in discussions with state officials to confirm the form of the records we will be provided).


As appropriate, potential survey respondents will be tracked using white pages, directory assistance, reverse directories, and internet searches. Efforts will be made to track each respondent before that person’s case is deemed to be a “non-response, unable to locate”. In keeping with the standard response rate definitions of the American Association for Public Opinion Research,2 these cases will be counted in the denominator for the final response rate calculations. Information such as full name, date of birth, and other identifying information provided in the sample will be used to verify that respondents who are identified through tracking are indeed the correct individuals. Further, all respondents will be tracked if the contact information provided in the original sample list is incorrect.


Cell phones present an interesting and growing problem for telephone surveys. Most surveys and survey centers screen cell phone prefixes out of the sample given ethical considerations (e.g., respondents pay for the cost of the interview, concerns for safety if the respondents answer the call while driving, etc.). We plan to follow this standard practice for the HIFA enrollee survey. If a respondent is reached via cell phone, a landline telephone number will be requested and the interview will be rescheduled.


Another potential problem will be obtaining enrollee cooperation, as the interviewer will have to reveal how respondents were selected. We are guardedly optimistic, however, based on our experience. For example, a past mixed mode (mail and telephone) survey of Minnesota Medicaid and MinnesotaCare enrollees drawn from administrative files yielded a 54 percent response rate.3



3. The University of Minnesota survey team has extensive experience interviewing participants of state programs and obtaining high response rates. The organization will adhere closely to industry standards in fielding the survey. Specifically, a random sample of current HIFA enrollees will be sampled from state administrative records. Using available contact information (or once a telephone number is located), experienced survey interviewers will contact all respondents by phone. Each respondent will be called eight or more times (eight times if all calls are “no answers” and up to five additional times once contact has been made with the respondent) to attempt to complete the survey before the case is considered a final non-response. This final non-response will counted in the denominator for the response rate. If an answering machine is reached, messages will be left including the interviewer’s name and affiliation, purpose of the call, and day and time when she or he would call back. Leaving such messages has been shown to increase the rate of reaching a household by as much as 15 percent. Overall response rates have been shown to be similarly increased.


Eligible enrollees who initially refuse to complete the survey will be re-contacted by a data collection supervisor or experienced survey interviewer specially trained in refusal conversion. In these calls, enrollees will be given the opportunity to reconsider participation in the study (except in the case of very hard refusals or irate respondents).


We will require that data collection supervisors monitor 5 to 10 percent of all interviews to ensure quality administration of the survey.


Using the procedures described above, the UMN survey team obtained a response rate of 54 percent in a 2003 survey of public program enrollees in the state of Minnesota (see section B2 above). The median response rate for the BRFSS survey in 2006 was 51 percent, ranging from 33 to 66 percent, as noted earlier. Given national trends of falling response rates generally,4 and geographic variations in response rates, we anticipate the response rate for the HIFA enrollee survey in New Mexico and Oregon could be somewhat lower. Fortunately, recent research indicates that lower response rates are not necessarily associated with greater response bias because surveys with high and low response rates demonstrate similar levels of absolute bias.5 Work with other surveys, including those measuring health insurance coverage, has demonstrated little absolute impact of the response rate on the estimates produced from the survey.6 


In addition to working to ensure high response rates and minimize bias, UMN will evaluate non-response bias insofar as is possible. Using data available from the administrative records (e.g., demographic data, length of enrollment in state HIFA program, etc.), UMN will contrast survey respondents with non-respondents for any evidence of response bias.


In addition, UMN will ask state collaborators for assistance to obtain other information available in the enrollment file that will be helpful in examining non-response bias. For example, length of enrollment in the state HIFA program, other demographic information, utilization data, and similar measures will be used that will allow the contractor to compare characteristics of:


(1) those who are invited and complete a survey,

(2) those who are invited and appear to be eligible to complete the survey but do not (callbacks, no answer, etc), and

(3) those who refuse the invitation to complete the survey.


To the extent data are available in the enrollment files, we will contrast these three groups for evidence of response bias (significant differences with t-tests) between those for whom we do and do not have completed survey data.



4. The enrollee survey will pre-tested on up to 10 respondents to ensure that the questions are clear and yield valid responses.


5. Several skilled methodologists have been consulted on the statistical aspects of the design of this enrollee survey:


1. Adam Atherly, PhD, Rollins School of Public Health, Emory University

Phone: 404-727-1175

Email: aatherl@sph.emory.edu


2. Gestur Davidson, PhD, State Health Access Data Assistance Center, University of Minnesota

Phone: 612-625-2339

Email: david064@umn.edu


3. Bryan Dowd, PhD, Division of Health Services Research and Policy, University of Minnesota

Phone: 612-624-5468

Email: dowdx001@umn.edu


The actual collection of the information for the agency will be done by the University of Minnesota or another qualified vendor under UMN’s direction. The analysis of the information will be done by the University of Minnesota (Professors Call, Davidson, and Dowd) and Emory University (Professors Adam Atherly and Kathleen Adams).




1 Gallagher PM, Fowler FJ, Stringfellow VL. Notes from the field: Experiments influencing response rates from Medicaid enrollees. Paper presented at the 55th Annual Conference of the American Association for Public Opinion Research, Portland, OR, 2000; and Hall JW. Sampling Medicaid and uninsured populations. Conference Proceedings: The Health Survey Research Conference, NCHS, PHS-96-1013, 1996.

2 The American Association for Public Opinion Research. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys, 4th Edition. Lenexa, Kansas: AAPOR, 2006.

3 Beebe T, Britt H, Call KT, Cha V, Kreider C, Lundblad J, McAlpine D, McRae J, Moore B, Osman S, Suarez W. Disparities and Barriers to Utilization Among Minnesota Health Care Program Enrollees, final report for the Minnesota Department of Human Services, Minneapolis, MN: SHADAC, Division of Health Services Research and Policy, University of Minnesota, December 2003.

4 Curtin R, Presser S, Singer E. Changes in Telephone Survey Nonresponse over the Past Quarter Century. Public Opinion Quarterly, 69(1): 87-98 (2005).

5 Groves, R. Nonresponse rates and nonresponse bias in household surveys. Public Opinion Quarterly, 70:5 (2006): 646-675.

6 E.g., Blumberg SJ, Bramlett MD. Comparing states on outcomes for children with special health care needs. Maternal and Child Health Journal, 9s (2005): s121-s128; and Keeter S, et al. Gauging the Impact of Growing Nonresponse on Estimates from a National RDD Telephone Survey. Public Opinion Quarterly, 70:5 (2006): 759-779.


MRAD/TOC HHSM-500-2005-00027I, T.O. 2 Page 4

File Typeapplication/msword
File TitleSupporting Statement B
Authorht80
Last Modified Byht80
File Modified2009-02-03
File Created2009-02-03

© 2024 OMB.report | Privacy Policy