B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
The respective target population for this survey is as follows:
Survey of Satisfaction with the VBA Call Center Experience: gathers satisfaction data from veterans and active duty personnel (customers) who recently contacted a VBA national call center
This section describes the universe of respondents for the survey, and the sampling plan that will be employed to achieve a representative sample for each survey.
For each call center, the eligible population will be randomly selected from all veterans who have contacted a VBA call center within 2 to 7 days. Three hundred surveys will be completed for each of the nine call centers. A maximum of 5 attempts will be made on every number until the requisite 300 surveys per call center have been completed.
Table 5 displays the number of customers to be surveyed, the expected response rate, and the expected yield of completed Call Center Satisfaction surveys. VA anticipates a response rate of 50%.
TABLE 5: CALL CENTER SATISFACTION SURVEY, EXPECTED RESPONSE RATE AND SURVEY YIELD |
||
Number of Customers |
Expected Response Rate |
Completed Surveys Expected |
5,400 |
50% |
2,700 |
2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The Call Center Satisfaction Survey will entail simple random sampling of 600 customer interactions from each national call center. VBA is using a 95% confidence interval for categorical variables for the survey. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.
VBA will obtain the services of a contractor to develop, administer, and analyze this survey.
Strategies to Maximize Response Rates
VBA will employ a variety of methods to minimize respondent burden and to maximize survey response rates. This section identified the strategies to be employed to reach these objectives. Each strategy is outlined below.
Strategy # 1 to Maximize Response Rates: Using a combination of Phone Interviews and Automated Phone Technologies for Ease of Response
The survey methodology involves two data collection techniques via the phone: a live phone interviewer and an IVR-hosted (automated) technology. Specifically, a live-phone interviewer will introduce the purpose of the survey and answer any questions veterans have before beginning the survey. Once a veteran agrees to take the survey, they will automatically be connected to the IVR-hosted survey. If the respondent is unable to complete the survey via IVR, the live interviewer will stay on the line and conduct the survey.
We are using these combined techniques in order to maximize response rates, reduce interviewer bias (i.e. scores for interviewer assisted surveys are notably higher than scores in non-assisted methods) and lower data collection costs.
Additionally, the IVR-hosted survey has been developed with the end user in mind and has been extensively tested and used by numerous clients of J.D. Power and Associates.
Strategy # 2 to Maximize Response Rates: Conduct Cognitive Labs/Pre-testing of Surveys
J.D. Power and Associates will conduct cognitive tests of the survey instrument prior to fielding. Detail on the tests can be found in response to question 4.
Strategy # 3 to Maximize Response Rates: Maintaining a Toll-Free Survey Hotline
During the period that the surveys are in the field, the contractor will provide and maintain a toll-free telephone line to answer any questions respondents and regional office points of contact may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, how to get another survey if their copy has been lost/damaged). Project staff will be available to answer telephone calls during regular business hours (8:30 a.m.-6 p.m. ET). A voice messaging system will be available to receive messages after regular business hours so after-hours calls can be responded to within 24 hours.
Strategy # 4 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature
None of the questions included in the surveys are sensitive, or private in nature, which will encourage compliance.
Strategy # 5 to Maximize Response Rates: Assuring and Maintaining Confidentiality
Survey respondents for all surveys will be assured that their personal anonymity will be maintained. Upon completion of the field period, the contractor will undertake to destroy any customer information that it has it its possession, in order to ensure that all customer information is held strictly confidential.
Strategy # 6 to Maximize Response Rates: Secure Networks and Systems
The contractor will have a secure network infrastructure that will protect the integrity of the databases, the survey application, and all associated server resources. The servers must be protected by a strong firewall system and the operations center must be in a secure temperature-controlled environment with video surveillance, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers will be supported by a backup power supply that can continue to run the systems in the event of a power outage. Additionally, the contractors must be immediately alerted if critical monitor thresholds are exceeded, so that they can proactively respond before outages occur.
Approach to Examine Non-Response Bias
Two types of non-response can affect the interpretation and generalization of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. In most satisfaction surveys, however, missing data rates on satisfaction items need to be at or above 50% before item non-response negatively impacts the results. This is because satisfaction items (i.e. ratings of knowledge, courtesy, responsiveness, etc) tend to be highly correlated (r >/=. 70) and therefore are collinear. Unit non-response is non-participation by an individual that was intended to be included in the survey sample and failed to respond to the survey. Unit non-response – the failure to return a questionnaire – is what generally is recognized as survey non-response bias.
Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that are not included or are under-represented within the resulting sample. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Two factors determine the degree to which non-response bias will impact the accuracy and reliability of the resulting survey estimate. The first factor is the overall amount of non-response (e.g. overall response rate). Typically, response rates of 25% or better provide resulting survey estimates that are within an acceptable margin of error from which to generalize to the overall population. However, this is not always the case if the second factor is also present. That is, in cases where there are meaningful differences between those who respond and those that do not on the key outcome measures or on other measures that predict or explain the primary outcome measures, than high response rates may not adequately minimize non-response bias.
There are two approaches to tackling the effects of non-response. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.
Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:
Contractor will attempt to contact respondents a maximum of 5 times over a 3 day period – varying the call time in the respondents’ time zone to increase the likelihood of completing the survey.
Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (e.g., males, young, full-time employed).
Use of well-designed questionnaires and the promise of confidentiality.
Providing a contact name and telephone number for inquiries.
Employing these strategies to the administration of these surveys will be crucial for maximizing high response rates across all respondent types (see section on maximizing response rates above).
Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted as follows:
Compare the Demographics of Respondents from the VBA Call Center Satisfaction Survey to the Demographics of Non-Respondents from the VBA Call Center Satisfaction Survey. To examine the presence of non-response bias, VA will compare the demographics of responders (i.e., those who responded to the VBA Call Center Satisfaction Survey) to the non-responders (i.e., those who did not respond to the VBA Call Center Satisfaction Survey).
The comparison between responders and non-responders will be made on the following variables for this survey:
Region – it is possible that participants from a certain part of the country (i.e., region) may respond to the survey at a higher rate than those who are from another part of the country.
Reason for or outcome of the call center contact—Reason for contacting the VBA or the resulting outcome of the contact (e.g., claim denied) may result in some veterans being more or less likely to complete the survey
Gender – it is possible that participants from a certain gender (i.e., male) may respond at a higher rate than their counterpart.
Degree and depth of interaction with call centers – it is possible that respondents and non-respondents may differ with respect to the degree and depth of their interaction with VBA call centers.
Based on the steps discussed above, VBA will identify issues with respect to non-response bias for the survey.
The contractor will conduct cognitive labs with three or more test users for the survey, to determine whether respondents understand the survey questions and answer choices, as intended. Working closely with VBA, the contractor will draw a small pool of names from potential participants in the survey for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. The contractor will submit the list of potential participants to VBA for review and approval. Once identified, the contractor will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington, DC area.
Once the participants have been selected, VA will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
For this survey, VBA contracted the services of J.D. Power and Associates to administer the survey. The following is a list of the persons involved in the survey.
Mrs. Gina Pingitore, PhD, J.D. Power and Associates, 805-418-8043
Mrs. Melissa Sauter, MBA, J.D. Power and Associates, 248-312-4174
Mr. Greg Truex, MPP, J.D. Power and Associates, 202-383-3511
File Type | application/msword |
Author | Hee-Seung L. Seu |
File Modified | 2009-10-20 |
File Created | 2009-10-20 |