B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
The current survey pool stratifications represent mutually-exclusive and exhaustive categories for each line of business based upon the level/degree of experience a respondent has with a particular benefit. This stratification method was chosen because the level/degree of experience with a benefit or line of business can influence their level of satisfaction with their experience. The number and types of questions answered will vary slightly based on the experiences of the targeted population (e.g., whether or not the respondent has had contact or not with the VA regarding their claim). The tables for the lines of business describe the universe of respondents for the survey, and the sampling plan that will be employed to achieve a representative sample for the survey. Each table displays the sampling frame, the expected response rates and the expected yield.
1) Compensation Enrollment
The targeted population for the Compensation Enrollment questionnaire benchmark study will include individuals who have received a decision on a compensation benefit claim within 30 days prior to the fielding period (includes those who were found eligible on a new or subsequent claim and those who have been denied and are not appealing the decision).
TABLE 1: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Claimants |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
2) Compensation Servicing
The targeted population for the Compensation Servicing questionnaire benchmark study will include individuals who received a decision or were receiving benefit payments between 6-18 months prior to the fielding period.
TABLE 2: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Claimants |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
3) Pension Enrollment
The targeted population for the Pension Enrollment questionnaire benchmark study will include individuals who have received a decision on a pension benefit claim within 30 days prior to the fielding period (includes those who were found eligible on a new or subsequent claim and those who have been denied and are not appealing the decision).
TABLE 3: Individuals who have received a decision on a pension benefit claim EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Veterans |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
4) Pension Servicing
The targeted population for the Pension Servicing questionnaire benchmark study will include individuals who have currently been receiving benefits for at least 6 months or individuals who received a decision 12 months ago.
TABLE 4: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Claimants |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
5) Education Enrollment
The targeted population for the Education Enrollment questionnaire benchmark study will include individuals who have received a decision on their education benefit application within 90 days (i.e., the original end-product has been cleared within the past 90 days) prior to the fielding period. The sample will be stratified as follows: (1) accepted and enrolled, (2) accepted and not enrolled.
TABLE 5: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Claimants |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
6) Education Servicing
The targeted population for the Education Servicing questionnaire benchmark study will include beneficiaries who have been enrolled and receiving education benefit payments for at least 2 consecutive school terms prior to the fielding period.
TABLE 6: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Beneficiaries |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
7) Loan Guaranty Enrollment
The targeted population for the Loan Guaranty Enrollment questionnaire benchmark study will include individuals who closed a VA home loan in the 90 days prior to the fielding period. The sample will be stratified as follows: (1) those who closed on purchase loans, (2) those who received loans for interest rate reductions, and (3) those who obtained cash out or other refinancing.
TABLE 7: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Beneficiaries |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
8) Specially Adapted Housing Servicing
The targeted population for the Specially Adapted Housing Servicing questionnaire benchmark study will include individuals who are eligible for a specially adapted housing grant and in the past 12 months have: (1) received an approval on their grant and are currently somewhere in post-approval, (2 have had all their funds dispersed and final accounting is not yet complete, and (3) have had all of their funds dispersed and final accounting is complete.
TABLE 8: Individuals who were eligible for a SAH grant in FY 2012 EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Beneficiaries |
Expected Response Rate |
Completed Surveys Expected |
5,000 |
30% |
1,500 |
9) Vocational Rehabilitation and Employment Enrollment
The targeted population for the Vocational Rehabilitation and Employment (VR&E) Enrollment questionnaire benchmark study will include individuals who had an initial meeting with their VR&E counselor and were granted a decision regarding their entitlement in the past 60 days prior to the fielding period. The sample will be stratified as follows: (1) those who applied/did not show up for initial appointment/never receive an entitlement decision, (2) those who applied/showed up for the initial appointment/entitled to program and pursued, (3) those who applied/showed up for initial appointment/entitled to program and did not pursue, (4) those who applied/showed up for initial appointment/not entitled to program)..
TABLE 9: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Claimants |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
10) Vocational Rehabilitation and Employment Servicing
The targeted population for the VR&E Servicing questionnaire benchmark study will include individuals who have entered and been enrolled in one of the five tracks for at least 60 days prior to the fielding period. The sample will be stratified as follows: (1) Veterans who have been rehabilitated, (2) Veterans who did not fully complete the program (negative closures), and (3) Veterans who have reached maximum rehabilitation gain and could not proceed in the program.
TABLE 10: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Beneficiaries |
Expected Response Rate |
Completed Surveys Expected |
10,000 |
30% |
3,000 |
11) Vocational Rehabilitation and Employment Escaped Beneficiary
The targeted population for the VR&E Escaped Beneficiary questionnaire benchmark study will include individuals who dropped out of the program prior to completing a rehabilitation plan. Included in this survey pool are individuals who applied for the benefit but did not appear for their initial meeting.
The sample will be stratified as follows: (1) applicants who never attended the initial meeting with a counselor, (2) applicants who were determined to be entitled and did not complete a rehabilitation plan, and (3) applicants who started, but did not complete rehabilitation (i.e., negative closures).
TABLE 11: EXPECTED RESPONSE RATE AND SURVEY YIELD
|
||
Number of Escaped Beneficiaries |
Expected Response Rate |
Completed Surveys Expected |
5,000 |
30% |
1,500 |
2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The Voice of the Veteran Compensation and Pension surveys will entail stratified random sampling of 10,000 Veterans for each of the four population segments outlined in Section 1. VA is using a 95% confidence interval for categorical variables for all surveys. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.
The Voice of the Veteran Education surveys will entail stratified random sampling of 10,000 Veterans for the each of the population segments outlined in Section 1. VA is using a 95% confidence interval for categorical variables for all surveys. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.
The Voice of the Veteran Loan Guaranty Enrollment survey will entail stratified random sampling of 10,000 Veterans and the Specially Adapted Housing Servicing survey will entail stratified random sampling of 5,000 Veterans for the population segment outlined in Section 1. VA is using a 95% confidence interval for categorical variables for all surveys. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.
The Voice of the Veteran Vocational Rehabilitation and Employment Enrollment and Servicing surveys will entail stratified random sampling of 10,000 Veterans for the population segment outlined in Section 1. The Voice of the Veteran Vocational Rehabilitation and Employment Escaped Beneficiary survey will entail stratified random sampling of 5,000 Veterans for the population segment outlined in Section 1. VA is using a 95% confidence interval for categorical variables for all surveys. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.
VBA has obtained the services of a contractor to develop, administer, and analyze this set of surveys.
Strategies to Maximize Response Rates
VBA will employ methods to minimize respondent burden and to maximize survey response rates. This section identifies the strategies to be employed to reach these objectives. Each strategy is outlined below.
Strategy # 1 to Maximize Response Rates: Using Web Technologies for Ease of Response
Veterans will have the option to complete the Voice of the Veteran surveys on paper or via web-based form. Respondents will only be provided access to participate in one survey during the benchmark study program. The sample will be de-duplicated and Veterans that appear in the sample list multiple times and/or across multiple lines of business will be directed to the survey with the lowest number of possible respondents.
The web address that the surveys will be posted on will be included in all of the mailing notifications as indicated below. This initial notification will include a cover letter and a URL and password.
Both the paper and web-based surveys will be developed with the end user in mind; a user-friendly form will help maximize response rates.
The online survey technology will incorporate several features to maximize response rates and respondent usability. These include a password system, which prevents any one person from completing more than one survey and allows respondents to begin the survey then come back at a later point to finish it.
Strategy # 2 to Maximize Response Rates: Using Advance and Follow-Up Mailings to Publicize the Surveys and Encourage Response
VBA will use a 4-step survey and follow-up process to administer the surveys (see Table 13 below). An increase in the overall response rate is the major advantage of using this process. The use of a letter as a follow-up tends to increase the response rate by between 5 and 8 percentage points.
Mailing |
Table 13: Mailing Material |
#1
|
Notification/cover letter w/ URL & password and paper survey
|
#2
|
Reminder notification/cover letter w/ URL & password and paper survey
|
#3
|
Second notification/cover letter w/ URL & password and paper survey
|
#4
|
Second reminder notification/cover letter w/ URL & password and paper survey
|
Strategy # 3 to Maximize Response Rates: Conduct Cognitive Labs/Pre-testing of Surveys
The contractor will conduct cognitive labs with three test users for each survey to determine whether respondents understand the survey questions and answer choices, as intended. VBA will provide the contractor with lists of potential test users. The contractor shall be responsible for securing the participation of test users, from this list. Prior to user testing, the contractor shall provide VBA staff with a list of the selected test-users.
VBA will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one or group sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.
The contractor has completed cognitive testing as part of questionnaire development for the pilot effort. A summary report of the cognitive testing sessions was prepared for the Voice of the Veteran surveys. The results of the cognitive labs were taken into account when revising and finalizing the survey questionnaires for the benchmark study efforts.
Strategy # 4 to Maximize Response Rates: Maintaining a Toll-Free Survey Hotline
During the period that the survey is in the field, the contractor will provide and maintain a toll-free telephone line and dedicated e-mail address to answer any questions respondents may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, how to get another survey if their copy has been lost/damaged). Project staff will be available to answer telephone calls or emails during regular business hours (8:30 a.m.-6 p.m. ET). A voice messaging system will be available to receive phone messages after regular business hours so after-hours calls can be responded to within 24 hours. Respondents who utilize the e-mail option will be sent an initial automatically generated response email informing them that their e-mail was received and that they will receive a response within 24 hours.
Strategy # 5 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature
None of the questions included in the surveys are sensitive, or private in nature, which will encourage compliance.
Strategy # 6 to Maximize Response Rates: Assuring and Maintaining Confidentiality
Survey respondents for all surveys will be assured that their personal anonymity will be maintained. All hard copy questionnaires will be scannable, and consist of approximately eight printed pages, printed back to back with a numeric Litho-Code on the front and back cover. Veterans will be provided unique passwords that will allow the contractor to identify when a respondent has completed the survey and exclude them from further reminder letter or postcards.
Strategy # 7 to Maximize Response Rates: Secure Networks and Systems
The contractor has a secure network infrastructure that will protect the integrity of the databases, the survey application, and all associated server resources. The servers are protected by a strong firewall system and the operations center is in a secure temperature-controlled environment with video surveillance, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers are supported by a backup power supply that can continue to run the systems in the event of a power outage. Additionally, the contractors will be immediately alerted if critical monitor thresholds are exceeded, so that they can proactively respond before outages occur.
Approach to Examine Non-Response Bias
Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that under-represents certain types of respondents. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Non-response bias associated with an estimate consists of two components – the amount of non-response and the difference in the estimate between the respondents and non-respondents. While high response rates are always desirable in surveys, they do not guarantee low response bias in cases where the respondents and non-respondents are very different. Two types of non-response can affect the interpretation and generalization of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. Unit non-response is non-participation by an individual that was intended to be included in the survey sample. Unit non-response – the failure to return a questionnaire – is what is generally recognized as survey non-response bias.
There are two approaches to tackling the effects of non-response. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures, which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.
Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:
Use of notification letters, duplicate survey mailings, reminder letters and postcards.
Use of novelty in correspondence such as reminder postcards designed in eye-catching colors.
Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (e.g., males, young, full-time employed).
Use of well-designed questionnaires and the promise of confidentiality.
Providing a contact name and telephone number for inquiries.
Employing these strategies to the administration of these surveys will be crucial for maximizing high response rates across all respondent types (see section on maximizing response rates above).
Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted as follows:
Compare the Demographics of Respondents from the Voice of the Veteran surveys to the Demographics of Non-Respondents from the Voice of the Veteran surveys. To examine the presence of non-response bias, VA will compare the demographics of responders (i.e., those who responded to each Voice of the Veteran Survey) to the non-responders (i.e., those who did not respond to each Voice of the Veteran Survey).
The comparison between responders and non-responders will be made on the following variables for this survey:
Region – it is possible that participants from a certain part of the country (i.e., region) may respond to the survey at a higher rate than those who are from another part of the country.
Age – it is possible that participants from a certain age group (i.e., over 30 years old) may respond at a higher rate than those who are members of another age group.
Gender – it is possible that participants from a certain gender (i.e., male) may respond at a higher rate than their counterpart.
Disability rating – it is possible that participants from a certain range of disability ratings may respond at a higher rate than another set of participants from another disability range.
Separation date - it is possible that participants who separated from the military at a later date may respond at a higher rate than participants who separated at a later date
Length of service – it is possible that participants who served in the military for longer periods of time may respond at a higher rate than participants who did not serve for as long a period
Based on the steps discussed above, VBA will identify issues with respect to non-response bias for the survey.
The contractor will conduct cognitive labs with three or more test users for the survey to determine whether respondents understand the survey questions and answer choices, as intended. Working closely with VBA, the contractor will draw a small pool of names from potential participants in each of the surveys for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. The contractor will submit the list of potential participants to VBA for review and approval. Once identified, the contractor will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington, DC area or via conference call depending upon participant availability.
Once the participants have been selected, VBA will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one or group sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.
The cognitive labs have been completed and changes to the questionnaires have been made based on those findings.
5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
The VBA Benefits Assistance Service VOV Program contact persons are:
Bernita Upshaw, Bernita.Upshaw@va.gov
Pamela Liverman, Pamela.Liverman@va.gov
The VBA Compensation Service contact persons are:
Eric Robinson, Eric.Robinson3@va.gov
Edward Walsh, Edward.Walsh@va.gov
Pamela Burd, Pamela.Burd@va.gov
The VBA Pension and Fiduciary Service contact persons are:
Marc Williams, Marc.Williams2@va.gov
Laurine Carson, Laurine.Carson@va.gov
The VBA EDU contact persons are:
Bob Macomber, Bob.Macomber@va.gov
Bill Spruce, Bill.Spruce @va.gov
The VBA LGY contact persons are:
Carleton Sea, Carleton.Sea@va.gov
Elysium Drumm, Elysium.Drum@va.gov
The VBA VR&E contact persons are:
Trisha Bartlett, Trisha.Bartlett@va.gov
Rod Scott Ward, Rod.Ward@va.gov
VBA has contracted the services of JD Power & Associates (JDPA) to administer the survey. JDPA contacts are as follows:
Greg Truex, 202-383-3511
Jennifer Benkarski, 202-383-3707
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Hee-Seung L. Seu |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |