Supporting statement- VBA LGY OMB-Part B

Supporting statement- VBA LGY OMB-Part B.doc

Veterans Benefits Administration (VBA) Loan Guaranty Service (LGY) Customer Satisfaction Surveys

OMB: 2900-0711

Document [doc]
Download: doc | pdf


B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

1. Provide a numerical estimate of the potential respondent universe and describe any sampling or other respondent selection method to be used. Data on the number of entities (e.g., households or persons) in the universe and the corresponding sample are to be provided in tabular format for the universe as a whole and for each stratum. Indicate expected response rates. If this has been conducted previously include actual response rates achieved.


As noted in part A, this information collection comprises a suite of customer satisfaction surveys of the VBA LGY. The respective target populations for these surveys are as follows:


  • Survey #1 – Survey of Lender Satisfaction with the VA Home Loan Guaranty Process (i.e. Lender Survey): gathers satisfaction data from lending institutions that participated in VBA LGY program during the past fiscal year.


  • Survey #2 – Survey of Veteran Satisfaction with the VA Home Loan Guaranty Process (i.e. Veteran Survey): gathers satisfaction data from veterans that used the VA to obtain a home loan in the past fiscal year.


  • Survey #3 – VA Specially Adapted Housing Survey (i.e. SAH Survey): gathers satisfaction data from veterans that have received an SAH grant in the past fiscal year.


  • Survey #4 – VA Specially Adapted Housing Survey (i.e., Non-SAH Survey): Eligible Non-Grantee Survey: gathers data from veterans that are eligible for SAH but have not yet obtained the grant.


This section describes the universe of respondents for each survey, and the sampling plan that will be employed to achieve a representative sample for each survey.


Lender Survey


For the Lender survey, useful customer satisfaction data can only be obtained from lenders who are entirely familiar with the VA Home Loan program. Using the population of all participating LGY lenders would likely yield a large number of inexperienced lenders, and would therefore not serve the purpose of the survey. To ensure useful data, the survey population will be limited to those lenders who have processed 12 VA loans or more in the prior 12 months. All such lenders can be assumed to have the familiarity with the VA loan program required to provide useful data. Using this threshold also facilitate comparison of findings with previous iterations of the Lender survey, since this criterion for selection was used previously.


The first stage is therefore to identify those lenders who meet the 12-loan criteria from the population of all participating lenders. Based on administrative data from VA, the size of this universe is approximately 2,000. Project resources allow the VA to survey the full census of lenders meeting the 12-loan threshold. Surveying the census of those conducting 12 or more loans, rather than drawing a sample, eliminates the potential for sampling error with respect to this particular survey, making this an attractive methodological choice.


Table 5 displays the universe of qualifying lenders, the expected response rate, and the expected yield of completed Lender surveys. We anticipate a response rate of 50% in 2008, similar to the response rate achieved in the FY02 Lender Survey.


TABLE 5:

LENDER SURVEY UNIVERSE, EXPECTED RESPONSE RATE AND SURVEY YIELD

No. of Lenders making 12+ VA Loans in FY07

Expected Response Rate

Completed Surveys Expected

2,000

50%

1,000


Veteran Survey


For the veteran survey, the size of the eligible population is determined by LGY national workload data for the fiscal year. LGY’s estimated national workload for FY06, was 230,000 veterans with loans guaranteed. This total includes veterans who obtained an original loan through the LGY program.


For comparative purposes, reliable statistical estimates of veterans’ satisfaction with the LGY program are required for customers in each of the nine Regional Loan Centers (RLCs). A statistical power analysis indicates that, to generate customer satisfaction estimates with confidence interval of 95%, a stratified random sample that yields approximately 385 completed surveys for each RLC will provide sufficient statistical power to test differences in satisfaction outcomes between veterans in different RLCs, and to conduct trend analyses with data collected from previous iterations of the Veterans survey.


Although we assume a 50% response rate, power calculations were calculated using a conservative approach to allow for a lower response rate (if necessary). We propose to draw a random sample of 13,500 veteran records from the eligible population. These 13,500 will be stratified by RLC; that is, 1400 randomly selected records (i.e., names, contact information) of eligible veterans will be provided from each of the nine RLCs and 900 will be provided from Hawaii (note: the population of loans in Hawaii is much smaller, therefore a smaller number will be drawn). Table 6 displays the sampling frame, RLCs, expected response rates and the expected yield.


Table 6: Veteran Survey

Respondent Sample Sizes Needed for 95% Confidence Interval
by REGIONAL LOAN CENTER (RLC)


Regional Loan Center (RLC)

No. of Veteran records to be

randomly sampled from VBA database


Expected response rate


Completed surveys expected

Atlanta, GA

1400

50%

700

Cleveland, OH

1400

50%

700

Denver, CO

1400

50%

700

Houston, TX

1400

50%

700

Manchester, NH

1400

50%

700

Hawaii, HI

900

50%

450

Phoenix, AZ

1400

50%

700

Roanoke, VA

1400

50%

700

St. Paul, MN

1400

50%

700

St. Petersburg, FL

1400

50%

700

Total

13,500

50%

6,750


Specially Adapted Housing Program: Grantee Survey


For the SAH grantee survey, the primary population to be surveyed includes all SAH grant recipients in the current fiscal year. The size of this universe is estimated by VBA at 600 individuals. Since it is a small universe, a census survey of all recipients is warranted to ensure that accurate, representative customer satisfaction data at the national level are obtained. Past iterations of the SAH Survey have been extremely successful in generating high rates of response and that is expected for the 2008 iteration of the survey (we estimate a 67% response rate based on past surveys). Table 7 displays the universe of current SAH grant recipients, the expected response rate, and the expected yield of completed surveys from this census.


TABLE 7:

SAH GRANTEE SURVEY UNIVERSE, EXPECTED RESPONSE RATE AND SURVEY YIELD

No. of current SAH grant recipients (census)

Expected Response Rate

Completed Surveys Expected

600

67%

400


Specially Adapted Housing Program: Eligible Non-Grantee Survey


The target population for the eligible non-grantee survey includes the individuals who are eligible to receive an SAH grant, but have not yet taken the necessary actions to become grant recipients. Using VA administrative data, VBA is able to identify and separate records of the individuals who are eligible for but have not yet used the SAH grant. To represent the population nationally, power calculations indicate that surveys from 385 veterans are needed. Although we expect a high response rate to the survey (70%), power calculations were calculated using a conservative approach to allow for a lower response rate (if necessary). We propose to send a total of 1,000 surveys to eligible non-grantee recipients (i.e., selecting 200 veterans per fiscal year, removing any duplicate cases). Table 8 displays the proposed sample, the expected response rate, and the expected yield of completed surveys from the survey.


TABLE 8:

SAH-ELIGIBLE SURVEY UNIVERSE, EXPECTED RESPONSE RATE AND SURVEY YIELD

Sample of Eligible Non-Grantee Recipients

Expected Response Rate

Completed Surveys Expected

1000

70%

700



2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; the estimation procedure; the degree of accuracy needed for the purpose in the proposed justification; any unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.


The previous section presents the stratification and allocations of the sample for the Veterans survey. To re-cap, the Veterans survey will entail stratified random sampling of 13,500 veteran records from a universe of approximately 200,000 VA home loan recipients. This sample will be stratified by region of loan origination, such that approximately 1400 records will be randomly selected from each of the 9 RLCs (an additional 900 will be selected for Hawaii). As stated in the previous section, we are using a 95% confidence interval for categorical variables for all surveys. There are no unusual procedures that will be required to draw a representative sample meeting these criteria.


The Lender and SAH survey will be sent to the universe of eligible respondents because these populations are small, and 95% confidence intervals are desired for the statistical estimates of customer satisfaction that are produced. Finally, the Non-SAH survey will be sent to 1,000 individuals (see section above).


3. Describe methods used to maximize the response rate and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


LGY will obtain the services of a contractor to develop, administer, and analyze this set of surveys. The development of all surveys will reflect the comments, suggestions, and results from previous iterations of each survey


Strategies to Maximize Response Rates


LGY will employ methods to minimize respondent burden and to maximize survey response rates. This section identified the strategies to be employed to reach these objectives. Each strategy is outlined below.


  • Strategy # 1 to Maximize Response Rates: Using Web Technologies for Ease of Response


The Lender survey will be web-based and will be administered online. The web address that the survey is posted on will also be included in the mailings indicated below. Lenders will then connect to the appropriate web page and complete the survey. As all credible lending agencies will have computers and internet connections, the Lender survey will be administered online to maximize the timeliness, efficiency, and response rate of data collection.


The Veteran survey will be administered both online and via self-administered mail surveys, giving the respondents a choice of mode, which will boost response and minimize perceived burden. Like the Lender survey, the web address that the survey is posted on will also be included in the mailings indicated below. This survey will utilize both online and self-administered surveys to maximize the timeliness, efficiency, and response rate of data collection. The Web-based Lender and Veteran surveys will be developed with the end user in mind, with the goal of providing a user-friendly Web site in which to complete the survey.


The on-line survey technology will incorporate several features to maximize response rates and respondent usability. These include a password system, which prevents any one person from completing more than one survey and allows respondents to begin the survey, then come back at a later point to finish it (i.e., this is particularly useful for long surveys). Other features include user-friendly drop-down boxes, internal links to the directions throughout the survey, and internal links to key terms and definitions.


  • Strategy # 2 to Maximize Response Rates: Allowing Disabled Veterans to Complete a Mailed Questionnaire


The SAH survey will be administered via self-administered mail surveys. Please see below for details regarding this process. The self-administered mail survey method was chosen for this survey with the severely disabled population that it assesses in mind. Previous iterations of the SAH survey method demonstrate that a mailed questionnaire elicits greater response.


  • Strategy # 3 to Maximize Response Rates: Using Advance and Follow-Up Mailings to Publicize the Surveys and Encourage Response


LGY will use a 5-step survey and follow-up process to administer the surveys (see Table 9). An increase in the overall response rate is the major advantage of using this process. The use of a reminder post card as a follow-up tends to increase the response rate by between 5 and 8 percentage points. The use of both reminder post cards and a second survey mailing almost double the response rate.



table 9: Design and Distribution of Mailing Materials

by Survey and Each of Four Mailings

Mailing

Mailing Material

Veteran

Survey

Lender

Survey

SAH

Surveys

#1

Pre-notification/cover letter





#2


Notification/cover letter

Notification/cover letter w/ URL & password

Paper survey

Business Reply Envelope





#3


1st reminder card w/ URL & password

1st reminder card




#4

Second notification/cover letter

Second notification/cover letter w/ URL &password

Paper survey

Business Reply Envelope





#5

2nd reminder card w/ URL & password

2nd reminder card




















  • Strategy # 4 to Maximize Response Rates: Conduct Cognitive Labs/Pre-testing of Surveys


The contractor will conduct cognitive labs with three or more test users for each survey (Lender, Veteran, and SAH) and per medium (2 methods – paper and Web), to determine whether respondents understand the survey questions and answer choices, as intended. In conjunction with LGY, the contractor will draw a small pool of names from potential participants in each of the surveys for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. The contractor will submit the list of potential participants to the VBA for review and approval. Once identified, the contractor will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington DC area.


Once the participants have been selected, we will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


The contractor will prepare a summary report of the cognitive testing session for paper and Web versions of the customer satisfaction surveys. The results of the cognitive labs will be taken into account when revising and finalizing the survey questionnaires.


  • Strategy # 5 to Maximize Response Rates: Maintaining a Toll-Free Survey Hotline


During the period that the surveys are in the field, the contractor will provide and maintain a toll-free telephone line to answer any questions respondents and regional office points of contact may have about the survey (e.g., how to interpret questions and response items, the purpose of the survey, how to get another survey if their copy has been lost/damaged). Project staff will be available to answer telephone calls during regular business hours (8:30 a.m.-6 p.m. ET). A voice messaging system will be available to receive messages after regular business hours so after-hours calls can be responded to within 24 hours.


Strategy # 6 to Maximize Response Rates: Excluding Questions of a “Sensitive” Nature


None of the questions included in the surveys are sensitive, or private in nature, which will encourage compliance.

Strategy # 7 to Maximize Response Rates: Assuring and Maintaining Confidentiality

Survey respondents for all surveys will be assured that their personal anonymity will be maintained. All hard copy questionnaires (i.e., Veteran and SAH surveys) will be scannable, and consist of approximately eight printed pages printed back to back with a numeric Litho-Code on the front and back cover. Individual veterans will be provided unique passwords which will allow the contractor to identify when a respondent has completed the survey and exclude them from further reminder letter or postcards. For the Lender survey, each response will be identified by its corresponding ‘Lender ID number’; thus, each response will be attributed to a specific lending agency, not an individual. Respondents will be informed of this fact in the initial pre-notification letter and subsequent survey notification letters.


Strategy # 8 to Maximize Response Rates: Secure Networks and Systems


The contractor will have a secure network infrastructure that will protect the integrity of the databases, the survey application, and all associated server resources. The servers must be protected by a strong firewall system and the operations center must be in a secure temperature-controlled environment with video surveillance, where network services are continually monitored by automated real-time programs to ensure the integrity and availability of all critical components. All key servers will be supported by a backup power supply that can continue to run the systems in the event of a power outage. Additionally, the contractors must be immediately alerted if critical monitor thresholds are exceeded, so that they can proactively respond before outages occur.


Approach to Examine Non-Response Bias


Non-response bias refers to the error expected in estimating a population characteristic based on a sample of survey data that under-represents certain types of respondents. Stated more technically, non-response bias is the difference between a survey estimate and the actual population value. Non-response bias associated with an estimate consists of two components – the amount of non-response and the difference in the estimate between the respondents and non-respondents. While high response rates are always desirable in surveys, they do not guarantee low response bias in cases where the respondents and non-respondents are very different. Two types of non-response can affect the interpretation and generalizability of survey data: item non-response and unit non-response. Item non-response occurs when one or more survey items are left blank in an otherwise completed, returned questionnaire. Unit non-response is non-participation by an individual that was intended to be included in the survey sample. Unit non-response – the failure to return a questionnaire – is what is generally recognized as survey non-response bias.


There are two approaches to tackling the effects of non-response. One is to minimize the chances of non-response at the data collection stage. This may involve introducing measures which aim to maximize the response rate. The other approach is to make statistical adjustments at a survey follow-up stage when all the data is collected. Both approaches are described in the next paragraphs of this section.


Since it is not always possible to measure the actual bias due to unit non-response, there are strategies for reducing non-response bias by maximizing response rates across all types of respondents. In the face of a long-standing trend of declining response rates in survey research (Steeh, 1981; Smith, 1995; Bradburn, 1992; De Leeuw & Heer, 2002; Curtin & Presser, 2005), these strategies include:


  • Use of notification letters, duplicate survey mailings, reminder letters and postcards.

  • Use of novelty in correspondence such as reminder postcards designed in eye-catching colors.

  • Use of an extended survey field-period to afford opportunities to respond for subgroups having a propensity to respond late (e.g., males, young, full-time employed).

  • Use of well-designed questionnaires and the promise of confidentiality.

  • Providing a contact name and telephone number for inquiries.

Employing these strategies to the administration of the VA LGY Customer Satisfaction Surveys will be crucial for maximizing high response rates across all respondent types (see section on maximizing response rates above).


Non-response follow-up analyses can help identify potential sources of bias and can help reassure data users, as well as the agency collecting and releasing the data, of the quality of the data collected. The approach to examining the presence of non-response bias will be conducted in two steps:


  • Step 1For the Veteran Survey, Compare the Demographics of the Respondents to the Demographics of the 2001 National Survey of Veterans Survey (NSV) Respondents. One initial way to examine whether there is a non-response bias issue with the VA LGY Veteran Survey is to compare the demographics of the respondents from the Veteran Survey to the demographics of the respondents from the 2001 National Survey of Veterans (NSV). Although the profile of veterans securing a home loan may look a little different than the larger VA population (e.g., they may be younger), this comparison may be an initial step to assess non-response bias issues. For this analysis, we will draw comparisons on demographics to include, but not limited to: age, gender, marital status, and education. This first step will provide indication to where potential non-response bias may exist (if at all).

  • Step 2 For the Two SAH Surveys, Compare the Demographics of the Respondents to the Demographics of the Respondents from Other Representative Surveys of the Disabled Veteran Population. One initial way to examine whether there is a non-response bias issue with the two VA LGY SAH Surveys is to compare the demographics of the respondents from the SAH Surveys to the demographics of the respondents from other representative samples of this population (e.g., the Vocational Rehabilitation and Employment surveys). Although the profile of veterans securing an SAH grant may look a little different than the larger VA disabled population, this comparison may be an initial step to assess non-response bias issues. For this analysis, we will draw comparisons on demographics to include, but not limited to: age, disability rating, gender, marital status, and education. This first step will provide indication to where potential non-response bias may exist (if at all).

  • Step 2 – For each of the Surveys, Compare the Demographics of Respondents from the VA LGY Surveys to the Demographics of Non-Respondents from the VA LGY Surveys. To further examine the presence of non-response bias, we will compare the demographics of responders (i.e., those who responded to the VA LGY Surveys) to the non-responders (i.e., those who did not respond to the VA LGY Surveys). The comparison between these two groups will be made on the following variables for the surveys of veterans:

    • Age – it is possible that respondents may be older or younger in age than non-respondents.

    • Region – it is possible that participants from a certain part of the country (i.e., region) may respond to the survey at a higher rate than those who are from another part of the country.

    • Gender – it is possible that participants from a certain gender (i.e., male) may respond at a higher rate than their counterpart.

    • Loan Amount – it is possible that respondents and non-respondents may differ with respect to the loan amount.

    • Gross Income – it is possible that respondents and non-respondents may differ with respect to the gross income.

The comparison between these two groups will be made on the following variables for the survey of lenders:

    • Size of Lender – it is possible that respondents and non-respondents may differ with respect to the size of the lender.

    • Length of Time in Industry – it is possible that respondents and non-respondents may differ with respect to the length of time in the industry.

Based on the two steps discussed above, we will identify issues with respect to non-response bias.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions of 10 or more individuals.


The contractor will conduct cognitive labs with three or more test users for each survey (Lender, Veteran, and SAH) and per medium (2 methods – paper and Web), to determine whether respondents understand the survey questions and answer choices, as intended. Working closely with the VBA, the contractor will draw a small pool of names from potential participants in each of the surveys for inclusion in the cognitive labs. Cognitive lab participants will be drawn from the same population that will be used for the main study. The contractor will submit the list of potential participants to the VBA for review and approval. Once identified, the contractor will contact potential participants by telephone and ask them to participate. Cognitive lab sessions will take place in the metropolitan Washington DC area.


Once the participants have been selected, we will conduct cognitive lab sessions aimed at identifying needed additions or refinements to the questionnaire. Cognitive labs are one-on-one sessions with potential survey participants, in which respondents are asked to complete the questionnaire while thinking aloud. The primary purpose of these sessions is to gather feedback on survey questions and answer choices to ensure they are easily understood and correctly interpreted. Outcomes of cognitive labs include, but are not limited to: addition or omission of specific questions, changes to wording of questions, clarification of question response options, addition of response options, and changes to ordering of questions.


  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Our LGY contact person is Katherine Faliski, 202-461-9527.


For our last set of surveys, LGY contracted the services of ICF International to administer the survey. The following is a list of the persons involved in the survey.

  • Dr. Christopher Spera, ICF International 703-934-3446

  • Mr. John Kunz, ICF International, 703-934-3627

LGY plans to contract the services of a contractor for this year’s survey.



Appendix A: Copy of All data Collection Instruments

File Typeapplication/msword
File TitleSupporting Statement for VBA Generic Customer Survey Clearance
AuthorVBA
File Modified2007-11-29
File Created2007-11-29

© 2024 OMB.report | Privacy Policy