A. Justification
The purpose of this submission is to request OMB authorization of the ODA Disaster Assistance Customer Service Center, Customer Satisfaction Survey.
Circumstances that make the collection of information necessary. The Small Business Administration is authorized to make loans to victims of declared disasters for the purpose of restoring their damaged property to, as near as possible, pre-disaster conditions. This authority is found in 15 USC 636 as amended, implemented by 13 CFR Part 123 (copies attached). SBA’s Office of Disaster Assistance provides customer service to individual and business loan applicants on the phone and via email through its Disaster Assistance Customer Service Center (DACSC) and in-person through its Field Operations Centers (FOC).
The DACSC is the national contact center for SBA’s Office of Disaster Assistance (ODA). Operating from Buffalo, New York, the DACSC provides customer support to disaster victims throughout the United States and U.S. Territories. Handling an average of several hundred thousand calls annually, Customer Service Representatives (CSRs) at the DACSC respond to a variety of SBA inquiries concerning the disaster loan program. ODA also operates two Field Operations Centers – The FOC-East in Atlanta, Georgia and the FOC-West in Sacramento, California. The FOCs deploy customer service representatives to staff temporary disaster recovery centers and SBA disaster loan outreach centers in disaster-affected locales. During a typical year, the FOCs deploy hundreds of CSRs to the field to aid tens of thousands of disaster victims.
The DACSC and FOCs use certain ‘output’ metrics to assess effectiveness. Key Performance Indicators (KPIs) for the call center, including wait times, abandonment rates, and average call handling times, are tracked and compared with industry benchmarks. Similarly, the FOCs track productivity data including customer contacts, applications accepted and summary declines processed. However, these output measures are ineffective indicators of ‘customer satisfaction.’ A customer satisfaction survey is more ‘outcome-based’ and a much better indicator of the overall effectiveness of the program.
2. How, by whom, and for what purpose information will be used. A team of Quality Assurance professionals at the DACSC will conduct a brief telephone survey of a representative sample of customers to measure their satisfaction with the service provided by the DACSC and FOCs (the Centers). The results are used to evaluate internal performance and provide timely feedback on areas of possible concern. Additionally, the results of the survey have been incorporated into the office “scorecard” as an independent measurement of the Centers’ performance through the eyes of its customers. Customer satisfaction surveys are a hallmark of all successful organizations in both the private and public sectors and SBA’s continued use of this tool reflects its commitment to ensuring quality customer service for the nation’s taxpayers.
3. Technological collection techniques. The survey is administered telephonically by a trained CSC employee. The survey administrator captures the survey data using an electronic form saved to a network database for analysis and reporting purposes.
4. Efforts to identify duplication. ODA contracts with the University of Michigan to conduct an annual customer satisfaction survey of the disaster loan program, however, this survey, although more comprehensive, is administered only once a year making it of limited value in assessing the real-time service levels in the various ODA centers. The infrequent delivery of the existing survey renders it ineffective for measuring on-going customer satisfaction rates and identifying areas of concern in a timely manner. Also, because the existing survey is geared toward assessing customer satisfaction on a broad level, it is often difficult to relate its results to specific work units. For example, when surveying a borrower about “customer service” (after months and perhaps dozens of interactions), it is uncertain whether the borrower is referring to an interaction with the DACSC, Processing and Disbursement Center or possibly even the FOC. A survey designed to elicit timely feedback regarding a specific interaction and work unit (e.g. the DACSC, or the FOC), is a better indicator of on-going customer satisfaction and would provide managers with relevant information to address problems as they occur rather than months afterward. We do not believe the existing ODA-wide survey provides the type of specific, targeted, timely and on-going feedback that will be provided by the DACSC process.
5. Impact on small businesses or other small entities. This survey will not have a significant economic impact on small businesses or other small entities.
6. Consequence if collection is conducted less frequently or is not conducted. Failure to implement the proposed methodology will affect ODA’s ability to accurately assess customer satisfaction levels and therefore, will affect management’s ability to take appropriate measures to improve delivery of critical financial assistance to disaster victims.
7. Existence of special circumstances. There are no special circumstances.
Solicitation of public comment. ODA solicited comments in the Federal Register on July 24, 2012 at 77 FR43410 (copy attached). The comment period closed August 25, 2008, and no comments were received.
9. Payments or gifts to respondents. There are no payments or gifts to respondents.
10. Assurance of confidentiality. The data captured through this survey will be maintained in an encrypted database accessible by a small number of authorized users. Management reports are not specifically linked to any person or entity, but rather depict the aggregate results of surveys administered over a specified period. Therefore, there is no need for assurances of confidentiality. The information is however, subject to the Freedom of Information Act (5 U.S.C. §552).
Questions of a sensitive nature. No sensitive questions are asked.
Estimates of the hourly burden. The survey process involves a random sample (90% Confidence Level with a 10 % margin of error) of callers to the DACSC. The survey is comprised of 7 short questions, 1 requiring a “Yes/No” response and 5 requiring a rating on a 1-5 scale, and 2 open-ended questions. Historically, the survey typically takes less than 5 minutes per customer to administer. A similar sample of Field Operation Center (FOC) customers are planned to be included in this survey, potentially doubling the agency’s burden hours for this activity. (See attached survey examples). Based on average activity levels for the DACSC during 2010 and 2011, we expect to conduct no more than 100 surveys per month. Based on customer visits to field locations, we estimate surveying a similar number of field customers on behalf of the FOCs to achieve statistically significant results. The survey is optional and the cost to the customer in terms of time is negligible.
Customer Service Center Customer Survey
Total Surveys = 100 surveys per month
100 surveys/month x 12 months = 1,200 annual responses
1,200 x .083 (5 minutes) = 99.6 burden hours
Field Operations Customer Survey
Total Surveys = 100 surveys per month
100 surveys/month x 12 = 1,200 annual responses
1,200 x .083 (5 minutes) = 99.6 burden hours
Total number of surveys= 1,200 + 1,200 = 2,400 respondents
Total burden hours = 199.2
13. Estimate of total annual cost burden. There are no additional costs beyond that identified in Item 12 above.
Estimated annualized cost to the Federal Government. Based on actual experience, we estimate that it takes approximately 10 minutes (including unsuccessful call attempts and minimal administrative duties) to conduct the survey. Agency burden hours are calculated below:
2,400 x .167 hours (10 minutes) per survey = 400 Agency burden hours
The annual cost estimate for the Agency is based on the salary of a GS-9, Step 1, ($23.30 per hour), which is representative of an employee performing these surveys. The cost is calculated as follows:
400 hours x $23.30 per hour = $9,320 Annual cost to the Government
15. Explanation of program changes or adjustments in Items 13 and 14 on OMB Form 83-I. All burden hours and costs are decreased compared to the last submission. The initial submission calculated burden hours and costs based on the assumption that 67 surveys would be conducted per day (1,340 surveys per month). Soon after implementation, it was determined that level of surveys was excessive and yielded no better results than would derived from a survey of approximately 100 surveys per month. The scaled-back methodology produced the same statistically significant results described in the submission.
16. Collection of information whose results will be published. No publication is anticipated.
Expiration date for collection of information. SBA will display the OMB expiration date.
18. Exceptions to certification statement in Block 19 on OMB Form 83-I. There are no exceptions to the certification statement.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | .Supporting Statement, SBA Form 987 |
Author | Bridget Dusenbury |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |