1660-NEW65 Program Effectiveness Recovery Supporting Statement PART B 8-3-11

1660-NEW65 Program Effectiveness Recovery Supporting Statement PART B 8-3-11.doc

Federal Emergency Management Agency Individual Assistance Program Effectiveness & Recovery Survey

OMB: 1660-0128

Document [doc]
Download: doc | pdf

8-3-11

Supporting Statement for

Paperwork Reduction Act Submissions


OMB Control Number: 1660 – NEW


Title: Federal Emergency Management Agency Individual Assistance Program Effectiveness & Recovery Survey


Form Number(s):


Program Effectiveness & Recovery Survey, FEMA Form 007-0-20, (Formerly FEMA Form 90-149)


B. Collections of Information Employing Statistical Methods.



When Item 17 on the Form OMB 83-I is checked “Yes”, the following documentation should be included in the Supporting Statement to the extent it applies to the methods proposed:



1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection has been conducted previously, include the actual response rate achieved during the last collection.


The survey proposed here is time-limited.

The target populations of this information collection are individuals who are disaster survivors living in the US territory and have registered for federal assistance for each declared disaster (catastrophic disasters may be excluded during weeks when registration activity exceeds 75,000). The sampling frames consist of the names of all the disaster survivors who contact FEMA for disaster assistance and are determined to be eligible for assistance. There cannot be a misclassification or eligibility confusion for the sampling frames because they are generated strictly by the definition of the target population stated above. There is neither an exclusion of any element, nor an alternative sample frame. There is no post-stratification procedure included in this IC, but the obtained data are aggregated to estimate the customers’ satisfaction level for the entire primary unit or the target population.

A systematic random sampling method is used to select a group of people to be surveyed from the target population stored in electronic data files, usually in the National Emergency Management Information System (NEMIS). The entity of the element of all samples is individual, and there is no stratification involved in the sampling. More detailed sampling methods and timeliness are provided below.

The disaster process covers a span of time and the goal is to measure and then report on those services and processes over that span of time. Participation in the survey is available to all eligible applicants within the targeted population based on a random selection. To achieve these goals and not to over burden the disaster survivors, a goal is established which will cumulate into a statistically valid response.

Program Effectiveness & Recovery Survey: The Program Effectiveness & Recovery Survey is conducted approximately 90 days after the first registration within the declared disaster. The one time random sample is based on the volume of recipients of assistance by disaster.


Example: Based on resources, the aim is to complete a statistically valid number of surveys based on approximately 400 per disaster. This provides in effect a minimum 95% confidence level and confidence interval of plus or minus 5% at 50% response distribution.


Response Rate and Decline Rate – See #3 below:


This is a new information collection request; however, the Program Effectiveness & Recovery Survey (PE&R) was administered under a prior collection, OMB Control Number 1660-0036. For the next request in three years, detailed descriptions will be provided of the response rates from this collection. In the event response rates fall below 80%, a non-response analysis will be performed on the group(s) in question.


For the PE&R Survey administered under a prior collection, the non-response bias was studied based on an age demographic. In a grouping of ages 0-30, 31-60 and 61 and older, there was a 5-10% demographic difference for some groupings. However, analysis of early verses late responders’ (assumed to be similar to non-responders) overall customer satisfaction shows only a slight variation of 1-3% in their ratings. Therefore, the results of the responders can be assumed to be similar to the non-responders in their ratings of this type of primarily customer service based survey. In the future, further analysis can be provided if needed.


The actual decline rate for the Program Effectiveness & Recovery Survey in FY 2010 was 2.74% of all attempts and completes. Completes to attempts was 36.51% due to the applicant’s unavailability to complete the survey, bad/wrong phone number, busy signals, declines, no answer, voice mail, privacy manager or the applicant did not remember contacting FEMA or was not familiar with the case. The interviewer phone number is blocked from the respondent which has a negative effect on the response rate. Attempts are made to reach the respondent each time when the case systematically returns to the call queue. Up to 4 attempts are made to obtain a survey response if necessary to achieve a valid number of completed surveys. If an applicant is not immediately available, an attempt is made to set up another time within the survey period that would be more convenient for the respondent, and the interviewer explains how important his/her feedback is.


Below shows the data on the size of the universe covered by the collection and the corresponding samples for the universe as a whole. In addition, Table 6A shows the target number of completed surveys and the confidence level and confidence interval pursued at the worst case of response distribution, p = 0.5.

Table 6. Annual estimates of universe and sample sizes. The entity of the sample elements is ‘individual’ for all surveys.

Phone Survey

Total Annual Universe for 2 Yr Avg All DRs*1

2 Yr Average Number of DRs 2008-2009 *2

Universe
2 Yr Average per DR

Sample per Disaster

prorated by DR based on an average response rate of 37%

FEMA Confidence Level
3

Reported by Disaster

Program Effectiveness & Recovery Survey

316,870

30

10,562

400

95%

[+/- 5%]


Notes








*1 : Universe size is estimated based on the average number of disaster survivors in FY 2008-2009 and the Percentage of registrants who use the system


*2 : Number of disasters is based on the average number of disasters occurred during FY 2008-2009


*3 : Confidence Level and Confidence Interval are at 50% response distribution (p = proportion = 0.5) for the samples, which are actually given to the interviewers to make survey calls.




Table 6A. Number of completed surveys (FY 2009 and proposed), and the confidence level and margin of errors pursued at response distribution, p = 0.5.

Phone Survey

Total Number of Completed Surveys in FY 2009

Number of Completed Surveys per Disaster

during the Period

Annual Number of Completed Surveys for 30 Disasters per Year

Annual Pursued Confidence Level

REPORTED BY DISASTER

Program Effectiveness & Recovery Survey

9,067

400

12,000

95%

[+/- 5%]

30 DRs


2. Describe the procedures for the collection of information including:


Once a sample set is obtained, the interviewer calls the individuals in the sample to conduct an interview until statistical validity is reached. In the case of the proposed information collection, sending a pre-notification letter for the survey is not desirable because of the time constraint (see B. #1). In addition, the items listed in question 3 are employed to the information collection instruments to insure the best balance between maximizing data quality and minimizing respondent time burden, some of which are relevant to response rate improvement.


-Statistical methodology for stratification and sample selection:

A systematic random sample is generated in the entire target population from the electronic database in the National Emergency Management Information System (NEMIS) or a program similar to it that contains the names, phone numbers, addresses, and disaster related information of all such applicants.

Each registered and eligible applicant from any given disaster will have the same chance of being chosen by the systematic random sampling method. The phone survey sample is imported into the survey database and is randomly populated onto a computer screen from the pool of names using a database software or a similar program for the interviewer.

As an example: The Program Effectiveness & Recovery Survey sample is imported after the close of the application period, allowing respondents time for recovery.


-Estimation procedure: To estimate the satisfaction level of service provided, the top three positive responses (for example, excellent, good and satisfactory) are averaged based on the completed surveys representing the universe of disaster applicants.


-Degree of accuracy needed for the purpose described in the justification: Although extremely accurate statistical inference is not necessary for this information collection, the goal is to achieve a level of accuracy of the estimated customer satisfaction based on call volume 95% confidence level, for all surveys.

-Unusual problems requiring specialized sampling procedures:

There are no unusual problems requiring specialized sampling procedures.


-Any use of periodic (less frequent than annual) data collection cycles to reduce burden: Usage of any periodic data collection cycle is not applicable to this particular type of information collection since disaster occurrences are not predictable enough to schedule a collection cycle in advance.


3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.


  • The opening statement explains, briefly, the purpose of the study and nature of being voluntary, and asks for the Applicant's help in order to improve FEMA's response to future disasters.

  • The questions are short and require little time to complete.

  • The questions are very straightforward and easy to answer.

  • An explanation is given that questions will not affect the outcome of their application with FEMA.

  • Information gathered from focus groups will be used to ensure that the survey items included are of interest to Individual Assistance applicants, making respondents more likely to see the survey as relevant.

  • Revisions will be made to the survey with attention to correcting low response items.

  • Callbacks are made to applicants who state they will be available at a later time when feasible and resources allow.

  • When limited sample is available, additional attempts are made to contact the applicant; up to 4 attempts.

  • Contacts are made at varying times of the day during the survey period.

  • Training is provided and more experienced interviewers are retained.

  • Interpreters are used to obtain results from applicants with other languages.


These surveys are performed by calling disaster assistance applicants. The decline ratio average is 2.74% of the total attempts and completes. The 2010 response rate for the proposed Individual Program Effectiveness & Recovery Surveys under the current OMB inventory is 36.51% using a response-rate formula recognized by the American Association for Public Opinion Research (AAPOR) as following:


RR = I / {(I+P) + (R+NC+O) + U}

, where


RR = Response rate

I = Complete interview

P = Partial interview

R = Refusal and break-off

NC = Non-contact

O = Other (bad/wrong numbers, technical phone problem, etc.)

U = Unknown eligibility (= 0 in this case, see B. #1.)


The relatively low response rate is compared with a distribution of response rates presented in McCart et al., 2006, a paper concerning phone survey response rates. Figure 1 is a histogram of the response rates using the same RR formula for 205 telephone surveys conducted at the University of Florida Survey Research Center at the Bureau of Economic and Business Research between January 2000 and July 2004, which shows the mode response rate 25%, and the mean about 41.5%. Thus, the response rate of 37% is not in fact as low as it looks without any reference for comparison.



Figure 1. Histogram of the response rates for 205 telephone surveys conducted at the University of Florida Survey Research Center at the Bureau of Economic and Business Research between January 2000 and July 2004 (McCarty et al., 2006, Effort in Phone Survey Response Rates: The Effects of Vendor and Client-Controlled Factors, Field Methods, Vol. 18 No. 2, 172-188).


The target population is disaster survivors. Since this information collection is time constraint, and following a disaster, the survivors have to be interviewed while they are still experiencing disaster trauma. In most of the cases the survivors may be in the worst stage of the disaster trauma when they are called for the surveys. Disaster trauma psychology symptoms may include


[http://www.citizencorps.gov/cert/downloads/training/PM-CERT-Unit7Rev3.doc]:


  • Irritability or anger

  • Self-blame or the blaming of others

  • Isolation and withdrawal

  • Fear of recurrence

  • Feeling stunned, numb, or overwhelmed

  • Feeling helpless

  • Mood swings

  • Sadness, depression, and grief

  • Denial

  • Concentration and memory problems

  • Relationship conflicts/marital discord

  • Loss of appetite

  • Headaches or chest pain

  • Diarrhea, stomach pain, or nausea

  • Hyperactivity

  • Increase in alcohol or drug consumption

  • Nightmares

  • The inability to sleep

  • Fatigue or low energy


In addition to disaster trauma, frequent relocations are anticipated for the survivors after a disaster, which attributes to the non-contact portion of the non-response. Often survivors do not have telephone service in their community due to the disaster. Considering even during normal stages of everyday life “time-limited polls often yield very low response rates” (McCarty et al. 2006), we believe that we have achieved a very good response rate if not the best possible for this particular type of target population. Nonetheless, we follow the steps described in B. #2 to maintain the current level of success in the response rate though our respondents may be still in disaster trauma during the survey periods.


4. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of tests may be submitted for approval separately or in combination with the main collection of information.


Many of the questions in the survey have been performed for six to ten years and were initially based on comments from past focus groups as well as contractor opinion. FEMA personnel also reviewed questionnaire content and wording to improve readability and clarity. Tests with less than 10 applicants may be performed by FEMA’s customer satisfaction analysis staff when updates are desirable, and all updates to questionnaires will be submitted to OMB for approval


5. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.


Maggie Billing

Program Analyst

Customer Satisfaction Analysis Section

Texas National Processing Service Center

940 891-8709 or 940 891-8500


Nicole Bouchet

Statistician

FEMA

202-646-2814

8


File Typeapplication/msword
File TitleRev 10/2003
AuthorFEMA Employee
Last Modified Bynbouchet
File Modified2011-08-16
File Created2011-08-16

© 2024 OMB.report | Privacy Policy