Annual Survey of Refugees
OMB Information Collection Request
0970 - 0033
Supporting Statement
Part B
December 2018
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer:
Wanda Hall
Office of Refugee Resettlement
B1. Respondent Universe and Sampling Methods
The
Annual Survey of Refugees (ASR) is a cross-sectional study, measuring
participant characteristics and outcomes among refugees entering the
United States in the previous five fiscal years. Each year, a sample
of refugee Principal Applicants (PAs) will be drawn from ACF Office
of Refugee Resettlement’s (ORR’s) Refugee Arrivals Data
System (RADS). A PA is the individual whose case formed the basis for
the household’s administrative claim to refugee status. This
individual is typically also the head of the household. The PA
responds on behalf of all eligible adults in their household. The
methodology for drawing the sample is as follows:
Contractors, in consultation with ORR, will analyze administrative data on refugee arrivals to determine which language groups to include in the survey administration. The goal is to maximize coverage of the population while minimizing the logistical challenges of serving small language groups. In 2016 and 2017, the ASR was offered in 17 languages (including English), covering over 75% of eligible refugee entrants from the focal fiscal years. We do not anticipate that languages offered will change for 2019- 2021.
Contractors will draw a stratified sample from the universe of primary applicants in eligible language groups. The principal stratum is a three-category “arrival cohort” (arrived in prior fiscal year; arrived two or three years ago; arrived four or five years ago). New arrivals are over-sampled to ensure statistical power to detect meaningful differences between the three cohorts. Within arrival cohort, data are further stratified by geographic sending region, language, age group, gender, and household size. These are proportionate strata, ensuring the resultant sample is representative of the refugee population.
A replicated sample design will be used for sample management. Within each arrival cohort, the stratified sample is randomly partitioned into 15 smaller “snapshots.” This strategy allows the contractor to monitor the sample release and response production closely, maximizing response rate within arrival cohorts while securing the targeted number of completed surveys per cohort in the 12-week fielding period.
The most recent contact information contained in ORR administrative data is collected by the U.S. State Department 90 days after the principal applicant’s arrival in the United States. Sampled individuals will be submitted to location tracing, in attempts to update their contact information in preparation for the survey administration.
B2. Procedures for Collection of Information
The information collection procedure is as follows:
Translation of Survey Instrument
If
necessary, the contractor will obtain translations of the survey
instrument described in this information collection into additional
foreign languages spoken by refugee populations.
Tracing of Respondents
The
contractor will seek updated contact information for sampled
respondents using the National Change of Address system and
TransUnion Batch Lookup. Based on past experience with this survey’s
administration, these are the most comprehensive sources of updated
information, as travel loans from the U.S. State Department for
refugees’ arrival in the United States are reported to the
TransUnion credit bureau. Given the lack of U.S. credit history for
recently-arrived individuals, other attempted batch lookup processes
have proven ineffective and will not be continued.
Advance Mail-out
The contractor will prepare and send an Introductory letter (see Appendix B) to each of the potential respondents. Provided in English and an additional language (drawn from administrative data), this letter introduces the survey and provides means to contact the research team with updated telephone information via pre-paid postal mail, email, or telephone. If a potential respondent does not update their information upon receipt of the letter, the contractor relies on the most up-to-date telephone number available from tracing efforts.
Interviewing
The
contractor will use culturally sensitive interview methods, including
matching interviewer and interviewee by gender, and avoiding major
religious days and holidays.
Telephone
interviews will be conducted in the Principal Applicant’s
preferred language, using computer-assisted telephone interviewing
(CATI) protocol to ensure accurate and complete data collection.
Interviewers will attempt to contact each selected refugee up to 10
times before final disposition as “unable to contact.”
Data Collection Quality Control
The
contractor will:
Prepare a questionnaire reference book, in consultation with the ACF project officer, for use by the interviewers. The contractor will provide training to the interviewers in the conduct of these interviews in order to reduce interviewer error prior to interviewing. Interviewers will receive a thorough explanation of each survey question and identify logical and acceptable responses to questions; be briefed on their commitment to privacy; familiarize themselves with the flow and the CATI application, and then be evaluated to ensure an acceptable command of all concepts and technical aspects involved in the interview process.
Provide ongoing monitoring of interview quality, including live listening to a sample of calls. Stronger interviewers will be assigned more difficult cases to maximize data quality;
Download and review data tables from CATI system, including frequency tests to identify any erroneous anomalies;
Compare respondent-provided household roster data to administrative data from ORR to ensure that only eligible refugees (arriving during the specified time period) are included in tabulations about refugee adults in the household;
B3. Methods to Maximize Response Rates and Deal with Nonresponse
Expected Response Rates
Survey
administrations in 2016 and 2017 provide the best estimates of
expected response rate for future ASR administrations.
In 2016, the overall response rate to the ASR was 24%. This response rate was driven by an inability to update contact information for (36%) or establish contact with (32%) sampled individuals. Conditional on contact, 76% of sampled individuals completed a survey, and this cooperation rate did not vary substantially by arrival cohort. In 2017, the overall response rate to the ASR was 25%. In 2017, contractors were unable to update contact information for (33%) or make contact with (32%) sampled individuals. Conditional on contact, 74% of sampled individuals completed a survey in 2017.
Dealing with Nonresponse
Beginning in the 2016 ASR, contractors included real-time monitoring of responses by key demographic groups hypothesized to be related to outcomes of interest in this survey, to proactively address any emerging non-response bias. The 2016 and 2017 ASR survey administrations demonstrate that, with inclusion of a $25 incentive, there is no substantial non-response bias across demographics of key interest: year of arrival in the United States, sending region, age, family size, gender, or native language.
ASR respondents are sampled from ORR administrative records that include contact information at 90 days following their arrival in the United States. Experience has demonstrated that the main source of non-response in the ASR is due to an inability to obtain updated, valid contact information for sampled households. With the presence of a $25 incentive, the overall “cooperation rate” conditional on successful contact was 75% in 2016 and 74% in 2017. This did not vary significantly by arrival cohort, the primary sampling stratum of interest.
At the completion of data collection, the contractor calculates analytic weights to enable nationally-representative point estimates and the calculation of statistical uncertainty that accounts for clustering of individuals within households. These weights include a base (sampling) weight reflecting the refugee household selection probability. Because sample allocations of each cohort are managed separately, selection probabilities vary by the size of the arrival cohort population and amount of the sample released into the field. Weights also include a post-stratification adjustment to correct for differential non-response across cohort and demographic subgroups, aligning the data to known population distributions taken from ORR administrative data.
In 2016 and 2017, the nonresponse/post-stratification adjustment was developed by first conducting a Chi-square Automatic Interaction Detector (CHAID) Analysis to identify the factors most associated with survey response. The factors that emerged from this analysis (which are also available in RADS administrative data) were then used for the nonresponse weighting. We do not anticipate changes to the weighting procedure for 2018-2020.
Maximizing Response Rates
As discussed above, the majority of non-response to the Annual Survey of Refugees comes from the difficulty of locating and contacting the ASR’s highly-mobile, newly-arrived focal population. This challenge is magnified given that ORR’s administrative data only contains contact information for refugee arrivals 90 days after their arrival in the United States.
The primary strategy for maximizing response rates while securing the target number of completed surveys will be the replicated sampling strategy, outlined in B1 above. This allows the contractor to closely follow response rate by arrival cohort, and release further sample into the field if production within a cohort is lower than expected. In both 2016 and 2017, the oldest cohorts (4 or 5 years since arrival) required additional replicate release to meet production goals, reflecting that non-traceability is higher among refugees who have been longer in the United States. Post-participation incentives will also be used to increase participation (see Supporting Statement A, section A9, for additional information).
As part of a separate effort to redesign the ASR instrument, contractors conducted substantial outreach to refugee-serving organizations and community groups to attempt to identify ways to improve respondent cooperation and tracing. This data collection was approved under the OPRE generic clearance for pretesting (OMB No. 0970-0355; Pretest of the Annual Survey of Refugees; approved September 13, 2017). We will continue to explore a sub-study of hard-to-trace sample members, to determine whether those who are harder to locate vary systematically on outcomes of interest in the survey (economic outcomes, language ability, benefits use, etc.). If necessary, we will submit a request for a non-substantive change to this ICR for this survey improvement effort.
B4. Tests of Procedures or Methods to be Undertaken
The
contractor will conduct sample validation exercises to ensure that
stratified random sampling and the replicate partitioning procedure
performed as intended, and are representative of the intended
inferential population. These analyses will be shared with and
approved by the ACF project officer prior to beginning subject
tracing.
During the fielding process, the replicate sample
release procedure allows for close monitoring, adaptation, and
continuous learning. Throughout the field period, the contractor
will produce weekly summaries of survey progress by key demographic
groups, in order to monitor non-response bias and redouble efforts to
secure participation from underrepresented populations as necessary.
Interviewer training materials were updated to improve the comparability and quality of data across many languages and cultural groups, based on experiences during the 2016 and 2017 survey administrations.
B5. Individual(s) Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
Dr. Nicole Deterding
Contract Social Science Research Analyst
Business Strategy Consultants
ACF Office of Planning, Research, and Evaluation
Mary E. Switzer Building, 4th Floor
330 C Street, SW, Washington, DC 20201
(202) 205-0742
nicole.deterding@acf.hhs.gov
Gary Frost
Program Manager
Civilian and Homeland Security Solutions Division
General Dynamics Information Technology
3211 Germantown Road
Fairfax, VA 22030
(703) 995-3700 direct
gary.frost@gdit.com
www.gdit.com
Robert Santos
Vice President and Chief Methodologist
The Urban Institute
2100 M Street NW
Washington, DC 20037
(202) 261-5904
RSantos@urban.org
File Type | application/msword |
File Title | OPRE OMB Clearance Manual |
Author | DHHS |
Last Modified By | SYSTEM |
File Modified | 2019-04-05 |
File Created | 2019-04-05 |