Peace Corps
Office of Strategic Information Research and Planning
OMB Control Number XXXX-XXXX
Supporting Statement
Supporting Statement For
Returned Peace Corps Volunteer Evacuation Survey (Part B)
B. Collections of Information Employing Statistical Methods
B1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The potential respondent universe for the Returned Peace Corps Volunteer Evacuation Survey consists of Returned Peace Corps Volunteers (RPCVs) who were evacuated during the months of February and March in 2020 due to COVID-19. This population consists of 6892 individuals, including those who were Peace Corps Volunteers, Peace Corps Trainees, and Peace Corps Response Volunteers at the time of the evacuation.
Email addresses are collected during the initial Peace Corps application process. Thus, the actual rates of email address coverage are likely to be high.
Expected Response Rate
There has not been a survey of Returned Peace Corps Volunteer in the context of an evacuation. The most recent systematic and comprehensive collection of survey data concerning the Returned Peace Corps Volunteer experience occurred in a survey conducted by the Peace Corps in 1996 and was not focused on a singular event. This survey achieved a response rate of 54%. We hope to receive a similar response rate.
We anticipate that response rates will be higher for a sample for which additional information is available, reducing the risk of nonresponse error (i.e., systematic differences in estimates of interest between respondents and non-respondents to the survey, leading to misestimates). Although increases in response rates do not guarantee a reduction in nonresponse error (cf. Lin and Schaeffer 1995), it is nevertheless true that as response rates increase, the potential bias from systematic differences between respondents and non-respondents decreases.
We aim to achieve a response rate of at least 54%. However, we have chosen more conservative estimates of response rate by utilizing the response rates achieved in the Corporation for National and Community Service (CNCS) 2015 AmeriCorps Alumni Outcomes Survey as a proxy in estimating response rates for the Peace Corps survey. The 2015 AmeriCorps Alumni Outcomes Survey had a 39% completion rate, specifically for online responses, so we will use 39% as our minimum response rate estimate.
Actual Response Rate in Previous Data Collection
This is a new data collection.
B2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; Estimation procedure; Degree of accuracy needed for the purpose described in the justification; Unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
Statistical Methodology for Stratification and Sample Selection
There is no sample selection, this survey will be distributed to the entire population of evacuated Returned Peace Corps Volunteers who were evacuated due to COVID-19.
Estimation Procedures
In order to obtain valid survey estimates, estimation will be done using properly weighted survey data. The weight to be applied to each respondent is a function of the appropriate nonresponse and post-stratification ratio adjustments. Thus, a respondent’s weight may incorporate the following adjustments:
Nonresponse –Results will be adjusted for differences in participation levels within cohorts, due to demographics and other characteristics. Logistic regression models will be used to calculate propensity scores of survey response driven by respondent demographics and other characteristics available for analysis. These scores will be used to create nonresponse weighting classes for statistical adjustment.
Depending on the number and distribution of non-responses that would add a bias to our data we will consider randomly sampling non-respondents and following up with another opportunity to take the survey.
As a final step, a raking algorithm such as Random Iterative Method (RIM) will be employed to adjust the calculated weights to insure that weighted totals match those of the population on the basis of demographic makeup and other characteristics.
Degree of Accuracy
A minimum estimate of 2688 completed responses will be collected, out of 6892 possible responses. Since we are collecting information from the entire population and not a sample a margin of error does not exist, or would be 0. The accuracy and reliability of the information collected in this survey will be adequate for estimates based on the total Peace Corps population.
Standard errors will be computed using statistical software, such as the R “survey” package for Analysis of Complex Survey Samples.
Unusual problems requiring specialized sampling procedures
Unusual problems requiring specialized sampling procedures are not anticipated at this time. All necessary steps to maximize response rates will be taken throughout the data collection period. Weighting procedures will also help address potential issues with nonresponse.
Use of periodic data collection cycles to reduce burden
A participant will respond to the survey only once, and the burden on any respondent will be low. Thus, a less frequent data collection cycle is not considered necessary.
B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
The Returned Peace Corps Volunteer Evacuation Survey will employ a number of strategies to maximize response rates while maintaining cost control.
The survey will be administered online, with participants being able to complete the survey on any internet-connected electronic device with a web browser. Respondents may also take more than one session to complete the survey. Survey communications will be by email for all participants.
Participants with known emails will be sent an invitation email at the survey launch date. The email will describe the intent of the survey and will include URL link to the survey unique to that participant. The email will utilize a peacecorps.gov sending address and will be signed by the Director of the Peace Corps in order to emphasize the legitimacy of the data collection request. An email reminder that also includes the participant’s unique URL link will be sent to non-respondents two weeks after the launch date. However, we may increase the frequency of reminders if warranted. The survey field period is currently scheduled to last one month.
To assess the impact of nonresponse bias in our study, we will conduct statistical analysis to identify any characteristics of respondents that are correlated with response. As discussed in Section B2, propensity scores from the logistic regression of survey response on explanatory variables will be used to create nonresponse weighting classes for statistical adjustment. These calculated weights will be further calibrated to insure that weighted totals match those of the population on the basis of demographic makeup and other characteristics.
B4. Describe any tests of procedures or methods to be undertaken.
Per Part A of this justification, the proposed survey instrument was designed to collect required data efficiently. A fully-functional prototype of the proposed survey instrument was programmed on the online survey management software platform to be used in the actual data collection. This prototype underwent small-scale testing by agency employees who were also Returned Peace Corps Volunteers. Initial testing indicated a respondent burden of about 15 minutes, with no issues reported with regards to clarity of survey instructions or content.
B5. Provide the name and telephone number of individuals consulted on statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.
Peace Corps staff were responsible for the design of the Returned Peace Corps Volunteer Evacuation Survey, and will be responsible for the data collection’s administration, analysis, and deliverables production.
The project lead at the Peace Corps for this data collection is:
Daisy Duarte
Management Analyst, Peace Corps Office of Strategic Information, Research and Planning (OSIRP)
+1 (202) 692-2220
Other members of the Peace Corps Office of Strategic Information, Research and Planning (OSIRP) were also involved in the methodological and statistical design of the survey. They are:
Jeff Kwiecinski
Acting Director, Peace Corps Office of Strategic Information, Research and Planning (OSIRP)
+1 (202) 692-1890
Mei Liang
Management Analyst, Peace Corps Office of Strategic Information, Research and Planning (OSIRP)
+1 (202) 692-1107
Stakeholders both internal and external to the Peace Corps were consulted on the content included in the data collection instrument. Contacts at other federal agencies were consulted on the proposed methodological and statistical design. However, the Peace Corps bears ultimate responsibility for this data collection’s design and execution.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Peace Corps – Office of Volunteer Recruitment and Selection |
Author | nfull |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |