Supporting Statement B - 2024 RPCV Survey - Final

Supporting Statement B - 2024 RPCV Survey - Final.docx

Peace Corps Returned Volunteer Impact Survey

OMB: 0420-0569

Document [docx]
Download: docx | pdf

Supporting Statement For

Peace Corps Returned Volunteer Impact Survey (Part B)


B. Collections of Information Employing Statistical Methods

B1. Describe (including numerical estimate) the potential respondent universe and any sampling or other respondent selection method to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.

The potential respondent universe for the 2024 Peace Corps Returned Volunteer Impact Survey consists of Returned Peace Corps Volunteers (RPCVs) whose initial term of service ended in calendar year 2003, 2013, 2018, or 2021, corresponding to 21, 11, 6, and 3 years before an assumed December 31, 2024 cutoff date for inclusion in the 2024 survey administration. This cohort design, focusing on Returned Volunteers who served in certain years rather than the full universe of Returned Volunteers, was employed in order to (1) minimize burden to the overall RPCV population while maximizing the ability to examine issues of career progression with respondents who are generally still of working age, and (2) increase the efficiency of contact information collection and validation by targeting resources against a smaller subset of potential respondents. Sample size calculations are based on the Volunteers aligned to the cohorts above who meet the following criteria:

    • Initial service was in a 2-year Peace Corps Volunteer (PCV) program

    • Initial service length was greater than zero days, defined as having a recorded oath date prior to end of service;

    • Not currently in service in any Peace Corps program

    • Not deceased

For each cohort, sample eligibility was restricted to Volunteers undertaking their initial service. A small percentage of Peace Corps Volunteers will serve in the Peace Corps more than once; as such, the decision to serve again may be considered an outcome of initial service. Thus, in order to investigate the link between Peace Corps Volunteer service and outcomes fairly, the decision was made to restrict cohort eligibility to those returning from their initial service. Respondents who are selected for a given cohort but have served in the Peace Corps in subsequent years will be identified as having multiple services for the purposes of analysis but will be asked to respond to the survey in the context of their initial service. In similar vein, potential respondents were excluded if their initial service was in a Peace Corps program other than the 2-year PCV program. The vast majority of Volunteers’ initial service was in a 2-year PCV program, and most of Peace Corps’ resources are aligned to the selection, training, support, and servicing of those Volunteers and Returned Volunteers. Finally, potential respondents currently serving in a Peace Corps program were excluded in order to conform to the mandate that the survey consider “former” Volunteers.

The estimated universe of eligible participants is stratified by time since last service (21, 11, 6, and 3 years; corresponding to Volunteers who returned in 2003, 2013, 2018, or 2021 respectively). Exhibit B1 shows the size of the universe.

Exhibit B1. Estimated Universe Size for the Peace Corps Returned Volunteer Impact Survey

Cohort Year

Eligible Participants

2003 (21-years out)

2,936

2013 (11-years out)

3,542

2018 (6-years out)

3,357

2021 (3-years out) *

0

Total

9,835


*The number of eligible participants for 2021 (3 years post-service) cohort is 0 for the 2024 survey since Peace Corps field operations did not take place in 2021 due to the COVID Pandemic. The fourth wave of the survey in 2026, however, will include a 3-years out cohort in order to maintain consistency in survey design and methodology across all four iterations of the RPCV Impact Survey.


The sample will be drawn from the universe as described in Exhibit B1 above. The current availability and quality of contact information for Returned Peace Corps Volunteers will be a major challenge to implementing this survey successfully. Among the eligible population, current email addresses are available for 86% of Returned Volunteers (56% of the 2003 cohort, 99% of the 2013 cohort, and 99% of the 2018 cohort). Email addresses are considered current if Returned Volunteers have chosen to disclose this information to the Peace Corps, typically when accessing Returned Volunteer Services. Email addresses are also collected during the initial Peace Corps application process. Mailing addresses are available for 98% of participants, though these records were collected during the original Volunteer application process.

One option would be to limit the sampling frame to potential participants with a known current email address. However, since the acquisition of an up-to-date email address currently requires that Returned Volunteers engage with the Peace Corps at some point post-service, limiting the sampling frame in the manner described may result in a sample prone to self-selection bias in favor of potential respondents who are inclined to stay connected with the Peace Corps. For this reason, we intend to select the sample independently of contact list availability. Section B3 describes the steps to be taken to improve the quality of contact information and increase the likelihood of successfully reaching a participant.

The desired sample sizes for the four cohort strata are shown in Exhibit B2, below.

Exhibit B2. Desired Sample Size for the Peace Corps Returned Volunteer Impact Survey

Cohort Year

Sample Size

2003 (21-years out)

288

2013 (11-years out)

347

2018 (6-years out)

329

2021 (3-years out)

-

Total (combined)

964

Note: Completed surveys

Expected Response Rate

The 2020 and 2022 iteration of the RPCV Impact survey provides the best basis for calculating the expected response rate for the 2024 survey. We use the combined response rates from the previous iterations of the survey to create expected response rates by cohort, listed below in Exhibit B3, alongside the estimated number of survey invitations required to achieve the desired sample size.

Exhibit B3. Estimated number of invitations required for the Peace Corps Returned Volunteer Impact Survey

Cohort Year

Desired Sample Size

Estimated Response Rate

Estimated Number of Invitations Required

2003 (21-years out)

288

16%

1,800

2013 (11-years out)

347

16%

2,169

2018 (6-years out)

329

19%

1,732

2021 (3-years out)

-

-

-

Total

964

17%

5,701



Actual Response Rate in Previous Data Collection

Overall, the combined response rate for the 2020 and 2022 iterations of the RPCV survey was 17%, with some relatively minor variation by cohort. Exhibit B4 shows the actual response rate for each survey period, as well as the combined response rate by cohort used to create the estimated response rate for the 2024 iteration of the survey.

Exhibit B4. Prior RPCV Impact Surveys - Actual response rates by survey period

Cohort Year

2020 Survey

2022 Survey

Combined

(both 2020 and 2022)

21-years out

21%

10%

16%

11-years out

17%

14%

16%

6-years out

19%

20%

19%

3-years out

22%

23%

22%

Total

20%

15%

18%



B2. Describe the procedures for the collection of information, including: Statistical methodology for stratification and sample selection; Estimation procedure; Degree of accuracy needed for the purpose described in the justification; Unusual problems requiring specialized sampling procedures; and any use of periodic (less frequent than annual) data collection cycles to reduce burden.

Statistical Methodology for Stratification and Sample Selection

The desired sample sizes shown in Exhibits B2 and B3 were calculated using proportional allocation. Overall sample size is determined by:

n = (NZ2P(1-P)) / (d2(N-1) + Z2P(1-P))

where

n = Total sample size (with finite population correction)

N = Total population size

Z = Z statistic (here, set for a level of confidence of 95%, or Z = 1.96)

P = Expected proportion (here, set to 0.5)

d = Precision (here, set to a margin of error of 3%, or d = 0.03)




Sampling fractions (nh) are then allocated to the three strata formed from the three cohorts (2003, 2013, and 2018). For a given stratum, the sampling fraction is determined by:

nh = ( Nh / N ) * n

where

nh = Sample size for stratum h

Nh = Population size for stratum h

N = Total population size

n = Total sample size


Within each cohort, the sample will be selected using a systematic random sample with implicit stratification. The sampling frame will be sorted by available demographic and programmatic data, with missing data included (e.g., sex would consist of female, male, and missing) and, at the lowest level, by a randomly generated number. This procedure reduces design effects compared to a simple random sample by ensuring proportional representation of groups within the stratum.

Exhibit B4, below, summarizes the previous tables, showing the relationship among (a) the universe of eligible participants, (b) the required sample to be drawn in order to reach (c) the desired number of completed surveys.

Exhibit B4. Population and sample overview for the Peace Corps Returned Volunteer Impact Survey

Cohort Year

Eligible Participants

Estimated Number of Invitations Required

Desired Sample Size

2003 (21-years out)

2,936

1,800

288

2013 (11-years out)

3,542

2,169

347

2018 (6-years out)

3,357

1,732

329

Total

9,835

5,701

964

Estimation Procedures

In order to obtain valid survey estimates, estimation will be done using properly weighted survey data. The weight to be applied to each respondent is a function of the overall probability of selection, and appropriate nonresponse and post-stratification ratio adjustments. Thus, a respondent’s weight may incorporate the following adjustments:

  • Sampling design – The planned sampling rate of each cohort differs due to differences in estimated response rate. Thus, results may have to be adjusted to account for actual survey participation relative to expected survey participation for a given cohort. Population size adjustment is implicit since initial cohort sample allocation utilized a proportional approach.

  • Nonresponse – Furthermore, results will be adjusted for differences in participation levels within cohorts, due to demographics and other characteristics. Logistic regression models will be used to calculate propensity scores of survey response driven by respondent demographics and other characteristics available for analysis. These scores will be used to create nonresponse weighting classes for statistical adjustment.

Nonresponse weights will be combined with sampling design weights to create the final weights for the analysis. As a final step, a raking algorithm such as Random Iterative Method (RIM) will be employed to adjust the calculated weights to ensure that weighted totals match those of the population on the basis of demographic makeup and other characteristics.

Degree of Accuracy

A minimum estimate of 964 completed responses will be collected. Assuming there is no design effect, with a proportion estimate of 50%, point estimates based on the total sample will have a margin of error of about 3.0% at a 95% level of significance. The accuracy and reliability of the information collected in this survey will be adequate for point estimates based on the total Peace Corps survey sample. Comparisons involving point estimates of subsets of the total Peace Corps survey sample will be less precise but may still be acceptable depending on final response rates.

It may also be useful for the agency to compare results between cohorts. Assuming that a minimum estimate of completed responses are collected, estimates of cohort-level margin of error assuming 50% proportion estimates with no design effect are listed in exhibit B4 as follows:

Exhibit B4. Estimated 95% Confidence Intervals for the Peace Corps Returned Volunteer Impact Survey

Cohort Year

Margin of Error (assuming minimum sample size)

2003

5.5%

2013

5.0%

2018

5.1%

Total

3.0%

Margins of error for a statistic of 50% shown; margins of error will be lower for statistics above or below 50%.

The minimum detectable differences for each cohort comparison are listed in exhibit B5 as follows:

Exhibit B5. Minimum Detectable Cohort Point Differences for the Peace Corps Returned Volunteer Impact Survey

Cohort Year Comparison

Minimum Detectable Percentage Point Difference (assuming minimum sample size)

2003 - 2013

10.5%

2003 - 2018

10.6%

2013 - 2018

10.1%


Assuming that only minimum response rates are met, the survey, as designed, will be sensitive enough to detect large point estimate differences between cohorts. This survey’s goals are focused on examining the incremental impact of Peace Corps on Americans who chose to serve, regardless of cohort. Nevertheless, the ability to read small differences between cohorts may be limited. That said, sensitivity will increase if survey participation exceeds expectations.

Standard errors will be computed using statistical software, such as the R “survey” package for Analysis of Complex Survey Samples, which accounts for complex survey designs.

Unusual problems requiring specialized sampling procedures

Unusual problems requiring specialized sampling procedures are not anticipated at this time. All necessary steps to maximize response rates will be taken throughout the data collection period. Weighting procedures will also help address potential issues with nonresponse. However, a low response rate may necessitate an additional sub-sample of completed surveys from non-respondents in order to confirm if nonresponse bias is present in the sample.

Use of periodic data collection cycles to reduce burden

A participant will respond to the survey only once, and the burden on respondents will be low (estimated at 15-minutes per respondent). Thus, a less frequent data collection cycle is not considered necessary.

B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.

The Peace Corps Returned Volunteer Impact Survey will employ a number of strategies to maximize response rates while maintaining cost control. As described in Section B1, the availability and quality of contact information for Returned Peace Corps Volunteers may be a major driver of nonresponse. Furthermore, it is likely that restricting the sampling frame to participants with a known current email address would result in bias due to the non-random nature of email address coverage of Returned Peace Corps Volunteers. The Peace Corps will select the sample independently of contact list availability and then attempt to identify any missing email and postal addresses for the selected participants. If contact information is still unavailable for a selected participant, a replacement may be selected, if appropriate, by utilizing matching analysis (either propensity score matching or proximity matching) to identify an unselected eligible candidate that most closely resembles the selection to be replaced.

The survey will be administered online, with participants being able to complete the survey on any internet-connected electronic device with a web browser. Respondents may also take more than one session to complete the survey. Survey communications will be by email for selected participants with a current email address available, whereas survey communications will be via printed mail for selected participants with only a current postal address.

Sampled participants with known current email addresses will be sent an invitation email at the survey launch date. The email will describe the intent of the survey and will include URL and QR code links to the survey unique to that participant. The email will utilize a peacecorps.gov sending address and will be signed by the Director of the Peace Corps in order to emphasize the legitimacy of the data collection request. An email reminder that also includes the participant’s unique URL link and QR code will be sent to non-respondents two weeks after the launch date. One reminder is currently planned in order to match the planned reminder frequency for participants invited via postal mailing. However, we may increase the frequency of reminders if warranted.

An invitation letter will be mailed to sampled participants for whom a current email address is not available. The invitation letter will be mailed on Peace Corps stationery and will be signed by the Director of the Peace Corps. The content of the letter will match that of the emailed invitation, also containing the participant’s unique URL for the survey and including a QR code that can be used by respondents to go to their unique URL without having to manually enter the address. One reminder letter, mailed on Peace Corps stationery, matching the content of the email reminder and also containing the unique survey URL and QR code links will be sent to non-respondents two weeks following the invitation letter. Due to postage costs, one mailed reminder is currently planned. However, we may increase the frequency of reminders if warranted.

The survey field period is currently scheduled to last six weeks. This period may be extended in order to increase participation, if warranted.

To enhance response rates further, the Peace Corps will work with external stakeholder groups such as the National Peace Corps Association and Returned Peace Corps Volunteer affinity groups to publicize the survey prior to launch and encourage selected participants to respond.

To assess the impact of nonresponse bias in our study, we will conduct statistical analysis to identify any characteristics of respondents that are correlated with response. As discussed in Section B2, propensity scores from the logistic regression of survey response on explanatory variables will be used to create nonresponse weighting classes for statistical adjustment. These nonresponse weights will be combined with sampling weights based on our stratification plan to create the final weights for our analysis. These calculated weights will be further calibrated to ensure that weighted totals match those of the population on the basis of demographic makeup and other characteristics.

B4. Describe any tests of procedures or methods to be undertaken.

Per Part A of this justification, the proposed survey instrument was designed to collect required data efficiently, with content heavily drawn from other surveys acknowledged to be of high quality, with conceptualizations, methodology, and content already developed and rigorously assessed for validity. Original question wording and construction were retained to the greatest extent possible. A fully functional prototype of the proposed survey instrument was programmed on the online survey management software platform to be used in the actual data collection. This prototype underwent small-scale testing by agency employees who were also Returned Peace Corps Volunteers. Initial testing indicated a respondent burden of about 15 minutes, with no issues reported with regards to clarity of survey instructions or content.

B5. Provide the name and telephone number of individuals consulted on statistical aspects of the design, and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.

Peace Corps staff were responsible for the design of the Peace Corps Returned Volunteer Impact Survey, and will be responsible for the data collection’s administration, analysis, and deliverables production.





The project lead at the Peace Corps for this data collection is:

David Holt

Management Analyst, Peace Corps Office of Strategic Information, Research and Planning (OSIRP)

+1 (202) 692-2225

dholt@peacecorps.gov


Other members of the Peace Corps were also involved in the methodological and statistical design of the survey. They are:


Luke Douglas

Former Acting Director, Peace Corps Office of Strategic Information, Research and Planning (OSIRP)

+1 (202) 692-1806 (No longer active)


Ryan Cristal

Information Technology Specialist, Peace Corps Office of the Chief Information Officer

+1 (202) 692-2693

rcristal@peacecorps.gov


Stakeholders both internal and external to the Peace Corps were consulted on the content included in the data collection instrument. Contacts at other federal agencies were consulted on the proposed methodological and statistical design. However, the Peace Corps bears ultimate responsibility for this data collection’s design and execution.


Appendix B1.1. Survey Invitation Draft (Email)

Peace Corps Returned Volunteer Impact Survey – Survey Invitation Draft (Email)


To: Survey invitees with known email addresses

From: “Peace Corps Returned Volunteer Impact Survey” (RPCVsurvey@peacecorps.gov)

When: Survey launch

Subject: Peace Corps Returned Volunteer Impact Survey





Dear [$First Name],


Thank you for your service.

I am writing to ask for your participation in the 2024 Returned Peace Corps Volunteer Impact Survey. Through the survey, we hope to learn how you and other Returned Volunteers continue to benefit from your Peace Corps experience. We want to better understand how your service influences your well-being and the well-being of the people you connect with, whether it is through your career path, community engagement, or commitment to public service.


The information gathered will be used to improve the Peace Corps experience not just for those who have returned from service, but also for current and future Volunteers. Your responses will also help the Peace Corps, in its position as a federal agency, continue to tell its story to Congress and the American people.


You were selected because you returned in [$year] from your [$add “first” if more than one service] Peace Corps Volunteer service in [$post]. The survey will take approximately 15 minutes to complete. If you have any questions about the survey or need assistance, please email [$mailto: “RPCVsurvey@peacecorps.gov”].


To take the online survey, please click the link below, or copy and paste it to your internet browser:

[$URL]


Or you may scan this QR code with your smartphone:

[$QR]


Again, thank you for your service as a Returned Peace Corps Volunteer. We look forward to hearing from you!

Sincerely,


<signatue>


Carol Spahn

Director




To opt out of receiving future emails about this survey, please click [$URL:”here”].







Appendix B1.2. Survey Invitation Draft (Print Mailing)

Peace Corps Returned Volunteer Impact Survey – Survey Invitation Draft (Print Mailing)


To: Survey invitees without a known email address, but with a known current mailing address

When: Survey launch




October xx, 2024


[$Full Name]

[$Mailing Address]


Dear [$First Name],


Thank you for your service.

I am writing to ask for your participation in the 2024 Returned Peace Corps Volunteer Impact Survey. Through the survey, we hope to learn how you and other Returned Volunteers continue to benefit from your Peace Corps experience. We want to better understand how your service influences your well-being and the well-being of the people you connect with, whether it is through your career path, community engagement, or commitment to public service.


The information gathered will be used to improve the Peace Corps experience not just for those who have returned from service, but also for current and future Volunteers. Your responses will also help the Peace Corps, in its position as a federal agency, continue to tell its story to Congress and the American people.


You were selected because you returned in [$year] from your [$add “first” if more than one service] Peace Corps Volunteer service in [$post]. The survey will take approximately 15 minutes to complete. If you have any questions about the survey or need assistance, please email [$mailto: “RPCVsurvey@peacecorps.gov”].


To take the online survey, please enter the following link on your internet browser:

[$URL]


Or you may scan this QR code with your smartphone:

[$QR]


Again, thank you for your service as a Returned Peace Corps Volunteer. We look forward to hearing from you!

Sincerely,


<signatue>


Carol Spahn

Director





Appendix B2.1. Survey Reminder Draft (Email)

Peace Corps Returned Volunteer Impact Survey – Survey Reminder Draft (Email)


To: Survey non-respondents with known email addresses

From: “Peace Corps Returned Volunteer Impact Survey” (RPCVsurvey@survey.peacecorps.gov)

When: Two weeks after survey launch

Subject: Reminder: Peace Corps Returned Volunteer Impact Survey





Dear [$First Name],


We recently invited you to take part in the 2024 Returned Peace Corps Volunteer Impact Survey. We really value your experiences and opinions and hope that you will be able to take part in this important survey.


To take the online survey, please click the link below, or copy and paste it to your internet browser:

[$URL]


Or you may scan this QR code with your smartphone:

[$QR]


The survey asks about how you and other Returned Volunteers continue to benefit from your Peace Corps service and how that experience influences your well-being, career, community engagement, and public service. The information gathered will be used to improve the Peace Corps experience for current, returned, and future Peace Corps Volunteers; and to aid the Peace Corps, in its position as a federal agency, in telling its story to Congress and the American people.


You were selected because you returned in [$year] from your [$add “first” if more than one service] Peace Corps Volunteer service in [$post]. The survey will take approximately 15 minutes to complete. If you have any questions about the survey or need assistance, please email [$mailto: “RPCVsurvey@peacecorps.gov”].


Thank you for your service as a Returned Peace Corps Volunteer. We look forward to hearing from you!

Sincerely,


<signatue>


Carol Spahn

Director






To opt out of receiving future emails about this survey, please click [$URL:”here”].



Appendix B2.2. Survey Reminder Draft (Print Mailing)

Peace Corps Returned Volunteer Impact Survey – Survey Reminder Draft (Print Mailing)


To: Survey non-respondents without a known email address, but with a known current mailing address

When: Two weeks after survey launch





November xx, 2024


[$Full Name]

[$Mailing Address]


Dear [$First Name],


We recently invited you to take part in the 2024 Returned Peace Corps Volunteer Impact Survey. We really value your experiences and opinions and hope that you will be able to take part in this important survey.


To take the online survey, please enter the following link on your internet browser:

[$URL]


Or you may scan this QR code with your smartphone:

[$QR]


The survey asks about how you and other Returned Volunteers continue to benefit from your Peace Corps service and how that experience influences your well-being, career, community engagement, and public service. The information gathered will be used to improve the Peace Corps experience for current, returned, and future Peace Corps Volunteers; and to aid the Peace Corps, in its position as a federal agency, in telling its story to Congress and the American people.


You were selected because you returned in [$year] from your [$add “first” if more than one service] Peace Corps Volunteer service in [$post]. The survey will take approximately 15 minutes to complete. If you have any questions about the survey or need assistance, please email RPCVsurvey@peacecorps.gov.


Thank you for your service as a Returned Peace Corps Volunteer. We look forward to hearing from you!

Sincerely,


<signatue>


Carol Spahn

Director



15


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCristal, Ryan
File Modified0000-00-00
File Created2024-07-20

© 2024 OMB.report | Privacy Policy