AMERICORPS ALUMNI OUTCOMES SURVEY
SUPPORTING STATEMENT FOR PAPERWORK REDUCTION ACT SUBMISSIONS
B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS
There is no current plan for sampling as this project currently being designed. The potential project will replicate the process and analysis that was employed for the 2015 project. Sample estimates and burden are based on highly stable rates and figures from the previous study. The following methodology and description from the previous study will be replicated however the time frame for each new cohort and strata will be between 2008-2010, 2013-2015, 2016-2018:
The potential respondent universe for the AmeriCorps Alumni Outcomes Survey consists of alumni of AmeriCorps programs who meet the following criteria: Sample size calculation are based on members whose last term of service ended between 2005, 2010, or 2013, corresponding to 10, 5, and 2 years before the original survey administration;
For whom a corporate code indicating the program they participated in their last term of service is available. These codes match the following AmeriCorps programs: AmeriCorps State and National (ASN), including America’s Promise, Education Award Program (EAP), Homeland Security, and Tribes and Territories; National Civilian Community Corps (NCCC); and Volunteers in Service to America (VISTA);
For whom total length of service across all terms of service is greater than or equal to 84 days full time equivalency (FTE), which is equivalent to completing at least one reduced half time term of service;
For whom length of service in most recent term was greater than zero days;
For whom total length of service did not exceed ((exit year of last term - year of birth - x) × 365.25), where x = minimum age of service - 1. For individuals who served multiple terms or who served one term with ASN, minimum age of service of 17 was used. For individuals who served one term with NCCC or VISTA, minimum age of service of 18 was used. One was subtracted from minimum age of service to account for rounding error due to exit year only being available without month and date;
For whom length of service of most recent term did not exceed ((exit year of last term - year of birth - x) × 365.25), where x is as defined above and minimum date of service is 17 for people whose last term of service was in ASN and 18 for people whose last term of service was in NCCC or VISTA.
The restriction to participants with a total service time greater than or equal to 84 days FTE is to ensure that participants have sufficient exposure to AmeriCorps program for it to have an impact on them.
The restriction to participants with a service time greater than zero days is to ensure that at least one term of service was with an eligible program. Information besides total length of service is only available regarding the most recent program; which programs the participant served in during earlier terms are not known.
The maximum length of service is designed to remove participants with data entry errors.
The estimated universe is stratified by time since last service (10, 5, and 2 years; or between 2008-10, 2013-15, and 2016-18 alumni) and program (ASN, NCCC, and VISTA). Exhibit B1 shows the size of the universe.
Exhibit B1. Estimated Universe Size for AmeriCorps Alumni Outcomes Survey based on Previous Study
|
Program |
|
|||
Cohort |
ASN |
NCCC |
VISTA |
Total |
|
2005 |
17,628 |
883 |
4,493 |
23,004 |
|
2010 |
24,954 |
730 |
6,294 |
31,978 |
|
2013 |
18,483 |
1,307 |
4,574 |
24,364 |
|
Total |
61,065 |
2,920 |
15,361 |
79,346 |
The sample drawn will be limited to participants for whom an email address and one of either a mailing address or phone number is available. The estimated size of the population from which the sample will be drawn is shown in Exhibit B2 below. These estimates are based in the population and sample from the 2014 study. The focus on participants for whom contact information is available represents a trade-off between coverage error on the one hand and nonresponse and sampling error on the other. The population from which the sample will be drawn is smaller than the population as a whole, creating potential for coverage error: systematic differences in estimands of interest between the population covered and the population not covered, leading to misestimates. The risk of coverage error is, however, offset by reduced risk of nonresponse error and a reduction in sampling error. We anticipate that response rates will be higher for a sample for which additional information is available, reducing the risk of nonresponse error: systematic differences in estimands of interest between respondents and nonrespondents to the survey, leading to misestimates. Although increases in response rates do not guarantee a reduction in nonresponse error (cf. Lin and Schaeffer 1995), it is nevertheless true that as response rates increase, the potential impact on estimates from systematic differences between respondents and nonrespondents decreases. With respect to sampling error, nonsystematic differences in estimands of interest from between the specific realization of the sampling scheme and underlying population, smaller sample sizes will be associated with increased sampling error. The inclusion of AmeriCorps participants with poorer quality contact information would reduce sample size for a given budget, increasing sampling error.
Exhibit B2. Estimated Sampling Frame Size for AmeriCorps Alumni Survey Base on Previous Study
|
Program |
|
||
Cohort |
ASN |
NCCC |
VISTA |
Total |
2005 |
7,566 |
527 |
2,137 |
10,230 |
2010 |
24,805 |
729 |
6,277 |
31,811 |
2013 |
18,479 |
1,304 |
4,573 |
24,356 |
Total |
50,850 |
2,560 |
12,987 |
66,397 |
The focus on participants for whom contact information is available represents a trade-off between coverage error on the one hand and nonresponse and sampling error on the other. Coverage error is discussed first below, followed by nonresponse and sampling error.
Due to the exclusion of participants without sufficient contact information, the population from which the sample will be drawn is smaller than the population as a whole, creating potential for coverage error (i.e., systematic differences in estimands between the population covered and the population not covered, leading to misestimates).
The risk of coverage error is, however, offset by reduced risk of nonresponse error and a reduction in sampling error, which we discuss in turn. We anticipate that response rates will be higher for a sample for which additional information is available, reducing the risk of nonresponse error (i.e., systematic differences in estimates of interest between respondents and nonrespondents to the survey, leading to misestimates). Although increases in response rates do not guarantee a reduction in nonresponse error (cf. Lin and Schaeffer 1995), it is nevertheless true that as response rates increase, the potential bias from systematic differences between respondents and nonrespondents decreases. With respect to sampling error, the smaller sample sizes that would be achieved were individuals with less contact information to be included in the sampling frame would increase sampling error (i.e., nonsystematic differences in estimates of interest from between the specific realization of the sampling scheme and underlying population).
We recognize that restricting the sampling frame to participants with an email address and either a phone number or mailing address is not a complete solution to the problem of contact information. Much of the contact information in the sampling frame was collected when the participant applied to AmeriCorps and has not been updated since. As we describe in section B3, we have included multiple steps to improve the quality of contact information and increase the likelihood of successfully reaching a participant.
The desired sample sizes for the nine program by cohort strata are shown in Exhibit B3, below.
Exhibit B3. Desired Sample Size for AmeriCorps Alumni Outcomes Survey based on previous study
|
Program |
|
||
Cohort |
ASN |
NCCC |
VISTA |
Total |
2005 |
491 |
153 |
182 |
826 |
2010 |
613 |
176 |
356 |
1,145 |
2013 |
551 |
357 |
271 |
1,179 |
Total |
1,655 |
686 |
809 |
3,150 |
Note: Completed surveys.
We aim to achieve a response rate of approximately 50%. A higher response is anticipated this year compared to the previous year based on the following factors:
Sample will be released in a single replicate.
The survey makes additional use of telephone interviews. Telephone interviewers will be instructed to complete the survey over the phone where possible with the option of sending an email containing a link to the survey or reading the survey link over the phone to be used only where the respondent is unable or unwilling to complete the interview over the phone. The protocol encourages completion of the survey over the phone because only 39% of people emailed the link by an interviewer completed the survey online in 2015.
A 20% response rate was achieved in the 2015 AmeriCorps Alumni Outcomes Survey (CNCS 2015). In 2015, sample was released in three replicates, termed “phases” by CNCS (2015). The final two replicates did not have sufficient time to be worked fully. This is reflected in response rates by replicate. The first replicate achieved a 26% response rate; the second achieved a 19% response rate; the third achieved a 9% response rate.
Statistical Methodology for Stratification and Sample Selection
The desired sample sizes shown in Exhibit B3 were calculated using optimal allocation procedures. Sampling fractions ( ) are allocated to the nine strata formed from the three cohorts (2005, 2010, 2013) and three programs (ASN, NCCC, and VISTA), together with the subsampling rate for cases with bad email addresses, to:
subject to:
;
;
;
where:
where is the sample size of completed interviews and the population size for each of the strata;
pairs of programs (ASN-NCCC, ASN-VISTA, and NCCC-VISTA);
pairs of cohorts (2005-2010, 2005-2013, 2010-2013);
(i.e., the minimum detectable difference of two proportions);
(i.e., calculated around a proportion of 0.5 where variance will be at a maximum);
is defined similarly to above for the subsamples of each pair of cohorts within and programs within . is the finite population correction for the th subsample;
is the effective sample size of completed interviews the th subsample, calculated using the approximation, where is the coefficient of variation of the weights in the th subsample;
is the th quantile of the distribution for a sample size of . is used in preference to the normal distribution due to the small size of some subsamples;
is the probability of Type I error, set at 0.05;
is the probability of Type II error, set at 0.20;
are importance weights for cohorts ( ) and programs ( ) used to weight the relative importance of precision for each (here, and ); and,
is a penalty function where is the mean of within program or cohort, as appropriate, and is a constant (here, ) used to scale the impact of the penalty function. Where is higher, inequality between minimum detectable differences will more heavily penalize the objective function.
Within strata, sample will be selected using a systematic random sample with implicit stratification. The sampling frame will be sorted by available demographic and programmatic data, with missing data included (e.g., sex would consist of female, male, and missing) and, at the lowest level, by a randomly generated number. The demographic and programmatic fields to be used for implicit stratification are CRPP code (a more detailed measure of the specific program the participant served in for ASN participants), age upon start of last service term, number of terms served, sex, length of last term of service, term type (e.g., full-time, half-time, etc.), whether the education award was used, education, and whether the most recent term was completed. Starting from a random point, every 1 in cases will be selected, where . This procedure reduces design effects compared to a simple random sample by ensuring proportional representation of groups within the stratum.
Exhibit B4, below, shows the proposed sample drawn and expected number of completed surveys.
Exhibit B4. Estimated Universe and Sample Sizes for AmeriCorps Alumni Outcomes Survey based on previous study
|
Program |
|
||
Cohort |
ASN |
NCCC |
VISTA |
Total |
Population |
||||
2005 |
17,628 |
883 |
4,493 |
23,004 |
2010 |
24,954 |
730 |
6,294 |
31,978 |
2013 |
18,483 |
1,307 |
4,574 |
24,364 |
Total |
61,065 |
2,920 |
15,361 |
79,346 |
Sampling Frame |
||||
2005 |
7,566 |
527 |
2,137 |
10,230 |
2010 |
24,805 |
729 |
6,277 |
31,811 |
2013 |
18,479 |
1,304 |
4,573 |
24,356 |
Total |
50,850 |
2,560 |
12,987 |
66,397 |
Completed Surveys Expected |
||||
2005 |
491 |
153 |
182 |
826 |
2010 |
613 |
176 |
356 |
1,145 |
2013 |
551 |
357 |
271 |
1,179 |
Total |
1,655 |
686 |
809 |
3,150 |
Estimation Procedures
In order to obtain valid survey estimates, estimation will be done using properly weighted survey data. The weight to be applied to each respondent is a function of the overall probability of selection, and appropriate nonresponse and post-stratification ratio adjustments.
Base weights ( ) are calculated as the inverse of the selection probability based on the sample design:
.
Subsampling weights ( ) for participants with email addresses are calculated as:
for participants with good email addresses and,
for participants with bad email addresses.
Note, however, that the optimal allocation procedures generated a subsampling rate of 1 (i.e., there will be no subsampling of bad email addresses and for participants with bad email addresses).
There will inevitably be nonrespondents to the survey and weighting adjustments will be used to compensate for them. The nonresponse-adjusted weight ( ) for the th weighting class will be computed as:
where S is the set of completed surveys in the th weighting class and the weighting classes will be based on a propensity score model created with the goal to minimize the bias due to nonresponse. The propensity score model will estimate the probability of response using logistic regression. The dependent variables will be based on frame data. The propensity scores will be grouped into quintiles. Within each quintile class, we will ratio adjust the respondents to reflect the nonrespondents, as described above.
To help reduce possible under-coverage errors in the sampling frame and reduce possible nonresponse bias, the final estimation weights will also be calibrated to the population sizes of the strata, as well as other frame data. Candidate variables for use in calibration include sex, education, age, CRPP code, term type, term length, education, whether the education award was used and number of terms served. Race/ethnicity is not available in the sampling frame and cannot be used for calibration.
Degree of Accuracy
Analyses of interest to CNCS are comparisons between cohorts and comparisons between programs. Expected design effects and margins of error for for a proportion of are shown in Exhibit B5. Minimum detectable differences for pairs of cohorts and programs are shown in Exhibit B6 for and statistical power of (i.e., ). Design effects are calculated using the approximation and have been increased by a factor of to account for the effect of calibration.
Exhibit B5. Estimated Expected Design Effects and 95% Confidence Intervals
Estimand |
Design Effect |
Margin of Error |
ASN |
1.34 |
2.7% |
NCCC |
1.13 |
4.0% |
VISTA |
1.15 |
3.7% |
2005 |
1.27 |
3.7% |
2010 |
1.65 |
3.4% |
2013 |
1.63 |
3.5% |
Total |
1.83 |
2.1% |
Note: Margins of error for a statistic of 50% shown; margins of error will be lower for statistics above or below 50%.
Exhibit B6. Estimated Expected Minimum Detectable Differences for Comparisons of Interest
Comparison |
Minimum Detectable Difference |
2010-2013 |
6.9% |
2005-2013 |
7.2% |
2005-2010 |
7.2% |
ASN-NCCC |
6.3% |
ASN-VISTA |
6.2% |
NCCC-VISTA |
7.0% |
Notes: For a statistic of 50%; margins of error will be lower for statistics above or below 50%. ; .
Standard errors will be computed using statistical software that accounts for the complex survey such as Stata svy and SAS SURVEY procedures.
B3. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield “reliable” data that can be generalized to the universe studied.
The AmeriCorps Alumni Survey employs a number of strategies to maximize response rates while maintaining cost control, which are detailed below.
The availability and quality of contact information for AmeriCorps alumni is a major driver of nonresponse. Among the eligible population, email addresses are available for 84% participants (44% of 2005 participants, 99% of 2010 participants, and 100% of 2013 participants), addresses are available for 99% participants (97% of 2005 participants and 100% of 2010 and 2015 participants), and phone numbers are available for 56% of participants (93% of 2005 participants, 43% of 2010 participants, and 40% of 2013 participants). As is described in section B1, an initial step in maximizing response rates is to restrict the sampling frame to cases with an email address and either a telephone number or a mailing address on file. However, much of the contact information available will be out-of-date. Additional steps are therefore taken to update contact information.
The contact information available for the sample will be updated in order to maximize the probability of successfully reaching sample members. All cases will be updated using LexisNexis’s Accurint service. The availability of Social Security Numbers for all participants improves match rates compared to not having this information available.
Prior to data collection, efforts will be made to verify contact information. This outreach will occur prior to fielding the survey and will be used to select the sample. These procedures are shown in Exhibit B7.
Cases
can be on multiple tracks where different types of contact
information
Undeliverable
email identified
Undeliverable
identified; update from forwarding notice
Bad
numbers identified
Inactive,
unknown numbers identified
Landlines
dialed on autodialer
Cell
WINS check if cell number active
LNPA
lookup to identify cell phones
Send
study description postcard
Send
study description email
Phone
number on file
Mailing
address on file
Email
address on file
Accurint
batch process update
Sample
drawn
Participants will be sent an email describing the study; as described above, only participants with email addresses are eligible for selection. The purpose of these emails is primarily to identify undeliverable (“bounced”) email addresses. A contact line will be provided for finding more out about the study as well as a link to the CNCS website describing the study. See Appendix B for the text of all planned participant communications.
Alumni with mailing addresses will be sent a study description postcard describing the study with return service requested. A contact line will be provided for finding more out about the study as well as a link to the CNCS website describing the study. Address forwarding notifications will be updated in the study records and undeliverable addresses will be recorded.
Phone numbers on file will be validated. Numbers will be classified as landline or cell phones based on telephone exchange and the Local Number Portability database in order to comply with the recent Federal Communications Commission ruling regulating dialing cell phones. For numbers classified as belonging to cell phones, numbers will be checked using Marketing Systems Group’s Cell WINS database to identify whether the number is active. Landline numbers will be dialed by autodialer to determine whether the number is working or not. No interviews will take place; the autodialer will only determine whether the number is working and immediately terminate the call.
Contact procedures for the survey itself are shown in Exhibit B8. These are separate from the procedures for verifying contact information shown in Exhibit B7.
No
working email
Send
final reminder letter if email not available & no phone
No
phone
Email
Phone
No
phone
Email
Phone
No
phone
No
address
Send
final email if email available
Reminder
email x 3
Phone
calls if number available
Send
advance postcard if address available
Send
reminder postcard with URL if no phone
Invitation
letter with URL and QR code if working email not available
Send
reminder letter if address available
Invitation
email if email available
Alumni with working email addresses will be sent an invitation email from CNCS’s GovDelivery email system approximately one week after the advance postcard is mailed. The use of a CNCS.gov email address is intended to emphasize the legitimacy of the data collection request. Three email reminders will be sent to nonrespondents at approximately one week intervals thereafter. These procedures are similar to those used in 2015, except that the length of time between reminders is increased from four days to a week and, in 2015, the invitations were sent from the contractor (see CNCS 2015 for further details; an advance email was sent via GovDelivery last year, but not a survey invitation). The longer period between email contacts is intended to reduce potential frustration on the part of alumni due to the frequency of email contacts.
An invitation letter will be sent to alumni for whom no working email address is available. The invitation letter will be mailed on AmeriCorps stationery and will be signed by AmeriCorps directors. The letter will contain a unique URL for the survey and will include a QR code that can be used by respondents to go to their unique URL without having to manually enter the address. The use of a mailed letter in 2015 was limited to alumni who could not be reached via email or phone and did not include a QR code (CNCS 2015). The letter is to be mailed in advance of calls in order to maximize cost-efficiency: surveys completed via web are less expensive than surveys completed via phone. A reminder postcard also containing the survey link will be sent to nonrespondents following the invitation letter.
To enhance response rates, a robust telephone effort will be made for alumni for whom a telephone number is available. Interviewers place calls and will be trained to attempt to complete the survey on the telephone except where the alumnus or alumna cannot or will not do so. Although the interviewer will be able to send an email containing the alumna or alumnus’ unique URL to an email address provided by an informant and will also be able to read out the URL upon request, completing the survey via phone is emphasized due to the relatively low response rate of 39% in 2015 for alumni who were reached by phone and requested to be emailed the study link. Interviewers will be able to update the telephone number on file if they receive new information (e.g., if the phone number is that of the alumnus or alumnus’ parents).
The final contacts made for nonrespondents will be a final email reminder for alumni with working email addresses and a final reminder letter for alumni without working email addresses. The letter will be mailed on AmeriCorps stationery.
As an experiment, 50% of all respondents will be offered a $2 incentive paid to the National Park Foundation on their behalf as a thank you for completing the survey. We recommend using the National Park Foundation as a recognizable, federally-affiliated, politically neutral recipient of the charitable donation. The National Park Foundation, in partnership with the National Park Service, enriches America’s national parks and programs through private support, safeguarding our heritage and inspiring generations of national park enthusiasts. The National Park Foundation has a well-developed framework for applying for permission to use their name and was able to supply permission quickly.
The prior alumni survey achieved only a 20% response rate, so this experiment is proposed as both an attempt to increase the response rate to this year’s alumni survey and to advance the body of knowledge on the effectiveness of a charitable donation incentive. Although the experimental evidence for the effectiveness of charitable donations has been predominantly negative, AmeriCorps alumni may be an exception due to altruistic orientation.1 AmeriCorps alumni scored higher on sense of connection to the community, a sense of civic obligations, being active in community affairs, and volunteering than did a control group, and were more likely to have donated to Hurricane Katrina relief than were controls (Corporation for National and Community Service, 2008a). Given these findings, it is not unreasonable to expect that the offer of a charitable donation may be more effective for AmeriCorps alumni than for the population at large. After the survey data collection period is complete, the response rates will be compared for the groups that were offered and not offered the charitable donation incentive, and any statistically significant effect will be reported in the final survey technical report.
Due to the limited field time available for the survey, the entire sample will be released at once rather than by replicates in order to avoid the difficulties experienced in 2015 when the second and third replicates could not be worked fully due to time limitations.
To assess the impact of nonresponse bias in our study, we will conduct statistical analysis to identify any characteristics of respondents that are correlated with response. Propensity scores from the logistic regression of survey response on explanatory variables will be used to create nonresponse weighting classes for statistical adjustment. These nonresponse weights will be combined with sampling weights based on our stratification plan to create the final weights for our analysis.
The survey instrument included in this justification is based upon the survey used in 2015. Cognitive interviews were conducted in addition to a pilot survey with 14 responses. In addition it was reported that “Responses showed good variability on almost all items, and no unexpected answer patterns that would indicate a question was not working as intended” (CNCS 2015:4). In turn, the 2015 instrument was based on largely on the AmeriCorps exit survey, with wording changes kept to a minimum.
To test the instrument to be used in 2016, a sample of 150 alumni was drawn from the sampling frame (see Exhibit B9 below). Due to expectations that all or almost all NCCC alumni would be sampled for the survey, the pilot survey sample was drawn solely from ASN and VISTA. More recent cohorts were oversampled in an effort to obtain more responses, and the sample was further restricted to alumni with an email address modification date in 2015 in order to further increase the number of responses. In addition, only alumni with full contact information on file were included in the pilot sample frame, as the 2015 survey found that alumni with both pieces of contact information were more likely to respond to the survey even before telephone reminders.
Exhibit B9. Pilot Sample for AmeriCorps Alumni Outcomes Survey
|
Program |
|
||
Cohort |
ASN |
NCCC |
VISTA |
Total |
2005 |
6 |
0 |
6 |
12 |
2010 |
30 |
0 |
30 |
60 |
2013 |
39 |
0 |
39 |
78 |
Total |
75 |
0 |
75 |
150 |
A total of 15 complete responses were obtained. The median length of response to the survey was 27.5 minutes, excluding the pilot-only questions. Several pilot respondents had very long completion times, suggesting that they started the survey, put it aside, and completed it much later. Therefore, the pilot response burden estimate of 25 minutes seems reasonable.
All pilot respondents indicated that the survey instructions were clear and easy to follow and that the order of the questions made sense. Pilot respondents suggested changes to several items, and while some items were clarified based on this feedback it was determined that some changes should not be made in order to maintain direct comparability with the AmeriCorps Exit Survey. Eight of the fifteen pilot respondents felt that the survey was too long. In response, 12 questions were dropped from the survey and one question was added, reducing the estimated response burden by three minutes, to 22 minutes.
CONTRACT HAS NOT BEEN AWARDED AND WILL NOT BE COMPLETED UNTIL FICAL YEAR 2019
The project officer at CNCS for this project is:
NOT ASSIGNED
Bertoni, Nick, Andrew Burkey, Molly Caldaro, Scott Keeter, Charles DiSogra, and Kyley McGeeney. 2015. “Advance Postcard Mailing Improves Web Panel Survey Participation.” Paper presented at the annual conference of the American Association for Public Opinion Research, Hollywood, FL.
Corporation for National and Community Service. 2015. “AmeriCorps Alumni Outcomes: Technical Memo.” Corporation for National and Community Service, Washington, D.C.
De Leeuw, Edith, Mario Callegaro, Joop Hox, Ellen Korendijk, and Gerty Lensvelt-Mulders. 2007. “The Influence of Advance Letters on Response in Telephone Surveys: A Meta-Analysis.” Public Opinion Quarterly 71:413-43.
Goldstein, Kenneth M. and M. Kent Jennings. 2002. “The Effect of Advance Letters on Cooperation in a List Sample Telephone Survey.” Public Opinion Quarterly 66:608-17.
Kaplowitz, M.D., T.D. Hadlock, and R. Levine. 2004. “A Comparison of Web and Mail Survey Response Rates.” Public Opinion Quarterly 68:94-101.
Lin, I-Fen and Nora-Cate Schaeffer. 1995. “Using Survey Participants to Estimate the Effect of Nonparticipation.” Public Opinion Quarterly 59:236-58.
Link, Michael W. and Ali Mokdad. 2005. “Advance Letters as a Means of Improving Respondent Cooperation in Random Digit Dial Studies: A Multistate Experiment.” Public Opinion Quarterly 69:572-87.
Appendix B
Study Description Postcard Sent for Contact Information Verification
In the coming months, you will receive a request to complete a survey of AmeriCorps alumni. You were selected because you served in [program]. The Corporation for National and Community Service, the federal agency that administers AmeriCorps programs, is seeking your input to help us better understand your service experience and how your participation affected your life and career. The survey will be conducted by [NAME OF CONTRACTOR], a survey research firm. Please contact them at [CONTACT INFORMATION] and mention ID [qkey] to update your contact information to make sure you receive this important survey or if you have any questions about this study. For more information about the study, please visit www.nationalservice.gov/impact-our-nation/evidence-exchange/americorps-alumni-survey.
[signature]
[name and title]
Study Description Email Sent for Email Address Verification
Dear [First Name],
In the coming months, you will receive a request to complete a survey of AmeriCorps alumni. You were selected because you served in [program]. The Corporation for National and Community Service, the federal agency that administers AmeriCorps programs, is seeking your input to help us better understand your service experience and how your participation affected your life and career. The survey will be conducted by [CONTRACTOR NAME], a survey research firm. Please contact us at [CONTACT INFORMATION] and mention ID [qkey] to update your contact information to make sure you receive this important survey or if you have any questions about this study. For more information about the study, please visit the [DEDICATED WESITE LINK].
Sincerely,
NAME:
PROJECT DIRECTOR
Advance Postcard (sent to all)
A few days from now, you will receive [email: an email / no email: a mailed] request to complete a survey of AmeriCorps alumni. You were selected because you served in [program]. The Corporation for National and Community Service, the federal agency that administers AmeriCorps programs, is seeking your input to help us better understand your service experience and how your participation affected your life and career. The results of the study will be shared on the CNCS website. Please contact [CONTACT INFORMATION] if you have any questions. For more information about the study, please visit www.nationalservice.gov/impact-our-nation/evidence-exchange/americorps-alumni-survey.
[SIGNATURE]
[NAME AND TITLE]
Invitation Email Sent from CNCS
Subject: Survey of AmeriCorps Alumni
Dear [firstname],
I am writing to ask you to complete a survey of AmeriCorps alumni. Through the survey, we hope to learn about your AmeriCorps experience and how it has affected you in the years since you served. The information we gather will be used to improve the quality of the service experience for all AmeriCorps members and ensure that the programs enable AmeriCorps members to excel, both in service and beyond. You were selected because you served in [program].
To take the survey, please click the link below:
[URL]
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members. It will take approximately 20 minutes to complete.
For more information about the study, please visit the [DEDICATED WEBSITE LINK]. If you encounter any technical problems with the survey, please contact NAME at CONTRACTOR CONTACT INFO and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF NAME at CONTACT INFORMATION.
Thank you very much for your time.
Sincerely,
[CNCS signatory]
To opt out of receiving future email about this survey, please click here. [where “here” is opt out link]
Invitation Letter for Bad Email Cases
Dear [firstname],
I am writing to ask you to complete a survey of AmeriCorps alumni. Through the survey, we hope to learn about your AmeriCorps experience and how it has affected you in the years since you served. The information we gather will be used to improve the quality of the service experience for all AmeriCorps members and ensure that the programs enable AmeriCorps members to excel, both in service and beyond. You were selected because you served in [program].
To take the survey, please go to the following link:
[URL]
Or you may scan this QR code with your smartphone:
[QR]
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members. It will take approximately 20 minutes to complete.
For more information about the study, please visit www.nationalservice.gov/impact-our-nation/evidence-exchange/americorps-alumni-survey. If you encounter any technical problems with the survey, please contact [NAME] at [CONTRACTOR CONTACT INOFRMATION] and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF LEAD NAME AND CONTACT INFORMATION].
Thank you very much for your time.
Sincerely,
[CNCS signatory]
Email Reminder 1 Sent from [CONTRACTOR NAME] to Nonrespondents with Email Addresses
Subject: Reminder: Survey of AmeriCorps Alumni
Dear [First Name],
You were recently invited to take part in an online survey of AmeriCorps alumni by the Corporation for National and Community Service. We really value your experiences and opinions and hope that you will be able to take part in this important survey.
To take the survey, please click the link below:
[URL]
The survey asks about your AmeriCorps experience and how it has affected you in the years since you served. The information gathered will be used to improve the quality of the service experience for AmeriCorps members and ensure that the programs enable AmeriCorps members to excel, both in service and beyond. You were selected because you served in [program].
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members. It will take approximately 20 minutes to complete.
If you encounter any technical problems with the survey, please contact [NAME] at [CONTRACTOR CONTACT INOFRMATION] and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF LEAD NAME AND CONTACT INFORMATION].
Thank you very much for your time.
Sincerely,
NAME.
CONTACTOR NAME
POSITION
To opt out of receiving future email about this survey, please click here. [where “here” is opt out link]
Email Reminder 2 Sent from [CONTRACTOR NAME] to Nonrespondents with Email Addresses
Subject: Please complete the AmeriCorps Alumni Survey
Dear [First Name],
I am writing again to ask for your participation in the Survey of AmeriCorps Alumni. We recently sent you an email about the survey. If you have already completed the survey, thank you very much. If you have not completed the survey yet, please take the time now to do so by clicking the link below:
[URL]
As a reminder, the results of this research will provide critical information that will be used to improve the quality of the service experience for all AmeriCorps. You were selected because you served in [program].
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members. It will take approximately 20 minutes to complete.
If you encounter any technical problems with the survey, please contact [NAME] at [CONTRACTOR CONTACT INOFRMATION] and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF LEAD NAME AND CONTACT INFORMATION].Thank you very much for your time.
Sincerely,
NAME.
CONTACTOR NAME
POSITION
To opt out of receiving future email about this survey, please click here. [where “here” is opt out link]
Email Reminder 3 Sent from [CONTRACTOR NAME] to Nonrespondents with Email Addresses
Subject: It’s not too late to complete the Survey of AmeriCorps Alumni
Dear [First Name],
It’s not too late for you to complete the Survey of AmeriCorps alumni. [IF INCENTIVE: As a thank you for your time, a donation of $2 will be made to the National Park Foundation for every survey completed.] We will [IF INCENTIVE: also] share the results of the survey on the CNCS website. According to my records, I have not yet received your completed survey. To take the survey, please click the link below:
[URL]
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members. It will take approximately 20 minutes to complete.
For more information about the study, please visit the [DEDICATED WEBSITE LINK]. If you encounter any technical problems with the survey, please contact [NAME] at [CONTRACTOR CONTACT INOFRMATION] and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF LEAD NAME AND CONTACT INFORMATION].Thank you very much for your time.
.
Thank you very much for your time and consideration.
Sincerely,
NAME.
CONTACTOR NAME
POSITION
To opt out of receiving future email about this survey, please click here. [where “here” is opt out link]
Reminder Postcard for Bad Email or Phone Nonrespondents
I am writing to ask for your participation in the Survey of AmeriCorps Alumni. I recently sent you a letter about the survey and, according to my records, I have not yet received your completed survey. We really value your experiences and opinions and hope that you will be able to take part in this important survey. You were selected because you served in [program]. The Corporation for National and Community Service, the federal agency that administers AmeriCorps programs, is seeking your input to help us better understand your service experience and how your participation affected your life and career. This online survey will take about 20 minutes. We] will share the results on the CNCS website. Contact AmeriCorps@srbi.com if you have any questions and mention study ID [qkey]. Please complete the survey now by going to:
[URL]
[signature]
[name and title]
Final Reminder Letter for All Nonrespondents with Good Mailing Address
Dear [firstname],
During the past two months, we contacted you several times about an important study of AmeriCorps alumni. We really value your experiences and opinions and hope that you will be able to take part in this important survey, which will take approximately 20 minutes to complete.
Its purpose is to learn about your AmeriCorps experience and how it has affected you in the years since you served. Information from the survey will be used to improve the quality of the service experience for all AmeriCorps members and ensure that the programs enable AmeriCorps members to excel, both in service and beyond. You were selected because you served in [program].
[IF NO EMAIL: The study is drawing to a close.] To take the survey, please go to the following link:
[URL]
Alternately, you may scan this QR code with your smartphone:
[QR]
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members.
If you encounter any technical problems with the survey, please contact [NAME] at [CONTRACTOR CONTACT INOFRMATION] and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF LEAD NAME AND CONTACT INFORMATION].Thank you very much for your time.
Thank you very much for your wiliness to consider our request as we conclude this effort to better understand the lives of AmeriCorps alumni.
Sincerely,
[CNCS signatory]
Telephone Interview Script
INTRO1 Hello, can I please speak to [firstname] [lastname]? [GO TO TIPRESP]
Named Respondent speaking [GO TO INTRO2]
Wrong number (haven’t heard of named respondent) [GO TO SORRY1] (FINAL OUTCOME) TRY ALTERNATIVE NUMBER IF AVAILABLE
Named respondent is temporarily unavailable [IDENTIFY BEST TIME TO CALL BACK, MAKE SOFT APPOINTMENT]
Named respondent cannot be reached by phone at any time/uncontactable by phone [GO TO SORRY1] (FINAL OUTCOME) TRY ALTERNATIVE NUMBER IF AVAILABLE
Named respondent is unavailable during fieldwork [GO TO SMS RECORD OUTCOME] (FINAL OUTCOME) TRY ALTERNATIVE NUMBER IF AVAILABLE
Language difficulty [MAKE SOFT APPOINTMENT TO TRY AGAIN ANOTHER TIME]
Call back on another number [UPDATE NUMBER AND ARRANGE TO CALL NEW NUMBER]
Number unobtainable/disconnected/out of order TRY ALTERNATIVE NUMBER IF AVAILABLE
Computer/modem/fax line TRY ALTERNATIVE NUMBER IF AVAILABLE
Answer machine/voicemail IF CONFIRMED, LEAVE VOICEMAIL AND DO NOT TRY AGAIN
No answer TRY AGAIN/TRY ALTERNATIVE NUMBER IF AVAILABLE
Line or extension busy TRY AGAIN/TRY ALTERNATIVE NUMBER IF AVAILABLE
Refusal by gatekeeper (PA/assistant etc), unable to be connected to named respondent [GO TO SORRY 1] (FINAL OUTCOME – HARD REFUSAL)
Refused – organisation policy [GO TO SORRY 1] (FINAL OUTCOME – HARD REFUSAL)
Gatekeeper requested information to pass on to respondent [GO TO GATEKEEPER1]
(+Dialer outcomes not visible to interviewer) TRY AGAIN IF APPROPRIATE OR TRY ALTERNATIVE NUMBER IF AVAILABLE
ASK IF CODE 1 AT INTRO1:
INTRO2 Hello, my name is _______ and I am calling from [CONTRACTOR NAME], a national survey research firm, on behalf of the Corporation for National and Community Service, a U.S. government agency. The purpose of the call is to learn about your AmeriCorps service experience. This is a nationwide survey, developed by AmeriCorps, and your valuable input will help us make AmeriCorps more effective for its members. The survey will take approximately 20 minutes. Will you be able to take the survey? (IF NECESSARY: Would you prefer to complete it online?)
Yes – online – GO TO INTRO3
Yes – phone interview – GO TO INTRO7
Not heard about survey before/not received letter or emails – GO TO INTRO5
Already completed interview – GO TO SORRY1 (FINAL OUTCOME - COMPLETE)
Language difficulty, cannot continue GO TO SORRY2 (FINAL OUTCOME- SCREENOUT)
Call back on another number NOTE NUMBER AND ARRANGE TO CALL NEW NUMBER – RESUME AT INTRO2
Hard callback on same number to go through reminder script GO TO SMS AND RECORD DETAILS OF APPOINTMENT – RESUME AT INTRO2
Refused – GO TO WHYREFUSED
Refused – remove from further research GO TO SORRY 1 (FINAL OUTCOME– HARD REFUSAL)
Refused – organization policy GO TO SORRY 1 (FINAL OUTCOME– HARD REFUSAL)
Has question for project staff before will complete - NOTE QUESTION IN OPEN-END FOLLOW-UP AND GO TO INTRO5
ASK IF CODE 1 AT INTRO2:
INTRO3 Would you like me to send an email containing the link to the survey to you?
Yes – IF EMAIL ADDRESS ON FILE ASK INTRO4; IF NO EMAIL ADDRESS ON FILE SKIP TO INTRO6
No - DO NOT SEND THE EMAIL. GO TO THANKS1 (FINAL OUTCOME - COMPLETE)
ASK IF CODE 1 AT INTRO3 & EMAIL ADDRESS ON FILE:
INTRO4 Can I check the e-mail address we have is correct? {SHOW EMAIL ADDRESS FROM SAMPLE FILE HERE} READ OUT EMAIL ADDRESS
Yes correct – AUTOMATICALLY SEND THE EMAIL. THEN GO TO THANKS1 (FINAL OUTCOME - COMPLETE)
No not correct – ASK INTRO6
ASK IF CODE 3 OR 11 AT INTRO2:
INTRO5 [IF INTRO2=3: If you haven’t received the letter or the e-mails, it’s probably best if we send you some information so you can see what the survey is about. / IF INTRO2=11: I will request that someone from the project team contact you to answer your question.] Can I check the email address we have is correct, and then we’ll email you a letter with some background information and a link to the survey? {SHOW EMAIL ADDRESS FROM SAMPLE FILE HERE} READ OUT EMAIL ADDRESS. Is this correct?
Yes correct – AUTOMATICALLY SEND THE EMAIL. THEN GO TO THANKS1 (FINAL OUTCOME - COMPLETE)
No not correct – ASK INTRO6
No, doesn’t want to provide email address/doesn’t want to take part - GO TO WHYREFUSED
ASK IF CODE 2 AT INTRO4 OR CODE 2 AT INTRO5:
INTRO6 Can I take down the [IF NOT MISSING EMAIL ADDRESS: correct] email address? ENTER EMAIL ADDRESS AND SEND AUTOMATICALLY. ALLOW REFUSED. THEN GO TO THANKS1 (FINAL OUTCOME)
ASK IF CODE 2 AT INTRO2:
INTRO7 Is now a good time to take the survey? It will take about 20 minutes.
Yes – GO TO PHONE_INTRO
No – ARRANGE APPOINTMENT (RESUME AT INTRO 7)
ASK IF CODE 1 AT INTRO7:
PHONE_INTRO Thank you. Please allow me a moment to pull up your survey.
Thank you for your commitment to service. To support current and future members and improve AmeriCorps programming, we are asking you to answer some brief questions so that we can better understand your AmeriCorps service experience. This survey is being conducted on behalf of the Corporation for National and Community Service (CNCS) by [CONTRACTOR NAME], a survey research firm, and is entirely voluntary. Your responses will remain private to the extent permitted by law, as is provided for in the Privacy Act of 1974. Your survey responses will be summarized in reports along with the responses of thousands of other AmeriCorps alumni in aggregate form only. CNCS will receive your responses with your name attached. However, all identifiable information will be removed from the data before they are made available to other researchers. The OMB control number for this collection is #3045-0163, which expires on 10/31/17. For questions/concerns about your rights as a study participant, please call the [CONTRACTOR] Institutional Review Board at [PHONE NUMBER].
INTERVIEWER: GO TO opinionport.com/CNCS_CATI AND ENTER [USERID] TO BEGIN SURVEY. STAY ON THIS SCREEN UNTIL DONE WITH PHONE SURVEY.
WHEN DONE WITH PHONE INTERVIEW PRESS 1 TO CONTINUE TO PHONE_END.
PHONE_END INTERVIEWER RECORD STATUS OF PHONE SURVEY:
COMPLETE [END]
PARTIAL - CALLBACK
PARTIAL - REFUSAL [GO TO WHYREFUSED]
ASK IF CODE 3 AT INTRO2 OR CODE 3 AT INTRO5 OR CODE 3 AT PHONE_END:
WHYREFUSED While participation in this survey is completely voluntary, may I ask why you don’t want to take this survey? Your response to this question may help the Corporation for National and Community Service to better understand people’s different reasons for not taking the survey. DO NOT READ OUT PRECODES. PROBE SENSITIVELY IF REQUIRED.
Too busy/survey takes too long [GOTO DECLINE] (FINAL OUTCOME – HARD REFUSAL)
Don’t want to answer personal questions [GOTO CONFIDENTIALITY] (FINAL OUTCOME – HARD REFUSAL)
Concerned about identification/confidentiality [GOTO CONFIDENTIALITY] (FINAL OUTCOME – HARD REFUSAL)
Don’t view study as important or relevant [GOTO DECLINE] (FINAL OUTCOME – HARD REFUSAL)
Don’t respond to unsolicited requests [GOTO RANDOMLY SELECTED] (FINAL OUTCOME – HARD REFUSAL)
Don’t want to give a reason/refused to answer [GOTO SORRY1] (FINAL OUTCOME – HARD REFUSAL)
Some other reason (PLEASE SPECIFY) [GOTO DECLINE] (FINAL OUTCOME – HARD REFUSAL)
ASK IF CODE 1 OR CODE 4 OR CODE 10 AT WHYREFUSED:
DECLINE I hope that you will reconsider. Your perspectives are important to us and your participation is essential to understanding the lives of AmeriCorps participants. If you change your mind, the link to the survey will be active until [date TBA]. Thank you very much for your time.
ASK IF CODE 2 OR 3 AT WHYREFUSED:
CONFIDENTIALITY I would like to reassure you that we have extensive protocols in place to protect your privacy and confidentiality. Your survey responses will be summarized in reports along with the responses of thousands of other AmeriCorps alumni in aggregate form only. While CNCS will receive your responses with your name attached, all identifiable information will be removed from the data before they are made available to other researchers. I hope you will reconsider as your opinions are very important to the study. If you change your mind, the link to the survey will be active until [date TBA]. Thank you very much for your time.
ASK IF CODE 5 AT WHYREFUSED:
RANDOMLY SELECTED The research team identified AmeriCorps participants and randomly selected you to participate. I hope you will reconsider as your opinions are very important to the study. If you change your mind, the link to the survey will be active until [DATE TBA]. Thank you very much for your time.
ASK IF CODE 17 AT INTRO1:
GATEKEEPER1 This message is for [firstname] [lastname]. I am calling from [CONTRACTOR NAME] on behalf of the Corporation for National and Community Service about an important survey of AmeriCorps participants. The Corporation for National and Community Service is the federal agency that administers AmeriCorps programs. [firstname] [lastname] was randomly selected from AmeriCorps participants. We would like to interview [firstname]. [firstname] call us toll-free at [NUMBER] and reference study number 30370 and ID number [ID], or email [DEDICATED EMAIL ADDRESS] and reference ID number [ID]. Thank you.
THEN GO TO GATEKEEPER 2
GATEKEEPER2 Can I also just check we have the right email address for {NAME OF RESPONDENT}? [SCRIPTWRITER PLEASE SHOW EMAIL ADDRESS HERE AND “CORRECT” YES/NO OPTIONS. ALSO ALLOW REFUSED HERE. REFUSALS SHOULD GO TO THANKS2] (FINAL OUTCOME – COMPLETE)
INTERVIEWER PLEASE READ OUT E-MAIL ADDRESS/ES TO GATEKEEPER AND CODE WHETHER CORRECT OR NOT. IF NOT CORRECT, PLEASE OBTAIN CORRECT DETAILS FROM GATEKEEPER.
ASK IF CODE 7 OR 8 OR 9 AT WHYREFUSED:
SORRY1 In that case I am sorry to have disturbed you. Thank you very much for your time.
ASK IF CODE 2 AT INTRO3 OR CODE 1 AT INTRO4 OR CODE 1 AT INTRO5 OR AFTER INTRO6:
THANKS1 OK then, thanks very much for speaking to me, and I hope you will be able to complete the survey soon.
ASK IF CODE 5 AT INTRO 2:
SORRY2 I am sorry it is too difficult for us to speak on the phone. I hope you will be able to take part in the survey online. Thank you very much for your time.
ASK IF CODE 17 AT INTRO1:
THANKS2 Thank you very much for your help today.
ASK IF CODE 6 AT WHYREFUSED:
ELIGIBLE You are still eligible to participate and we hope that you will be able to take part in the survey online. Thank you very much for your time.
IF VOICEMAIL (LEAVE ON 2nd and 4th ATTEMPTS):
VM Hello, I am calling from [CONTRACTOR NAME] on behalf of the Corporation for National and Community Service, a U.S. government agency, for [firstname] [lastname]. This is a follow-up to an invitation to participate in a survey of AmeriCorps alumni. If you have received the invitation I encourage you to complete the survey at your earliest convenience. Participation is voluntary.
If you have not received an invitation or would like it resent, please call toll free at 1-XXX or by email at [DEDICATED WEBSITE LINK] (INTERVIEWER SPELL OUT EMAIL ADDRESS AFTER READING IT) and mention study 30370 and ID number [ID]. Thank you.
Final Email Reminder
SUBJECT: Last chance to take the Survey of AmeriCorps Alumni
Dear [First Name],
I am extending a final invitation for you to complete the Survey of AmeriCorps alumni. The findings will be used to help improve AmeriCorps programs and the support AmeriCorps can offer to its members.
To take the survey, please click the link below:
[URL]
The survey is on behalf of the Corporation for National and Community Service, the federal agency that administers AmeriCorps programs. It will take approximately 20 minutes to complete.
For more information about the study, please visit the DEDICATED WEBSITE LINK]. If you encounter any technical problems with the survey, please contact [NAME] at [CONTRACTOR CONTACT INOFRMATION] and mention your study ID, [qkey]. If you have questions about this project, please contact CNCS PROJECT STAFF LEAD NAME AND CONTACT INFORMATION].Thank you very much for your time.
Thank you very much for your time.
Sincerely,
[CNCS signatory]
To opt out of receiving future email about this survey, please click here. [where “here” is opt out link]
1 Brennan, Seymour, and Gendall (1993) and Robertson and Bellenger (1978) found a positive effects for promised donations to charity, although the Brennan et al. result held only for the first of three waves of the survey. Other studies found no effect for a charitable donation (Boyle et al. 2012 both for a meta-analysis of existing literature and an experiment; Furse and Stewart 1982; Hubbard and Little 1988; Olson, Schneiderman, and Armstrong 1993; Pedersen and Nielsen 2014; Skinner, Ferrell, and Pride 1984; Warriner et al. 1996).
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | DiTommaso, Adrienne (Guest) |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |