SIF PFS OMB Part B_02_05_16

SIF PFS OMB Part B_02_05_16.docx

Social Innovation Fund Pay for Success Process Evaluation

OMB: 3045-0177

Document [docx]
Download: docx | pdf

SUPPORTING STATEMENT

For

Paperwork Reduction Act Submission



Process Evaluation of the Social Innovation Fund (SIF) Pay for Success (PFS) Program















Part B. Collection of Information Employing Statistical Methods

















Submitted by:

Corporation for National & Community Service

1201 New York Avenue, NW
Washington, DC 20525








February 5, 2016





Supplemental Information



Appendix B.1 Survey Email Invitations

Appendix B.2 Study Endorsement Letter from CNCS

Appendix B.3 Study Fact Sheet

Appendix B.4 Survey Email Reminders

Appendix B.5 Survey Email Thank You Messages



Supporting Statement

Part B. Collections of Information Employing Statistical Methods

B1. Respondent Universe and Sampling Methods

The potential respondent universe for the SIF PFS Program Process Evaluation surveys consists of all SIF PFS grantees, subrecipients, and service recipients. In October 2014, CNCS funded eight SIF PFS grantees for up to three years. Each of these grantees released requests for proposals (RFPs) and collectively selected a group of 55 subrecipients (to receive assistance with feasibility assessments and technical assistance for capacity building) and service recipients (to receive transaction structuring assistance and/or receive pass-through funding) with the goal of exploring and developing PFS projects.

The SIF PFS program anticipates selecting approximately four additional SIF PFS grantees in the spring 2016 and another six in spring 2017. Each of these ten additional grantees is expected to fund approximately seven subrecipients/service recipients to work with. The exhibit below provides the expected number of grantees and subrecipients/service recipients that will participate in each survey period during the proposed three annual rounds of survey data collection. Please note that the number of subrecipients/service recipients may change from year to year.



Exhibit B.1. Respondents for each Survey Data Collection Period




Respondent Type




Funding Period

Number of Grantees and Subrecipients/ Service Recipients

Number of Respondents per Survey Period

Spring 2016

Spring 2017

Spring 2018

2014 Grantees

Fall 2014 to Fall 2017

8

8

8

--

RFP 1 Subrecipients/Service Recipients


Spring 2015 to Fall 2017

55

55

55

--

RFP 2 Subrecipients/Service Recipients

Spring 2016 to Fall 2017

55

55

55

--

2016 Grantees

Spring 2016 to Spring 2019

4

4

4

4

RFP 1 Subrecipients/Service Recipients


Fall 2016 to Fall 2018

28

--

28

28

RFP 2 Subrecipients/Service Recipients


Fall 2017 to Spring 2019

28

--

--

28

2017 Grantees

Spring 2017 to Spring 2020

6

--

6

6

RFP 1 Subrecipients/Service Recipients


Fall 2017 to Fall 2019

42

--

--

42

RFP 2 Subrecipients/Service Recipientsa


Fall 2018 to Spring 2020

--

--

--

--

Total Number of Respondents


226

122

156

108

a The 2017 RFP 2 subrecipients/service recipients are expected to be selected in fall 2018, after the last round of data collection has been completed, so they will not be included in the survey.



As indicated in Exhibit B.1., the SIF PFS Program Process Evaluation will survey the entire universe of grantees and subrecipients/service recipients actively participating in the SIF PFS program during each data collection period. Each grantee and subrecipient/service recipient will be surveyed between one and three times (depending upon the cohort in which they were selected) to allow for the collection of longitudinal data on project activities and program development. Since the SIF PFS program utilizes a new funding strategy (i.e., Pay for Success) and grantees and subrecipients/service recipients have different project goals and levels of experience, the activities and operational trajectories of grantees and subrecipients/service recipients are likely to be extremely varied. As such, targeting the entire universe is necessary to collect information on the full range of experiences and outcomes of organizations in the SIF PFS program.

We are currently estimating a 100 percent response rate for the grantee survey and a minimum of an 80 percent response rate for the subrecipient/service recipient surveys. These estimates are based on our confidence in the proposed strategies to maximize response rates (as described below) and the cooperation and support levels shown to date by the 2014 grantees and their subrecipients/service recipients when asked to engage in less formal activities (e.g., site visits and telephone discussions),

B2. Procedures for the Collection of Information

Since the entire universe of SIF PFS grantees and subrecipients/service recipients will be surveyed, no stratification or sample selection methodologies will be employed. In addition, sampling weights will not be necessary for survey estimates.

Based on the size and nature of the organizations studied, it is anticipated that all or nearly all the organizations surveyed will have access to the Internet; therefore, communications can be sent by email and respondents are expected to complete the surveys online. The surveys will be administered in an online format using FluidSurveys software. FluidSurveys uses SSL to encrypt survey responses as they are entered and McAfee security scans and firewalls to protect stored data from unauthorized access. All data exported from FluidSurveys will be kept in a secure computing environment with restricted Internet access.

Hard-copy versions of the surveys will be made available to organizations that do not have Internet access, as well as to those that prefer to respond using a hard-copy version of the survey.

The survey communications will be undertaken in several steps. The evaluation contractor will initially send the grantee contact person an email two weeks prior to the start of the survey data collection period. The email will remind grantees of the purpose of, and plans for the grantee and subrecipient/service recipient surveys and notify them the date that the grantee and subrecipient/service recipient survey data collection will begin. At this time, we will ask the grantees to communicate with their subrecipients/service recipients by email or telephone to notify them of the upcoming survey and ask for their support and cooperation.

At the time of the survey data collection, the evaluation contractor will send an email (Appendix B.1) to all grantees and subrecipients/service recipients to inform them that the survey data collection is beginning, provide information about the survey’s purpose, and ask for their participation. The email will include a link to the survey URL and a study email address that the respondents can use to communicate with the evaluation team if they have questions or want more information about the survey. A letter of endorsement for the evaluation from CNCS (Appendix B.2) will be included as an attachment to the email along with a study fact sheet (Appendix B.3) that provides additional information about the evaluation.

Previous communications with grantees and subrecipients/service recipients have indicated that the appropriate point of contact is the SIF PFS program contact person for the grantee or subrecipient/service recipient organization. All survey related correspondence about the survey will be directed to that point of contact.

For respondents requesting a hard-copy survey, (either because they do not have email and Internet access or because they prefer to respond via hard copy), the instrument will be mailed, along with a self-addressed stamped envelope for easy return. Completed hard copy surveys will be manually entered into the programmed instrument upon receipt.

Survey receipt will be closely monitored by evaluation staff, including tracking the response rates and checking the surveys for completeness and consistency. As needed, follow-up email reminders will be sent (Appendix B.4) to obtain completed surveys or identify an alternate point of contact/respondent if needed. Undeliverable messages will also be monitored and telephone or email will be used to contact those organizations.

Respondents will be given four weeks to complete the survey. Response rates will be monitored weekly to determine if that period is sufficient or if additional time or follow-up would enhance the response rate. Respondents will be sent a “thank you” email message (Appendix B.5) for their participation in the survey.

A similar survey administration procedure will be followed for the second and third round of surveys in 2017 and 2018. To facilitate responses and minimize burden for respondents who completed the survey in a previous year, the round 2 and 3 surveys will be customized to include the information that survey respondents provided in the previous round of surveys, with a request for respondents to report updates or changes since their earlier response.

B3. Methods to Maximize Response Rates and Deal with Issues of Non-Response

The survey administration focuses on establishing trust, communicating benefits, and limiting costs of survey participation. The messages to respondents will emphasize trust by communicating the legitimacy and importance of the data collection and the information they provide. Benefits of participation will be emphasized by collecting information that is central to the development, implementation, and oversight of PFS activities that can ultimately benefit all PFS practitioners and policy makers. Costs of participation will be minimized by making the survey readily available in multiple formats (e.g., online and hard copy) as well as easy to complete (short, well-designed, and programmed for online submission).

The SIF PFS Program Process Evaluation will employ a number of strategies to maximize response rates while maintaining cost control. Survey data collection will be conducted primarily online, which will reduce data collection costs and minimize respondent burden. Once the appropriate respondents have been identified, each respondent will be emailed an individualized survey link and will have up to four weeks to complete the survey at their convenience. Follow-up email reminders, tailored to non-respondents, will be used. We assume that all respondents will have work email addresses and access to the Internet. Reminders will be sent to nonrespondents, in the form of a telephone call or personalized email outreach from the study team. The surveys are expected to take on average 20 minutes or less per respondent.

Survey nonresponse will be handled in accordance with OMB’s Standards and Guidelines for Statistical Surveys, particularly sections 1.3 and 3.2. Unit non-response adjustments will be useful to ensure that weighted totals match population totals for both the grantee and subrecipient/service recipient groups. To assess the impact of nonresponse bias in this study, we will conduct statistical analysis to identify any characteristics of respondents that are correlated with response. If, as anticipated, the levels of non-response are low, the non-response adjustment will be a simple inverse probability weight adjustment so that the sums of the weights attached to responding records match the total number of records in the population weight. Specifically, the non-response weight would be , where is the total population size for the group (grantees or subrecipients/service recipients) and is the number of responding organizations in that group. Using a logistic regression model, we will create inverse probability weights for each respondent to adjust the results for nonrespondents. We will also conduct a sensitivity analysis to determine the level of variation between the unadjusted results and the weighted results.

In addition to handling unit non-response, item non-response will also be investigated. The first step will be to identify items with high proportions of missing responses. Data drawn from such items may be excluded from the analysis. Alternatively, for key survey items with high missing rates, the analysis will explore whether there are any predictors, in terms of other survey responses or auxiliary/frame data that could indicate differences in who chooses to respond to these questions.

B4. Tests of Procedures or Methods to Be Used

A pilot test of the grantee survey was conducted in late January 2016 with two grantees, one of each grant type funded by CNCS (feasibility assessment/capacity building and transaction structuring). At the same time, a pilot test of the subrecipient/service recipient survey was conducted with four subrecipients/service recipients. Pilot test respondents were asked to comment on the content and clarity of the survey questions and to identify any problems or issues with the wording or ordering of questions. Respondents were also asked individualized follow-up questions about any missing, unclear or inconsistent responses they provided in their surveys and the estimated time it took them to complete the surveys.

Overall, respondents indicated that the survey format was easy to follow and the questions were appropriate for their organization. One of the general suggestions made was to provide a list of items (e.g., staffing list, timeline) in the survey instructions that were needed to complete specific questions in the surveys. Respondents reported that knowing beforehand what information was needed for these questions would have reduced the amount of time they spent gathering this information or obtaining this information from colleagues in their organization. As a result of pilot testing, a list of items needed to answer the survey questions was added to the survey instructions.

Respondents also had specific comments on questions, requests to clarify certain terms, and suggestions to add response categories or simplify questions. For example, one of the government subrecipients wasn’t sure how to define their “organization” for the purposes of the survey. As a result, “hover-over” text boxes were added to clarify or define specific terms in both surveys. Additionally, many of the questions asking for month and year of project activities were left blank or reported by respondents to be difficult to answer due to organizational turnover or incomplete records. As a result, whenever possible, questions asking for specific month and year responses were deleted or simplified to ask whether an event had occurred, and if so was it before or after involvement in SIF PFS program.

Respondents said specific questions about the organization’s budget (both overall and share of budget allocated to PFS activities) were the most difficult and time-consuming to answer. These budget questions were intended to indicate organization size and commitment of resources to PFS. However, these questions were removed from the surveys since questions about staffing will provide information about both organization size and commitment to PFS and were able to be answered more accurately and easily by respondents.

Across all respondents, the average time needed to complete the surveys was 27 minutes. The survey completion times were longer than anticipated, primarily due to the time respondents reported spending searching for information and waiting for responses from others in their organizations on budget questions.

It is expected that the changes made to the survey (adding instructions to gather relevant documents before starting, adding definitions, removing the budget questions, and removing the month/year responses will reduce response time by an average of 5 to 10 minutes, bringing the estimated time for completion to approximately 20 minutes.


B5. Individuals Consulted on Statistical Aspects of the Design and Organizations/Persons Collecting and Analyzing the Data

Abt Associates has been contracted to administer the first round of the surveys and analyze the data. The key staff assigned to this project are:

Marjorie Levin

Project Director

Abt Associates Inc.

Marjorie_Levin@abtassoc.com

617-349-2819


Allan Porowski

Principal Investigator

Abt Associates Inc.

Allan_Porowski@abtassoc.com

301-347-5050


Cristina Booker

Data Analysis Task Lead

Abt Associates Inc.

Cristina_Booker@abtassoc.com

617-349-2681



CNCS has collaborated with the Abt team in all stages of the evaluation. The individual at CNCS assigned to this project is:

Lily Zandniapour

Project Officer/Contract Officer Representative

Corporation for National and Community Service

LZandniapour@cns.gov

202-606-6939




7



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorWilson, Alicia
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy