PART B: Statistical methods for Pathway home grant program evaluation
OMB NO. XXXX-0NEW
JULY 2022
The Chief Evaluation Office (CEO) in the U.S. Department of Labor (DOL) is undertaking the Pathway Home Grant Program Evaluation. The overall aim of the evaluation is to determine whether the Pathway Home grant programs improve employment outcomes and workforce readiness for adults by expanding the availability of services to individuals in the justice system, both before and after release, to improve their chances of finding meaningful employment and avoiding repeat involvement in the criminal justice system. CEO contracted with Mathematica and its subcontractor, Social Policy Research Associates, and the Council of State Governments Justice Center, to conduct this evaluation. The evaluation will include an impact study and an implementation study. This package requests clearance for six data collection instruments (instruments included as supporting documents):
1. Baseline survey of study participants
2. Survey of grant administrators
3. Survey of correctional facility administrators
4. Interview guide for program and partner administrators and staff
5. Focus group guide for pre-release program participants
6. Interview guide for post-release program participants
These data collection instruments have been submitted for approval by Health Media Lab Institutional Review Board.1
The universe of potential sites for the data collection efforts for which we request Office of Management and Budget (OMB) clearance approval includes all of the Pathway Home grantees and subgrantees that received grants in 2021. The implementation study will administer surveys to all grantee, subgrantee, and participating correctional facility sites. The implementation study will also conduct site visits and phone calls with interview staff and participants in the subset of sites purposively selected for the impact study. A randomized controlled trial or a quasi-experimental design to estimate program impacts with comparison groups is planned for the impact study in the subset of selected sites. If the design changes, we will submit a change request package to OMB for review.
Selection of sites
Implementation study. Five of the six instruments noted above will be used in the implementation study. We will use two approaches to select sites. For the grant administrator and correctional facility administrator surveys, we will include all grantees, subgrantees, and partnering correctional facilities in the survey samples. This includes approximately 70 grantees and subgrantees in the survey of grant administrators and approximately 160 correctional facilities in the survey of facility administrators. Approximately 16 sites will participate in the implementation study site visits where the interview and focus group guides will be used to collect information from grant administrators, partner administrators, frontline staff, and program participants. All impact study sites will participate in the implementation study site visits, and additional grantees, not included in the impact study, might be selected for visits that are (1) implementing similar activities as impact sites and (2) implementing a unique program model. For example, sites implementing a unique program might focus on target groups of interest (e.g., rural areas, women) or might focus on particular topics of interest (e.g., partnerships that allow for the delivery of specific services, differences in experiences between CBOs and intermediaries). The interview guide for post-release program participants will be used to collect information from program participants who received services from one of the 16 sites included in the site visits. Below, we discuss our approach to selecting sites for the impact study.
Impact study. One instrument, the baseline survey of study participants, will be used in the impact study. The impact study will include approximately 2,500 study participants from a subset of approximately six sites from among the Pathway Home grantees and subgrantees.2 The impact study will include purposively chosen direct grantees and intermediary grantee organizations and their subgrantees. The grantees will be purposively selected based on the strength of their program models, consistency of implementation across subgrantees, locations in states with data availability, large sample sizes, tenure of their programs, and the feasibility of conducting an evaluation of their program using either random assignment or quasi-experimental methods.
Selection of respondents
See Table B.1 for estimates of universe and sample sizes for each data collection instrument.
Table B.1. Sampling and response rate assumptions, by data collection instrument and respondent type
Respondent type |
Sampling method |
Universe of potential respondents |
Estimated selected sample |
Average responses (per site) |
Estimated responses per respondent |
Estimated response rates |
Estimated responses (across sites) |
|||||||||||
Baseline survey of study participants |
|
|
||||||||||||||||
Study participantsa |
Census |
2,500 |
2,500 |
417 |
1 |
100% |
2,500 |
|
||||||||||
Survey of grant administrators |
|
|
||||||||||||||||
Grant administratorsb |
Census |
70 |
70 |
1 |
1 |
95% |
67 |
|
|
|||||||||
Survey of correctional facility administrators |
|
|
||||||||||||||||
Correctional facility partner administratorsc |
Census |
160 |
160 |
2 |
1 |
80% |
128 |
|
|
|||||||||
Interview guide for program and partner administrators and staff |
|
|
|
|||||||||||||||
Grant administratorsd |
Purposive sampling |
130 |
130 |
8 |
1 |
90% |
117 |
|
|
|||||||||
Partner staff administrators |
Purposive sampling |
160 |
160 |
10 |
1 |
90% |
144 |
|
|
|||||||||
Frontline staff |
Purposive sampling |
240 |
240 |
15 |
1 |
90% |
216 |
|
|
|||||||||
Focus group guide for pre-release program participants |
|
|
|
|||||||||||||||
Study participantse |
Purposive |
1,250 |
128 |
8 |
1 |
90% |
115 |
|
|
|||||||||
Interview guide for post-release program participants |
|
|
|
|||||||||||||||
Study participantsf |
Purposive |
1,250 |
256 |
16 |
1 |
50% |
128 |
|
|
a The baseline survey will be completed by a census of impact study participants. It is possible that sites will begin enrolling program participants before enrollment for the impact study begins. Therefore, it is possible that the census of program participants will be larger than the census of impact study participants who will complete the baseline survey. Study resources suggest that a maximum of 2,500 study participants may be included in the impact study. Study design considerations, as well as resources, will influence the number of sites included in the impact study. Given that the estimated number of sites that will be included is six, the average number of respondents per site is shown to be 417. However, if fewer sites are included in the study, then the number of study participants per site might be larger. Based on previous experience administering baseline surveys, the evaluation team anticipates that 100 percent of people who participate in the study will complete a baseline survey.
b The survey of grant administrators will be administered to 70 grantees and intermediary subgrantees. We anticipate a 95 percent response rate.
c The survey of correctional facility administrators will be administered to 160 correctional facility administrators. We anticipate an 80 percent response rate. This assumption is a conservative estimate based on the literature.
d The grant administrators row includes both Pathway Home direct grant administrators and intermediary grant administrators staff. It assumes an average of eight administrators per site. We anticipate a 90 percent response rate.
e Based on grantee target enrollment numbers; we anticipate inviting an average of 8 participants per focus group. We expect a 90 percent response rate, meaning approximately seven participants will attend each focus group. This assumption is a conservative estimate based on similar data collection efforts for the Linking to Employment Activities Pre-release evaluation.
f Based on grantee target enrollment numbers; we anticipate inviting an average of 16 participants per site. We expect a 50 percent response rate, meaning approximately eight participants will participate in interviews. This assumption is a conservative estimate based on similar data collection efforts.
Baseline survey of study participants. The sampling unit for this survey includes participants at impact study sites who meet the program eligibility requirements and consent to be part of the study. If random assignment is used, all participants who meet the program eligibility requirements and consent to be part of the study will be subject to random assignment. If a quasi-experimental design is chosen, the sample will also include individuals who meet the criteria for inclusion in the comparison group, which means that they may not have to meet all of the program eligibility requirements. We do not expect to enroll more than 2,500 sample members in the impact study. If the most suitable sites for inclusion in the impact study are expected to serve a relatively large number of participants (as opposed to a relatively small number of participants), then fewer sites will be in the impact study.
In sites selected for the impact study, all applicants for program services (after receiving OMB approval and study procedures commence at the site) will complete a baseline survey. If random assignment is used, this will happen before individuals are randomly assigned to either a program group that will be able to receive Pathway Home services or a control group that will not. If random assignment is not used, we will work with facility and program staff to enroll individuals from a comparison group in the study. The comparison group will be individuals who apply to pathway home or who are eligible for the program but do not enroll, either in facilities that offer Pathway Home or other similar correctional facilities. We will work with program and facility staff to administer the baseline survey to these individuals during the study enrollment period. The universe of Pathway Home study participants (which will include program and comparison group members) across approximately six impact study sites is estimated to be 2,500. The evaluation team anticipates collecting information on study participants’ characteristics for 100 percent of the Pathway Home study participants. The baseline survey will be administered before enrollment into the study, allowing for a 100 percent response rate.
Survey of grant administrators. The goal of this survey is to gather common information about organizational settings and intervention characteristics for Pathway Home grantees as part of the implementation study. The survey will be distributed to an executive director from the census of 22 grantees and a director from all of their subgrantees (where relevant) that received Pathway Home grants in 2021, which is estimated to be approximately 70 sites. We will not use any statistical methods to select respondents for this survey because all grantees and subgrantees that received grant awards in 2021 will be included in the survey fielding effort. The evaluation team anticipates the survey will take an average of 30 minutes to complete. Participation in the evaluation is a condition of grant award from DOL, and the evaluation team anticipates a 95 percent response rate.
Survey of correctional facility administrators. The goal of this survey is to gather for the implementation study common information about corrections partners’ perspectives on the services provided through Pathway Home and learn about the successes and challenges in coordinating services both within and outside of correctional system facilities. The survey will be distributed to administrators in all of the partnering correctional facilities where Pathway Home participants are enrolled and where pre-release services are delivered, which for 22 grantees is estimated to be approximately 160 distinct facilities. We will not use any statistical methods to select respondents for this survey because we will collaborate with grantees to survey all relevant partners. The evaluation team anticipates that the survey will take an average of 20 minutes to complete. The evaluation team anticipates an 80 percent response rate for the survey based on previous research surveys of correctional facilities.3
Interview guide for program and partner administrators and staff. As part of the implementation study, the evaluation team will conduct interviews with program and partner administrators and staff during site visits to the subset of grantees, which is estimated to be approximately 16 sites. We will not use any statistical methods to select the grantees to visit. If there are subgrantees, the evaluation team will select one of the subgrantee locations to visit. For each grantee or subgrantee visited, the evaluation team will visit one partnering correctional facility to conduct interviews. We will base those selections on the type of facility, the populations in custody, and distance from the grantee site.
Administrator and staff interview respondents will be purposively selected based on their engagement with Pathway Home program activities. Respondents will include grant administrators (112 direct grantee and 18 intermediary grant administrators), partner administrators (160), and frontline staff (240) to understand how the program has been developed, managed, and delivered. Participation in the evaluation is a condition of grant award from DOL, and the evaluation team anticipates a 90 percent response rate.
Focus group guide for pre-release program participants. As part of the implementation study’s site visits, the evaluation team will conduct a focus group at each of the sites to gather information from participants about their experiences receiving pre-release program services. The evaluation team will coordinate with staff from the grantees selected for site visits to identify engaged participants who can speak about their experiences with the program. The evaluation team anticipates an average of eight participants at each of the focus group discussions. The data collected from focus group participants will not be generalized to the broader universe of Pathway Home program participants. These focus groups will provide insights to help answer research questions about the types and combinations of pre-release services that focus group participants received and participants’ experiences accessing those services. The evaluation team anticipates a 90 percent response rate for the pre-release focus groups based on prior experience conducting pre-release focus groups on the LEAP evaluation.4
Interview guide for post-release program participants. The goal of the post-release participant interviews is to get information about the Pathway Home participants’ experiences transitioning back to their communities and participating in post-release employment services. We will purposively select the interview participants in coordination with the grantees, which will help identify engaged participants who can speak about their experiences with the program. The evaluation team anticipates an average of 16 participants from each of the impact study grantee sites will be invited to participate in the interviews. The evaluation team expects a 50 percent response rate for the post-release interviews; this response rate is a conservative estimate based on literature regarding similar data collection efforts.5
Understanding the effectiveness of the Pathway Home grants requires collecting data from multiple sources. For the implementation study, data collection will include a survey of grant administrators, a survey of correctional facility administrators, site visits to a subset of grant sites for interviews with administrators and staff, focus groups and interviews with participants, and individual-level program data. For the impact study, the evaluation team will collect baseline information on study participants at program application to understand the baseline characteristics of the sample. Outcome data will come from a follow-up survey of a sample of study participants, as well as administrative earnings data and criminal justice system data for all impact study participants. The data collection instruments this clearance covers include the baseline survey, survey of grant administrators, survey of correctional facility administrators, interview guide for program and partner administrators and staff, focus groups for pre-release participants, and interview guide for post-release participants.
Baseline survey of study participants. Program or facility staff will administer the baseline survey at intake to all eligible individuals at the sites selected for the impact study and enter those individuals into a web-based system as they go through an intake process. In addition to sample member contact information, the baseline survey will collect information on basic demographic characteristics, employment history, and criminal justice history. Whenever possible, baseline surveys will be collected electronically through the study’s web-based enrollment system. The evaluation team is also prepared to offer the baseline survey on paper if an Internet connection is not available within the correctional facilities.
Program staff will obtain from all impact study participants consent to participate in the research study. To fully ensure informed consent, the evaluation team will train grantee staff to collect written consent from all participants in the Pathway Home Evaluation. Written consent forms will describe the purpose of the study; outline the information that will be collected; explain the risks, benefits, and voluntary nature of participation; and collect participants’ consent to be included in the evaluation. If however, due to COVID-19 or other restrictions, staff are not able to obtain written consent from the participant, the staff will request permission to sign on their behalf and will initial the signature to indicate that they signed the consent form with the person’s permission. The participant consent form for the impact study is included in the Supplementary Documents.
Survey of grant administrators. As part of the implementation study, we will field an electronic survey to approximately 70 grantees and subgrantees to obtain information on approaches to project management, recruitment and outreach, and service delivery. This survey will include a set of common questions to lead to insights about variations across grantees and to provide context for the impact study and other data collection activities.
Survey of correctional facility administrators. As part of the implementation study, we will field an electronic survey to approximately 160 correctional facility partners. The survey will include a common set of questions to gather information about corrections partners’ perspectives on the services provided through Pathway Home and learn about the successes and challenges in coordinating services both within and outside of correctional system facilities.
Interview guide for program and partner administrators and staff. As part of the implementation study, the evaluation team will conduct site visits to approximately 16 sites that were selected for the impact study, and one correctional facility per grantee. During each visit, two members of the evaluation team will conduct semistructured interviews with program administrators, partner administrators, and frontline staff from each site. The interviews will focus on services offered, staff roles and responsibilities, participant experiences, partnerships, plans for sustainability, and challenges associated with establishing a program inside a correctional facility. Participation in interviews is voluntary. The evaluation team will collect verbal assent from all respondents who participate in the interviews. The evaluation team might need to conduct interviews by telephone, if for any reason it cannot complete all of the interviews on-site (for example, due to issues that might arise in advance of the visit, or if a respondent is out of the office unexpectedly while the evaluation team is on-site). The interview guide for program and partner administrators and staff will be used to create individual discussion guides based on the respondent type in preparation for the site visits.
Focus group guide for pre-release program participants. As part of the implementation study’s site visits, the evaluation team will conduct a focus group with program participants at each of the correctional facilities visited. The in-person focus groups will allow the evaluation team to understand the perspectives of pre-release program participants on the pre-release services offered through Pathway Home. For each of these focus groups, the evaluation team will work with the grantee to identify and invite approximately eight participants who represent diverse backgrounds and experiences in the Pathway Home program and whom the facility permits to attend (for a total of approximately 128 participants across sites). Participation in focus group discussions is voluntary. The evaluation team will collect consent from any program participants who take part in the focus groups.
Interview guide for post-release program participants. As part of the implementation study, the evaluation team will conduct interviews with program participants who have received post-release services in each site where a visit was conducted. The interviews will occur either in person during site visits or through telephone interviews, based on the availability of the respondents. The evaluation team will work with the grantee to identify and invite approximately eight participants who represent diverse backgrounds and experiences in the Pathway Home program and who have participated in post-release Pathway Home services. Participation in interviews is voluntary. The evaluation team will collect written consent from any program participants who take part in the interviews.
Baseline survey of study participants. The data gathered through the baseline survey will be tabulated using descriptive methods (including simple frequencies, cross-tabulations, and means, when appropriate) to provide contextual information about the characteristics of participants in the impact study. If a random assignment design is used and conducted properly, there should be no systematic observable or unobservable differences between research groups across key indicators including sex, race/ethnicity, education level, and prior justice system involvement, except for the services offered after random assignment. Therefore, the baseline survey data will be used to ensure randomization was conducted properly. If a quasi-experimental design is used, the baseline survey will be used to evaluate whether there are differences in observable characteristics between the treatment and comparison groups. We will use statistical tests, such as t-tests or chi-squared tests, to assess whether there are statistically significant differences in these baseline measures of program and comparison groups using available baseline data. If there are determined to be differences, we will use the baseline survey to develop matched comparison designs or study weights that can be used to account for any differences.
Surveys of grant and correctional facility administrators. The main type of data collected from the surveys of grant and correctional facility administrators will be contextual information about the program, services offered, the facility environment, and any unique approaches used or populations served. No complex statistical methodology (such as sample stratification) or estimation will be necessary to analyze data from the grant administrator survey. We will analyze the data using simple descriptive measures to generate aggregated counts of responses. Responses to open-ended questions will be coded to identify key themes across sites and enable the evaluation team to describe program and facility characteristics and experiences.
Interviews with program and partner administrators and staff, focus groups with pre-release participants, and interviews with post-release participants. The main type of data collected from the interviews and focus groups will be qualitative information about staff’s experiences and insights implementing the Pathway Home grants or, in the case of participants, their motivations for participating in Pathway Home and their experiences while doing so. Thus, no statistical methodology (such as sample stratification) or estimation will be necessary to analyze the interview or focus group data. We will not make any declarative statements about the efficacy of strategies or practices implemented by programs. We will qualitatively describe these programs to inform DOL and the broader field about pre-and post-release employment-focused programs.
We will use NVivo, qualitative data analysis software, to analyze the qualitative data collected through interviews and focus groups, with thematic analysis informed by our conceptual framework. To extract data on key themes and topics, the evaluation team will develop a coding scheme, which will be organized according to key research questions and topics and guided by the conceptual framework, as well as more general constructs from the Consolidated Framework for Implementation Research and Community Coalition Action Theory framework on factors that affect implementation.6,7 To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. We will also analyze data across grantees for each theme and determine trends in the data that suggest differences between types of grantees, program models, facility types, or other important aspects of the program. Because the implementation study is examining grant implementation, study findings will apply only to the Pathway Home program grantees and will not be more broadly generalizable.
Specialized sampling procedures are not required for administering the baseline or grant administrator surveys. The evaluation team will attempt to collect baseline survey data from every impact study sample member who is from the sites that will be part of the follow-up survey data collection efforts. Additionally, the evaluation team will attempt to collect survey data from all grantees awarded grants in 2021 and their respective correctional facility partners.
As mentioned previously, the semistructured interview and focus group data will be used to describe the Pathway Home grants, including the perspectives of frontline staff, partners, and participants. The evaluation team plans to use purposive sampling methods. For the administrator and staff interviews, the evaluation team will ask grantee staff in advance of the visits to identify administrators, staff, and partners that have been closely involved with the Pathway Home grant (for example, by providing input on the development of the grant plans, providing services). Although the evaluation team will aim to include all administrators and staff who were closely involved with the program in these discussions, the interview discussions might not be representative of all administrators and staff. For the participant focus groups and interviews, the evaluation team will ask grantee staff to recommend and help invite participants who were engaged in program services and can provide a range of perspectives; however, the final sample of invitees might not represent the participant diversity present at each program site.
The data collection instruments for the impact and implementation studies will be administered only once for any individual respondent. To further minimize burden on respondents, the evaluation team will review pertinent data available from Pathway Home grantee applications, grantee staffing and implementation plans, and any other reporting information to reduce the burden on site respondents whenever possible. Thus, the evaluation team can focus the discussions with respondents on topics for which information is not otherwise available.
As their grant agreements indicate, Pathway Home grantees and subgrantees included in the evaluation are expected to participate in the data collection activities as a condition of their grant awards.
Baseline survey of study participants. Program staff will administer the baseline survey to all eligible individuals at the sites selected for the impact study and then enter those individuals into a web-based system as they go through an intake process. Completion of the survey will be a condition of program participation. Therefore, participants who do not complete the survey, will not be able to access program services. The methods to maximize response for the intake forms will be based on approaches used successfully in many other random assignment studies (for example, the YouthBuild Evaluation, for which nearly 4,000 people were randomly assigned) to ensure the study is clearly explained to study participants and staff and that the forms are easy to understand and complete. Staff will be thoroughly trained on how to address study participants’ questions about the forms. Grantee staff will also receive a site-specific operational procedures training prepared by the research team, contact information for members of the research team, and detailed information about the study. A paper version of the survey will also be available, should program staff not be able to administer the survey online. To encourage participation in the study, the evaluation team will offer a $15 incentive for completing the baseline survey (pending the facility’s approval to pay people who are in custody). Based upon the facility’s policies and participant’s preferences, the incentive will either be deposited into the participant’s commissary account, provided to the facility to hold until the participant’s release, or sent to a family member in the community. Although this is a nominal amount that is not large enough to be coercive for participants, the payment will facilitate participation and indicate to participants that we value their time and participation.
Surveys of grant and correctional facility administrators. The evaluation team expects to achieve a response rate of 95 percent for the survey of grant administrators and a response rate of 80 percent for the survey of correctional facility administrators. The surveys will be designed to be as brief as possible, with clear, easy-to-answer questions (mostly closed-ended questions, with a few open-ended questions). An official advance letter with log-in information will be emailed to grantee sample members to legitimize the study and encourage participation.
The evaluation team will use certain survey methods and best practices to encourage high response rates while minimizing burden and nonresponse. These methods include:
Web administration. We anticipate most respondents will prefer to complete the survey online. This choice allows respondents to complete on their own schedule and pace, as well as complete the survey over multiple sessions. The web survey system the data collection team uses also supports mobile browsers, such as tablets or cellular phones.
Technology to reduce burden. To reduce burden, the surveys will employ drop-down response categories so respondents can quickly select from a list, dynamic questions, automated skip patterns so respondents only see those questions that apply to them (including those based on answers provided previously in the survey), and logical rules for responses so respondents’ answers are restricted to those intended by the question. These features should minimize data entry burden among participants and facilitate high quality responses.
Use tested questionnaires. The collection of survey data has been tailored to the specific circumstances of this evaluation, yet is based closely on prior surveys. These include the America’s Promise Job Driven Grant Program Evaluation (OMB 201801-1290-001) and the Reentry Employment Opportunities (REO) Evaluations (OMB 1290-0NEW). These prior instruments were extensively tested using cognitive interviews, or debrief sessions under each of these evaluations with populations that are similar to this study’s. These populations include active participants in employment and training services and individuals returning from incarceration.
Interview guide for program and partner administrators and staff. The evaluation team expects to achieve a response rate of 90 percent for interviews with program and partner administrators and staff. To ensure full cooperation, the evaluation team will be flexible in scheduling interviews and activities to accommodate the particular needs of respondents. Furthermore, data collectors will meet with in-person interview respondents in a central location that is well-known and accessible, such as the correctional facility or grantee location where Pathway Home services are provided. Although the evaluation team will try to arrange interviews that accommodate respondents’ scheduling needs, there might be instances when a respondent is unable to meet while the team is on-site; when this happens, a member of the evaluation team will request to meet with the respondent’s designee or schedule a follow-up call at a more convenient time. With these approaches, the evaluation team anticipates a 90 percent response rate for administrator interviews, as using these approaches has achieved a response rate of 100 percent on similar qualitative data collection efforts, such as those for the Workforce Investment Act Adult and Dislocated Worker Programs Gold Standard Evaluation, the Evaluation of the Summer Youth Employment Initiative under the Recovery Act, and the Impact Evaluation of the Trade Adjustment Assistance Program. However, due to the uncertainty around the COVID-19 pandemic we anticipate our response rates to be slightly lower than in past studies.
Focus groups with pre-release participants and interviews with post-release participants. To encourage participation in the participant focus groups and interviews, the evaluation team will use methods that have been successful for numerous other Mathematica studies, including enlisting program staff in outreach to participants, providing easy-to-understand outreach materials, strategically scheduling focus groups and interviews at convenient times and locations, and offering incentives to encourage participants to respond.
Outreach materials will be designed to help sites recruit participants for the focus groups and interviews. These materials will (1) describe the study, its purpose, and how the data collected will be used; (2) highlight DOL as the study sponsor; (3) explain the voluntary nature of participation in focus groups and interviews; and (4) provide a phone number and email address for questions that respondents might have. Outreach materials will be clear and succinct and convey the importance of the data collection.
Pending facility policies, pre-release focus group participants will offer a $20 incentive as a token of appreciation for their participation. Post-release interview participants will be offered a $50 gift card incentive, because unlike the Pathway Home staff and partners who are invested in the grant and evaluation, and unlike the pre-release participants who are easily located in a facility, post-release participants will be much more difficult to locate and they will have competing demands on their time making it more challenging for them to participate. Therefore, to encourage participation in the interviews, the evaluation team will use methods that have been successful for other Mathematica studies such as The National Evaluation of the Trade Adjustment Assistance (TAA) Program, including offering the post-release interview participants a $50 gift card incentive.8 Although this is a nominal amount that is not large enough to be coercive for participants, the payment will facilitate participation and indicate to participants that we value their time and participation.
Methods to ensure data reliability. We will use several well-proven strategies to ensure the reliability of data collected from surveys, interviews, and focus groups.
Baseline, grant administrator, and correctional facility surveys. We will use the same surveys across all study sites to ensure consistency in the collected data. The evaluation team will have reviewed the forms extensively, and the forms were thoroughly tested in a pre-test involving nine individuals from nonparticipating sites. To ensure we capture complete and accurate data, the web-based platform will flag missing data or data outside a valid range. Furthermore, to ensure we collect the most critical pieces of information from all respondents, the evaluation team will program the web-based system to not allow missing answers for critical items. At the analysis stage, the evaluation team will create binary variable flags for all noncritical items with missing values and include these flags in the analyses. In addition, staff will be trained on project security, including safeguards to protect personally identifiable information while collecting and storing information on sample members. For the baseline survey, staff will be trained on each survey item to ensure that they understand it and will accurately record the information that program applicants provide. In addition, each participating site will have access to the web-based system for entering the information from the baseline survey.
Interview and focus group guides. We will use several well-proven strategies to ensure the reliability of the data from interviews and focus groups collected during the site visits and post-release participant interviews. First, qualitative data collectors, all of whom already have extensive experience with this data collection method, will be thoroughly trained in aspects particular to this study, including how to probe for additional details to help interpret responses to interview and focus group questions. Second, this training and the use of the guides will ensure that data collection is standardized across sites. Finally, all interview and focus group respondents will be assured that their responses will remain anonymous; reports will never identify respondents by name, and any quotes will be devoid of identifying information, including site name.
Baseline survey of study participants. The baseline survey was tested with seven individuals enrolled in Pathway Home grants awarded in 2020, using pre-test participants who most closely approximated the future study sample. The pre-test sample was diverse in terms of age, prior work, and incarceration experiences. The baseline survey pre-tests were conducted by phone or by zoom with participants from three different grantees. To mirror as much as possible the various ways the survey may likely be administered with the future study sample, we tested both self-administered and interviewer-administered modes.9 Four participants were post-release but were recently released and working with the grantee. For these four participants, we first explained the purpose of the pretest, then administered the survey to them by asking them the questions and recording their responses. For the other three participants, who were pre-release participants in a Kansas facility, we asked that they complete the survey themselves using a hardcopy version of the survey. . In both scenarios, after respondents completed the survey, a member of the evaluation team debriefed with them to get their overall comments and to further assess their understanding of key questions and terms. The evaluation team was also able to confirm that the average length of time to complete the survey (both interviewer- and self-administered modes) was within the 15-minute target. Feedback from the baseline pre-tests was used to clarify the question text and improve the overall flow of the survey. The updated baseline survey is included as a supporting document as is a pretest memo with full details on methodology and findings.
Surveys of grant and correctional facility administrators. During formal pre-testing, the evaluation team tested the grant and facility administrator surveys with nine respondents from nonparticipating sites. The evaluation team sent respondents the survey via email and asked them to complete it, scan it, and return it to the evaluation team by email. Respondents were instructed to keep a copy of their completed survey to reference during the respondent debrief. The debriefs were conducted by telephone using a semistructured interview guide. Feedback from the pre-tests was used to clarify the wording of the instructions and questions to eliminate questions that respondents found overly burdensome. The updated grant and facility administrator surveys are included as supporting documents.
Interviews with program and partner administrators and staff, focus groups with pre-release participants, and interviews with post-release participants. To ensure the interview and focus group guides are used effectively and yield comprehensive and comparable data across the study sites, senior research team members will conduct the first site visit, checking that the guides include appropriate probes and all relevant topics of inquiry. Furthermore, during this first visit, the senior research staff leading this visit will assess the site visit agenda—including how data collection activities should be structured during each site visit—and ensure it is practical, accounting for the amount of data to be collected and the amount of time allotted for each data collection activity. The interview and focus group guides will be pilot tested, with adjustments made as needed to clarify questions that were unclear to pilot respondents. Based on its experience with the first site visit, the evaluation team will train all site visitors on the data collection instruments to ensure a common understanding of the key objectives and concepts as well as fidelity to the guides. The training session will cover topics such as the study purposes and research questions, data collection guides, procedures for scheduling visits and conducting on-site activities (including a review of techniques for facilitating interviews and focus groups, and procedures for protecting human subjects), and post-visit files and summaries. The interview and focus group guides are included as supporting documents.
The evaluation team has convened a technical working group (TWG) to provide substantive feedback throughout the project period, particularly on the impact evaluation design. The TWG members will have expertise in research methodology as well as on programs and populations similar to those being served in the Pathway Home grant programs. The evaluation team is also convening an advisory group of people with lived experience in the justice system to ensure the research design, instruments, and findings are grounded in the experiences of people with direct experience in the justice system. Table B.2 lists the people who will oversee data collection and analysis for the Pathway Home Evaluation.
Table B.2. People who will oversee data collection and analysis for the Pathway Home Evaluation
Organization |
Individuals |
Mathematica
|
Ms.
Samina Sattar
|
|
Dr.
Jillian Berk
Dr.
Jillian Stein
Ms.
Jeanne Bellotti
Ms.
Betsy Santos |
Social
Policy Research Associates |
Dr.
Andrew Wiegand |
|
Mr.
Christian Geckeler |
Council
of State Governments Justice Center |
Dr.
Nicole Jarrett |
1 We expect to receive institutional review board approval by the time this package goes to OMB.
2 A subset of the Pathway Home grants was awarded to intermediary organizations that selected subgrantees. The evaluation will include direct grantees and subgrantees where appropriate.
3 Senior, J., K. Forsyth, E. Walsh, K. O’Hara, C. Stevenson, A. Hayes, V. Short, R. Webb, D. Challis, S. Fazel, A. Burns, and J. Shaw. “Health and Social Care Services for Older Male Adults in Prison: The Identification of Current Service Provision and Piloting of an Assessment and Care Planning Model.” Health Services and Delivery Research, vol. 1, no. 5, 2013.
4 Bellotti, Jeanne, Samina Sattar, Alix Gould-Werth, Jillian Berk, Ivette Gutierrez, Jillian Stein, Hannah Betesh, Lindsay Ochoa, and Andrew Wiegand. Developing American Job Centers in Jails: Implementation of the Linking to Employment Activities Pre-Release (LEAP) Grants. Mathematica Policy Research, 2018.
5 Kerry Levin, Jennifer Anderson, and Jocelyn Newsome. Comparing Recruitment for Focus Groups and Friendship Groups: Which Methodology Makes Recruitment Easier? Poster presented by Westat at the annual American Association for Public Opinion Research conference, 2016. Paper available at http://ewriteonline.com/wp-content/uploads/2016/03/Jen-Anderson_2016-AAPOR-Poster-Content_Hmwk1_no-comments.pdf
6 Keith, Rosalind E., Jesse C. Crosson, Ann S. O’Malley, DeAnn Cromp, and Erin Fries Taylor. “Using the Consolidated Framework for Implementation Research (CFIR) to Produce Actionable Findings: A Rapid-Cycle Evaluation Approach To Improving Implementation.” Implementation Science, vol. 12, no. 1, 2017, pp. 1–12.
7 Butterfoss, F. D., and M. C. Kegler. “Toward a Comprehensive Understanding of Community Coalitions: Moving from Practice to Theory.” In Emerging Theories in Health Promotion Practice and Research, edited by R. DiClemente, L. Crosby, and M. C. Kegler (pp. 157–193). San Francisco, CA: Jossey-Bass, 2002.
8 In 2008 the TAA Evaluation conducted an incentive experiment to test whether offering $25, $50, or $75 significantly improved response rates among dislocated workers. Based on the findings from the experiment, the TAA study team moved forward with $50 incentive payments because that amount led to a significant increase in response rates compared to the $25 payment. The experiment did not find a difference between offering $50 and $75 incentive payments (Schochet, Berk, and Nemeth 2008).
9 Due to Covid-19 restrictions, we were unable to administer the survey in-person at any facility.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Reentry Employment Opportunities OMB Statements |
Subject | OMB |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2022-08-31 |