PART B: JUSTIFICATION FOR EVALUATION OF AMERICA’S PROMISE JOB-DRIVEN GRANT PROGRAM
OMB No. 1290-0NEW
OCTOBER 2019
The Chief Evaluation Office of the U.S. Department of Labor (DOL) has commissioned an evaluation of the America’s Promise Job-Driven Grant program (America’s Promise). This program aims to create or expand regional partnerships that will identify the needs of specific industry sectors relying on the H-1B visa program to hire skilled foreign workers and prepare the domestic workforce for middle- and high-skilled, high-growth jobs in those sectors. The America’s Promise evaluation offers a unique opportunity to build knowledge about the implementation and effectiveness of these regional partnerships. Mathematica Policy Research and its subcontractor Social Policy Research Associates have been contracted to conduct an implementation and impact evaluation. A request to collect information for data collection activities associated with the implementation evaluation, as required by the Paperwork Reduction Act (PRA) was approved by the Office of Management and Budget (OMB) (OMB Control Number 1290-0020) on February 2, 2019. This package requests clearance for five additional data collection activities as part of the implementation evaluation:
Program stakeholder interview protocol (in-person)
Employer interview protocol
Participant focus group protocol
Participant focus group information form
Program stakeholder interview protocol (telephone)
The universe of sites for this evaluation includes the 23 grantees awarded America’s Promise grants. The implementation evaluation includes interviews with program stakeholders, employers, and participants across all 23 grantees. In-depth interviews and focus groups will be conducted in person during site visits with 12 purposefully selected grantees. For the 11 grantees that do not receive site visits, telephone interviews with program stakeholders will be conducted.
Table B.1. Sampling and response rate assumptions, by respondent type
Type of respondent |
Sampling method |
Number of sites |
Estimated universe across all sites |
Expected sample (per site) |
Estimated response rate (percent) |
Estimated responses (per site) |
Estimated responses (across sites) |
Sites Selected for Site Visits |
|||||||
Grantee and partner staff |
Purposeful |
12 |
180 |
10 |
100 |
10 |
120 |
Employers |
Purposeful |
12 |
240 |
2 |
100 |
2 |
24 |
Program participantsa |
Purposeful |
12 |
10,800 |
10 |
50 |
5 |
60 |
Sites Not Selected for Site Visits |
|||||||
Grantee and partner staff |
Purposeful |
11 |
165 |
4 |
100 |
4 |
44 |
a Based on previous experience conducting focus groups, the study team anticipates 100 percent of people who participate in the focus groups will complete a participant information form.
A subset of 12 grantees will be selected for in-depth site visits, where in-person interviews with staff and employers, as well as participant focus groups will be conducted. Grantees are required to participate in the evaluation. For the remaining 11 grantees, telephone interviews with program stakeholders will be conducted. Grantees will be selected for site visits that (1) are implementing promising or innovative strategies and (2) are diverse along several dimensions. Examples of promising or innovative strategies include active engagement of strong employer partners, new or unique curricula to meet local industry needs, clear career pathways models with stackable credentials for entry and advancement in an H-1B field, or weaving of funding sources for tuition-free education. The study team will also consider grantees that are diverse in structure and maturity of partnerships, number and strength of employer partnerships, type of sector, population served, types of training, urbanicity, region, and other priorities. The previously cleared grantee survey will provide findings representing all 23 grantees and the site visits will provide perspectives from the field in a subset of grantees to more fully tell the story of America’s Promise implementation at the grantee level.
No statistical methods will be used in selecting interviewees and focus group participants. Participants will be purposefully selected based on their engagement with the America’s Promise activities occurring at each grantee, described in further detail below.
Program stakeholders. Across the 12 grantees selected for site visits, the grant manager and another key staff member will be selected to participant in an in-person semi-structure interview. In addition, we estimate that the number of non-employer partners may vary by site, but we anticipate interviewing 8 non-employer partner staff. Across the remaining 11 grantees, we again anticipate interviewing the grant manager and another key staff member at each grantee, along with two partner staff.
Employers. For the 12 sites selected for site visits, two employer partners of the grantee will be selected to participate in either an in-person or telephone interview (depending upon availability during the site visit window). Discussions with employers will supplement systematic data collection. The evaluation team will work with grantees to identify two employer partners to participate in site visit interviews. The employers will be purposively selected. Grantee staff will be asked to identify employers that have had the most frequent contact and demonstrated involvement in the America’s Promise program to ensure that the evaluation team can capture interviewed employers’ perspective and involvement in America’s Promise. Focusing on these employer partners will allow the study team to address research questions focused on understanding factors that influenced their high levels of involvement in the partnerships. Interviewed employers will provide their insights on research questions including how their regional partnerships were developed and maintained and describing the regional and community context of the America’s Promise grants the participated in.
Focus group participants. Focus groups will be arranged at each of the 12 grantees selected for site visits. To select focus group participants, the study team will use a sample of convenience and rely on staff at each selected grantee to identify and invite 10 participants who have received a high number of America’s Promise grant services. Specifically, we will request participants who have entered into education and training, actively participated in case management, and participated in job placement services. Recruitment will result in 120 individuals across all sites, with the expectation that up to 5 per site are likely to attend, for a total of 60 participants across sites (and a response rate of 50%). Focus group participants will be purposively selected in coordination with the grantees. Focus groups will center on learning about selected participants’ perspectives on America’s Promise services. Focusing on focus group participants who received a high number of services will allow the study team to address research questions focused on understanding how focus group participants engaged with the types and combinations of services provided through the grants. Importantly, these qualitative questions are meant to complement the impact study with descriptions of how focus group participants engaged with services. These questions will not provide data related to possible impacts from the services. The data collected from focus group participants will not be generalized to the broader universe of America’s Promise program participants. Insights from focus groups participants will describe the types and combinations of service provided to them, focus group participants’ labor market outcomes, and focus group participant characteristics.
Understanding the effectiveness of the America’s Promise program requires data collection from multiple sources. To collect these data, the study team will interview program stakeholders, employers, and participants between fall 2019 and winter 2020, when partnerships will have achieved a steady state of operations. In-person site visits will be made to 12 grantees, including the 6 grantees that participated in the partner network survey. Two-person site visit teams will conduct interviews with program stakeholders and employers, and conduct participant focus groups over two and a half days. Semi-structured protocols for each interview type will ensure consistent, high quality data collection across sites. Telephone interviews will be conducted with program stakeholders of the 11 grantees who do not receive site visits, also between fall 2019 and winter 2020. Interviews conducted in person will be in-depth and open-ended; telephone interviews will be shorter and tailored to the circumstances of each grantee.
During each site visit to the selected 12 grantees, the study team will conduct one focus group. The focus groups will gather information about the services and training they have received through the grant, their assessment of these services and trainings, and their overall experiences with the program. At the beginning of each focus group, the study team will hand out a paper information form to each participant. The information form will collect details on participant demographics, education and employment history and America’s Promise program participation. The form is expected to take approximately five minutes to complete. Based on previous experience conducting focus groups, the study team anticipates 100 percent of people who participate in the focus groups will complete a participant information form.
The goal of the implementation evaluation is to obtain a comprehensive picture of how the America’s Promise grants unfolded. This will be done by learning how the regional workforce and partnerships were developed and maintained; the types and combinations of services the partnerships provided; the characteristics of the target population; and the community contexts of the grantees. To do this, we will conduct in-person interviews at 12 grantee sites and telephone interviews with program stakeholders for 11 grantees.
The main type of data collected from the interview and focus group respondents will be qualitative information about staff’s experiences and insights implementing the America’s Promise grant or, in the case of participants, their motivations for participating in America’s Promise and their experiences while doing so. Thus, no statistical methodology (such as sample stratification) or estimation will be needed in the analysis of the interview or focus group data. The Consolidated Framework for Implementation Research will be used to guide the analysis of implementation data gathered from all 23 grantees, including identification of facilitators and barriers.1 This framework was developed to facilitate systematic assessment of the implementation context to reveal respondents’ perspectives on common implementation challenges and promising strategies..
Analysis of interview and focus group data will involve coding and triangulating across data sources. The evaluation team will begin by writing up detailed field notes from in-person and telephone interviews and focus groups in a structured format. To code the qualitative data for key themes and topics, a coding scheme will be developed and organized according to key research questions and topics and guided by the conceptual framework as well as constructs from the Consolidated Framework for Implementation Research on factors that affect implementation. Each segment of coded data will be assigned a negative or positive flag to identify barriers to and facilitators of implementation. This process will reduce the data into a manageable number of topics and themes for analysis (Ritchie and Spencer 2002).2 The evaluation team will then code the data using qualitative analysis software, such as NVivo or ATLAS.ti. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. These data will be used to describe the nuances of how partnerships developed as they did, and to explore implementation challenges and promising practices. Because the implementation study is examining grant implementation, study findings will apply only to the America’s Promise grantees and will not be more broadly generalizable.
The data gathered through the interview respondent information forms and participant focus group information forms will be tabulated using descriptive methods (including simple frequencies, cross-tabulations, and means, when appropriate) to provide contextual information about the characteristics of participants who provide the qualitative interview and focus group data. We will not use these data to make inferences about the broader set of staff who provide grant services, or to participants who are engaged in those services.
As mentioned previously, the semistructured interview and focus group data will be used to describe the America’s Promise grants, including the perspectives of program staff, partners, and participants. We will use a purposefully selected set of respondents for interviews and a sample of convenience for the focus groups. The selection will aim capture the diversity of grantees’ experiences and the perspectives of multiple respondents in each grantee location. Without talking to all the key administrators and frontline staff, the study team might miss important information regarding the implementation of the America’s Promise grant.
As the study team schedules site visits, they will explain the nature of the visits and share a suggested schedule, so that staff know what the team expects of them when they agree to participate. Participating grantees will in turn suggest potential program and partner staff interviews, along with focus group participants, based on the study team’s parameters. Site visitors will work with grantee staff to ensure that the timing of the visit is convenient for all staff involved. Site visitors will also work with staff to ensure that selected grantees are aware of the purpose of the study and available to participate. Grantee staff will be included on any communication with the local employers to encourage participation.
Semi-structured in-person and telephone interviews with program and partner staff. To ensure full cooperation from America’s Promise core and partner program staff, the study team will be flexible in scheduling site visit activities and telephone interviews to accommodate the particular needs of respondents. Furthermore, data collectors will meet with in-person interview respondents at the grantee site or another central location that is well-known and accessible.
Although the study team will try to arrange interviews that accommodate respondents’ scheduling needs, there might be instances when a respondent is unable to meet while the team is on site; when this happens, a member of the study team will request to meet with the respondent’s designee or schedule a follow-up call at a more convenient time. With these approaches, the study team anticipates a 100 percent response rate for staff interviews, as has been achieved on similar qualitative data collection efforts, such as those for the Workforce Investment Act Adult and Dislocated Worker Programs Gold Standard Evaluation, the Evaluation of the Linking to Employment Activities Pre-Release grants, the Evaluation of the Summer Youth Employment Initiative, and the Impact Evaluation of the Trade Adjustment Assistance Program.
Focus groups and participant information forms. To encourage participation in the focus groups, the study team will use methods that have been successful for numerous other Mathematica studies, including providing easy-to-understand outreach materials, strategically scheduling focus groups at convenient times and locations, and offering incentives to encourage participants to respond.
Outreach materials will be used to help sites recruit participants for the focus groups. These materials will (1) describe the study, its purpose, and how the data collected will be used; (2) highlight DOL as the study sponsor; (3) explain the voluntary nature of participation in focus groups; and (4) provide a phone number and email address for questions that focus group members might have. Outreach materials for participants will be clear and succinct and convey the importance of the focus group data collection.
The study team will consider respondents’ schedules and availability when scheduling the focus groups to maximize response. In addition, data collectors will meet with respondents in locations that are convenient.
In addition to these strategies, we also plan to offer incentives for the focus group participants. Unlike the America’s Promise staff and partners who are invested in the grant and evaluation, America’s Promise participants will be more difficult to locate and invite to participate. Therefore, to encourage participation in the focus groups, the study team will use methods that have been successful for numerous other Mathematica studies, such as the Evaluation of Linking Employment Activities Pre-Release and the Evaluation of Youth Career Connect, including offering the focus group participants a $25 gift card incentive. Although this is a nominal amount that is not large enough to be coercive for participants, the payment will serve two purposes: (1) to facilitate recruitment by increasing the likelihood that an America’s Promise participant will agree to participate in the focus group, and (2) to acknowledge that participants’ time is valuable and that participants may incur some costs (e.g., transportation and/or child care) to attend.
We will hand out the participant information forms at the end of each focus group after the discussion. The information form is a brief questionnaire written in clear and straightforward language. The average time required for respondents to complete the information form is estimated to be approximately 5 minutes for participants. Previous experience collecting such demographic and contextual information from focus group respondents suggests that 100 percent of focus group attendees will complete the information form.
Methods to ensure data reliability. We will use several well-proven strategies to ensure the reliability of the interview data collected during the site visits and telephone interviews. First, site visitors, all of whom already have extensive experience with this data collection method in a workforce setting, will be thoroughly trained in this study, its research questions, and goals. They will also be trained on how to probe for additional details to help interpret responses to interview questions. Second, this training and the use of the protocols (included in package) will ensure that the data are collected in a standardized way across sites. Finally, all interview and focus group respondents will be assured that their responses will remain private; reports will never identify respondents by name, and any quotes will be devoid of identifying information, including site name. No monetary or nonmonetary incentives will be provided to interview respondents.
All procedures and protocols to be used in the America’s Promise Implementation Study have been reviewed by content and methodological experts to ensure clarity and optimal ordering of the questions. To ensure that the America’s Promise Implementation Study interview protocols could be used effectively as a field guide to yield comprehensive and comparable data across grantee sites, senior research staff will use the telephone interview guides and the site visit protocols during the first call or visit. The interview guides and protocols will serve as discussion guides with general purpose areas and overarching objectives, meaning the data collected will be qualitative and questions asked will not be reliable across all sites. While early visits may lead to very minor refinements to the question ordering or terms used, these are the final guides and protocols. Data from all site visits, regardless of timing, will be analyzed using the same set of procedures. As described above, the study team will code site visit data by topic and theme rather than by individual interview questions. This approach will allow the study team to apply a consistent approach to analyzing interview data despite variation in protocol questions. In advance of the implementation site visits and phone interviews, the study team will train all site visitors to ensure a common understanding of the key objectives and concepts as well as fidelity to the protocols. The training session will cover topics such as the study purposes and research questions, data collection protocols, procedures for scheduling visits and conducting on-site activities (including a review of interview facilitation techniques and procedures for protecting the privacy of respondents), and post-visit files and summaries.
Consultations on the statistical methods used in this study will ensure the technical soundness of the study. The following individuals are being consulted on statistical aspects of the design:
Peter Mueser, PhD Professor, Department of Economics and Truman School of Public Affairs University of Missouri Columbia, MO 65211
|
Mary Alice McCarthy Director, Center on Education and Skills New America 740 15th Street NW, Suite 900 Washington, DC 20005 |
|
Margaret Hargreaves Principal Associate Community Science 438 N. Frederick Avenue, Suite 315 Gaithersburg, MD 20877 |
The following individuals consulted on statistical aspects of the design and will also be primarily responsible for actually collecting and analyzing the data for the agency:
Mathematica Policy Research
Ms. Jeanne Bellotti (609) 275-2243
Dr. Jillian Berk (202) 264-3449
Dr. Robert Santillano (510) 285-4653
Ms. Diane Paulsell (609) 275-2297
Consultant
Dr. Kevin Hollenbeck (269) 343-5541
1 Damschroder, L.A., D.C. Aron, R.E. Keith, S.R. Kirsh, J.A. Alexander, and J.C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science,” Implementation Science. vol. 4, no. 7, August 7, 2009.
2 Ritchie, J., and L. Spencer. “Qualitative Data Analysis for Applied Policy Research.” In The Qualitative Researcher’s Companion, edited by M. Huberman and B. Miles. London: Sage, 2002.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Dpatterson |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |