PART A: JUSTIFICATION for Evaluation of America’s Promise Job-Driven Grant Program
OMB No.
APRIL 2020
The Chief Evaluation Office of the U.S. Department of Labor (DOL) has commissioned an evaluation of the America’s Promise Job-Driven Grant program (America’s Promise). This program aims to create or expand regional partnerships that will identify the needs of specific industry sectors relying on the H-1B visa program to hire skilled foreign workers and prepare the domestic workforce for middle- and high-skilled, high-growth jobs in those sectors. The America’s Promise evaluation offers a unique opportunity to build knowledge about the implementation and effectiveness of these regional partnerships. Additionally, as the grants are in a mature state of operation, this evaluation is in a unique position to be able to learn about how grantees’ established programs, employer partnerships, and service delivery approaches might be changing as a result of the COVID-19 pandemic. Mathematica Policy Research and its subcontractor Social Policy Research Associates have been contracted to conduct an implementation and impact evaluation. A request to collect information for data collection activities associated with the implementation evaluation, as required by the Paperwork Reduction Act (PRA) was approved by the Office of Management and Budget (OMB) (OMB Control Number 1290-0020) on February 2, 2019. This package requests clearance for five additional data collection activities as part of the implementation evaluation:
Program stakeholder interview protocol (in-person, if feasible, or else virtual)
Employer interview protocol
Participant focus group protocol
Participant focus group information form
Program stakeholder interview protocol (telephone)
A skill gap between the qualifications of American workers and the needs of many American businesses has persisted. U.S. firms annually sponsor hundreds of thousands of nonimmigrant H-1B visas to fill skilled positions.1 To reclaim some of these jobs for the American workforce, in January 2017, DOL awarded more than $110 million to 23 grantees for America’s Promise. The purpose of these four-year grants is to support local partnerships between workforce agencies, employers, industry representatives, training providers, community-based organizations, and economic development agencies to identify the needs of specific industry sectors relying on the H-1B visa program for workers and implement career pathway programs that build the skills of the domestic workforce for middle- and high-skilled jobs in those sectors.
Citation of sections of laws that justify this information collection: The America’s Promise grant program and subsequent evaluation are authorized by Title 29 of the American Competitiveness and Workforce Improvement Act, which states that “the Secretary of Labor shall . . . award grants to eligible entities to provide job training and related activities for workers to assist them in obtaining or upgrading employment in industries and economic sectors . . . projected to experience significant growth and ensure that job training and related activities funded by such grants are coordinated with the public workforce investment system (29 USC 3224(a)).”
A request to collect information for data collection activities associated with the implementation evaluation was approved by the OMB (1290-0020) on February 2, 2019. This package requests clearance for five additional data collection activities which need to start in May 2020 as part of the implementation evaluation. Given that the America’s Promise grants end in December 2020, a timely start to the information collection is critical for providing DOL near real-time information about how the grants were implemented as well as if and how they were adapted during the COVID-19 pandemic.
The data collected through the activities summarized in this request will be used by DOL to comprehensively describe implementation of the America’s Promise grant program, including its partnerships, training and support services provided, target population, and common implementation successes and challenges, including how they adapted during the COVID-19 pandemic. Although not addressed through instruments included in this request, the evaluation will also assess the impacts of America’s Promise on participant outcomes (during time periods that will examine participant outcomes both before and after the pandemic began). This analysis will involve existing administrative data sets, which does not require OMB approval. These data and the evaluation team’s descriptive and impact analyses will provide DOL and other policymakers with important information to guide management decisions, support future planning efforts regarding such grant programs, including those that will be funded during the pandemic, and share evidence of the effectiveness of training approaches for middle- and high-skill occupations.
The evaluation of America’s Promise includes two components: (1) an implementation evaluation to understand program implementation and partnership development and (2) an impact evaluation to measure the effects of America’s Promise on participant outcomes. Both components will take place over five years (2017 to 2022) and will address the following research questions:
How were regional partnerships developed and maintained? What factors did site visit respondents report as influencing partnership development and employer engagement? (implementation evaluation)
What types and combinations of services and approaches were provided? How were they implemented? (implementation evaluation)
What were the characteristics of enrolled participants? (implementation evaluation)
What was the regional and community context of the America’s Promise grantees? (implementation evaluation)What changes did America’s Promise grantees make to their programs as a result of the COVID-19 pandemic?
What impact did America’s Promise have on participants’ labor market outcomes?
How did the impact of America’s Promise vary by participant characteristics or program components?
The implementation evaluation component will answer research questions 1-4. This component includes a grantee survey involving all 23 grantees; review of grant documents from all 23 grantees; a partner network survey involving approximately six grantees; interviews with program stakeholders and employers as well as participant focus groups during site visits to 12 grantees (planned to be in-person, if feasible); and telephone interviews with program stakeholders from the remaining 11 grantees. The PRA request approved by OMB on February 5, 2019 (OMB Control Number 1290- includes the grantee survey and the partner network. This PRA clearance request includes the protocols that will be used during the on-site and telephone interviews and focus groups. The impact evaluation component will use administrative data to address research questions 6 and 7.
The 12 sites selected for in-person site visits will include the six sites selected for the partner network survey (see 83 FR 51984), as well as an additional six sites identified through data collected in the grantee survey (see, again, 83 FR 51984) that appears to meet key criteria of interest to DOL. These criteria include the structure and maturity of partnerships, number and strength of employer partnerships, type of sector, population served, type of training, urbanicity, and region. The 11 sites not selected for in-person site visits will participate in telephone interviews.
Understanding the implementation and effectiveness of America’s Promise requires data collection from multiple sources. The implementation evaluation data collection instruments included in this clearance request include the protocols that will be used to conduct in-person (if feasible) interviews and focus groups during site visits for approximately 12 of 23 grantees and telephone interviews for the 11 remaining grantees, beginning in May 2020. Interviews conducted in person will be in-depth, using a semi-structured master protocol with open-ended question prompts; telephone interviews will be conducted using a subset of questions from the same semi-structured protocols used for the in-depth site visits. Telephone interviews will prioritize topics of interest to DOL and the analysis: community context, organization, and administrative structure; recruitment, enrollment, and participant characteristics; America’s Promise services; and alternative services, outcomes, and sustainability. This package seeks clearance for interview protocols for three types of respondents: program stakeholders, employers, and small groups of current and former program participants.
Program stakeholder interview protocol (in-person). This protocol will be used to conduct in-person interviews with grantee managers, staff, and key members of the regional partnership (again, if feasible; if not, these will be done virtually). This protocol will cover program structure, community context, recruitment, service overview, alternative services available, what changes to the program made as a result of the COVID-19 pandemic (including whether and how employer partnerships were influenced), participant characteristics and outcomes, and sustainability. The in-person interviews are expected to take between 60 and 105 minutes, depending on respondent type. In the event that an in-person interview cannot be conducted during the site visit, the interview will be conducted via telephone using the in-person interview protocol to ensure that similar topics are discussed with all respondents.
Employer interview protocol. This protocol will be used in semi-structured interviews to collect information on employers’ roles in service design and implementation (including whether COVID-19 changed their training needs or role in the program), their perception of the quality and effectiveness of program services, whether they hire or advance participants, and whether participants acquire the skills required to be successful. It will also provide important insight on the local economic context, including how COVID-19 has affected their industry locally. This interview will be conducted in person during the site visits and is expected to take approximately 60 minutes to complete. If we are unable to schedule interviews while on-site, they will be conducted via telephone at a later date.
Participant focus group protocol. This protocol will be used to conduct focus groups with a small number of participants at each visited site. This protocol will gather data on their backgrounds, reasons for seeking program services, experiences with America’s Promise, and outcomes after participating. Consent to participate in the research study will be obtained from all focus group participants. To fully ensure informed consent, the study team will collect written consent from all participants at the start of each focus group. Written consent forms will describe the purpose of the study; outline the information that will be collected; explain the risks, benefits, and voluntary nature of participation; and collect participants’ consent to participate in the focus groups. These groups will be conducted in person and are expected to take approximately 90 minutes to complete.
Participant focus group information form. This form will be distributed to focus group participants for completion at the beginning of each focus group. The information form will collect details on participant demographics, education and employment history and America’s Promise program participation. The form is expected to take approximately five minutes to complete.
Program stakeholder interview protocol (telephone). This protocol will be used to conduct telephone interviews with grantee managers, staff, and key members of the regional partnership. This protocol will cover community context, organization, and administrative structure; recruitment, enrollment, and participant characteristics; America’s Promise services; and alternative services, outcomes, and sustainability. The telephone interviews are expected to take approximately 120 minutes to complete.
Proposed uses for each data collection activity are described in Table A.1.
Table A.1. How data will be used, by data collection activity
Data collection activity |
How the data will be used |
1. Program stakeholder interviews (in-person) |
We will conduct in-person interviews (if feasible) with America’s Promise grantee and partner staff to describe program structure, community context, recruitment and participant characteristics, service overview, alternative services available, changes the program made as a result of COVID-19, outcomes and sustainability. |
2. Employer interviews |
We will conduct in-person interviews (if feasible) with employers to describe their role in service design and implementation, their perception of the quality and effectiveness of services, whether they hire or advance participants, whether participants acquire the skills required to be successful, and whether their training needs or role in the program changed as a result of COVID-19. |
3. Participant focus groups |
We will conduct in-person focus groups with a subset of participants to describe participant characteristics, reasons for seeking services, experiences with America’s Promise, and outcomes after participating. |
4. Participant focus group information form |
We will administer the information form to describe characteristics of the population participating in the focus groups. |
5. Program stakeholder interviews (telephone) |
We will conduct telephone interviews with America’s Promise grantee and partner staff to describe community context, organization, and administrative structure; recruitment, enrollment, and participant characteristics; America’s Promise services; and alternative services, outcomes, and sustainability. |
The evaluation team will primarily use email to help facilitate the logistics and scheduling of the site visits and interviews to reduce the burden on participants. Site visitors for the evaluation of America’s Promise will use electronic audio recorders to record the semi-structured interviews. This will allow the visitors to conduct interviews in the shortest amount of time possible, as they will not be required to use interview time to take notes on the content of the conversation. There will be no other information technology used by site visitors.
The evaluation of America’s Promise will not require collection of information that is available through alternate sources. For example, the evaluation will use available information from grantee applications and existing administrative data sets to ensure that data collected through interview and focus groups are not available elsewhere.
Interviews could be conducted with employers or program stakeholders from small businesses or other small entities. We will only request information required for the intended use and minimize burden by restricting the length of interviews to the minimum required time.
If the in-person and telephone interviews are not conducted, DOL and other stakeholders will not have the information necessary to answer key research questions of the evaluation. Without collecting the information specified in the site visit and telephone interviews, a comprehensive implementation analysis of America’s Promise could not occur. This would prevent information being provided to policymakers about the context in which the partnerships and programs operated (both before and during the pandemic), any operational challenges faced by grantees and partners, how the partnerships and services evolved over time, how grantees adapted during the pandemic, whether the approaches were effective and implications for interpreting results (particularly as the evaluation will measure participant outcomes at time periods covering both before and during the pandemic), or implications for program improvement based on evidence obtained through the evaluation – particularly those being funded in the next several months.
* Requiring respondents to report information to the agency more often than quarterly;
* Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
* Requiring respondents to submit more than an original and two copies of any document;
* Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
* In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
* Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
* That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
* Requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
No special circumstances apply to this data collection.
A 60-day notice to solicit public comments was published in the Federal Register, 83 FR 54943 on November 1, 2018 (title correction was published on 83 FR 55561 on November 6, 2018). One comment was received. It suggested that the program was ineffective and costly, but no evidence was provided in support of the statement. DOL acknowledged receipt of the comment. In fact, the purpose of this evaluation is to provide evidence as to the effectiveness of the program and understand strategies to support on-going program improvements.
Describe efforts to consult with persons outside the agency to obtain their views on the availability of data, frequency of collection, the clarity of instructions and recordkeeping, disclosure, or reporting format (if any), and on the data elements to be recorded, disclosed, or reported.
Consultation with representatives of those from whom information is to be obtained or those who must compile records should occur at least once every 3 years - even if the collection of information activity is the same as in prior periods. There may be circumstances that may preclude consultation in a specific situation. These circumstances should be explained.
Consultation on the research design and data needs is being coordinated by the evaluation team and involves discussions with experts and site-level program staff. The purpose of consultation with outside experts is to ensure the technical soundness of the evaluation and the relevance of evaluation findings and to verify the importance, relevance, and accessibility of the information sought in the evaluation. These experts participating in the evaluation technical working group are listed in Table A.2. The purpose of the consultation with program staff was to better understand the feasibility of the research design within the regional context of grantees.
Table A.2. Individuals providing consultation on America’s Promise evaluation design
Peter Mueser, PhD Professor, Department of Economics and Truman School of Public Affairs University of Missouri Columbia, MO 65211
|
Mary Alice McCarthy Director, Center on Education and Skills New America 740 15th Street NW, Suite 900 Washington, DC 20005 |
|
Margaret Hargreaves Principal Associate Community Science 438 N. Frederick Avenue, Suite 315 Gaithersburg, MD 20877 |
9. Explain any decision to provide any payment or gift to respondents, other than remuneration of contractors or grantees.
There are no payments or gifts to program and partner staff, as activities are expected to be carried out in the course of their employment, and no additional compensation will be provided outside of their normal pay. Respondents participating in the participant focus groups will receive a $25 gift card.
Information collected will be kept private to the extent permitted by law. The evaluation team complies with DOL data security requirements by implementing security controls for processes that it routinely uses in projects that involve sensitive data. Further, the evaluation is being conducted in accordance with all relevant regulations and requirements.
12. Provide estimates of the hour burden of the collection of information.
* Indicate the number of respondents, frequency of response, annual hour burden, and an explanation of how the burden was estimated. Unless directed to do so, agencies should not conduct special surveys to obtain information on which to base hour burden estimates. Consultation with a sample (fewer than 10) of potential respondents is desirable. If the hour burden on respondents is expected to vary widely because of differences in activity, size, or complexity, show the range of estimated hour burden, and explain the reasons for the variance. Generally, estimates should not include burden hours for customary and usual business practices.
* If this request for approval covers more than one form, provide separate hour burden estimates for each form and aggregate the hour burdens.
* Provide estimates of annualized cost to respondents for the hour burdens for collections of information, identifying and using appropriate wage rate categories. The cost of contracting out or paying outside parties for information collection activities should not be included here. Instead, this cost should be included under “Annual Cost to Federal Government.”
Table A.3 provides annual burden estimates for each of the data collection activities for which this package requests clearance. All of the activities covered by this request will take place over a three-year period. To calculate the estimated cost burden for respondents, average hourly wages from the U.S. Bureau of Labor Statistics, National, State, Metropolitan, and Nonmetropolitan Area Occupational Employment and Wage Estimates for May 2018 were multiplied by the number of hours per respondent type. The following summarizes the annual burden estimates for each of the five data collection activities:
Program stakeholder interviews (in-person). Grantee and partner staff interviews will be conducted for 12 grantee sites in-person. On average, in-person interviews with grant managers and other stakeholders will take 75 minutes to complete. Two grantee staff—the grant manager and another key staff member--and 8 non-employer partner staff are expected to be interviewed for each in-person site visit, for a total of 10 stakeholders per grantee. The total burden for site visit interviews is 150 hours (10 stakeholders x 12 grantees x 75/60 hours); the annualized burden is 50 hours.
Employer interviews. Employer interviews will be conducted in person during the course of the 12 grantee site visits with a total of 24 respondents (2 employers × 12 grantees). If the evaluation team is unable to schedule these interviews during the site visit window, they may be conducted via telephone after the visit has occurred. These interviews will take 60 minutes to complete. Total burden for the employer interviews is 24 hours (24 respondents × 60/60 hours); the annualized burden is 8 hours.
Participant focus groups. Focus groups with a subset of participants will take place during in-person site visits. Each focus group will take 90 minutes to complete. Five participants are expected to participate at each of the 12 sites visited, for a total of 60 respondents (5 participants × 12 grantees). The total burden is 90 hours ((60 respondents) × 90/60 hours); the annualized burden is 30 hours.
Participant focus group information form. Forms will be administered with participants at the start of each focus group. Each form will take 5 minutes to complete. Five participants are expected to participate at each of the 12 sites visited, for a total of 60 respondents (5 participants × 12 grantees). The total burden is 5 hours (60 respondents) × 5/60 hours); the annualized burden is 2 hours.
Program stakeholder interviews (telephone). Grantee and partner staff interviews will be conducted via telephone for 11 grantees. The telephone interviews will take 120 minutes to complete. Two grantee staff and 2 partner staff are expected for each phone call, for an overall total of 4 stakeholders. The total burden for phone interviews is 88 hours (4 stakeholders x 11 grantees × 120/60 hours); the annualized burden is 29 hours.
Table A.3. Estimated Annualized Respondent Hour and Cost Burden
Data Collection Activity |
Number of respondents |
Number of responses per respondent |
Total number of responses |
Average burden per response (in hours) |
Annual |
Average hourly a |
Annual monetized burden hours |
Semi-structured program stakeholder interviews (in-person) |
40 |
1 |
40 |
75/60 |
50 |
$45.36 |
$2,268 |
Employer interviews |
8 |
1 |
8 |
60/60 |
8 |
$45.36 |
$363 |
Participant focus groups |
20 |
1 |
20 |
90/60 |
30 |
$18.58 |
$557 |
Participant focus group information form |
20 |
1 |
20 |
5/60 |
2 |
$18.58 |
$37 |
Semi-structured program stakeholder interviews (telephone) |
15 |
1 |
15 |
120/60 |
30 |
$45.36 |
$1,361 |
Unduplicated Total |
103 |
-- |
103 |
|
120 |
|
$4, 586 |
a The hourly wage of $45.36 is the May 2018 median wage across Education Administrators, Postsecondary (see http://www.bls.gov/oes/current/oes_nat.htm); $18.58 is the May 2018 median wage across all occupations in the United States
* Generally, estimates should not include purchases of equipment or services, or portions thereof, made: (1) prior to October 1, 1995, (2) to achieve regulatory compliance with requirements not associated with the information collection, (3) for reasons other than to provide information or keep records for the government, or (4) as part of customary and usual business or private practices.
The total cost to the Federal government over three years is $427,486, and annualized cost to the federal government is $142,495. Costs result from the following categories:
The estimated cost to the federal government for the contractor to carry out the site visit interviews and the telephone interviews is $368,1882. Annualized, this comes to $122,729:
The annual cost borne by DOL for federal technical staff to oversee the contract is estimated to be $19,766. We expect the annual level of effort to perform these duties will require 200 hours for one federal GS 14 step 4 employee based in Washington, D.C., earning $61.77 per hour. (See Office of Personnel Management 2019 Hourly Salary Table at https://www.opm.gov/policy-data-oversight/pay-leave/salaries-wages/salary-tables/pdf/2019/DCB_h.pdf . To account for fringe benefits and other overhead costs, the agency has applied multiplication factor of 1.6:
200 hours × $61.77 × 1.6 = $19,766.
Thus the total annualized federal cost is $122,729+ $19,766= $142,495.
The Consolidated Framework for Implementation Research will be used to guide the analysis of implementation data gathered from all 23 grantees, including identification of facilitators and barriers.3 This framework was developed to facilitate systematic assessment of the implementation context to reveal the factors that influence implementation, common implementation challenges, and promising strategies for replication.
Analysis of interview data will involve coding and triangulating across data sources. The evaluation team will begin by writing up detailed field notes from in-person and telephone interviews in a structured format. To code the qualitative data for key themes and topics, a coding scheme will be developed and organized according to key research questions and topics and guided by the conceptual framework as well as constructs from the Consolidated Framework for Implementation Research on factors that affect implementation. Each segment of coded data will be assigned a negative or positive flag to identify barriers to and facilitators of implementation. This process will reduce the data into a manageable number of topics and themes for analysis (Ritchie and Spencer 2002).4 The evaluation team will then code the data using qualitative analysis software. To ensure reliability across team staff, all coders will code an initial set of documents and compare codes to identify and resolve discrepancies. These data will be used to describe the nuances of how and why partnerships developed as they did, and to explore implementation challenges and promising practices.
In early 2022, we will produce a report on the implementation and impact evaluations, as well as produce other dissemination products such as fact sheets and issue briefs on topics of interest to DOL in as timely a manner as possible to support DOL’s decision-making during the pandemic.
18. Explain each exception to the topics of the certification statement identified in “Certification for Paperwork Reduction Act Submissions.”
1 U.S. Department of State. “Report of the Visa Office 2016, Table XVI (B) Nonimmigrant Visas Issued by Classification (Including Border Crossing Cards) Fiscal Years 2012–2016.” Washington, DC: U.S. Department of State, 2017. Available at https://travel.state.gov/content/visas/en/law-and-policy/statistics/annual-reports/report-of-the-visa-office-2016.html
2 The total contractor cost includes the cost for $25 gift cards paid to focus group participants.
3 Damschroder, L.A., D.C. Aron, R.E. Keith, S.R. Kirsh, J.A. Alexander, and J.C. Lowery. “Fostering Implementation of Health Services Research Findings into Practice: A Consolidated Framework for Advancing Implementation Science,” Implementation Science. vol. 4, no. 7, August 7, 2009.
4 Ritchie, J., and L. Spencer. “Qualitative Data Analysis for Applied Policy Research.” In The Qualitative Researcher’s Companion, edited by M. Huberman and B. Miles. London: Sage, 2002.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Jeanne Bellotti |
File Modified | 0000-00-00 |
File Created | 2021-01-15 |