JCEB Supporting Statement B Revised based on C1 feedback_090721_CLEAN

JCEB Supporting Statement B Revised based on C1 feedback_090721_CLEAN.docx

Job Corps Evidence Building Portfolio (JCEB)

OMB: 1290-0037

Document [docx]
Download: docx | pdf









JOB CORPS EVIDENCE-BUILDING PORTFOLIO PROGRAM

SUPPORTING STATEMENT FOR OMB CLEARANCE REQUEST PART B

September 2021










CONTRACT NUMBER:

1605DC-18-A-0021


PREPARED FOR:

Jessica Lohmann

Gloria Salas-Kos

Chief Evaluation Office

U.S. Department of Labor


SUBMITTED BY:

MDRC




























Part B: Statistical Methods

B1. Respondent Universe and Sample

The Chief Evaluation Office (CEO) of the U.S. Department of Labor (DOL) seeks approval to collect information for the implementation, outcome and impact feasibility studies of two Job Corps demonstration pilots. The Job Scholars pilot has 30 separate grantees. The Idaho Job Corps pilot has one grantee, which operates four locations. These studies, funded as part of the broader Job Corps Evidence-Building Portfolio Program (contract #1605DC-18-A-0021), aim to understand who the pilots enroll, what services they provide, how these services are implemented, and how the pilots compare with traditional Job Corps centers. The evaluation will also assess outcomes of participants in the demonstration pilots, as well as identify any best practices. The project also includes impact feasibility assessments of each of the pilots to assess the potential for conducting an impact evaluation of the pilot’s effectiveness or similar future pilots. MDRC and its subcontractor Abt Associates have been contracted to conduct these studies. This supporting statement is the first and only OMB submission planned for the Job Corps Evidence-Building Portfolio Program. Part B of the Supporting Statement considers the issues pertaining to Collection of Information Employing Statistical Methods.



This package requests clearance for four data collection activities:



  1. Program survey of Job Corps centers and demonstration pilot grantees.

  2. Interviews with program staff and staff from selected community partner organizations topic guide.

  3. Participant interviews or focus group topic guide.

  4. Impact feasibility assessment interviews with demonstration pilot staff topic guide.



In this Information Collection Request (ICR), clearance is sought for four instruments. The sampling approaches for these four instruments are:

  • Program survey of demonstration pilot grantees. The project will field a program survey to all of the demonstration pilot grantees to gather information about program implementation, service offerings, and staffing. Key variables that will be collected include: prior experience serving WIOA out of school youth, Job Corps participants, or youth with similar backgrounds; the types of format of HSEs offered (self-paced online, drop-in, classroom-based) and career training (such as online, classroom-based, work-based, blended); availability of academic, personal, career counseling/support and their staffing, how often students are required to meet with counselors and/or receive particular types of support services (such as academic support); the provision of free living arrangements; types of staff training required (trauma, motivational interviewing, diversity, etc.), and staff caseloads.


The survey will be targeted at program directors. The survey will be fielded to all 30 pilot demonstration sites in winter 2021/2022. We expect a 100% response rate on the survey because participating in the evaluation is a condition of the pilots’ grants.


  1. Semi-structured interviews with program staff and staff from selected community partner organizations topic guide. Each of the pilot demonstration projects draw on a range of staff and partners that deliver services; thus, interviews may include pilot staff, partner staff, employers, and training and education providers. Sites and respondents will be selected purposively. The universe includes approximately 290 grantee and partner organization staff (10 staff at each of the 26 Job Scholars grantees, 30 staff total for Idaho Job Corps pilot sites). The team will interview up to 175 of these staff, chosen purposively to span the key variation in implementation. The team will conduct a site visit, in person or virtual, to Idaho Job Scholars and will interview up to 30 staff, partners and employers associated with the program across their four locations. Results from the grantee survey and review of grantee materials will be used to identify Job Scholars sites for interviews. Up to five Job Scholars sites will be selected for site visits, which will also take place virtually or in-person and will include up to 15 interviews with staff, partners, and employers associated with the program. Up to 15 Job Scholars sites will be selected for phone interviews, and up to five staff, partners or employers at these sites will participate. Sites will be selected to ensure that the range of organizational characteristics and implementation strategies are represented. We will use organizational charts and information on each employee’s role at the grantee organization and its partner organizations to identify specific staff to interview. Purposeful selection is appropriate for staff selection because insights and information can only come from individuals with particular roles or knowledge. Participation in JCEB evaluation activities—including interviews—is a condition of the grant award. We anticipate 95 percent participation from selected staff because the National Office of Job Corps will request their compliance, but a few staff may miss their scheduled interviews due to illness or staffing turnover.1 Staff at partnering organizations are not under the same mandate; however, we anticipate 75 percent of selected partner organizations will participate because each site will leverage their relationship with the partner organization to encourage their participation.2 The research team will identify one member to serve as the primary contact/ site liaison between the research team and the sites. This member will work with the primary contact at each selected site to handle scheduling and logistics. To ensure staff are available during the site visit—whether the visit takes place in-person or virtually—site visit dates will be confirmed at least one month in advance to allow enough time to schedule interviews. Interview dates and times will be finalized at least two weeks before the site visit. In the event that a staff member is not available at their scheduled time, efforts will be made to reschedule the interview for an alternative time during the visit (if in person) or to take place virtually after the visit.


  1. Participant interviews or focus group topic guide. We will also interview demonstration pilot participants through one-on-one interviews or focus groups. These interviews or focus groups may be conducted in person, online, or over the phone. The universe is an estimated 2,830 participants (750 for Idaho Job Corps and 2,080 for Job Scholars). We will conduct interviews and focus groups with up to 25 participants at Idaho Job Corps. At each of the 5 Job Scholars sites selected for site visits, we will conduct interviews and focus groups with up to 25 participants. At the remaining sites, we will conduct up to 25 phone interviews with participants. Participants will be selected for interviews using a convenience sample. Because the research team does not have the contact information for participants, we will rely on sites to recruit participants for focus groups and participants. We will ask sites to recruit participants that reflect a diverse range of experiences and perspectives on the pilot programs. Though the views of this convenience sample is not intended to be representative of the larger universe of participants in the Job Corps pilots, the qualitative information will contextualize our understanding of participants’ experiences. Because we are using a convenience sample and interviews will be scheduled in advance, we anticipate 90 percent of participants scheduled to participate in an interview or focus group will do so.3 We expect each site to work with participants to schedule interviews and focus groups at times that fit with participants’ other obligations (e.g., classes). In the event a participant is not available to attend their scheduled focus group, we will invite a student from an alternate list. In the event a participant is not available during their scheduled interview time, every effort will be made to reschedule the interview within the week.



  1. Impact feasibility assessment interviews with demonstration pilot staff topic guide. In addition to the implementation and outcome study, the evaluation will gather information from select grantee staff about topics related to feasibility of conducting an impact study of the demonstration pilot. The team will conduct phone, video or in person interviews with grantee staff who are involved in management, enrollment, and program services in winter 2022. The project will conduct 30 interviews across the 27 pilots. Idaho Job Corps will have up to 5 interviews, and the Job Scholars grantees will have up to 25 interviews. Some Job Scholars grantees will have up to 3 interviews and some grantees will have none. The results of the grantee survey and review of grantees documents will be used to identify sites for impact feasibility interviews. Participation in JCEB evaluation activities—including interviews—is a condition of the grant award. We anticipate a 95 percent response rate from selected staff because the National Office of Job Corps will request their compliance.4 The research team will identify one member to serve as the primary contact/ site liaison between the research team and the sites. This member will work with the primary contact at each selected site to handle scheduling and logistics. To ensure staff are available during the site visit—whether the visit takes place in-person or virtually—site visit dates will be confirmed at least one month in advance to allow enough time to schedule interviews. Interview dates and times will be finalized at least two weeks before the site visit. In the event that a staff member is not available at their scheduled time, efforts will be made to reschedule the interview during the visit (when in person) or to take place after the visit virtually












Table B.1: Universe and Sample for Information Collection Instruments

Information Collection Instrument

Estimated Number of Persons in Universe

Estimated Number of Persons in Sample

Sampling Method

Instrument 1: Pilot survey of program directors

30 pilot program directors

30 pilot program directors

NA

Instrument 2: Staff and community partner interview topic guide

290 Grantee and partner organization staff (10 staff at each of the 26 Job Scholars sites, 30 for Idaho Job Corps demonstration)

175 (25 at Idaho Job Corps demonstration, up to 15 each at up to 5 Job Scholars sites selected for site visits, 5 each at up 15 sites selected for virtual interviews)

Purposive

Instrument 3: Participant interview or focus group topic guide

2,830 participants (750 for Idaho Job Corps and 2,080 for Job Scholars (80 enrollees at each of the 26 Job Scholars sites)

175 (up to 25 at Idaho Job Corps, up to 25 at each of the 5 Job Scholars sites selected for site visits, and up to 25 virtual interviews for remaining sites)

Convenience

Instrument 4: Impact feasibility assessment interviews with demonstration pilot staff topic guide.

78 pilot staff (estimated 3 staff at each of the 26 Job Scholars sites that can inform topic, such as program director, enrollment staff, data management staff)

30 (average of 1 interview at each demonstration pilot)

Purposive

NA= Not applicable



B2. Statistical Methods for Sample Selection and Degree of Accuracy

Description of methodology for stratification, sample selection, clustering, MDEs

B2.1. Statistical methodology for stratification and sample selection

There is no statistical sampling. No statistical methods will be used in the implementation analysis, and discussions of the results will be carefully phrased to make clear that no generalization is intended. The outcome data analysis will also not include sampling and will be conducted using program data on all participants.

B2.2 Description of Sample Selection Methodology

The sample selection is described in section B1.

B2.3 Estimation Methods

The survey and interviews are designed to provide in-depth qualitative information about the pilots. The frequencies of the survey responses will be calculated by averaging responses across all the surveys. No estimation procedures will be used for the interview data. The data analysis will be descriptive.

B2.4. Description of statistical tests

No statistical tests will be used.

B2.5 Minimal Detectable Effects

No effect sizes will be calculated.

B2.6 Specialized sampling procedures for unusual problems

No specialized sampling procedures will be utilized.

B2.7: Use of Periodic Data Collection Cycles to Reduce Burden

Data for each tool will only be collected once from each respondent. Data collection cycles will be spaced out so that respondents that may be asked to respond to more than one instrument (such as the grantee survey and an interview) have time in between data collection tasks. Additionally, duplication between instruments is minimal to reduce burden on respondents.





B3. Maximizing Response Rates and Addressing Nonresponse


Maximizing Response Rates



Grantee Survey

Participation in the evaluation was a condition of the pilot contracts. The National Office of Job Corps will request compliance from sites. Thus, it is anticipated that all program directors will respond to the pilot survey.

Site Visits and Interviews with Pilot and Partner Staff for Implementation Study and Impact Feasibility Assessments

Participation in the evaluation was a condition of the pilot contracts. The National Office of Job Corps will request compliance from sites. Thus, it is expected that sites and staff will cooperate with planning site visits and participating in interviews. Site visitors will work closely with site to help in scheduling site visits or virtual interviews. One member of the two-person site visit team will take responsibility for working with the primary contact person to handle the scheduling and logistics, e.g., identifying appropriate interview respondents. Dates for in-person site visits will be set at least one month in advance to allow ample time to schedule interviews.

Participant Interviews and Focus Groups

Interview data are not intended to be representative in a statistical sense, in that they will not be used to make statements about the prevalence of experiences in the broader population of pilot participants. However, it is important to secure participants with a range of background characteristics in order to capture a variety of possible experiences with Job Corps demonstration pilot services. The team will work with site staff to recruit participants with diverse backgrounds and experiences. The team will also be flexible to ease barriers to participating in interviews, such as conducting interviews at times and places most convenient for potential interviewees.



Nonresponse

We do not anticipate any nonresponse bias or item nonresponse issues for the pilot survey. However, we will still examine nonresponse bias to see if patterns of nonresponse have implications for the representativeness of our survey sample; thus, we may interpret results with more caution. However, we will not impute or weight data. We do not plan to delete cases with missing data (listwise deletion), but may exclude cases from some measures if data is missing for a complete module, for example. Otherwise, averages will be calculated over all non-missing cases.

 

B4. Test Procedures

The survey was piloted with fewer than 9 pilot grantees prior to submission. Since the topic guides will be used to develop pilot-specific interview guides, they will not be pretested.

B5. Contact Information

The individuals listed in Table B.2 below made a contribution to the design of the evaluation, and, as noted, will be responsible for data collection and analysis.

Table B.2. Individuals Providing Consultation on Job Corps Evidence-Building Portfolio Program Design

Name/Affiliation

Role in Study

Email

Jean Grossman

MDRC


Co-Principal Investigator (design, analysis)


jgrossma@Princeton.edu

Jacob Klerman

Abt Associates Inc.


Co-Principal Investigator (design, analysis)


jacob_klerman@abtassoc.com

Hannah Betesh

Abt Associates Inc.


Project Manager (design, data collection, analysis)


hannah_betesh@abtassoc.com

Louisa Treskon

MDRC


Project Manager (design, data collection, analysis)




louisa.treskon@mdrc.org

Keith Olejniczak

MDRC

Content Expert (design, data collection, analysis)


Keith.olejniczak@mdrc.org

Affiong Ibok

MDRC

Content Expert (design, data collection)


Affiong.ibok@mdrc.org

Betsy Tessler

MDRC

Content Expert (design)


Betsy.Tessler@mdrc.org



1 This estimate is informed by experience on the Cascades College and Career Academy Evaluation project, also a Job Corps pilot project. There was 94% staff participation when the Cascades College and Career Academy Evaluation project team conducted in-person staff interviews in 2019 and 100% participation when the project team conducted virtual staff interviews in 2020.

2 This estimate is informed by experience on the Cascades College and Career Academy Evaluation project. The team obtained a 66 percent response rate with partners in 2020, though the interviews were conducted early in the COVID-19 pandemic which affected response rates.

3 There was 100% participant participation when the Cascades College and Career Academy Evaluation project team conducted in-person focus groups in 2019 and approximately 60% participation rate in the virtual focus groups conducted in 2020. We believe the low participate rate in 2020 was caused by the ongoing pandemic as the majority of participants were not at centers at this time.


4 There was 94% staff participation when the Cascades College and Career Academy Evaluation project team conducted in-person staff interviews in 2019 and 100% participation when the project team conducted virtual staff interviews in 2020.

7


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorLouisa Treskon
File Modified0000-00-00
File Created2022-02-12

© 2024 OMB.report | Privacy Policy