Job-Corps-Process-Study_OMB_Supporting-Statement_Part_B6 6 2012

Job-Corps-Process-Study_OMB_Supporting-Statement_Part_B6 6 2012.docx

Job Corps Process Study

OMB: 1205-0501

Document [docx]
Download: docx | pdf

PART B. SUBMISSION FOR COLLECTIONS OF

INFORMATION EMPLOYING STATISTICAL METHODS


1. Respondent Universe and Sampling Methods


The U.S. Department of Labor, Employment and Training Administration (ETA) is requesting clearance for an information collection to conduct site visits to Job Corps centers and a survey of center directors for a process study of the Job Corps program. The study seeks to explore and identify associations between centers’ practices and their performance on a range of relevant outcomes, including gains in foundational academic skills during students’ time on center; completion of relevant academic and career technical programs; attainment of credentials; and placement and earnings after students leave the center. ETA expects that the study’s results can be used for peer-to-peer learning, technical assistance and development of performance measurement systems. IMPAQ International and its subcontractors, Battelle Memorial Institute and Decision Information Resources (henceforth the IMPAQ team), are conducting the study.


In the fall of 2010, ETA contracted with the IMPAQ team to conduct this study to address the following broad questions:


  • What center practices appear to be associated with center performance or particular dimensions of performance and how?

  • How do interactions among center practices and characteristics mediate these associations? Put differently, do some strategies or practices work especially well (or especially badly) for certain kinds of centers?


ETA requests clearance for the IMPAQ team to conduct two principal research activities: 1) site visits to 16 Job Corps Centers to conduct in-depth interviews with senior center management, instructors, social and residential staff, on- and off-center partners, operator executives, and regional staff and student focus groups. and (3) a Web-based survey of Job Corps center directors.



Candidate promising practices identified through the site visits and hypothesized to be associated with performance outcomes will form the foundation of the instrument for the survey, which will be administered to all Job Corps center directors. Survey responses will be used to determine systematic associations between practices or sets of practices and aspects of center performance.

The IMPAQ team will conduct interviews during site visits to the 16 centers purposively selected based on center performance indicators, analyzed by the IMPAQ team. The goal of the center selection is to identify a variety of centers on a number of characteristics ensuring a strong likelihood of observing different programmatic and management practices and formulating credible hypotheses about the associations between practices and performance indices, based on Job Corps’ Outcome Measurement System (OMS). The analysis utilized all the measures in the PMS with the addition of two new measures:


  • Percentage of students employed in the first complete calendar quarter after program exit, based on WRIS data and

  • Student satisfaction with Job Corps measured through a survey of students after they leave the program.


A factor analytic model was used with varimax rotation to identify factors that explain, with minimal “redundancy,” the greatest degree of the overall variance across measures. The researchers calculated how well each measure is correlated with, or “loads on,” each factor. Based on the factor loadings, the team used the measures retained in the factor model to generate a score for each center along each of the factor dimensions.1 These factor scores are used in the study to rank centers according to their success along each dimension.


Based on these rankings and on considerations of geographic and other diversity, ETA in consultation with the contractor will select 16 Job Corps centers to visit, which vary in their performance on the different factors. Visiting both high- and low-performing centers should allow the IMPAQ team to distinguish between practices that are plausibly associated with high performance and those that are merely widespread or otherwise not related to the outcomes of interest, as well as understand different implementation settings. Variety with respect to region, type of operator, operator corporation, and size will also be taken into account.


A Web link to the survey will be sent to all 125 Job Corps center directors, i.e., no sampling is involved. Because a Job Corps participant can only be enrolled in one center at a time, there is no overlap in the student populations served by the different centers, which eases interpretation of the findings.


We expect 100 percent response rates for both the survey and the site visits, which is a reasonable estimate based on the National Job Corps Study’s experience, the IMPAQ team’s experience on related efforts, and assurances from ETA staff.


2. Information Collection Procedures


In-depth interviews with selected Job Corps center personnel held during the site visits will be conducted with the site visit protocol included in Appendix E (this is a condensed version of the protocols, which lists all questions and includes a key matching types of respondents to questions).2 We do not believe that there are any selection bias issues related to the specific Job Corps center personnel chosen for the interview as the interview topics are broad and informational and the IMPAQ team does not expect to analyze findings from them using statistical methods. Focus groups held with students during the site visits will be conducted using the protocol in Appendix F. Appendix G contains the informed consent form for students who participate in the focus groups. Appendix H includes a focus group participant information form. Finally, researchers will record their observations throughout the site visit using the observation protocol found in Appendix I.


The Job Corps center survey will be conducted via the World Wide Web, using the software tool KeySurvey (www.keysurvey.com). The survey instrument is included in this submission as Appendix B. The survey will include questions about the practices that emerge in the site visits as plausibly associated with performance. Unlike the site visit findings, survey results will be subject to statistical analyses. The survey program instruction and e-mail reminder are included as Appendices C and D, respectively.


2.1 Statistical Methodology for Stratification and Sample Selection


Centers for site visits will be selected to represent diverse points on the performance continuum based on a ranking analysis that will rank order centers according to their performance outcomes (e.g., student program outcomes and post-program outcomes such as job placement). Performance is measured in this study through a combination of metrics combined through factor analytic methods. The discussion below provides supporting detail.


The IMPAQ team has developed center rankings using Job Corps’ current Outcome Measurement System, WRIS placement, and student satisfaction survey data from Program Years 2007, 2008, and 2009. Three years of data provide a broad base for classifying a center as “high-“ or “low-“ performing that is not unduly influenced by year-to-year fluctuations in center performance.


Current Job Corps practice includes the use of statistical models to adjust most, but not all, performance measures in the OMS. This approach is used to remove the influence of factors that centers cannot control that may affect center performance. Currently those control factors include age, initial TABE scores, pre-test barriers to GED attainment, highest grade completed, high school degree or GED enrollment, and industry of training. The control variables are included in regression models that are estimated for each performance measure, with residuals serving as the adjusted measures. This approach has limitations in that some measures are not currently adjusted for factors beyond center control. Furthermore, for policy reasons, Job Corps’ current approach to adjusting OMS measures does not include several variables that may influence center performance (e.g., student gender, race, and ethnicity).


The analytic approach used for this effort builds on the approach that Job Corps has used to date in developing adjustment models – including the use of ordinary least squares regression models and the specific variables included in the models – and extends the adjustment process to apply to all performance measures as well as add control variables to the adjustment. These additional factors include student gender, race/ethnicity, county unemployment rate, county average earnings, county poverty rates, and the minimum wage applicable to the state. The full set of control measures are used, as mentioned earlier, to adjust for factors that are beyond center’s control. Otherwise, we would risk selecting centers to visit that are high on post-program placement outcomes because they serve students that were better prepared at entry to Job Corps or because they are primarily located in areas with better job opportunities, rather than because of the special training or placement services offered to the students.


The research team generated ratings adjusted for center differences in the types of students served and in the local area employment opportunities by estimating multivariate regression models with the original measures as the dependent variables and calculating the residuals. For a given measure, a highly positive value of the residual indicates that after controlling for other factors, the center is performing considerably higher than expected given its student mix and, as applicable, local economic conditions; similarly, a large negative value shows that the center is underperforming. These residuals are then designated as adjusted performance measures and entered into factor-analytic models. Factor loadings were computed for each performance measure, and factor-specific rankings were generated based on the factor loadings. Factor models and rankings were generated separately for the years 2007, 2008, and 2009.


The estimated factor models were consistent for each of the three years and resulted in four factors in each year that were very similar. The rankings from the final factor models will be used to identify centers that are rated consistently high, or low, for all three years. The results will also be used to identify centers that have improved over the three years or declined during the period. By identifying consistent performers as well as those that are improving or declining we hope to identify aspects of centers that are related to better performance and factors that appear to hinder center performance.


No sampling, stratification or estimation procedures will be utilized for the center director survey, as we are surveying their entire universe and as indicated above fully expect a 100% response rate.


2.2 Analytic Approach


Estimation Procedures


The analysis plan will be implemented in two stages. The first stage entails collecting in-depth interview information from center staff during in-person site visits. Data from the interviews conducted during the center visits will be largely qualitative in nature, and the analysis will be conducted by reviewing the interview notes/transcripts to identify and extract major themes concerning the factors and practices that may be related to center performance. The results of the center visits will not be generalizable to the entire population of centers, nor is that the goal of the center visits. Instead, the center visits are designed to help identify factors that are likely to be related to center performance that can be included in the survey of all centers. The sample of centers for site visits will be selected to maximize the likelihood of identifying factors that may be linked to center performance by selecting centers that are consistently better or poorer performers and by identifying centers that have improved or declined in their performance in the last three years.


The second stage of the analysis plan involves conducting a Web survey of all centers and then analyzing the survey data along with data from three other sources that will be merged at the center-level to form a center database for the analysis. The other data sources include: 1) OJC administrative data, 2) student satisfaction data from the follow-up survey, and 3) WRIS data. These four data sources will be merged to construct an analysis file in which Job Corps center is the primary unit of analysis.


The Web survey will supply data that describe the practices at centers and factors, which may affect performance results. These data will require only minimal “cleaning” as the Web-based software allows us to program correct skip patterns between questions, appropriate answer ranges, a single response per question, and error notification for missing and out-of-range responses. Data from the survey will be reviewed to identify and correct any data cleaning issues and to determine data quality. Various relevant center characteristics can be found in the OJC administrative data, such as the number of students on center and center contract financial information. Center performance will be captured by adjusted measures developed from the Center Information System, student follow-up satisfaction survey data, and the WRIS data, as discussed. Based on our previous experience in working with these “linked” data sources, they are of generally high quality; however, individual variables will be evaluated (through descriptive analysis) to determine their use in the study and whether they are of sufficient quality. The resulting combined dataset linked at the center level will be used to analyze relationships between center practices and performance, addressing the core research questions of the study.



3. Methods to Maximize Response Rates


ETA’s Office of Jobs Corps will be able to stress the importance of their cooperation with the survey and site visits. In particular, the Office of Job Corps will send a Job Corps Program Instruction to all center directors informing them of the survey and asking them to participate. Furthermore, the project will utilize a variety of techniques to maximize response rates to the center director survey, including: 1) liaison with the OJC to obtain the most recent contact information for each center director, 2) sending a survey invitation packet communicating DOL/ETA’s endorsement prior to the survey, 3) providing the online survey directly through a convenient e-mail link, and 4) tracking participation and sending periodic reminders to non‐respondents. A similar process has been used annually for the past two decades to obtain survey data from Job Corps centers for use in obtaining information on state and local factors that affect high school diploma and GED attainment and has consistently resulted in 100% response rates.


For the site visits, 16 primary sites will be chosen, along with a few alternates should one of the primary sites be unable to participate. To maximize response to the site visit interviews, the research team will conduct teleconferences with the Job Corps center key personnel once site visit locations have been identified. The team will provide an overview of the project, discuss the center’s role in the evaluation, alleviate any concerns regarding participation, and identify potential center liaisons. Information will also be provided about the expectations for interview duration and other aspects of the visits. If appropriate, ETA’s Contracting Officer’s Technical Representative will participate in the call to indicate DOL/ETA’s support of the request and to provide further encouragement for the site’s participation. Additionally, the IMPAQ team will work with the selected centers to determine the best dates to visit to ensure maximum availability of respondents.


Due to the steps listed above and because both the survey and the site visits will be conducted with the endorsement and support of the OJC/DOL, a 100 percent response rate is expected.


If achieving a 100 percent response rate does not occur, there is no reason to believe that significant differences exist in the characteristics of centers that would result in non-response bias. Where minor differences do occur, appropriate statistical methods will be employed, such as estimating a logistic regression model of the probability that a sample member responded to the survey and using the predicted probability of survey response to construct appropriate weights for each respondent.


4. Tests of Procedures


The first site visit will be conducted as a pilot visit. The pilot site will be selected randomly from the final list of selected sites. At the conclusion of each interview respondents will be asked to answer a series of questions about the interview protocol regarding clarity, flow, duplication, etc. This information will be used to make minor adjustments to the protocol.


Based on the site visit findings, the IMPAQ team may make changes to the survey instrument in preparation for developing the request for a non-substantive change to the sought OMB clearance. Research staff and programmers will thoroughly test the computerized questionnaire. A testing protocol will be developed along with various testing scenarios to ensure that the instrument is performing correctly for all types of respondents. Test scenarios will be used to evaluate whether question wording and response choices are accurate when translated from paper to Web-based administration, whether instructions are clear, and whether skip patterns are functioning properly. Thorough testing will ensure that any errors are corrected prior to full survey administration.


Additionally, we will conduct a pilot test of the survey with a convenience sample of nine center directors. The pilot survey will be administered over the Internet. The nine respondents will be instructed to log in to a specific Web site and complete the survey. After each respondent has completed the survey, we will conduct a telephone interview with the respondent, using cognitive interviewing techniques. The goal of the cognitive interviews is to assess the degree to which (a) the survey instructions and wording of the questions are clear and understandable; (b) the respondents are interpreting the meaning of each question as intended; and (c) the response options are adequate. The pre-test will identify questions that are poorly understood, terms that are ambiguous in meaning, possibly superfluous questions, and difficult transitions between topics. If the changes to the instrument as a result of the pre-test are minor (i.e., changing the order of the questions), the nine center directors who participated in the survey will not take the final version of the survey. If the changes are more significant and additional information is required from the center directors we will administer the added questions over the phone, instead of completing the survey a second time.


5. Statistical Consultants


This information collection effort is primarily qualitative in nature; as such we will not be using any statistical consultants for the project.


All data collection and analysis will be conducted by the following individuals:


Name

Organization

Phone Number

E-mail Address

Jacob Benus

IMPAQ

443.367.0379

jbenus@impaqint.com

Morgan Sacchetti

IMPAQ

443.718.4355

msacchetti@impaqint.com

Ted Shen

IMPAQ

443.539.1393

tshen@impaqint.com

Donald Nichols

IMPAQ

202.696.1004

dnicholas@impaqint.com

Terry Johnson

Battelle

206-528-3113

johnsont@battelle.org

Mary Kay Dugan

Battelle

206.528.3142

dugan@battelle.org

Dan Klepinger

Battelle

206-528-3124

kleping@battelle.org

Maria Gregoriou

Battelle

703.875.2941

gregorioum@battelle.org

Russell Jackson

DIR

832.485.3701

rjackson@dir-online.com

Jim Cooper

DIR

832.485.3713

jcooper@dir-online.com

Lenin Williams

DIR

832.485.3716

lwiliams@dir-online.com

1 Factor analytic models are used as a data reduction method to replace a large set of measures that are assumed to be related with a smaller and more manageable set of conceptual variables. To determine the set of factors, the IMPAQ team considered a variety of approaches, including eigen values, use of scree plots, formal statistical tests, and rotating the axis. For all factor models, the team explored whether rotating the axis (applying a nonsingular linear transformation) would improve interpretability and maximize the extent to which a few variables have large loadings and the remaining variables have low loadings. In all cases, the orthogonal rotation (varimax) improved interpretability. Non-orthogonal rotation (allowing the factors to be correlated) did not appreciably improve interpretability and in some cases actually reduced it. All of the information from these analyses was used to select a four-factor model.

2 The following center personnel will be interviewed: Center Director, Administration Staff, Academic Instruction Manager, CTT Manager, Human Resources Manager, Work-Based Learning Coordinator, Career Preparation Period Manager, Counseling Manager, Peer Leadership Coordinator, Social Development Manager, Center Safety Officer, Academic Instructors, CTT Instructors, Residential Advisors, Senior Administrative Staff, Business and Community Liaison, Outreach and Admissions staff, Career Transition Services staff, Employer Partner, Community Partner- other than employer, Regional Office Project Manager.

5

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSUPPORTING STATEMENT
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy