1290-0NEW Supporting Statement B (20190715)_Clean 7302019

1290-0NEW Supporting Statement B (20190715)_Clean 7302019.docx

Evaluation of Employer Performance Measurement Approaches

OMB: 1290-0030

Document [docx]
Download: docx | pdf

Evaluation of Employer Performance Measurement Approaches

ICR REF 201901-1290-001

July 2019


Evaluation of Employer Performance Measurement Approaches


Part B: Supporting Statement for Request for OMB Approval Under the Paperwork Reduction Act


Overview


The U.S. Department of Labor (DOL) Chief Evaluation Office (CEO) is seeking Office of Management and Budget (OMB) approval to collect information from State and local public workforce system employees and partners, and to gather feedback from a group of U.S. employers, to inform the Analysis of Employer Performance Measurement Approaches study. The purpose of the study is to conduct a 36-month analysis of employer services measurement approaches and metrics, as well as their cross-State and cross-program applicability, with a goal of understanding and implementing a final indicator of performance. The study will explore and establish an understanding of employer services measurement and supplement the start-up of reporting by the States1 on the National Pilot measures. Key objectives of the study include: (1) developing and understanding how employer services are defined by the federal government, States, localities, and core Workforce Innovation and Opportunity Act (WIOA) programs and exploring options for developing a uniform definition of employer services; (2) identifying what measures exist for understanding employer services, key objectives of these measures, and possibilities for uniform implementation at the federal level; and (3) developing options for an evaluation design to assess the validity, reliability, and feasibility of proposed measures and alternative measures of effectiveness in serving employers. This ICR seeks clearance for a brief online survey of all State-level core WIOA program directors.


  1. Statistical Methods


  1. Respondent Universe and Samples


This section outlines the respondent universe and sampling methods for the State Program Director Survey, as it is the only quantitative statistical data collection instrument.

State Program Director Survey. The respondent universe for the online survey of core WIOA program administrators includes all directors of Title 1, II, III, and IV programs across the States. The number of respondents per State will vary depending on how many program directors oversee the seven programs of interest, which include the following:

  1. Title I Adult Program

  2. Title I Dislocated Worker Program

  3. Title I Youth Program

  4. Title II Adult Education Program

  5. Title III Wagner Peyser Employment Service

  6. Title IV Vocational Rehabilitation Program (for Blind only)

  7. Title IV Vocational Rehabilitation Program


A list of primary (and secondary, if applicable) respondent names and email addresses were provided by the Department of Labor (DOL), and the Department of Education’s Office of Career, Technical, and Adult Education (OCTAE), and the Rehabilitation Services Administration (RSA). These contact lists were verified in coordination with NASWA and other association contacts representing state program directors of Title II and Title IV WIOA programs. The research team will supplement this contact verification by conducting a scan of information available on the Internet to gather as many names and email addresses as possible for each State’s program core WIOA program directors, if any information received from the Departments is out of date or missing. When the survey is administered, the online survey directions will state that the respondent has the option to ask members of their staff to complete questions they might not have the knowledge or experience to address, such as questions about the national pilot measures and other performance data-related inquiries. This will allow the research to capture as much information as possible related to employer services provided and experience with national pilot measures and performance measurement across all core WIOA programs in every State.


We estimate the respondent universe for the State Program Director Survey to be 215, based on state program director/administrator contact lists shared by OCTAE, RSA, and DOL. Some program directors oversee more than one program, such as a Title I program director that oversees Title I Adult, Dislocated Worker, and Youth programming in their state. Individuals that oversee multiple programs will only complete one survey; therefore, they are only counted once in the respondent universe. We will seek a response rate of 80 percent for the State Program Director Survey, for a total of 172 state program director responses.


Table 2. Respondent Universe, Sample Size, and Response Rates for Data Collection

Data Collection Activity

Universe/Sampling Frame

Respondent Description

Sample Size and Response Rate

State Program Director Survey

Approximately 215 State Program Directors of core WIOA programs

Program directors, staff that often work with employers, and staff working with pilot measures or other performance data

215 surveys will be fielded with an expected 80 percent response rate, for a total of 172 responses

  1. Statistical Methods for Sample Selection and Degree of Accuracy Needed


Survey sample selection for this study is not random. The State Program Director Survey is a survey of the entire population of core WIOA programs and is intended to provide information across the States. We will focus our efforts to boost response rates on making sure we have good representation of the four core programs across a variety of States.


Data analysis. We will use implementation analysis techniques to describe our findings across levels of government (federal, state, local). Tables will be developed to organize state survey data. We will aim to describe the processes for supporting implementation of performance measures; reasons, purposes, and incentives associated with the performance measures selected; intergovernmental dimensions of measure selection and implementation including which activities and actions occurring at the state level vs. the local level, and the extent of flexibility granted to local entities in their selection and implementation of measures.


Statistical methodology for stratification and sample selection. No statistical methods will be used for stratification and sample selection for the State Program Director Survey potential respondent population.


Estimation procedures. The survey is intended to provide an understanding of employer services measurement and supplement the start-up of reporting by the States on the National Pilot measures. No estimation procedures will be used. The data analysis will be descriptive.


Statistical techniques to ensure accuracy for the purposes described in this justification. No statistical techniques will be used to ensure accuracy.


Specialized sampling procedures to correct unusual problems. No specialized sampling procedures will be used.


Periodic data collection cycles to reduce burden. Periodic data collection cycles are not appropriate for this study, as they would not reduce burden for the respondents, and time and resource constraints for the study do not allow for multiple data collection cycles.


  1. Maximizing Response Rates and Addressing Nonresponse


State Program Director Survey. As explained in section A.1, state program directors of Titles I, II, III, and IV will be asked to respond to the survey. To maximize response rates, we will leverage NASWA’s relationships with its members to encourage survey completion for Title I and III programs. For Title II and IV programs, we will leverage NASWA’s relationship with two other relevant state associations—the Council of State Administrators of Vocational Rehabilitation (CSAVR) and the National Association of State Directors of Adult Education (NASDAE), to create buy in for the survey. Specifically, we will request that the agencies encourage their members to complete the survey in their membership communications.


NASWA will be asked to distribute the survey to identified NASWA members that administer Title I and III programs. Urban will distribute the survey to program directors of Title II and IV programs, with reminders and outreach to members provided by CSAVR and NASDAE. Respondents will be given no more than three months to complete the survey. The study team will develop and employ a plan for multiple reminder emails, and reminder emails will be scheduled on varying dates and at varying times, an approach that has been shown to increase response rates. The research team will develop the plan for reminders and outreach in partnership with NASWA, to include personalized outreach via email or phone as necessary to achieve an 80 percent response rate. Dedicated staff from the research team will be made available to answer any technical issues or questions that arise from respondents, who will be provided with a toll-free number and email address to easily contact the research team for survey support.


The involvement of NASWA as a partner in the study and in fielding the survey will promote a high response rate. NASWA is the primary national association representing state administrators of the publicly-funded state workforce system, including WIOA, employment services, training programs, unemployment insurance, employment statistics, and labor market and workforce information. NASWA will promote the survey in communications with its members. We anticipate an 80 percent response rate based on NASWA’s prior experience in fielding other, similar surveys2, their relationships with Title II and IV statewide associations, and our plans for reminders, personal outreach, and respondent support. To address nonresponse, we will report on missing cases when possible, and will omit (delete) missing cases when it is not possible to include “missing” as a category in reporting.


  1. Test Procedures


Between August and December of 2018, the research team engaged stakeholders in reviewing the data collection instruments. The team tested the state program director instrument to ensure that the data collection instruments (and each question) were clearly written and understandable to respondents, fully covered appropriate topics, asked questions appropriately, and offered respondents a complete listing of response categories for each closed-ended question. Prior to programming the state program director survey, , the study team engaged state program directors of core WIOA programs to review the survey language, to ensure inclusion of employer services and measurement approaches relevant across core programs. Following survey and questionnaire programming, formal testing in November and December was completed. Testing of the online state program director survey involved the study team, state program directors, association staff, and experts at Urban. Fewer than 10 testers were involved in the testing of each instrument.


  1. Contact Information and Privacy

No uncompensated individuals were consulted on statistical aspects of the study design. No uncompensated persons or entities will collect or analyze information for the study.

The list of parties who were consulted on the statistical aspects of the design include:


Federal Workgroup


U.S. Department of Labor

Chief Evaluation Office

Person Responsible: Megan Lizik, Project Officer

Lizik.Megan@dol.gov


U.S. Department of Education

Office of Career, Technical, and Adult Education

Person Responsible: Cheryl Keenan, Director

cheryl.keenan@ed.gov


Rehabilitation Services Administration

Data Collection Analysis Unit

Person Responsible: Melinda Giancola, Chief

Melinda.Giancola@ed.gov


Subcontractors


George Washington University

Person Responsible: Dr. Burt Barnow, Co-Principal Investigator

barnow@gwu.edu


Capital Research Corporation

Person Responsible: John Trutko, Senior Advisor & Task Lead

jtrutko@aol.com


National Association of State Workforce Agencies

Center for Employment Security Education and Research

Person Responsible: Yvette Chocolaad, Senior Advisor

ychocolaad@naswa.org


Subject Matter Experts


Bruce Ferguson, President and CEO

CareerSource Northeast Florida

BFerguson@careersourcenefl.com


Harry Hatry, Distinguished Fellow

Urban Institute

HHatry@urban.org


Carolyn Heinrich, Patricia and Rodes Hart Professor of Public Policy, Education and Economics

Peabody College

Vanderbilt University

carolyn.j.heinrich@vanderbilt.edu


Jason Palmer, Director

Bureau of LMI & Strategic Initiatives

Department of Technology, Management & Budget

palmerj2@michigan.gov

All data collection and analysis will be conducted by:

The Urban Institute

Person Responsible: Shayne Spaulding, Project Director

SSpaulding@urban.org


Subcontractors


George Washington University

Person Responsible: Dr. Burt Barnow, Co-Principal Investigator

barnow@gwu.edu


Capital Research Corporation

Person Responsible: John Trutko, Senior Advisor & Task Lead

jtrutko@aol.com


National Association of State Workforce Agencies

Center for Employment Security Education and Research

Person Responsible: Yvette Chocolaad, Senior Advisor

ychocolaad@naswa.org



1 For purposes of this ICR supporting Statement, “State” and “States” refer to the fifty States, the District of Columbia, Puerto Rico, and U.S. territories and outlying areas.

2 All state workforce agencies responded to an October 2016 NASWA State Supplemental Funding Survey of state workforce agency administrators and finance directors, for a 100% response rate. NASWA received an 88 percent response rate on a survey in 2013 sent to state workforce administrators, Employment and Training Directors, Administration and Finance Directors, and Employment Services Directors. Sources: https://www.naswa.org/sites/naswa_main/files/document/fy_2016_supplemental_report.pdf

p2 and https://www.naswa.org/reports/state-role-public-workforce-development-system-evidence-survey-use-wagner-peyser-act


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorChristin Durham
File Modified0000-00-00
File Created2021-01-15

© 2024 OMB.report | Privacy Policy