HPOG NIE and Impact Supporting Statement_Part A REVISED 111814 clean

HPOG NIE and Impact Supporting Statement_Part A REVISED 111814 clean.docx

Health Profession Opportunity Grants (HPOG) program

OMB: 0970-0394

Document [docx]
Download: docx | pdf



Supporting Statement for OMB Clearance Request


Part A


HPOG Impact Study 36-Month Follow Up and National Implementation Evaluation of the

Health Profession Opportunity Grants (HPOG) to Serve TANF Recipients and Other Low-Income Individuals


0970-0394






November 2014


Submitted by:

Office of Planning,
Research & Evaluation

Administration for Children & Families

U.S. Department of Health
and Human Services


Federal Project Officers

Hilary Forster and Mary Mueggenborg


Table of Contents

Part A: Justification 1

A.1 Necessity for the Data Collection 2

A.1.1 Study Background 2

A.1.2 Legal or Administrative Requirements that Necessitate the Collection 3

A.1.3 Study Designs 3

A.1.4 Research Questions 4

A.1.5 Universe of Data Collection Efforts 5

A.2 Purpose of the Survey and Data Collection Procedures 7

A.2.1 Overview of Purpose and Approach 7

A.2.2 Data Collection Process 8

A.2.3 Who Will Use the Information 8

A.2.4 Instrument Item-by-Item Justification 9

A.3 Improved Information Technology to Reduce Burden 10

A.4 Efforts to Identify Duplication 11

A.4.2 Coordination and Streamlining of Study Efforts 12

A.5 Involvement of Small Organizations 12

A.6 Consequences of Less Frequent Data Collection 12

A.7 Special Circumstances 13

A.8 Federal Register Notice and Efforts to Consult Outside the Agency 13

A.8.1 Federal Register Notice and Comments 13

A8.2 Consultation with Experts Outside of the Study 13

A.9 Payment of Respondents 14

A.10 Privacy of Respondents 15

A.11 Sensitive Questions 16

A.12 Estimation of Information Collection Burden 16

A.12.1 Baseline Data Collection Already Approved 16

A.12.2 Current Information Collection Request 16

A.12.3 Total Burden Hour Request 18

A.13 Cost Burden to Respondents or Record Keepers 18

A.14 Estimate of Cost to the Federal Government 18

A.15 Change in Burden 18

A.16 Plan and Time Schedule for Information Collection, Tabulation and Publication 19

A.16.1 Analysis Plan 19

A.16.2 Time Schedule and Publications 20

A.17 Reasons not to Display OMB Expiration Date 21

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions 21

References 22

Appendices

A. HPOG Logic Model

B. HPOG-Impact 36-month Follow-Up Survey – Treatment Group Version

C. HPOG-Impact 36-month Follow-up Survey – Control Group Version

D. HPOG-Impact 36-month Follow-up Survey – Table of Contents

E. HPOG-NIE Screening Questionnaire

F. HPOG-NIE Semi-Structured Discussion Guide

G. OMB 60-Day Notice

H. HPOG-Impact Contact Update Letter and Contact Update Form

I. HPOG-Impact Advance Letter




Part A: Justification

This section provides supporting statements for the collection of information for the Health Profession Opportunity Grants (HPOG) program funded by the U.S. Department of Health and Human Services (HHS), Administration for Children and Families (ACF). This information collection will provide data for the Impact Study of the Health Profession Opportunity Grants (HPOG-Impact) and the National Implementation Evaluation of the Health Profession Opportunity Grants to Serve TANF Recipients and Other Low-Income Individuals (HPOG-NIE). HPOG funds programs that provide Temporary Assistance for Needy Families (TANF) recipients, other low-income individuals, and members of Native American tribes with training and support needed to find and keep employment in healthcare occupations and fill the growing demand for skilled healthcare workers. Thirty-two grants were awarded in September 2010 to government agencies, community-based organizations, post-secondary educational institutions, and tribal-affiliated organizations to conduct these activities. Of the 32 HPOG grants, 27 were awarded to agencies serving TANF recipients and other low-income individuals and are relevant to this request.

The current submission is in support of the HPOG Impact Study (HPOG-Impact) and the HPOG National Implementation Evaluation (HPOG-NIE). All 27 HPOG grantees serving TANF recipients and other low-income individuals are participating in HPOG-NIE; 20 are participating in HPOG-Impact. Abt Associates and its partner, the Urban Institute, are conducting both evaluations. Three data collections related to this current submission have been approved by the Office of Management and Budget (OMB) under OMB Clearance number 0970-0394, these include:

  1. The Implementation, Systems and Outcome Evaluation of the Health Profession Opportunity Grants to Serve TANF Recipients and Other Low-Income Individuals, for the Performance Reporting System (PRS) (clearance received September 2011).

  2. The HPOG Impact Study’s baseline data collection instruments (clearance received October 2012).

  3. The National Implementation Evaluation’s data collection through surveys and interviews of HPOG grantees, management/staff, stakeholders, and employers, and the HPOG Impact Study’s 15-month follow-up survey and related data collection instruments (clearance received August 2013).

HPOG-Impact and HPOG-NIE are two projects within the broader portfolio of research that OPRE is utilizing to assess the success of the career pathways programs and models. This strategy includes a multi-pronged research and evaluation approach for the HPOG program to better understand and assess the activities conducted and their results as well as the Pathways for Advancing Careers and Education (PACE)1 project. In order to maximize learning across the portfolio, survey development for the HPOG and PACE baseline and follow up surveys is being coordinated, and the majority of the data elements collected in these surveys are similar.

Two data collection efforts for PACE (three HPOG grantees participate in PACE) have been approved under OMB clearance number 0970-0397; these include:

  1. Baseline instruments and implementation interview guides (clearance received November 2011)

  2. The 15-month follow-up survey and related data collection instruments for PACE (clearance received August 2013)

A third (new) request is being submitted at the same time as this request under OMB #0970–0397, and HPOG and PACE research teams coordinated on development of these most recent surveys.

Other HPOG-related research and evaluation activities include a separate evaluation of the Tribal HPOG grants currently being conducted by NORC at the University of Chicago (OMB clearance number 0970-0395).

ACF and its contractors are engaged in many efforts to coordinate research activities so that each study capitalizes on related work conducted in other projects. By coordinating research efforts, burden is minimized for grantees and for study participants. In addition, comparable data from different, related studies may be combined to enhance the cumulative development of knowledge useful to government policy makers, program operators, and the public. HPOG-Impact and HPOG-NIE will use data from the PRS as well as adapted versions of instruments developed for PACE, allowing for future analysis of combined data sets. They also make use of data from the three PACE project sites that are HPOG grantees.

In this document, we request a revision to OMB clearance number 0970-0394 for the next phase of data collection for HPOG-Impact and for follow-on data collection for HPOG-NIE.

A.1 Necessity for the Data Collection

ACF seeks approval for 36-month follow up data collection for HPOG-Impact and for follow-on HPOG-NIE data collection activities.

A.1.1 Study Background

As part of the Affordable Care Act (ACA) of 2010, Congress authorized funds for the HPOG program “to conduct demonstration projects that provide eligible individuals with the opportunity to obtain education and training for occupations in the healthcare field that pay well” (Grant Announcement HHS-2010-ACF-OFA-FX-0126).2 These demonstration projects are intended to address two pervasive and growing problems: the increasing shortfall in the supply of qualified healthcare professionals in the face of expanding demand, and the increasing requirement for a post-secondary education to secure a job with a living wage for families.

The HPOG evaluations will collect data to document and demonstrate how effectively grantees implement the HPOG program and to assess how variations in program services affect program outcomes and impacts. As such, the studies will fill a void in the sectoral training and career pathways literature both about program effectiveness and about which types of programs or program components are most effective. Few large-scale impact studies of career pathways efforts exist, and none that show the impact of specific program components and models (Werner, Dun Rappaport, et al., 2011).3

A.1.2 Legal or Administrative Requirements that Necessitate the Collection

H.R. 3590, the ACA requires an evaluation of the HPOG demonstration projects (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(B)). The Act further indicates that the evaluation will be used to inform the final report to Congress (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(C)). The Act calls for evaluation activities to assess the success of HPOG in “creating opportunities for developing and sustaining, particularly with respect to low-income individuals and other entry-level workers, a health professions workforce that has accessible entry points, that meets high standards for education, training, certification, and professional development, and that provides increased wages and affordable benefits, including healthcare coverage, that are responsive to the workforce’s needs” (H.R. 3590, Title V, Subtitle F, Sec. 5507, sec. 2008, (a)(3)(B)).

A.1.3 Study Designs

The HPOG studies are guided by the career pathways framework, as shown in the HPOG logic model (Appendix A). The framework puts into practice the assertion that “post-secondary training should be organized as a series of manageable and well-articulated steps accompanied by strong supports and connections to employment” (Fein et al., 2012). These articulated steps provide opportunities for students to advance through successively higher levels of education and training, exiting into employment at multiple possible points. The framework also incorporates customization, supports and employer connections.

Guided by the framework, the goal of HPOG-NIE is to describe and assess the implementation, systems change, and outcomes related to the 27 HPOG grantees focused on TANF recipients and other lowincome individuals. The related goal of HPOG-Impact is to evaluate the effectiveness of approaches used by 20 of the HPOG grantees with regard to improving HPOG participants’ attainment of education, training, employment, and advancement within the healthcare field. HPOG-Impact also intends to assess whether HPOG programs affect the outcomes of participants’ families by including questions on the 36-month survey that focus on their children. Data from the three HPOG/PACE grantees will be used in some of the analysis.

HPOG-Impact also is intended to evaluate variation in participant impact that may be attributable to different HPOG program components and models. Twenty of the 27 HPOG grantees serving TANF and low-income individuals will be included in the HPOG impact analysis.4

The HPOG-Impact design includes randomizing program-eligible participants to treatment and control status in all sites. Ten grantees (19 sites total) are randomizing into two treatment arms (a basic and an enhanced version of the intervention) and a control group. This third experimental arm creates planned variation in some sites that is consistent with natural variation of these program components in other sites, thereby creating an opportunity to learn more both about the effects of program components but also about the methods we use to estimate those effects.

Those in the third arm are participating in one of three planned enhanced HPOG services, in which the basic HPOG programs are augmented by an additional program component. In this subset of grantees, program applicants will be randomly assigned to (1) the “basic” HPOG program, (2) an “enhanced” HPOG program (i.e., the HPOG program plus an enhancement) or (3) a control group that is not offered the opportunity to enroll in HPOG.

Control group members have access to whatever other programs and services are available in the local community.

In a parallel activity, as mentioned above, the PACE project has submitted an OMB clearance request (revision of OMB # 0970-0343) for a 36-month follow-up survey for all of the PACE projects, including the three HPOG grantees that are participating in PACE. PACE is also conducting a randomized experiment that will assess similar program impacts. Data from the PACE baseline survey and follow-up interviews will be included in the HPOG impact analysis.

Additionally, the research team will match participant data collected through HPOG-Impact for both the treatment and control groups to long-term employment and earnings data from ACF’s National Directory of New Hires (NDNH). An agreement with the Office of Child Support Enforcement (OCSE) is in place.

A.1.4 Research Questions

Overall, HPOG-NIE will address the following research questions:

1. How are health profession training programs being implemented across the HPOG grantee sites?

2. What changes to the service delivery system are associated with HPOG program implementation?

3. What participant-level outputs (e.g., enrollment, retention, course completions, accreditation/ certification) and outcomes (e.g., job entry, employment retention and advancement, earnings) occur?

4. What key components appear necessary or contribute to the success of these programs?

New questions specific to the data collection we are seeking approval for under this package focus on how performance measures are used and influence program implementation. They include:

  1. To what extent have grantees modified their initial performance goals and why?


  1. What kind of changes in program structure or program focus have grantees made as a result of their use of performance data?


  1. To what extent have grantees changed or expanded the way they use performance measurement information over the course of the HPOG Program?



  1. What types of performance information were most commonly used and/or considered most useful for making program improvements?


Overall, HPOG-Impact will address the following research questions:

1. What impacts do HPOG programs, as a group, have on the outcomes of participants and their families?

2. To what extent do these impacts vary by subgroups of interest?

3. Which locally-adopted program components influence average impacts?

4. To what extent does participation in a particular HPOG component (or components) change the impact experienced by individual trainees?

New questions specific to the data collection we are seeking approval for under this package focus on longer-term (36-month) participant impacts. They include:

  1. What are the longer-term effects of the HPOG program on its populations of interest?

  2. How do effects of career pathways programs vary over time, across outcomes or domains, by occupational sector, by program model and by participant characteristics?

  3. Do different HPOG models, strategies or components, (e.g., a particular curricular model, such as I-BEST, particular recruitment strategies or support services, etc.) lead to different impacts for participants?

  4. How can career pathways models be adjusted to promote longer-term outcomes for participants?

A.1.5 Universe of Data Collection Efforts

To address these research questions, the studies will use a number of data collection instruments. Instruments in the current clearance request include the following:

  1. The HPOG-Impact second follow-up survey (at 36-months post-random assignment) of both treatment and control group members.

  2. A HPOG-NIE screening questionnaire and semi-structured discussion guide for use in interviews with grantees about their use of performance measurement information.

These data are not available through any current source.

Study instruments approved by OMB in prior information collection requests include the following:

  1. HPOG Performance Reporting System (PRS), a management information system for documenting program activities and accomplishments against program goals and to assist with program management (approved September 2011 under this OMB No. 0970-0394)

  2. Supplemental Baseline Questions to the PRS, to be used at participant intake for random assignment into the Impact Study and for analysis for both HPOG-NIE and HPOG-Impact (approved October 2012 under this OMB No. 0970-0394)

  3. PACE 15-month Follow-Up survey for the three HPOG grantees that are participating in the PACE project (approved August 2013 under OMB No. 0970-0397)

  4. The HPOG-NIE Sampling Questionnaire for the HPOG surveys (approved August 2013 under this OMB No. 0970-0394)

  5. The HPOG-NIE Follow-Up phone call protocol for the Stakeholder/Network survey (approved August 2013 under this OMB No. 0970-0394)

  6. The HPOG-NIE Grantee survey (approved August 2013 under this OMB No. 0970-0394)

  7. HPOG-Impact Implementation interview guide for partnering employers (approved August 2013 under this OMB No. 0970-0394)

  8. HPOG-Impact Implementation interview guide for instructors (approved August 2013 under this OMB No. 0970-0394)

  9. HPOG-Impact Implementation interview guide for HPOG program management (approved August 2013 under this OMB No. 0970-0394)

  10. HPOG-Impact Implementation interview guide for HPOG program staff (approved August 2013 under this OMB No. 0970-0394)

  11. The HPOG-NIE Management and Staff survey (approved August 2013 under this OMB No. 0970-0394)

  12. The HPOG-NIE Stakeholder/Network survey (approved August 2013 under this OMB No. 0970-0394)

  13. The HPOG-NIE Employer survey (approved August 2013 under OMB No. 0970-0394)

  14. The HPOG-Impact and HPOG-NIE 15-month Participant Follow-Up survey (approved August 2013 under this OMB No. 0970-0394)

  15. The HPOG-Impact 15-month Control Group Member Follow-Up survey (approved August 2013 under this OMB No. 0970-0394)

As part of the HPOG data collection, we anticipate submitting a future additional OMB clearance request for the following:

  1. A third follow-up survey for HPOG-Impact study participants approximately 60 months after study enrollment.

Other extant data sources will be used for the HPOG- studies. These include the following:

  1. National Directory of New Hires (NDNH). These data will provide information about employment and earnings of HPOG participants.

  2. HPOG program management information, including initial applications and ongoing management reports, which we will use as supplemental information in tracking the evaluation of the grant, and information on the local healthcare labor market and needs for occupational training.

  3. Government sources of labor market data, from the U.S. Census Bureau and Bureau of Labor Statistics (BLS), such as County Business Patterns, Local Area Unemployment Statistics (LAUS), Quarterly Workforce Indicators QWI), which will be used to provide a picture of the local labor market.

A.2 Purpose of the Survey and Data Collection Procedures

A.2.1 Overview of Purpose and Approach

The HPOG-Impact and NIE studies, in conjunction with the related studies under the HPOG research umbrella will increase the knowledge base about the effectiveness of HPOG programs in providing TANF recipients and other lowincome individuals with opportunities for education and training that lead to employment and advancement in the healthcare workforce. We first describe the purpose and approach of the data to be collected primarily for HPOG-NIE, followed by a similar description of data to be collected for HPOG-Impact. However, it should be noted that, for the most part, data collected for one of the two studies will augment and enhance analysis for the other study.

        1. Overview of HPOG-NIE Approach

HPOG-NIE involves a set of complementary analyses regarding program implementation and the broader HPOG partner network and system. It includes a description of participant outcomes, an analysis of how participant outcomes relate to participant characteristics and program features, and how, from the perspective of stakeholders and employers, local projects have affected the healthcare labor market. The collection of information through the surveys of various organizations and respondents for HPOG-NIE will feed into these various analyses. Most of the information collected from HPOG grantees, partners, stakeholders, employers, and respondents from other organizations is close-ended in nature. The current data collection activities will augment data collected to date by providing information about how HPOG programs use performance measures and how the use of these data influence program implementation. The research team will use semi-structured interviews to collect data from select grantees.

        1. Overview of the HPOG-Impact Approach

For HPOG-Impact, baseline data are being collected through the HPOG PRS, including the supplemental baseline questions that were previously cleared under this OMB number. The purposes of these data are several. First, the contact information collected at baseline is necessary to enhance researchers’ ability to locate respondents for follow-up surveys that will measure intervention outcomes. A second purpose is to create a rich dataset for researchers to explore and test hypotheses, including those about the impact of HPOG programs and the relative effectiveness of various components and implementation features of those programs. Other analytic purposes of the baseline data include characterizing the HPOG Impact Study sample, adjusting for chance differences in observable characteristics and thereby increasing precision of impact estimates, identifying subgroups of interest (including program-related subgroups), checking the integrity of random assignment, and adjusting for non-random survey sample attrition. The purpose of the child roster questions is to create a sampling frame for follow-up surveys that collect data about child outcomes.

The 15-Month Participant Follow-Up survey, approved under a previous package and underway, collects data on outcomes, including HPOG services received, participation in non-HPOG trainings or services, receipt of degrees or certifications, and employment and earnings outcomes. These data will be used to understand treatment and control differentials in participant experiences and outcomes. The 36-Month Participant Follow-Up survey, for which approval is currently sought, will allow for an understanding of treatment and control differentials in participant experiences and outcomes longer term. Using experimental impact analysis and these data, the research team will estimate the extent to which HPOG program designs lead to differential mean individual outcomes between the treatment and control groups. In assessing the relative impacts of specific program components, the team is combining prospective systematic variation of program models within selected HPOG grantees with natural variation in program models across many HPOG grantees.

A.2.2 Data Collection Process

        1. HPOG-Impact Data Collection – New Request

The follow-up survey data collection for HPOG participants and control group members will take place approximately 36 months following random assignment, which begins in late spring of 2016. The interviews will be conducted primarily by telephone, with field follow-up for those respondents who cannot be reached by telephone (Appendix B: HPOG-Impact 36-month Follow-Up Survey – Treatment Group Version; and C: HPOG-Impact 36-month Follow-Up Survey – Control Group Version). Many of the questions to be asked at 36 months were approved for the 15-month survey and most other items have been asked in other OMB-approved studies. A summary of the sources of survey items is provided in Appendix D.

        1. HPOG-NIE Data Collection – New Request

First a short screening questionnaire (approximately 10 minutes) will be used to understand important changes to program practices, focus or structure and whether performance measurement information played a role in making these changes. This screening questionnaire will be administered as a web-based survey (Appendix E). Following the screening questionnaire, semi-structured interviews will be conducted with a subset of grantees to understand the process by which key program change decisions were made and how performance measurement information was used in making these changes (Appendix F).


The screening questionnaire will be administered in winter 2014/2015. Semi-structured interviews will be conducted in the winter - early spring of 2015.


A.2.3 Who Will Use the Information

The primary beneficiaries of this planned data collection effort will be ACF, other federal agencies, program operators, other policy makers and researchers, and the healthcare community. ACF will use the information to assess the effects of the HPOG programs on low-income individuals and on the healthcare community. These data will begin to answer ACF's and other policy makers questions about the implementation and impacts of the career pathways programs focused on training staff for the healthcare industry. It will help identify which program components and features appear to result in impacts related to education and credential achievement, employment and earnings, and income and will provide information on the systems change that occur as a result of these programs.

Secondary beneficiaries of this data collection will be those in the public policy and social science research community who are interested in further understanding initiatives to promote economic self-sufficiency of individuals and families through comprehensive career pathways programs, particularly as they relate to the healthcare industry. At the conclusion of the HPOG studies, the research team will provide ACF with a restricted-use data set containing individual-level data stripped of all personally identifying information. The restricted-use data will be made available to researchers for approved secondary uses. Ultimately, these data will benefit researchers, policy analysts, and policy makers in a wide range of program areas.

A.2.4 Instrument Item-by-Item Justification

The HPOG-Impact and HPOG-NIE studies involve a relatively large number of separate, but related analyses. These include:

  • HPOG-NIE: Descriptive Implementation Study;

  • HPOG-NIE: Systems Change Analysis;

  • HPOG-NIE: Outcome Study;

  • HPOG-NIE: Outcome Analysis;

  • HPOG-NIE: Special Study of Grantees’ Use of Performance Measurement Information;

  • HPOG-Impact: Impact Analysis (including the two- and three-arm experiments);

  • HPOG-Impact: Analysis of Natural Variation (including connecting it to the planned/experimental variation); and

  • HPOG-Impact: Qualitative Implementation Study.

Exhibit A-1 describes the target respondents, content, and reason for inclusion (i.e., which analyses will use the information) for each new data collection activity submitted with this current request. For more information about previously approved instruments, see previous information collection request (0970-0394), approved August 2013.

Exhibit A-1 Item-by-Item Justification of New Data Collection Instruments

Data Collection Instrument(s)

Respondents, Content, and Reason for Inclusion

1. HPOG-Impact: 36-month Participant Follow-Up survey (Appendix B)




Respondents: HPOG-Impact study participants assigned to the treatment group (estimated to total 7,260 over the three-year period)


Content:

  • Employment experiences, income, and material well-being

  • Aspirations and expectations regarding education and future employment

  • Barriers to employment

  • Perceptions of and experiences in HPOG programming

  • Type and amount of training received since random assignment and credentials earned

  • Type and amount of social services received

  • Knowledge of/access to financial resources and career opportunities in healthcare

  • Parent report of children’s school performance and behavior, parent aspirations for children’s future schooling, and children’s use of time.



Used for:

  • Impact-Impact Analysis

  • Impact-Analysis of Natural Variation

2. HPOG-Impact: 36-month Control Group Member Follow-Up survey (Appendix C)


Respondents: HPOG-Impact study participants assigned to the control group (estimated to total 3,623 over the three-year period)


Content:

  • Employment experiences, income, and material well-being

  • Aspirations and expectations regarding education and future employment

  • Barriers to employment

  • Type and amount of training received since random assignment and credentials earned

  • Type and amount of social services received

  • Knowledge of/access to financial resources and career opportunities in healthcare

  • Parent report of children’s school performance and behavior, parent aspirations for children’s future schooling, and children’s use of time.



Used for:

  • Impact-Impact Analysis

  • Impact-Analysis of Natural Variation


  1. HPOG-NIE screening questionnaire (Appendix E)

Respondents: HPOG program managers (49)


Content:

  • Use of performance measurement information to make program changes


Used for:

  • NIE Study of Grantees’ Use of Performance Measurement Information


  1. HPOG-NIE semi-structured discussion guide for use in interviews with grantees about their use of performance measurement information (Appendix F)




Respondents: HPOG program managers (20)


Content:

  • Use of performance measurement information to make program changes


Used for:

  • NIE Study of Grantees’ Use of Performance Measurement Information


A.3 Improved Information Technology to Reduce Burden

The HPOG studies will generate a substantial amount of data and will use a combination of data collection methods. For each data collection activity, the study team has selected the form of technology that enables the collection of valid and reliable information in an efficient way while minimizing burden. This evaluation will use improved technology to facilitate the collection of the survey data in standardized and accurate ways that also ensures the protection of the data collected.

As with the previously approved 15-month Participant Follow-Up survey, the 36-month Participant Follow-Up survey described above will be administered using CATI (computer-assisted telephone interviewing) technology. When the individual cannot be located by telephone, field staff will attempt to find the sample member in person. If the sample member then agrees to take the survey, the field staff will establish administer the survey electronically in the field, using CAPI (computer-assisted personal interviewing). CATI and CAPI technology reduce respondent burden, as interviewers can proceed more quickly and accurately through the survey instruments, minimizing the interview length. Computerized questionnaires ensure that the skip patterns work properly, minimizing respondent burden by not asking inappropriate or non-applicable questions. For example, respondents who did not participate in post-secondary training will be routed past questions only relevant to those who did. Computer-assisted interviewing can build in checkpoints, which allow the interviewer or respondent to confirm responses thereby minimizing data entry errors. Finally, automated survey administration can incorporate hard edits to check for allowable ranges for quantity and range value questions, minimizing out of range or unallowable values.

The agency considered using a planned missingness and multiple imputation method to reduce burden on respondents by establishing subpanels that receive different forms of the questionnaire.  This would enable the research team to report about a larger set of outcomes without causing the average interview duration to exceed 60 minutes. However, the research team was concerned about the large number of programs being evaluated and analytic difficulties that would be encountered in trying to impute the missing data in ways that fully reflected differences in effectiveness across programs.  The agency and research team instead opted to employ a variety of efficiencies to reduce the length of the survey overall to 60 minutes.  This included analyzing data from the 15-month survey to determine which 21st Century Skills scales worked best to answer the study questions and removing or consolidating questions that were similar to those asked at 15 months. 

The screening questionnaire will be hosted on the Internet via a live secure web-link. This approach is particularly well-suited to the needs of these surveys in that respondents can easily stop and start if they are interrupted, share the link with other respondents, and review and/or modify responses. Semi-structured interviews will be conducted by telephone.

A.4 Efforts to Identify Duplication

A.4.1 Surveys and Site Visits

The HPOG data collection efforts collect information that other sources do not currently provide. For the HPOG-NIE data collection regarding grantees’ use of performance measurement information, the research team will first review existing sources of information, including the PRS, grantee applications, and performance progress reports, to compile pertinent information. The interviews will not ask for this same information but will ask selected grantees to provide more detailed information.

The HPOG-Impact 36-month participant follow-up survey will collect information that is not available from any other sources. This includes information on the control group experiences post-random assignment, for which there is no other information, as well as information on HPOG participants’ (treatment members) in the 20 HPOG-Impact sites post-HPOG experiences.

At 36 months, many study participants will have completed two-year degrees or transferred to four-year institutions; others will have gained employment after shorter credential programs, and their families and children may benefit from their additional education, earnings, and overall well-being. Importantly, the 36-month survey will allow us to measure educational progress (at this point receipt of a credential in a health profession or still in healthcare training) which will be the subject of one of the two confirmatory hypotheses at this point. The information on HPOG training/service receipt and completion available in the PRS for HPOG participants will be used in these studies. Although the team will request some similar information in the 36-month Participant Follow-Up survey, the purpose will be to verify and expand on PRS data. In addition, both studies will use administrative information on wages and employment from the NDNH linked to PRS data that will eliminate the need for gathering as complete an employment history in the survey as might otherwise be necessary. However, these administrative data do not have information on hourly wages, benefits, or other aspects of the job that we will collect in the follow-up survey.

A.4.2 Coordination and Streamlining of Study Efforts

The HPOG and PACE research teams have and will continue to work closely to coordinate data collection across both studies. Areas of coordination include:

  • Previously approved data collection efforts have coordinated closely, with HPOG-Impact using HPOG-NIE Grantee and Management and Staff surveys to measure program features, and HPOG-Impact site visits collecting information for use in the HPOG-NIE Descriptive Implementation Study and Systems Change Analysis. PACE and HPOG teams have conducted site visits jointly to reduce the burden for site staff. PACE has shared data collected from the three HPOG sites in the PACE study with the HPOG research team for inclusion in both the NIE and Impact studies.

  • Questions and constructs for the 36-Month Participant Follow-Up survey included in this clearance request were based, to the extent feasible on the 15-month follow-up evaluation instruments in order to ensure the alignment of a core group of questions.

A.5 Involvement of Small Organizations

The primary organizations involved in this study are community colleges, workforce development agencies, employers, and community-based organizations that operate occupational training programs and provide related services. The research team will minimize burden for these entities, including those that could be considered to be small organizations, by requesting only the information required to achieve the study’s objectives and offering them the use of on-line data collection tools so that they can respond to the information request at their convenience. In addition, at the time the grants were awarded, ACF informed all grantees of the congressionally mandated evaluation and the reporting requirements, and adequate resources have been provided to coordinate the data collection and reporting. There should be no adverse impact for any grantees participating in the study.

A.6 Consequences of Less Frequent Data Collection

The data collection effort described in this document is essential to the HPOG-Impact and the HPOG-NIE studies. Less frequent data collection would jeopardize ACF’s ability to conduct these Congressionally mandated studies in time to provide relevant and timely results to shape policy. Collecting data identified in the current request will allow measurement of program implementation essential for the HPOG Descriptive Implementation Study within the HPOG-NIE, as well as longer-term outcome measures for individuals for the HPOG-Impact study.

Data collected through the HPOG-Impact and HPOG-NIE studies are critical to ACF’s comprehensive strategy to evaluate the HPOG demonstration grants. HPOG is a significant policy initiative aimed at training and placing TANF recipients and other low-income individuals in stable healthcare industry occupations with a career path. Many of the HPOG grantees have adopted cutting-edge education and training technologies developed over the past decade to meet the needs of older, non-traditional students with little or no post-secondary educational experience. These relatively new approaches, which generally align with the Career Pathways framework, are largely untested by strong evaluation designs. Together, these studies will develop knowledge about the effectiveness of the new training modules and what works best for whom.

A.7 Special Circumstances

There are no special circumstances for the proposed data collection.

A.8 Federal Register Notice and Efforts to Consult Outside the Agency

A.8.1 Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13 and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995)), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on Monday, June 23, 2014, Volume 79, Number 120, page 35547, and provided a 60-day period for public comment. A copy of this notice is included as Appendix G. During the notice and comment period, the government received no requests for information or substantive comments.

To ensure the length of the instrument is within the burden estimate, we took efforts to pretest and edit the instruments to keep burden to a minimum. During internal pretesting, all instruments were closely examined to eliminate unnecessary respondent burden and questions deemed to be unnecessary were eliminated. External pretesting was conducted with four respondents and the length of the instrument was found to be consistent with the burden estimate. Some instrument revisions were made to improve question clarity and response categories. Edits made as a result of the pretest have been incorporated in the instruments attached in Appendices B and C.

A8.2 Consultation with Experts Outside of the Study

The HPOG-Impact and HPOG-NIE studies consulted outside experts to inform research designs and analysis plans. These include the following:

Outside Expert

Affiliation

Contact Information

Dr. Larry Hedges

Professor, Northwestern University

2006 Sheridan Rd., Evanston, IL 60208

l-hedges@northwestern.edu

847-491-8899

Dr. Carolyn Heinrich

Professor, University of Texas at Austin

Lyndon B. Johnson School of Public Affairs, P.O. Box Y, Austin, TX 78713

cheinrich@austin.utexas.edu

512-471-3779 

Dr. Philip Hong

Associate Professor, Loyola University

820 N. Michigan Ave., Lewis Towers 1238, Chicago, IL  60611

phong@luc.edu

312.915.7447

Dr. Katherine Magnuson

Associate Professor, University of Wisconsin

206 School of Social Work, University of Wisconsin-Madison, Madison, WI 53706

kmagnuson@wisc.edu

(608) 263-4812

Ms. Karin Martinson

Principal Associate, Abt Associates

4550 Montgomery Ave., Bethesda, MD, 20814

Karin_Martinson@abtassoc.com

301-347-5726

Dr. Jeffrey Smith

Professor, University of Michigan

Department of Economics, University of Michigan, 238 Lorch Hall, 611 Tappan St., Ann Arbor, MI 48109

econjeff@umich.edu

734-764-5359



A.9 Incentives for Respondents

For the HPOG 36-month Participant Follow-Up survey of HPOG participants and control group members in HPOG-Impact sites, we plan to offer participants $40 for completing the follow-up survey, as reflected in the advance letter and introductory text in the survey. This reflects an increase from the $30 offered for the 15-month Participant Follow-Up survey. The use of increased incentives is expected to reduce nonresponse bias to 36-month follow-up for the difficult-to-follow sample in the HPOG study. (PACE sites are also planning to provide the same amount, as indicated in the separate OMB package.)

Additionally, we seek approval to offer $5 with the advance letter for the 36-month survey as a gesture of appreciation to respondents. This request is based on the literature documenting the effectiveness of prepayments in interviewer-mediated surveys. Singer, Van Hoewyk, and Maher (2000) report that prepaid incentives were effective in reducing nonresponse to the Survey of Consumer Attitudes in a series of experiments in which promised incentives were not. An earlier meta-analysis by these authors (Singer et al. 1999) found that prepaid incentives were effective in increasing response rates and, although there was no significant difference between prepaid and promised incentives, noted that “the direction of the differences is in accord with conventional wisdom – i.e., prepayment appears to be more effective than promised payments. When we look only at experiments that hold size of payment constant and compare prepayment and promised payment, find in every case that prepayment yields higher response rates than the promised payment condition” (p. 223). Similarly, Cantor, O’Hare, and O’Connor’s (2008) review of incentive experiments in telephone surveys found consistently significant effects for prepaid incentives between $1 and $5, with response rate increases of 2.2 to 12.1 percentage points. The effectiveness of prepaid incentives for interviewer-mediated surveys builds upon the extensive literature documenting the effectiveness of prepaid incentives for mail surveys (c.f. Church 1993; Edwards et al. 2002; Yammarino, Skinner, and Childers 1991). Given the research findings outlined above, we also seek approval to offer $2 for each of the three rounds of participant contact update requests (see Appendix H). The $2 will be sent to participants with the request for updated contact information.

Many surveys are designed to offer incentives of varying types with the goal of increasing survey response. Monetary incentives at one or more phases of data collection have become fairly common, including some federally sponsored surveys. Examples include the National Survey on Drug Use and Health (NSDUH, Substance Abuse and Mental Health Services Administration), the National Survey of Family Growth (NSFG, National Center for Health Statistics), the National Health and Nutrition Examination Survey (NHANES, National Center for Health Statistics), the National Survey of Child and Adolescent Well-Being (NSCAW, Administration for Children and Families), and the Early Childhood Longitudinal Study-Birth Cohort (ECLS-B, U.S. Department of Education).

There has been extensive publication about the relative efficacy of different monetary incentives, but several federal agencies have determined $20–$30 to be effective. The U.S. Census Bureau has experimented with and begun offering monetary incentives for several of its longitudinal panel surveys, including the Survey of Income and Program Participation (SIPP) and the Survey of Program Dynamics (SPD). SIPP has conducted several multi-wave incentive studies, most recently with its 2008 panel, comparing results of $10, $20, and $40 incentive amounts to those of a $0 control group. SIPP examined response rate outcomes in various subgroups of interest (e.g., the poverty stratum), use of targeted incentives for non-interview cases, and the impact of base wave incentives on later participation. Overall, $20 incentives increased response rates and improved the conversion rate for non-interview cases. (Creighton et al., 2007). The NSDUH conducted an experiment in which the cost per interview in the $20 incentive group was 5 percent lower than the control group, whereas the $40 incentive group cost was 4 percent lower than the control, due to reduced effort needed in gaining cooperation (Kennet et al., 2005). The NSDUH adopted an intermediate incentive of $30 because the greatest increase in response rate was found in the $20 incentive condition, and the $40 condition obtained a higher variation in per-interview costs. A similar incentive experiment conducted for the NSFG Cycle 5 Pretest examined $0, $20, and $40 incentive amounts. The additional incentive costs were more than offset by savings in interviewer labor and travel costs (Duffer et al., 1994).

A.10 Privacy of Respondents

Although the 36-month Participant Follow-Up survey itself does not involve collecting individual identification data, the HPOG-Impact and HPOG-NIE will include individual identification data collected through the existing PRS. This is consistent with previously approved information collection requests under 0970-0394. All HPOG-Impact and HPOG-NIE study participants will complete both the PRS and the Supplemental Baseline Questions added to it for the purpose of the Impact Study.

The information collected under this data collection will be kept private to the fullest extent provided by law. The information requested under this collection will be private in a manner consistent with 42 U.S.C. 1306, 20 CFR 402, and OMB Circular No. A-130. ACF recognizes that HPOG grantees serve vulnerable populations (per the authorizing legislation), and that grantees must protect those populations from any risks of harm from the research and evaluation activities. Accordingly, as is done when collecting participant data in the PRS, HPOG-Impact and HPOG-NIE obtained informed consent forms from all study participants who entered the HPOG study. This informed consent ensures that participants understand the nature of the research and evaluation activities being conducted. The introductory text in the 36-month Follow-Up Surveys (Appendices B and C) reiterates the voluntary nature and privacy of the survey.

As a part of informed consent for study participants, grantees provided the following rationale for data collection and privacy assurances to HPOG participants:

  • We are conducting this research to see how well various approaches to training for healthcare jobs work. This program and research are funded by the U.S. Department of Health and Human Services, and they may fund other research on this program in the future.

  • In this program, we will collect some personal information from you, such as your name, date of birth, Social Security number, and your involvement in other programs. The researchers studying the program for the government also need this information. We will keep all of the information about you collected for the program or for the research studies completely private to the extent allowed by law, and no one’s name will ever appear in any report or discussion of the evaluation results.

  • As part of the study, researchers may contact some of you in the future. You may refuse to answer any of their specific questions at any time.

  • Researchers and program staff using the information collected must take all necessary actions to protect your information and they will pledge their agreement to protect privacy. All Abt Associates employees must sign a data privacy pledge on accepting an offer of employment. Any individual allowed access to identifiable data for this project must sign an additional user agreement pledging privacy. Urban Institute employees must sign a similar pledge of privacy upon employment. Individuals accessing data through the PRS must sign an additional PRS User Agreement that indicates that they will keep those data secure.

The screening questionnaire and semi-structured interviews are purely voluntary. Respondents will be told that all of their responses during the interview will be kept private, their names will not appear in any written reports, and that responses to the questions are entirely voluntary.

The ACF Office of Planning, Research and Evaluation is in the process of publishing a System of Records Notice (SORN) titled OPRE Research and Evaluation Project Records and a Privacy Impact Assessment (PIA) titled ACF Research and Evaluation Studies.

A.11 Sensitive Questions

With the exception of some of the 36-month Participant Follow-Up survey items, no questions of a sensitive nature will be asked. The follow-up surveys include items addressing respondents’ income, welfare receipt, presence of children in the household, and employment barriers such as illness or disability. Some respondents may consider these somewhat personal questions to be sensitive.

Including such items as income, welfare receipt, presence of children in the household, and barriers to employment is necessary to describe the study population and evaluate their moderating effects on program impacts. Furthermore, questions pertaining to personal preferences, motivations and self-efficacy will be especially useful for identifying the pathways that participants follow through multi-faceted programs. This will allow the study team to estimate the impacts of various program models and components, which is the central research question that HPOG-Impact considers. Interview staff will inform respondents that survey participation is voluntary and they may refuse to answer individual items. Study participants will also be reminded that their responses will be kept private, to encourage their candid responses.

A.12 Estimation of Information Collection Burden

A.12.1 Baseline Data Collection Already Approved

The total burden for the instruments already approved (the PRS and Supplemental Baseline Questions and 15 month follow-up) was estimated to be 3,380 hours annually. Estimated burden to continue use of these instruments is 10,140 hours total over three years, or 3,380 hours annually.

A.12.2 Current Information Collection Request

Exhibit A-2 presents the reporting burden on study participants completing the instruments included in this data collection request and their total cost. Because some of the data collection instruments will be in the field for longer than one year, burden is annualized and reflected across a three-year period. Estimated burden related to the new data collection request is 8,430 hours total over three years, or 2,810 hours annually.

We calculated the average hourly wage for each respondent group based on information from the Bureau of Labor Statistics5 or the federal minimum wage. We calculated the average hourly rate6 for each respondent group using the following categories:

  • Study participant: the minimum hourly wage ($7.25) plus a 40 percent adjustment to account for benefits, or $10.15 per hour.

  • Community and Social Service Occupations (SOC 21-0000): wage rate of $21.07 plus a 40 percent adjustment for benefits, or $29.49.

  • Education, Training, and Library Occupations (SOC 25-0000): wage rate of $24.46, plus a 40 percent adjustment for benefits, or $34.24.

  • Social and Community Service Manager Occupations (SOC 11-9151): wage rate of $30.43, plus a 40 percent adjustment for benefits, or $42.60.

  • Medical and Health Services Managers (SOC 11-9111): wage rate of $46.17 plus a 40 percent adjustment for benefits, or $64.64.

  • HR Managers (SOC 11-3121): wage rate of $52.21 plus a 40 percent adjustment for benefits, or $73.09.

When members of a respondent group come from multiple job categories, we took an average across the relevant categories, as noted.

Exhibit A-2: Annual Information Collection Activities and Cost

Instrument

Total Number of Respondents

Annual Number of Respondents

Number of Responses Per Respondent

Average Burden Hours Per Response

Annual Burden Hours

Average Hourly wage

Total annual cost

Previously Approved Instruments


PRS

32

11

4

31.2

1373



HPOG-Impact 15-month Participant Follow-Up survey

5,600


1867

1

0.7

1307



HPOG-Impact 15-month Control Group Member Follow-Up survey

2,800


933

1

0.6

560



HPOG–NIE 15-month Participant Follow-Up survey

600


200

1

0.7

140



Current Request for Approval

HPOG-Impact 36-month Participant Follow-Up survey

5,808

1,936

1

1

1936

$10.15

$19,650

HPOG-Impact 36-month Control Group Member Follow-Up survey

2,898


966

1

1

966

$10.15

$9,805

HPOG-NIE Performance Measurement Screening Questionnaire

39

13

1

.2

3

$42.60

$127.80

HPOG-NIE Performance Measurement Semi-Structured Discussion Guide

20

7

1

1

7

$42.60

$298.20

Estimated Annual Response Burden Hours: 6,292

A.12.3 Total Burden Hour Request

Exhibit A-2 displays total burden. The total burden to continue use of already approved information collection in addition to the new request is 18,876 hours or 6,292 hours per year over three years.



A.13 Cost Burden to Respondents or Record Keepers

This data collection effort involves no recordkeeping or reporting costs for respondents other than those described in Exhibit A-2 above.

A.14 Estimate of Cost to the Federal Government

The total cost for the new data collection activities will be approximately $6,221,800. Annual costs to the Federal government will be approximately $2,073,933 for the proposed data collection. Total annual cost of the information collection including previously approved information collection is $13,892,621. This includes the cost of developing and pretesting data collection instruments and tools, administering the surveys (including the 15-month and 36-month Participant Follow-Up surveys), and collecting information on use of performance measurement data. The annual cost to continue use of previously approved instruments and begin use of the new information collections is $4,630,736.

A.15 Change in Burden

This evaluation involves new and continuing data collection that increases the public reporting burden under this OMB number.

A.16 Plan and Time Schedule for Information Collection, Tabulation and Publication

A.16.1 Analysis Plan

HPOG-Impact and HPOG-NIE have complementary analysis plans. HPOG-NIE reports will focus on the structure of the programs designed and implemented by the grantees as well as on immediate outputs of the programs in simple terms such as numbers of graduates. HPOG-Impact reports will focus on evaluating the overall effectiveness of the grant program, as well as evaluating the relative efficacy of different program designs and studying linkages between specific program features and personal participation on the one side to student outcomes on the other side.

While additional reports may be produced, current plans include the following reports:

  1. An interim NIE report on baseline descriptive outcomes. This report, expected by August 2014 will profile participants in terms of data collected in the Performance Reporting System (PRS).

  2. An NIE report on descriptive implementation and outcomes. This report, expected by May 2015, will contain two major sections. The first section will be a comprehensive description of the HPOG program as designed and implemented across 27 grantees and their sites. The section will cover, for example: program context, including local labor market characteristics, program operations, resources, and costs, and individual level outcomes or HPOG participants. The report will also include updates to the interim report on baseline descriptive outcomes using additional available data from the PRS.

  3. An NIE systems change and network analysis report. This report, expected by May 2015, will discuss changes to the service delivery system associated with program implementation. In addition, the Systems Change Analysis will describe and analyze the institutional and stakeholder network in which the HPOG program operates.

  4. An NIE final report. This report, expected by September 2017, will use the 15-month Participant Follow-Up survey data from participants to give a more complete understanding of the conditions of employment. Some of the outcomes that can be studied with participant survey data (but could not be studied with data from the PRS or NDNH) include: post-program employment and earnings in a health job, wages, benefits, further career training and career advancement.

  5. Impact evaluation final report on short-term outcomes. This report, expected by June 2016, will focus on how average outputs (including education and training experiences) and outcomes (including, credential/ certificate/degree attainment, employment, earnings/wages, job benefits and other characteristics) differ between the randomized groups, differences that—when statistically significant—are attributable to the HPOG program since no other systematic differences exist. The data will be examined in multiple ways to address the multiple research questions, including:

  • All treatment group members compared to all control group members;

  • Comparison of treatment and control group members for individuals in specific demographic categories such as women, high school dropouts, and non-native English speakers;

  • Comparison of treatment and control group members whose HPOG programs provide certain intervention components; and

  • Comparisons of treatment and control group members with equivalent baseline characteristics that, in the treatment group, are associated with participation in a particular intervention component.

A key feature of the Impact Study is the exploitation of the substantial cross-site variation in program design and implementation, both via planned and natural variation, to address the program component-focused research questions. This report will focus on program experiences and the treatment-control contrast, early program impacts on outcomes such as credential attainment and impacts on intermediate outcomes such as employment and earnings as well as job quality.

  1. Impact evaluation final report on intermediate outcomes. This report, expected in March 2019, will include similar analyses of employment, education, and other outcomes for treatment and control group members at roughly 36 months after random assignment.

  2. An NIE report on grantees’ use of performance measurement information. This report, expected in May 2015, will provide information on grantees use of performance measurement information to track their individual program’s performance and to modify programs and practices.

Upon completion, each of the final reports will go through ACF’s thorough review process. As part of the review process, ACF will ensure each report is 508 compliant for dissemination on their website.

A.16.2 Time Schedule and Publications

Exhibit A-3 presents an overview of the project schedule for information collection. It also identifies publications associated with each major data collection activity.

Exhibit A-3: Overview of Project Data Collection Schedule

Data Collection Activity

Timing

Associated Publications

Previously approved data collection efforts7

Baseline data collection for HPOG-Impact

March 2013–November 2014

Baseline Data Collection Report (February 2015)

Survey sample frames

October – December 2013

NA

Surveys of HPOG grantees, management/staff, stakeholders, and employers

November 2013 – May 2014

Descriptive Implementation and Outcome Report (May 2015)

Systems Change and Network Analysis Report (May 2015)

15-month Participant Follow-Up survey

March 2014–November 2015

Impact Evaluation Final Report on Short-term Outcomes (June 2016)

Final NIE Report (September 2017)

New data collection requests

36-month Participant Follow-Up survey

March 2016 to December 2017

Impact Evaluation Report on Intermediate Outcomes (March 2019)

Screening questionnaire

Winter 2014/2015

HPOG Grantees’ Use of Performance Measurement Information (May 2015)

Semi-structured Interviews about Use of Performance Measures

Winter/Spring 2015

HPOG Grantees’ Use of Performance Measurement Information (May 2015)

A.17 Reasons not to Display OMB Expiration Date

All instruments created for HPOG-NIE and HPOG-Impact will display the OMB approval number and the expiration date for OMB approval.

A.18 Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this information collection.


References

Cantor, D., O'Hare, B. C., & O'Connor, K. S. (2008). The Use of Monetary Incentives to Reduce Nonresponse in Random Digit Dial Telephone Surveys. Advances in Telephone Survey Methodology, 471-498. John Wiley and Sons.

Church, A. H. (1993). Estimating the effect of Incentives on Mail Survey Response Rates: A Meta-Analysis. Public Opinion Quarterly, 57(1), 62-79.

Creighton, K., King, K., and Martin, E. (2007). The Use of Monetary Incentives in Census Bureau Longitudinal Surveys. Research Report Series, Survey Methodology, #2007-2, January. Washington, DC: U.S. Census Bureau.

Duffer, A., Lessler, J., Weeks, M., and Mosher, M. (1994). Effects of Incentive Payments on Response Rates and Field Costs in a Pretest of a National CAPI Survey. Paper presented at the Annual Meeting of the American Statistical Association, Toronto.

Edwards, P., Roberts, I., Clarke, M., DiGuiseppi, C., Pratap, S., Wentz, R., and Kwan, I. 2002. Increasing Response Rates to Postal Questionnaires: Systematic Review. British Medical Journal 324:1883-85.

Fein, D., Judkins, D., Martinson, K., Rolston, H., Gardiner, K., Bell, S., and Barnow, B. (2012). Innovative Strategies for Increasing Self-Sufficiency Evaluation Design Report. Unpublished draft manuscript. Bethesda, MD: Abt Associates Inc.

Hendra, R., Dillman, K., Hamilton, G., Lundquist, E., Martinson, K., and Wavelet, M. (2010). The Employment Retention and Advancement Project: How Effective Are Different Approaches Aiming to Increase Employment Retention and Advancement? Final Impacts for Twelve Models. New York: MDRC.

Kennet, J. and Gfroerer, J. (Eds.) (2005). Evaluating and improving methods used in the National Survey on Drug Use and Health (DHHS Publication No. SMA 05-4044, Methodology Series M-5). Rockville, MD: Substance Abuse and Mental Health Services Administration, Office of Applied Studies.

Maguire, S., Freely, J., Clymer, C., Conway, M., and Schwartz, D. (2010, July). Tuning In to Local Labor Markets: Findings from the Sectoral Employment Impact Study. Philadelphia: Public/Private Ventures.

Roder, A., and Elliott, M. (2011, April). A promising start: Year Up’s initial impacts on low-income young adults’ careers. New York, NY: Economic Mobility Corporation.

Singer, E., Groves, R., and Corning, A. (1999). Differential Incentives: Beliefs about Practices, Perceptions of Equity, and Effects on Survey Participation. Public Opinion Quarterly 63:251–60.

Singer, E., Van Hoewyk, J., & Maher, M. P. (2000). Experiments with Incentives in Telephone Surveys. Public Opinion Quarterly, 64(2), 171-188.

Werner, A., Dun Rappaport, C., and Lewis, J. (2011). Implementation, Systems and Outcome Evaluation of the Health Profession Opportunity Grants to Serve TANF Recipients and Other Low-Income Individuals, Draft literature review: Career Pathway Programs. Cambridge, MA: Abt Associates Inc.

Yammarino, F. J., Skinner, S. J., & Childers, T. L. (1991). Understanding Mail Survey Response Behavior: A Meta-Analysis. Public Opinion Quarterly, 55(4), 613-639.



1 From the project inception in 2007 through October 2014 the project was called Innovative Strategies for Increasing Self-Sufficiency.

2 Authority for the HPOG demonstrations is included in the Patient Protection and Affordable Care Act (ACA), Public Law 111-148, 124 Stat. 119, March 23 2010, sect. 5507 (a), “Demonstration Projects to Provide Low-Income Individuals with Opportunities for Education, Training, and Career Advancement to Address Health Professions Workforce Needs,” adding sect. 2008(a) to the Social Security Act, 42, U.S.C., 1397g(a).

3 Public Private Ventures’ Sectoral Employment Impact Study (Maguire et al., 2010) is an impact evaluation of sectoral employment programs; A Promising Start: Year Up’s Initial Impacts on Low-Income Young Adults’ Careers (Roder and Elliott, 2011) is a small-scale random assignment impact study of a sectoral employment effort that does not target healthcare. The impact evaluation of the national Employment Retention and Advancement (ERA) Project (Hendra et al, 2010) is another recent impact study of a workforce development program, but it is not specifically focused on career pathways or healthcare.

4 Three HPOG grantees being evaluated under ISIS (though data collection is being coordinated, and data from the HPOG/ISIS grantees may be used in the HPOG Impact Study) and four HPOG grantees who are engaged in independent research projects with a University partner are not included in the HPOG Impact Study.

5 http://www.bls.gov/oes/current/oes_nat.htm

6 Assuming 2080 FTE hours worked.

7 Currently operating under previous OMB clearance (approved September 2011 and October 2012)


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-27

© 2024 OMB.report | Privacy Policy