HPOG 2.0 Long-Term Follow-up Survey Supporting Statement B_REV_June 2022_V3

HPOG 2.0 Long-Term Follow-up Survey Supporting Statement B_REV_June 2022_V3.docx

OPRE Evaluation - National and Tribal Evaluation of the 2nd Generation of the Health Profession Opportunity Grants [descriptive evaluation, impact evaluation, cost-benefit analysis study, pilot study]

OMB: 0970-0462

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



National and Tribal Evaluation of the 2nd Generation of the Health Profession Opportunity Grants


OMB Information Collection Request

0970 - 0462





Supporting Statement

Part B



June 2022







Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Lisa Zingman

Nicole Constance




Part B

B1. Objectives

Study Objectives

As described further in Supporting Statement A, the U.S. Department of Health and Human Services (HHS), Administration for Children and Families (ACF) awarded grants to 32 organizations to administer the second generation of the Health Profession Opportunity Grants (HPOG 2.0) Program. The Program, which ended enrollment in September 2021, provided healthcare occupational training for Temporary Assistance for Needy Families (TANF) recipients and other low-income people. OMB has approved various data collection activities in support of the HPOG 2.0 National and Tribal Evaluations under OMB #0970-0462 (see Supporting Statement A, Study Design for a summary of the previous approvals.)

As noted in Supporting Statement A, ACF is preparing to conduct the HPOG 2.0 Long-Term Survey (LTS)—the focus of this information request. The goal of conducting the LTS (approximately five and a half years after random assignment) with the same cohort of participants selected for the Intermediate-Term Follow-up Survey (ITS) is to measure the outcomes of interest over the longer-term. The procedures for collecting contact updates from the participants selected for the LTS have already been approved under this OMB #0970-0462 (Instrument 5b/Instrument 5b-Phone Version in July 2021). This request seeks approval to continue use of the contact updates and to begin use the LTS (Instrument 21 and 21a) and supporting materials. We are also finishing data collection using the previously approved COVID-19 Cohort Short-Term Follow-up Survey. All other previously approved data collection is complete.

Generalizability of Results

This randomized study is intended to produce internally valid estimates of the intervention’s causal impact, not to promote statistical generalization to other sites or service populations.

Appropriateness of Study Design and Methods for Planned Uses

The HPOG 2.0 LTS will provide insights into the longer-term impacts of HPOG 2.0 for outcomes that are not captured in administrative records, such as details about educational experiences, characteristics of employment (including employment in and earnings from healthcare, a crucial outcome for this study), self-employment, and earnings from jobs not covered in administrative data, receipt of public assistance, physical and mental well-being, and child outcomes. Through this survey, OPRE can address important unanswered questions for policymakers and practitioners about the effects of the HPOG 2.0 Program on outcomes and impacts of study participants about five and a half years after enrollment. The results are not expected to be representative of the general population. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.


B2. Methods and Design

Target Population

Grantees began enrolling participants in 2016 and continued through the end of the grant period in September 2021. Overall, the 32 grantees enrolled more than 56,000 study participants. The 27 non-tribal grantees, the focus of this information collection, randomly assigned more than 52,000 study participants and enrolled an additional 3,000 participants who were not subject to random assignment.

The target population for the LTS is the universe of applicants to the non-Tribal HPOG 2.0 programs who were randomly assigned between September 2017 and January 2018. This is the same sample as the ITS and represents a subset of the Short-Term Follow-Up Survey (STS) sample.

Sampling

The evaluation team selected 13,118 study participants—all of the participants enrolled between March 2017 and February 2018—for inclusion in the STS sample (previously approved under this OMB control number in June 2018). A subset—4,946—of those participants, was selected for the ITS sample—those enrolled between September 2017 and January 2018. The ITS information collection request describing the sample selection and the survey instrument was previously approved in July 2019, with revisions approved in June 2020. This LTS, the third follow-up survey effort with the ITS sample, will provide data to measure outcomes and impacts at a third point in time after enrollment. Prior to data collection, study members identified as deceased through attempts to update their contact information or prior survey efforts will be excluded from the LTS survey sample. We will also exclude those who asked not to be contacted for future surveys. After removing those study members, the beginning survey sample for the LTS is expected to be about 4,938. (Please see Attachment AG, which includes Section B1 from Supporting Statement B, previously approved in July 2019, with updates approved in June 2020, for more detail on sample selection.)

Sampling Plan for Study of Impacts on Child Outcomes

The HPOG 2.0 LTS will include a new child module to assess program impacts on participants’ children. The survey will ask each parent to answer questions about one child (even if there is more than one eligible child in the household). That child should be someone who was included in the household roster at enrollment and will be between 6 and 24 years of age as of the start of the LTS data collection period. Below, we discuss how we will select the focal child when there is more than one such child.

Using the dates of birth collected for each child in the household at baseline, we will assign children to one of three age categories based on age at the start of the LTS data collection period: (1) children in kindergarten through grade 5; (2) children in grades 6 through 12; and (3) children who have aged into young adulthood (18-24 years of age). Based on baseline household rosters, Exhibit B-1 shows counts of children by age group in the LTS sample.

Exhibit B-1 Child Sample Size by Age Group


Age group 1: Kindergarten through Grade 5

Age group 2: Grades 6 through 12

Age group 3: Aged into Adulthood (18-24 years of age)

Total

N

2,550

2,115

1,033

5,698

Expected (n)

1,398

929

705

5,698

Design Effect

1.43

1.32

1.34


Effective (n)

975

705

449

2,129


The row labelled “N” provides the total number of children reported by study members. The row labelled “Expected (n)” provides the number that we project will be selected by the procedure described immediately below. The third row shows the design effect we expect to incur due to the variation in weights. (For a given sample size, the design effect is a measure of the loss of precision due to not using simple random sampling.) Finally, the row labelled “Effective (n)” provides the sample size that would yield the same precision if the study had used simple random sampling.

The sampling procedure will involve 60 strata, defined by the number of children in each of the three age ranges of interest. Selection will proceed in two phases. In the first phase, one age group will be selected. In the second phase, one of the children in the selected age group will be randomly selected. The first-phase sampling rates will be optimized to yield the largest effective sample size for the smallest age group (adult children) subject to the constraints that every child has a positive probability of selection and that at least one child be selected from every household with at least one eligible child. (A positive probability of selection for each child is required to create weights such that the sample represents all children.) As shown in Exhibit B-1, this procedure yields large, but not equal, samples in each group: 1,398 in the youngest group, 929 in the middle group, and 705 in the oldest group. Although we would have preferred equal sample sizes for the three age groups, this was not possible given the distribution of children across study members.

Sampling weights will be used to account for the differential sampling ratios for some child age categories in some household configurations. By applying the sampling weights, the sample for estimating program impacts on children will represent the distribution of child ages across study households.

Minimum Detectable Effects

The sampling plan determines the expected sample size, which informs calculations of statistical power and minimum detectable effects (MDEs). This MDE is the smallest true impact that the study will have an 80 percent probability of detecting when the test is for the hypothesis of “no impact” and has just a five percent chance of finding an impact if the true impact is zero. The research team has estimated the MDEs under the proposed sampling plan.

Exhibit B-2 reports MDEs for a number of outcomes that will be measured with the LTS. Although the confirmatory and secondary outcomes for the Long-Term Impact Report have not been defined yet, the outcomes shown in Exhibit B-2 include all confirmatory or secondary outcome measures from the Intermediate-Term Impact Report except for earnings (which use administrative data) and training duration (which cannot be measured with the LTS instrument). To estimate MDEs for these outcomes, the team used estimated standard errors on impacts estimated for the same outcomes from the ITS. These estimates are appropriate under the assumption that standard errors for these outcomes as measured in the LTS will not differ substantially (e.g., due to a different sample size or difference in the variance of the outcome).1

Exhibit B-2 also includes one new measure not included in the Intermediate-Term Impact Report: “current healthcare employment in a job that pays $20 or more per hour as of survey interview.” To estimate the MDE for this outcome the team assumed the same size sample of respondents to the LTS as for the ITS, other parameters as observed in analysis of ITS outcomes,2 and a control group mean of 0.25.3

These MDEs are sufficient to detect likely impacts of policy relevance in some domains. For instance, impacts on outcomes in the educational progress domain at the point of STS were substantially larger than those reported in Exhibit B-2. However, as was the case with the ITS, these MDEs are considerably larger than those used for the analysis of short-term impacts. (This is because the sample sizes for the ITS and LTS are much smaller than those for the STS.)



Exhibit B-2: Minimum Detectable Effects for Key Outcomes using the Long-Term Survey

Outcome of interest (measure)

Associated MDE (p.p.)

Educational Progress


Completed any training within 66 months of randomization (program completion is defined as having earned any new credential since randomization)

5.5

Completed training by earning a credential or still in training 66 months after randomization

5.2

Earned an exam-based professional, state, or industry certification or license as of 36 months after randomization

4.2

Labor Market


Employed at job that offers health insurance as of survey interview

4.7

Healthcare Labor Supply


Current healthcare employment as of survey interview (“Is this occupation in the field of healthcare?”)

4.0

Current healthcare employment in a job that pays $20 or more as of survey interview

4.2

Career Progress


Career connected,” meaning either employed full-time, attending school full-time, or a mixture of part-time work and school at the same time at 66 months after randomization

5.7

Subjective perception of progress towards long-range education and career goals

0.1

Well-being


Trouble making ends meet

3.7

Receipt of TANF, SNAP, or Medicaid by anyone in household at survey interview

3.5

Notes: “Minimum detectable” defined as 80 percent power given a one-sided test at the 5 percent significance level. Calculations use preliminary estimates of actual standard errors on impacts estimated for the same outcomes from the ITS. They do not adjust for potential changes to these estimates in the LTS, which might come from different sample sizes, variance reductions due to calibration of the LTS to the ITS and/or STS, or loss of information associated with people who responded to the critical items instrument instead of the main survey instrument. Since the effects of these complexities on variances operate in opposite directions, the net effect on MDEs should be modest.



B3. Design of Data Collection Instruments

Development of Data Collection Instrument

In developing the HPOG 2.0 LTS (Instruments 21 and 21a) the evaluation team included items used successfully in other surveys for similar interventions. The most important source was earlier HPOG 2.0 surveys previously approved under this OMB control number. Using similar items will allow exploration of how outcomes vary with time since random assignment. Other important sources were the long-term surveys from other Career Pathways studies (the first round of Health Profession Opportunity Grants (HPOG 1.0) OMB #0970-0394 and Pathways for Advancing Careers and Education (PACE) OMB #0970-0397). These surveys provide insights into how questions need to change for longer follow-up and questions to add. Drawing from these instruments ensured that many of the survey questions have been thoroughly tested on large groups of people with similar characteristics.

The evaluation team added three new modules to this survey: one on barriers to pursuing longer healthcare training, one on informal employment, and the third on child outcomes.

  1. Earlier career pathways evaluations have conjectured a crucial role for longer trainings in achieving larger impacts on earnings (Juras and Buron, 2021).4 The module on barriers to pursuing longer healthcare trainings will allow us to understand some of the barriers that may prevent or delay successful completion of the longer credential trainings such as radiology technician, Licensed Practical Nurse (LPN), Registered Nurse (RN), etc.—the types of trainings that would lead to higher earnings in healthcare jobs. This battery of questions was newly developed for this module. The barriers included in the module were chosen based on feedback received from grantees during the preview of the Short-Term Impact Report findings and the annual grantee meeting in 2019. The evaluation team also drew on feedback provided by participants as part of the in-depth participant interviews (Instrument #17, previously approved under this OMB #0970-0462).

  2. The National Directory of New Hires (NDNH) is the primary source of earnings data for the evaluation. NDNH data, however, does not capture informal employment and self-employment. Understanding levels and patterns of such employment are therefore crucial for interpreting NDNH-based results. The module on informal employment will capture additional information on self-employment and informal job earnings. In the US economy as a whole, Nightingale and Wandner (20115) estimate that informal employment represents five to 10 percent of GDP. Bracha and Burke (2017)6 estimate that about 20 percent of nonretired adults engage in informal employment. Abraham and Houseman (2019)7 estimate a higher percentage: 28 percent of the adult population engaging in informal work in any given month. They also estimate that this work is an important source of household incomes for 11 percent of the adult population. Such informal employment may be particularly relevant for the study’s target population. For example, informal employment might be an important source of earnings for home health aides—a common occupation for HPOG program participants. It could also be an important form of employment for those in the HPOG control group. The first seven questions in this module are about work not reported in the employment history. They are based on the Joint Program in Survey Methodology module by Abraham and Amaya.89 Questions 8 and 9 are intended to collect employee perceptions of how compliant their employers are with requirements that employers pay employment taxes and with employee taxes (income tax withholding, Social Security payroll tax, and Medicare payroll tax. The study team did not find any good models for measuring the concept of employee perceptions of employer compliance. The team therefore developed this new module.

  3. At five and a half years, enough time may have passed for benefits to flow down from the adult study participant to their children. The child module covers academic progress, educational attainment, time out of the home/supervision, family routines, educational goals, and if applicable, transition to adulthood for one selected focal child. The child module was drawn from the long-term follow-up surveys for the previously approved PACE and HPOG 1.0 studies (OMB #0970-0397 and 0970-0394, respectively).

In order to add new modules without increasing the expected survey time, relative to the ITS, the LTS drops the full education, training, and employment history (Section A of the HPOG 2.0 ITS) and school details (Section B of the ITS). The LTS also drops the literacy and numeracy skills assessment (Section J of the ITS) and the COVID-19 module (Section H of the ITS).

This submission also seeks approval for a critical items only version of the LTS, the HPOG 2.0 Long-Term Survey Critical Items Instrument (LTS-CII), which is shorter in length (20 minutes as opposed to 60 minutes). This shorter version (Instrument 21a) is intended to serve as a tool to maximize response rates by offering participants that would likely become a final refusal the opportunity to complete a shorter version of the instrument. As noted in Supporting Statement A and in Section B4, the use of the critical items only version of the HPOG 2.0 ITS helped improve the overall response rate for the key outcomes of interest and minimized the response rate differential between treatment and control group members. See Supporting Statement A for more detail on the research questions and instrument content and Attachment AC-HPOG 2.0 Long-Term Survey Sources for a listing of sources used for each question in the HPOG 2.0 LTS.


B4. Collection of Data and Quality Control

The procedures for this survey build off those developed for the STS and ITS efforts (Instrument 12 previously approved in June 2018 and Instruments 18 and 18a previously approved in July 2019, with updates in June 2020). The survey will be programmed and administered using Confirmit Computer Aided Personal Interviewing (CAPI) technology. Interviewers will use tablets equipped with the Confirmit software to conduct the surveys. The evaluation contractor will send an advance letter to all participants selected for the LTS data collection effort. Trained interviewers will attempt to locate and interview respondents first by telephone and then in-person.10 See the section on “Interviewing” below for more detail on potential COVID-19 implications for in-person follow-up. The remainder of this section describes the planned data collection procedures in more detail. We first describe procedures for maintaining contact information for all survey sample members and then describe the procedures for data collection.

Participant Contact Update Request Procedures.

The participant contact update form (Instrument 5b, previously approved under this OMB control number) is self-administered.11 The form is mailed to sample members quarterly, beginning three months after random assignment. Participants are encouraged to update their information by returning the form by mail, through a secure online portal, or they can update their contact information by calling the evaluation contractor’s toll-free number. Participants can indicate that the information is correct, or they can make any necessary changes to contact information. The contact update requests improve the quality of the contact information we have in our records by allowing participants to update address information, add apartment numbers, correct zip codes, update phone numbers—all of which helps to improve the accuracy of outreach efforts.

As approved in July 2021, the evaluation contractor will substitute one round of contact update mail requests with conduct updates by phone, instead. Local interviewers working under the evaluation contractor will make outbound calls to conduct a short check-in call with study participants eight months prior to the start of the LTS. This call allows the interviewers to build upon their existing rapport with participants established during the short-term and intermediate-term follow-up data collection efforts. Interviewers will inform study participants about the next survey data collection effort and address any questions about the study. Interviewers will conclude the call by collecting updated contact data. All new contact information will be appended to the study database prior to the start of the LTS. (See previously approved Instrument 5b HPOG 2.0 Contact Update Phone Version, which includes the script the evaluation contractor will use to make the outbound calls to collect the updated contact information.)

HPOG 2.0 Participant Newsletter. As approved under this OMB control number in July 2021, the evaluation contractor will also send participants in the LTS sample a participant newsletter. The HPOG 2.0 Participant Newsletter (Attachment AA) will remind participants that they are part of this study. This newsletter will thank participants for their continued cooperation and remind them of the importance of their participation—even if they were assigned to the control group and did not participate in the program. It will also include a summary of key HPOG 2.0 impact evaluation accomplishments since grants were awarded in 2015. Finally, it will explain the remaining data collection activities and offer participants an opportunity to update their contact information via the online system or on paper. This one-time newsletter will be sent 12 months prior to the release of the LTS, along with the standard contact update request form that would go out at that time.

LTS Procedures

Interviewer Staffing: An experienced, trained staff of interviewers will conduct the LTS with participants. To the extent possible, the evaluation contractor will recruit interviewers who worked successfully on the HPOG 2.0 STS, ITS, and COVID-19 Cohort STS data collection efforts. These interviewers are familiar with HPOG 2.0 and career pathways model and study goals, and they have valuable experience working with this study population. All interviewers will participate in a training that includes didactic presentations, numerous hands-on practice exercises, and role-play interviews. The evaluator’s training materials will build on those prepared for the prior HPOG 2.0 surveys.

Advance Letter: The evaluation team will mail an advance letter to all participants in the randomization cohort selected for inclusion in the LTS. The advance letter (Attachment AB) will be mailed to study participants selected to participate in the survey approximately one and a half weeks before interviewers begin data collection. The advance letter helps to alert participants to the upcoming survey effort, so they are more prepared for the interviewer’s call. The letter provides each selected study participant with a toll-free number that they can call to set-up an interview. See the previously approved Supporting Statement A from June 2018 for more information on the use of the advance letter.

Email Reminder: Interviewers will attempt to contact participants by telephone first. If initial phone attempts are unsuccessful, interviewers can use their project-specific email accounts to introduce themselves as the local data collection staff, explain the study, and attempt to set up an interview. They send this email, along with the advance letter, about halfway through the time during which they are working the cases (see Attachment AE HPOG 2.0 Long-Term Survey Email Reminder Text).

Interviewing: The LTS data collection will be mixed-mode phone with in-person follow-up—if it is determined appropriate to resume face-to-face interviewing. If COVID-19 pandemic restrictions are in place at the start of data collection in May 2023, in-person follow-up efforts will be contingent upon local conditions and adherence to CDC guidelines. If CDC guidance indicates that in-person data collection is too risky, interviewers will continue to work cases only by telephone, as they did for the ITS. Data collection begins when interviewers attempt to reach the selected study participants by telephone, using the full contact information history for the respondent and any alternate contacts (such as family or friends that study participants identified as people who will know how to reach them). After the interviewers exhaust all phone efforts, they will work non-completed cases in person (if feasible). Interviewers may leave specially designed project flyers with family members or friends (see Attachment AD HPOG 2.0 Long-Term Survey Trying to Reach You Flyer).

Sample Release Schedule: Pending OMB approval, we will release the full sample in May 2023 and data collection will continue through December 2023. Although prior waves of survey data collection released sample for interviewing on a monthly basis, we propose a single release with a short six-month data collection period for the LTS. This single release schedule will minimize costs to the federal government. We used the higher-cost staged release for the STS and the ITS because of concern about variable length of follow-up potentially confounding treatment impacts. At five and a half years, we anticipate little variation in outcomes will result from a few months difference in the length of follow-up, so our previous concern about the potential for this variance to confound treatment impacts is sharply reduced. As shown in Exhibit B-3, under this release schedule, participants will be interviewed as early as 64 months or as late as 74 months after randomization.


Exhibit B-3: Time Elapsed between Random Assignment and Survey Completion

Enrollment Cohort

Months after Random Assignment at Time of Release

Number of Months Elapsed Between Random Assignment and Date of Interview

Minimum

Maximum

September 2017

68

68

74

October 2017

67

67

73

November 2017

66

66

72

December 2017

65

65

71

January 2018

64

64

70



B5. Response Rates and Potential Nonresponse Bias

The LTS is expected to start in May 2023. At that point, the hope is that the COVID-19 pandemic will have stabilized and the social distancing guidelines will be lifted, allowing for a full mixed-mode phone with in-person follow-up interviewing effort. As such, we expect the data collection approach and the methods used to maximize response rates for the LTS will be nearly identical to those approved for use in the HPOG 2.0 STS (Instrument 12) and the ITS (phone with in-person follow-up).12 Specifically, the evaluation team will use the following methods to maximize response to the LTS:

  • Participant contact updates and locating (as described above in Section B4);

  • Tokens of Appreciation (as described in Supporting Statement A, Section A9); and

  • Sample control during the data collection period (as described in Attachment AH).

The ITS introduced another data collection strategy to maximize the response rate, the shorter critical items only version of the instrument. The LTS will also incorporate the critical items instrument (CII) approach to enhance overall response rates and reduce potential non-response for the most critical items of interest.


For the HPOG 2.0 STS, after excluding those participants who withdrew from the study or were ineligible to participate, the evaluation team completed an interview with 9,620 of the 12,923 participants eligible for the data collection:13 a 74.4 percent response rate. It is common for the response rate to decrease between follow-up survey waves. With a response rate of 65.8 percent for the HPOG 2.0 ITS, the decrease in response rates between the STS and ITS was 8.6 percentage points. The comparable career pathways studies PACE and HPOG 1.0 had 3.7 and 3.1 percentage point decreases, respectively, in response rates between their STS and ITS data collection efforts. The larger HPOG 2.0 decrease between waves was likely due in large part to the inability to conduct data collection in-person for the ITS due to COVID-19 pandemic social distancing guidelines.


Experience with the recently completed ITS data collection also showed that the (much shorter) critical items instrument (Instrument 18a) increased the overall response rate (by 9.5 percentage points) and reduced the response rate differential between treatment and control group members (by 2.4 percentage points). This information collection request includes an analogous CII version of the LTS (Instrument 21) that is significantly shorter and captures only the most critical outcomes, the LTS-CII (Instrument 21a). While the full survey requires 60 minutes, the shorter LTS-CII can be completed in just 20 minutes. Balancing the time elapsed between the ITS and the LTS and the expected reduction in response rates between waves, with the expected increase in response rates if in-person follow-up activities are allowed, we estimate that a 73 percent response is feasible for the LTS. The projection is that the full LTS will yield a 62 percent response rate and the remaining 11 percent will come from the LTS-CII.


We plan to use the same nonresponse analyses and adjustments as we used for the Intermediate-Term Impact Report. For purposes of nonresponse adjustment weighting, the evaluation team will consider respondents to the LTS-CII as if they had responded to the full LTS. Since the LTS-CII respondents will have much less detailed information than respondents to the full LTS, analysis will require imputation of most of the LTS for those who complete the LTS-CII.  We developed procedures for this imputation as part of work preparing the Intermediate-Term Impact Report. Sensitivity analyses conducted in support of the Intermediate-Term Impact Report (documented in forthcoming appendices to that report) indicate acceptable performance. In addition to the items in the CII, these procedures draw on administrative data to improve the quality of the imputations. We use National Student Clearinghouse (NSC) data to inform the imputation of education outcomes and NDNH to inform the imputation of labor market and well-being outcomes. We will apply the same procedures for the Long-Term Impact Report.


B6. Production of Estimates and Projections

The evaluation contractor will estimate the impact of outcomes measured for the LTS sample using similar methods to those used for the HPOG 2.0 STS and ITS samples. Results of those analyses will be disseminated using approaches similar to those planned for the Intermediate-Term Impact Report. Those approaches include a report (with an Overview and an Executive Summary), a technical appendix volume, and several Briefs. These primary products will be referenced on OPRE’s website, in regular emails on new findings, and through twitter and other new media.

ACF anticipates that a wide range of policy makers and policy analysts will use these reports in deciding whether to continue funding programs such as HPOG and what guidance to include for grantees in funding opportunity announcements. See the list of reports in the Data Use section of B7 below for more information on the evaluation design and analysis plans.

ACF will work with the evaluation contractor to prepare data for archiving with copious documentation of data structures to try to maximize the chances that secondary analysts will be able to prepare high-quality reports and papers. Except for some highly sensitive information (e.g., name, date of birth, contact information), all of the data and codebook documentation will be made available to secondary researchers through the Child and Family Data Archive, maintained by the Inter-university Consortium for Political and Social Research (ICPSR). The archived data will be found here: https://www.childandfamilydataarchive.org/cfda/pages/cfda/index.html;jsessionid=54A15AD4A666408D10A075699D55AE8E. The work of secondary analysts will also be made easier by the very detailed technical methods appendices that have been prepared for the Short-Term Impact Report and those to be prepared for the Intermediate-Term Impact Report and Long-Term Impact Report.


B7. Data Handling and Analysis

The LTS will be an additional component under the HPOG 2.0 National Evaluation impact evaluation. Please refer to prior revisions of OMB #0970-0462 for additional details on the data handling and analysis for the previously approved HPOG 2.0 National Evaluation descriptive evaluation, cost-benefit analysis study, and earlier impact evaluation survey efforts. Prior revisions also cover the HPOG 2.0 Tribal Evaluation.


Data Handling

To ensure data security and enhance data quality, the trained interviewers administering the LTS will collect data on tablets using the Confirmit CAPI system. The Confirmit CAPI interviewer console requires secure login-based interface and, all case level data files are stored and encrypted by the CAPI software that is installed on the devices. This includes survey data as well as PII associated with respondents and their secondary contacts. All survey data will be transmitted to Abt servers via a secure internet connection, and the CAPI system itself encrypts data flowing between interviewer consoles and the centralized server. Datasets generated by the CAPI system will be stored in a restricted access folder on Abt Associates’ secure Analytical Computing Environment (ACE3), the FISMA moderate server, where most analyses will be conducted. Once Abt programmers are satisfied with the extraction of survey data from CAPI software into SAS, the SAS files will be transferred to Abt analysts. Data transfer between the survey system and the analysis system will be conducted through Abt’s secure online file transfer platform that utilizes FIPS 140-2 validated cryptographic modules. Only those project research staff with security approval will be granted access to the data. All analyses will take place within the secure ACE-3 computing environment at Abt Associates.


Data Analysis

As mentioned under Section B6, a detailed analysis plan was prepared for the Short-Term Impact Report (Judkins, Klerman, and Locke 2020) and the Intermediate-Term Impact Report (Judkins, Prenovitz, Klerman, Durham, and Locke 2021).14 15Those earlier plans will serve as a starting point for the Long-Term Impact Report Analysis Plan. The analysis plan for the Long-Term Impact Report will include the technical details on estimation, weighting, covariate selection, and how the information collected will be used or interpreted in conjunction with other sources of information. The Long-Term Impact Report Analysis Plan will guide the analysis and presentation of results.


Publicly Posted Data

The HPOG 2.0 National Evaluation Impact Study, the Short-Term Impact Analysis Plan, and the Intermediate-Term Impact Analysis Plan have been registered at the Registry of Efficacy and Effectiveness Studies (REES), registration ID 1948.1v1. 16 They are also registered with the Open Science Framework (OSF). 17 Once the Long-Term Impact Report Analysis Plan is developed and finalized, it will also be registered and publicly posted with REES and OSF before analysis for the Long-Term Impact Report begins.



Data Use

With ACF oversight, Abt and its partners MEF Associates, the Urban Institute, Insight Policy Research, and NORC are responsible for conducting the HPOG 2.0 National and Tribal Evaluation. This team has published several evaluation design and analysis plans detailing how the data collection approved under this OMB control number will be used. These include:

https://www.acf.hhs.gov/opre/report/national-and-tribal-evaluation-2nd-generation-health-profession-opportunity-grants-0; and

  • the Intermediate-Term Impact Study Analysis Plan:

https://www.acf.hhs.gov/opre/report/analysis-plan-hpog-20-national-evaluation-intermediate-Term-impact-report.


See Supporting Statement A for more detail on the timeline for publication—planned or actual—of the Short-Term Impact Report, the Intermediate-Term Impact Report, and the Long-Term Impact Report. The primary output for LTS analyses will be the Long-Term Impact Report. The appendices prepared for the Short-Term Impact Report and Intermediate-Term Impact Report are very thorough, and we plan a similar effort for the Long-Term Impact Report including clear statements of the limitations of the data and explorations of data quality.

As noted in Section B6, ACF and the evaluation contractor will archive data and develop guidelines on how to use and interpret the data to support secondary analysis.

B8. Contact Person(s)

The individuals listed in Exhibit B-4 below made a contribution to this information collection request.

Exhibit B-4: Contributors

Name

Role in HPOG 2.0 National and Tribal Evaluation

Organization/Affiliation

Larry Buron

National Evaluation Project Director

Abt Associates

Jacob Klerman

National Evaluation Principal Investigator

Abt Associates

Jill Hamadyk

National Evaluation Deputy Project Director

Abt Associates

Larry Buron

National Evaluation Project Quality Advisor

Abt Associates

David Judkins

National Evaluation Director of Impact Analysis

Abt Associates

Debi McInnis

National Evaluation Site Coordinator

Abt Associates



Inquiries regarding the statistical aspects of the HPOG 2.0 National Evaluation design should be directed to:

Larry Buron, Project Director

Abt Associates

6130 Executive Boulevard

Rockville, MD 20852

(301) 634-1735



The following ACF staff—including the ACF project officers Nicole Constance and Lisa Zingman—have overseen the design process and can be contacted at:



Nicole Constance

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

330 C Street S.W., 4th Floor, Washington, D.C. 20201

(202) 401-7260


Lisa Zingman

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services

330 C Street S.W., 4th Floor, Washington, D.C. 20201

(202) 260-0323






Instrument and Attachments-Current Request

New Instruments:

  • Instrument 21: HPOG 2.0 Long-Term Survey (LTS)

  • Instrument 21a: HPOG 2.0 Long-Term Survey Critical Items Instrument (LTS-CII)

Previously Approved Instruments Still in Use:

  • Instrument 5b: HPOG 2.0 National Evaluation participant contact update letter and form

  • Instrument 5b: HPOG 2.0 Contact Update Form Phone Version

  • Instrument 12a: COVID-19 Cohort Short-Term Follow-up Survey

  • Instrument 12b: COVID-19 Cohort Short-Term Follow-up Survey _Critical Items Only


New Attachments

  • Attachment AB: HPOG 2.0 Long-Term Follow-Up Survey Advance Letter

  • Attachment AC: HPOG 2.0 Long-Term Follow-Up Survey Sources

  • Attachment AD: HPOG 2.0 Long-Term Follow-Up Survey Trying to Reach You Flyer

  • Attachment AE: HPOG 2.0 Long-Term Follow-Up Survey Email Reminder Text

  • Attachment AF: HPOG 2.0 Previously Approved Sample Selection for the STS and ITS

  • Attachment AG: HPOG 2.0 Previously Approved Methods to Maximize Response Rates

Previously Approved Attachments Still in Use

  • Attachment H: HPOG Logic Model

  • Attachment K-Revised: COVID-19 Cohort Short-term Survey Advance Letter

  • Attachment L-Revised: COVID-19 Short-Term Survey Sources

  • Attachment M-Revised: COVID-19 Cohort Short-term Survey Trying to Reach You Flyer

  • Attachment N-Revised: COVID-19 Cohort Short-Term Survey Email Reminder Text

  • Attachment AA: HPOG 2.0 Participant Newsletter


Instruments and Attachments – Previously Approved, No Longer in Use

Previously Approved Instruments No Longer in Use:

  • Instrument 1: PAGES Grantee- and Participant-Level Data Items List

  • Instrument 2: HPOG 2.0 National Evaluation Screening Interview

  • Instrument 3: HPOG 2.0 National Evaluation first-round telephone interview protocol

  • Instrument 4: HPOG 2.0 National Evaluation in-person implementation interviews

    • Instrument 4A HPOG 2.0 National Evaluation In-Person Implementation Interview

    • Instrument 4B HPOG 2.0 National Evaluation In-Person Implementation Interviews Basic Skills Training

    • Instrument 4C HPOG 2.0 National Evaluation In-Person Implementation Interviews Career Pathways

    • Instrument 4D HPOG 2.0 National Evaluation In-Person Implementation Interviews Work-Readiness

    • Instrument 4E HPOG 2.0 National Evaluation In-Person Implementation Interviews Sustainability

  • Instrument 5: HPOG 2.0 National Evaluation welcome packet and participant contact update forms

  • Instrument 5a: HPOG 2.0 National Evaluation welcome packet and contact update form_REV Instrument 6: HPOG 2.0 Tribal Evaluation grantee and partner administrative staff interviews

  • Instrument 7: HPOG 2.0 Tribal Evaluation program implementation staff interviews

  • Instrument 8: HPOG 2.0 Tribal Evaluation employer interviews

  • Instrument 9: HPOG 2.0 Tribal Evaluation program participant focus groups

  • Instrument 10: HPOG 2.0 Tribal Evaluation program participant completer interviews

  • Instrument 11: HPOG 2.0 Tribal Evaluation program participant non-completer interviews

  • Instrument 12: HPOG 2.0 National Evaluation Short-term Follow-up Survey

  • Instrument 13: HPOG 2.0 Screening Interview Second Round

  • Instrument 14: HPOG 2.0 Second Round Telephone Interview Guide

  • Instrument 15: HPOG 2.0 Program Operator Interview Guide for Systems Study

  • Instrument 16: HPOG 2.0 Partner Interview Guide for Systems Study

  • Instrument 17: HPOG 2.0 Participant In-depth Interview Guide

  • Instrument 18: HPOG 2.0 Intermediate Follow-up Survey_ REV_June2020

    • Instrument 18a: HPOG 2.0 Intermediate Follow-up Survey_Critical Items Only

  • Instrument 19: HPOG 2.0 Phone-based Skills Assessment Pilot Study Instrument

  • Instrument 20: HPOG 2.0 Program Cost Survey

Previously Approved Attachments No Longer in Use

  • Attachment A: References

  • Attachment B: Previously Approved Informed Consent Forms

    • Attachment B: National Evaluation informed consent form A (Lottery Required)

    • Attachment B: National Evaluation informed consent form B (Lottery Not Required)

    • Attachment B: National Evaluation Informed Consent Form C (Lottery Required) _Verbal

    • Attachment B: National Evaluation Informed Consent Form D (Lottery Not Required) _Verbal

  • Attachment B: Informed Consent Forms, Updated Time Period

    • Attachment B: National Evaluation Informed Consent Form A (Lottery Required) _REV

    • Attachment B: National Evaluation Informed Consent Form C (Lottery Required) _Verbal_REV

    • Attachment B2: Tribal Evaluation informed consent form A (SSNs)

    • Attachment B3: Tribal Evaluation informed consent form B (Unique identifiers)

    • Attachment B2: Tribal Evaluation Informed Consent Form C (SSNs)_Verbal

    • Attachment B3: Tribal Evaluation Informed Consent Form D (Unique identifiers) _Verbal

  • Attachment C: 60-Day Federal Register Notice

  • Attachment D: Previously Approved Sources and Justification for PAGES Grantee- and Participant-Level Data Items

  • Attachment E: Previously Approved Final Updated Attachment E PPR Data List and Mockup

  • Attachment F: First Round of HPOG Grantees Research Portfolio

  • Attachment G: Previously Approved Participant Contact Information Update Letter and Form (Obsolete, replaced by Instrument 5a and 5b)

  • Attachment I: Previously Approved Focus Group Participant Consent Form

  • Attachment I: New Focus Group Participant Consent Form_Remote

  • Attachment J: Previously Approved Interview Verbal Informed Consent Form

  • Attachment J: New Interview Verbal Informed Consent Form_Remote

  • Attachment K: HPOG 2.0 National Evaluation Short-term Follow-up Survey Advance Letter

  • Attachment L: HPOG 2.0 National Evaluation Short-term Follow-up Survey Sources

  • Attachment M: HPOG 2.0 National Evaluation Short-term Follow-up Survey Trying to Reach You Flyer

  • Attachment N: HPOG 2.0 National Evaluation Short-term Follow-up Survey Email Reminder

  • Attachment O: Research Questions for Previously Approved Data Collection Efforts (National Evaluation and Tribal Evaluation)

  • Attachment P: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Advance Letter

  • Attachment P: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Advance Letter_REV

  • Attachment Q: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Sources

  • Attachment Q: Intermediated Follow-up Survey Sources_REV

  • Attachment R: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Trying to Reach You Flyer

  • Attachment R: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Trying to Reach You Flyer_REV

  • Attachment S: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Email Reminder

  • Attachment S: HPOG 2.0 National Evaluation Intermediate Follow-up Survey Email Reminder_REV

  • Attachment T: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot flyer

  • Attachment U: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot grantee letter

  • Attachment V: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot participant letter

  • Attachment W: HPOG 2.0 National Evaluation phone-based Skills Assessment Pilot recruitment script

  • Attachment X: Complete list of previously approved data collection instruments

  • Attachment Y: 60-day Federal Register Notice

  • Attachment Z: Participant Interview Recruitment Materials


1 Longer-term follow-up typically yields smaller sample sizes than intermediate-term follow-up. However, in this case, we might actually get larger sample sizes at long-term follow-up since we expect that we will be able to use in-person interviews (which was not possible with the ITS due to COVID-19).

2 Specifically, an R-squared of 0.08 and ICC of 0.01.

3 This mean corresponds to the control group mean of a similar outcome measured in the ITS.

4 Juras, Randall, and Larry Buron. 2021. Summary and Insights from the Ten PACE and HPOG 1.0 Job Training Evaluations: Three-Year Cross-Site Report. OPRE Report 2021-155. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

5 Nightingale, Demetra Smith, and Stephen A. Wandner. 2011. “Informal and Nonstandard Employment in the United States: Implications for Low-Income Working Families.” Washington, DC: The Urban Institute.

6 Bracha, Anat and Burke, Mary A., Who Counts as Employed? Informal Work, Employment Status, and Labor Market Slack (December, 2016). FRB of Boston Working Paper No. 16-29, Available at SSRN: https://ssrn.com/abstract=2935535

7 Abraham, Katharine G., and Susan N. Houseman. 2019. “Making Ends Meet: The Role of Informal Work in Supplementing Americans’ Income.” RSF: The Russell Sage Foundation Journal of the Social Sciences 5(5): 110–31. DOI: 10.7758/RSF.2019.5.5.06.

8 There are three different versions of Q1. Which version is asked will vary with the respondent’s employment history (as reported earlier in the survey).

9 Abraham, Katharine G., and Ashley Amaya. 2018.“Probing for Informal Work Activity.” NBER working paper no. 24880, Revised in 2019. Cambridge, Mass.: National Bureau of Economic Research.


10 During the ITS data collection period, the COVID-19 pandemic and CDC social distancing guidelines prevented in-person data collection efforts, so all interviews were done by telephone.

11 As of this request for clearance, contact updates are sent to two groups of participants: participants enrolled between September 2017 and January 2018 (those who are in the LTS sample), and participants enrolled between May 2020 and September 2021 (those who are in the ongoing COVID-19 Cohort Short-Term Follow-Up Survey sample, previously approved in October 2021).

12 While the ITS data collection used the same procedures for telephone interviewing as the STS, the COVID-19 pandemic prevented the use of in-person data collection. Thus, the ITS was done by telephone only.

13 13,907 people were randomized during the sample window. Of these, 43 withdrew consent to have their data used in the study for any purpose, including nonresponse analysis. Another 42 became ineligible for the following reasons: death, incarceration, or inability to respond because of a permanent health/disability issue.

14 Judkins, David Ross, Jacob Alex Klerman, and Gretchen Locke. 2020. Analysis Plan for the HPOG 2.0 National Evaluation Short-Term Impact Report. OPRE Report 2020-07. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

15 Judkins, David Ross, Sarah Prenovitz, Gabriel Durham, Jacob Alex Klerman, and Gretchen Locke. 2021. Analysis Plan for the HPOG 2.0 National Evaluation Intermediate-Term Impact Report. OPRE Report 2021-176. Washington, DC: Office of Planning, Research, and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.

15


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created0000-00-00

© 2024 OMB.report | Privacy Policy