PGE_OMB_PartA_response2 to RIMS and OMB_070312

PGE_OMB_PartA_response2 to RIMS and OMB_070312.docx

Pell Grant Experiments Study

OMB: 1850-0892

Document [docx]
Download: docx | pdf



Evaluation of the Pell Grant Experiments Under the Experimental Sites Initiative

OMB Supporting Statement:

Part A

July 3, 2012




Evaluation of the Pell Grant Experiments Under the Experimental Sites Initiative

OMB Supporting Statement:

Part A

July 3, 2012














TABLES

A.1 PGE Data Collection Plan 9

A.2 Use of the Data to Answer Study Research Questions 11

A.3 Hour Burden Estimates for PGE Schools 17

A.4 Hour Burden Estimates for Survey Data Collection 18

A.5 Total Hour Burden Estimates for Experiments 1 and 2 18

A.6 Cost Burden Estimates for PGE Schools 19

A.7 Cost Burden Estimates for Survey Data Collection 19

A.8 Total Cost Burden Estimates for Experiments 1 and 2 20

A.9 Preliminary Project Schedule of Activities 21





FIGURES

A.1 Stylized Model of the Recruitment, Enrollment, and Random Assignment Process for PGE When There Is Need-Blind Admissions 3

A.2 Time Line for the Pell Grants Experiments Study 7



PART A: SUPPORTING STATEMENT FOR PAPERWORK
REDUCTION ACT SUBMISSION

The Institute of Education Sciences (IES) at the U.S. Department of Education (ED) requests approval to conduct an evaluation of the effects of two Pell Grant Experiments (PGE) demonstrations under the Experimental Sites Initiative (ESI). The ESI, authorized by section 487A(b) of the Higher Education Act of 1965 (HEA), allows the Secretary to grant waivers from specific Title IV HEA statutory or regulatory requirements to enable institutions to test alternative methods for administering those federal student aid programs. The two demonstrations are targeted to income- eligible postsecondary students interested in vocational training but who could not otherwise receive a Pell grant because: (1) they currently have a bachelor’s degree, or (2) they seek to enroll in a vocational program that is shorter than the current minimum duration and clock hours. Because of the potential high costs – and benefits – of expanding Pell grant eligibility in these two ways, ED has decided to rigorously assess the demonstration programs using a random assignment design. The study will examine the impacts of each experiment on employment and earnings, participation in education and training and job support activities, and student debt and financial aid receipt.

OVERVIEW OF THE DEMONSTRATIONS AND STUDY APPROACH

Under the ESI, Title IV institutions choose to participate in demonstrations or “experiments” in response to a notice from ED’s Office of Federal Student Aid (FSA). FSA published such a notice in October 2011, inviting postsecondary schools to participate in any of 8 different experiments1, two of which expanded Pell grant eligibility for students seeking job training. That notice also specified the institutions’ obligations to provide data and to ensure that a control or comparison group could be formed so that the effects of participating in the experiments could be evaluated. In subsequent webinars, FSA has provided additional detail to interested institutions about the demonstrations and the evaluation.

1. The Two Pell Grant Experiments (PGE)

Under the current ESI, postsecondary schools will receive waivers to enable them to provide Pell Grants to students who would not otherwise qualify under current Pell Grant rules. The PGE evaluation will include two substudies, each of which relaxes one eligibility criterion for receipt of a Pell Grant:

  1. Experiment 1. Students who already hold a bachelor’s degree and who document that they are unemployed or underemployed will be able to receive Pell Grant award support. This support can be for up to a one-year program of vocational education intended to help them obtain employment, to be used over no more than two award years. Current rules do not allow individuals with a bachelor’s degree to receive Pell support unless it is to be used for teacher certification or licensure.

  1. Experiment 2. Students will be able to receive a prorated amount of Pell Grant financial support for short-term vocational training that lasts for at least 150 clock hours over a period of at least 8 weeks. Current rules require that a student’s academic program is at least 600 clock hours (or an equivalent in semester, trimester, or quarter hours) over at least 15 weeks to qualify for Pell support.

2. Selecting Schools

Schools that volunteered to implement Experiments 1 and 2, that were in good standing in administering Title IV programs (e.g., related to compliance, default rates, etc.), and that agreed to meet the requirements of the evaluation form the study school sample. ED expects the sample to include a maximum of 28 schools for Experiment 1 and 40 for Experiment 2, but with approximately 17 intending to participate in both experiments. Although there will be 51 distinct schools participating, because each experiment will be studied separately there will be a total of 68 experiments underway. Each school will identify the set of vocational or job training programs to which the experiments will apply.

3. Identifying Eligible Students

Recruitment, enrollment, and random assignment of sample members into the PGE study will be the same for both substudies and will involve several steps (Figure A.1). Participating schools will recruit applicants and encourage them to submit both the Free Application for Federal Student Aid (FAFSA) (typically completed on line) and an application to the PGE-eligible program in which the student wants to enroll. Simultaneously or sequentially, FSA will process the FAFSA and the school will determine whether the student can be admitted to the vocational program. Students will receive a Student Aid Report (SAR) and schools an Institutional Student Information Record (ISIR), which provides an assessment of the applicant’s expected family contribution (EFC) towards his or her educational expenses.

Because the potential participants in the study would not ordinarily be eligible for Pell grants, by virtue of their educational characteristics or their program, the PGE schools will need to determine a way to identify candidates for the experiments rather than processing their aid packages in the usual manner. Most likely, the institutions will ensure that financial aid office staff flag students who apply to the PGE eligible programs and review their ISIRs separately.

4. Random Assignment

Once candidates for the experiments are identified by the institutions, school staff will send these eligible individuals (evaluation contractor-provided) information about the study that also requests students’ consent to participate. School staff will data-enter into a web-accessible, study-specific random assignment system the names and Social Security numbers of eligible admitted applicants who have given consent, as well as a very limited amount of other information about the individual and PGE program, so that random assignment can be conducted.2 In real-time (with little delay), the school then will be notified of the research group status of each study participant. The proportion assigned to the treatment group versus control group will depend on the number of eligible candidates the institutions expect to identify.

Figure A.1. Stylized Model of the Recruitment, Enrollment, and Random Assignment Process for PGE When There Is Need-Blind Admissions


Control group members will have access to the normal financial support that they are eligible for (i.e., excluding a Pell Grant). Study participants assigned to the treatment group will be offered a Pell grant, and the school will take this into account in determining any other aid for which the student is eligible. The financial aid packages will then be provided to the study participants. Regardless of whether the participant is assigned to the treatment or control group, he or she can choose to enroll at the PGE school, enroll at another school to which he or she has been admitted, or pursue some other type of activity.3

It is expected that schools in Experiment 1 will enroll 100 participants, on average, while schools in Experiment 2 will enroll 200 participants into the study, for a total of 2,800 sample members in Experiment 1 and 8,000 in Experiment 2. Thus, total sample enrollment for the study will be 10,800. The study participants will consist of individuals who have been determined to be eligible for the study under either experiment and who have consented to be in the study.

5. Collecting Data

Both substudies of PGE will have the same data collection plans. These collection plans include new burden imposed by two types of data collection efforts: (1) PGE school data for all study participants and (2) survey data for a subsample of 2,000 participants (out of 2,500 participants randomly selected for participation in the survey). The plans also include use of two other types of data—FSA data and annual earnings data maintained by the Social Security Administration (SSA)4 —that do not generate data collection burden on participating schools or students. These data are described in detail in Section A.2. Together, these data will provide a rich set of information from which we can estimate the impacts of expanded Pell grant eligibility on study participants’ employment and earnings, educational experiences, and student debt, as well as on the characteristics of participants and their vocational programs.

6. Reporting

The schedules for sample enrollment and program participation, as well as when post-program outcomes can be observed, drives the project’s reporting schedule. The study is expected to last five years, from spring 2012 to June 2017 (Figure A.2)5. Enrollment of school applicants into the study is expected to start in summer 2012. Although each of the 68 experiments in the study might take a slightly different amount of time to complete its enrollment of study participants, enrollment for the study is expected to continue through spring 2014.

Most of the study participants who enroll in Experiment 2 are expected to complete their participation in education or training in a fairly short time (two to four months), while participants who enroll in Experiment 1 are expected to take 9-14 months but could be up to two years if attending less than full-time. It is expected that all sample members who participate in a PGE program will complete their training program by late 2014. The first full post-program calendar year for all study participants will be 2015, although many of the participants who entered the study early in the sample enrollment period are expected to have had a full year of post-program experiences prior to then. SSA data covering calendar year 2015 is expected to be available for analysis in preliminary form in spring/summer 2016, making it possible to draft a report and have it go through IES’ statutorily required review process for publication in late spring 2017.

A1. Circumstances Necessitating the Data Collection

Federal Pell grants are considered the foundation of higher education financial aid for low-income students. Available to those with family incomes up to $60,000, most Pell awards go to students with family incomes below $30,000. In 2009-2010, 8 million students received Pell grants to attend some kind of postsecondary education or training, at a total cost of $30 billion in government expenditures (U.S. Department of Education 2010).


However, not all income eligible students obtain a Pell grant. For example, among postsecondary students from households with income below $10,000, 58 percent received a Pell grant while the remaining 42 percent did not (Baum and Payea 2011). Some may be unaware that such aid is available or are unable to complete the process of obtaining the grants. But with unemployment above 8.5 percent in 2011, and some reports of unfilled openings for skilled jobs in some occupations6, higher education institutions have called for expanding Pell grants to help fill the skill training gap for low-income workers. The Pell grant experiments under the ESI provide an opportunity to test out the use of federal funds in this way.


Under the ESI statute, the Secretary is required to review and evaluate the experiences of institutions that participate as experimental sites and, on a biennial basis, submit a report based on the review and evaluation to the authorizing committees (section 487A(b)(2)). If the study’s proposed data collection were not undertaken, ED would not be able to fulfill this mandate; the demonstrations – and the funds set aside for them -- would not yield rigorous evidence on whether expanding eligibility to underemployed and underemployed individuals with bachelor’s degrees (Experiment 1) or to individuals applying to short-term career training programs (Experiment 2) can raise their employment and earnings. The collaboration between IES and FSA on this PGE demonstration and random assignment evaluation will provide credible and reliable information to help guide future policy decisions in this area of federal financial aid.


In particular, the data collection for and evaluation of the two Pell Grant experiments will address the following research questions:


1. Characteristics of Participating Schools’ Programs and Applicants

  • What are the characteristics of the education or training programs identified by participating institutions as most suitable for the experiments (e.g., field of study, intensity, duration, and cost)? How are study applicants distributed across them?

  • What are the personal and family background characteristics of individuals who qualify for each of the experiments?

  • To what extend does the availability of Pell grants for short-term training under the experiments shift enrollments from longer-term programs traditionally eligible under Title IV to shorter ones?

2. Education-Related Outcomes

  • To what extent does the offer of a Pell grant affect enrollment in the participating institutions’ education and training programs? In what kinds of programs do study participants enroll?

  • Do students not offered a Pell grant go elsewhere for skills training?

  • What is the impact on education and training outcomes overall (e.g., credits earned, completion rates, time to complete, and credentials attained) and on access to support services such as job search assistance?

  • Does the offer of a Pell grant affect the amount of student debt incurred?

  • How are education and training costs distributed across different types of financing strategies, such as grants, loans, and personal or family savings?

  • Is there an impact on barriers that can hinder students from successfully completing their education and training plans?

3. Employment-Related Outcomes

  • What is the impact of being offered a Pell grant on short-term employment rates and earnings levels?

  • What are the characteristics of jobs attained (e.g., relatedness to PGE program, duration, availability of fringe benefits)?

Figure A.2. Time Line for the Pell Grant Experiments Study



A2. How, by Whom, and for What Purpose Is the Information to Be Used

Information for the PGE evaluation will be collected and analyzed by a contractor selected through a competitive procurement. Data from four sources will be used for the study: (1) PGE school data, (2) survey data, (3) FSA data, and (4) SSA data. Data from the first two sources generate new burden, while the other two do not. This section describes the four types of data in detail and then explains how each type of data will be used to answer the study’s research questions.

PGE school data. PGE schools will provide three types of data. First, on a weekly basis school staff, most likely in the financial aid office, will send the evaluation contractor the list of candidates identified as eligible for the experiments during that period and who have been sent consent material (see Table A.1). The list will include each candidate’s name, program, and the date consent material was sent to him or her. This will allow the contractor to track the consent process and ensure that only students with consent participate in the demonstrations.

The second type of PGE school data is that needed for random assignment. School staff will need to enter a limited amount of identifying information – name, SSN, date of birth, and program – about all study participants with consent into a web-accessible system. Entering this data will ensure that: (1) the integrity of the random assignment process is maintained; (2) no individual goes through random assignment more than once; and (3) the treatment and control groups are balanced on important characteristics. The data will be collected on a rolling basis as individuals enter the study.

The third type of data PGE schools will provide pertains to characteristics of students who have enrolled at the school, regardless of whether the study participants were assigned to the treatment or control group. These data will provide information about the rates of enrollment at PGE schools, the types of programs in which the students enrolled, and the students’ progress and completion rates. The data also will contain information about the financial aid offers given to study participants, regardless of whether or not they enrolled at the school. The first extract of these data, to be provided around February 2013, will be used to test the data collection process and ensure that schools correctly record data items. The second extract, to be provided around February 2014, will contain most of the study sample that enrolls at PGE schools. The third extract, to be provided around February 2015, will include the full expected history of participants’ PGE-related program experiences. This extract is timed so that it can include the program experiences of participants who enter the study near the end of the sample enrollment period and who participate in relatively long programs. However, collecting these school data extracts on an annual basis will be important to ensure that the data are being properly collected and retained for the study’s analysis.

Survey data. The survey of study participants is expected to take about 15 minutes to administer, and will collect information about participation in education and training programs; the methods used to pay for their education and training, such as the types of grants and loans received and the amount of student debt incurred; receipt of job search assistance from schools and other organizations; and employment and earnings outcomes. The survey data will cover experiences at both PGE and non-PGE schools. This is critical because it is expected that some study participants—from either the treatment group or the control group—will attend schools other than the PGE schools. (In contrast, the PGE school administrative data will cover enrollment experiences at PGE schools only.) Furthermore, the survey data will provide additional detail about sample members’ employment experiences compared to what will be available through the SSA data described later.

Given the funding available for the study, the survey will be fielded with a random subsample of 2,500 study participants, from which 2,000 completed surveys are expected. The survey sample size will limit the precision of estimated impacts and thus these analyses will be considered exploratory.

ED is exploring how best to balance the cost efficiencies of a relatively short survey fielding period with possible analysis and recall issues from surveying study participants at different points after their enrollment. One approach under consideration is to select survey participants from an early subset of the time window during which sample members enrolled in the study. This approach could allow for a somewhat compressed survey fielding effort, such as 12 months, which would be shorter than the full study enrollment period. Doing so would allow the survey to collect data on a uniform follow-up period for each survey participant, but the survey results might not generalize as well to all study participants. Another approach would be to randomly select for the survey study participants from the full enrollment period; if this approach is taken, then the lengths of follow-up could vary across survey respondents. Ultimately, however, the approach taken is not expected to affect the estimates of the burden created by the survey effort and reported in this package.

FSA data. The main type of data to be provided by the FSA will come from the FAFSA form, which contains identifying, contact, and background information used to process individuals’ requests for federal student aid. For the PGE study, the identifying and locating information in the FAFSA will enable evaluation researchers to find sample members and request their participation in the survey. The background information will be used to describe the characteristics of study participants. In addition, it will enhance the estimation of program impacts by allowing the influence of these characteristics to be netted out of the estimation process. These data will be collected three times. The first time will be in November 2012 as a test of the data extraction process early in the study enrollment period. The second time will be in November 2013, so that the contact and locating information can be used to field the survey. The third time, in November 2014, will provide additional information available about sample members’ access to and use of financial aid (items that are not available through the FAFSA), as well as FAFSA information for any study participants that enroll after the previous extracts were provided.

Table A.1. PGE Data Collection Plan

Data Type

New Data Collection Burden?

Frequency and Timing of Data Collection

Data Items

PGE School Data

Yes

Rolling basis during sample intake from summer 2012 to early 2014

Study participants’ name, the PGE program to which he or she has applied, and the date consent materials were sent, for purpose of tracking consent

Yes

Rolling basis during sample intake from summer 2012 to early 2014

Study participants’ name, SSN, and date of birth, as well as the PGE program to which he or she has applied, for purpose of random assignment

Yes

3 times (February 2013, 2014, and 2015)

Start/stop dates for enrollment at school, characteristics of the education/training participated in, credits earned, credentials attained, financial aid offered and received, placement and support services provided

Survey Data

Yes

July 2014 through March 2015

Participation in and characteristics of education/training, financial aid and student debts, employment-related and other assistance received, employment and earnings outcomes

FSA Data

No

3 times (November of2012, 2013, and 2014)

Identifying information; locating information; background information on demographics, educational attainment, income, and assets, financial aid obtained

SSA Data

No

1 times (fall 2016)

Annual earnings from Social-Security-covered employment, for calendar years 2011 through 2015


SSA data. Each year, employers are required to report to the Internal Revenue Service (IRS) the total earnings subject to Social Security for the calendar year. SSA receives these data from IRS to determine eligibility for Social Security benefits. The Master Earnings File contains earnings records for each worker with a SSN who has worked in covered employment.7 To protect the privacy of individuals, SSA does not release earnings data for individuals. Instead, it provides summary earnings statistics for groups of people. Accordingly, the study team will request that SSA run computer programs to estimate earnings impacts (that is, differences in mean earnings between treatment and control group members) and their associated levels of statistical significance. The process will involve having the study team send to SSA a data file containing identifying information (such as SSN, name, and date of birth) and other data needed to run the computer programs, as well as the code for the computer programs. SSA will match the study sample to the Master Earnings File, run the computer programs, verify that output complies with regulations related to the privacy of the data, and send to the study team the output from these computer runs.

The Master Earnings data are updated annually, with more than 90 percent of the records updated by August of the following calendar year. These data are complete by the following February. Hence, the lag in obtaining earnings information is about 8 to 14 months. For example, data pertaining to calendar year 2015 will become available in preliminary form in August 2016. The analysis will use SSA data from calendar years 2011 through 2015. Data covering calendar year 2011 will provide a clean measure of baseline earnings for all sample members. Data covering calendar year 2015 will provide a clean measure of post-program earnings for all sample members, including those who are among the last to enroll in the study. Data covering calendar years 2012 through 2014 will reflect a mix of earnings before, during, and after education and training, depending on when the study participants begin and end their programs.8


Table A.2. Use of the Data to Answer Study Research Questions

Research Question

Main Impacts, Exploratory Impacts, or Descriptive Question

PGE Schools Data

Survey Data

FSA Data

SSA Data

Characteristics of the School Applicants and their Desired Education or Training Programs

  • What are the characteristics of the education or training programs identified by participating institutions as most suitable for the experiments (e.g., field of study, intensity, duration, and cost)? How are study applicants distributed across the programs?

Descriptive

X


X

X

  • What are the personal and family background characteristics of individuals who qualify for each of the experiments?

Descriptive

X

X



Education-Related Outcomes

  • What is the impact on participation in PGE education and training programs of access to Pell Grant funding by individuals who qualify for Experiment 1 or Experiment 2?

Main impacts


X


X


  • For each substudy, what are the characteristics of the education and training programs in which study participants enroll, such as the field of study, intensity, duration, and cost?

Descriptive

X

X



  • For each substudy, what is the impact of access to Pell Grants on study participants’ education and training outcomes, such as their credits earned, completion rates, and credentials attained?

Exploratory impacts


X

X


  • For each substudy, what is the impact of access to Pell Grants on the amount of student debt incurred by study participants?

Exploratory impacts


X

X


  • For each substudy, how are the costs of study participants’ education and training distributed across different types of financing strategies, such as grants, loans, and personal or family savings?

Descriptive

X

X



  • What is the impact of access to Pell Grants on barriers that study participants from successfully completing their education and training plans?

Exploratory impacts


X

X


Employment-Related Outcomes

  • For each substudy, what is the impact of access to Pell Grant funding on study participants’ employment rates and earnings levels?

Main impacts



X

X

  • For each substudy, what are the characteristics of jobs that study participants attain?

Descriptive

X

X





A3. Uses of Improved Technology to Reduce Burden

The study will primarily rely on electronic technology to collect data. For each data collection activity, IES has selected the form of technology that will provide reliable information while minimizing respondent burden. This section describes the four activities and their use of technology to reduce burden.

  1. Collection of PGE school data. Two types of data will be collected from participating PGE schools: (1) a very limited amount of identifying information so that random assignment can be conducted; and (2) data extracts on the types of training programs that participants enroll in, their educational outcomes, and financial aid information. For the first type of data, the PGE study will use a web-based random assignment database that will facilitate ease of data entry by PGE school staff. The second type of data will be acquired in electronic format whenever possible to reduce the burden on PGE schools in providing the data. The contractor will set up a FTP site so that participating schools can upload data extracts easily and securely.

  1. Collection of survey data. The survey instrument for this evaluation has been designed to be mode-neutral, which means that it can be easily adapted to be completed through any of these three modes—online, telephone, or mail. Allowing respondents to choose the survey mode they are most comfortable with can decrease response burden and signals flexibility to respondents. Nonetheless, among the modes a respondent may choose between, a web survey is preferred because it will lead to the highest response rate and lowest respondent burden.9 In addition, it provides valuable cost efficiencies in the administration of the survey. Although an online survey will likely provide most of the responses, telephone and paper modes will be offered to sample members who do not respond to the online survey. The current expectation is that approximately 50 percent of respondents will complete the survey online, and completion rates for telephone and mail are expected to be 30 percent and 20 percent, respectively.

  2. Collection of FSA data about study participants. Data available from the FSA office within ED will include the FAFSA for each study participant, as well as other information about their continued eligibility and receipt of federal financial aid. The FSA data extraction efforts will proceed using the established technology standards and protocols for secure transmission of the data.

  3. Use of Social Security earnings data for study participants. The data maintained by SSA will not be directly available to the evaluation researchers; rather, the evaluation team will provide electronic computer programs to SSA staff who will run the programs on the data and provide output to the evaluation team.

A4. Efforts to Identify Duplication

In designing the PGE evaluation, the study team has examined existing literature and data sources to ensure that this effort does not duplicate existing or available data. To the extent possible, we are relying on extant data and will merely obtain it from the appropriate agencies. Data on the program experiences and labor market outcomes of the target populations can be obtained only through the proposed data collection design.

The PGE school and survey data elements, for which OMB approval for data collection are requested in this package, do not currently exist for this study’s target populations (that is, students currently ineligible for Pell Grant eligibility) and can be collected only through the proposed data collection plan. Neither of these two types of data will contain information identical to that available through the FSA, SSA earnings records, or other sources. For example, although both the survey and SSA records will contain information about sample members’ employment and earnings, the employment and earnings measures from each data source will differ. The SSA data will contain information about earnings from jobs covered by Social Security and aggregated to a single total for a calendar year. In contrast, the survey data will provide information about the sample member’s hours worked, wage rate, occupation, and satisfaction with the current job. Including both the survey and SSA data in the study is important because the survey data will not be available for all sample members.

In addition, although both the survey and the PGE schools administrative data will provide information about sample members’ participation in education or training at PGE schools, the measures that are available from each source will differ. Thus, it is important for the study to collect data from both sources. Because administrative records data are not subject to recall bias, it is expected that they will provide more accurate information for some topics, such as the dates that study participants start and stop their enrollment in an education or training program. In addition, the administrative data will be available for all sample members. However, the administrative data will not contain any information about sample members’ participation in education and training programs not offered at PGE schools. In contrast, the survey data will collect information about education and training program experiences at all schools. Furthermore, the survey data will provide information about some topics that are unavailable through administrative records, such as why participants discontinued involvement in a program.

A5. Methods to Minimize Burden on Small Business Entities

This data collection effort does not involve small business or other small entities.

A6. Consequences of Not Collecting the Data

The data collected through this study will enable the evaluation to generate unbiased estimates of the impacts of possible changes to Pell Grant eligibility rules on participants’ education, training, employment and earnings. Given the cost implications of making these eligibility changes on a larger scale, evidence on impact is crucial. Without collecting the planned data for Experiments 1 and 2, only limited information will be available to justify the use of current funds OMB has set aside for the demonstrations.

Most of the data for this evaluation will be obtained from administrative data housed by PGE schools, the FSA, and the SSA. The survey component of the data collection effort, however, is also critical. The information collected through the survey instrument is the only direct input the study team will obtain from actual program participants. Data collected from the survey will support an in-depth understanding of the participation patterns and perceptions of the group of students the program intends to benefit. Moreover, because the survey will measure respondents’ assessments of the quality of their educational and training experiences, the survey is the primary means by which the evaluation can explore how specific aspects of training affect labor market outcomes. In this regard, the detailed data collected through the survey instrument allows for an assessment of the causal mechanisms that relate specific training experiences and skills with labor market outcomes.

A7. Special Data Collection Circumstances

This data collection does not involve any special circumstances.

A8. Federal Register Notice

As required by 5 CFR 1320.8 (d), a 60-day Federal Register notice was published (May 2, 2012) and a 30-day notice will be published. The Federal Register announcements provide the public an opportunity to review and comment on the planned data collection and evaluation. Consultations on the research design, random assignment of participants to treatment and control groups, survey sample design, questionnaire instrument revisions, and data collection will be part of the evaluation’s design phase. These consultations will further ensure the technical soundness of the study and the relevance of the findings.

A9. Respondent Payments

The survey component of the evaluation aims to provide detailed information on study participants’ engagement in and satisfaction with the training programs in which they participate (offered by both PGE and non-PGE schools) as well as their self-reported labor market status and experiences, their participation in services to help them to attain employment, and their experiences financing their education. A survey will be conducted for subsamples of members of both the treatment and control groups in Experiments 1 and 2 in order to learn about their experiences under both the new Pell Grant eligibility rules and the current eligibility rules. Given this goal, it is critical to maximize cooperation among treatment and control group members and ensure high survey response rates, thereby ensuring the representativeness of the survey sample and providing data that are complete, valid, reliable, and unbiased. To this end, providing a modest payment to study participants who complete the survey is expected to significantly increase response rates and ensure that the data collected from the sample is representative. Respondent payments can help achieve high response rates by increasing the sample members’ propensity to respond (Singer et al. 2000). Studies offering respondent payments typically demonstrate decreased refusal rates and increased contact and cooperation rates. Among sample members who initially refuse to be interviewed, the payments increase refusal-conversion rates. Expenditures used for respondent payments can help contain costs of administering the survey by increasing cooperation, which reduces the costs associated with recruiting additional survey participants. The use of respondent payments has become common practice for survey studies in recent years (Curtin et al. 2005).

IES plans to provide a two-tiered incentive structure to encourage high survey response rates. In both cases, a $5 up front payment will be mailed to each survey sample member, along with a letter informing them about the survey and encouraging them to complete. Under the two-tiered approach, the study team will offer an additional $10 payment to respondents who complete the survey online. The total $15 incentive is consistent with ED and OMB guidelines and is a gesture to thank individuals for the time they spend completing the survey interview. The second-tier incentive offers a $5 additional payment to respondents who complete the survey either by telephone or mail. The lower amount is intended to help offset the higher completion costs associated with telephone and mail survey modes and to encourage respondents to select the less-expensive online option.

Such a sign of appreciation motivates sample members to participate in a survey. The current study plan involves an initial email invitation to the sample members that will invite them to follow a web link that directly connects them to the survey instrument. For those sample members without email addresses, the study team will attempt to make contact by telephone and provide an option to take the survey online, over the telephone, or by mail. Prospective respondents contacted in this fashion will be offered the $15 payment during this recruitment phase for taking the survey online. As explained above, the $10 incentive will be offered for completion of the survey by telephone or mail.

A10. Confidentiality

Policies and procedures related to confidentiality, physical and technical safeguards, and approaches to the treatment of personally identifiable information (PII) will be followed to ensure the confidentiality and protection of all data collected about study participants.

a. Confidentiality Policy

ED, its evaluation contractor (which has yet to be selected), and all other parties involved in the evaluation of the PGE will follow the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183. The act requires “All collection, maintenance, use, and wide dissemination of data by the Institute” to “conform with the requirements of section 552 of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act (20 U.S.C. 1232g, 1232h).” These citations refer to the Privacy Act, the Family Educational Rights and Privacy Act, and the Protection of Pupil Rights Amendment. In addition, for student information, “The Director shall ensure that all individually identifiable information about students, their academic achievements, their families, and information with respect to individual schools, shall remain confidential in accordance with section 552a of title 5, United States Code, the confidentiality standards of subsection (c) of this section, and sections 444 and 445 of the General Education Provision Act.” Subsection (c) of section 183 referenced above requires the director of the Institute of Education Sciences to “develop and enforce standards designed to protect the confidentiality of persons in the collection, reporting, and publication of data.” Subsection (d) of section 183 prohibits disclosure of individually identifiable information as well as making the publishing or communicating of individually identifiable information by employees or staff a felony.

b. Confidentiality Safeguards

ED, the evaluation contractor, and all other parties involved in the evaluation of the PGE will protect the confidentiality of all information collected for the study and will use it for research purposes only. No information that identifies any study participant will be released. Information from participating institutions and respondents will be presented at aggregate levels in reports. Information on respondents will be linked to their institution but not to any individually identifiable information. No individually identifiable information will be maintained by the study team after the information is no longer necessary for the purpose of conducting the study and fulfilling contractual requirements. ED will require the evaluation contractor to keep hard copies of documents in securely locked file cabinets, electronic data files to be encrypted, and access to study files to be strictly limited to study staff who have been identified by the project director as having a need to view those files. Respondents will be given written assurance in all advance materials about the PGE study data collection (and verbal reminders during administration of the survey by telephone) that the information they provide will be kept private and will not be disclosed to anyone but the researchers authorized to conduct the study, except as otherwise required by law. Furthermore, ED will require all institution-level identifiable information to be kept in secured locations and identifiers to be destroyed as soon as they are no longer required.

In addition, the following language will appear on all letters, brochures, and/or other study materials:

Per the policies and procedures required by the Education Sciences Reform Act of 2002, Title I, Part E, Section 183, responses to this data collection will be used only for statistical purposes. The reports prepared for this study will summarize findings across the sample and will not associate responses with a specific district or individual. We will not provide information that identifies you or your district to anyone outside the study team, except as required by law. Any willful disclosure of such information for nonstatistical purposes, without the informed consent of the respondent, is a class E felony.

A11. Questions of a Sensitive Nature

None of the survey questions is considered sensitive. The survey instrument (Appendix A), has, whenever possible, been developed based on questions from existing education and labor market surveys. Moreover, the beginning of the survey instrument explains that respondents may decline to answer a question if they would like; thus, if a respondent prefers not to answer a question that he or she deems sensitive, then he or she will be able to continue to respond to other questions.

A12. Hour Burden of Collection of Information

The planned study includes two separate hour burden estimates: (1) the hour burden imposed on PGE schools to provide data for the study and (2) the hour burden for respondents who are contacted and complete the survey. Both burden estimates are presented next.

Estimated hour burden for PGE school data. The planned evaluation includes three types of data collection efforts from PGE schools (Table A.3). The first is the effort to provide the evaluation contractor with lists of eligible candidates identified for the two experiments each week and the dates that consent materials were mailed. There are 28 PGE schools participating in Experiment 1 and 40 schools in Experiment 2. While some schools will participate in both, for the purposes of estimating burden we consider there to be a total of 68 PGE school experiments. Schools participating in Experiment 1 are expected to have an average of about 100 study participants while schools in Experiment 2 are expected to have an average of about 200 study participants. It is expected that the effort required to record and send the data to the evaluator on a weekly basis is 10 minutes per participant. For Experiment 1, this time equals 466.67 hours (28 school experiments x 100 participants per school x 10 minutes per participant divided by 60 minutes per hour). For Experiment 2, the comparable estimate of burden is 1,333.33 hours (40 schools x 200 participants per school x 10 minutes per participant divided by 60 minutes per hour).

The second data collection effort involves the recording of a limited amount of information about individuals in the study-specific random assignment database. It is expected that the processing time and effort required for data entry of sample members into the database is similar to that of recording candidates and consent dates, or 10 minutes per participant. Given the number of schools in each experiment and the expected number of participants at each school, as described above, for Experiment 1 the estimated time required equals 466.67 hours (28 school experiments x 100 participants per school x 10 minutes per participant divided by 60 minutes per hour). For Experiment 2, the comparable estimate of burden is 1,333.33 hours (40 schools x 200 participants per school x 10 minutes per participant divided by 60 minutes per hour).

The final type of data collection effort from PGE schools involves three waves of administrative data extraction. It is assumed that each extraction, regardless of the wave or the type of experiment in which a school participates, will take 8 hours. Hence, the total time for PGE schools in Experiment 1 to extract and process the administrative data records is 672 hours (28 schools x 3 extracts per school x 8 hours per extract). Similarly, the total time for PGE schools in Experiment 2 is 960 hours (40 schools x 3 extracts per school x 8 hours per extract).

Summing the total hours from the three different types of data collection processes for each experiment leads to burden estimates of 1,605.34 hours for Experiment 1 and 3626.66 hours for Experiment 2. The total burden for PGE school data collection, across both experiments, equals 5,232.00 hours.






Table A.3. Hour Burden Estimates for PGE Schools

Experiment

Number of PGE School Experiments

Number of Participants per PGE School

Frequency of Collection

Average Processing Time

Burden (Hours)







Experiment 1






List of Candidates and Date Consent Materials Mailed

28

100

Once per study participant

10 minutes per study participant

466.67

Data Entry into the Random Assignment Database

28

100

Once per study participant

10 minutes per study participant

466.67

Extraction of School Enrollment Records

28

n.a.

3 extracts per school

8 hours per extract

672.00

Subtotal

28

n.a.

n.a.

n.a.

1,605.34







Experiment 2






List of Candidates and Date Consent Materials Mailed

40

200

Once per study participant

10 minutes per study participant

1,333.33

Data Entry into the Random Assignment Database

40

200

Once per study participant

10 minutes per study participant

1,333.33

Extraction of School Enrollment Records

40

n.a.

3 extracts per school

8 hours per extract

960.00

Subtotal

40

n.a.

n.a.

n.a.

3,626.66







Total for Both Experiments

68

100-200

n.a.

n.a.

5,232.00

n.a. = not applicable.



Estimated hour burden for survey respondents. The planned evaluation includes one survey of study participants in Experiments 1 and 2. The same survey instrument will be used for all respondents. The survey of Experiment 1 participants will include a total of 1,000 respondents equally stratified by treatment and control groups (approximately 500 respondents per group). The survey of Experiment 2 participants will also contain 1,000 respondents with equal representation across treatment and control groups. With a targeted response rate of 80 percent, the total sample size needed to obtain 1,000 completed surveys for Experiment 1 is approximately 1,250 participants (500 x 2/0.8). The same survey design will be used to collect data for Experiment 2 respondents; a total sample of 1,250 participants is expected in order to collect 1,000 completed surveys for Experiment 2.

For both Experiments 1 and 2, the time to complete the survey is estimated to be 15 minutes. Table A.4 provides the total hour burden estimates for survey respondents. The total hour burden for survey respondents in Experiment 1, therefore, is estimated to be 15,000 minutes (1,000 respondents x 15 minutes per respondent), which when divided by 60 equals 250 hours. This same formula is used to determine the hour burden for respondents in Experiment 2, which also produces a burden of 250 hours. Together, Experiments 1 and 2 yield a total respondent burden of 500 hours.


Table A.4. Hour Burden Estimates for Survey Data Collection

Respondents

Number of Respondents

Frequency of Collection

Average Time per Sample Member

Burden (Hours)

Experiment 1





Respondents

1,000

Once

15 minutes

250

Subtotal

1,000

Once

n.a.

250






Experiment 2





Respondents

1,000

Once

15 minutes

250

Subtotal

1,000

Once

n.a.

250






Total for Both Experiments

2,000

Once

n.a.

500

n.a. = not applicable.


Total estimated hour burden for all data collection. Summing the hour burden estimates from the PGE school data collection and the survey data collection yields the total burden expected for the PGE study. As shown in Table A.5, the total expected burden equals 1,855.34 hours for Experiment 1 and 3,876.66 hours for Experiment 2. The total burden for all respondents across both experiments, equals 5,732.00 hours.

Table A.5. Total Hour Burden Estimates for Experiments 1 and 2

Experiment

Burden (Hours)



Experiment 1


PGE Schools Data Collection

1,605.34

Survey Data Collection

250.00

Subtotal

1,855.34



Experiment 2


PGE Schools Data Collection

3,626.66

Survey Data Collection

250.00

Subtotal

3,876.66



Both Experiments


PGE Schools Data Collection

5,232.00

Survey Data Collection

500.00

Total for Both Experiments

5,732.00

PGE = Pell Grant Experiments.


There are two cost burden estimates included in this request for clearance: (1) the cost burden imposed on PGE schools for providing study data and (2) the costs imposed on survey respondents for taking the survey. Estimates for both data collection activities are provided next.

Estimated cost burden for PGE school data. As shown in Section A.12, the estimated total hour burden for PGE school data collection is 5,232 (1,605.34 for Experiment 1 and 3,626.66 for Experiment 2). The average hourly wage, based on the Bureau of Labor Statistics (BLS) average hourly earnings of production and nonsupervisory employees on private, nonfarm payrolls, is $19.42 (May 2011 Employment Situation Table B-8, Current Employment Statistics, BLS, U.S. Department of Labor). Multiplying the total number of hours by the average hourly wage yields a total cost estimate of $101,606 (5,232 hours x $19.42 per hour). Table A.6 provides separate cost burden estimates for Experiments 1 and 2 as well as the total combined cost estimate.

Table A.6. Cost Burden Estimates for PGE Schools

Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Cost





Experiment 1




List of Candidates and Date Consent Materials Mailed

466.67

$19.42

$9,063

Data Entry into the Random Assignment Database

466.67

$19.42

$9,063

Extraction of School Enrollment Records

672.00

$19.42

$13,050

Subtotal

1,605.34

$19.42

$31,176





Experiment 2




List of Candidates and Date Consent Materials Mailed

1,333.33

$19.42

$25,893

Data Entry into the Random Assignment Database

1,333.33

$19.42

$25,893

Extraction of School Enrollment Records

960.00

$19.42

$18,643

Subtotal

3,626.66

$19.42

$70,430





Total for Both Experiments

5,232.00

$19.42

$101,606


Estimated cost burden for survey respondents. The estimated total hour burden for survey respondents is 500 hours (250 x 2), which is the sum of the burden from Experiments 1 and 2. The cost burden per hour for the survey is calculated based on the same hourly wage, $19.42, as the cost burden per hour for the PGE schools data collection effort. Multiplying the total number of hours by the average hourly wage yields a total cost estimate of $4,855 per experiment, or $9,710 for both experiments combined. Table A.7 provides the total hour burden estimates for the survey data collection in Experiments 1 and 2 as well as the total cost estimate.

Total estimated cost burden for all data collection. Summing the cost burden estimates from the PGE school data collection and the survey data collection yields the total burden expected for the PGE study. As shown in Table A.8, the total expected burden equals 1,855.34 hours for Experiment 1 and 3,876.66 hours for Experiment 2; the total burden for both experiments equals 5,732 hours. Multiplying these hour burden estimates by the average wage of $19.42 yields cost burden estimates of $36,031 for Experiment 1, $75,285 for Experiment 2, and $111,316 for both experiments combined.

Table A.7. Cost Burden Estimates for Survey Data Collection

Data Collection Activity

Total Burden Hours

Average Hourly Wage

Total Cost

Experiment 1




Respondents

250

$19.42

$4,855

Subtotal

250

$19.42

$4,855





Experiment 1




Respondents

250

$19.42

$4,855

Subtotal

250

$19.42

$4,855





Total for Both Experiments

500

$19.42

$9,710


Table A.8. Total Cost Burden Estimates for Experiments 1 and 2

Experiment

Total Burden Hours

Average Hourly Wage

Cost Burden





Experiment 1




PGE Schools Data Collection

1,605.34

$19.42

$31,176

Survey Data Collection

250.00

$19.42

$4,855

Subtotal

1,855.34

$19.42

$36,031





Experiment 2




PGE Schools Data Collection

3,626.66

$19.42

$70,430

Survey Data Collection

250.00

$19.42

$4,855

Subtotal

3,876.66

$19.42

$75,285





Both Experiments




PGE Schools Data Collection

5,232.00

$19.42

$101,606

Survey Data Collection

500.00

$19.42

$9,710

Total for Both Experiments

5,732.00

$19.42

$111,316

PGE = Pell Grant Experiments.


A13. Estimated Total Cost Burden for Collection of Information

There are no start-up costs for this collection.

A14. Estimated Annualized Cost to the Federal Government

The annualized cost to the Federal government has only been estimated, not confirmed, and cannot be included in public documents because a competition for the evaluation contract is underway. In addition to the costs associated with collecting and analyzing the evaluation data, FSA will bear some costs because they will have to pay their contractor to write and execute queries to the National Student Loan Data System to allow for extraction of appropriate data to manage the two demonstration programs.

A15. Changes in Burden

This is a new information collection request.

A16. Publication Plans and Project Schedule

The final publication plans have not yet been determined. ED plans to work with an independent evaluation contractor to finalize a publication and project schedule. Table A.9 provides a preliminary project schedule for the specific data collection activities for which clearance is requested, as well as for those that impose no new burden. The table also includes expected dates for the interim and final reports for the PGE study. The time lines are considered preliminary.



Table A.9. Preliminary Project Schedule of Activities

Activity

Time Frame

Study Participant Enrollment

July 2012 – January 2014

Expected Program Completion Period

Fall 2012 – Fall 2014

FSA Data Extraction

November 2012, November 2013, and November 2014

Data Extracts from PGE Schools

February 2013, February 2014, and February 2015

Survey Fielding Period

July 2014 - March 2015

SSA Data Analysis

July 2016 through December 2016

Final Report

June 2017

FSA = Federal Student Aid office; PGE = Pell Grant Experiments; SSA = Social Security Administration.


A17. Reasons for Not Displaying Expiration Date of OMB Approval

The expiration date for OMB approval will be displayed on all forms associated with this data collection.

A18. Exception to the Certification Statement

Exception to the certification statement is not requested for the data collection.

rEFERENCES

Baum, Sandy and Kathleen Payea. Trends in Student Aid 2011. New York, NY: The College Board, 2011.

Bettinger, Eric. “How Financial Aid Affects Persistence.” In College Choices: The Economics of Where to Go, When to Go, and How to Pay For It, edited by Caroline Minter Hoxby. Cambridge, MA: National Bureau of Economic Research, 2004.

Curtin, R., S. Presser, and Eleanor Singer. “Changes in Telephone Survey Nonresponse Over the Past Quarter Century.” Public Opinion Quarterly, vol. 69, no. 1, spring 2005, pp. 87–98.

Federal Register. “Postsecondary Educational Institutions Invited To Participate in Experiments Under the Experimental Sites Initiative.” vol. 76, no. 208, October 27, 2011. Available at [http://www.gpo.gov/fdsys/pkg/FR-2011-10-27/html/2011-27880.htm]. Accessed March 15, 2012, pp. 66698–66707.

Seftor, Neil S. and Sarah E. Turner. “Back to School: Federal Student Aid Policy and Adult College Enrollment.” The Journal of Human Resources, vol. 37, no. 2, spring 2002, pp. 336-352.

Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, summer 2000, pp. 171–188.

Social Security Administration. “Social Security Bulletin: Annual Statistical Supplement 2001.” Washington, DC: Social Security Administration Office of Research, Evaluation, and Statistics, 2001.

U.S. Department of Education. “Federal Pell Grant Program.” Washington, DC: U.S. Department of Education, 2011. Available at [http://www2.ed.gov/programs/fpg/index.html]. Accessed March 15, 2012.

U.S. Department of Education. “2009-2010 Federal Pell Grant Program End-of-Year Report.” Washington, DC: U.S. Department of Education, 2010. Available at [http://www2.ed.gov/finaid/prof/resources/data/pell-2009-10/pell-eoy-09-10.pdf]. Accessed March 23, 2012.


















2 Randomly assigning within programs will promote treatment-control group balance on this important dimension. This might allow the evaluation to calculate impacts separately by occupational area.

3 The particular methods that schools use to recruit potential sample members and any screening that is conducted to assess applicants’ interest levels in the PGE program before random assignment is conducted will have an influence on the rate at which study participants enroll in the PGE program.

4 There is also some possibility of obtaining quarterly wage data from the National Directory of New Hires (NDHD) maintained by the U.S. Department of Health and Human Services. There is pending legislation to expand access to the database for federal research purposes. If this access is available during the evaluation period, we would consider substituting NDNH data for the FSA annual earnings data.

5 While Ed would prefer to have an additional, longer-term follow up on earnings, study resources do not currently allow for that.

6See, for example, interviews with manufacturing executives: http://www.cnbc.com/id/44838614/Need_Work_US_Has_3_2_Million_Unfilled_Job_Openings; http://www.ft.com/cms/s/0/6d586922-21f0-11e1-8b93-00144feabdc0.html#axzz1queCR3E0

7 About 96 percent of all workers in employment or self-employment are covered under the Old-Age, Survivors, and Disability Insurance (OASDI) program (Social Security Administration 2001). Workers who are not covered include (1) civilian federal employees hired before 1984, (2) railroad workers, (3) some employees of state and local governments who are covered under their employers’ retirement systems, (4) people with net annual earnings from self-employment below $400, and (5) domestic and farm workers with low earnings.

8 It would be preferable to obtain employment and earnings data from the Unemployment Insurance system maintained by state and/or local workforce agencies. Such data is collected quarterly and contain additional information (e.g., occupational area of employment) that would be useful in either adding to or verifying information obtained through the survey. However, based on the Department of Labor’s experience, pursuing this approach would likely take too long and be too costly given the limitations of the Pell Grant Experiments evaluation.

9 In addition to capturing time-stamp information on participants, web administration allows instrument programming to route respondents through the questionnaire without relying on them to ensure that skip patterns are followed. Web administration also allows respondents to take the survey at their own pace at a time that is convenient for them (saving their responses, exiting the survey, and returning to the survey later). Moreover, the web pages that display the survey questionnaire can be designed to work well with smart phones as well as laptop and desktop computers. The ability to complete the survey using a smart phone can increase online response rates, particularly among lower income populations that are more likely to have smart phones than desktop or laptop computers.


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorCMcClure
File Modified0000-00-00
File Created2021-01-30

© 2024 OMB.report | Privacy Policy