ED Response Summary to Public Comments (60 day)

Responses to Public Comment on RGEES Survey and Standards.docx

Recent Graduates Employment and Earnings Survey (RGEES) Standards and Survey Form

ED Response Summary to Public Comments (60 day)

OMB: 1845-0138

Document [docx]
Download: docx | pdf

Department of Education Responses to Public Comments on Recent Graduates Employment and Earnings Survey - 60 day comment period ending 8-31-2015

Marc Jerome (ED-2015-ICCD-0085-0004):


Comment 1--

The 60% Threshold for Responses is Not Reasonable

If colleges had 100% correct contact information for graduates years after graduation, a 60% response rate would be difficult.  Given that this is not the case and the percentage of correct addresses may very well be below 60%, the prescribed threshold is unreasonable. The College tested a sample set of our graduates. We chose one program with 165 graduates in the 2011-2013 cohort. We reached out to this group based on the contact information that we had (which is now 2-4 years old and would become older at the time of the actual survey). We were able to contact approximately 25% of the population, with only half of those contacted willing to complete our requested survey. Requiring a 60% response rate is unrealistic based on the information we currently have.


In addition, the time frame given does not allow for adequate time to obtain updated contact information. The administration of a survey will begin 3- 5 years after the student has graduated from the institution. Students rarely provide updated contact information to their college after they graduate. This will further diminish the ability of the Institution to receive an adequate sample size.


Response: Please note that the threshold identified in the RGEES Standards for the response rate is 50 percent, not 60 percent as indicated by this commenter. The 50 percent response rate required in the RGEES Standards mirrors the requirement in the Gainful Employment Regulations for the other alternative earnings source allowable under an appeal. That approach requires that the institution obtain earnings records from the State employment database for 50 percent of the programs’ graduates (based on the list of program graduates submitted to SSA for earnings data). Although it would be preferable to have response rates higher than 50 percent, the Department has determined that 50 percent should be the minimum acceptable level, thus matching the requirement for the other alternative source for earnings data to be used in an appeal.


With regard to outdated student contact information, we note that each institution has the ability to access the most recent contact information available for each student through the National Student Loan Data System (NSLDS), and thus does not have to rely on its own records which may be out of date.


Comment 2—

Requesting use of alternate earnings data

We propose that institutions be allowed to use BLS earnings data, segmented by region to obtain appropriate earnings. A 2nd method of data could be an independent 3rd party that is already collecting this data. Companies like Equifax (the credit monitoring company) are building a database of actual W-2 earnings from employers. Information from this source should satisfy the alternative earnings survey process.


Response: As explained more fully in the preamble to the proposed and final Gainful Employment regulations, BLS data do not reflect the earnings of the graduates of the program for which the D/E rates are calculated. For the same reason, BLS data cannot be used as alternative earnings. To the extent that other dependable sources are developed that contain individual earnings data that can be associated with a program’s graduates, the Department will examine the suitability of those sources when they become available.


Comment 3--

Burdensome to prepare and respond

The response indicated that a student would complete the survey within five minutes. While it may take 5 minutes to complete the survey there is significantly more time involved to complete it accurately. In order to ensure accurate completion, the institution will be required to reach out to all respondents to ensure their understanding of the questions. This process will take longer than 5 minutes for the respondent. This process is quite burdensome to the student and institution. It is estimated that it will take a student approximately 90 minutes to complete the survey. The breakdown is as follows:


5 minutes to review the survey when received and decide if he/she will complete based on the incentive


20 minutes to review the questionnaire with the institution


45 minutes to retrieve the salary information for the time frame indicated


10 minutes to decide if unreported income will be disclosed on the survey


10 minutes to deliver the envelope to the post office for mailing


This translates to 183,600 burden hours using the same assumptions used for the RGEES.


Response: The Department appreciates the concern regarding the burden estimate. We note for clarification that the burden assessment applied here is only for the individual who receives the survey. The burden to the institutions was previously assessed in the final Gainful Employment regulations that were published as a final rule on October 31, 2014. The institutional survey burden is contained in the information collection package OMB Control Number 1845-0122 with a total burden hours of 23,860 hours for institutions. That information collection was open for public comment during the Notice of Proposed Rulemaking comment period.


This new information clearance package is for the individual graduate who will receive the survey and mirrored the burden estimate for the individual as assigned for the pilot Recent Graduate Employment and Earnings Survey which has just concluded its information collection clearance. In reviewing this and other comments regarding the individual graduate burden estimates the Department is persuaded that 5 minutes is not sufficient time for a survey recipient to review the survey questions and search for any documentation to allow for the most complete and accurate response.


However, the Department does not agree with this commenter that the individual who receives the survey would require the proposed 90 minutes to complete the survey. There is no basis to assume that all of the institutions that send an earnings survey to its graduates will need to spend time explaining the survey to the recipient. Nor is there a basis for each individual to require 50 minutes to locate earnings documentation and determine what earnings information will be supplied on the survey. The Department anticipates that the majority of these surveys will be performed electronically utilizing the free RGEES data collection and processing platform which will meet all of the standards requirements, employs built in edit checks, and eliminates the need for 10 minutes to post an envelope.


The Department is revising its individual burden estimate upward from 5 minutes to 20 minutes for an individual to review the survey, locate income documentation, record the data on the survey and submit it to the institution. Therefore the revised burden is 7,374 hours (22,123 x 20/60), an increase of 5,604 hours from the original estimate of 1,770.


Comment 4--

Only 10% of programs in the Zone or that Fail the metric would appeal the data with the alternative survey.


The assumption that only 10% of programs would appeal seems arbitrary. This number is extremely low. We believe that over 75% of the programs in the Zone or that Fail the metric would attempt to complete and submit the alternative earnings survey. All of these schools would want to protect their program and supply accurate data to be able to do so.


Based on the calculation of burden in the supporting documentation, we believe the calculation will look closer to this, rather than the calculation presented.


Graduates: 221,227 x 75% = 165,620 graduates sent a survey.


These are new burden hours.

Respondents Responses Burden Hours

Individuals 165,620 165,620 X .08 hours 13,250


Based on our assumptions in #2 above, we believe the actual burden hours to be the following:


Respondents Responses Burden Hours

Individuals 165,620 165,620 X 1.5 hours 248,430


Response: As discussed more fully in the final Gainful Employment regulations, an institution may submit an appeal only if the earnings data obtained from the survey categorically improve a program’s Debt to earnings rate, e.g., from a failing rate to a passing or zone rate, or a zone rate to passing rate. Institutional burden is calculated based on an estimate of the number of appeals that would be submitted to the Department, not the number of attempted surveys. The estimate that survey appeals would be submitted for 10% of the programs reflects our belief that, except perhaps for tips or unreported cash earnings, SSA earnings data are accurate and reliable.




Satkartar Kinney (ED-2015-ICCD-0085-0005):


Comment 1--

The interspersed notes that indicate whether each requirement is handled through the platform or must be addressed separately are very distracting. Please consider separating the requirements for different types of users or perhaps labeling requirements that are for everyone with an asterisk or some other mark.

Response: The Department understands your concern over the distraction of the labelling as to whether each standard is addressed within the RGEES Platform or not. Since it is important to present all of the standards in one comprehensive document that follows the flow of the various stages of a data collection cycle, the Department has adopted your second option. In the final RGEES Standards, requirements that are starred are met with information produced using the RGEES Platform. Programs using the RGEES Platform need only address the unstarred requirements. For those electing to not use the platform, the survey administrator must adhere to each of the requirements.


Comment 2--

Please review and clarify those instances where the reference population is graduates, not students.


Response: Each standard was reviewed, and although there are instances where the correct reference is to students (e.g., the graduates’ student records), clarifying changes were made in Standards 6 and 8.


Comment 3--

Page 6, 2.3.1. Can you clarify what is meant by 'reasons' for nonresponse? Does the Platform provide interim response rates during data collection?

Response: Clarifying text was added indicating that examples of reasons for nonresponse rate come from survey paradata, such as the identification of refusals and hard to locate cases. Clarifying text was also added indicating that the RGEES Platform has a participation rate reporting function that supports the monitoring of response rates throughout data collection.


Comment 4--

Page 7, 3.3. In this day of multiple modes of publication, please clarify what is meant by the term 'published'.

Response: Published was replaced with “released for external use.”


Comment 5--

Page 8, 5.1. The text states that "A completed survey must include sufficient responses to support reporting the respondent's total earnings (including 0 earnings)." This would make more sense if it read: "A completed survey must include sufficient responses to determine whether the respondent has earnings and to support reporting the respondent's total earnings (including 0 earnings)."

Response: This clarification was made. Standard 5.1 now reads “A completed survey must include sufficient responses to determine whether the respondent has earnings and to support reporting the respondent’s total earnings (including 0 earnings). The RGEES survey will be considered “complete” if the respondent fills out at least one of the earnings items.”

Comment 6--

Page 8, 5.5.1 Shouldn't the number of newly added cases be added to the number of cases in the cohort (C) and to the number of completed surveys (S)? Please help the reader understand what the excluded cases refers to.

Response: The text was clarified to indicate that excluded cases refers to those cohort members who were excluded from the agreed upon cohort list that the U.S. Department of Education submitted to the Social Security Administration to obtain the cohort earnings data. The text was also modified to clarify that the count of added cohort members should be added to the number of cases in the cohort (C) and also to the number of completed surveys (S).


Comment 7--

Page 8, 5.2.2, Please consider limiting the discussion of nonresponse bias analysis to Standard 6 on that topic.

Response: Thank you. This is a good point. The discussion of the nonresponse bias analysis was removed from Standard 5 and is addressed in Standard 6.


Comment 8--

Page 9, 6. Note that rather than assessing the potential magnitude of nonresponse bias, a nonresponse bias analysis is more correctly described as indicating the potential impact of nonresponse bias.


Response: This technical clarification was made. The sentence now reads “The nonresponse bias analysis can indicate the potential impact of nonresponse bias.”


Comment 9--

Page 10, 6.1. Clarify whether the nonresponse bias is to be measured across each of the three variables called out in standard 6 part 2.

Response: The relevant text in Standard 6.1 was modified to read “The relative bias must be computed for the percent of graduates who received Pell Grants while enrolled, the percent with zero expected family contributions, and the percent female and the average relative bias averaged over these three attributes.”


Comment 10--

Page 10, 6.1. Please help the reader understand what the excluded cases refers to.

Response: The text was modified to read “If excluded cohort members that are not part of the agreed upon cohort list that the U.S. Department of Education submitted to the Social Security Administration to obtain the cohort earnings data are added to the cohort, they must be added to the data collection data base before the final nonresponse bias analysis is conducted (see Introduction for a description of excluded cases). “







Comment 11--

Page 10, 6.1. Does "both before and after the excluded cases have been included" mean "both with and without the excluded cases"?


Response: Yes, the text was modified to read “The nonresponse bias analysis must be conducted both with and without the excluded cases.”



Ellen C. Meyer (ED-2015-ICCD-0085-0006):


Comment 1--

Regency is highly concerned that the RGEES, as designed, will fail to serve its intended purpose. In the RGEES, a graduate will be asked for specific earnings information. We expect that our graduates (like anyone) will view detailed income information as highly personal and confidential and will be resistant to providing this information voluntarily in response to a solicitation by email or mail. Indeed, they would be well advised to ignore unverified inquiries made in this matter. As the Federal Trade Commission website advises the public, “Don’t email personal or financial information.” See http://www.consumer.ftc.gov/articles/0003-phishing. Moreover, there is no logical basis to expect that an individual who has failed to truthfully disclose his or her income to the federal government for one purpose (taxes) will answer truthfully when asked for another purpose (REEGS). Notwithstanding the statement on the REEGS that individual information will not be provided to the government, a reasonable actor would either report consistent information or, as would seem more likely, decline to complete the voluntary survey.


Response: An expectation that some graduates will not respond to the survey, either because they do not wish to disclose earnings information or for other reasons, does not invalidate the purpose of the survey or render it undoable. We point to the American Community Survey (ACS), the Current Population Survey (CPS), and the National Youth Longitudinal Survey, as examples of surveys that successfully capture earnings data provided by individuals. With respect to phishing, we note that graduates will receive the survey directly from the institution, with the institution’s name and contact information, including the institution’s phone number, so that graduates can verify its authenticity.


Comment 2--

Additionally, Regency believes that a 50% response rate is unnecessary and unrealistic.


A standard primary research confidence level is 95%, and a response rate needed to achieve that would be 35% for a +/-3% margin of error or a 16% response rate for a +/- 5% margin of error. Moreover, according to basic principles of statistics, the size of the pool of survey recipients has an inverse relationship to the percentage of responses needed to reach the desired confidence level. As such, a school with ten thousand graduates will need to reach a smaller percentage of graduates than a school with one thousand graduates in order to achieve the same confidence level in its results. In short, the numeric standards do not represent common standards in data collection and are not grounded in appropriate statistical methods.


Regency regularly solicits employment information from graduates. Based on our experience and efforts to gather information about whether and where our graduates are employed – information that most individuals would not view as confidential or sensitive in nature – we believe that the 50% response rate required for the REEGS is not reasonably achievable.


Response: The Department is not aware of specific direct relationships between response rates for sample surveys and possible confidence intervals. However, confidence intervals are conceptually central to surveys that collect data from a sample drawn from a full population, not to surveys that collect data from all members of a population. For the appeals process, the data collection is based on full populations of completers for a given program so confidence intervals would not be applicable. Furthermore, the Department has determined that there is a significant likelihood of unacceptable bias resulting from response rates below 50 percent. 


Comment 3--The Department appears to acknowledge the barriers to the RGEES, as it is offering each individual in the survey pool a $25.00 incentive to complete the income survey. It is not clear whether schools will be permitted to offer monetary incentives in this manner. Even assuming we are allowed to do so, the cost will be substantial. Regency has submitted data for more than 8,600 graduates. Following the Department’s methods, Regency will incur costs in excess of $107,000 to achieve the minimum 50% response rate. There will also be substantial new costs to the organization to staff call centers for necessary follow-up telephone and email outreach to graduates. These costs will be prohibitive to many in the cosmetology education field given current enrollment trends. We are also concerned that the use of this incentive payment to elicit responses will tend to skew the pool of respondents towards those with lower earnings.


Response: There is no statutory or regulatory prohibition for offering respondent encouragements, including incentives. Indeed, respondent encouragements are suggested in the Best Practices Guide posted to the FSA website on August 31, 2015 (http://ifap.ed.gov/GainfulEmploymentInfo/attachments/BestPracticesGuide.pdf). Such encouragements are not required by the regulation but an institution may choose to avail itself of this option to help increase response rates.


The Department is revising the information collection to include possible costs for offering respondents encouragements to complete the survey. Based upon a “Best Practice Guide” monetary encouragement of $5.00 for each of the 22,123 students that are projected to be in failing or zone GE programs, the total amount of potential cost is projected to be $110,615.00 (22,123x $5 = $110,615). This estimate may be high when considering some institutions that submit their RGEES survey will either not be providing cash encouragements or only providing cash encouragements to a subset of the total projected number of recent graduates.



Rebecca Campoverde (ED-2015-ICCD-0085-0007):


Comment 1--

The form requests precise data for multiple income sources (employer, self­ employment, free-lancing), but it will be difficult to ensure the accuracy of these sources.


Response: The RGEES asks about earnings using a series of items designed to elicit total earnings by asking separately about 1) wages, salary, tips, overtime pay, bonuses or commission, 2) earnings from self-employment, and 3) earnings from other work including freelancing, consulting, moonlighting, or other casual jobs. Based on standard practice for earnings/income items in federal surveys, the multiple item approach is designed to help the respondent remember all of their sources of earnings. The RGEES is designed to facilitate recall of total earnings, rather than to collect precise estimates of each earnings component. By guiding the graduate through the various types of earnings, the survey helps the graduate consider all possible sources of earnings and so come up with an accurate estimate of their total earnings for the reference calendar year.


These items were derived from existing person-level surveys conducted regularly by the United States Census Bureau (Census) and the Bureau of Labor Statistics (BLS):

Question 2. This leading “No/Yes” item summarizes several pages of employment questions that precede earnings questions in the March CPS. This is the same approach as used in the National Youth Longitudinal Survey income questions (please see https://www.nlsinfo.org/content/cohorts/nlsy97/other-documentation/questionnaires, under round 16 income – item YINC – 1400).


Question 2a asks about earnings from the job held the longest during a calendar year and was derived from March CPS items Q47a (instruction to focus on the job held the longest), and Q48aa and Q48aad (request for all earnings from the longest job)(please see http://www2.census.gov/programs-surveys/cps/techdocs/cpsmar14.pdf, appendix D).


Question 2b asks about earnings from all other jobs and was derived from March CPS items Q49b1d and Q49B1A.


Questions 3 and 3a ask specifically about self-employment earnings. The items were derived from parallel items in NSLY, particularly items YINC-2000 and YINC-2100. The approach to asking about earnings is also similar to item Q48b in the March CPS. Note that in concordance with recommendations from expert reviewers, the concept of ‘net earnings’ was clarified for item 3a in the current survey.


Questions 4 and 4a ask about earnings from other jobs outside of regular jobs. It is based on items Q73A1 T and Q731 in the March CPS. Wording modifications were needed to fit the population of interest for the current earnings survey.


Comment 2--

There is inconsistency in income sources: The form requests gross income from employers and net income from self-employment sources.


Response: For graduates who are self-employed, earnings reported to the IRS are based on net profit or net loss. Net profit and net loss are calculated by subtracting business expenses from business income. If expenses are less than r income, the difference is net profit and is included as income on page 1 of Form 1040. If expenses are more than income, the difference is a net loss. (http://www.irs.gov/Individuals/Self-Employed) Individuals working for an employer report gross earnings before deductions as income on page 1 of Form 1040. Therefore, net earnings for graduates who are self-employed, defined as “income minus expenses” in Question 3, is the correct amount to report.


Comment 3--

The survey form does not specify net or gross income for "extra money earned outside of regular jobs, i.e. freelancing, consulting, moonlighting, or doing other casual jobs."


Response: Questions 4 and 4a ask about earnings from other jobs outside of regular jobs. It is based on items Q73A1 T and Q731 in the March CPS. During cognitive interviews, no respondents had difficulty understanding that the survey was collecting information on earnings prior to taxes and deductions.


Comment 4--

Top-coding very high income values and zeroing negative values is problematic because positive values are dropped, thus skewing the results.


Response: The editing rules proposed for the RGEES are the same as those used in the American Community Survey. The ACS allows a range for wages plus tips/bonuses/ commissions of $1 - $999,999. Because the ACS is the premier federal survey of households and individuals in the United States and has been extensively studied and validated, the RGEES incorporated those same range edits. Negative values for self-employed income are set to 0 earnings because that constitutes a net loss and so there are no earnings for self-employed graduates for any year in which their business expenses exceed their business income.


Comment 5--

Though it's a federal standard, a response rate of 50 percent per program is unrealistic. Kaplan Higher Education's program completers are a highly mobile group, and the lack of current contact information for alumni is an obstacle for all postsecondary institutions. In addition, individuals who have indicated they do not wish to be contacted (DNC) are another factor that impacts response rate.


Response: Fifty percent is not the Federal standard for a minimum survey response rate. However acceptable levels of nonresponse are discussed in a document from the Office of Management and Budget “Guidance on Agency Survey and Statistical Information Collections” (https://www.whitehouse.gov/sites/default/files/omb/assets/omb/inforeg/pmc_survey_guidance_2006.pdf). That guidance indicates that when submitting an Information Collection Request (ICR) to OMB “surveys with expected response rates lower than 80 percent need complete descriptions of how the expected response rate was determined, a detailed description of steps that will be taken to maximize the response rate, and a description of plans to evaluate nonresponse bias. Agencies also need a clear justification as to why the expected response rate is adequate based on the purpose of the study and the type of information that will be collected (whether influential or not).” Note that in the OMB Information Quality Guidelines, “influential” refers to information (i.e., data) that has a clear and substantial impact on important public policies or important private sector decisions; as such, these data are held to a higher standard of reproducibility and transparency and should be designed to minimize all sources of survey error, including nonresponse bias. Furthermore, the OMB Guidance notes that information collection requests for surveys with lower expected response rates are often justified when information is planned for internal use only, is exploratory, or is not intended to be generalized to a target population. Examples for these kinds of collections may include some customer satisfaction and web site user surveys and other qualitative or anecdotal collections. Given the intended use of data collected using the RGEES, these data meet the criteria of influential data and exceed the limited internal uses of data allowable with lower expected response rates.


In addition, the 50 percent response rate required in the RGEES Standards mirrors the requirement in the Gainful Employment Regulations for the other alternative earnings source allowable under an appeal. That approach requires that the institution obtain earnings records from the State employment database for 50 percent of the programs’ graduates (based on the list of program graduates submitted to SSA for earnings data). Although it would be preferable to have response rates higher than 50 percent, the Department has determined that 50 percent should be the minimum acceptable level, thus matching the requirement for the other alternative source for earnings data to be used in an appeal.


The commenter further expressed concern that an institution’s student contact information may be out of date, and thus makes the response rate requirement impossible to meet. We note that each institution has the ability to access the most recent contact information available for each student through the National Student Loan Data System (NSLDS), and thus does not have to rely on its r own records which may be out of date.


The commenter further expressed concern that graduates on the Do Not Contact (DNC) list will increase nonresponse. The Department does not anticipate that the number of graduates who have taken the proactive step of placing themselves on the DNC list will be large.

Furthermore, the DNC list is actually the Do Not Call list and does not preclude the institution from contacting the graduate by email or mail. Graduates who refuse to respond to the survey and ask the institution to not contact them about the survey should be included in the denominator of the response rate calculation and counted as nonrespondents. The Department is only requiring a 50% response rate, so the effect of such hard refusals should be absorbed into the allowable nonresponse.


Comment 6--

Question 2a: We recommend elimination because it will cause confusion, particularly for those who have held multiple jobs in one year.


Response: A panel of experts consulted in the development of the survey items encouraged the Department to anchor the respondents’ recollection of their earnings by asking first about earnings for the “longest job” and then about “all other jobs”. The second round of cognitive interviews confirmed that this approach elicited total earnings from all sources.


Comment 7--

Question 2b: Most individuals could not come up with a specific dollar amount for wages earned in a specific year, but most will have a general idea of the amount. It would be better to ask the responder to indicate a salary range:

o Less than $10,000

o $10,000 to $19,999, plus additional ranges in increments of $10,000 to $99,999

o $100,000 to 149,999

o $150,000 or more

Note that the above income ranges should be used in all questions that ask for income.


Response: The RGEES asks about earnings using a series of items designed to elicit recall of total earnings by asking separately about 1) wages, salary, tips, overtime pay, bonuses or commission, 2) earnings from self-employment, and 3) earnings from other work including freelancing, consulting, moonlighting, or other casual jobs. Based on standard practice for earnings/income items in federal surveys, the multiple item approach is designed to help the respondent remember all of their sources of earnings. However, the only information that is relevant for an alternative earnings appeal is total earnings. The cognitive testing done on the RGEES estimate showed that graduates had little difficultly coming up with a point estimate for their earnings for the reference year when asked about each component of earnings separately.


Comment 8--

Question 3: This can be eliminated if "self-employment" and "own business" are added to Question 4. The more that the requests for data can be consolidated and streamlined, the more reliable the responses are likely to be.


Response: Question 4 is designed to help graduates recall sources of earnings that are outside of their normal employment. Question 3, on the other hand, is specific to self-employment earnings. Because of the need to specify how self-employment earnings should be reported (business income minus business expenses), it is important to keep these two questions separate.



Anne Sullivan Polino (ED-2015-ICCD-0085-0008):


Comment 1—

However, the Department presents no basis whatsoever for the assumption that Pell grant recipient status, 0 EFC status, or being a female is correlated with earnings of graduates. (page 7) If ED has now concluded that low-income status and being female is highly correlated to low earnings status, then ED’s previous positions in the GE rulemaking must be re-addressed and the GE rulemaking should be re-opened so that the public may comment on this revised viewpoint. (page 11)


Response: The measure used in the regression analysis in the GE Regulations is the debt to earnings (D/E) rates measure. It evaluates the amount of debt students who completed a GE program incurred to attend that program in comparison to those same students’ discretionary and annual earnings after completing the program. In contrast, given the focus on measuring the potential for bias in the earnings data collected using RGEES, the measure used in developing the RGEES standards is earnings. The relationship between the identified characteristics of the students in each program and the earnings of students in the program is different than the relationships observed between the same characteristics and the debt to earnings rates. For example, the zero- order correlations between earnings and percent ZEFC, percent Pell, and percent female are -.593, -.611, and -.267, respectively, compared to correlations of -.121, -,976, and .017 when the same three variables are correlated with the debt to earnings rates. The supporting analysis is posted with the final Standards for Conducting the Recent Graduates Employment and Earnings Survey.


Comment 2 (page 12)—

It simply makes no sense for the Department to require that institutions that are challenging SSA earnings data ensure that missing data from their own graduate surveys do not skew their results while not submitting the SSA missing data to such a process. Again, where ED has documented that 11.4% of its universe of earnings data are missing, the potential for bias is significant. See 79 Fed. Reg. at 64954.


Because ED has failed to require bias analysis for the SSA earnings data it will be relying upon for its initial GE earnings calculations, it should re-open the GE rulemaking to include such a process and permit the public comment on its proposed bias analysis process.


Response: While the potential for nonresponse bias is present with any amount of missing data, missing data for 11.4 percent of the data translates into a response rate of 88.4 percent which is well above the 80 percent trigger used by the Office of Management and Budget and incorporated into the Standards for Conducting the Recent Graduates Employment and Earnings Survey. A Gainful Employment Program using the RGEES to collect earnings data for use in an appeal would not be required to conduct a nonresponse bias analysis at this level of response. Reopening the GE rulemaking for this reason is not warranted.


Comment 3 (pages 12-13)—

The Cognitive Interviews following the administration of the RGEES pilot survey revealed problems with extracting accurate earnings information from the pilot respondents and the changes made by ED to the RGEES do not correct these issues. See Results of Cognitive Testing, Appendix 4 (Docket # ED-2015-ICCD-0063). For example, students thought that the questions about their income related only to income related to their degree. Id. ED did not correct or modify the survey to clarify that the income information need not be related to the student’s degree or credential earned. Id. At least one student did not report tip income because she had not reported such income on her tax return, even though the pilot survey specifically asked about tip information. Id. Likewise, asked about commission income, at least one student did not report such income. Id. The pilot question regarding self-employment income generated confusion from a former cosmetology student who was presumably self-employed renting a space and paying for her own supplies. Id. This cosmetology undergraduate was unsure how to answer the question but no correction to the question was recommended. Id. Indeed all of the questions dealing with self-employment income generated some level of confusion. Id. Thus, the survey generates confusion for precisely the same students who are most likely to have their income under-reported or missing from the SSA –self-employed individual and individuals receiving tip income.


Response: The version of the RGEES that was tested in the first round of cognitive interviews included the following as the second question:


According to our information, you completed the <program name> program at <institution name> between <beginning of cohort period> and <end of cohort period>. Is that correct?

No

Yes

Because this question asked about the program that the graduate completed, it did indeed cause graduates to think that the questions were about earnings only related to their degree. We removed this question after the first round of cognitive interviews. In the second round of cognitive interviews, graduates understood that the survey was asking about all earnings from all sources, and not just earnings related to their degree.


While we understand that some graduates may be reluctant to report tip income on a survey when they have not reported such income to the IRS, the RGEES provides a way for institutions to encourage graduates to report all such income. Individual data from respondents to the RGEES will never be reported to the federal government, only aggregated mean and median earnings. Therefore, institutions can assure their graduates that they can feel confident reporting all types of income in this survey.


Comment 4 (pages 13-14)—

Moreover, ED apparently carried out no pilot testing of the accuracy of the income responses it received from pilot respondents. Likely, given the confusion expressed about self-employment, tip, and unreported income, the Department would have confirmed the obvious potential for respondents to under-report their income. For these reasons, we advocate that the GE rulemaking be re-opened and that the RGEES be modified to provide more explanation regarding the scope of income information sought. The RGEES should expressly state that the income information sought is not limited to program-related income; it should include the definition of self-employment; and it should emphasize the need to report income even if it was not reported to the IRS. Then ED should re-execute its pilot survey to examine the quality of the income information generated from another pilot RGEES.


Response: A pilot test of the RGEES is now being conducted with a sample of graduates who received federal student aid from three program areas: cosmetology and related personal grooming services, somatic bodywork and related therapeutic services, and practical nursing, vocational nursing, and nursing assistants. The remaining respondents will be selected from all other program areas. The pilot will be used to compare median earnings collected through the appeals survey to median earnings for graduates from comparable programs based on a match to the Social Security Administration as part of the 2012 gainful employment informational rates. The results of the pilot will also be compared to earnings estimates in the Current Population Survey and the American Community Survey. RGEES question 3 indicates to self-employed graduates how they should calculate their earnings (income minus expenses). Institutions are free to emphasize to graduates in their communications materials that the information provided in the RGEES will not be shared with any federal agency. Examples of respondent communication materials are provided in the Best Practices Guide posted to the FSA website on August 31, 2015 (http://ifap.ed.gov/GainfulEmploymentInfo/attachments/BestPracticesGuide.pdf).


Comment 5—

ED has entirely failed to consider the burden that the draft RGEES Standards imposes on institutions and focuses exclusively on the burden on students responding to the RGEES. (page 14) Moreover, with respect to all of the Standards, ED must quantify the burden hours associated with complying with them. The failure to do so is in direct violation of the Paperwork Reduction Act. (page 17)


Response: The Department did consider the institutional burden associated with the use of surveys as allowed under the regulation. The burden was estimated in the Notice of Proposed Rulemaking and final regulation following the Gainful Employment program negotiated rulemaking. (See the final GE program Federal Register date 10/31/2014 FR Volume 79, Number 211). The institutional burden is calculated in the approved information collection OMB Control Number 1845-0122.


This new information collection is to identify the burden placed on the individual recipient of the survey.


Comment 6 (page 16)—

For example, Standard 1.1 requires a detailed discussion of the goals and objectives of the survey or survey system including the information needs that will be met, content areas included, the list of program completers to be surveyed, and analytic goals. While institutions using the RGEES Platform will have this produced by it, for institutions not using the Platform, this requirement seems needless, as all of these components are clear from the regulation itself.


Response: While it is not recommended, the Standards allow for the possibility of asking the RGEES questions verbatim at the beginning of another survey. Especially if this is done, it is important that the required elements be provided. If that is not the case, the survey administrator can easily copy this information from the Regulations as the commenter points out. Since the documentation will be used by an independent auditor who evaluates compliance with the Standards, this is information that will be important to your auditor. In addition, the list of program completers to be surveyed is an important component of the evaluation of the quality and usability of the resulting data and must also be provided to the independent auditor.


Comment 6a: Standard 1.3 includes a requirement that institutions carry out training of survey collection staff and persons coding and editing the data. While virtually every new Title IV regulatory requirement will frequently require institutions to train their staff on completing it in compliance with Title IV, a specific training requirement in the Standards will lead to additional burdens of documenting training and in some cases carrying out training of individuals who already possess the skills to conduct survey requirements simply to create an audit trail of compliance.


Response: Even experienced data collection staff must be briefed on the procedures unique to a new data collection. For example, what modes are to be used during data collection, what triggers a decision to switch modes, what protocol will a specific survey use to encourage non responders to respond, what procedures will be used to trace hard to locate graduates. Similarly, the edit and coding specifications are unique to each survey, and must be communicated to the data coding and editing staff.


Comment 7 (pages 16-17)--

Standard 1.5 requires a security plan to preserve data confidentiality and Standard 1.6 mandates a “disclosure analysis plan” if data is to be used for any purpose beyond appeal. In both cases, the goal is to preserve the confidentiality of graduate data and the two separate standards appear duplicative. The security plan can and should address how the institution will ensure data confidentiality and requiring both a security plan and a disclosure analysis plan seems unnecessary.


Response: The security plan required in Standard 1.5 calls for preserving the confidentiality of the data during collection, processing, and analysis. This plan relates to the electronic and physical safeguards that are applied to protect the data while they are in the institution’s control (e.g., level of data encryption when the data are at rest, where the electronic and hard copy of the results are stored and where they may be used, who has access to the data, a specification of limitations on the use of the data). In contrast the disclosure analysis plan required in Standard 1.6 refers to the procedures applied to the data before they are shared with anyone for usage other than the appeal (e.g., removal of direct student identifiers such as name or social security number, cell size restrictions requiring a minimum cell size before the data can be released, top- or bottom- coding of extreme values, categorization of cases, adding noise to the data).

.

Comment 8 (page 17)—

The proposed Standards would require institutions to “[o]utline the quality assurance plan for each phase of the survey process that will permit monitoring and assessing the performance during implementation.” See Standard 1.7. While institutions are reasonably expected to ensure that the RGEES is carried out in accordance with the Standards, it is unclear what institutions are required to produce to meet this standard except to ensure that it follows the Standards. Without any additional instruction on what is expected here, this Standard seems to require the production of nothing more than a reiteration of the Standards.


Response: Since the documentation will be used by an independent auditor who evaluates compliance with the Standards, this is information that will be important to your auditor, even if it is just to confirm that the Standards were followed. Beyond the Standards, there are a number of activities identified In the Best Practices Guide for RGEES that can contribute to your quality assurance plan. For example, the survey administrator may elect to monitor the response rates at various points throughout the data collection to guide decisions as to whether to send reminder messages to the survey respondents. The administrator may also want to monitor the number of respondents with incorrect contact information in order to know when to pursue alternative sources for this information.


Comment 9 (page 17)—

Likewise, Standard 1.9 is unclear in its expectations. It compels institutions to “identify and monitor key milestones of the survey and the time relationships among them.” If this means institutions should document when surveys are sent out and when graduates respond, the Standards should so state. If this means something else, the Standards should be revised to clarify what is expected.


Response: There are a variety of different milestones that are in part dependent on the level of effort the institution elects to commit to the RGEES. The Best Practices Guide includes suggestions for additional steps an institution may want to take when administering RGEES to improve response rates. For example, in addition to the initial date the surveys are sent out, a program may elect to monitor response rates and send out reminders or resend the survey to nonrespondents.


Comment 10 (pages 17-18)--

The Paperwork Reduction Act contains a specific requirement that federal agencies consider information collection burdens on small businesses and additional efforts to reduce the burden on business with fewer than 25 employees. 44 U.S.C. 3506(c)(4).


In its Supporting Statement for Paperwork Reduction Act Submission, the Department denies that the RGEES would affect small businesses at all. See Supporting Statement at 3 (“This information collection does not involve small businesses or other small entities”). ED makes this representation presumably because ED has considered only the burden on graduates responding to the RGEES. However, in the GE final regulations, ED identified 709 institutions to which the GE regulations apply and that it classified as “small entities.” 79 Fed. Reg. at 65,102. While ED did quantify the burden on small entities in preparing an alternative earnings appeal, such an estimate carries no validity as it was calculated before NCES even began to draft the RGEES Standards.


Clearly, these Standards cannot be effectuated until ED completes a burden analysis on all institutions and on small businesses in particular.


Response: This information collection is focused on the individual graduate who will be completing the survey--not institutional burden. The institutional burden for this survey process was part of an earlier Notice of Proposed Rulemaking whose comment period has closed. The citation noted by the commenter does not apply to this current information collection.


Comment 11--


The imposition of a 50% response rate will render RGEES unavailable to a significant proportion, if not a majority, of institutions wishing to challenge SSA earnings data. This is because our alumni’s living situations are particularly unstable due to a variety of factors including difficulties arising from having low-income backgrounds and, in many cases, dependents. (page 19)


Even if we could reach every single graduate in a universe RGEES survey, obtaining a complete response at 50% rate would be extraordinarily difficult and potentially impossible. ED’s survey universe for its pilot survey was 3,400. ED Supporting Statement For Paperwork Reduction Act Submission Gainful Employment Recent Graduates Employment and Earnings Survey Pilot Test, Part A (Revised July 17, 2015).


Simply put, students are reluctant to respond to any institutional survey and particularly where sensitive income information is sought. For all of these reasons, we ask that the rulemaking be re-opened and that the response rate be reduced to a statistically valid sample. (page 21)


Response: Acceptable levels of nonresponse are discussed in a document from the Office of Management and Budget “Guidance on Agency Survey and Statistical Information Collections” (https://www.whitehouse.gov/sites/default/files/omb/assets/omb/inforeg/pmc_survey_guidance_2006.pdf). That guidance indicates that when submitting an Information Collection Request (ICR) to OMB “surveys with expected response rates lower than 80 percent need complete descriptions of how the expected response rate was determined, a detailed description of steps that will be taken to maximize the response rate, and a description of plans to evaluate nonresponse bias. Agencies also need a clear justification as to why the expected response rate is adequate based on the purpose of the study and the type of information that will be collected (whether influential or not).” Note that in the OMB Information Quality Guidelines, “influential” refers to information (i.e., data) that has a clear and substantial impact on important public policies or important private sector decisions; as such, these data are held to a higher standard of reproducibility and transparency and should be designed to minimize all sources of survey error, including nonresponse bias. Furthermore, the OMB Guidance notes that information collection requests for surveys with lower expected response rates are often justified when information is planned for internal use only, is exploratory, or is not intended to be generalized to a target population. Examples for these kinds of collections may include some customer satisfaction and web site user surveys and other qualitative or anecdotal collections. Given the intended use of data collected using the RGEES, these data meet the criteria of influential data and exceed the limited internal uses of data allowable with lower expected response rates.


In addition, the 50 percent response rate required in the RGEES Standards mirrors the requirement in the Gainful Employment Regulations for the other alternative earnings source allowable under an appeal. That approach requires that the institution obtain earnings records from the State employment database for 50 percent of the programs’ graduates (based on the list of program graduates submitted to SSA for earnings data). Although it would be preferable to have response rates higher than 50 percent, the Department has determined that 50 percent should be the minimum acceptable level, thus matching the requirement for the other alternative source for earnings data to be used in an appeal.


The commenter further expressed concern that an institution’s student contact information may be out of date, thus making the response rate requirement impossible to meet. We note that each institution has the ability to access the most recent contact information available for each student through the National Student Loan Data System (NSLDS), and thus does not have to rely on its r own records which may be out of date.


Comment 11a--Although ED expected a response rate of 60%, Bryant & Stratton can find no public document reporting on the actual pilot survey response rate and ED either has not calculated the response rate or has not disclosed the calculated response rate.


ED has disclosed, however, that only 42 survey respondents responded to its outreach to carry out its first round of cognitive testing. Results of Cognitive Testing, Appendix 4 (Docket # ED-2015-ICCD-0063). Only 41 individuals responded to the second round of cognitive testing. Id. Of those who responded to the first and second rounds of cognitive testing, only 15 individuals actually participated in each of the rounds. Id. If ED was able to obtain its expected 60% response rate (2040 respondents), its rate of response to its request for cognitive testing was only 2% in each round (42 divided by 2040) and only .07% actually participated in the cognitive testing in each round (15 divided by 2040), (page 20)


Response: Cognitive interviews are conducted with opportunity samples and are not designed to garner high response rates. The RGEES cognitive interviews were conducted with graduates who were willing to respond to advertisements to meet with data collection agents for an hour and to engage in a thorough discussion about their responses to the survey items. Response rates are not calculated for cognitive interviews because they are meaningless. The RGEES is now undergoing a large scale pilot that will include nonresponse follow-up and other best practices designed to elicit responses. The Best Practices Guide posted to the FSA website on August 31, 2015 (http://ifap.ed.gov/GainfulEmploymentInfo/attachments/BestPracticesGuide.pdf) includes tips for institutions to help them achieve the highest possible response rate on their institutional survey.


Comment 12 (pages 21-22)--

ED was asked by commenters to the RGEES Pilot Test (Docket # ED-2015-ICCD-0063) whether institutions would be permitted to pay graduates to encourage them to respond to surveys. In its response, ED stated that it would recommend “institutions consider using various types of respondent encouragements to help obtain graduate cooperation. “ ED paid graduate respondents $25.00 to respond to its pilot survey. For most institutions, such payments would add significant and burdensome costs to achieve the minimum 50% response rate required. However, ED did not consider this burden as required by the Paperwork Reduction Act.


ED has assumed a universe of 22,123 survey recipients in its Supporting Statement to its Proposed RGEES Standards. If institutions were to pay $25.00 to each of these recipients, the resulting cost would be $553,075 on top of the costs incurred to plan, prepare, execute, and document the RGEES. Even if the amount were paid only to half of this universe, the cost is significant and should be considered in this review.


Response: There is no statutory or regulatory requirement for institutions to offer respondent encouragements, including incentives. Respondent encouragements are suggested in the Best Practices Guide posted to the FSA website on August 31, 2015 (http://ifap.ed.gov/GainfulEmploymentInfo/attachments/BestPracticesGuide.pdf). Such encouragements are not required by the regulation but an institution may choose to avail itself of this option to help increase response rates.


Comment 13 (page 22)--

For these reasons, ED should add a mechanism permitting institutions to obtain extensions of time to submit the RGEES for good cause. Also, the RGEES does not address whether institutions may contract with third parties to execute the RGEES to its graduates and given the burden associated with the RGEES, it asks that ED clarify that such contracting is permitted.


Response: Under the regulations, an institution has 60 days after the final Debt to Earnings rates are issued to submit an appeal based on survey earnings. However, the institution can begin to survey its program graduates at any time after the calendar year for which earnings data are used to calculate the D/E rates for a particular cohort. For example, for the cohort of students who completed a program during the two-year period from 2011-2013, the D/E rates will be calculated based on earnings for 2015 calendar year. So, the institution could begin to survey that cohort as soon as January 2016. For an institution that waits to conduct the survey until draft D/E rates are issued (at which point the institution will know that a program is failing or in the zone), we estimate that it will take about six months from that point to issue final D/E rates. In this case, the institution will have those months plus the 60 day period to conduct the survey and submit its appeal. In any case, we believe there is ample time for the institution to conduct the survey and do not intend to grant appeal extensions.


With regard to contracting, we note that an institution may enter into a contract with a third-party servicer to perform any function or activity the institution would normally perform in administering the student aid programs (see 34 CFR 668.25), including conducting earnings surveys.


Comment 14 (page 23)—

In fact, ED suggested in its response to RGEES Pilot Comments that the time it would take for graduates to respond to a real (rather than pilot) RGEES could exceed 5 minutes. Responses to Public Comment on the RGEES Pilot Test (Docket # ED-2015-ICCD-0063) at 18 (“Because respondents to the RGEES pilot test will have the opportunity to consult their records to gather the data needed, the actual time to complete the survey may be longer”). It is unclear why the Department continues to estimate that respondents will take a mere 5 minutes to complete the RGEES.


A more realistic time burden estimate per respondent is at least one hour, which would include the time to read the survey, time to consult with the institution sending the survey, time to find income documentation to reference, and time to send the response, back to the institution.


Response: The Department appreciates the concern regarding the burden estimate. We note for clarification that the burden assessment applied here is only for the individual who receives the survey. The burden to the institutions was previously assessed in the final Gainful Employment regulations that were published as a final rule on October 31, 2014. The institutional survey burden is contained in the information collection package OMB Control Number 1845-0122 with a total burden hours of 23,860 hours for institutions. That information collection was open for public comment during the Notice of Proposed Rulemaking comment period.


This new information clearance package is for the individual graduate who will receive the survey and mirrored the burden estimate for the individual as assigned for the pilot Recent Graduate Employment and Earnings Survey which has just concluded its information collection clearance. In reviewing this and other comments regarding the individual graduate burden estimates the Department is persuaded that 5 minutes is not sufficient time for a survey recipient to review the survey questions and search for any documentation to allow for the most complete and accurate response.


However, the Department does not agree with this commenter that the individual who receives the survey would require an hour to complete the survey. There is no basis to assume that all of the institutions that send an earnings survey to its graduates will need to spend time explaining the survey to the recipient. The Department anticipates that the majority of these surveys will be performed electronically utilizing the free RGEES data collection and processing platform. The Department is revising its individual burden estimate upward from 5 minutes to 20 minutes for an individual to review the survey, locate income documentation, record the data on the survey and submit it to the institution.



Comment 15 (pages 24-25)—

The most critical omission in the proposed RGEES Standards is the Best Practices Guide (BPG), which to our knowledge the Department has still not released to the public. The BPG is part and parcel of the RGEES and is referenced repeatedly and frequently in the proposed Standards. According to the proposed RGEES Standards, the BPG will shed light on appropriate data collection procedures, acceptable strategies to receive high response rates, data collection protocol for staff involved with data collection, procedures for identifying and correcting problems, wording of the required institutional confidentiality pledge, a template for the required institutional security plan, and data editing plan specifications. See Introduction to Survey and Standards.


Without the BPG, ED has failed to inform the public of the full scope of the burden and costs associated with the RGEES Standards.


Response: The Best Practice Guide was posted to the FSA website on August 31 and can be found at http://ifap.ed.gov/GainfulEmploymentInfo/attachments/BestPracticesGuide.pdf.


Comment 16 (page 25)—

Standard 5.1 requires that a survey be “completed” to be included as a response. However, in its Responses to Public Comment on the RGEES Pilot Test, ED stated that even incomplete surveys would be considered. How does ED reconcile these statements and how will an institution be able to determine whether a survey response is sufficiently “complete” to be considered a response?


Response: Standard 5.1 now reads “A completed survey must include sufficient responses to determine whether the respondent has earnings and to support reporting the respondent’s total earnings (including 0 earnings). The RGEES survey will be considered “complete” if the respondent fills out at least one of the earnings items.”


Comment 17 (page 25)—

Standard 1.2 states that institutions that choose to include additional survey questions must ensure that “[a]ny additional items [are] separate and not applicable to the Gainful Employment issue.” What kinds of survey questions would be applicable to GE issues and what kinds of additional survey questions would be acceptable?


Response: If an institution chooses, for example, to add the RGEES earnings questions to an existing graduate survey, they must put the RGEES items first. It is up to the auditor to confirm that the placement and content of additional items did not affect the collection of earnings data in the RGEES.


Comment 18 (page 25)—

Bryant & Stratton asks that the BPG be released after ED has re-opened the GE rulemaking to address the issues discussed in these comments so that the public be given the opportunity to respond in conformity with Paperwork Reduction Act.


Response: The Best Practices Guide was made public in August 2015 and is viewable on the Resources section of the Gainful Employment page on the Information for Financial Aid Professionals page at http://ifap.ed.gov/GainfulEmploymentInfo/attachments/BestPracticesGuide.pdf .


Comment 19—

The Department Failed to Estimate the Added Burden of Attempting to Conduct the RGEES and Report the Result Within 60 Days. If the DE rates are appealed, the GE regulation allows for 60 days to conduct and return the survey to the Department. NCES’s pilot study took 60 days after it developed the survey and after it identified the participants. Institutions will have only 60 days to do the same plus identify their universe of graduate survey recipients, track down contact information for the graduates, comply with the complex and detailed data collection planning, methodology, and confidentiality requirements in RGEES Standards 1-3, comply with the data editing requirements of Standard 4, calculate response rates per Standard 5, carry out the nonresponse bias analysis per Standard 6, calculate the mean and median earnings as set forth in Standard 7, and follow the survey documentation mandates of Standard 8.


Response: Institutions will know the identity of their graduates in their survey universe, before the names are submitted to SSA, and about six months before the institution receives final D/E rates for its programs. If an institution expects to have a need for an appeal survey (i.e., based on prior year results) the institution can begin the process of assembling contact information for the graduates that they will survey as soon as they receive the approved list from ED. This activity is outside the 60 day window. As to the commenter’s concerns over the burden imposed by the Standards, the Department is providing the RGEES Platform free of charge to reduce the burden on institutions. For those institutions that employ the RGEES Platform, the requirements of Standards 4, 5, 6, 7, 8, and parts of 1 and 3 are met and documented within the platform and are included in the Platform documentation and reports.


20

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorSharon Boivin
File Modified0000-00-00
File Created2021-01-24

© 2024 OMB.report | Privacy Policy