RESPONSE TO OMB QUESTIONS (4647) 1875-NEW “Program Performance Data Audits Project”
What is the relationship between the Audits Project and the Department’s ongoing Data Quality Initiative?
The Program Performance Audits Project is one component of the Department’s ongoing Data Quality Initiative (DQI). While not the Department’s sole provider of support related to data quality, the Audits Project addresses a cross-cutting set of categorical and discretionary programs that will inform recommendations and guidance pertaining to data quality for a range of program areas. Also included under the DQI is the Technical Assistance Project, which has provided support to Department programs focused on enhancing data collection and reporting since 2006.
What is the history of the Audits Project? Has the Department collected data before under this project? If so, how is this collection different?
The Audits Project commenced in 2009 to provide guidance to improve data quality and ensure that program decisions reflect sound information. This activity supports the Department’s efforts to comply with OMB Circular A-11, requiring federal agencies to establish procedures to ensure the accuracy of all performance measurement. The Department has not collected comparable data prior to this effort.
The Department awarded the first Data Audit contract to Decision Information Resources and Mathematica Policy Research to conduct the Data Audit work. To date, the project has collected and reviewed documentation for the 11 programs of interest and interviewed staff from the ED program and budget offices and their contractors (as appropriate). These interviews yielded information about ED’s processes and challenges encountered in implementing elements of the performance reporting and evaluation processes. With the grantee and state interviews--the subject of the respective clearance request--the Department will learn about the grantees’ reporting processes as well as their views about the usefulness of the Department’s guidance, training, and technical assistance.
3. Program selection
a. How did the Department select the 11 programs to be included in this project?
The Department selected programs in two phases, based on input from Budget Service staff and the interest and willingness of program offices. A key priority was the inclusion of different formula and discretionary grant programs, and strategies for data collection and reporting to illustrate a range of strategies and inform recommendations for improving data quality across the Agency. The first cohort of programs, selected in 2009, included Title III National Professional Development Programs, Native Americans and Alaskan Natives and State Grants, Equity Assistance Centers, Voluntary Public School Choice, and Literacy through School Libraries. At the time, OMB staff suggested including the Rehabilitative Services Administration (RSA). However, the Department could not support data audit work related to RSA, as funds for this contract are from the Fund for Improvement in Education, which supports elementary/secondary education programs. Selection of the second cohort of programs in 2010--GEAR UP, IDEA PARTs B and C, and the Perkins State and Tech Prep Grants included a similar rationale.
b. What is the rationale for including two programs (Literacy through School Libraries and Voluntary Public School Choice) that were eliminated in 2012?
The Department selected the Literacy through School Libraries and Voluntary Public School Choice programs, as well as the Perkins Tech-Prep program, based on the recommendation of budget analysts. Because they are no longer funded, the Department withdraws its proposal to interview grantees from these three programs, but will share lessons learned through work to date from those programs. Attachment A includes a revised burden estimate.
c. One of the two primary research questions in this project relates to local evaluations. What is the rationale for including formula grant programs that do not usually require these evaluations?
The project will not review grantee evaluations for formula grant programs.
4. One of the big problems with data quality in the IDEA programs, and other state grants that get passed on to LEAs, is poor quality local data, which the State then aggregates and passes on to ED. Is the Department concerned that if it is only looking at data submitted by the State grantees, then it will not go deep enough to identify potential problems? Will this project address to any extent the local level data quality in these State grant programs?
The Department is concerned about the quality of local grantee data aggregated to the State level, and addressed this with respect to the IDEA in the FY 13 budget justification. The Office of Special Education Programs is participating in a data validation pilot project overseen by the Performance Information Management Systems office (PIMS). OSERS is participating along with OESE in this pilot project designed to improve the Department’s knowledge of the validity of all levels of State-reported assessment results. IDEA funds also support technical assistance to improve the capacity of States to collect and report valid and reliable IDEA data to the Department.
The project does not have sufficient resources to address the quality of local data reported to and aggregated at the state level for other programs.
5. We are very interested in learning how the Department comes up with performance measures and about the process of how the Department changes performance measures or adds new ones. Along the same vein, we are also interested to learn more about how the Department uses performance data and evaluations to inform programming. Based on the research and sub-research questions, it seems that these two topics will be covered by this project. If this is the case, can you provide the questions/interview protocol that will be used with Department staff to cover these two topics?
Response: In recent years, program offices have worked with Budget Service and other relevant staff to develop and update performance measures. The DQI technical assistance contract has also contributed to this work. The Department is currently reviewing program performance measures and will be proposing new performance measures where needed. This project will contribute to the larger effort by assessing the quality of program performance data, a set of performance criteria developed under this contract. These criteria include sections on the development and stability of the performance measures and on the uses of the data. Interviews with program and Budget Service staff, as well as with grantees, include questions that address the development of measures and use of performance data. Attachment B includes Performance Measure and Evaluation Interview Questions asked of Department staff.
6. Can you further break out the costs to the government by each major activity/task? In particular, we’re interested in understanding how the cost of the data collection compares to the total cost in 2012—calendar year or fiscal year.
The table below displays the total contract and data collection costs by each major activity/task for the entire project. The table also shows the anticipated total and data collection costs to the government in 2012, based on the current schedule of work. The data collection from interviews with grantees and local evaluators, which is the focus of this clearance request, reflects approximately 27 percent of the total expected costs in 2012. Other costs in 2012 will include those for data entry and aggregation checks, systematic reviews of local evaluations, and the analysis, briefings, and reports on the findings.
Cost Category |
Total |
% of total |
2012 Costs |
% of 2012 Costs |
Communicate with ED (including weekly meetings and monthly reports) |
$177,564 |
7.07 |
$112,272 |
7.16 |
Collect and Review data-related documents |
$245,982 |
9.79 |
0 |
0 |
Develop criteria for assessing quality of program performance results |
$105,038 |
4.18 |
0 |
0 |
Develop protocols and conduct interviews with departmental staff and contractors |
$352,608 |
14.03 |
0 |
0 |
Develop protocols for conducting grantee and local evaluator interviews |
$59,052 |
2.35 |
0 |
0 |
Prepare IRB and OMB documents |
$46,129 |
1.84 |
$4,304 |
0.27 |
Draft sampling plans and conduct grantee interviews |
$458,260 |
18.24 |
$421,478 |
26.87 |
Conduct data entry checks |
$189,179 |
7.53 |
$189,179 |
12.06 |
Conduct data aggregation checks |
$174,066 |
6.93 |
$174,066 |
11.10 |
Assess quality of local evaluations |
$266,061 |
10.59 |
$228,440 |
14.56 |
Briefings and Reports |
$438,957 |
17.47% |
$438,957 |
27.98 |
Total |
$2,512,896 |
100.00% |
$1,568,696 |
100.00% |
Attachment A
Revised Estimates of Grantee and Contractor Total Burden—8 programs
Grant Program |
Number of Grantees or Contractors to Interviewa |
Number of Respondents Per Grantee or Contractor |
Average Time Per Respondent (Hours) |
Total Respondent Burden (Hours) |
Estimated Hourly Wage (Dollars)b |
Estimated Total Burden Across all Respondents (Dollars) |
|
Title III National Professional Development Grants |
102 |
3 |
1 |
306 |
$32.65 |
$9991 |
|
ELA State Grants |
52 |
3 |
1 |
156 |
$32.65 |
$5,093 |
|
Equity Assistance Centers |
10 |
3 |
1 |
30 |
$32.65 |
$980 |
|
ELA Native American Alaska Native Children in School Program |
9 |
3 |
1 |
27 |
$32.65 |
$882 |
|
Gaining Early Awareness and Readiness for Undergraduate Programs (GEAR UP) |
136 |
3 |
1 |
408 |
$32.65 |
$13,321 |
|
Perkins Title I – Basic State Grant |
53 |
3 |
1 |
159 |
$32.65 |
$5,191 |
|
OSERS Part B – State Grants |
59 |
3 |
1 |
177 |
$32.65 |
$5,779 |
|
OSERS Part C– Infants and Toddlers |
59 |
3 |
1 |
177 |
$32.65 |
$5,779 |
|
Program Office Contractors |
11 |
2 |
.75 |
16.5 |
$32.65 |
$538 |
|
Revised Totals1 |
491 |
|
|
1456.5 |
|
Performance Measurement and Evaluation Interview Questions
Asked of Department Staff
For each performance measure that grantees reported during the 2008–2009 reporting cycle, what was the rationale for its development? Why was this particular measure chosen to track a specific program goal as opposed to some other measure? (PROBE: Ask respondent to provide a rationale for all submitted 2008–2009 performance measures including GPRA measures and any other non-GPRA measures that were submitted.)
Please describe the decision-making process used to select these 2008–2009 performance measures. Please describe the process that was used to identify potential measures, the criteria used to rank those measures, and then to select and finalize the performance measures that were submitted in 2008–2009. (PROBE: Was this process the same for all performance measures including non-GPRA measures? If not, what process was used for each performance measure? Also, determine if any logic models were used in the decision- making process. If so, ask the respondent for a description of these models and how they were used.)
In regards to the decision making process used to identify and select performance measures, what were some of the major issues discussed or considered? How were those issues addressed?
4. What activities were conducted to determine which data elements would be used to calculate the 2008–2009 performance measures?
5. When did ED start the process to identify and select performance measures for the 2008–2009 reporting cycle? (PROBE: Determine if this was the typical time when the performance measure derivation processes occur. If not, determine why this reporting year deviated from what was typical.)
6. Approximately, how long did the process take to identify, select, and determine how to calculate all 2008–2009 performance measures? (PROBE: If possible, determine the total time for each component of the derivation process separately. Also, determine if the derivation of all performance measures (GPRA and non-GPRA measures) occurred simultaneously. If not, ask the respondent to provide these time estimates for each type of performance measure.)
7. Were any problems encountered with regard to identifying, selecting, and determining how to calculate the performance measures? IF YES, what types of problems were encountered and what steps were taken to address these problems?
8. Since the selection of 2008–2009 performance measures, have there been any changes in the process used to identify, select, and determine how to calculate program performance measures for subsequent program years? IF YES, please describe the changes in the process. (PROBE: Determine why the changes occurred, when these changes occurred, how many times changes have occurred since the 2008–2009 reporting cycle, and if they occurred more than once the specific changes for each occurrence.)
9. Are you aware of grantees developing stand-alone evaluation reports? IF YES, how did [the program office/the contractor] use these evaluation reports? (PROBE: Determine if any GPRA or non-GPRA performance data were used in grantee evaluation reports.)
1 Revised Totals eliminates burden associated with programs no longer included in grantee interviews: Literacy Through School Libraries, Voluntary Public School Choice, and Perkins Title II-Tech Prep.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | danberg_n |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |