November 14, 2012
ED’s
Responses to OMB’s Comments on 201205-1875-002
Strategies
for Preparing At-Risk Youth for Postsecondary Success
We appreciate the comments from OMB and the opportunity to clarify issues raised by the review. Below are our responses to each concern that OMB provided to ED on November 9, 2012. We chose to number each bullet point for easy reference.
OMB’s Overall Comment: ED proposes collection of up to 15 site visit reports describing programs targeted at decreasing high school dropout rates among at-risk youth and preparing them or training, preceded by a phone-only screening interview to refine final site visit selections. This is a very well-articulated proposal and---commendably---states clearly, early, and often that the site visits are a purposive sample and so their “results” are not directly generalizable to any broader population. There are some questions/concerns---including a big one on the time allotted for the interview---but no single one of them is a fatal flaw:
OMB: There is one notable slip where the supporting statement strays from the otherwise exceptionally cautious statements about the limited generalizability of the findings from this work. After the research questions on p. 2, Part A, the statement says that The data collected on the site visits will be used to answer research questions 2 through 5—and answer alone is too strong a statement. They may partially answer the questions; they may provide information related to the questions; they may (vaguely) speak to the questions; but the site visit data cannot definitively answer them. This is especially true concerning Question 4, which is all about generalizing experiences to differing state, local, school, and community contexts. Some more nuanced wording should be used in place of answer.
ED Response: The purposive sample of 15 case studies can indeed only address—as opposed to fully answer—research questions two through five, as the OMB review notes. Accordingly, we changed the language in Part A to “The data collected on the site visits will be used to address research questions 2 through 5” (p. 2-3).
OMB: The public comment (from Dana Zorovich), and ED’s response, should be posted in ROCIS.
ED Response: For your reference, we have attached an email file with Dana Zorovich’s original comment along with ED’s direct response to her (in addition to posting this in ROCIS).
OMB: One key detail—to the extent that it can be pinned down—that’s missing from the discussion in Part B is even the roughest order-of-magnitude statement of how big the universe of postsecondary-focused dropout prevention programs might be. There are a couple of bread crumbs: the literature review is said to have indicated that there are relatively few such programs (Section B.1) and that their number is relatively small (Section B.3), and Section B.1 says that the screening interview information will help prioritize the shortlist in the event that more than 15 sites meet the sampling criteria. What’s missing is any idea of how likely ED thinks that event is. Is 15 effectively a census of these programs? Are there a few dozen programs? Less than 200? That ED is conducting this as a purposive sample—and representing it as such—is fine, but it would still be useful to have some rough calibration for how big the pool is.
ED Response: We agree that it would be very helpful to put the case study sample of 15 sites in appropriate context. Unfortunately, the universe of programs aimed at both preventing dropouts among at-risk students and preparing them for postsecondary success is unknown. To our knowledge, no census of such programs exists, likely because they represent a relatively new policy emphasis. Indeed, this new policy direction was an impetus behind the Department’s interest in funding this project.
Nonetheless, some related benchmarks are available. In 2010-11, a majority of public school districts offered a variety of supports at the high-school level for students at risk of dropping out, including: Tutoring (84%), remediation classes (79%), and summer school (67%) (NCES, 2011, p. 8); and credit recovery courses/programs (88%) and early graduation options (63%) (ibid., p. 9). Among the strategies to prepare at-risk students for postsecondary pursuits, public school districts also reported at-risk student participation in career/technical education and dual enrollment courses: 66% of public school districts reported that “some” and 26% reported that “most” at-risk students in their districts took career/technical courses at regular high schools; 58% reported that “some” and 3% reported that “most” at-risk students in their district took “dual enrollment courses with a career/technical focus”; and 34% reported that “some” and 1% reported that “most” at-risk students took “dual enrollment courses with an academic focus” (ibid., p. 10). These data on at-risk students’ participation in dual enrollment with an academic focus—one of multiple potential strategies for preparing at-risk students for postsecondary success—indicate that likely a very small proportion of public high schools are engaging the majority of their at-risk population in preparation for postsecondary education.
A brief explanation is now included at the end of Section B.1: “The universe of programs aimed at both preventing dropouts among at-risk students and preparing them for postsecondary success is unknown. To our knowledge, no census of such programs exists. However, 58% of public school districts reported that “some” and 3% reported that “most” at-risk students in their district took “dual enrollment courses with a career/technical focus”, and 34% reported that “some” and 1% reported that “most” at-risk students took “dual enrollment courses with an academic focus” (Carver and Lewis 2011, p. 10). These data on at-risk students’ participation in academic dual enrollment courses—one common potential strategy for preparing at-risk students for postsecondary success—indicate that likely a very small proportion of public high schools are engaging the majority of their at-risk population in preparation for postsecondary education” (p. 7).
Reference: Carver, Priscilla Rouse and Laurie Lewis. 2011. Dropout Prevention Services and Programs in Public School Districts: 2010–11 (NCES 2011-037). U.S. Department of Education, National Center for Education Statistics. Washington, DC: Government Printing Office. Available at http://nces.ed.gov/pubs2011/2011037.pdf
OMB: When the four program-type criteria are introduced on p. 6 of Part B, it would be helpful for them to be labeled as being in rough order of importance or in rough order of selection priority, to better set up the reference back to them as being used in refining the short list (on the next page).
ED Response: The sentence introducing the four program types now reads: “In addition to meeting the initial three criteria, nominated programs must specifically represent one of the following types of programs to be included in the sample (in order of selection priority)” (p. 6 of Part B).
OMB: On p. 7, Part B, in the middle of the paragraph on refining the final site visit list, one sentence is perhaps overly vague: In all program types, cases with the best evidence suggesting effectiveness will have the highest priority. Particularly since it’s unclear whether the screening interview is at all positioned to inform an assessment of evidence suggesting effectiveness, this sentence should either be elaborated or dropped.
ED Response: The sentence is dropped (p. 7 of Part B).
OMB: Understood that the idea is to cap interview time during the site visits to 45 minutes per person, but the expected response time/burden estimate for some groups seem low (not anticipating any prep time for the interview). The protocol for guidance principals/assistant principals (Attachment 8) suggests that 1-3 bolded questions should only be asked of administrators in charge of the specific program implementation; even assuming all of those questions are skipped, the (admittedly semi-scripted and, hence, somewhat flexible) protocol would seem to call for the completion of 49 open-ended questions in a 45 minute interview, which seems extremely optimistic. Similarly, the instructions for the counselor/social workers interview (Attachment 10) promises some bolded questions to be asked only of particularly key folks but no questions are actually bolded, leaving 38 open-ended questions to be answered in 45 minutes. Will site visit staff have some protocol questions flagged as essential/top priority to address in the interview? Or should the interview time be modified?
ED Response: The interview protocols focus on the key topics necessary to complete the case studies. Although we anticipate that site visitors will be able to streamline the questioning if certain items are not applicable, we appreciate OMB’s concern about representing the estimated response time more accurately. Therefore, we have raised the expected interview time to 60 minutes for all interviews other than that of the program manager. All time and cost burden estimates associated with those respondents have been changed correspondingly in Part A, Part B, the IC Data Burden Table, and the IC Data Form. As a result of that change, we also increased the expected interview time for the program manager to 75 minutes since that protocol is slightly longer; however, because we estimated a total of 5 hours per program manager to be split between preparation and interview time, the time and cost burden for that respondent type did not change. We will ensure that preparation time is reduced by a corresponding 15 minutes, as indicated on p. 12 of Part A.
The directions for the counselor/social worker protocol have been modified to read: Tailor/select questions based on guidance counselor’s description of responsibilities and degree to which he/she plays an integral part in the program’s implementation activities.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Bonnee Groover |
File Modified | 0000-00-00 |
File Created | 2021-01-30 |