April 10, 2013
Prepared for
Institute of Education Sciences
U.S. Department of Education
Prepared by
Decision Information Resources, Inc.
Abt Associates
Contract Number:
ED-IES-10-R-0015
B.1. Respondent Universe and Sampling Methods 3
B.2. Information Collection Procedures 4
B.3. Methods to Maximize Response Rates 5
B.5. Individuals Consulted on Statistical Aspects of Design 6
This clearance request is submitted to the Office of Management and Budget (OMB) in support of data collection and analysis planned as part of A Study of Implementation and Outcomes of Upward Bound and Other TRIO Programs, sponsored by the Institute of Education Sciences (IES) at the U.S. Department of Education (ED). This request is a resubmission on behalf of this project, after ED and OMB determined that an earlier plan for the study was not feasible.1
Overview of the Upward Bound Program
The Upward Bound (UB) Program, the oldest and largest of the federal TRIO programs, was initiated under Title IV of the Higher Education Act of 1965 as a pre-college program designed to help economically disadvantaged high school students (low-income and/or first-generation college going) prepare for, enter, and succeed in postsecondary education. With grant awards approximating $5,000 per student per year, Upward Bound provides an intensive set of services. In the most recent (FY 2012) grant competition, ED sought to make the provision of those services more productive and cost effective by incentivizing project applicants to propose new delivery methods and approaches that could reduce the cost per student without sacrificing implementation quality. Awards were made to 820 Upward Bound grantees totaling $268.2 million to serve more than 62,500 students.
The structure of Upward Bound projects is well established, defined largely by specific statutory language prescribing project and student eligibility and the services offered. Students usually enter the program while in the ninth or tenth grade. Although students may participate in Upward Bound through the summer following twelfth grade (for three to four years total), participants spend an average of 21 months in the program.2 Upward Bound projects are generally operated by two- or four-year colleges. Projects must offer academic instruction as well as college counseling, mentoring, and other support services. In addition to regularly scheduled meetings throughout the school year, projects also offer an intensive instructional program that meets daily for about six weeks during the summer. Upward Bound projects may provide additional services or supports to students, beyond the set of required core activities. Currently, very little is known about the intensity, duration, and mix of services, or how they are delivered, particularly for FY 2012 grantees that may have changed implementation strategies under the new grant requirements
Overview of the Study and the Role of the Proposed Data Collection
The Study of Implementation and Outcomes of Upward Bound and Other TRIO Programs was awarded in August 2010 as a first step towards launching a congressionally mandate study. The 2008 reauthorization of the Higher Education Act (HEA) required ED to conduct a study that would identify particular institutional, community, and program or project practices that are effective in improving Upward Bound student outcomes (20 U.S.C. 1070a-18) (see Appendix A). However, the law also prohibited any study of a TRIO program that would require grantees to “recruit additional students beyond those that the program or project would normally recruit” or that would result “in the denial of services for an eligible student under the program or project.” These prohibitions were initially seen as eliminating the possibility of conducting a randomized control trial within the Upward Bound program.
The new contract was intended to explore possible designs for and the feasibility of conducting a study of the effectiveness of one or more promising practices in Upward Bound. Design work completed highlighted the weaknesses of popular, program-supported study approaches such as matching grantees on certain project characteristics and then assessing variation in student outcomes that could be attributed to differences in project implementation.3 In consultation with OMB in summer 2011, ED moved forward to investigate the options for testing out certain “enhanced” practices or alternative strategies for implementing core components of Upward Bound using a randomized control trial.
The data collection for which we are now seeking OMB approval is part of that effort.4 The survey of FY 2012 Upward Bound grantee project directors has two objectives:
To help identify one or more strategies to test as part of a random assignment demonstration. One challenge in pursuing a rigorous study of promising practices in Upward Bound, given that it is a service-rich program, is understanding how (not just whether) projects implement core components, where gaps in services might exist, and the prevalence of both the implementation approaches and gaps. In order not to violate the statutory prohibition against denying regular Upward Bound services to student participants, we have to ensure that any promising strategy we evaluate is not already widely implemented but yet is sufficiently attractive to Upward Bound projects to allow us to recruit grantees not already implementing the strategy to participate in a research demonstration involving random assignment.
To provide new
and up-to-date information about the services and strategies that
Upward Bound grantees implement, particularly under the new grant
productivity requirements. The last survey of program
implementation was conducted in 19975
and the program office has an interest in obtaining information that
is more current. The survey will attempt to capture approaches to
program services and supports that participants experience and which
may improve their prospects of successfully competing high school
and entering and/or college.
The survey is designed to address the following research questions:
How are the key components and service areas implemented within Upward Bound?
What is the prevalence of specific strategies or approaches for implementing service areas or program components across grantee projects?
Where are there gaps in services that might be linked to improving student outcomes?
What strategies or approaches may be amenable to testing through an experimental design to measure impact on student outcomes?
The project director survey will be administered to all 820 regular Upward Bound grantees in the summer of 2013, so that grantees can report on Upward Bound activities and services during their summer program as well as during the school year. Because of the need to collect information on how services are implemented, the instrument is detailed and designed to take approximately 40 minutes to complete. We solicited and have received input from the TRIO community and other college access experts to improve and revise the instrument during the public comment period.6 The results of the data collection will be tabulated and released in a summary report.
All 820 Upward Bound grantees that received grants in 2012 will be included in the survey. The list of funded grantees maintained by the U.S. Department of Education’s Office of Postsecondary Education (OPE) will be used to identify the universe of UB projects.
The decision to conduct surveys with a census of UB grantees, as opposed to a smaller statistical sample, is based on the following factors:
Need for subgroup estimates. One goal of the survey is to help identify an important and promising practice to assess rigorously. Key to achieving that goal is determining how commonly the practice is already implemented both overall and across various types of grantees. In order to tabulate subgroup means and distributions it is necessary to have a significant sample of grantees in the survey.
Respondent cooperation. Because of concern among UB grantees and stakeholders about the use of data from a past UB evaluation in which sampling and weighting were used, we believe that greater acceptance of and cooperation with the survey effort will be garnered if all UB projects are invited to participate. Because of the interaction among UB grantees through their regional and national association activities, we anticipate higher response rates if all grantees are required to participate in the survey as part of their grant obligations under EDGAR rather than asking a sample of grantees to participate.
Cost efficiency. Like all
surveys, the fixed costs of conducting this survey will be
substantial. However, the marginal cost of including additional
sample members in the survey is small. In addition, selecting a
sample and using sampling weights that account for the sampling
design would add substantial cost to the government.
In summary, the survey will include a census of Upward Bound projects because of the higher anticipated response rate, need for detail and a sample sufficiently large for subgroup analysis, and the limited marginal cost to the government of including all projects instead of a smaller sample of projects are quite small. In addition, given that it has been more than 15 years since the last national survey of Upward Bound projects, we believe that survey including all Upward Bound projects is warranted.
B.2. Describe the procedures for the collection of information including statistical methodology for stratification and sample selection, estimation procedure, degree of accuracy needed for the purpose described in the justification, unusual problems requiring specialized sampling procedures, and any use of periodic (less frequent than annual) data collection cycles to reduce burden.
The survey will include the current universe of Upward Bound grantees so no sampling will be employed. No special statistical procedures will be used in the analysis, only tabulations of means and distributions overall and for key subgroups of grantees. Because we expect response from close to 100 percent of the grantees, tests of statistical significance for group differences are not warranted.7 This is a one-time data collection effort, with the most recent similar effort conducted more than 15 years ago.
The proposed plan for deployment of the grantee survey and subsequent analysis is summarized in Table 1.
Table 1. Anticipated Schedule of Data-Collection and Reporting Activities
Weeks from Launch |
Activity |
Launch (estimated 6/1/2013) |
Email and hard copy invitations mailed |
Weekly from weeks 2-6 |
Email reminder messages |
Weeks 3-6 |
Begin reminder phone calls to non-responding PDs (Estimated 60% of sample) |
Weeks 6-7 |
Send letter plus hard copy survey to non-responders with option to complete hard copy, fax return, complete on Web, or call into DIR (Estimated 30% of sample) |
Weeks 7-8 |
Troubleshoot final non-respondents with senior staff calls |
Weeks 2-8 |
Prepare data files for analysis |
Week 10 |
Internal memo summarizing initial findings |
Week 44 |
Final data file with documentation |
The
study will aim to achieve a response rate of 95 percent with the
expectation of a 100% response rate since grantee participation in a
national evaluation collection is required under ED’s grant
regulations (EDGAR). We expect policy officials at OPE to
communicate this requirement to Upward Bound grantees. All data for
this study will be collected through a web-based survey. Our approach
to implementing web-based surveys will help to ensure we attain the
high response rate desired to minimize non-response bias. The study
team will employ a number of features in the design and
administration of the survey that have been demonstrated to be
effective in web-based surveys. These features include:
Clearly identifying the survey sponsorship and support by key stakeholders. The letter requesting completion of the survey will be sent by the Office of Postsecondary Education and will clearly identify the need and goals of the survey. The study team will also seek the endorsement of the Council for Opportunity in Education, whose active support of the study could help improve the survey items and be instrumental in helping achieve high response rates.
Provide for multiple modes for completion. Respondents will be able to completed the survey online, hard-copy, and by phone.
Make use of reminders. The study team will make use of weekly reminder emails, as well as reminder phone calls and mailed reminders as needed.8
During the 60-day comment period, the instrument was refined through comments and input solicited from seven individuals total—three project directors, the advocacy group representing UB and other TRIO programs, and three college access experts serving as a Technical Working Group. Their feedback has resulted in a substantial revision of the survey instrument that is still consistent with our burden estimates. The study contractor will pilot the web-based version of the revised survey with no more than eight respondents during the OMB review period to ensure that the survey is easy to navigate, functions well, all skips operate appropriately, and the resulting data file is accurate. Based on the nature of the comments received from current project directors and the TRIO community, and the revision of the instrument in response to those comments, it is our expectation that changes to the instrument will be primarily operational. For more information on the individuals who provided feedback and the nature of the comments please reference Supporting Statement A, section A.8.
The data-collection analysis plans were developed by Decision Information Resources, Inc. (DIR) and Abt Associates.
The following individuals will be responsible for data collection and analysis:
Dr. Russell Jackson, Decision Information Resources, Inc., 713-650-1425
Dr. Sylvia Epps, Decision Information Resources, Inc., 713-650-1425
Dr. Carol Pistorino, Decision Information Resources, Inc., 713-650-1425
Dr. Rob Olsen, Abt Associates, 301-634-1716
Ms. Radha Roy, Abt Associates, 301-347-5722
Dr. Eleanor Harvill, Abt Associates, 301-347-5638
1 The original OMB submission (1850-NEW (04471)), for site visits to 20 Upward Bound grantees, was submitted on 4/20/11 and later withdrawn.
2 U.S. Department of Education, Office of the Under Secretary, Policy and Program Studies Service, The Impacts of Regular Upward Bound: Results from the Third Follow-Up Data Collection, Washington, D.C., 2004.
3 This type of quasi-experimental study would not yield reliable or conclusive results because of the inability to fully isolate the effects of any particular implementation strategies. Upward Bound projects differ in the students they serve, the relationship between the grantee “host institution” and its participating high schools, the level of external resources, and many other factors in addition to their approaches to implementing core components and services. To measure the effects of implementing a particular (potentially promising) approach to a UB component an evaluation would need to statistically “control for” everything else that might differ among projects and influence outcomes; there would inevitably be unknown or not easily measured factors that could have a powerful effect on outcomes.
4 IES has also awarded a separate contract with an option to conduct an RCT of a promising practice within Upward Bound, once such a practice or strategy can be identified under the current contract.
5 Moore, Mary T. “The National Evaluation of Upward Bound: A 1990's View of Upward Bound Programs Offered, Students Served, and Operational Issues.” Washington, D.C.: U.S. Department of Education, 1997.
6 During the 60-day public comment period, comments were received from four members of the TRIO community, including the advocacy group representing UB and other TRIO programs. Three experts were solicited. Comments and input did not exceed eight individuals.
7 In the event that less 95 percent of the universe completes the survey we will assume that we have a sample of the universe and use appropriate tests of statistical significance (t-tests and chi-square) for comparing results across groups.
8 Millar, Morgan M. and Donald Dillman. “Improving Response to Web and Mixed-Mode Surveys.” Public Opinion Quarterly. Volume 75, Number 2, Summer 2011; Heerwegh, Dirk. “Effects of Personal Salutations in Email Invitations to Participate in a Web Survey. Public Opinion Quarterly. Volume 69, Number 4, Winter 2005.
http://survey.sesrc.wsu.edu/dillman/papers/websurveyppr.pdf, accessed 6 November 2002.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Part A: Justification |
Author | jdiamandopoulos |
File Modified | 0000-00-00 |
File Created | 2021-01-29 |