Section B. Collection of Information Employing Statistical Methods
As indicated in Part A of this OMB submission, this study will examine the relationship between the Department’s investments in technical assistance supports to grantees and the quality of evaluation and performance reporting. Initially the study will focus on two grant programs within the Department’s Office of Innovation and Improvement—the Charter Schools Program: State Educational Agencies (CSP SEA) program and the Voluntary Public School Choice (VPSC) program. The major research questions are listed below; the full set of questions and subquestions are included in Section A.
What technical assistance do Office of Innovation and Improvement grantees receive to help them improve the quality of their evaluations and performance reporting?
How do grantees examine project outcomes and effectiveness?
How are grantee evaluations and performance reporting used to inform project improvement and federal policymaking?
B.1. Respondent Universe and Sampling Methods
The data collection process will include telephone interviews with grantee project directors, data managers, and evaluators. It will also include interviews with federal program directors and staff, technical assistance contractors, and grant monitoring contractors, which are not subject to OMB approval. The role of each of these groups is described in Section A.
The universe of grantees for the study consists of all 14 currently active VPSC grants funded in 2007, and 33 currently active or recently completed CSP SEA grants (those initially funded between 2005 and 2011). All of the 33 CSP SEA grants that are included have received one or more renewal grants, some of them going as far back as 1995. The only states that have had CSP SEA grants that are not included in the study are seven states whose final grant ended in 2007 or earlier. Since the focus of this study is on current technical assistance, these grants are excluded from the study universe. Similarly, the study includes all VPSC grants funded in 2007. Half of these grants were continuations from the 2002 awards. The remaining seven grants were newly funded in 2007.
The document review will include the entire universe of grantees. The team will use the results of the document review and the interviews with federal staff and contractors to develop a strategy for selecting which grantees to interview. In consultation with PPSS, the study team will purposively sample approximately 15–20 grantees. Grantees will be purposively selected for interviews by stratifying on:
Program (CSP and VPSC);
Type of evaluation and performance measures (reporting on inputs or outputs only, descriptive design, one group pre-post design, and quasi-experimental or experimental design); and
Timing of grant (current grant versus completed grant).
Grantees with Project Directors who have been involved since the beginning will also be targeted. This is because these Project Directors are more likely to be familiar with the application process and any early technical assistance and resulting changes that occurred. This selection process will result in a diverse sample of grantees. No comparison group will be used for this study.
The use of a purposive sample to select grantees for interviews will ensure balance across the two programs and in the key characteristics of grantees (such as type of evaluation and performance measures and timing of grant).
However, the resulting sample is not necessarily representative of all grantees. Therefore, the findings generated from the grantee interviews are not generalizable to all CSP and VPSC grantees. However, the interview selection process will ensure that the sample includes a diverse set of grantees from both programs. Further, similar topics will be addressed across interviews with all types of respondents, including grantee project directors, grantee evaluators, technical assistance providers, grantee monitors, and federal program office staff. The findings from these interviews will be triangulated to provide a complete account of grantees’ experiences with performance reporting and evaluation, as well as the technical assistance provided in these areas.
B.2. Procedures for the Collection of Information
The study team will collect two forms of data: 1) extant documents and materials from federal program offices, and 2) primary data in the form of semi-structured interviews.
B.2.1. Collection of Extant Documents
Although the document review is not part of this OMB request, the procedures for collecting that data are described below and demonstrate how both data collection activities will be used in tandem to efficiently gather all necessary data. The majority of the extant documents will be requested from the CSP SEA and VPSC program offices. The study team has contacted the program offices to request document inventories for each of the 47 grantees in this study, consisting of applications, grantee applications, annual and final performance reports, and grantee monitoring reports.
Although evaluations are not required by the CSP SEA and VPSC programs, and thus not required to be submitted even if completed, some grantees do submit local evaluation reports to their program office. The study team has requested all such evaluations and, where there is reason to believe an evaluation has been completed but not submitted, the study team will request it from grantees and contractors during the interviews.
The study team will also collect documents that the technical assistance contractor provided to grantees that provide guidance on evaluation planning. These technical assistance documents will be requested from the program offices and the technical assistance contractor prior to the interviews with grantees.
B.2.2. Collection of Interview Data
Additionally, this study will collect primary data via interviews with grantee project directors, grantee data managers, and grantee evaluators. The grantee interviews are the subject of this OMB request. To help inform the study and prepare for the grantee interviews, interviews will also be conducted in advance with federal program directors and staff, technical assistance contractors, and grant monitoring contractors. Appendix C includes a crosswalk of key concepts covered across the interviews; and Appendix D includes topic guides for the interview guides that do not require OMB approval.
The members of the study team will obtain lists of names and contact information for grantee project directors to be interviewed from the federal program offices. The study team will then send the grantee project directors a letter via email informing them about the study. This letter will include statements of support from the federal program directors. The letter will inform the grantee that a member of the study team will contact the grantee to set up a time for an interview. During the subsequent telephone contact or email contact, the study team will identify a time and date for the interview with the project director and data manger and collect contact information for the evaluator, if applicable, as well as answer any questions the project director may have about the study. Because there are only a few issues pertaining to data collection, the study team anticipates interviewing the project data manager along with the project director. Independent evaluators will be interviewed separately.
Semi-structured telephone interviews with the grantee project directors, data managers, and evaluators will be conducted in fall 2012, after the study obtains relevant OMB approvals. Each telephone interview is expected to last between 30 and 60 minutes. Careful notes will be kept throughout each interview, but to ensure all respondents’ answers are captured accurately, the interviews will be recorded. Interviewees will be made aware that interviews will be recorded for the purposes of data accuracy and that the recordings will be stored securely and only be available members of the study team. They will also be given the option to not have the interviews recorded. Experienced interviewers will be conducting these interviews, so the protocols serve as guides and not as verbatim scripts. Prior to beginning the interviews, all study team members participating in the interviews will attend a group training to familiarize them with the protocols and address any questions or concerns regarding the interview purpose and procedures.
Because grantees likely had some similar experiences in performance reporting, conducting evaluations, and with technical assistance, interviews will be conducted with 15-20 grantees selected using a purposive sample. While the purposive sample will result in a diverse set of grantees, any findings generated from the interviews will be limited to those grantees included in the interviews and will not be generalizable to all grantees.
As for the interviews with respondents that do not require OMB approval (federal staff and contractors), the study team has obtained the contact information for federal program staff who will be interviewed. In fact, some preliminary interviews with federal employees were conducted in fall 2011. The remaining interviews with federal employees and contractors will be conducted in spring and summer 2012. The technical assistance contractor is the same for both the CSP SEA and VPSC programs, but the study team requested specific contact information for the technical assistance contractor from each program office, as technical assistance may have been provided to grantees from each program under different management within the single contractor. The study team requested the contact information of the grant monitoring contractor—also the same for both programs—from each of the program offices to ensure the correct individuals(s) is interviewed.
Interviewees will be assured that the names of individuals and grantees will be suppressed in the study’s final report. All interview data will be securely maintained to protect the identities of individuals and grantees. The study team has conducted numerous projects involving sensitive information; consequently, the institutions and all project staff employ both electronic and physical safeguards to protect data from unauthorized access. Electronic project directories, files, and databases are accessible only to project staff and are protected by discretionary access control lists, group memberships, passwords, and locking workstations. Access to the data processing area and database servers is limited to authorized personnel, and building security staff are in place in all sites 24 hours a day, seven days a week. To protect against data loss, the study team’s information technology protocols use automated, redundant backup procedures and file management techniques to ensure that files are not inadvertently lost or damaged.
B.3. Methods to Maximize Response Rates and Deal with Nonresponse
Since all of the grantees included in this study are current or recent grantees, the study team expects to be able to collect current contact information for nearly all grantee project directors. If contact information is not available through the federal program offices, the study team will use multiple methods to locate the grantees, including telephone, email, and/or mail follow-up with the original grantee contact and Internet web searches on grantees’ names and institutions. Missing contact information is unlikely to be a concern in the case of current grantees. However, contact information may be missing for recently completed grants.
Given that the majority of grants are currently active, the study team does not anticipate that it will be difficult to encourage almost all potential respondents to participate. Grantees are required to participate in this study per the Education Department General Administrative Regulations (EDGAR) § 74.53. Further, the small sample size will allow the study team to follow up with nonrespondents via emails and phone calls to encourage participation.
Prior to data collection, the study team will provide respondents with early notification regarding the study. Specifically, the notification will include an introductory letter signed by the program director and Policy and Program Studies Service director that explains the purpose and the importance of the study, the topics that will be addressed during the interviews, and the expected use of study findings at the federal level.
After grantees receive notification of this study, the study team will contact them via email and/or telephone to briefly describe the purpose of the study, identify project staff who should be in attendance during the interview, request the grantee project director’s assistance in providing contact information for the evaluator, if applicable, and find a suitable time to conduct an interview. Evaluators will be notified of the study via email once contact information has been obtained. Those respondents will receive materials similar to those sent to the grantee project directors.
The initial contact from the study team prior to the interview will permit respondents to allocate adequate time for the interview, increase respondent “buy-in,” enable them to ask questions before participating in the study, and allow the study team to establish rapport with any reluctant respondents. Throughout the data collection cycle, the study team will provide a study team telephone number and email address to ensure that potential respondents can easily and quickly obtain answers to questions. In cases where it is not possible to interview grantee respondents, the study will rely on interviews with federal program directors and staff, technical assistance contractors, and grant monitoring contractors in combination with extant data to draw preliminary conclusions about nonrespondent grantees.
B.4. Test Procedures or Methods
The interview protocol for local grantee staff and evaluators was carefully designed to ensure clarity and minimize redundancy. The grantee and evaluator instrument was pilot tested with three individuals from two grantees (one from each of the two programs) in early summer 2012. At the conclusion of the pilot interviews, respondents were asked to indicate any difficulties encountered during the interview, and no respondents indicated difficulties. The interviews went smoothly; respondent answers indicated a high level of question clarity; none of the respondents expressed concern about the structure or content of the interviews; and respondents did not have any recommendations for changes to the protocol. The length of the interview was as expected; each of the interviews lasted approximately one hour. The protocol allowed interviewers to tailor the content and structure of the interview to respondents’ knowledge and experiences with ease. Therefore, the protocol was not changed as a result of the pilot testing.
B.5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
The contractors for collection and analysis of data are SRI International and Abt Associates Inc. Staff have knowledge of statistical methods, experience in evaluation of research programs, and expertise in scientific research. No other outside experts were consulted in the design of the study.
The key personnel who will collect and analyze the study data are:
Abt Associates Inc. Ellen Bobronnikov 617-349-2718
Tamara Linkow 617-520-2978
Michelle Ciurea 617-520-2785
Ricky Takai 301-634-1765
Robert Olsen 301-634-1716
SRI International Dan Aladjem 703-247-8542
Marianne Bakia 703-247-8571
References
Frechtling, J. (2010). The 2010 User-Friendly Handbook for Project Evaluation. Arlington, VA: National Science Foundation.
Lauer, P.A. (2004). A Policymaker’s Primer on Education Research: How to Understand, Evaluate and Use It. Denver, CO: Mid-Continent Research for Education and Learning (McREL).
Study of Quality of Local
Grantee Program Evaluations Section B: Collection Employing
Statistical Methods
File Type | application/msword |
Author | Noah Mann |
Last Modified By | katrina.ingalls |
File Modified | 2012-12-07 |
File Created | 2012-12-07 |