Att_Revised TIF OMB Supporting Statement B response

Att_Revised TIF OMB Supporting Statement B response.docx

Evaluation of the Teacher Incentive Fund (TIF) Program: Data Collection Instruments

OMB: 1875-0256

Document [docx]
Download: docx | pdf


Request for Clearance of Data Collection Instruments for the

TEACHER INCENTIVE FUND EVALUATION



Request for Clearance of Data Collection Instruments for the

TEACHER INCENTIVE FUND EVALUATION



II. SUPPORTING Statement For Paperwork Reduction Act Submission

B. Collections of Information Employing Statistical Methods

1. Respondent Universe and Sample Selection

The current request includes three phases of data collection: (1) telephone interviews with all 34 TIF grantees in 2009, (2) site visit interviews with a sample of 12 TIF grantees in 2010, and (3) site visit interviews with another sample of 12 TIF grantees in 2011(which may include a subset of those visited in 2010). We also plan to develop a survey component for teachers and principals, which we anticipate will be submitted in late 2009 or early 2010 as a revision to this collection. Additionally, if the optional outcomes study is deemed feasible and exercised, we will submit the proposed study design as a revision to this collection in late 2010.

Respondents for all three data collections will represent the range of participants and stakeholders in each grantee (key informants). These key informants will include TIF project staff members (e.g., project directors, evaluators), educators (e.g., teachers, principals), administrative staff (e.g., district superintendent, State administrator) and other stakeholders (e.g. representatives of partner organizations, representatives of parent organizations, teacher association officials). The study team will work with the TIF project director in each grantee to identify key informants.

The telephone interviews to be conducted in fall 2009 will provide basic data on each of the 34 TIF projects and provide the foundation for more in-depth examination of implementation issues.

We will use the site visits to delve more deeply into the components and system supports that we learned of during the telephone interviews. Our proposed approach is designed to document and explain varying levels of implementation consistency with plans and the authorizing legislation; long-term implementation issues and challenges; and the principal, teacher, and student outcomes of performance pay programs across all grantees.

Site Visit 1

Based on the information we learn in the telephone interviews and a thorough review of extant grantee documents we will produce detailed grantee profiles from which we will select a sample of projects for Site Visit 1 (2010). We are proposing a stratified random sample of active grantees. We will remove any grantee sites in which we find the projects was not implemented (based on conversations with the TIF program office) or in the event that a grantee’s project ceases to operate over the course of this study. Although these non-implemented/discontinued projects would not be included in the case studies, we will capture valuable implementation data from the telephone interviews and will include extensive information on the reasons for any issues with implementation or continued operation.

From the population of active grantees that have implemented plans, we will randomly select four projects where 50 percent or more of the TIF grant is focused on performance pay for improved student learning, four projects implementing broader forms of differentiated compensation (e.g., to attract teachers to hard-to-staff schools), and four projects that have comprehensive systems that combine performance pay for improved student learning and other forms of differentiated compensation. This approach is designed to yield a sample that represents the range of implementation fidelity, but does not waste evaluation resources on studying sites where implementation never occurred or was not sustained through the grant period.

Exhibit B1. Stratified Random Sample for Site Visit 1

Performance Pay

Differentiated

Comprehensive

4 grantees

4 grantees

4 grantees



The actual sample will be developed in consultation with the TWG and will depend in part on the distribution of different strategies across the population of eligible projects. If possible, we will also consider urbanicity and strength of local bargaining units as additional sampling criteria.

We are aware that sampling plans that do not include discontinued projects may undermine the strength of inferences about factors associated with project success because the presence or absence of those factors is never investigated in this excluded subset. In fact, this shortcoming plagued the early literature on effective schools. However, gathering extensive information in telephone interviews will likely mitigate this potential weakness. Furthermore, by focusing on those grantees that have enacted their plans, the first round of case studies will be able to document the long-term implementation challenges—an area not well understood in the current research literature on performance pay.


Site Visit 2

The sampling approach we recommend for the second site visit (in 2011) would stratify by project outcome in addition to the strata described for Site Visit 1. This would require conducting the optional outcome analysis prior to the second round of case studies in order to sample based on grantees’ observed outcomes. (Note: The outcome analysis is described in the introduction to Supporting Statement A). This approach would ensure that the evaluation would gather in-depth data on the grantee plans, practices, and contextual factors that lead to a range of student and teacher outcomes. Thus, as part of our commitment to advancing the research base on performance pay, the second round of case studies will focus on explaining outcomes—another neglected area of the research literature. We propose to select a stratified random sample of three high-performing projects and one low-performing project (with performance based on findings from the outcomes study) from each of the three types of grantee programs. The following table illustrates one scenario under this approach.


Exhibit B2. Stratified Random Sample for Site Visit 2


Performance Pay

Differentiated

Comprehensive

High Performing

3 grantees

3 grantees

3 grantees

Low Performing

1 grantee

1 grantee

1 grantee


This strategy will provide comprehensive data on implementation consistency with authorizing legislation and project plans; long-term implementation challenges; and the principal, student, and teacher outcomes associated with performance pay.

At this point the total number of grantees to be included in the site visits is not yet known. It is possible that the same 12 sites are included in both Site Visit 1 and Site Visit 2 or it may be that there is no overlap yielding 24 different grantees over the two sets of site visits.


2. Data Collection

The sampling issues related to data collection activities are covered in the previous section and described in the data collection tasks and deliverables described in Exhibit A2.



3. Methods to Maximize Response Rates

Grantees were made aware of this evaluation in the original application, points were awarded for agreeing to participate, and in fact all proposed participating in the evaluation. In addition, members of local TIF grantee project teams were introduced to the study (and its leaders) at the annual grantee meeting hosted by CECR in the summer of 2009. The TIF program office will also provide support for the study and convey its importance to grantees through their regular communication with grantees. Initial communication for each of the three data collection phases will be made with the project director at the TIF grantee as identified in documents provided


Given the support and endorsement of this evaluation by the TIF program office and the grantee’s knowledge of and expected participation in this evaluation, we anticipate cooperation and participation by all grantees. As mentioned above, TIF project directors will be the first point of contact in all three phases of data collection. They will also be instrumental in helping to identify and contact key informants. Multiple attempts will be made to reach identified respondents including phone calls, emails, and follow-up by project staff. We will work with the project directors to identify the most appropriate way to gain access to school employees and provide scheduling templates and funding for substitute coverage when necessary. In the rare cases when identified informants cannot be reached or are unavailable during the time period of the telephone interviews or site visits we will work with the project directors to identify alternate informants.


As described above, key informants will be identified with help from project directors and will be contacted by experienced and well-trained interviewers who will introduce the study by providing relevant background and rationale. In similar studies we have found that interviews such as these provide a venue for respondents to share experiences and contribute to the body of knowledge which motivates many respondents. In addition we have taken the following steps to maximize participation and minimize respondent burden:

  • We have worded all data collection instruments as concisely as possible. To the extent possible, we will coordinate data collection activities within the evaluation team in order to ensure that these activities impose a manageable burden on respondents, while yielding data that collectively answer the evaluation questions of most interest to policy-makers and the field.

  • Prior to data collection, we will send letters of introduction to project directors informing them of the study and describing all relevant data collection activities. The letters will include: (1) contact information for evaluation team staff who can answer questions about the study, (2) information about OMB clearance, and (3) contact information for the TIF program office.



4. Protocol Development and Review

Prior to finalizing the attached protocols, we conducted pilot interviews and solicited feedback from our TWG. In collaboration with the TIF program office we identified three TIF sites from which to solicit feedback on the three draft interview protocols. We shared the protocols with members of the community in three TIF grantees: (1) Guilford County Schools (NC); (2) Harrison School District Two (CO); and (3) the South Carolina Department of Education (SC). We also sent the protocols to all members of the TWG (Drs. Goldring, King, Loeb, Rice, Schochet, Springer), and requested an extensive review from TWG members Dr. Suzanne Wilson (a qualitative researcher with expertise in teacher development) and Dr. Dan Goldhaber (a researcher with expertise on educator labor markets and performance pay who uses quantitative and qualitative methods in his work).


The results of these reviews suggest that the questions were clear. Reviewers reported that the strategy of using data available from extant program monitoring to customize the protocols reduced possible redundancy between program monitoring and our data collection strategy. Reviewer’s greatest concern was that we find a way to facilitate project directors’ gathering of quantitative data on the award structure and payouts. Reviewers were also concerned about the potential for grantees with pre-existing performance-pay programs to intertwine multiple performance-pay programs in response to questions. To address these issues, we revised the protocols to streamline the gathering of data on project design and implementation of payments (by creating the Award Structure and Payout Form and adding the option of sending us an electronic file with necessary data instead of completing that form). We also modified our initial questions to make clear the distinction, especially when interviewing project directors and other leaders, between TIF and other pre- and co-existing performance-pay projects. The review process and protocol revision were completed prior to submitting the protocols to OMB.



We also found that the training we provided to the small number of researchers conducting pilot interviews (who represent the range in experience of those who will be conducting the actual interviews) was sufficient to enable them to gather necessary data.



5. Contact Information

The contact person at the U.S. Department of Education is Dr. Andrew Abrams. The primary contractor of this study is SRI International, based in Menlo Park, CA. Berkeley Policy Associates, based in Oakland, California and the Urban Institute, based in Washington DC, are the subcontractors. The principal investigator of the study is Dr. Daniel Humphrey and the project director is Dr. H. Alix Gallagher. Data collection will be conducted by researchers from SRI International, Berkeley Policy Associates, and the Urban Institute under the direction of Dr. Humphrey. The contact information for these individuals is as follows:



Andrew Abrams, Ph.D.

U.S. Department of Education

Policy and Program Studies Services

Phone: 202-401-1232

Email: mailto:Andrew.Abrams@ed.gov



Daniel Humphrey, Ed.D.

SRI International

Phone: (650) 859-4014

E-Mail: Daniel.Humphrey@sri.com


H. Alix Gallagher, Ph.D.

SRI International

Phone: (650) 859-3504

E-Mail: Alix.Gallagher@sri.com









File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleREQUEST FOR CLEARANCE OF DATA COLLECTION INSTRUMENTS FOR THE EVALUATION OF THE TEACHING AMERICAN HISTORY GRANTS PROGRAM
AuthorPolicy Division
File Modified0000-00-00
File Created2021-02-03

© 2024 OMB.report | Privacy Policy