APPENDIX D
Evaluation of Teen Pregnancy Prevention Replications
Program Name:
Program Location:
Sponsoring Organization:
Individual(s) Interviewed: (names and titles)
Contract Staff (as appropriate):
Date of Communication:
Instructions for Site Visitors
This visit has a set of very specific goals:
To ensure that we have a complete and up-to-date understanding of the roles and responsibilities of the grantee and partners and their staff:
To expand our understanding of aspects of readiness and preparation that would support strong replication of a program model (preliminary information has been abstracted from the grant proposal and other extant documents and is incorporated into the profile you received. Please make sure that you have read this and are thoroughly familiar with it so that you can probe for updated information and identify incorrect information);
To understand the plan for replication of the program model, the adaptations that were approved and made, and the extent to which the replication was implemented as planned (again, you will have been given a summary of the replication plan as contained in the proposal and updated in the request for continuation funding. Your task will be to use discussion and direct observation to determine how the program is actually implemented and to determine the extent to which the various aspects of the replication were implemented as planned);
To understand the local context in which the replication is being implemented (this includes the school or agency environment as well as the social structure ,behavioral norms, resources and services availability of the local community);
To understand the ways in which the grantee changed or adapted aspects of the replication plan in response to local needs or pressures;
To understand the challenges encountered in replicating the program model and in other aspects of implementation, the extent to which staff are able to address those challenges and the strategies they employ to address them; and
To document the services provided to the members of the control group.
These goals will have been articulated in prior correspondence with the sites before the visit, but you should reiterate them at the beginning of any discussion with staff. You should hand the following statement to everyone you interview and an appropriate version of it to youth participants in focus groups.
Thank you for taking time to talk about (Name of Program). As you know, we are conducting the TPP Replication Evaluation. As an important part of that effort, we are visiting programs that are participating in the evaluation two or three times during their grant period, in an effort to understand and document the process of replicating an evidence-based model in real-world settings, the challenges that arise, and how staff on the ground respond to those challenges. The information we gather will serve two purposes: it will help future program operators and policymakers understand what is needed to replicate a program with fidelity, to implement it as intended, with the populations originally targeted; and it will be used to help us understand variations in program impacts, where they occur.
You may, of course, choose not to discuss any topic or end the discussion at any time. We will combine the information from this visit and subsequent visits, with information from your program documents and the performance and fidelity data you have collected, to create a narrative account of your program and the process of replicating the program model you selected. At the same time, we will combine the information about your program with information about other programs in the study to identify consistent themes that apply more generally across a range of program types and replication efforts. Neither your name nor the names of any individuals will be reported, and the notes we take about our discussions will not be shared with or provided to the federal government or anyone else except the members of the evaluation team.
A: Readiness/Preparation: Supervisor background and sponsoring agency readiness
A1. Education and experience of supervisor
Probes: Can you tell me about yourself – how long you have been with the agency, what you were doing before you came here? What aspects of your education and experience do you see as helpful for this job? Experience with youth programs? Sexual health services or interventions? Social services?
A2. Agency position in and penetration of community.
Probes: What is your perception of how the agency is viewed in the community – in terms of its mission, the accessibility of its programs and services, its ability to reach and serve needy populations? Is it seen as a leader in the community or one of many? Is there opposition to agency programs such as this one? Who in the agency provides leadership, vision for the program, is the driving force behind it?
A4. Selection of program model for replication
Probes: Were you involved in the selection of the program model that you are replicating? What were considerations in selecting (name of program model)? In what ways did it appear appropriate to the needs identified? Did you foresee any challenges in implementing this program model – if so what were your concerns (agency policies, community opposition, school district concerns about aspects of the program)?
A5. External support for the program
Probes: Thinking about planning for implementing the program, what resources, if any, were there in the community to support the program (sexual health services, youth programs as sources for referral into the program or sources for additional services)? Were there organizations or individuals in the community you felt you could count on to support the program (school district or school staff, local government agencies, private agencies)? Have those expectations been realized? Did the individuals and organizations you counted on to provide resources or support, in fact provide them?
B. Readiness/Preparation: Staffing
B1. Structure of program staffing
Probes: Can you help me understand how the project works, how the program is staffed, who you report to, who reports to you, the lines of supervision and accountability? In your view, is the staff configuration and number appropriate to mount a strong implementation of the program? If not, what additional staff do you think would make the program stronger – numbers and type of staff?
B1. Recruitment and selection of staff for the program
Probes: How involved were you in planning the staffing of the project? If you were involved, what was your plan for staffing the program (supervisory vs. front-line staff? Was your plan to use existing staff to implement the program or to recruit staff specifically for this program? Advantages vs. disadvantages of the decision? If decision was to use existing staff, how did you select them, what were criteria for selection? If decision was to hire new staff, how did you recruit them, what qualifications, skills were you looking for?
B2. Staff training prior to implementation
Probes: What amount and kind of training did you feel it was necessary for staff to have? What type and amount of training did they receive before the program began? Who provided the training? Did you participate in the training? Did you feel it was adequate? Were staff required to do any other type of preparation?
B3. Staff commitment
Probes: How committed are staff to this specific program? Do you think they believe in the program’s goals? Feel the activities and content are appropriate for the youth population they are working with? Did their feelings about the program change as a result of the pilot? In what ways?
C. Readiness/Preparation: Site-Specific Replication Plan
C1. Approved changes/adaptations to the program model (Use program profile and probe for changes or updates)
Probes: My understanding is that your proposed plan for replication was as follows (name target population, recruiting and retention strategies, staffing, program components, number and length of sessions, settings, delivery strategies). Have I missed anything? Did you make any subsequent changes to the plan with OAH approval? What were the reasons for the change(s)?
D. Implementation: Putting the Program in Place
D1. Settings for the program
Probes: Were you able to implement the program in the number and type of schools (other settings) that you planned? What obstacles did you encounter? Were you able to overcome them? How? If the obstacle remained, what changes did you make in your strategy for implementing the program? How has this affected the implementation of the program, your ability to recruit and retain youth, other aspects of the program?
D2. Staffing the program
Probes: Did you make any changes in program staffing as a result of the pilot year? What were they? What is the workload (case flow) for front-line staff? Is it more or less than you expected? What are the reasons for the difference? Have you lost any of your original staff? How many and over what period? What were their reasons for leaving the program?
D3. Target population
Probes: Are you serving the youth you planned to serve, in terms of numbers, characteristics, risk factors? If not, what barriers to your original plan did you encounter? What outreach strategies have you developed to recruit participants? How do you recruit youth for the program? Have you encountered problems with retention? What strategies have you developed to improve retention?
D4. Schedule for program activities
Probes: How is the program delivered? In how many sessions, of what length, and over what period of time? What challenges to scheduling the program did you encounter? How does scheduling affect retention? How does it affect your ability to deliver the program?
D6. Program components/activities
Probes: have you been able to implement all the components/activities required by the program model (as adapted for the replication)? If not, which ones have you had to drop or modify? What were the reasons for the change?
D7. Gaps in /problems with program content
Probes: Are there activities or program content that are inappropriate for the population you are serving? That seem out of date? Are there gaps in content, information that your youth population needs that is not part of the program? How have you dealt with these issues?
D8. Satisfaction with program model
Probes: Overall, do you feel that the program model you are replicating is the correct choice for the youth population you are serving? If not, in what ways is it less than ideal? In retrospect, would you choose a different program model to replicate? Which one (or what characteristics would be important)?
D9. Response of participants
Probes: How engaged are youth in the activities/content of the program? What aspects of the program/activities/content are they most/least responsive to? Have you had any feedback form them about the program? What kinds of comments do they make about the program? Have you made any changes as a result of these comments? What kinds of changes did you make?
E. Implementation: Administrative and Supervisory Processes
E1. Working with partners
Probes: Were you able to work productively with the partners you originally proposed? What problems or barriers did you encounter? What roles did the partners play in implementing the program? Which partnerships were most effective?
E2. Decision-making and problem-solving processes and strategies
Probes: Who is involved in making decisions about the program, solving problems that arise? How do you bring front-line staff into the process?
E3. Maintaining school and community support
Probes: What have you done to maintain support for the program in schools (or community agencies)? What difficulties have you encountered?
E4. Rules and standards
Probes: In addition to the performance standards that OAH requires you to meet, are there other standards or rules that you have developed to ensure strong implementation of the program? What are they?
F. Support for Staff Performance
F1. In-service training for staff
Probes: Do you provide in-service training for your front-line staff? What type and amount do you provide? Who does the training? Do you get feedback from staff about the relevance and effectiveness of the training? What about new staff … how are they trained?
F2. Consultation and coaching
Probes: In addition to any in-service training, who can front-line staff go to for advice, consultation? Does this happen as a regularly scheduled activity, or as needed?
F3. Monitoring, evaluation and feedback
Probes: Who is responsible for monitoring staff performance, in particular monitoring fidelity to the program model and effectiveness of delivery? How is that information used, in addition to reporting it to OAH? Is it used to provide feedback to front-line staff? Who provides the feedback and on what schedule? What has been staff reaction to the monitoring tools and any feedback? Do they find it helpful? Do they believe that the monitoring tools assess performance accurately?
F4. Staff workload
Probes: What is the workload of your frontline staff (number of clients/number of sessions or groups per week)? Is that too much, too little or just right, in your view? How could it be improved?
G. Community Context
G3. Community attitudes toward the problem of teen pregnancy
Probes: What are the prevailing attitudes towards adolescent sexual and other risk behaviors? What are the beliefs about teen pregnancy (i.e. a large problem, a manageable problem)? Are teen sexual behavior and pregnancy perceived as problems by members of community?
G4. Visibility of the program and community response
Probes: Is this program (highly) visible in the community? What is the level of community support for and/or opposition to the program from schools/school supervisors/community leaders? What are the sources of support for and/or opposition to the program from schools/schools supervisors/community leaders? Have you received any positive or negative messages about your program? Are there particular components of the program that are perceived positively or negatively by the community?
Abt Associates Inc. Appendix D: Discussion Guide for Use with SUPERVISORYSTAFF
File Type | application/msword |
File Title | APPENDIX B |
Author | Seth F. Chamberlain |
Last Modified By | CTAC |
File Modified | 2012-06-29 |
File Created | 2012-06-29 |