Instructions for Implementation Study Analysis Plan Template for HMRE Award Recipients Conducting Impact Studies
Month, Day, Year
This page has been left blank for double-sided copying.
Contents
The Implementation Study Analysis Plan for Impact Studies 1
Instructions for completing the implementation study analysis plan template 2
The Administration for Children and Families (ACF), Office of Family Assistance (OFA) is requiring that all Healthy Marriage and Relationship Education (HMRE) award recipients with local impact evaluations funded by OFA provide an implementation study analysis plan for their evaluations, in addition to an impact evaluation analysis plan (see additional template and instructions). In any rigorous impact evaluation, it is important to tell a clear, succinct story about program implementation. The implementation findings (1) describe the services offered to and received by people in the treatment group and people in the control or comparison group, (2) contextualize the impact findings, and (3) generate hypotheses about why the program did or did not have a positive impact. Developing a structured implementation study analysis plan before examining the implementation data will foster an efficient and effective approach for analyzing the data and reporting the findings.
Please complete this implementation study analysis plan in addition to an impact evaluation analysis plan. This document provides instructions for completing the implementation study analysis plan. Award recipients must provide information on all required sections. Please do not structure your analysis plan to match the formatting of these instructions. Instead, please use the provided template (HMRE Implementation Analysis Plan Template).
Please email your implementation study analysis plan (together with your impact evaluation analysis plan) to your Federal Program Specialist (FPS) and copy your Evaluation Technical Assistant Partner (ETAP) liaison by [Insert DATE]. For consistency, please use this common file naming convention when submitting your implementation study analysis plan:
[HMRE Award Recipient Name] Implementation Analysis Plan.docx
Your FPS and ETAP liaison will review the analysis plan, provide comments and suggest edits, and return it to you for revisions. Your analysis plans must be revised and approved by your FPS.
ACF expects that evaluators will complete the analysis plans, with program directors and program staff adding input as appropriate. Therefore, these instructions are mainly directed to evaluators and include a few technical terms. For many of the sections, evaluators can draw from the most recently approved evaluation plan.
For the implementation study analysis plan, please describe the research questions you aim to answer, the data you will use to answer those research questions, and the methods you will use to analyze the implementation data and describe the findings. To the extent possible, please use tables to briefly summarize the required implementation information.
The focus of this implementation study analysis plan is to measure the intervention services received by the intervention group and the alternative services and services similar to the intervention that were received by the control or comparison group during the evaluation period (from study enrollment through the final follow-up interview), based on the data you have collected through surveys and nFORM. The analysis plan should focus on the services evaluation participants received, which might be a subset of all services provided and populations served under the HMRE award. Please discuss any questions about the focus of the implementation study with your ETAP liaison and FPS.
The research questions articulate the main hypotheses of your implementation study. Research questions can refer to implementation elements, such as fidelity, dosage, quality of implementation, engagement, and context, as the following examples show:
Fidelity: Were the intervention services and the control or comparison services (and each of their components, if applicable) delivered as intended?
Dosage: How much of the programming (or how many components, if applicable) did members of the intervention group and members of the control or comparison group receive?
Quality: How well were the services implemented or delivered to members of the intervention group and members of the control or comparison group?
Engagement: Did adults, couples, or youth in the intervention group and those in the control and comparison group engage in the provided services, and if so, how engaged were they?
Context: What other types of services are available to members of the intervention group and members of the control or comparison group? What external events or unplanned adaptations might have affected implementation of the intervention services and the control or comparison services as intended?
Table 1 lists examples (in italics) of research questions for each implementation element.
Table 1. Examples of research questions for each implementation element and study group
Implementation element |
Research question |
Intervention group questions |
|
Fidelity |
|
Dosage |
|
Quality |
|
Engagement |
|
Context |
|
Control or comparison group questions |
|
Fidelity |
|
Dosage |
|
Quality |
|
Engagement |
|
Context |
|
Describe the data sources you will use to answer the research questions (for example, nFORM, fidelity protocols, attendance logs). Describe the timing and frequency of each data collection effort (for example, during all sessions, once a week, annually), and the party responsible for collecting the data. Use a table to link the information on data collection to the research questions. Table 2 presents an example of such a table (sample text appears in italics).
Table 2. Examples of data for addressing the research questions
Implementation element |
Research question |
Data source |
Timing and frequency of data collection |
Party responsible for data collection |
Fidelity |
Were all intended intervention components offered and for the expected duration? |
Workshop sessions in nFORM |
All sessions delivered |
Intervention staff |
Fidelity |
What content did the clients receive? |
Fidelity tracking log or protocol; attendance logs; session observations |
Every session for fidelity tracking and attendance logs; two times a year for session observations |
Intervention staff for fidelity tracking and attendance logs; study staff for session observations |
Fidelity |
Who delivered services to clients? |
Staff applications; hiring records; training logs |
One time X months after start of implementation; annually |
Intervention staff |
Fidelity |
What were the unplanned adaptations to key intervention components? |
Adaptation request; work plan; six-month progress report; annual progress report |
Annually; ad hoc |
Intervention staff; study staff |
Dosage |
How often did clients participate in the intervention on average? |
Workshop sessions and individual service contacts in nFORM; attendance logs |
All sessions delivered |
Intervention staff |
Quality |
What was the quality of staff–participant interactions? |
Observations of interaction quality, using protocol developed by study staff |
X percentage of sessions selected at random for observation |
Study staff |
Engagement |
How engaged were clients in the intervention? |
Observations of engagement, possibly using an engagement assessment tool; ratings from facilitator fidelity logs; engagement ratings from participant satisfaction surveys |
Y percentage of sessions selected at random for observation |
Study staff |
Context |
What other HMRE programming was available to study participants? |
Interviews with staff from partnering agencies in the community; survey items on baseline and follow-up assessments; websites of other agencies in the community providing HM/RE programming |
Once a year; ad hoc |
Study staff |
Context |
What external events affected implementation? |
Interviews with community or county representatives; list of site or school closures |
Once a year; ad hoc |
Study staff |
Note: We use the word “clients” in this table for simplicity. These research questions should be adapted by replacing the term “clients” and specifying “intervention group members” and “control or comparison group members” to address the research questions posed in Table 1, separately by group.
Describe the specific measures you will construct and your approaches to using those measures to answer the research questions. For example, describe whether you will calculate averages, percentages, and frequencies using the data you will collect for the implementation study. In addition, include information on your approaches to examine and summarize qualitative data from interviews, focus groups, and observations. Use a table to link the description of the measures to the research questions. Table 3 presents an example of such a table (sample text appears in italics).
Table 3. Examples of measures for addressing the research questions
Implementation element |
Research question |
Measures |
Fidelity |
Were all intended intervention components offered and for the expected duration? |
|
Fidelity |
What content did the clients receive? |
|
Fidelity |
Who delivered services to clients? |
|
Fidelity |
What were the unplanned adaptations to key intervention components? |
|
Dosage |
How often did clients participate in the intervention on average? |
|
Quality |
What was the quality of staff–participant interactions? |
|
Engagement |
How engaged were clients in the intervention? |
|
Context |
What other HMRE programming was available to study participants? |
|
Context |
What external events affected implementation? |
|
Note: Please adapt the questions to measure the intervention services received by the intervention group and alternative services and services similar to the intervention received by the control or comparison group during the evaluation period (enrollment through the final follow-up interview), based on the data you have collected through surveys and nFORM.
Mathematica® Inc.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Implemenation |
Author | Julieta Lugo-Gil |
File Modified | 0000-00-00 |
File Created | 2024-11-13 |