Form Approved
OMB No. 0920-xxxx
Exp. Date XX/XX/20XX
Each year [Insert NOFO] Category A recipients are required to submit an evaluation report that describes findings from the previous year’s evaluation of their [two or three] selected strategies across each of the core areas outlined in the Evaluation and Performance Measurement Plan Guidance. Developing the annual evaluation report provides recipients the opportunity to reflect on their program implementation, facilitators and barriers, as well as their evaluation design and methodologies. The sections below provide examples of the specific information that should be included in the Category A Recipient-Led Annual Evaluation Report.
The annual evaluation report will require up to 8 hours to complete.
Note: Public reporting burden of this collection of information is estimated to average 8 hours per response, including the time for reviewing instructions, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. An agency may not conduct or sponsor, and a person is not required to respond to a collection of information unless it displays a currently valid OMB control number. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden to CDC/ATSDR Reports Clearance Officer; 1600 Clifton Road NE, MS D-74, Atlanta, Georgia 30333; ATTN: PRA (0920-19BHC)
Strategies to Evaluate: Please check the strategies from your work plan you have selected to evaluate.
Evaluation Purpose: State the purpose of the overall evaluation and describe how the findings are expected to be used to inform programmatic decisions.
Evaluation Approach and Context: Describe the general approach you will use to evaluate the selected strategies. Provide information on relevant contextual factors for your program, such as how the program is situated in your state and how it connects to other programs or initiatives. Consider that this document may be viewed separately from your work plan; therefore, provide enough detail for CDC to understand the program and evaluation context.
Evaluation Stakeholders and Primary Intended Users of the Evaluation: Describe individuals or groups who have a stake in the evaluation and who will use the evaluation results. Include a brief description of how you have engaged (or plan to engage) these evaluation stakeholders.
Evaluation Questions: What you wanted to know?
Indicator(s): A specific, observable, and measurable characteristic or change that shows progress toward achieving a specified objective or outcome.
Count/Percent: State the actual cumulative amount or percent achieved as of the end of the reporting period.
Data Source(s): Where did you collect the data (i.e., program records, surveys, etc.)? List a source for each indictor.
Data Collection Method: How did you collect the data (i.e., abstraction from spreadsheet, database, etc.)?
Data Analysis: What type of analysis did you apply to the data (e.g. descriptive statistics, thematic analysis, etc.)?
Status of Data Collection: Please provide the status of data collection (complete, in progress, not started).
Status Details: Please provide a brief update on the status of data collection.
Barriers and Facilitators: Provide a description of the barriers and facilitators encountered in implementing the selected strategy (e.g. Access to data, capacity, resources, partnerships/ stakeholder engagement). How were the barriers encountered overcome?
Findings and Conclusion: Provide a description of the findings based on evidence generated by the evaluation data collection and analysis methods. Conclusions should be drawn directly from findings and help summarize the “so what “of the findings. Several findings can lead to one or more conclusions. Recipients may include appendices with easy to read charts, tables, graphs, and maps to demonstrate evidence that supports conclusions and recommendations.
Communication/Dissemination: Describe your broad plans for communicating/sharing your findings and provide examples of products that you will develop. Describe how your evaluation reports will be published on a publicly available website.
Use of Evaluation Findings: Describe how your evaluation findings will be used to ensure continuous quality and any programmatic changes that will be made to the implementation of the selected strategy because of the current year’s evaluation findings.
Recipient: Click or tap here to enter the recipient’s organization name
Report Due Date: Enter date (MM/DD/YY) report is due
Last Date Updated: Enter Date (MM/DD/YY) the report was last updated
Instructions: Please refer to your evaluation plan and complete the section below. Update this section to reflect any changes made during the implementation of the evaluation.
Evaluation Overview |
|
Strategies to Evaluate: |
Please check the strategies from your work plan you have selected to evaluate.
|
Evaluation Purpose: |
State the purpose of the overall evaluation and describe how the findings are expected to be used to inform programmatic decisions. |
Evaluation Approach and Context: |
Describe the general approach you will use to evaluate the selected strategies. Provide information on relevant contextual factors for your program, such as how the program is situated in your state and how it connects to other programs or initiatives. Consider that this document may be viewed separately from your work plan; therefore, provide enough detail for CDC to understand the program and evaluation context. |
Evaluation Stakeholders and Primary Intended Users of the Evaluation: |
Describe individuals or groups who have a stake in the evaluation and who will use the evaluation results. Include a brief description of how you have engaged (or plan to engage) these evaluation stakeholders. |
Instructions: Please use the section below to provide an update on the evaluation of the strategies you have selected to evaluate. Repeat the table for each of the strategies selected for evaluation.
[Insert Strategy #]: |
[Insert Strategy Name] |
||||||
Evaluation Questions (EQ) |
Indicator(s) |
Count/ Percent |
Data Source(s) |
Data Collection Method |
Data Analysis |
Status of Data Collection |
Status Details |
What you wanted to know? |
A specific, observable, and measurable characteristic or change that shows progress toward achieving a specified objective or outcome. |
State the actual cumulative amount or percent achieved as of the end of the reporting period.
|
Where did you collect the data (i.e., program records, surveys, etc.)? List a source for each indictor. . |
How did you collect the data (i.e., abstraction from spreadsheet, database, etc.)? |
What type of analysis did you apply to the data (e.g. descriptive statistics, thematic analysis, etc.)? . |
Please provide the status of data collection (complete, in progress, not started). |
Please provide a brief update on the status of data collection. |
[Insert NOFO-specific Evaluation Core Area: Approach, Effectiveness, Efficiency, Sustainability, Impact] |
|||||||
EQ 1: |
|
|
|
|
|
|
|
EQ 2: |
|
|
|
|
|
|
|
[Insert NOFO-specific Evaluation Core Area: Approach, Effectiveness, Efficiency, Sustainability, Impact]: |
|||||||
EQ 1: |
|
|
|
|
|
|
|
EQ 2: |
|
|
|
|
|
|
|
[Insert Evaluation Core Area: Approach, Effectiveness, Efficiency, Sustainability, Impact]: |
|||||||
EQ 1: |
|
|
|
|
|
|
|
EQ 2: |
|
|
|
|
|
|
|
[Insert Evaluation Core Area: Approach, Effectiveness, Efficiency, Sustainability, Impact]: |
|||||||
EQ 1: |
|
|
|
|
|
|
|
EQ 2: |
|
|
|
|
|
|
|
Insert Evaluation Core Area: Approach, Effectiveness, Efficiency, Sustainability, Impact |
|||||||
EQ 1: |
|
|
|
|
|
|
|
EQ 2: |
|
|
|
|
|
|
|
Instructions: Use the section below to summarize the evaluation findings from the last period of performance for each strategy selected to evaluate. For each strategy you have selected to evaluate, insert an additional table to report findings. Note: This evaluation report is designed to provide a high-level summary of implementation of the evaluation plan and early findings. Recipients may include appendices with supplemental evaluation findings and data collection tools.
[Insert Strategy #]: |
[Insert Strategy Name]
|
Finding |
Description |
Barriers and Facilitators |
Provide a description of the barriers and facilitators encountered in implementing the selected strategy (e.g. Access to data, capacity, resources, partnerships/ stakeholder engagement). How were the barriers encountered overcome? |
Findings and Conclusions |
Provide a description of the findings based on evidence generated by the evaluation data collection and analysis methods. Conclusions should be drawn directly from findings and help summarize the “so what “of the findings. Several findings can lead to one or more conclusions. Recipients may include appendices with easy to read charts, tables, graphs, and maps to demonstrate evidence that supports conclusions and recommendations. |
Communication/Dissemination |
Describe your broad plans for communicating/sharing your findings and provide examples of products (e.g. evaluation briefs, infographics) that you will develop. Describe how your evaluation reports will be published on a publicly available website. |
Use of Evaluation Findings
|
Describe how your evaluation findings will be used to ensure continuous quality and any programmatic changes that will be made to the implementation of the selected strategy because of the current year’s evaluation findings. |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Vaughan, Marla C. (CDC/ONDIEH/NCCDPHP) |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |