SSB_1815 Category B_v06_rev

SSB_1815 Category B_v06_rev.docx

National Evaluation of the DP18-1815 Cooperative Agreement Program: Category B, Cardiovascular Disease Prevention and Management

OMB: 0920-1311

Document [docx]
Download: docx | pdf




National Evaluation of the DP18-1815 Cooperative Agreement Program:

CATEGORY B: CARDIOVASCULAR DISEASE PREVENTION AND MANAGEMENT



PART B: STATISTICAL METHODS



April 22, 2020

Revised and resubmitted to OMB: 12/1/2020

Contact: Joanna Elmi

Telephone: 770-488-5979

E-mail: zft6@cdc.gov

National Center for Chronic Disease

Prevention and Health Promotion

Centers for Disease Control and Prevention

Atlanta, Georgia




TABLE OF CONTENTS

1. Respondent Universe and Sampling Methods

2. Procedures for the Collection of Information

3. Methods to Maximize Response Rates and Deal with No Response

4. Test of Procedures or Methods to be Undertaken

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or

Analyzing Data





ATTACHMENTS

Attachment 1.

  • Centers for Disease Control and Prevention. National Center for Chronic Disease Prevention and Health Promotion. Notice of Funding Opportunity: Improving the Health of Americans through Prevention and Management of Diabetes and Heart Disease and Stroke – Financed in part by the 2018 Prevention and Public Health Funds. CDC-RFA-DP18-1815PPHF18



Attachment 2.

  • Authorizing Legislation Section 301(a) of the Public Health Services Act [42.U.S.C. 242k]



Attachment 3. 1815 Awardees and Information Collection Plan

  1. 1815 List of Health Department Awardees

  2. 1815 Strategies for Preventing and Controlling Diabetes and Heart Disease and Stroke

  3. 1815 Logic Model

  4. 1815 Crosswalk of Evaluation Components and Data Collection Tools

  5. 1815 Summary of Annualized Respondents

  6. 1815 Evaluation Gantt Chart


Attachment 4. Category B Case Studies

  1. CQM Health Department Interview Guide

  2. CQM Group Discussion Guide

  3. CQM Partner Site-Level Interview Guide

  4. TBC Health Department Interview Guide

  5. MTM Health Department Interview Guide

  6. TBC Group Discussion Guide – TBC/MTM

  7. TBC Partner Site-Level Interview Guide: Strategy B3

  8. MTM Partner Site-Level Interview Guide: Strategy B4

  9. MTM Partner Site-Level Interview Guide: Strategy B4 – Pharmacy Managers/Pharmacists

  10. CCL Health Department Interview Guide

  11. CCL Group Discussion Guide

  12. CCL Partner Site-Level Informant Interview Guide

Attachment 5. Category B Cost Study

  1. Cost Study Resource Use and Cost Inventory Tool (Category B) – HD Level

  2. Cost Study Resource Use and Cost Inventory Tool (Category B) – Partner Site Level



Attachment 6. Category B Recipient-led Evaluations

  1. Category B Recipient-Led Evaluation Annual Report Templates – Year 1 Implementation Brief

  2. Category B Recipient-Led Evaluation Annual Report Templates – Year 2 Efficiency Strategy Map Report

  3. Category B Recipient-Led Evaluation Annual Report Templates – Year 3 Effectiveness Brief

  4. Category B Recipient-Led Evaluation Annual Report Templates – Year 4 Sustainability Action Report

  5. Category B Recipient-Led Evaluation Annual Report Templates – Year 5 Health Impact Statement

Attachment 7. 60-Day Federal Register Notice



Attachment 8.

  1. Institutional Review Board Approval Notification or Exemption Determination Part A

  2. Institutional Review Board Approval Notification or Exemption Determination Part B



Attachment 9. Introductory/Follow-Up Letters

  1. Category B Case Study HD Invitation E-mail

  2. Category B Case Study Partner Site-Level Invitation E-mail

  3. Category B Case Study Confirmation E-mail

  4. Category B Case Study Reminder E-mail

  5. Category B Case Study Follow-Up E-mail







List of Acronyms


CCL

Community Clinical Linkages

CDC

Centers for Disease Control and Prevention

CHWs

Community Health Workers

CQM

Clinical Quality Measures

CVD

Cardiovascular Disease

DDT

Division of Diabetes Translation

DHDSP

Division for Heart Disease and Stroke Prevention

DOL

Department of Labor Bureau

EHR

Electronic Health Record

EPET

Evaluation and Program Effectiveness Team

EPMP

Evaluation and Performance Measurement Plan

HBC

High Blood Cholesterol

HBP

High Blood Pressure

HD

Health Department

HIT

Health Information Technology

ICR

Information Collection Request

LCP

Lifestyle Change Program

MTM

Medication Therapy Management

NCCDPHP

National Center for Chronic Disease Prevention and Health Promotion

OMB

Office of Management and Budget

TBC

Team-Based Care











B. Collections of Information Employing Statistical Methods

1. Respondent Universe and Sampling Methods

There are two primary respondent categories for the DP18-1815 National Evaluation data collection efforts: (1) program directors, program staff, and evaluators from the 51 Health Departments (HD recipients) (Attachment 3a) funded through the Improving the Health of Americans Through Prevention and Management of Diabetes and Heart Disease and Stroke Cooperative Agreement program (CDC-RFA-DP18-1815; hereafter referred to as 1815), and (2) affiliate staff members from 30-45 health systems or sites (partner sites) working with or otherwise collaborating with the 1815-funded HD recipients. For Category B of 1815, the recipients that partake in the National Evaluation data collection efforts will be selected based on: (1) the strategies they have selected for implementation under 1815; (2) a mix of different geographic locations and contexts; (3) the demographic characteristics of the populations served under 1815; (4) the varying levels of experience of implementing the strategies, and (5) the availability and willingness of HDs and partner sites. Respondent sampling methods for the various data collection efforts included in this National Evaluation are detailed further below and note that all participants sampled for the respective data collection efforts will be sampled with replacement. Participation in all National Evaluation activities are optional however, completion of state-level evaluation activities by recipients is listed as a requirement in the 1815 cooperative agreement (i.e. Category B Recipient-led Evaluation Deliverables). These documents are included in the package to gain approval for their use as the data could be triangulated with other sources to inform the national evaluation.

CDC has contracted with Deloitte Consulting to design and implement the 1815 National Evaluation. Deloitte Consulting, together with the Division for Heart Disease and Stroke Prevention (DHDSP) Evaluation and Program Effectiveness Team (EPET) will be responsible for data collection and analysis activities. Deloitte and EPET are referred to collectively as the National Evaluation Team in this document.

Category B Evaluation Component 1: Case Studies

HD-Level Recipient Interviews (Att. 4a, 4d, 4e, 4j: The National Evaluation Team will utilize a purposive sampling approach to identify 5 HD recipients to participate in each of the Clinical Quality Measures (CQM), Team-Based Care/ Medication Therapy Management (TBC/MTM), and Community-Clinical Linkages (CCL) case studies, for a total of 15 HD recipients to participate in the case studies. HD recipients will be selected for participation in a case study based on the 1815 strategies they have selected for implementation and evaluation; history of implementation of similar strategies and interventions, and state-level context and performance. HD recipients will be purposively selected to ensure variety and comparability in the cases. HD recipients will participate in only one of the three case studies.

For each participating HD recipient, interviews will be conducted with 3 to 5 purposively selected staff members working on the respective strategies under each case study, for a total of no more than 75 interviewees across all 15 participating HD recipients. The National Evaluation Team will work with HD recipient program directors/team leads in identifying program staff who work closely on managing or implementing the strategies covered under each case study to participate in the interviews.

HD-Level Group Discussions (Att. 4b, 4f, 4k) will be conducted with 6 to 8 staff members per HD recipient, for a total of 90 to 120 staff members across all 15 case study HD recipients. Group discussion participants will include individuals who participated in the interviews as well as additional staff members or consultants who are engaged in implementation of 1815 strategies. This will allow for capturing additional perspectives that may emerge within the dynamics of a group conversational setting.

Partner Site-Level Interviews (Att. 4c, 4g, 4h, 4i, 4l): After HD recipients are selected for the case study, the National Evaluation Team will work with recipients to identify partner sites that are supporting the implementation of 1815 strategies. The National Evaluation Team will use a purposive sampling process to identify 2-3 partner sites for each HD recipient participating in the case study (for a total of no more than 45 sites). Site selection criteria will include maturity of strategy-specific interventions within the partner site, type of site (e.g. large hospital, rural health center, etc.), and willingness of the site to participate.

Within each participating partner site, 2-3 staff members working on the respective strategies covered under each case study will be invited to participate in an interview (for a total of no more than 135 interviewees). Site level interviewees may include physicians, health care organization administrators, health information technology professionals, health educators, pharmacists, nurses, community health workers, or other health professionals. Partner site interviewees will be selected in conjunction with the respective HD recipient and partner site managers overseeing implementation of 1815-funded and HD recipient-supported interventions.

Category B Evaluation Component 2: Cost Study

HD-Level Recipient Respondents (Att. 5a): The National Evaluation Team will use a stratified, purposeful sampling approach to select 20-25 HD recipients for participation in the cost study. HD recipients will first be stratified by the ten geographic regions used by CDC’s National Center for Chronic Disease Prevention and Health Promotion (NCCDPHP)1 to account for any geographic differences in cost of implementation. Based on the number of states within each category, 2 or 3 states will be selected per geographic region using the following criteria:

  • Participation in Category B Case Studies detailed above

  • 1815 strategies selected for implementation and evaluation

  • Willingness of the HD recipient to participate

Preference will be given to HD recipients that have been selected for participation in the Category B Case Studies to streamline data collection efforts. In addition, HD recipients will be selected to ensure that all Category B strategies are equally represented in the analysis. Sampling with replacement will be used for any recipients that are not willing to participate.

Partner Site-Level Respondents (Att. 5b): The National Evaluation Team will work with HD recipients participating in the cost study to construct a sampling frame of all the sites they work with on CQM, TBC/MTM, and CCL strategies. Based on prior CDC studies with these types of organizations2, we expect a participation rate of about 30% among partner sites. To maximize the sample size of participating sites, we will invite all identified sites to participate in the cost study, but cap participation at 50 partner sites.



Category B Evaluation Component 3: Recipient-Led Evaluations

Category B Recipient-Led Evaluation Deliverable (Att. 6a, 6b, 6c, 6d, 6e)

Each year, all 51 HD recipients are required to submit specific evaluation reporting deliverables based on the findings from the previous years’ evaluation for Category B strategies (Table B.1-A).

Table B1.-A. Category B Recipient -Led Evaluation Report Deliverables

Year

Recipient-led Evaluation Report Deliverables

1

Implementation Brief

2

Efficiency/Strategy Mapping

3

Effectiveness Brief or Manuscript

4

Sustainability and Action Report

5

Health Impact Statement per strategy evaluated



Attachment 3e indicates the annualized number of entities covered by each proposed data collection effort.

Table B.1-B. Overview of the Data Collection Plan

This table provides an overview of the data collection plan, forms, respondents (by roles), and the schedule. OMB approval is requested for 3 years. Information collection will occur in years 3, 4, and 5 of the 5-year cooperative agreement (Years 1, 2, and 3 of the 3-year period of OMB approval).

Evaluation Component

Type of Respondents

Form Name

Coag data collection period (YR3-YR5)

Total No. of Collections YR3-YR5

No. of Respondents per Collection

No. of Respondents per Collection (detail)

Case Study Site Level Interviews

Partner site staff (1)

Att. 4c: CQM Partner Site-Level Interview Guide

YR 4

1

45

5 SHDs x 3 sites/SHD x 3 staff/site

Partner site staff (2)

Att. 4g, 4h: TBC Partner Site-Level Interview Guide  

YR 4

1

24

5 SHDs x 2 sites/SHD x 2.5 staff/site

Partner site staff (3)

Att. 4i: MTM Partner Site-Level Interview Guide

YR 4

1

21

5 SHDs x 2 sites/SHD x 2.1 staff/site

Partner site staff (4)

Att. 4l: CCL Partner Site-Level Interview Guide

YR 4

1

45

5 SHDs x 3 sites/SHD x 3 staff/site

Case Study SHD Level Interviews

HD recipient staff (5)

Att. 4a: CQM HD Recipient
Interview Guide

YR 3, 5

2

25

5 SHDs x 5 staff/SHD

HD recipient staff (5)

Att. 4b: TBC HD Recipient
Interview Guide

YR 3, 5

2

13

5 SHDs x 2.6 staff/SHD

HD recipient staff (5)

Att. 4e: MTM HD Recipient
Interview Guide

YR 3, 5

2

12

5 SHDs x 2.4 staff/SHD

HD recipient staff (5)

Att. 4j: CCL HD Recipient
Interview Guide

YR 3, 5

2

25

5 SHDs x 5 staff/SHD

Case Study SHD Level Group Discussion

HD recipient staff (5)

Att. 4b: CQM SHD
Group Discussion Guide

YR 3, 5

2

30

5 SHDs x 6 staff/SHD

HD recipient staff (5)

Att. 4f: TBC/MTM SHD
Group Discussion Guide

YR 3, 5

2

40

5 SHDs x 8 staff/SHD

HD recipient staff (5)

Att.4k: CCL SHD
Group Discussion Guide

YR 3, 5

2

40

5 SHDs x 8 staff/SHD

Cost Study

HD recipient staff (6)

Att.5a: HD Cost Study Resource Use and Cost Inventory Tool

YR 3, 5

1

25

25 SHDs x 1 staff/SHD

Site Staff (7)

Att.5b: Site-Level Cost Study Resource Use and Cost Inventory Tool

YR 3, 5

1

50

25 SHDs x 2 sites/SHD

Recipient-Led Evaluation

HD recipient staff (8)

Att. 6a-6e: DHDSP Recipient-led Evaluation Deliverable Template(s)

YR 2, 3, 4

3

51

51 SHDs + Washington DC x 1 staff/SHD

  1. Clinical and Administrative Staff (Providers, Pharmacists, Nurses, and Administrative Staff)

  2. Clinicians (Providers, Pharmacists, Nurses) and health care extenders (CHWs)

  3. Clinicians (Providers, Pharmacists)

  4. Clinicians (Providers, Nurses) and health care extenders (CHWs)

  5. Program Director; Team Lead/ Manager; Evaluator; Health Scientist

  6. Team Lead/ Manager; Evaluator

  7. Program Manager

  8. Evaluator



2. Procedures for the Collection of Information

Information will be collected from HD recipients on an annual basis, at most (Attachment 3g). Data collection procedures vary slightly for each component of the evaluation, and those methods are described below.



Category B Evaluation Component 1: Case Studies

HD Recipient and Partner Site-level Data Collection

For all three cases studies, interviews and group discussions with HD recipient staff members will take place during Years 3, 4, and 5 of the cooperative agreement. In Year 3 the interviews will be held virtually due to the coronavirus pandemic. In Years 4 and 5 interviews will be in-person and all participating individuals will be asked to conform to CDC and local COVID-19 public health recommendations. Telephone interviews with partner site staff members will take place in Years 3 and 5 of the cooperative agreement.

For both HD recipient and partner site data collection efforts, the National Evaluation Team will send an invitation email (Att. 9a, 9b) to identified potential participants. The invitation email will explain the purpose of the case studies and how insights gained from the interviews/group discussions will be used; specify that participation is voluntary; describe how individual-level responses will be safeguarded; clarify the expected time that the interviews and group discussions will take to complete; and provide contact information for the evaluation team. Once they accept the invitation to participate, HD recipient and partner site staff members will receive a confirmation email (Att. 9c) with a copy of the data collection instrument and details to begin scheduling the interviews and group discussion. Five business days in advance of the interview, participants will receive a reminder e-mail (Att. 9d) indicating the upcoming time and date of the interviews/group discussions.

Interviews with HD recipient staff members will be conducted in person during site visits (Years 4-5) and will last no more than 2 hours. In Year 3 virtual interviews will be held by video conference due to the coronavirus pandemic. Group discussions with HD recipient staff members will last no more than 2.5 hours. In-person interviews and discussions will implement local and CDC recommended COVID-19 protective measures to ensure the safety and health of all participating individuals. Interviews with partner site staff will be conducted over the phone and will last no more than 1.5 hours. All interviews and group discussions will be led by a primary interviewer and supported by a note-taker, both from the Deloitte team. All interviews and group discussions will be digitally recorded for transcription purposes, with consent of participants. Verbal permission will be obtained from the participants prior to the beginning of interviews and group discussions. Participants will be provided a meeting password to access the video conference calls for the interviews and discussions in Year 3. The interviewer will monitor and control who joins the video conference to ensure the security and confidentiality of the participants.

As interviews are completed, participants will receive a follow up email (Att. 9e) thanking them for their participation, sharing the anticipated timeline for data analysis and results, and letting them know whom to contact with further questions.

Category B Evaluation Component 2: Cost Study

Both HD recipient and partner site cost study participants will input their cost data into the web-based Resource Use and Cost Inventory Tool (Att. 5a, 5b). An invitation to participate in the cost study will be sent to the HD recipients and partner sites that have agreed to participate in the cost study. The National Evaluation Team will host a webinar to orient HD recipients and their partner sites to the Inventory Tool, develop clear guidance for using the Tool, and set clear expectations regarding what type of information will be requested, the timeline for reporting, and the time needed for completion. The National Evaluation Team will work with each participating HD recipient to appoint a Cost Study Liaison to be responsible for cost data gathering and reporting and support the Liaison to answer questions and provide technical assistance.

Cost data will be collected from HD recipients and partner sites in Years 3 and 5. Participants will self-administer the tool and submit the completed Resource Use and Cost Inventory Tool using an online platform.

Category B Evaluation Component 3: Recipient-Led Evaluation Reporting Deliverables

Per the 1815 cooperative agreement, all HD recipients are required to complete an annual evaluation reporting deliverable to update CDC on the progress of their recipient-led evaluations of Category B strategies.

Category B Annual Evaluation Reporting Deliverables (Att. 6a, 6b, 6c, 6d, 6e): Each year, HD recipients will submit specific evaluation reporting deliverables for their Category B strategies, based on the findings from the previous year’s evaluation, culminating with a Health Impact Statement in Year 5.

DHDSP will provide HD recipients with technical assistance and guidance to support completion of the reporting deliverables. The deliverables will be due 90 days after the end of each program year.



3. Methods to Maximize Response Rates and Deal with No Response

While participation in all data collection for national evaluation activities is voluntary, the National Evaluation Team will make every effort to maximize the rate of participation. The HD recipient and partner site-level interview guides are tailored specifically to each stakeholder and are designed to gather the most relevant information within the designated length of time per instrument. The National Evaluation Team will also engage with select HD recipients to gather their input on the interview tools and process, thereby building buy-in for the evaluation process and encouraging full participation.

For potential interview and survey participants, the National Evaluation Team will first send invitation e-mails describing the purpose and length of interviews and surveys, what types of questions will be asked, and how findings will be used. For interviews, once individuals agree to participate, the National Evaluation Team will follow-up by sending confirmation and reminder e-mails in advance of interviews to ensure participation. For virtual interviews and discussions in Year 3, the Team will also provide detailed log in instructions and required computer and internet capabilities for a stable connection. For surveys, the National Evaluation Team will follow-up with non-respondents by sending two follow-up e-mails during the survey period to encourage their participation.

The National Evaluation Team will work with CDC project officers and evaluators, as well as select HD recipient key informants, to determine appropriate incentive strategies to maximize participation for the Category B case study partner site data collection and cost study. Non-monetary incentives such as tailored partner site reports of findings and joint publications or presentations may be offered to potential participants.

Completion of the Category B Recipient-led Evaluation Reporting Deliverables (Att. 6a, 6b, 6c, 6d, 6e) is a requirement of the 1815 cooperative agreement. These documents are included in the package to gain approval for their use as the data could be triangulated with other sources to inform the national evaluation. The DHDSP team will provide ongoing technical assistance to HD recipients, including hosting topic-specific webinars, to support preparation and submission of each deliverable.

4. Test of Procedures or Methods to be Undertaken

DHDSP and the National Evaluation Team are convening voluntary external advisory groups, or Evaluation Planning Groups (EPGs), comprised of HD recipients, who will provide input and feedback on data collection tools. Each data collection tool will be reviewed by no more than 9 individuals.

Category B Evaluation Component 1: Case Studies: Data collection instruments for all three case studies were pre-tested and cognitively tested with at least 1 different HD recipient representative (but no more than nine representatives per tool) and were administered by phone. EPET and the selected HD recipients were also asked to provide feedback on the data collection tools and protocols during facilitated discussions. Feedback from both groups were used to refine questions as needed, avoid duplicative areas, clarify question wording, ensure accurate programming and skip patterns, and establish the estimated time required to complete the data collection instruments.

Category B Evaluation Component 2: Cost Study: The Resource Use and Cost Inventory Tool was pre-tested with a selected HD. EPET and the HD recipients provided feedback on the tool and instructions for completing the tool during facilitated discussions. Feedback from both groups were used to refine the Inventory Tool, avoid duplicative areas, clarify cost categories and items as well as instructions, and establish the estimated time required to complete the tool.


Category B Evaluation Component 3: Evaluation Report Templates: No more than 9 HD recipients provided feedback to refine the annual evaluation report deliverables. EPET will provide ongoing technical assistance and guidance to HD recipients with respect to their annual evaluation report deliverable.

5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data

The individuals responsible for design and management of the Category B 1815 National Evaluation data collection tools and processes include:

Marla Vaughan

mhv1@cdc.gov

CDC, Division for Heart Disease and Stroke Prevention, Evaluation and Program Effectiveness Team

Rachel Davis

bkf4@cdc.gov

CDC, Division for Heart Disease and Stroke Prevention, Evaluation and Program Effectiveness Team

Aisha Tucker-Brown

htj1@cdc.gov


CDC, Division for Heart Disease and Stroke Prevention, Evaluation and Program Effectiveness Team

Joanna Elmi

zft6@cdc.gov

CDC, Division for Heart Disease and Stroke Prevention, Evaluation and Program Effectiveness Team

Jenica Reed

jhreed@deloitte.com

Deloitte Consulting, 1815/1817 National Evaluation Team

Meklit Hailemeskal

mhailemeskal@deloitte.com

Deloitte Consulting, 1815/1817 National Evaluation Team

Gizelle Gopez

ggopez@deloitte.com


Deloitte Consulting, 1815/1817 National Evaluation Team

Hannah Eisen

heisen@deloitte.com

Deloitte Consulting, 1815/1817 National Evaluation Team

















2 DHDSP. Paul Coverdell National Acute Stroke Program National Evaluation – Partner Cost Study. 2017

12

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKumar, Anoosha (US - Chicago)
File Modified0000-00-00
File Created2021-10-28

© 2024 OMB.report | Privacy Policy