4_Prostate Cancer_SSB

4_Prostate Cancer_SSB.docx

[NCCDPHP] Evaluation of a Prostate Cancer Decision Aid

OMB: 0920-1438

Document [docx]
Download: docx | pdf



Evaluation of an Online Prostate Cancer Decision Aid


[OMB No. 0920-xxxx] [OMB expiration date]


Supporting Statement B







Program Official/Contact

David Siegel, MD, MPH

Medical Officer

National Center for Chronic Disease Prevention and Health Promotion

Centers for Disease Control and Prevention

P: [Contact phone]

F: [Contact fax]

irn3@cdc.gov




6/17/2024



TABLE OF CONTENTS


ATTACHMENTS

1. Authorizing Legislation

2. Study Design Graphic

3. Evaluation Questions and Indicators

4a. Provider Survey: Introductory Email to All Primary Care Providers, English

4b. Provider Survey: Consent Statement, English

4c. Provider Survey: Paper, English

4d. Provider Survey: Online, English

4e. Provider Survey: Reminder Email, English

4f. Provider Survey: Thank You Email, English

5a. Patient Recruitment: Introductory Email to Providers of Patient Sample, English

5b. Patient Recruitment: Introductory Email to Patients, English

5c. Patient Recruitment: Introductory Email to Patients, Spanish

5d. Patient Recruitment: Email with Link to Eligibility Screener, English

5e. Patient Recruitment: Email with Link to Eligibility Screener, Spanish

5f. Patient Recruitment: Reminder Email for Eligibility Screener, English

5g. Patient Recruitment: Reminder Email for Eligibility Screener, Spanish

6a. Pre-Exposure Survey: Eligibility Screener, English

6b. Pre-Exposure Survey: Eligibility Screener, Spanish

6c. Pre-Exposure Survey: Consent Statement, English

6d. Pre-Exposure Survey: Consent Statement, Spanish

6e. Pre-Exposure Survey: Paper, English

6f. Pre-Exposure Survey: Online, English

6g. Pre-Exposure Survey: Online, Spanish

6h. Pre-Exposure Survey: Reminder Email, English

6i. Pre-Exposure Survey: Reminder Email, Spanish

7a. Post-Exposure Survey: Assigned Materials and Survey Link, English

7b. Post-Exposure Survey: Assigned Materials and Survey Link, Spanish

7c. Post-Exposure Survey: Consent Statement, English

7d. Post-Exposure Survey: Consent Statement, Spanish

7e. Post-Exposure Survey: Paper, English

7f. Post-Exposure Survey: Online, English

7g. Post-Exposure Survey: Online, Spanish

7h. Post-Exposure Survey: Reminder Email, English

7i. Post-Exposure Survey: Reminder Email, Spanish

8a. Usability Survey: Pre-Notification Email, English

8b. Usability Survey: Pre-Notification Email, Spanish

8c. Usability Survey: Email with Survey Link, English

8d. Usability Survey: Email with Survey Link, Spanish

8e. Usability Survey: Consent Statement, English

8f. Usability Survey: Consent Statement, Spanish

8g. Usability Survey: Paper, English

8h. Usability Survey: Online, English

8i. Usability Survey: Online, Spanish

8j. Usability Survey: Reminder Email, English

8k. Usability Survey: Reminder Email, Spanish

9a. User Experience Interviews: Recruitment Email, English

9b. User Experience Interviews: Recruitment Email, Spanish

9c. User Experience Interviews: Consent Statement, English

9d. User Experience Interviews: Consent Statement, Spanish

9e. User Experience Interviews: Interview Guide, English

9f. User Experience Interviews: Interview Guide, Spanish

10a. Post-Clinic Visit Survey: Pre-Notification Email, English

10b. Post-Clinic Visit Survey: Pre-Notification Email, Spanish

10c. Post-Clinic Visit Survey: Email with Survey Link, English

10d. Post-Clinic Visit Survey: Email with Survey Link, Spanish

10e. Post-Clinic Visit Survey: Consent Statement, English

10f. Post-Clinic Visit Survey: Consent Statement, Spanish

10g. Post-Clinic Visit Survey: Paper, English

10h. Post-Clinic Visit Survey: Online, English

10i. Post-Clinic Visit Survey: Online, Spanish

10j. Post-Clinic Visit Survey: Reminder Email, English

10k. Post-Clinic Visit Survey: Reminder Email, Spanish

10l. Post-Clinic Visit Survey: Thank You Email, English

10m. Post-Clinic Visit Survey: Thank You Email, Spanish

11a. Clinic Coordinator Interviews: Consent Statement, English

11b. Clinic Coordinator Interviews: Interview Guide, English

12a. Published 60-Day Federal Register Notice (FRN)

12b. 60-Day FRN Public Comments

12c. CDC Response to Public Comments

13. Approved Privacy Narrative

14. Institutional Review Board Approval

B. COLLECTIONS OF INFORMATION EMPLOYING STATISTICAL METHODS

B1. Respondent Universe and Sampling Methods


The target universe for the study is male patients from four primary care clinics between the ages of 55-69 scheduled for an upcoming health exam. Each of the four clinics will provide a sample of all primary care providers within their clinic and assign a study coordinator who will help facilitate the study within the clinic. The criteria for selecting the four primary clinics for the study are described in A2 of Supporting Statement A as well as in Table B1a below.


Table B1A: Criteria for Clinic Selection

Population Served

  • Men aged 55-69 years

  • Large population of men at high risk:

  • Black or African American men

  • Men with a family history of prostate cancer

  • At least one first-degree relative (father, son, or brother) who had prostate cancer, or having two close relatives on the same side of the family who had prostate cancer

  • Sufficient population of Hispanic or Latino, American Indian or Alaska Native, and Asian men

Data Sharing

  • Willing to share EHR data with ICF

Perspective on Shared Decision-Making

  • Not opposed to shared decision-making for prostate cancer screening

Resources

  • Capacity to dedicate staff as a study coordinator

  • Capacity to engage in subcontracting or other type of agreement with ICF


The universe of patients from which the sampling frame is created, and the sample is drawn, is the list of patients provided by the four primary care clinics. Male patients with prior history of prostate cancer, urinary tract symptoms, prostate biopsy, cognitive deficits, and terminal illness, will be excluded. A sample of 900 eligible patients will be selected from the final sampling frame.


We will focus on recruiting high-risk participants (Black or African American men and men with a family history of prostate cancer). We will use permuted block design to randomly assign participants to one of the three arms of the RCT (Intervention Arm: Nathan, Control Arm 1: MDPH decision aid, and Control Arm 2: NCI PDQ). No formal probability sampling or subsampling will occur. Recruitment and random assignment for the RCT will begin after receipt of OMB approval for the data collection activities. All individuals enrolled will be asked to complete the pre-test, post-exposure, and post-visit surveys. If a clinic site enrolls more than the 250 individuals projected, those recruited after the 225-participant target is met for that clinic site will not be included in the study.


Table B1B shows expected sample sizes for each of the three arms. Given the study objective of comparing outcomes among the study arms, the RCT design emphasizes internal validity over generalizability of findings to the larger population. Therefore, we will use a convenience sampling approach among the clinic patients who meet the inclusion criteria for the selection of participants.


Table B1B. Expected sample sizes for the three study arms.


Nathan Decision Aid

MDPH Decision Aid

NCI PDQ

Total

Clinic 1

75

75

75

225

Clinic 2

75

75

75

225

Clinic 3

75

75

75

225

Clinic 4

75

75

75

225


300

300

300

900


The universe from which the sampling frame is created for the provider survey and clinic coordinator interviews is the sample of all primary care providers and study coordinators within each of the four clinics. We anticipate that each clinic will provide a sample of approximately 10 providers, resulting in a sample of 40 providers total for the provider survey. We anticipate that there will be one study coordinator from each of the four clinics, resulting in a total of four study coordinators for the clinic coordinator interviews.


B2. Procedures for the Collection of Information


Our information collections are informed by the evaluation questions and indicators included in Attachment 3. Eight forms of information collection will be implemented to answer our evaluation questions. These include a provider survey; a patient eligibility screener; patient pre-exposure, post-exposure, and post-clinic visit surveys; a patient usability survey; patient user experience interviews; and clinic coordinator interviews. Each instrument will be administered once per respondent throughout the course of the study.


Provider Survey

Each of the four clinics will provide a sample of all primary care providers within their clinic. Prior to executing the three-arm study, we will administer a web-based survey to these providers (Attachment 4d; a Microsoft Word version of this survey is provided in Attachment 4c for ease of review). The provider survey gathers information on health care providers prostate cancer screening practices and their attitudes towards prostate cancer screening. The provider survey will be administered in English. An introductory email (Attachment 4a) will be sent to all providers at the four primary care clinics informing them of the planned information collection, announcing the dates the survey will remain open, and providing relevant web-links to the survey instrument. No personal information on the provider will be collected through the survey. Providers will have a period of 2 weeks to complete the survey. Based on a small pre-test, we estimate the provider survey will take approximately 10 minutes to complete in its entirety. A reminder email that notes the deadline for responding will be sent to providers who have not responded to the survey 5 days before information collection ends (Attachment 4e). At the close of the study, providers will receive a thank you email for their participation with a link to CDC’s Explore Talking to Patients about Prostate Cancer, a version of Nathan for providers (Attachment 4f).


All data collected through the provider survey will be stored in an electronic database and results will be reported in aggregate across the providers. The data will be analyzed, and results will summarize across the providers what are the prostate cancer screening practices and provider attitudes towards screening and how this varies by demographics.

 

Patient Eligibility Screener

Each of the four clinics will provide a sample of men who meet the criteria for the study based on clinics’ review of electronic health records (EHRs). Primary care providers of men in the sample will receive an email informing them of the study and its purpose and letting them know that some of their patients may be enrolled in the study (Attachment 5a). Men in the sample will receive an introductory email informing them of the study and its purpose and letting them know that someone from ICF will be contacting them (Attachments 5b and 5c). ICF will then send a formal invitation by email or text message containing a unique web link to access the screener (Attachment 5d and 5e). A reminder email that notes the deadline for responding will be sent to participants who have not completed the screener 10 days before information collection ends (Attachment 5f and 5g). Based on a small pre-test, we estimate the screener will take approximately 8 minutes to complete in its entirety (Attachments 6a and 6b).


Pre-Exposure, Post-Exposure and Post-Clinic Visit Surveys

If determined eligible via completion of the eligibility screener, participants will be immediately prompted to complete the pre-exposure survey (Attachments 6f and 6g; a Microsoft Word version of this survey is provided in Attachment 6e for ease of review). The web-based pre-exposure survey will be administered to all 900 study participants before exposure to their assigned material (Nathan, MDPH decision aid, or NCI PDQ). This survey will measure the primary outcome (decisional conflict) and secondary outcomes, including prostate cancer knowledge and autonomous decision-making. The pre-exposure survey will also collect participants’ demographics, digital literacy, health literacy, previous exposure to informational materials about prostate cancer screening, and prostate cancer experience. At the close of the pre-exposure survey, participants will be asked for their preferred mode of communication moving forward: email or text messaging. A reminder email that notes the deadline for responding will be sent to participants who have not responded to the survey 10 days before information collection ends (Attachments 6h and 6i). Based on a small pre-test, we estimate the pre-exposure survey will take approximately 20 minutes to complete in its entirety.


After completing the pre-exposure survey, all 900 participants will be randomized into one of three arms (Intervention Arm: Nathan, Control Arm 1: MDPH decision aid, and Control Arm 2: NCI PDQ). They will receive an email or text/SMS message with the associated materials for their assigned arm (Nathan, MDPH decision aid, or NCI PDQ) as well as a unique web link to access the post-exposure survey (Attachments 7a and 7b). The web-based post-exposure survey (Attachments 7f and 7g; a Microsoft Word version of this survey is provided in Attachment 7e for ease of review) will collect information on exposure to the assigned material, decisional conflict, autonomous decision making, decision self-efficacy, preparation for decision-making, prostate cancer knowledge, help needed to review assigned materials, and contamination from other informational materials about prostate cancer screening. A reminder email or text/SMS message that notes the deadline for responding will be sent to participants who have not responded to the survey 10 days before information collection ends (Attachments 7h and 7i). Based on a small pre-test, we estimate the post-exposure survey will take approximately 20 minutes to complete in its entirety.


The web-based post-clinic visit survey (Attachments 10h and 10i; a Microsoft Word version of this survey is provided in Attachment 10g for ease of review) will be administered to all 900 study participants immediately after clinic encounter with their provider. The study participants will be sent a pre-notification email or text/SMS message one week before the survey launch inviting them to complete the online survey (Attachments 10a and 10b). Upon survey launch, the team will send a formal invitation by email or text/SMS message containing a unique web link to access the survey (Attachments 10c and 10d). This survey will measure decisional conflict, autonomous decision-making, prostate cancer knowledge, screening behavioral intent, screening behavior (also confirmed by electronic health record [EHR] review), shared decision-making, time spent with provider discussing PSA test, and informational materials used in making a screening decision. A reminder email or text/SMS message that notes the deadline for responding will be sent to participants who have not responded to the survey 10 days before information collection ends (Attachments 10j and 10k). Based on a small pre-test, we estimate the post-clinic visit survey will take approximately 20 minutes to complete in its entirety.


For each of the three surveys (pre-and post-exposure, and post-clinic visit), study participants will have 2 weeks to complete the survey from when the link is sent for that survey. If a participant does not respond after 2 weeks, the ICF team will call the study participant to complete the respective survey via telephone. All telephone interviews with non-responders will be conducted using the Voxco CATI data collection system. The survey will be programmed to lead the interviewers through the survey’s skip and branching patterns. The data from the web and telephone surveys will be stored on the Voxco survey platform.


The information collected through the three surveys will be analyzed and the results will be used to gauge progress in reaching the primary outcome of interest – resolving decisional conflict. The analysis of the information will also focus on characterizing participants (e.g., by sociodemographic background) and exploring variations in change in primary outcomes by the three arms and for different subpopulations.


Usability Survey

The web-based usability survey (Attachments 8h and 8i; a Microsoft Word version of this survey is provided in Attachment 8g for ease of review) will be administered to the 300 study participants in the Nathan arm of the study. The usability survey will be administered within 2 weeks of completing the post-exposure survey. The study participants will be sent a pre-notification email or text/SMS message one week before the survey launch inviting them to complete the online survey (Attachments 8a and 8b). Upon survey launch, the team will send a formal invitation by email or text message containing a unique web link to access the survey (Attachments 8c and 8d). The usability survey will focus on understanding the acceptability, perceived fit, usability of the decision aid, and technology acceptance. It also will assess Nathan dosage (i.e., pathways completed and time spent on Nathan by patient; also confirmed by use data gathered through the Nathan platform), help needed to review Nathan, COVID-19 impact and telemedicine, and gather recommendations for improving Nathan’s content and functionality. A reminder email or text/SMS message that notes the deadline for responding will be sent to participants who have not responded to the survey 10 days before information collection ends (Attachments 8j and 8k). Based on a small pre-test, we estimate the usability survey will take approximately 18 minutes to complete in its entirety. Respondents will have a period of 2 weeks to complete the survey. Like the pre- and post-surveys, if a participant does not respond after 2 weeks, the ICF team will call the study participant to complete the usability survey via telephone. All telephone interviews with non-responders will be conducted using the Voxco CATI data collection system. The data from the web and telephone surveys will be stored on the Voxco survey platform.


In addition to getting an understanding of the acceptability, perceived fit, usability of the Nathan decision aid, and technology acceptance, the analysis of results will summarize the recommendations made by users for improving Nathan’s content and functionality.

Each participant responding to the various web or telephone surveys mentioned above will be assigned a random digit identification number. The identification number will be used to link participant information to survey responses for internal purposes of data tracking. Separate databases will be used to house participants’ survey responses and their email addresses and telephone number; each will be stored in a separate secure file on a secure network server. Only ICF project staff will have access to these data. Only aggregate responses will be used in the report of study results. A de-identified data file will be created to share the data with CDC. In addition, all team members will be trained on the project’s specific security requirements and will sign an agreement to keep the data secure.


User Experience and Clinic Coordinator Interviews

A subset of participants in the intervention arm (n=30) who participate in the usability survey will be invited to participate in a user experience interview (Attachments 9e and 9f). These interviews will gather a deeper understanding of Nathan’s acceptability, perceived fit, usability, and digital literacy as well as barriers and facilitators to use and recommendations for improving Nathan’s content and functionality from respondents. Based on a small pre-test, we estimate the user experience interview will take approximately 20 minutes to complete in its entirety.


For the clinic coordinator interviews, the team will conduct interviews with the clinic coordinators (n=4) from the four primary care clinics to get their perspective on barriers, facilitators, and best practices to incorporating Nathan into the clinic workflow (Attachment 11b). Based on a small pre-test, we estimate the clinic coordinator interview will take approximately 30 minutes to complete in its entirety.


Interviews with Nathan users and clinic coordinators will be conducted virtually (e.g., using Zoom). Interviewers will use the appropriate guide (user experience interview guide or clinic coordinator interview guide) to lead each discussion. Interviews will be audio recorded with permission from the participant. A second team member will be present for each interview and will take notes if the participant does not consent to be audio recorded. Audio recordings will be saved on ICF’s Microsoft Teams project site. Audio recordings will be transcribed, and any information will be removed that could identify a participant, clinic, or clinician. Textual data from interview transcripts will be entered into a qualitative database software program, MAXQDA, for analysis.


The results of these interviews will be used to get an in-depth understanding of the user experience with Nathan, satisfaction with the experience, behavioral intention, and recommendations for improving the aid, as well as any in-depth understanding of barriers to health literacy and resolution of decisional conflicts.


At the close of the study, men will receive a thank you email with the materials assigned to each of the three arms of the study (Attachments 10l and 10m).


B3. Methods to Maximize Response Rates and Deal with No Response


To maximize response rates, participants will receive introductory emails as well as electronic (email or text/SMS message) pre-notifications and invitations that are addressed personally to each participant and will include pieces of information that have been shown to be critical for enhancing response (e.g., how the individual was selected, the usefulness of the information, and incentives for participation). These materials can be found in Attachments 4a, 5a, 5b, 5c, 5d, 5e, 7a, 7b, 8a, 8b, 8c, 8d, 9a, 9b,10a, 10b, 10c, and 10d. Automatic survey reminders will be sent through Voxco to engage respondents who haven’t opened the email or responded to the survey yet (Attachments 4e, 5f, 5g, 6h, 6i, 7h, 7i, 8j, 8k, 10j, and 10k). Using Voxco’s platform the study team can control the frequency of reminders and monitor responses. We expect that 50% of the study participants will complete the web-based surveys. For the rest of the participants who do not respond to the web-based surveys, ICF team will contact them over the phone and upon their consent will complete the survey via telephone mode.


We anticipate sufficient power to detect effects even with some sample attrition. Loss to follow up will be analyzed to determine if statistically significant differences exist in rates of attrition across intervention and control arms. Differences in sample characteristics between those with and without missing values will also be noted descriptively. For study participants who do not answer most of the survey questions, responses will be included in the analysis for which there is data. Non-response to questions will be noted in the analysis. 

For study participants who did not answer occasional questions, statistical analysis will be run to see if nonresponses to any questions are correlated to specific responses or nonresponses to other questions. This may show whether the data is missing at random, although not if there are minimal nonresponses. Following statistical analysis of missing data, the ICF team will make the final decision of whether the data is missing at random. The missing data, that is missing potentially at random, will be imputed using multiple imputation methods. The data will be imputed multiple times and analyzed individually. The imputed individual analysis will be pooled to a singular result. The pooled imputed results and the complete case results will be compared, and both will be reported.

B4. Tests of Procedures or Methods to be Undertaken


Web and CATI versions of the pre-exposure, post-exposure, usability, and post-clinic visit surveys were pre-tested with 6 participants. Average time for completion of the pre-exposure, post-exposure, and post-clinic visit surveys was 20 minutes; average time for completion of the usability survey was 18 minutes. The user experience interview guide was pretested with 3 participants using Zoom technology. The average time for completion was 20 minutes.


The participants that participated in pre-testing of the pre-, post- and usability surveys and the user experience interview guide also participated in cognitive interviews to ensure clarity of questions in those instruments. Very few problems were reported and mainly included questions regarding instructional language and question response options. Modifications including clarifications to instructional language and additional or revised response options were made to the instruments based on the feedback received from the pre-testing participants.


The clinic coordinator interview guide was pre-tested with 3 participants. Average time for completion was 30 minutes. The participants that participated in pre-testing of the clinic coordinator interview guide also participated in cognitive interviews to ensure clarity of questions in that instrument. No problems were reported, and no modifications were made to the instrument based on the feedback received from the pre-testing participants.


B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


David Siegel, MD, MPH and Thomas Richards, MD, of CDC, are the CDC study leads and have overall responsibility for overseeing the design, conduct, and analysis of the study.


All data collection instruments, sampling and data collection procedures, and analysis plan were designed in collaboration with the ICF team. ICF will conduct data collection and analysis, in consultation with the CDC investigators mentioned above. Danielle Nielsen, MPH has overall technical and financial responsibility for this study at ICF. She worked closely with several ICF staff including Bhuvana Sukumar, PhD, MSW, Robert Stephens, PhD, Helen Coelho, MPH, Elizabeth Douglas, MPH, Sara Perrins, PhD, and Bryce McGowan, MPH to design and implement this protocol. Dr. Sukumar led the design of the evaluation protocol. Dr. Stephens leads the evaluation team, overseeing evaluation design, implementation, and analysis. Ms. Coelho will oversee clinic and participant recruitment. Dr. Sukumar serves as the senior technical advisor for the project. Dr. Sukumar will also oversee report writing and dissemination.


The ICF team assisting with data collection, analysis, and report writing for each of the data collection activities is described below.


For the provider and patient surveys, several highly trained ICF staff have extensive experience in quantitative data analysis, ranging from descriptive statistics to more complex methods such as linear, multiple, and logistic regressions, hierarchical linear modeling, and ANOVAs. Elizabeth Douglas, MPH will oversee data collection and analysis and provide analytic support as needed. Robert Stephens, PhD and Sara Perrins, PhD will be primarily responsible for quantitative data analysis. Danielle Nielsen, MPH, Helen Coelho, MPH and Bryce McGowan, MPH will provide support, mainly related to data collection, entry, and cleaning. Robert Volk, MD, will provide as needed scientific and technical guidance. The individuals involved with data collection and analyses for survey activities are listed in Table B5A.


Table B5A. Individuals Responsible for Survey Data Collection and Analysis

Name

Contact Info

Organization

Role

Robert Stephens, PhD

Bob.Stephens@icf.com

ICF

Team Lead, Statistician

Elizabeth Douglas, MPH

Elizabeth.Douglas@icf.com

ICF

Evaluation and data collection design, data collection lead, data analyst

Sara Perrins, PhD

Sara.Perrins@icf.com

ICF

Statistician

Danielle Nielsen, MPH

Danielle.Nielsen@icf.com

ICF

Project Manager, Data collection and analysis support

Helen Coelho, MPH

Helen.Coelho@icf.com

ICF

Recruitment lead, Data collection and analysis

Bryce McGowan, MPH

Bryce.McGowan@icf.com

ICF

Project support, Qualitative data collection and analysis

Robert Volk, MPH

BVolk@mdanderson.org

University of Texas MD Anderson Cancer Center

Scientific and technical advisor


For the user experience and clinic coordinator interviews, no statistical sampling or estimation procedures are used in this data collection or analysis; therefore, no individuals were consulted on the statistical aspects of the design. The individuals involved with interview data collection and qualitative analysis are listed in Table B5B below.


Table B5B. Individuals Responsible for Interview Data Collection and Analysis

Name

Contact Info

Organization

Role

Elizabeth Douglas, MPH

Elizabeth.Douglas@icf.com

ICF

Evaluation and data collection design, data collection lead, data analyst

Danielle Nielsen, MPH

Danielle.Nielsen@icf.com

ICF

Project Manager, Data collection and analysis support

Helen Coelho, MPH

Helen.Coelho@icf.com

ICF

Recruitment lead, Data collection and analysis

Bryce McGowan, MPH

Bryce.McGowan@icf.com

ICF

Project support, Qualitative data collection and analysis



File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File TitleSupporting Statement B template
SubjectSupporting Statement B template
AuthorCenters for Disease Control and Prevention
File Modified0000-00-00
File Created2024-09-05

© 2024 OMB.report | Privacy Policy