Ssb Cpfp_12-11-13

SSB CPFP_12-11-13.doc

Outcomes Evaluation of the National Cancer Institute (NCI) Cancer Prevention Fellowship Program (CPFP)

OMB: 0925-0690

Document [doc]
Download: doc | pdf




Supporting Statement B For:


Outcomes Evaluation of the NCI Cancer Prevention Fellowship Program (NCI)



December 11, 2013





Jessica Faupel-Badger, PhD, MPH

Associate Director

Cancer Prevention Fellowship Program

National Cancer Institute


6120 Executive Boulevard

Suite 150E

Rockville, MD 20892-7105


Telephone: 301-402-8806

Fax: 301-480-2669

Email: badgerje@mail.nih.gov



Table of Contents




Attachments


1. A. Survey of Alumni (Screenshots)

B. survey of Alumni (Word Document with all questions)

2. A. Survey of Applicants (Screenshots)

B. Survey of Applicants (Word Document with all questions)

3. A. Survey of F32 Awardees (Screenshots)

B. Survey of F32 Awardees (Word Document with all questions)

4. Privacy Impact Assessment

5. Privacy Act Memo

6. IRB Approval Letters

7. Email Survey Notice to CPFP Alumni, CPFP Applicants and F32 Awardees

8. Invitation Letter for Survey to CPFP Alumni, CPFP Applicants and F32 Awardees

9. Email Survey Reminder to CPFP Alumni, CPFP Applicants and F32 Awardees

10. Telephone Follow-up Script for CPFP Alumni, CPFP Applicants and F32 Awardees



B. STATISTICAL METHODS


The CPFP contains structured and formal training offerings and is truly trans-disciplinary, with Fellows from different scientific backgrounds routinely working together over several years. It also is designed to support fellows toward career opportunities in many settings in addition to traditional academic pathways. An evaluation of the CPFP program will provide insight to the NCI, NIH, and the broader postgraduate training community about the roles that structured training programs in general, and trans-disciplinary programs specifically, may have on career outcomes. Ultimately, we want to determine if the CPFP is meeting its overarching goal of training leaders in the field of cancer prevention and control.

We are proposing to use statistical methods for our Web Survey to obtain many of the quantitative outcome data measures from CPFP alumni (Attachment 1) and individuals in the comparison group, CPFP applicants (Attachment 2) and F32 Awardees (Attachment 3). To investigate the research questions, we selected two different comparison groups.


B.1 Respondent Universe and Sampling Methods

B.1.1 Respondent Universe


Eligible CPFP alumni were all alumni who entered the program as of August 31, 1987, were a fellow for at least 12 months, and left the program no later than December 31, 2011. The review of the CPFP application database identified 200 alumni who met the inclusion criteria.

The comparison group of CPFP Applicants (Applicants) will consist of all persons who applied to the CPFP, were reviewed and interviewed by the entire CPFP Scientific Education Committee, but were not selected (unsuccessful applicants) between January 1, 1987 and December 31, 2011. The review of the CPFP application database identified 283 applicants who met the inclusion criteria.

The comparison group of F32 Awardees (F32 Awardees) will consist of early career scientists who were recipients of funding from NCI through the F32 mechanism who had relevant cancer prevention and control-oriented projects between January 1, 1987 and December 31, 2011. The review of the IMPACII database identified 367 recipients who met the inclusion criteria.

B.1.2. Sampling Methods


The survey will be conducted through a census of all CPFP alumni, all CPFP applicants who were interviewed by NCI, and all F32 awardees supported by NCI. Table B-1 shows the population count, sample size, expected response rate, and expected number of completes.


B-1. Target population, estimated respondents who were selected for survey

Population

Target population

Expected Response rate

Expected # of completes

CPFP Alumni

200

0.80

160

CPFP Applicants

283

0.60

170

F32 Awardees

367

0.60

220

Total

850

0.65

550


While we expect to have an 80 percent response rate for CPFP alumni, we assume a 60 percent response rate from the CPFP Applicants and F32 Awardees. Though different factors may be responsible for non-response, we anticipate two main reasons: (1) failure to locate respondents and (2) refusal to participate. Obviously, if we cannot trace respondents successfully, there is zero probability of getting them to respond to the surveys.

The other group of non-respondents consists of those who are contacted and asked to participate in the survey, but refuse to do so by email, telephone or lack of response. Non-respondents in the CPFP Applicants group may simply not respond to the survey because they were unsuccessful applicants to CPFP. Similarly, non-respondents in the F32 Awardees group may not respond to the survey because they do not have direct association with CPFP.

Section B.1.4 addresses the plans to minimize the first type of nonresponse and Section B.3 addresses the plans to maximize the response rate.

B.1.3. Levels of Precision


The survey will be conducted through a census of all CPFP alumni, all interviewed CPFP applicants, and all F32 awardees supported by NCI. Under this circumstance, our survey operates at a 100 percent sampling rate and thus power analysis is not necessary. The precision of estimates depends on the response rate. Precision is usually measured by the standard deviation of the estimator and is known as the standard error. For example, the sample mean is used to estimate the population mean and the precision of this estimator improves as the response rate increases.

We conducted a precision analysis using a minimum expected estimate of 65 percent response rate. This calculation here focuses on categorical variables because they are more important to address the key study questions.

Using the following equation, we determined that with a sample size of 550 (65% of the population of 850), and standard assumptions (alpha=.05 and power =.80), the study will have the precision to have its sample estimates be within 0.012 standard errors from its population parameters.


Where   is the estimated proportion; n is the sample size; and N is the population size.

Also we determined that with a sample size of 550 and standard assumptions (alpha=.05), the study will have 95% confidence to have the parameter estimate be between 0.4997 and 0.5002 for a yes/no question (given the assumption that the sample estimate of the proportion is .50).

B.1.4. Updating Respondent Contact Information


We will need to update contact information for respondents in order to administer the survey. This information will include e-mail addresses, telephone numbers (work or home), and work or home addresses.

Although the CPFP application and IMPACII database are reliable sources to generate the contact list, some of those pieces of information may not be available or be outdated. Updating the contact information prior to the administration of the web survey is critical.

Westat will employ trained staff to trace contact information (email addresses, employer, home/work addresses, home/work phone numbers) using a number of Internet search tools (e.g., Google, LinkedIn, People Search) and other methods (e.g., use of key term combinations). Staff will include individuals with tracing experience trained in various tracing techniques by one of the Westat team members.


B.2 Procedures for the Collection of Information

Data collection procedures consist of the following:

  • Upon OMB approval, potential participants with a valid email address will be sent an introductory email notice about the survey, Email Survey Notice to CPFP Alumni, CPFP Applicants and F32 Awardees (Attachment 7).


  • Two weeks after the email notification, an email invitation letter will be sent to all potential participants to encourage participation in the study. The invitation letter requests the individual’s participation, introduces the purpose and content of the survey, includes instructions on how to complete the web version of the survey, and provides contact information in case of queries (Attachment 8).


  • Reminder emails and telephone follow-ups will be conducted if the survey is not completed (see B.3.1 for more details).


Data will be gathered using a self-administered online survey of approximately 160 CPFP Alumni (Attachment 1), 170 CPFP Applicants (Attachment 2) and 220 F32 Awardees (Attachment 3). The survey will collect information about the experiences, satisfaction, professional pursuits and professional outputs of respondents. The collection of the evaluation assessment data will provide NCI with a better understanding of how CPFP was implemented and how feedback from CPFP program fellows can enhance the quality of the program for future postdocs.

B.2.1. Quality Control


The contractor for this study will establish and maintain quality control procedures to ensure standardization, and high standards of data collection and data processing. The contractor will maintain a log of all decisions that affect sample enrollment and data collection. The contractor will monitor response rates and completeness of acquired data, and provide NCI with progress reports in agreed upon intervals.



B.3 Methods to Maximize Response Rates and Deal with Nonresponse

B.3.1. Follow-up


To improve response rates, follow-up efforts will be used to encourage survey completion. More specifically, the efforts to reduce the number of non-respondents consist of the following:

  • Reminder emails will be sent to all non-respondents beginning two weeks after initiation of the survey. A second reminder will be sent to all non-respondents three weeks after initiation, Email Survey Reminder to CPFP Alumni, CPFP Applicants and F32 Awardees (Attachment 9).


  • Telephone follow-up for non-response will begin four weeks after initiation of the survey. Experienced telephone interviewers, who are trained in non-response conversion, will make up to four attempts for the follow-up calls. The call is a prompting call only encouraging the potential participant to complete the survey. The telephone interviewers will use the script, Telephone Follow-up Script for CPFP Alumni, CPFP Applicants and F32 Awardees (Attachment 10), for the call.


B.3.2. Unit Nonresponse


As noted, to improve nonresponse, experienced interviewers who are particularly skilled in nonresponse conversion will re-contact non-respondents. The major exception to this rule is for hard refusals (i.e., individuals who have requested not to be called again). To deal with unit nonresponse due to hard refusals, weights will be used to adjust for nonresponse within cells identified to key variables known (i.e., individuals’ application years).

B.3.3. Item Nonresponse


Although our procedures are designed to maximize item response rates, the analysis will need to confront the issue of missing data. Experience with previous surveys indicates that some respondents will omit responses to some specific items (e.g., sensitive items), although they may have provided most of the data required. By employing good survey data collection practices, we expect to minimize the amount of missing data on any single variable to a very low level. However, if item nonresponse is unexpectedly high for any of the key analytic variables, hot deck imputation techniques will be used to estimate missing-item values.

For analyses involving just one or two variables that have not been subject to imputation, we will handle the problem by omitting the cases with missing data; or, in the case of categorical response variables, we will use an explicit “missing” or “unknown” category. When multivariate techniques involving several variables are used, analytic techniques for missing values will be used (such as using the variable mean or adding a dummy variable to reflect how the non-respondents differ from the other individuals).


B.4 Test of Procedures or Methods to be Undertaken

Westat conducted a pilot test to inform development of the survey and a subsequent pretest to refine the collection of information, to minimize burden and to improve utility. The survey instruments have been reviewed by Westat survey methodologists and NCI and NIH staff.

B.4.1. Pilot Test


Convenience sampling was used to select respondents for a pilot test to help inform development of the survey. The pilot testing was conducted between December 2012 and January 2013. CPFP alumni were notified by email of the CPFP evaluation and informed about qualitative alumni interviews that would help develop the survey. Stratified random sampling was used to select alumni for those interviews, with current career sector—government, academia, or private sector—as the stratifying variable. Twenty-five alumni1 participated in the pilot and were stratified by career sector into three groups. Alumni were asked general questions about their career and experience with the CPFP. The information collected during the pilot helped inform the development of the survey for the full scale evaluation.

B.4.2. Pretest of Alumni Survey


Westat conducted a pretest of the CPFP alumni survey during May 2013. Alumni who were not selected for the pilot interviews but had emailed with an offer of assistance were contacted for the pretest. Additional pretest participants, who had not participated in the pilot, were randomly selected from the alumni roster. Six CPFP alumni participated in the pretest, all of whom completed the survey and participated in a follow-up telephone interview.

Pretest participants were asked to complete and return the survey and participate in a telephone debriefing interview. The interviews focused on capturing participants’ experience with the survey, including the time needed to complete the survey, questions or instructions that were confusing to alumni, lists of response options that seemed incomplete, and information that was difficult for alumni to provide. Interviews generally lasted about 30 minutes, and all six interviews were held the same day of the participant returning the survey. Interviews were conducted to explore key survey questions. Probes specific to individual participants were also developed based on review of completed surveys.

B.4.3. Key Findings and Survey Modifications


This section summarizes key findings from the pretest and modifications made to the surveys in response to the findings, where applicable.

  • Completion time: The average completion time reported by CPFP alumni was 25 minutes. Modifications made to the questionnaire following the pretest have not significantly altered its length, so the average time reported by pretest participants has been listed in the introduction as the estimated time needed to respond.


  • Overall feedback: Feedback from pretest participants most commonly identified survey items that required more clarity, either through instructions or minor modifications of the items themselves. The participants also reacted positively to the open-ended items.


  • Question 1: Participants were unclear whether question 1 about their postdoctoral experience should include CPFP. Instructions were added telling alumni to include CPFP.


  • Question 8: One participant recommended adding a question that asks respondents how long they have been at their current job because this provides important contextual information about their job-related responses. This question has been added.


  • Question 9: Participants indicated that question 9 about the discipline(s) of their current work was somewhat confusing because some categories covered a broad range of disciplines, while other categories were narrower “sub-areas” of broader disciplines. Slight modifications were made to this question by combining two related disciplines—epidemiology and public health, the disciplines identified as being the cause of the confusion.


  • Questions 12 – 14: Several participants indicated that open-ended questions asking for the percentage of time spent on a specific activity may be difficult to provide for some respondents. These questions were revised to provide respondents with a range of percentages to choose from. The word “approximately” was also bolded in the question stem to signal that estimates are appropriate when answering this question.


  • Question 14: A question about the percentage of time spent teaching and advising students was added, per a participant’s recommendation.


  • Question 17: When answering this question about presentation activities, one participant indicated that she included presentations given by her students for which she helped them prepare. Instructions were added telling respondents to exclude presentation activities conducted by students.


  • Question 22: Several respondents indicated that most of the questions on career advancement were either/or questions and a five-point extent scale was not appropriate. The response categories have been revised to “Yes/No” options.


  • Questions 27 – 29: With respect to program benefits, some participants noted that the knowledge and skills they gained during CPFP were beneficial, but they may not use them in their current jobs. As a result, participants were rating the benefits they received from the program lower on a five-point scale because they are not applicable in their current positions. This series of questions has been revised to distinguish between benefits gained during the program and benefits that have affected respondents’ career.


B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


The following individuals were critical in developing the research plan, the conceptual framework, survey questions, and sampling strategies underlying Evaluation of the CPFP program. Many of the same individuals will be involved with analysis once the data are collected.



Sophia Tsakraklides

Westat Principle Investigator for CPFP

Westat

1600 Research Blvd.
Rockville, MD 20850

301-738-3580

SophiaTsakraklides@westat.com

Kimberley Raue

Westat Survey Manager for CPFP

Westat

1600 Research Blvd.
Rockville, MD 20850
301-294-38
kimberleyraue@westat.com

Atsushi Miyaoka

Senior Study Director

Westat

1600 Research Blvd.

Rockville, MD 20850

301-610-4948

atsushimiyaoka@westat.com


David Morganstein

Vice President

Director of Statistical Staff

Westat

1600 Research Blvd.

Rockville, MD 20850

Phone: 301-251-8215

DavidMorganstein@westat.com

Weijia Ren, Ph.D.

Statistician

Westat

1600 Research Blvd.

Rockville, MD 20850

Phone: 301.251.8297

weijiaren@westat.com

Maura Spiegelman

Research Analyst

Westat

1600 Research Blvd.

Rockville, MD 20850

301-315-5960

MauraSpiegelman@westat.com


1 This was found to be in violation of the Paperwork Reduction Act. NCI agrees that they should have sought PRA clearance prior to conducting the pilot and pre-testing activities. Both Westat and NCI have been instructed to contact either the NCI PRA Liaison or the NIH Project Clearance Branch should similar activities want to be conducted in the future so that a determination can be made prior to information being collected.

3

File Typeapplication/msword
File TitleTABLE OF CONTENTS
AuthorVivian Horovitch-Kelley
Last Modified ByVivian Horovitch-Kelley
File Modified2013-12-11
File Created2013-12-11

© 2024 OMB.report | Privacy Policy