Supporting Statement B for the Program Review of the Division of Acquired Immunodeficiency Syndrome (DAIDS) Policy Implementation Program
Dione Washington, MS
Division of AIDS
6700B Rockledge Drive
Bethesda MD 20892
Phone: 301 594 2764
Fax: 301 402 1506
National Institute of Allergy and Infectious Diseases
National Institutes of Health
U.S. Department of Health and Human Services
Table of contents
B.1 Respondent Universe and Sampling Methods 3
B.2 Procedures for the Collection of Information 3
B.3 Methods to Maximize Response Rates and Deal with Nonresponse 4
B.4 Test of Procedures or Methods to be Undertaken 5
B.5 Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 9
The collection of the information for this program review does not necessitate statistical sampling methods because the participant population of extramural researchers is small and allows for a census approach. There are no unusual circumstances requiring specialized sampling procedures. A survey will be administered via web, and in-person focus groups comprised of the questions previously described (as well as found in Attachment 3 and 4). Data will be collected by the contractor staff. All contractor staff is experienced in data collection procedures.
The respondent universe and sampling methods, by data collection method, are as follows:
Surveys – The target population is the extramural researchers in the DAIDS research programs, who are recipients of funding from DAIDS to conduct and review research. This target population of extramural researchers is comprised of Principal Investigators, Site Coordinators, Site Leaders of Clinical Research Sites (CRSs) and Research Networks and Clinical Site Monitors of the CTUs and CRSs. The extramural researchers will be surveyed for the groups’ knowledge, attitudes and perceptions of DPIP (see attachment 1).
Web-based surveys will be administered annually with all persons identified in the target group (i.e., a census approach) in order to capture potential changes in respondents’ perceptions and attitudes regarding DPIP. In addition, this data collection schedule will be coordinated with and allow timely collection of information as new waves of policies become effective. A survey should take no more than approximately one hour to complete. Additionally, extramural researchers will be provided with up to two email reminders as to when survey responses are to be completed.
We will strive to achieve an 80% response rate on the survey. Based on the size of the target population of extramural researchers (N = 392), this means obtaining at least 314 responses (314/392 = 80%). We will strive for this target by applying best practices for survey design (e.g., a user-friendly survey) and survey communications (e.g., reminders). To note, we need to obtain a minimum of 194 responses in order to be able to state that the results are generalizable at the p ≤.05 level. This level of responses (194/392 = 49%) not only allows us to conclude generalizability but is also consistent with guidance on average response rates (e.g., Babbie (1990) cites a response rate of 50% as sufficient for analysis and reporting; other sources cite 40% to 50% as average survey response rates for online surveys).
Focus Groups – Focus groups will also be conducted with the defined target population. Focus group questions will be structured and all focus groups will receive the same questions (see attachment 2). Focus groups will have no more than 10 participants and will last approximately two hours. Focus group members will only participate in a focus group once to minimize burden. Focus groups will be conducted in conjunction with DAIDS Policy Training events and Network Meetings to minimize burden to sites associated with time, travel, and dollar expenditures. Therefore, a convenience sample will be utilized for the focus groups.
We intend to conduct no more than 10 focus groups. Assuming a conservative average of eight participants at each of the 10 sessions, we anticipate collecting focus group data from 80 extramural researchers. Given that there are 392 extramural researchers overall, this represents a response rate of 20% (80/392).
Overall, these two methods will allow us to tap the perceptions of approximately 80% of the extramural researchers via the web-based survey and approximately 20% of the extramural researchers via the focus group. Given that there will likely be overlap (some extramural researchers will participate in both the survey and a focus group session), these numbers are not additive. At the program review’s conclusion, we will be able to assert that the survey response rate represents, at a minimum, the percentage of extramural researchers that participated in the program review in some capacity.
The program review team will employ a web-based survey as the first data collection activity. We will use a census approach; the survey will be administered to all of the extramural researchers in the sampling frame. To draw attention to the survey and to enhance the response rate, the program review team will distribute a survey announcement prior to the survey’s administration and will also use up to two survey reminders during the survey administration period (See attachment 1). To enhance quality of data received, the survey tool will require a response from participants but each question will allow for the participant to provide a “not applicable” or “don’t know” response. In addition, the survey tool is programmed to require responses to each question; therefore, maximizing data received.
The program review team will conduct focus groups as the second data collection activity. A convenience sampling approach will be used, drawing from extramural researchers in attendance at an existing DPIP training event or DAIDS research meeting, and who are willing to participate. We will provide advance notification of the focus groups to potential participants via the DPIP list serv, the HIV/AIDS Network Coordination (HANC) website, and the DAIDS wide distribution list. Prior to a training event or research meeting, we will send an email to attendees of these events informing them of the purpose of the session and soliciting their participation (See attachment 2). Focus groups will have no more than 10 participants and will last approximately two hours. Focus group participants will only participate in a focus group once to minimize burden. Focus groups will be conducted in conjunction with DAIDS Policy Training events to minimize burden to sites associated with time, travel, and dollar expenditures. In advance of conducting the focus groups, the focus group facilitators (Booz Allen team) will develop a focus group procedures document, which includes the focus group questions. The focus group facilitators are experienced in conducting focus groups for various Government agencies including the National Institutes of Health. As part of preparing for the DPIP focus group activity, the Booz Allen team will also conduct an internal mock focus group to enhance group facilitation technique.
As stated in B.1, we will strive for a response rate target of 80% and will also strive to not fall below a minimum response of 49% for the web-based survey. Given that there are 392 extramural researchers, this means we will seek to obtain responses from at least 314 responses (i.e., 314/392 = 80%). We will also ensure that we meet the minimum number of responses (n = 194) that will allow us to generalize, with 95% confidence, that the results are representative of the population from which they were drawn (i.e., 194/392 = 49%). Also as described in B.1, we are anticipating a response rate of 20.4% (80/392) for the focus group sessions.
In the event that the response rate is not achieved, we will send an additional communication to encourage participation in conjunction with extending the survey administration period. After the survey has closed, a non-response bias analysis, which will allow us to evaluate whether the respondents and the non-respondents are reasonably similar will be performed. If few differences exists, we can increase our confidence that the survey results are reasonably representative of the population overall.
The use of a web-based survey allows respondents to directly key in their responses, which will help to maximize response rates for extramural researchers. In addition, response rates for the survey and focus groups will be maximized by providing potential respondents with targeted communications about the program review data collection activities before and during the data collection period, which is a well-documented technique for boosting response rates. These communications will clearly state the intent of the data collection effort, why participation is important, how the data will be used, and that participation is voluntary. Survey recipients will first receive notification of the program review via the DPIP list serv, posting on HANC website, and through the DAIDS wide distribution list.
For the survey, the potential respondents will then receive an email requesting their participation (see Attachment 1). The respondents will be identified from the HANC listserv and the DAIDS Enterprise System that houses the list of extramural researchers. Once the web survey has been administered, an email reminder will be sent after one week and then also several days prior to closing of the survey to maximize response rates.
For the focus group, once a DPIP training or DAIDS research meeting has been identified, we will receive the list of attendees from DAIDS. Those attendees, who are extramural researchers, will receive an email informing them of the focus group, and requesting their participation (see Attachment 2). We will also send a reminder email prior to the focus group date/time reminding of the focus group.
No specific formal pretesting procedures are planned for this program review. As an effective means of refining the data collection and to minimize burden, the survey and focus group questions have been thoroughly reviewed by the Advisory Committee. In addition, we will be asking a small group of individuals within DAIDs to initially access the survey instrument to provide feedback on the usability of the tool .
The program review consists of a mixed methods approach – a web-based survey and focus groups – to help address the research questions against the DPIP program goals of awareness, accessibility, understandability, and harmonization.
The DPIP has not conducted a program review before. Therefore, the first data collection results will serve as the baseline from which subsequent data collections will be compared against.
For each of the DPIP Program Goals, research questions were developed. Then, these research questions were parsed into measures and metrics for development of the data collection instruments (a web-survey and focus group questions). The data collection questions were designed to help determine the success of DPIP reaching its program goals (see attachments 3 and 4). The success of the program will be determined to the extent by which the program goals are being met: extramural researchers are aware of DPIP policies and procedures (P&Ps) and can access them; extramural researchers think that the P&Ps are clear and understand them; extramural researchers know which P&Ps apply to their research portfolio; and extramural researchers perceive that the P&Ps help harmonize across DAIDS funded research. Each data collection method (i.e., survey, focus group) will provide preliminary results by looking at an issue from a single method perspective. We will then integrate our findings across methods to more fully examine how well the program is meeting its goals.
More specifically, the web based survey tool will allow us to capture quantifiable data from participants on their perceptions of and experiences with the DPIP program. Because these data are quantifiable, we will be able to gain a broad understanding about progress toward goals. The focus group data will then allow us to delve deeper into participants’ perceptions on similar questions given our ability, in a group setting, to probe on the responses. We will then integrate survey and focus group findings to create a fuller understanding of progress toward goals.
Survey Data: Once the web-based survey administration period has closed, the program review team will extract the data from the survey tool and will transfer the data into a statistical software package, such as SPSS (Statistical Package for the Social Sciences). We will perform an initial review of the data to determine the extent of missing data (if any) and whether any adjustments will be needed to address missing data.
We will then run basic descriptive statistics to characterize the data. We will run frequencies on the full data set, including the three demographic items (site location, job function, and network), the 19 items on a 5-point agreement scale, and the 12 items that have categorical or open-ended response options. We will run means, medians, and standard deviations on the items that are on an interval scale (i.e., the 19 items on an agreement scale). During this process, we will also examine the data for the presence of any outliers and will consider the impact of the outliers on the results. Finally, we will table the results of these quantitative analyses for data presentation. The results of the descriptive statistics will be the primary source of analysis of the survey data and will allow us to be responsive to answering the research questions.
For example, one of DPIP’s goals is Applicability; meeting this goal assumes that extramural researchers are able to correctly identify which P&Ps apply to the projects in their portfolios. Research Question 5 for this program review asks: Does the target population understand which policies apply to the research projects in their portfolios. Using this research question, the program review team developed a sub-question that states “Target populations find DPIP activities and non-DPIP activities useful.” With this sub-question, we are attempting to gauge whether extramural researchers find that they can apply the policies to their clinical research activities and how exactly do they do so. Therefore, we ask two survey questions built on a five-point scale of agreement:
I am able to apply DAIDS policies when
designing clinical research activities.
(Five
point scale with N/A option: Strongly Disagree-->Disagree-->Neither
Agree nor Disagree-->Agree-->Strongly Agree or N/A)
I am able to apply DAIDS policies when
carrying out clinical research activities.
(Five
point scale with N/A option: Strongly Disagree-->Disagree-->Neither
Agree nor Disagree-->Agree-->Strongly Agree or N/A)
The results from these survey questions provide some understanding of whether extramural researchers believe that they apply (or cannot apply) the DAIDS policies. We will be able to document the number and percentage of respondents who agree with these statements. The focus group questions further elucidate this understanding. The focus group questions ask extramural researchers how they apply the policies:
How did you apply policies to design your research studies?
How can the policies be more useful when you design your research studies?
How did you apply policies to carry out your research studies?
How can the policies be more useful when you carry out your research studies?
Focus Group Data: The program review team will use thematic analysis techniques to analyze the focus group data. This will allow us to extract the most common responses across respondents in each question. Once the focus groups are completed, we will prepare the qualitative data for analysis by cleaning the raw notes. We will then, on a per question basis, identify the key themes that emerged and will quantify the frequency of key themes across respondents. We will also mark each set of notes by type of interviewee (e.g., job function) so that we can stratify the responses. Finally, we will table the results of these qualitative analyses for data presentation.
Each data collection method (i.e., survey, focus group) will provide preliminary results by looking at an issue from a single method perspective. We will then integrate our findings across methods so as to be able to more fully examine how well the program is meeting its goals.
The focus group question results will provide further evidence to the results from the survey because this data collection methodology allows respondents to provide richer, qualitative responses. For example, responses to the focus group question:
How did you apply policies to design your research studies?
How can the policies be more useful when you design your research studies?
How did you apply policies to carry out your research studies?
How can the policies be more useful when you carry out your research studies?
could include details around using the policies to design procedure manuals for studies, or conducting internal training sessions that will help facilitate site staff in the conduct of their research and provide DPIP with ideas on how to improve the policies for applicability – answers from extramural researchers may include thoughts such as providing more examples of applicable use, or checklists to accompany policies.
The combination of survey and focus group questions for this research question provides data toward whether extramural researches feel that they are applying the policies, and then evidence toward how they apply the policies. This information culminates into a perspective of whether extramural researchers are applying the policies. Progress toward the applicability goal will be found if, with the survey, a majority of responders Agree or Strongly Agree that they apply the policies, and then that coupled with the evidence found in the focus group results will show how the policies are applied.
Below are nine tables presenting each of the research questions that were derived from the DPIP program goals. For each research question, associated sub-questions were developed. And performance measures, metrics, data sources and analytic techniques. The metrics present the tabulation expected for the data item – for example percentage of target populations that expressed satisfaction.
Table 1. Research Question 1: How effectively does OPCRO make the target populations aware of policies?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
1.1 What are the methods by which target populations know about policies? |
Type of policy dissemination activity used |
|
|
|
1.2 Does DPIP disseminate policy information in a timely manner? |
Time when policy is received |
|
|
|
Table 2. Research Question 2: Are policies readily accessible
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
2.1 Are target populations satisfied with access to policies? |
Access to policies |
|
|
|
2.2 Are target populations satisfied with access to policies |
Provision of sufficient information on accessing new policies and procedures |
|
|
Descriptive Statistics – Frequencies, Means, Medians, Standard Deviations |
Table 3. Research Question 3: Are the policies written so that the content is clear
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
3.1 Do target populations understand the policies? |
Policies are easy to understand |
|
|
|
Table 4. Research Question 4: Is there additional support to facilitate understanding of the new policies?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
4.1 Do target populations find training useful to understand the policies? |
Effectiveness of policy-training used |
|
|
|
4.2 Do target populations benefit from repeat training? |
Renewed training helps support the understanding of new policies |
|
|
|
4.3 Does DPIP provide adequate clarification on policy questions?
|
Effectiveness of resources available to facilitate the understanding of new policies |
|
|
|
Table 5. Research Question 5: Does the target population understand which policies apply to the research projects in their portfolios?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
5.1 Target populations utilize different methods to determine how policies apply to their studies |
Training effectiveness on the applicability of policies |
|
|
|
5.2 Target populations utilize different methods to determine how policies apply to their studies and find DPIP activities and non-DPIP activities useful |
Communication mechanism effectiveness on the applicability of policies |
|
|
|
5.3 Target populations find DPIP activities and non-DPIP activities useful |
Ability to apply policies to research projects in their portfolios. |
|
|
|
Table 6. Research Question 6: Do policies effectively communicate staffs’ roles and responsibilities in projects?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
6.1 Target populations understand whose study roles are affected by the applied policies |
Ability of policies to inform distinct research roles and responsibilities |
|
|
|
Table 7. Research Question 7: Do policies simplify the implementation of DAIDS funded/sponsored research for Extramural Researchers?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
7.1 Target populations found that the policies provided a basis for carrying out clinical research
|
|
|
|
|
7.2 Target populations can identify barriers and facilitators to policy implementation |
Facilitators and barriers to the implementation of policies |
|
|
|
Table 8. Research Question 8: Do policies apply broadly within networks and non-networks?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
8.1 Target populations can implement policies regardless of study location or type |
Policies provide standards to consistently implement research |
|
|
|
Table 9. Research Question 9: Do standardized policies facilitate greater integration among DAIDS funded clinical research programs?
Sub-Question |
Performance Measures |
Metrics |
Data Sources |
Analytic Techniques |
9.1 Target populations found that policies helped facilitate collaboration between sites |
Policies promote collaboration among sites (both in- and non-network, cross sites, sites hosting multiple trials, etc.) |
|
|
|
Statistical Aspects Contact
Dione Washington, MS
Health Specialist/Project Officer
National Institute of Allergy and Infectious Diseases
(301) 594-2764
Data Collection/Analysis and Statistical Contact – Booz Allen Hamilton Contract Team
Rajni Samavedam, MPH
Project Director
(301) 838-3647
Elaine Brenner, PhD
Statistician / Methodology Lead
(703) 984-0063
Elizabeth Coppolecchia, MHSA
Lead Analyst
(240) 314-5933
Jenna Goldstein, MPH
Analyst
(301) 838-3600
Nadeeka Jayatilake, PhD (in progress)
Analyst
(703) 984-7568
Attachment 1: Email Communication to Extramural Researchers about Survey
Attachment 2 Focus Groups Communication to Extramural Researchers about Survey
Attachment 3: NIAID DAIDS Survey Question for Extramural Researchers
Attachment 4: NIAID DAIDS Focus Group Questions for Extramural Researchers
File Type | application/msword |
File Title | Supporting Statement B for the Program Review of the Division of Acquired Immunodeficiency Syndrome (DAIDS) Policy Implementatio |
Author | Rajni Samavedam |
Last Modified By | Washingtondi |
File Modified | 2010-03-10 |
File Created | 2010-03-10 |