Improving Fetal Alcohol Spectrum Disorders Prevention and Practice through Practice and Implementation Centers and National Partnerships
OMB # 0920-xxxx
Supporting Statement Part A
New Request
Nancy Cheal, Ph.D., R.N.
Health Scientist
Centers for Disease Control and Prevention
Email: ncheal@cdc.gov
Phone: 404-498-6764
Fax: 404-498-3070
February 29, 2016
Goal
of the study: The
purpose of this information collection is to evaluate the
collaboration between the Practice and Implementation Centers
(PICs) and national partner organizations, and to evaluate the work
of the PICs and partners.
Intended use of the
resulting data: Data will be used to measure whether the efforts of
the PICs and partners have resulted in practice changes among
targeted healthcare providers; to evaluate and improve project
trainings and FASD prevention messages; to gauge success of
collaboration efforts; and to provide recommendations for future
practice change efforts.
Methods to be used to
collect: Data will be collected through online or paper-pencil
pre/post/follow-up surveys of training efforts (as well as other
surveys of PIC target disciplines and audiences), and qualitative
key informant interviews conducted in-person or via telephone.
The subpopulation to be
studied: The target population is healthcare practitioners and
students in the following disciplines: medical assistants, nursing,
obstetrics and gynecology, pediatrics, social work, and family
physicians. This project will also collect data from grantee staff
and representatives of health systems.
How data will be analyzed:
Quantitative analyses planned by grantees and the cross-site
evaluator include cross-tabulations, t-tests, bivariate regression
analysis, chi-square and McNemar’s tests, repeated measures
ANCOVA, and MANOVA/MANCOVA. Qualitative content analyses will also
be conducted.
Table of Contents
A.1. Circumstances Making the Collection of Information Necessary
A.2. Purpose and Use of the Information Collection
A.3. Use of Improved Information Technology and Burden Reduction
A.4. Efforts to Identify Duplication and Use of Similar Information
A.5. Impact on Small Businesses or Other Small Entities
A.6. Consequences of Collecting the Information Less Frequently
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A. 9. Explanation of Any Payment or Gift to Respondents
A.10. Assurance of Confidentiality Provided to Respondents
A.11. Justification for Sensitive Questions
A.12. Estimates of Annualized Burden Hours and Costs
A.13. Estimates of Other Total Annual Cost Burden to Respondents and Record Keepers
A.14. Annualized Cost to the Federal Government
A.15. Explanation for Program Changes or Adjustments
A.16. Plans for Tabulation and Publication and Project Time Schedule
A.17. Reason(s) Display of OMB Expiration Date is Inappropriate
A.18. Exceptions to Certification for Paperwork Reduction Act Submissions
LIST OF ATTACHMENTS
Attachment A: Applicable Laws or Regulations
A1. Section 301 of the Public Health Service Act (42 U.S.C. 241)
Attachment B: Average Annualized Cost to the Federal Government
Attachment C1 –C2: Federal Registry Notice
Attachment C1: Published 60-Day Federal Register Notice
Attachment C2: Public Comments from 60-Day Federal Register Notice
Attachments D1 – D3: FASD Core Training Survey Instruments
D1. FASD Core Training Survey – Pre-Test
D2. FASD Core Training Survey – Post-Test
D3. FASD Core Training Survey – 6 Month Follow-Up
Attachments E1 – E3: Cross-Site Evaluation Instruments
E1. DSW Report
E2. High Impact Study: Discipline Specific Workgroup Discussion Guide for Project Staff
E3. High Impact Study: Key Informant Interview - Health Care System Staff
Attachments F1 – F7: Medical Assistants Discipline-Specific Workgroup Instruments
F1. Medical Assistants Pre-Test Survey
F2. Medical Assistants Pre-Test Survey (Academic)
F3. Medical Assistants Post-Test Survey
F4. Medical Assistants Post-Test Survey (Academic)
F5. Medical Assistants Follow Up Survey
F6. Medical Assistants Follow Up Survey (Academic)
F7. Medical Assistants Change in Practice Survey
Attachments G1 – G8: Nursing Discipline-Specific Workgroup Instruments
G1. Pre-Training Survey for Nursing
G2. Post-Training Survey for Nursing
G3. Six Month Follow Up Survey for Nursing
G4. Nursing DSW Polling Questions
G5. Key Informant Interviews with Champions
G6. Brief Questionnaire for Nursing Organization Memberships
G7. Friends and Members of the Network Survey
G8. Healthcare Organization Utilization Survey
Attachments H1 – H9: Obstetrics & Gynecology Discipline-Specific Workgroup Instruments
H1. OBGYN SBI Knowledge and Agency
H2. OBGYN BI-MI Proficiency Rating Scale – Provider Skills Training Baseline
H3. OBGYN BI-MI Proficiency Rating Scale – Standardized Patient Version
H4. OBGYN BI-MI Proficiency Rating Scale – Provider Follow Up (3m & 6m)
H5. OBGYN Avatar Training Satisfaction Survey
H6. OBGYN FASD SBI Training Event Evaluation
H7. OBGYN Qualitative Key Informant Interview – Pre-Training
H8. OBGYN Qualitative Key Informant Interview – Post-Training
H9. OBGYN Telecom Training Satisfaction Survey
Attachments I1 – I9: Pediatrics Discipline-Specific Workgroup Instruments
I1. AAP Three Month Follow Up Evaluation Instrument
I2. AAP Six Month Follow Up Evaluation Instrument
I3. Pediatric FASD Regional Liaison/Champion Training Session Evaluation
I4. Pediatric FASD Regional Education and Awareness Liaisons Work Plan
I5. FASD Toolkit User Survey
I6. FASD Toolkit Evaluation Focus Group/Guided Interview
I7. Survey of Pediatricians – Baseline and Follow Up
I8. AAP Pre-Training Evaluation Survey
I9. AAP Post-Training Evaluation Survey
Attachments J1 – J4: Social Work and Family Physicians Discipline-Specific Workgroup Instruments
J1. Social Work and Family Physicians Pre-Training Survey
J2. Social Work and Family Physicians Post-Training Survey
J3. Social Work and Family Physicians 6-Month Follow Up Survey
J4. Family Medicine Evaluation Questions Addendum for Practice or Individual Provider
Attachments K1 – K4: National Organization on Fetal Alcohol Syndrome (NOFAS) Instruments
K1. NOFAS Webinar Survey
K2. NOFAS Three Month Follow Up Webinar Questionnaire
K3. NOFAS Pre-Test Survey
K4. NOFAS Post-Test Survey
Attachments L1 – L3: Cross-Discipline Instruments
L1. Clinical Process Improvement Survey
L2. Organizational Readiness to Change
L3. TCU Organizational Readiness Survey
Attachment M: FASD Evaluation Working Group Members
Attachments N1 – N3: FASD Core Training Survey Instruments – Screenshots
N1. FASD Core Training Survey – Pre-Test – Screenshot
N2. FASD Core Training Survey – Post-Test – Screenshot
N3. FASD Core Training Survey – 6 Month Follow-Up – Screenshot
Attachment O: Cross-Site Evaluation Instruments – Screenshots
O1. DSW Report – Screenshot
Attachments P1 – P7: Medical Assistants Discipline-Specific Workgroup Instruments - Screenshots
P1. Medical Assistants Pre-Test Survey – Screenshot
P2. Medical Assistants Pre-Test Survey (Academic) – Screenshot
P3. Medical Assistants Post-Test Survey – Screenshot
P4. Medical Assistants Post-Test Survey (Academic) – Screenshot
P5. Medical Assistants Follow Up Survey – Screenshot
P6. Medical Assistants Follow Up Survey (Academic) – Screenshot
P7. Medical Assistants Change in Practice Survey – Screenshot
Attachments Q1 – Q7: Nursing Discipline-Specific Workgroup Instruments – Screenshots
Q1. Pre-Training Survey for Nursing – Screenshot
Q2. Post-Training Survey for Nursing – Screenshot
Q3. Six Month Follow Up Survey for Nursing – Screenshot
Q4. Nursing DSW Polling Questions – Screenshot
Q5. Brief Questionnaire for Nursing Organization Memberships – Screenshot
Q6. Friends and Members of the Network Survey – Screenshot
Q7. Healthcare Organization Utilization Survey – Screenshot
Attachments R1 – R9: Obstetrics & Gynecology Discipline-Specific Workgroup Instruments – Screenshots
R1. OBGYN SBI Knowledge and Agency – Screenshot
R2. OBGYN BI-MI Proficiency Rating Scale – Provider Skills Training Baseline – Screenshot
R3. OBGYN BI-MI Proficiency Rating Scale – Standardized Patient Version – Screenshot
R4. OBGYN BI-MI Proficiency Rating Scale – Provider Follow Up (3m & 6m) – Screenshot
R5. OBGYN Avatar Training Satisfaction Survey – Screenshot
R6. OBGYN FASD SBI Training Event Evaluation – Screenshot
R7. OBGYN Qualitative Key Informant Interview – Pre-Training – Screenshot
R8. OBGYN Qualitative Key Informant Interview – Post-Training – Screenshot
R9. OBGYN Telecom Training Satisfaction Survey
Attachments S1 – S6: Pediatrics Discipline-Specific Workgroup Instruments – Screenshots
S1. AAP Three Month Follow Up Evaluation Instrument – Screenshot
S2. AAP Six Month Follow Up Evaluation Instrument – Screenshot
S3. FASD Toolkit User Survey – Screenshot
S4. Survey of Pediatricians – Baseline and Follow Up – Screenshot
S5. AAP Pre-Training Evaluation Survey – Screenshot
S6. AAP Post-Training Evaluation Survey – Screenshot
Attachments T1 – T4: Social Work and Family Physicians Discipline-Specific Workgroup Instruments – Screenshots
T1. Social Work and Family Physicians Pre-Training Survey – Screenshot
T2. Social Work and Family Physicians Post-Training Survey – Screenshot
T3. Social Work and Family Physicians 6-Month Follow Up Survey – Screenshot
T4. Family Medicine Evaluation Questions Addendum for Practice or Individual Provider – Screenshot
Attachments U1 – U4: National Organization on Fetal Alcohol Syndrome (NOFAS) Instruments – Screenshots
U1. NOFAS Webinar Survey – Screenshot
U2. NOFAS Three Month Follow Up Webinar Questionnaire – Screenshot
U3. NOFAS Pre-Test Survey – Screenshot
U4. NOFAS Post-Test Survey – Screenshot
Attachments V1 – V3: Cross-Discipline Instruments– Screenshots
V1. Clinical Process Improvement Survey – Screenshot
V2. Organizational Readiness to Change – Screenshot
V3. TCU Organizational Readiness Survey – Screenshot
A. Justification
A.1. Circumstances Making the Collection of Information Necessary
This Information Collection Request is submitted under the classification new. The length of data collection requested for Office of Management and Budget (OMB) approval is 3 years. The National Center of Birth Defects and Developmental Disability (NCBDDD) at the Centers for Disease Control and Prevention (CDC) is making this request as authorized by Section 301 of the Public Health Service Act (42 U.S.C. 241) (Attachment A1).
Background
Prenatal alcohol use is the cause of a range of birth defects and developmental disabilities, collectively known as fetal alcohol spectrum disorders (FASD). This term is used to define the spectrum of physical, mental, behavioral, and/or learning disabilities that can result from prenatal alcohol exposure. Although it is not known how many people have FASDs, some recent estimates indicate the prevalence of fetal alcohol syndrome (FAS), the most severe condition in the spectrum, to be 6 to 9 out of 1,000 children, and prevalence of FASD to be 24 to 48 per 1000 children (May, Baete, Russo et al., 2014).
FASDs are completely preventable if a woman does not drink alcohol during pregnancy. However, data from a 2012 report indicate that one in 13 pregnant women reports alcohol use in the past month, and one in 17 pregnant women reports binge drinking in the past month. In addition, more than half of all women of childbearing age report some alcohol use, and one in seven reports binge drinking in the past month (CDC, 2012). Many of these women are at risk for an alcohol-exposed pregnancy, even if they are not intending to become pregnant. In 2005, the U.S. Surgeon General re-issued a 1981 advisory that women who are pregnant or considering becoming pregnant should abstain from using alcohol.
Healthcare professionals play a crucial role in identifying women at risk for an alcohol-exposed pregnancy and in identifying effects of prenatal alcohol exposure in individuals. However, despite the data regarding alcohol consumption among women of childbearing age and the prevalence of FASDs, screening for alcohol use among female patients of childbearing age and diagnosis of conditions along the FASD continuum are not yet routine standards of care. New data collection is needed to evaluate both FASD training programs for healthcare professionals addressing the prevention, identification, and treatment of FASDs, as well as collaborative efforts to affect practice change among healthcare professionals and systems change within healthcare systems.
Although most primary care providers ask patients about their alcohol use, research suggests they do not follow recommended methods of screening or delivering brief interventions to patients who drink too much. Vinson and colleagues (2013) found that most primary care physicians are not screening systematically, but instead rely on their own intuition about whether the patient is likely to be using too much alcohol. Findings from their study indicated that clinician judgment missed most patients with a potential problem. Similarly, a systematic review found that healthcare professionals require sufficient knowledge about alcohol guidelines and risks in order to implement screening and interventions (Johnson et al., 2011). Findings such as these affirm the need for systems to be in place that allow providers to systematically screen, identify risky drinkers, provide evidence-based interventions, and bill appropriately for these services.
CDC has funded Fetal Alcohol Spectrum Disorders Regional Training Centers (FASD RTCs) since 2002 to train healthcare professionals and students in the prevention, identification, and treatment of FASDs. The FASD RTCs were evaluated by an external peer review panel in July 2013. The panel reaffirmed the need for the RTCs, with several changes. The panel identified a need for more comprehensive national coverage, discipline-specific trainings, increased use of technology, greater collaboration with medical societies, and stronger linkages with national partner organizations to increase the reach of training opportunities. The panel suggested that the focus of the training centers should be demonstrable practice change and sustainability, and should emphasize primary prevention of FASDs.
In response to the panel’s recommendations, CDC undertook a program redesign. Each center, now called Practice and Implementation Centers (PICs), will be charged with developing discipline-specific trainings and other learning opportunities to be implemented, shared across centers, and disseminated by organizations such as medical societies, national professional organizations, and national partner organizations. While a major focus of the grantees’ work will be national, regional approaches will also be used to develop new content and test the feasibility and acceptability of materials, especially among healthcare providers and medical societies. Evaluation efforts will be primarily dedicated to measuring practice change among targeted healthcare providers and systems change within healthcare systems.
A.2. Purpose and Use of Information Collection
The current FASD project focuses on collaboration between the PICs and national partners to support the enhancement of FASD training, provider education, and practice change. In order to assess whether the project’s trainings and other outreach efforts meet these goals, information must be collected to assess whether participants are satisfied with trainings and whether their knowledge and behavior have changed. Information about core project trainings will be collected at multiple points in time – pre-training (Attachment D1), immediate post-training (Attachment D2), and follow-up at 6 months post-training (Attachment D3). The FASD Core Training Surveys contain a set of items for assessing knowledge, practice behaviors, and comfort and self-efficacy to perform certain skills related to the prevention, identification, and treatment of FASDs, allowing for the evaluation of certain aspects of the collective DSW activities using consistent measures. This will provide CDC’s FAS Prevention Team information regarding the effectiveness of the project as a whole and will assist with future program planning. Without this information collection, it will not be possible to ascertain whether the DSWs are effective in improving knowledge, skills, and practice behaviors within their respective disciplines.
In addition to the core training surveys, the DSWs and NOFAS have also created tailored evaluation instruments (Attachments F-K) to gather survey or interview data from their specific target populations. The results of these supplemental data collection efforts will assist each DSW and NOFAS in understanding whether they have reached and met the needs of their specific target audiences.
The cross-site evaluator will use the DSW report (Attachment E1) to assess the effectiveness of collaboration efforts among the PICs and partners within each DSW. The high impact studies (Attachment E2-E3) and DSW systems change projects (Attachment L) will be used to assess practice and systems changes that result from the training efforts undertaken by the DSWs.
A.3. Use of Improved Information Technology and Burden Reduction
As noted previously, the core project trainings, along with their associated data collections, are planned to be administered online. Instruments that will be administered online (or are planned to have the option of online administration) include the pre, post, and follow-up Core Training Survey Instruments (Attachment D); the cross-site evaluation DSW Report (Attachment E1); all instruments specific to the Medical Assistants DSW (Attachment F); seven out of eight instruments specific to the Nursing DSW (Attachments G1-G4 and G6-G8); all instruments specific to the OBGYN DSW (Attachment H); six of nine instruments specific to the Pediatrics DSW (Attachment I1-I2, I5, I7-I9); all instruments specific to the Social Work and Family Physicians DSW (Attachment J); all instruments specific to NOFAS (Attachment K); and all instruments for DSW healthcare systems change projects (Attachment L) . This means that 88% of data collection instruments (44 out of 50 instruments) will be conducted via advanced information technology. (See Burden Table in section A12.) This will reduce the burden to the participants by allowing instant submission of responses and by not requiring responses to be returned via mail.
In some instances, trainings may be administered in-person. It is not feasible to conduct the evaluations at the beginning and end of the in-person trainings electronically, since internet access may not be available and response rates for surveys to be completed later from a different location (rather than immediately at the end of the training) would be significantly lower. Qualitative data collections, such as key informant interviews, cannot generally be conducted online. Doing so would place a high burden on respondents to type answers to open-ended questions, and would likely lead to less complete data being collected because an interviewer would not be able to follow-up on unclear information. When possible, however, qualitative data collection will be conducted via telephone to decrease burden on respondents. In addition, interview scheduling will be flexible in order to meet the needs of varying respondent work schedules.
A.4. Efforts to Identify Duplication and Use of Similar Information
There are no similar data. The trainings held by the DSWs are unique and not conducted by other organizations, so ongoing data collection to evaluate these trainings, and their resulting systems and practice changes, is needed.
A.5. Impact on Small Businesses or Other Small Entities
No small businesses will be involved in this data collection.
A.6. Consequences of Collecting the Information Less Frequently
For core project trainings, information will be collected at three points in time from participants: immediately prior to the training (Attachment D1), immediately following the training (Attachment D2), and at a 6-month follow-up period (Attachment D3). It is important to assess the effectiveness of the trainings for all participants, and it is necessary to conduct a follow-up survey to assess whether the trainings were effective to allow retention in knowledge gained through the trainings, as well as to assess change in actual behavior in medical professionals who attended the trainings. Collecting information less frequently would not allow accurate evaluation of the trainings, and particularly their impact on practice change.
The high impact studies (Attachments E2 and E3) and systems change studies (Attachments L1-L3) will also collect information at multiple time points. Collecting information at multiple time points is necessary to understand baseline conditions within targeted health systems and allow us to assess whether any practice or systems changes can be attributed to interventions implemented through this project. If information were collected less frequently, we would not be able to accurately assess whether project interventions are effective at leading to change.
A small number of other project data collections will occur at multiple time points. (See Attachments E1, G5, G7, and H4). In each of these cases, the data collection is designed for to occur at two time periods throughout the year to assess changes in the measured items. Conducting the survey less frequently would not allow these instruments to detect change.
All other data collection efforts are planned to occur only once.
A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5
This request fully complies with the guidelines of 5 CFR 1320.5.
A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency
A copy of the agency’s 60-day Federal Register Notice, as required by 5 CFR 1320.8 (d), was published on September 15, 2015 (volume 80, pages 55362-55364) (Attachment C1)
Four public comments were received; they are summarized in Attachment C2.
From January - June 2015, representatives from several organizations outside of CDC were consulted and asked to review the data collection instruments for this study. Please see Attachment M for a detailed list of collaborators.
A.9. Explanation of Any Payment or Gift to Respondents
This collection of information does not involve any payment or gift to respondents.
A.10. Protection of the Privacy and Confidentiality of Information Provided by Respondents
Privacy Impact Assessment
This submission has been reviewed by the NCBDDD Privacy Officer, who determined that the Privacy Act does not apply. Activities do not involve the collection of individually identifiable information.
1.1 Privacy Impact Assessment
Overview of the Data Collection System
An overview of the organization of the current project is helpful to understand the proposed data collection system. Key agencies involved with the project include CDC, the PICs, national partners, the National Organization on Fetal Alcohol Syndrome (NOFAS), an evaluation consultant, a cross-site evaluator, and a web development contractor.
Each PIC is paired with a national partner organization to form Discipline Specific Workgroups (DSWs) representing specific specialty groups of practicing healthcare professionals. There are six DSWs: Medical Assistants, Nursing, Obstetrics & Gynecology (OBGYN), Pediatrics, Social Work, and Family Physicians. NOFAS serves as national partner and a resource to all DSWs. The cross-site evaluator (Westat) will analyze cross-site data to assess the effectiveness of the project. The website development contractor (ICF International and their subcontractor ORAU) will design the website that will host core project trainings and their associated data collection surveys.
A core set of FASD trainings will be developed, with input from members of all DSWs. These core trainings will be offered to healthcare professionals across all disciplines targeted by the project. To assess the effectiveness of these core trainings, a set of core evaluation instruments have been developed to gather data immediately prior to training (Attachment D1), immediately after training (Attachment D2), and at a six month post-training follow-up (Attachment D3). These instruments will be administered online and will be programmed by the web development contractor. The core instruments contain items that assess knowledge, practice behaviors, comfort, and self-efficacy to perform certain skills related to the prevention, identification, and treatment of FASDs. To meet the needs of the specific DSW target audiences, however, each DSW plans to create their own discipline-specific trainings. The discipline–specific trainings will be assessed through pre, post, and follow-up instruments tailored to each training (Attachments F1-F6, G1-G3, H2, H4, I1-I2, I8-I9, J1-J3, K3-K4). These instruments will be administered online or via paper and pencil.
In addition to the core trainings and associated data collection, most DSWs (and NOFAS) have planned additional discipline- or audience-specific data collection efforts to meet the needs of their target audiences. These data collection efforts include online, telephone, and/or in-person qualitative key informant interviews and surveys. These DSWs, and NOFAS, will be responsible for their own data collection and analysis for these additional efforts. See Attachment F for data collection instruments from the Medical Assistants DSW; Attachment G for data collection instruments from the Nursing DSW; Attachment H for data collection instruments from the OBGYN DSW; Attachment I for data collection instruments from the Pediatrics DSW; Attachment J for data collection instruments from the Social Work and Family Physicians DSWs; and Attachment K for data collection instruments from NOFAS. Attachment L includes a set of instruments that will be used by all DSWs to assess healthcare systems change.
Finally, the cross-site evaluator, Westat, is also tasked with assessing practice and systems changes resulting from the project, as well as collaboration among DSW members. To assess practice and systems change, Westat will conduct several high-impact studies that will consist primarily of in-person and telephone qualitative interviews, and observations within targeted health systems. To assess collaboration among DSW members, Westat will ask grantees to complete a DSW Report once every six months to provide feedback on collaboration within each DSW. The DSW Report is a Word document that DSW members will complete digitally and email to Westat. See Attachment E for all cross-site evaluation instruments.
Any electronic data will be stored on password-protected servers within each DSW, with the project’s web development contractor, and/or with the cross-site evaluator. Paper-pencil surveys, when conducted, will contain no personally-identifiable information, but will be stored in a locked file room at the DSW’s respective offices separate from all other project data. The data and subsequent analyses will be stored electronically for five years, at which time they will be destroyed. Access to raw data will be limited to project collaborators (as identified in Attachment M). CDC will receive only summarized, aggregate data in the form of evaluation reports, interim progress reports, and final project reports.
Screenshots of all instruments that will be administered online are attached to this submission. Attachment N includes screenshots of FASD Core Training instruments; Attachment O includes screenshots of cross-site evaluation instruments; Attachment P includes screenshots of Medical Assistants DSW instruments; Attachment Q includes screenshots of Nursing DSW instruments; Attachment R includes screenshots of OBGYN DSW instruments; Attachment S includes screenshots of Pediatrics DSW instruments; Attachment T includes screenshots of Social Work and Family Physicians DSW instruments; Attachment U includes screenshots of NOFAS DSW instruments; and Attachment V includes screenshots of instruments that will be used by all DSWs to assess healthcare systems change.
Items of Information to Be Collected
No personally-identifiable information will be collected. Data collection will be anonymous for the majority of data collection activities; the evaluation forms themselves will have no identifying information or any link to names or contact information.
Several surveys (which are designed to be administered at pre-test, post-test, and follow-up) will use a code to link to an individual respondent; this code, however, will not be stored by project staff. The code will be created by the respondent, and will comprise a series of letters and numbers that the respondent can remember and reproduce on each survey in order to link them.
Data collection for the high impact studies is not anonymous due to the relatively small number of participants in each study, but all information collected will be kept secure. Participant names will not be linked to their responses in any reports resulting from interviews through this project.
Identification of Website(s) and Website Content Directed at Children Under 13 Years of Age
No website content directed at children under 13 years of age is involved in this information collection request.
A.11. Institutional Review Board (IRB) and Justification of Sensitive Questions
IRB Approval.
This is non-research data collection. The NCBDDD Human Subjects Officer has reviewed this collection and determined that IRB approval is not required for this activity
Sensitive Questions
No sensitive questions will be asked.
A.12. Estimates of Annualized Burden Hours and Costs
The information will be collected from the following types of respondents: FASD core training participants, project grantee staff, health system staff, nurses, healthcare organization representatives, pediatricians, obstetrician-gynecologists, family medicine physicians, students in allied health professions, residency directors, training coordinators, clinical directors, certified medical assistants, social workers, social work students, NOFAS webinar attendees, NOFAS training participants, and systems change project participants. Burden estimates are based on projections from each DSW, NOFAS, and the cross-site evaluator of how many participants they will reach annually and how long each evaluation instrument is estimated to take for a respondent to answer. As noted in the table of estimated annualized burden hours, each organization plans to use a variety of instruments to evaluate their own activities. See Attachments D-L for all proposed evaluation instruments.
It is estimated that data collection will include 29,573 participants each year, for a total of 88,719 over the three year approval period. The total estimated annual burden is 3790 hours. (See Table 1 for details.) There are no costs to respondents other than their time.
Table 1. Estimated Annualized Burden Hours
Type of Respondents |
DSW/ Organization |
Form Name |
No. of Respondents |
No. Responses per Respondent |
Average Burden per Response (in hours) |
Total Burden Hours |
Project Grantee Staff |
Westat (Cross-Site Evaluator) |
DSW Report |
90 |
2 |
10/60 |
30 |
Project Grantee Staff |
Westat (Cross-Site Evaluator) |
High Impact Study: Discipline Specific Workgroup Discussion Guide for Project Staff |
10 |
2 |
60/60 |
20 |
Health Care System Staff |
Westat (Cross-Site Evaluator) |
High Impact Study: Key Informant Interview - Health Care System Staff |
10 |
2 |
60/60 |
20 |
FASD Core Training Participants |
Westat (Cross-Site Evaluator) |
FASD Core Training Survey – Pre-Test |
4013 |
1 |
9/60 |
602 |
FASD Core Training Participants |
Westat (Cross-Site Evaluator) |
FASD Core Training Survey – Post-Test |
4013 |
1 |
5/60 |
335 |
FASD Core Training Participants |
Westat (Cross-Site Evaluator) |
FASD Core Training Survey – 6 Month Follow-Up |
4013 |
1 |
6/60 |
402 |
Nurses |
Nursing |
Pre-Training Survey for Nursing |
667 |
1 |
9/60 |
101 |
Nurses |
Nursing |
Post-Training Survey for Nursing |
550 |
1 |
9/60 |
83 |
Nurses |
Nursing |
Six Month Follow-Up Training Survey for Nursing |
440 |
1 |
9/60 |
66 |
Nurses |
Nursing |
Nursing DSW Polling Questions |
417 |
1 |
5/60 |
35 |
Nurses |
Nursing |
Key Informant Interviews with Champions |
14 |
2 |
45/60 |
21 |
Nurses |
Nursing |
Brief Questionnaire for Nursing Organization Memberships |
2934 |
1 |
10/60 |
489 |
Nurses |
Nursing |
Friends & Members of the Network Survey |
34 |
2 |
10/60 |
12 |
Healthcare Organization Representatives |
Nursing |
Healthcare Organization Utilization Survey |
234 |
1 |
30/60 |
117 |
Obstetrician-Gynecologists and students in allied health professions |
OBGYN |
OBGYN SBI Knowledge & Agency |
600 |
1 |
2/60 |
20 |
Obstetrician-Gynecologists |
OBGYN |
OBGYN BI-MI Proficiency Rating Scale - Provider Skills Training Baseline |
600 |
1 |
3/60 |
30 |
Students in allied health professions |
OBGYN |
OBGYN BI-MI Proficiency Rating Scale - Standardized Patient Version |
600 |
1 |
3/60 |
30 |
Obstetrician-Gynecologists |
OBGYN |
OBGYN BI-MI Proficiency Rating Scale - Provider Follow Up (3m & 6m) |
600 |
2 |
3/60 |
60 |
Obstetrician-Gynecologists and students in allied health professions |
OBGYN |
OBGYN Telecom Training Satisfaction Survey |
480 |
1 |
5/60 |
40 |
Obstetrician-Gynecologists and students in allied health professions |
OBGYN |
OBGYN Avatar Training Satisfaction Survey |
120 |
1 |
5/60 |
10 |
Obstetrician-Gynecologists |
OBGYN |
OBGYN FASD-SBI Training Event Evaluation |
124 |
1 |
2/60 |
5 |
Residency Directors, Training Coordinators, Clinical Directors, Obstetrician-Gynecologists |
OBGYN |
OBGYN Qualitative Key Informant Interview - Pre-Training |
34 |
1 |
25/60 |
15 |
Residency Directors, Training Coordinators, Clinical Directors, Obstetrician-Gynecologists |
OBGYN |
OBGYN Qualitative Key Informant Interview - Post-Training |
34 |
1 |
25/60 |
15 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistant – Pre-Test Survey |
334 |
1 |
10/60 |
56 |
Students in allied health professions |
Medical Assistants |
Medical Assistant – Pre-Test Survey (Academic) |
67 |
1 |
10/60 |
12 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistant – Post-Test Survey |
334 |
1 |
10/60 |
56 |
Students in allied health professions |
Medical Assistants |
Medical Assistant – Post-Test Survey (Academic) |
67 |
1 |
10/60 |
12 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistant Follow Up Survey |
200 |
1 |
10/60 |
34 |
Students in allied health professions |
Medical Assistants |
Medical Assistant Follow Up Survey (Academic) |
17 |
1 |
10/60 |
3 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistants Change in Practice Survey |
250 |
1 |
15/60 |
63 |
Pediatricians |
Pediatrics |
Survey of Pediatricians - Baseline and Follow Up |
534 |
2 |
10/60 |
178 |
Pediatricians |
Pediatrics |
AAP Post-Training Evaluation Survey |
120 |
1 |
7/60 |
14 |
Pediatricians |
Pediatrics |
AAP Pre-Training Evaluation Survey |
120 |
1 |
7/60 |
14 |
Pediatricians |
Pediatrics |
AAP Three Month Follow Up Evaluation Survey |
120 |
1 |
2/60 |
4 |
Pediatricians |
Pediatrics |
AAP Six Month Follow Up Evaluation Survey |
120 |
1 |
5/60 |
10 |
Pediatricians |
Pediatrics |
FASD Toolkit User Survey |
50 |
1 |
15/60 |
13 |
Pediatricians |
Pediatrics |
FASD Toolkit Evaluation Focus Group/Guided Interview |
10 |
1 |
30/60 |
5 |
Pediatricians |
Pediatrics |
Pediatric FASD Regional Education and Awareness Liaisons Work Plan |
10 |
1 |
20/60 |
4 |
Pediatricians |
Pediatrics |
Pediatric FASD Regional Liaison/Champion Training Session Evaluation |
10 |
1 |
4/60 |
1 |
Family Medicine Physicians |
Social Work and Family Medicine |
Family Medicine Evaluation Questions Addendum for Practice or Individual Provider |
62 |
1 |
8/60 |
9 |
Family medicine physicians, social workers, social work students |
Social Work and Family Medicine |
Social Work and Family Physicians Pre-training Survey |
1167 |
1 |
8/60 |
156 |
Family medicine physicians, social workers, social work students |
Social Work and Family Medicine |
Social Work and Family Physicians Post-training Survey |
1167 |
1 |
5/60 |
98 |
Family medicine physicians, social workers, social work students |
Social Work and Family Medicine |
Social Work and Family Physicians 6-Month Follow Up Survey |
1167 |
1 |
8/60 |
156 |
NOFAS webinar attendees |
NOFAS |
NOFAS Webinar Survey |
601 |
1 |
2/60 |
20 |
NOFAS webinar attendees |
NOFAS |
NOFAS Three Month Follow-Up Webinar Questionnaire |
601 |
1 |
2/60 |
20 |
NOFAS training participants |
NOFAS |
NOFAS Pre-Test Survey |
551 |
1 |
3/60 |
28 |
NOFAS training participants |
NOFAS |
NOFAS Post-Test Survey |
551 |
1 |
3/60 |
28 |
Systems change project participants |
Cross-DSW |
Clinical Process Improvement Survey |
246 |
2 |
10/60 |
82 |
Systems change project participants |
Cross-DSW |
TCU Organizational Readiness Survey |
246 |
2 |
10/60 |
82 |
Systems change project participants |
Cross-DSW |
Organizational Readiness to Change Assessment |
220 |
2 |
10/60 |
74 |
TOTAL |
|
|
29,573 |
|
|
3790 |
Estimates of annualized cost to respondents for the burden hours for collections of information were based on the mean hourly wage from the U.S. Department of Labor’s “May 2014 National Occupational Employment and Wage Estimates.” (See http://www.bls.gov/oes/current/oes_nat.htm.) (See Table 2 for details.) For rows containing multiple respondent types or where the specific occupation of the respondent is unclear, our wage rates were calculated as follows:
Project Grantee Staff: Average of rates for “Medical Scientists” and “Social Scientists and Related Workers.”
Health Systems Staff: Average of rates for “Medical and Health Services Managers” and “Health Diagnosing and Treating Practitioners.”
Students in Allied Health Professions: Rate for “All Occupations.” (This wage rate was selected because the jobs that students might have while in school are unknown.)
FASD Core Training Participants: Average of rates for “All Occupations” and “Health Diagnosing and Treating Practitioners.”
Healthcare Organization Representatives: Rate for “Medical and Health Services Managers.”
Obstetrician-Gynecologists and Students in Allied Health Professions: Average of “Obstetricians and Gynecologists” and “All Occupations.”
Residency Directors, Training Coordinators, Clinical Directors, Obstetrician-Gynecologists: Rate for “Physicians and Surgeons.”
Certified Medical Assistants and Students: Average of rate for “Medical Assistants” and “All Occupations.”
Family Medicine Physicians, Social Workers, Social Work Students: Average of rates for “Family and General Practitioners,” “Healthcare Social Workers,” and “All Occupations.”
NOFAS Webinar Attendees and NOFAS Training Participants: Average of rates for “Health Diagnosing and Treating Practitioners,” “Education, Training, and Library Occupations,” “Legal Occupations,” and “All Occupations.”
Systems Change Project Participants: Average of rates for “Medical and Health Services Managers” and “Health Diagnosing and Treating Practitioners.”
Table 2. Estimated Annualized Burden Costs
Type of Respondents |
DSW/ Organization |
Form Name |
Total Burden Hours |
Hourly Wage Rate |
Total Respondent Costs |
Project Grantee Staff |
Westat (Cross-Site Evaluator) |
DSW Report |
30 |
$39.84 |
$1,195.20 |
Project Grantee Staff |
Westat (Cross-Site Evaluator) |
High Impact Study: Discipline Specific Workgroup Discussion Guide for Project Staff |
20 |
$39.84 |
$796.80 |
Health Care System Staff |
Westat (Cross-Site Evaluator) |
High Impact Study: Key Informant Interview - Health Care System Staff |
20 |
$47.73 |
$954.60 |
FASD Core Training Participants |
Westat (Cross-Site Evaluator) |
FASD Core Training Survey – Pre-Test |
602 |
$34.17 |
$20,570.34 |
FASD Core Training Participants |
Westat (Cross-Site Evaluator) |
FASD Core Training Survey – Post-Test |
335 |
$34.17 |
$11,446.95 |
FASD Core Training Participants |
Westat (Cross-Site Evaluator) |
FASD Core Training Survey – 6 Month Follow-Up |
402 |
$34.17 |
$13,736.34 |
Nurses |
Nursing |
Pre-Training Survey for Nursing |
101 |
$33.55 |
$3,388.55 |
Nurses |
Nursing |
Post-Training Survey for Nursing |
83 |
$33.55 |
$2,784.65 |
Nurses |
Nursing |
Six Month Follow-Up Training Survey for Nursing |
66 |
$33.55 |
$2,214.30 |
Nurses |
Nursing |
Nursing DSW Polling Questions |
35 |
$33.55 |
$1,174.25 |
Nurses |
Nursing |
Key Informant Interviews with Champions |
21 |
$33.55 |
$704.55 |
Nurses |
Nursing |
Brief Questionnaire for Nursing Organization Memberships |
489 |
$33.55 |
$16,405.95 |
Nurses |
Nursing |
Friends & Members of the Network Survey |
12 |
$33.55 |
$402.60 |
Healthcare Organization Representatives |
Nursing |
Healthcare Organization Utilization Survey |
117 |
$49.84 |
$5,831.28 |
Obstetrician-Gynecologists and students in allied health professions |
OBGYN |
OBGYN SBI Knowledge & Agency |
20 |
$62.98 |
$1,259.60 |
Obstetrician-Gynecologists |
OBGYN |
OBGYN BI-MI Proficiency Rating Scale - Provider Skills Training Baseline |
30 |
$103.25 |
$3,097.50 |
Students in allied health professions |
OBGYN |
OBGYN BI-MI Proficiency Rating Scale - Standardized Patient Version |
30 |
$22.71 |
$681.30 |
Obstetrician-Gynecologists |
OBGYN |
OBGYN BI-MI Proficiency Rating Scale - Provider Follow Up (3m & 6m) |
60 |
$103.25 |
$6,195.00 |
Obstetrician-Gynecologists and students in allied health professions |
OBGYN |
OBGYN Telecom Training Satisfaction Survey |
40 |
$62.98 |
$2,519.20 |
Obstetrician-Gynecologists and students in allied health professions |
OBGYN |
OBGYN Avatar Training Satisfaction Survey |
10 |
$62.98 |
$629.80 |
Obstetrician-Gynecologists |
OBGYN |
OBGYN FASD-SBI Training Event Evaluation |
5 |
$103.25 |
$516.25 |
Residency Directors, Training Coordinators, Clinical Directors, Obstetrician-Gynecologists |
OBGYN |
OBGYN Qualitative Key Informant Interview - Pre-Training |
15 |
$93.74 |
$1,406.10 |
Residency Directors, Training Coordinators, Clinical Directors, Obstetrician-Gynecologists |
OBGYN |
OBGYN Qualitative Key Informant Interview - Post-Training |
15 |
$93.74 |
$1,406.10 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistant – Pre-Test Survey |
56 |
$18.86 |
$1,056.16 |
Students in allied health professions |
Medical Assistants |
Medical Assistant – Pre-Test Survey (Academic) |
12 |
$22.71 |
$272.52 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistant – Post-Test Survey |
56 |
$18.86 |
$1,056.16 |
Students in allied health professions |
Medical Assistants |
Medical Assistant – Post-Test Survey (Academic) |
12 |
$22.71 |
$272.52 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistant Follow Up Survey |
34 |
$18.86 |
$641.24 |
Students in allied health professions |
Medical Assistants |
Medical Assistant Follow Up Survey (Academic) |
3 |
$22.71 |
$68.13 |
Certified Medical Assistants and students |
Medical Assistants |
Medical Assistants Change in Practice Survey |
63 |
$18.86 |
$1,188.18 |
Pediatricians |
Pediatrics |
Survey of Pediatricians - Baseline and Follow Up |
178 |
$84.33 |
$15,010.74 |
Pediatricians |
Pediatrics |
AAP Post-Training Evaluation Survey |
14 |
$84.33 |
$1,180.62 |
Pediatricians |
Pediatrics |
AAP Pre-Training Evaluation Survey |
14 |
$84.33 |
$1,180.62 |
Pediatricians |
Pediatrics |
AAP Three Month Follow Up Evaluation Survey |
4 |
$84.33 |
$337.32 |
Pediatricians |
Pediatrics |
AAP Six Month Follow Up Evaluation Survey |
10 |
$84.33 |
$843.30 |
Pediatricians |
Pediatrics |
FASD Toolkit User Survey |
13 |
$84.33 |
$1,096.29 |
Pediatricians |
Pediatrics |
FASD Toolkit Evaluation Focus Group/Guided Interview |
5 |
$84.33 |
$421.65 |
Pediatricians |
Pediatrics |
Pediatric FASD Regional Education and Awareness Liaisons Work Plan |
4 |
$84.33 |
$337.32 |
Pediatricians |
Pediatrics |
Pediatric FASD Regional Liaison/Champion Training Session Evaluation |
1 |
$84.33 |
$84.33 |
Family Medicine Physicians |
Social Work and Family Medicine |
Family Medicine Evaluation Questions Addendum for Practice or Individual Provider |
9 |
$89.58 |
$806.22 |
Family medicine physicians, social workers, social work students |
Social Work and Family Medicine |
Social Work and Family Physicians Pre-training Survey |
156 |
$46.02 |
$7,179.12 |
Family medicine physicians, social workers, social work students |
Social Work and Family Medicine |
Social Work and Family Physicians Post-training Survey |
98 |
$46.02 |
$4,509.96 |
Family medicine physicians, social workers, social work students |
Social Work and Family Medicine |
Social Work and Family Physicians 6-Month Follow Up Survey |
156 |
$46.02 |
$7,179.12 |
NOFAS webinar attendees |
NOFAS |
NOFAS Webinar Survey |
20 |
$35.51 |
$710.20 |
NOFAS webinar attendees |
NOFAS |
NOFAS Three Month Follow-Up Webinar Questionnaire |
20 |
$35.51 |
$710.20 |
NOFAS training participants |
NOFAS |
NOFAS Pre-Test Survey |
28 |
$35.51 |
$994.28 |
NOFAS training participants |
NOFAS |
NOFAS Post-Test Survey |
28 |
$35.51 |
$994.28 |
Systems change project participants |
Cross-DSW |
Clinical Process Improvement Survey |
82 |
$47.73 |
$3,913.86 |
Systems change project participants |
Cross-DSW |
TCU Organizational Readiness Survey |
82 |
$47.73 |
$3,913.86 |
Systems change project participants |
Cross-DSW |
Organizational Readiness to Change Assessment |
74 |
$47.73 |
$3,532.02 |
TOTAL |
|
|
3790 |
|
$158,807.43 |
A.13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers
There are no other annual cost burdens to respondents or record keepers.
A.14. Annualized Cost to the Government
The average annualized cost to the Government to collect this information is $1,603,724 for the OMB approval period that is requested (Attachment B). It is anticipated that costs for the future years will be comparable to those shown, with appropriate adjustments for budget changes, inflation, and salary increases.
A.15. Explanation for Program Changes or Adjustments
This is a new data collection; therefore, program changes and adjustments do not apply.
A.16. Plans for Tabulation and Publication and Project Time Schedule
Project trainings will begin soon after OMB approval is received (late 2016). Grantees are working to develop the content of the trainings so that they will be ready for implementation upon receipt of OMB approval. High impact studies will commence soon after the trainings begin in order to allow for collection of baseline data. Grantees are required to provide CDC with progress reports each year. The cross-site evaluator will provide reports upon completion of each high impact study, and at the end of the project period.
Data will be summarized across respondents in all reports. For rating and categorical scales, the percent of each answer chosen compared to the total number of answers given will be reported per item. Open ended questions will be reviewed and summarized by themes. When applicable, qualitative and quantitative data will be synthesized to provide a more complete picture of the findings.
Table A.16. Project Time Schedule
|
||
Activity |
Timeframe |
|
Identify and invite participants to trainings
|
Starts 1–2 months after OMB approval, ongoing |
|
Conduct trainings |
Deliver training |
Starts 1–2 months after OMB approval, ongoing |
Conduct pre and post surveys |
Starts 1–2 months after OMB approval, ongoing |
|
Conduct follow-up survey |
6 months after each training |
|
Conduct high impact studies |
Identify promising interventions/projects |
1–2 months after OMB approval |
Recruit sites to participate |
1–2 months after OMB approval |
|
Collect data |
Starts 3 months after OMB approval, ongoing |
|
Analyze and Report Data |
Grantee Progress Reporting |
Semi-annual, with progress report (mid-year) and annual report (end of year) |
High Impact Study Reports |
At completion of each study |
|
Final Cross-Site Evaluation Report |
At end of budget period |
A.17. Reason(s) Display of OMB Expiration Date Is Inappropriate
Expiration dates are displayed, no exception is sought.
A.18. Exceptions to Certification for Paperwork Reduction Act Submissions
There are no exceptions to the certification.
References
Centers for Disease Control and Prevention. (July 20, 2012). Alcohol use and binge drinking among
women of childbearing age — United States, 2006–2010. MMWR, 61(28), 534-538.
Johnson, M, Jackson, R, Guillaume, L, Meier, P, and Goyder, E. Barriers and facilitators to
implementing screening and brief intervention for alcohol misuse: a systematic review of
qualitative evidence. Journal of Public Health, 2011, 33 (3), 412-421.
May, P.A., Baete, A., Russo, J., Elliott, A. J., Blankenship, J., Kalberg, W. O., . . . Hoyme, H.E.
(2014). Prevalence and characteristics of fetal alcohol spectrum disorders. Pediatrics, 134, 855-
66.
Vinson, D.C., Turner, B.J., Manning, B.K., and Galliher, J.M. (2013). Clinician Suspicion of an
Alcohol Problem: An Observational Study From the AAFP National Research Network.
Annals of Family Medicine, 11(1). At www.annfammed.org. Accessed 7/6/2015.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Rich Ann Baetz |
File Modified | 0000-00-00 |
File Created | 2021-01-24 |