SuppStatement_B_Extension_ Request_9-16-09

SuppStatement_B_Extension_ Request_9-16-09.doc

Program Effectiveness Evaluation of a Workplace Intervention for Intimate Partner Violence

OMB: 0920-0789

Document [doc]
Download: doc | pdf

Division of Violence Prevention

Prevention Development and Evaluation Branch

Centers for Disease Control and Prevention (CDC)


Program Effectiveness Evaluation of a

Workplace Intervention for Intimate Partner Violence




Office of Management and Budget Information Collection Request Supporting Statement and Data Collection Instruments

Supporting Statement B



Phyllis Holditch Niolon (PNiolon@CDC.GOV)

National Center for Injury Prevention and Control

Division of Violence Prevention

4770 Buford Hwy, NE, MS F-63

Atlanta, GA 30341



September 16, 2009

B. COLLECTION OF INFORMATION EMPLOYING STATISTICAL METHODS


B1. Respondent Universe and Sampling Methods


Our sampling frame will consist of all managers in the corporate office of the participating company. The respondent universe includes all managers in the corporate office who have not yet completed the Domestic Violence Training for Managers. Although the recruitment e-mail will be sent to all managers, it will state that only managers who have not yet completed the Domestic Violence Training for Managers are eligible to participate in the survey. The participating company has estimated that at the anticipated start date, there will be approximately 500 managers in the corporate headquarters who have not yet received training. Over a 13-month follow-up period, the participating company will continue to offer the manager training and will document which managers received the training and when the training was implemented. Managers will be grouped into “treatment” and “comparison” groups based on whether they received training at any point over the 13-month follow-up period.


All employees whose managers completed the baseline survey will be recruited to participate in the baseline and follow-up employee survey (estimated ranges from 2 to 50 employees per manager). The employee baseline survey will need to be conducted as soon as possible after the manager baseline survey, in order to capture employee experiences prior to their managers’ being trained. The employee survey will be updated immediately prior to the follow-up survey, in order to accurately capture the employees who are supervised by the managers at the time of the follow-up survey. We estimate that an average of 3 employees per manager will respond to each wave, for a total of 1500 employee survey respondents per wave.


The proposed study is an evaluation of the effectiveness of the manager training in influencing a variety of intermediate and ultimate outcomes—measured at both the employee and the manager level. The design involves the recruitment of all untrained managers to participate in the baseline survey and allows for natural (i.e., not manipulated) receipt of training among some members of the manager sample during the follow-up period. The manager and employee surveys will be conducted at several points in time in order to examine pre-/post-training differences between managers (and employees supervised by managers) that did and did not participate in the training, both immediately after the training and several months after the training (to determine the duration of any effect of training). Although the manager survey component will be longitudinal in nature (in that individuals will be tracked over time), the employee survey component will not. At the broadest level, the study design is best classified as a quasi-experimental design (Murray, 1998). Specifically, the study employs a pretest-posttest control group design (in that the sample will include employees of managers who do not ever receive training and employees of managers who receive training at some point during the follow-up period). The employee component employs a nested cross-sectional design (given that employees are not tracked over time).


Because assignment to manager training will not take place on a random basis, the manager baseline survey will collect detailed information on potential factors likely to differentiate between managers who request (and receive) the training and those who do not (e.g., awareness of IPV as a workplace issue, experience with employees as IPV victims) in order to adjust for these confounds analytically. In addition, we have identified several variables that could potentially be used as instrumental variables (i.e., variables that are associated with the likelihood of receiving the intervention but not the outcomes of interest). These constructs are noted in Appendix L, which outlines the potential use of every construct included in the surveys.


In order to maximize the number of managers who request training during the follow-up period (in order to ensure a sufficient number of managers in the treatment group), we will ask the company to intensify their recruitment efforts during this time period. If more managers request training than can be served, we will be able to identify a subsample of “wait list comparisons” in our comparison group, in order to do additional analyses comparing this group to the treatment group. Since it is likely that some managers who “request” the training will not directly request it themselves but rather have supervisors who requested it for them (or their entire group), we will also attempt to document this factor in the baseline survey in order to create a subsample of “non self-initiated trained” managers within the treatment group. This will provide an opportunity to compare trained managers who did not request the training to comparison group members who have not yet requested the training. Finally, because managers in the treatment group could theoretically receive their training at any point during the 13-month follow-up period, we will be able to examine the impact of manager training at several time periods relative to training (e.g., 1 month post-training, 3 months post-training, 12 months post-training, etc.).


B2. Procedures for the Collection of Information


Both managers and employees who are recruited for the study will be sent a lead letter from the company followed by a “recruitment e-mail” to their company e-mail address. The lead letter will provide detailed information about the study and notify managers and employees that they will receive a recruitment e-mail for the survey starting that week. The recruitment e-mail (see Appendix D) will provide basic information about the study, provide a “study identification number”, and contain a link to the survey website.


Upon entering the survey website, potential respondents will see an “introductory screen”, which includes (in a bulleted format) much of the information included in the recruitment e-mail and additional details such as:

  • a link to a letter of endorsement from the CEO of the participating company

  • a link to local and national resources for IPV victims

  • (employee survey only) a description of the procedures that will preclude anyone, including RTI staff, from being able to link their identity to the data they provide

  • a recommendation to take the survey in a private setting

  • (employee survey only) a recommendation to take the survey when they can complete it in one 15-minute block (since the respondent will not be “logging in,” the website will not be designed to enable the respondent to get back into the survey if they exit)


From this introductory screen, potential respondents can click on a link to start the survey. Managers (but not employees) will be prompted to enter their study identification number and password prior to completing the survey questions.


The manager surveys will be designed to take an average of 30 minutes to complete. The employee surveys will be designed to take an average of 15 minutes to complete. Both surveys contain questions on background characteristics, employment characteristics, current work behaviors, health, experiences with intimate partner violence, and awareness of the company’s intimate partner violence activities. The questions will be close-ended. Each page of the survey will contain a link to the local and national resources for IPV victims.


For the employee survey, respondents will be provided with a “survey completion code” after answering the last question of the survey. Further details about the incentive drawing and redemption process are provided in Section A9.



B3. Methods to Maximize Response Rates and Deal with Nonresponse


Several strategies will be employed to maximize response rates. First, prior to sending the initial recruitment e-mails, the company CEO will e-mail a letter of support for the study (Appendix D).

This encouragement, coupled with the company’s long history of conducting web-based surveys among employees and emphasis on IPV should help increase response rates.


In addition, the incentive amount and large number of incentives given out are designed to increase participation. We believe that the amount and number of incentives will be large enough to encourage participation and that offering many $100 gift codes would be much more likely to encourage participation than offering a very small incentive to every survey participant.


Finally, for both the manager and the employee surveys, we will be able to identify individuals who have not completed the survey. For the employee sample, this will consist of individuals whose study identification number has not been paired with a survey completion code in the “validation” website; however, employee surveys will retain elements of anonymity because surveys will not be linked with identification numbers or completion codes. We will engage in e-mail follow-up with all individuals over the two week period in which each survey is open for participation.


After each administration of surveys, we will conduct a nonresponse bias analysis, in order to determine if individuals who completed the survey differed on select indicators (such as unit/division, and any other variables available from the participating company) from those who did not. However, we will likely not be able to weight the survey data based on the nonresponse bias analysis, due to the small number of variables available on all individuals in the sampling frame.


B4. Tests of Procedures or Methods to be Undertaken


Before initiating data collection, we will undertake pilot testing of the manager and employee survey instruments (see Appendix I). For each instrument, we will have five RTI employees take the surveys and participate in a brief follow-up interview about the survey. The instruments will be revised as needed based on information obtained through this process, with the goal of improving the quality of data collected and minimizing burden on respondents. We will also test the incentive redemption process.


Since most items included in the data collection instrument have been used successfully in previous studies with similar populations, we anticipate making only minor changes as a result of the pretest. Before data collection begins, the Office of Management and Budget will be informed of any changes to the study procedures or instruments.


B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data


a) Individuals who have participated in designing the data collection:


CDC staff


Daniel Whitaker, PhD dpw7@cdc.gov

Phyllis Holditch Niolon, PhD euh2@cdc.gov

Phaedra Corso pas7@cdc.gov

Xiangming Fang ddz6@cdc.gov


RTI International Staff


Monique Clinton-Sherrod, PhD mclinton@rti.org

Christine Lindquist, PhD lindquist@rti.org

Georgia Karuntzos, PhD gtk@rti.org

Jeremy Bray, PhD bray@rti.org

Derek Brown, PhD dsbrown@rti.org


b) Individuals who will participate in the collection of data (all from RTI International):

Monique Clinton-Sherrod, PhD mclinton@rti.org

Christine Lindquist, PhD lindquist@rti.org

Derek Brown, PhD dsbrown@rti.org

Todd Heinrich toddh@rti.org

Jennifer Hardison, MSW jhardison@rti.org

Tasseli McKay, MPH tmckay@rti.org


c) Individuals who will participate in data analysis:


CDC Staff


Daniel Whitaker, PhD dpw7@cdc.gov

Phyllis Holditch Niolon, PhD euh2@cdc.gov



RTI International Staff

Monique Clinton-Sherrod, PhD mclinton@rti.org

Christine Lindquist, PhD lindquist@rti.org

Jason Williams, PhD jawilliams@rti.org

Derek Brown, PhD dsbrown@rti.org

Jennifer Hardison, MSW jhardison@rti.org

Tasseli McKay, MPH tmckay@rti.org


File Typeapplication/msword
File TitleDivision of Violence Prevention
AuthorPhyllis Holditch Niolon
Last Modified Byfmc7
File Modified2009-09-16
File Created2009-09-16

© 2024 OMB.report | Privacy Policy