October 2011
Evaluation of the Administration for Children and Families Responsible Fatherhood, Marriage and Family Strengthening Grants for Incarcerated and Reentering Fathers and Their Partners
OMB
Supporting Statement: Part B
Renewal for OMB No 0990-0331
Prepared for
Office of the Assistant Secretary for Planning & Evaluation (ASPE)
United States Department of Health & Human Services (HHS)
Prepared by
RTI International
3040 Cornwallis Road
Research Triangle Park, NC 27709
RTI Project Number 0210412.000.014.001
Renewal Application Supporting Statement Part B
Evaluation of the Administration for Children and Families Responsible Fatherhood, Marriage and Family Strengthening Grants for Incarcerated and Reentering Fathers and Their Partners
Submitted by:
U.S. Department of Health and Human Services
Office of the Assistant Secretary for Planning and Evaluation
Contact Person:
Linda Mellgren
(202) 690-6806
linda.mellgren@hhs.gov
Section Page
Background 1
A. Justification 1
A1. Circumstances Making the Collection of Information Necessary 1
A2. Purpose and Use of the Information Collection 2
A3. Use of Improved Information Technology and Burden Reduction 4
A4. Efforts to Identify Duplication and Use of Similar Information 4
A5. Impact on Small Businesses or Other Small Entities 5
A6. Consequences of Collecting the Information Less Frequently 6
A7. Special Circumstances Relating to the Guidelines of 5 CFR §1320.5 6
A8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency 6
A.8.1 Public Comment 6
A.8.2 Consultation with Experts 6
A9. Explanation of Any Payment or Gift to Respondents 7
A10. Assurance of Confidentiality Provided to Respondents 8
A11. Justification for Sensitive Questions 10
A12. Estimates of Annualized Burden Hours and Costs 12
A13. Estimates of Other Total Annual Cost Burden to Respondents or Record Keepers 13
A14. Annualized Costs to the Federal Government 13
A15. Explanation for Program Changes or Adjustments 13
A16. Plans for Tabulation and Publication and Project Time Schedule 13
A.16.1 Plans for Tabulation 13
A.16.2 Plans for Publication 14
A.16.3 Project Timeline 15
A17. Reason(s) Display of OMB Expiration is Inappropriate 16
A18. Exceptions to Certification for Paperwork Reduction Act Submissions 16
B. Collection of Information Employing Statistical Methods 17
B1. Respondent Universe and Sampling Methods 17
B.1.1 Site Selection 18
B.1.2 Site-Specific Study Designs 20
B.1.3 Selection of Respondents 20
B.1.4 Power Analysis 22
B2. Procedures for the Collection of Information 24
B3. Methods to Maximize Response Rates and Deal with Nonresponse 28
B4. Tests of Procedures or Methods to be Undertaken 30
B5. Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data 30
References 32
Number Page
For Part A
1 Annual Burden Estimates 12
2 Annualized Costs to the Federal Government 13
For Part B
3 Site Selection Criteria 19
4 Proposed Site-Specific Study Approaches and Projected Sample Sizes 21
5 Power Estimates from Monte Carlo Simulations 23
6 Site-Specific Implications of the Timing of Program Delivery and Participants’ Incarceration Terms on the Interview Schedule 24
Appendix A: Male Follow-up Survey Instruments (9-18-34-month)
Appendix B. Female Follow-up Survey Instruments (9-18-34-month)
For Part A
Appendix C: Informed Consent Forms
Appendix D: Statute Authorizing Information Collection
Appendix E: Institutional Review Board Approval
Appendix F: Office of Human Research Protections Certification
Appendix G: Individuals and Organizations Consulted
For Part B
Appendix H: Pilot Test Feedback Form
Appendix I: Overview of MFS-IP Programs
Appendix J: Effect Sizes of Marriage and Relationship Education Programs
The Office of the Assistant Secretary for Planning and Evaluation (ASPE) of the United States Department of Health and Human Services (HHS) is undertaking a study of Responsible Fatherhood, Marriage and Family Strengthening Grants for Incarcerated and Reentering Fathers and Their Partners, and is requesting continuing clearance for the following forms:
Male Follow-up Survey Instruments (Appendix A)
Female Follow-up Survey Instruments (Appendix B)
Male and Female Informed Consent Forms (Appendix C)
The purpose of the study is to evaluate grants designed to enhance partner and parenting relationships among incarcerated and reentering fathers, their partners and children. These grants were administered by the HHS Administration for Children and Families (ACF) Office of Family Assistance (OFA) under the authority of the Deficit Reduction Act of 2005 (DRA) (P.L. 109-171) from September 2006 to September 2011. The DRA amended Title IV, Section 403(a)(2) of the Social Security Act (42 U.S.C. 603(a)(2)) to authorize competitive grants for states; territories; Indian tribes; tribal organizations; and public and non-profit community entities, including faith-based organizations, to develop and implement projects that support any of the three authorized activity areas: Healthy Marriage, Responsible Parenting, and Economic Stability. ASPE has contracted with RTI International (RTI), and its subcontractors, to conduct the study.
This application is for renewal of an existing study that has largely remained the same in terms of methods, instrumentation, and timeline since the original submission. However, several changes to the study are being requested in this renewal application:
An additional wave of follow-up data collection at 34 months (see A2, p. 7)
Increased incentives for certain groups of respondents (see A9, p. 11)
Updated sample sizes and burden estimates for all waves of interviewing (see A12, p. 16)
The target population for the surveys is couples who participated or were eligible to participate in the MFS-IP programs. Couples who enrolled in couple-based services funded by the MFS-IP initiative at five selected program sites were recruited to participate in the study and constituted the treatment group. A cohort of comparison or control couples who were not participating in MFS-IP services were also recruited at each of the five sites (see B.1.2 for additional details about the study design in each site). The couples were comprised of men incarcerated at baseline and their partners (none of whom were incarcerated at baseline). For all interview waves, similar questionnaires are administered to the men and to their partners in separate interviews. Recruitment and baseline interviewing took place over a 32-month period from December 2008 through July 2011. During the follow-up period some men may still be in prison, while others may have been released from incarceration.
The service settings and specific target populations varied among the grantees, and the sites varied widely in terms of services delivered and service delivery approach (Exhibit 3). The broad set of program components delivered among the set of grantees included marriage education, marriage/family counseling, parenting education, enhanced visitation options, case management, education and employment services, support groups, financial literacy services, mentoring and coaching services, and domestic violence services. Brief program descriptions are included in Appendix I.
Study participants include treatment and control/comparison group members at a subset of five grantee sites: RIDGE Project (Defiance, OH), Minnesota Council on Crime and Justice (Minneapolis, MN), New Jersey Department of Corrections (Trenton, NJ), Osborne Association (Brooklyn, NY), and Indiana Department of Correction (Indianapolis, IN) (Exhibit 3). These five sites were selected based on six criteria.
Each program had to include a couple-based relationship focus because the impact evaluation must achieve its goal of determining whether couple-based family strengthening programming has a positive impact on relationship quality, child well-being, and other outcomes.
Program intensity was looked at because in order for a program to achieve any desirable outcomes, a reasonable level of program exposure/contact is necessary.
Program enrollment was a key factor, as it is in any evaluability assessment. Sufficient numbers of couples receiving the treatment (as well as a sufficient number of “untreated” couples to serve as the comparison/control group) are an important prerequisite for acceptable statistical power for detecting actual treatment effects.
Stage of implementation was a consideration because modifications to program design or delivery once the evaluation is underway are extremely undesirable. Therefore, it was necessary that programs had finalized plans for program delivery (and were ready to begin implementing their programs or had already begun implementation) at the time of evaluation site selection.
Study design considerations were key. Specifically, we assessed whether each program was willing to randomly assign eligible couples to receive MFS-IP programming or “treatment as usual,” and if not, whether other possibilities for the identification of a comparison group existed at the site.
Finally, the site’s willingness to participate and staff capacity for participation were essential considerations because the impact evaluation cannot be conducted successfully without the cooperation of the participating sites.
Exhibit 3. Site Selection Criteria and Program Status (at Time of Initial Selection)
Impact Study Sites |
Target Population |
Program Components |
Rationale |
NJ DOC |
Targeted men incarcerated at state correctional facilities who were either in a committed relationship with children or married, had six to nine months left to serve, and were max-out offenders. Men could only participate if their partners agreed to enroll. |
Participants received case management pre- and post-release, including visitation coordination and discharge planning (with a particular focus on family relationships and substance abuse); a 12-week marriage education and parenting curriculum (Married and Loving It); a substance abuse treatment course (Living in Balance); and referrals to resources in the community. |
|
MN CCJ |
Targeted fathers in state correctional facilities who were in a committed relationship, from and returning to the Twin Cities area, and had a sentence of six months to three years (sentence length requirement was shortened to 3-6 months in Year 4 to ensure that post-release services could be delivered for one year before the end of grant funding). Men could only participate if their partners also enrolled. |
Program components, all of which were offered to both members of the couple, included the following: case management; parenting classes; relationship classes; financial literacy training; and employment referrals, training, and placement. |
|
RIDGE Project |
Targeted men in the fatherhood role who were incarcerated, in a committed relationship, recently released, on probation, or on parole within the past two years. |
Participants were offered a series of courses focused on strengthening family relationships during and after incarceration: Couple Communication I classes (1.5 hours per week for six weeks), Keeping FAITH (Module 1) workshops (1.5 hours per week for 12 weeks), Couple Communication II classes (1.5 hours per week for six weeks), and Keeping FAITH (Module 2) workshops (1.5 hours per week for 12 weeks). |
|
Osborne Association |
Targeted incarcerated fathers at New York DOCS facilities who were in committed relationships, and their partners. |
Those who enrolled in the parenting class received 16 weeks of classroom-based fatherhood training. Healthy Relationships class participants received five or six weekly sessions. Those who took part in the Healthy Marriage Seminar for couples received an eight-hour class delivered during a single weekend. |
|
IN DOC |
Targeted men incarcerated at Indiana Department of Corrections prisons who were participants in character/faith-based living units (PLUS) or general inmate population graduates of a parenting class, and their partners. |
Couples retreat participants received eight hours of marriage education (PREP) delivered jointly to both members of the couple over a single weekend.
|
|
Each of the selected sites had a strong couple-based relationship program, acceptable enrollment numbers, and strong counterfactual strategies. Individually, they fulfilled all key selection criteria and collectively they offered a diverse set of programs—one for which key questions can be answered regarding the effectiveness of couple-based relationship-strengthening services in general and specific program components that appear to be associated with greater effectiveness.
Consultations with the five impact sites and external experts resulted in site-specific study designs with distinct counterfactual strategies. Approaches for each site are detailed in Exhibit 4. An essential component of the evaluation is understanding and documenting the counterfactual in each site, since it is unlikely that comparison and control group members will avoid services altogether. Service receipt by control and comparison group members was documented in the implementation evaluation (see B2, below), and through the inclusion of a battery of items regarding service receipt in the survey instruments (see Appendices A, B).
Selection of subjects for the longitudinal interview component of the evaluation was conducted with the assistance of each site participating in the impact evaluation. For “treatment” couples (i.e., those who were officially enrolled in the MFS-IP programs), program staff provided contact information to the study for the men and their partners. For “comparison” couples, program or agency research staff assisted in the identification of eligible subjects. In sites where a surplus of eligible study members was identified for either the treatment or comparison groups, we truncated the baseline enrollment period in that site and/or randomly sampled from the eligible respondents. Enrollment in the study is complete as of submission of this renewal application.
Exhibit 4. Site-Specific Study Approaches and Actual Sample Sizes
Grantee |
Approach |
ActualSample Size |
NJ DOC |
The program enrolled incarcerated men at one of seven NJDOC facilities with projected release dates of six to nine months (at the time of recruitment) and who are eligible based on several additional characteristics. A comparison group strategy was developed in this site because random assignment would have reduced the number of treatment couples from 270 over the three years to 135 each for the treatment and control group. Therefore, an alternative strategy was developed in which men who met all program eligibility criteria but were housed at one of three NJDOC facilities that did not offer the program were screened using the NJDOC database and in-person screening sessions and were recruited into the comparison group if eligible. The committed partners of all treatment and comparison group men who enrolled in the study were then recruited in the community. |
185 treatment and 128 comparison men and their partners over the three year enrollment period |
MN CCJ |
The program enrolled incarcerated men from the seven-county Twin Cities area into the program (with several other eligibility criteria applied). Men who were screened as “paper eligible” for the program were randomly assigned to either the treatment or control group. RTI and MN CCJ conducted further, parallel screening procedures with the paper eligible control and treatment men, respectively. Men met final eligibility criteria, which, for the treatment group, included partner enrollment in the program, were enrolled in the study. The committed partners of all treatment and comparison group men who enrolled in the study were then recruited in the community. |
48 treatment and 35 comparison men and their partners over the three year enrollment period |
IN DOC |
The program served men incarcerated in a specialized housing unit and those in the general population who completed a parenting class, across 13 state prisons. In order to be eligible for the PREP couples retreat (the component being evaluated), men had to be in a committed relationship (with a partner who is willing to attend the retreat). The study design for the evaluation of this treatment was a quasi-experimental design in which a matched comparison group was identified (using a brief screening tool and administrative data from the IDOC database) from the same populations of men (at the same IDOC facilities) who were eligible for the PREP couples’ retreats. The comparison men had to meet the eligibility criteria for the retreats (i.e., be from one of the subpopulations eligible for the retreat and be in a self-reported romantic relationship with a partner they thought would be willing to attend) and were matched to the actual retreat participants on a number of characteristics. The comparison men did not receive the retreat at any point during the 18 month follow-up period because they were either released prior to the next scheduled retreat (priority was given to these individuals in the selection process) or for some other (unknown) reason. |
282 treatment and 418 comparison men and their committed partners over the three year enrollment period |
Osborne |
The PREP couples retreat (the component being evaluated) was available to men in a committed relationship (with a partner who was willing to attend the retreat) who were incarcerated at one of six NYSDOCS facilities and had completed prerequisite parenting and healthy relationship classes. Because there was not a surplus of couples eligible for the retreat (and the enrollment targets for this site were low), random assignment was rejected. Instead, a matched comparison group was recruited by screening participants in parenting classes at five comparable NYSDOCS facilities not served by the grant. Based on the screening, men in committed relationships who were interested in attending a couples retreat (if available), were identified for the comparison group. |
139 treatment and 75 comparison men and their committed partners over the three year enrollment period |
RIDGE Project (OH) |
RIDGE offered a series of relationship courses (Couple Communication and Keeping Faith) and other services to men incarcerated at one of ten OHDOC facilities. A comparison group was selected from the wait list for the program. All men on the wait list expressed interest in receiving the program and met initial eligibility criteria. Men selected for the comparison group were those who did not end up enrolling in the program because they were either transferred or released before the actual enrollment for CC1 began at their facility or because they remained on the wait list for the entire duration of the follow-up period. |
433 treatment and 258 comparison men and their partners over the three year enrollment period |
Estimates of power can be difficult with many of the multilevel analyses proposed; in several instances, closed power formulae for the designs proposed have not been developed. However, power can always be estimated using Monte Carlo simulations, where multiple data sets, using bootstrap resampling, are generated using estimates of parameters provided by the user to inform the simulation. In turn, the multiple data sets are analyzed to determine power (Thomas & Krebs, 1997). Because we have conducted related studies using some of the measures we plan to use here, effect size estimates are generally available. The power analyses presented here use pooled estimates of parameters and standard errors from the following studies (a) Behavioral Couples Therapy for Drug Abuse, (b) Abbreviated Couples Therapy for Drug Abuse, and (c) Group-Based BCT for Drug Abuse which are all NIDA-funded longitudinal trials. The following assumptions are made in this analysis:
There is a 30% missing rate at any given time point, with missing data addressed using Full Information Maximum Likelihood methods (and adjusting, of course, for increased standard error due to missing data)
The analysis uses a longitudinal design comparing couples therapy to an equally intensive Treatment-As-Usual (TAU)
The models assume Couple Dyadic Adjustment Scale (DAS) scores nested within time; the models are very similar if only husband or wife scores are used, so there is no gain with other methods
The model assumes normal data (DAS tends to look somewhat normal)
These multilevel models will be estimated in a covariance structure framework (versus traditional multilevel framework, as implemented in statistical programming packages such as HLM or MLwiN)
The analysis uses a .05 alpha.
Power estimates in Exhibit 5 are from Monte Carlo simulations using procedures described in Sartorra and Saris (1985) relating to the DAS and the Conflict Tactics Scale (CTS II). For the DAS and CTS (as well as most primary outcome measures on children’s adjustment, parenting behavior, substance use, employment, legal entanglements, and so forth), the effect sizes observed in Dr. Fals-Stewart’s trials are medium-sized using Cohen’s (1988) conventions. In these contexts, a medium effect size for the DAS translates roughly to an increase in scores of about 15 to 20 points more in the treatment group than the control group. In categorical terms, this means a move from very distressed to distressed, distressed to mildly distressed, mildly distressed to normal, normal to happy, or happy to very happy. For the CTS, it means about 20% more couples in the treatment group will not commit any acts of partner violence in the post-intervention period compared with controls. In both cases, this is benchmarked against an active TAU and these effect sizes are viewed as clinically meaningful. This effect size would be detected with 80% power in samples of 200 or more couples in treatment and control conditions.
Exhibit 5. Power Estimates from Monte Carlo Simulations
Sample Size (in terms of couples per condition; assumes 2 conditions): |
DAS |
CTS II1 |
50 |
0.62 |
0.4 |
100 |
0.76 |
0.56 |
150 |
0.86 |
0.69 |
200 |
0.94 |
0.8 |
250 |
0.98 |
0.87 |
300 |
0.99 |
0.92 |
Recent research on relationship education interventions has found a range of detectable effects depending on the population and implementation of the intervention. These effect size estimates are provided in Appendix J. With the expected enrollment in each site at each wave, we estimate the minimum detectable effect based on Bloom, 1995. These estimates are displayed in Exhibit 6. These MDEs are well within the range that has been found in the literature, with MN being a possible outlier.
Exhibit 6. MDEs Based on Expected Enrollment per Site
|
Men |
||
|
program sample size |
total sample size |
MDE |
NJ |
150 |
300 |
0.27 |
MN |
40 |
80 |
0.52 |
IN |
350 |
700 |
0.18 |
NY |
125 |
250 |
0.3 |
OH |
315 |
700 |
0.18 |
The MDE, or minimum detectible effect, is displayed as a fraction of the initial sample standard deviation. These values assume equal numbers assigned to treatment and comparison groups, power of 80%, alpha of 0.05, a two-sided test, analysis of program impact done using a regression with an R^2 of 0.2 when run without the treatment indicator. |
The study is designed to assess the short- and long-term effects of participation in MFS-IP programming on key outcomes, including relationship quality and stability, intimate partner violence, parenting behaviors, child well-being, family income, and recidivism. Several relationship quality constructs are measured, including relationship satisfaction (using the DAS-8), fidelity, marriage plans, positive couple interactions, supportiveness, shared decision making, and conflict. The relationship stability domain includes measures of marital history, marital status, presence of a romantic relationship, cohabitation, and commitment. Several variables posited as mediators between MFS-IP programming and the outcomes above are measured, including substance abuse, employment, parenting skills and attitudes toward marriage.
Interview data are obtained from each member of participating couples. Baseline interview data were collected before initiation of any MFS-IP services. Follow-up data are collected at nine and 18 months post-baseline in all sites, and also at 34 months post baseline in the two highest-enrolling sites, IN and OH. The timing of the follow-up data collections from the point of enrollment is appropriate because the interventions vary widely in terms of their length, dose, and timing relative to the term of incarceration. The follow-up surveys (Appendices A, B) are structured so that all respondents receive some common sections, while some sections are administered only to respondents who are still incarcerated or only to respondents who have been released. Detail on the timing of each interview wave relative to program delivery and study participant incarceration term is provided in Exhibit 6.
Exhibit 6. Site-Specific Implications of the Timing of Program Delivery and Participants’ Incarceration Terms on the Interview Schedule
Program |
Baseline Interview (completed during prior OMB approval period) |
9 Month Interview |
18 Month Interview |
34 Month Interview |
The Osborne Association |
|
|||
Timing re: program delivery |
Immediately upon enrollment in the couples seminar |
Program likely completed for all |
At least 12 months after program completion |
At least 26 months after program completion |
Timing re: incarceration term |
Variable (enrollment not based on sentence characteristics and can take place any time during incarceration) |
Variable |
Variable |
Variable |
IN DOC |
|
|||
Timing re: program delivery |
Immediately after enrollment in the couples seminar |
Program likely completed for all |
At least 12 months after program completion |
At least 26 months after program completion |
Timing re: incarceration term |
Variable (“enrollment” into couples retreat takes place for most at the end of the 4th quarter [12 months] of PLUS participation) |
Variable |
Variable (but most likely coming up on release, given average sentence lengths and the fact that they have likely already served a year at program entry) |
Variable |
RIDGE Project (OH) |
|
|||
Timing re: program delivery |
Immediately after enrollment |
Program likely completed for all |
Program ends for all by this time period |
Program completed for all |
Timing re: incarceration term |
Variable (enrollment not based on sentence characteristics and can take place any time during incarceration) |
Variable |
Variable |
Variable |
NJ DOC |
|
|||
Timing re: program delivery |
Immediately after enrollment |
Program still ongoing for all |
Program ends for all at this time period |
Program completed for all |
Timing re: incarceration term |
6-9 months from release |
Within 0-3 months after release |
9-12 months after release |
23-26 months after release |
MN CCJ |
|
|||
Timing re: program delivery |
Immediately after enrollment |
Program still ongoing for all |
Program still ongoing for all |
Program completed for all |
Timing re: incarceration term |
Immediately after intake; one-three years from release |
All are still incarcerated |
From up to 18 months before release to six months after release |
From up to 4 months before release to 20 months after release |
Several waves of data collection overlap, and the overall data collection will take 66 months, including the 36-month renewal period covered by the current application.
The study utilizes field supervisors and field interviewers from RTI’s National Interviewer File, and has been supplemented with new hires as necessary to afford full coverage of each site. Field supervisors attended a one-day in-person training session focused on project management responsibilities. Then all field supervisors and field interviewers attended a five-day, in-person training session, covering procedures for contacting respondents, gaining cooperation, avoiding and converting refusals, administering the interview, and reporting. An interim training for replacement interviewers hired mid-study was also held, and refresher trainings were held prior to the 9 and 18 month interviews. An additional training will be held prior to the 34 month interview. Trainings involve a combination of lecture, demonstration, and hands-on skills practice. All field supervisors and field interviewers are required to pass a certification exam upon the completion of training.
Eligible respondents who are incarcerated at the time that they are due for their follow-up interview are brought individually to a private room in the correctional facility where the study and all respondent rights are explained. It is relayed ahead of time to the facility contact (in the facility access negotiations carried out by site liaisons) that facility staff should not convey any details about our study to the participants. When approaching study participants about meeting with the interviewer, facility staff tell the prospective respondents that a researcher would like to meet with them and talk about the possibility of doing an interview for a research study and that the researcher will tell them more about the study in the interview room. The field interviewer reviews the consent form and answers any questions that the respondent has about the study. Prison officials are told that the length of the interview varies greatly (from ten minutes to two hours).
Eligible respondents who are not incarcerated (which at follow-up includes most female respondents as well as some male respondents) are mailed a letter introducing or reminding them about the study. The letters contain a toll-free number that the respondent may call to schedule the interview and increase his or her payment by $5. If the respondent does not call the interviewer within a week of the letter being sent, the interviewer attempts to contact the respondent (via telephone or a home visit) to schedule the interview. Copies of the lead letters and the Q&A brochures (which do not accompany the partner lead letters but rather are used by the field interviewer when meeting with the partner in person to discuss whether she is interested in participating in the study) are attached, in addition to the “Sorry I Missed You” cards and appointment cards that are used in the field for setting up interviews.
All interviews are conducted in a private setting. For incarcerated respondents, the interview is conducted in a private room at the correctional facility, and only the respondent and the interviewer are present. For partners, the interview is conducted at the respondent’s home or other private location. Men who are eligible for study participation are escorted individually to a private room in the correctional facility where they reside. The interviewer then describes the study to potential participants.
For all interviews, after confirming that the correct respondent is present, participants are handed the consent form and read along as the interviewer reads the consent form text directly from the laptop. The consent form displays the interview consent text and a signature line for participants to indicate their consent to be interviewed (and for the interviewer to sign). The consent form also has additional text and a separate signature line for the respondents to indicate their consent for having random segments of the interview audio recorded for quality control purposes (i.e., Computer-Assisted Recorded Interview [CARI]). Participants sign one copy of the consent form for the project files and retain an unsigned copy for themselves.
The interviewer administers the interview in a prescribed and uniform manner. The interviews are conducted using Computer-Assisted Personal Interviewing (CAPI). For particularly sensitive sections of , ACASI is used. Specifically, the ACASI section includes the questions on criminal history/behavior, criminal/drug involvement of the people with whom the respondent resides, substance use, relationship fidelity, and intimate partner violence. For the non-ACASI sections, the field interviewer (FI) reads the questions from the screen and enters the respondent’s answers into the laptop. If at any time, the privacy of the interview setting is compromised, the interviewer will pause the interview until privacy can be reestablished, rescheduling as necessary. Each interview lasts approximately one-and-a-half hours and covers the following topics: basic demographic information, attitudes, programs and services, family structure, relationship quality, parenting, physical and mental health, substance use, criminal behavior, employment and income, expectations for release, and future contact information. The content of all instruments (male and female follow-up) is similar. However, the time periods about which the questions are asked differ, and there are separate skip and fill patterns depending on whether the respondent is incarcerated.
At the conclusion of community-based interviews, respondents are given their payment and asked to initial a receipt. At the conclusion of facility-based interviews, FIs follow the compensation procedures allowed by the facility (e.g., no compensation, payment to a community designee, or deposit of a money order into jail inmates’ accounts).
During the course of the study, FIs may observe respondent distress or child abuse or neglect. Critical incident protocols have been developed for the study (with separate versions for facility- and community-based interviews). These protocols specify steps the interviewer and other project staff charged with decision-making should follow. The interviewers have been trained extensively on these protocols.
Some portions of each interview are recorded by the laptop using CARI technology. The purpose of CARI is to detect interviewer falsification. CARI files for 5% of all interviewers’ cases are reviewed by the project quality control manager. The respondent’s permission to use CARI is requested during the informed consent process. The respondent may still participate in the interview even if he or she declines CARI. If the respondent agrees to CARI, as a necessary condition of detecting interviewer falsification, neither the respondent nor the interviewer are aware of when the computer is making the recordings. At least three 30-second portions of the interview are recorded as well as several responses to “other specify” questions. CARI is not used during the portion of the interview that asks the respondent for future contact information. CARI is not used during any portion of the interview that asks particularly sensitive questions.
The field supervisors for 5% of community-based interviews also conduct standard telephone verification these interviews.
In any longitudinal evaluation, retaining the cooperation of the study sample for the entire study is challenging. Given that the study population in this case is incarcerated and formerly incarcerated men and their partners, retaining study sample cooperation through the transition back into the community is especially challenging. Our overall baseline response rates were 82% for men and 75% for women. In order to retain our sample as more of the men become released into the community, we have proposed increasing the incentive for those who are eligible to receive an incentive to $75.
In addition, our treatment and control sample retention rates are well-matched for men, but the retention rates for female treatment group members are around 4% higher than the retention rates for female control group members. There are two reasons why the treatment groups may have higher response rates than the control groups:
The issues of marriage, family, and child well-being may be more salient in the treatment group since some of the interventions may enhance this awareness.
Individuals who have benefited from the interventions who might otherwise have refused to respond to the survey may decide to participate to reciprocate for perceived benefits; likewise those who have not benefited as much from the interventions will tend not to cooperate with the survey request.
There may be other reasons why the response rates may differ, but these two reasons represent the heart of the issue. Both reasons have the potential for biasing the estimates of the treatment effects. Increased salience might lead certain groups that typically have low response rates (such as low education and some minority groups) to respond at a higher rate if they are in the treatment group relative to the control group. Additionally a better response rate from those who benefited more from the interventions could overstate the benefits and the estimated treatment effect may be biased upward.
The impact study approach includes two strategies to address these concerns. First, a data collection methodology designed to minimize the non-response bias is used, particularly as it affects the comparison of estimates between control/comparison and treatment groups. Here it is important to note that having the same response rates in both treatment and control/comparison groups does not guarantee that bias in the treatment effect is minimized. This is because non-response bias is the product of two components: the non-response rate and the difference in the characteristics of respondents and non-respondents. Thus, the differential non-response bias in treatment versus control comparisons may still not be zero if the compositions of the non-responding populations are different with respect to the characteristics of interest. To guard against this eventuality, a commitment has been made to successfully contacting and screening sample members and achieving the highest possible response rates. Such methods include:
In-Person Interviewing. It is expected that in general the prospective participants will be in a lower socioeconomic class than average, and that some of them may not have regular telephone access. When surveying a hard-to-reach population of this kind, experience has shown that an interviewer-administered mode yields higher response rates than self-administered modes. In-person interviews using CAPI will be the most efficient means of efficiently gathering interview data.
Respondent Convenience and Multiple Attempts. Interviewers will schedule interviews at the respondent’s convenience. There will be multiple attempts to reach nonrespondents, including leaving “Sorry I Missed You” cards at empty households at the time of a scheduled visit. As the best times to reach respondents and make contact attempts are ascertained, the interviewer scheduling will be adjusted accordingly.
Customized Lead Letters. Customized lead letters are sent in advance of fielding to promote respondent cooperation. The lead letter explains the study objectives, explains that the survey is voluntary, and assures confidentiality. Moreover, the letter provides several means for respondents to contact interviewers, including a toll-free telephone number and email address.
Financial Incentive. Wherever allowable, a cash incentive is offered to each respondent who makes a good faith effort to complete the survey. This is understood to increase perceived benefit so that respondents will make time for the interview. The incentive payment also helps emphasize the importance of participating in the study.
Comprehensive Interviewer Training. Interviewing staff participate in a multi-day, comprehensive training. Interviewers are trained on the study purpose and procedures, interview administration, and the protection of human subjects. Past literature has shown that interviewer effect can be a source of potential survey bias. Therefore, a thorough understanding of the study and the instrument, and upholding standard protocols and ethical commitment, will reduce bias and in turn help interviewers gain respondent trust.
Refusal Aversion and Conversion. Part of the interviewer training addresses in detail specific techniques to avert and convert a refusal from a respondent. Respondents who initially refuse to participate will be assigned to interviewers who have a proven record of turning refusals into completed interviews. Reasons for refusals and barriers to participation are continually evaluated in light of the experience gained in the data collection process.
Regular Debriefings with Data Collection Staff. The project management staff regularly meet with data collection staff to discuss issues related to data collection operations. Methods to enhance response rates are a standard agenda item at these meetings.
To the extent possible, response rates by pre-identified demographic and other variables during data collection have been monitored and adjustments made as needed to ensure that both groups at a site have not only the same response rates but also the same patterns of nonresponse across the demographic groups.
If a differential rate or bias is apparent in the estimates, statistical adjustments will be used to further minimize the risks of bias. Response propensity models will allow the weighting of the data so that the demographic composition of both groups in a site are statistically equivalent on those variables that are most highly correlated with and most critical to analysis. Another approach involves sample selection models, both static and dynamic. Here, the goal is to take account of the factors influencing selection into the sample in a first stage equation and then include a selection variable in an outcome equation.
Difference between groups on observable variables will be adjusted statistically using propensity matching and other methods. Unobservable differences between treatment and control groups are more concerning, but our screening collection will offer a partial solution in that it captures equal motivation to receive services between participants. Other impact estimation methods will be used that attempt to address selection bias by using alternative comparisons and counterfactuals. Possible approaches will include comparing the outcomes of the entire eligible population in the presence of the program with the outcomes of that population without the program, comparing different cohorts of participants because participation can be lagged by as much as three years, and matching treatment and comparison groups on observable characteristics.
In developing the instruments for this study, a pilot test of the male and female survey instruments was conducted. The purpose of the pilot was to test the overall flow and length of the survey, to elicit information on how respondents are forming and reporting responses, and to identify any questions that may be difficult for respondents to answer. The male instrument was piloted with a convenience sample of nine incarcerated men who were married or in a committed relationship and had minor children. The female instrument was piloted with the committed partners of those men who agreed to do the interview and agreed to provide their partners’ contact information. Male interviews were conducted in the correctional facility where respondents were incarcerated, and female interviews were conducted in respondents’ homes. Both versions were piloted using a paper-and-pencil version of the instrument. Respondents were paid $25 each for the interviews. The instruments were revised based on information obtained through this process, with the goal of improving the quality of data collected and minimizing burden on respondents. Reference periods were simplified for some constructs to provide greater ease of recall for those items that pilot study participants found difficult to answer. The incentive process was also tested, and payment for female respondents was increased to $40. (For the pilot feedback form, see Appendix H)
Most items included in the data collection instrument have been used successfully in previous studies with similar populations, including the Multi-Site Evaluation of the Serious and Violent Offender Reentry Initiative (SVORI), the Evaluation of the Community Healthy Marriage Initiative (CHMI), and the longitudinal Returning Home study of reentering prisoners. Investigators who were involved in each of these projects provided guidance and feedback on the survey instrument, and their experiences collecting survey data involving similar populations and outcomes of interest was very helpful.
a) Individuals who have participated in designing the data collection:
ASPE staff
Linda Mellgren Linda.Mellgren@hhs.gov
Jennifer Burnszynski Jennifer.Burnszynski@hhs.gov
Nicole Gardner-Neblett, PhD Nicole.Gardner-Neblett@hhs.gov
Diana Merelman Diana.Merelman@hhs.gov
Erica Meade Erica.Meade@hhs.gov
Kimberly Clum Kimberly.Clum@hhs.gov
RTI International Staff
Anupa Bir, PhD abir@rti.org
Christine Lindquist, PhD lindquist@rti.org
Tasseli McKay tmckay@rti.org
b) Individuals who will participate in the collection of data (all from RTI International):
Kristine Fahrney fahrney@rti.org
Azot Derecho derecho@rti.org
Judy Myer mfsjmey1@mfsmail.rti.org
Barbara Davis mfsbdav1@mfsmail.rti.org
c) Individuals who will participate in data analysis:
ASPE Staff
Linda Mellgren Linda.Mellgren@hhs.gov
Erica Meade Erica.Meade@hhs.gov
Kimberly Clum Kimberly.Clum@hhs.gov
RTI International Staff
Anupa Bir, PhD abir@rti.org
Christine Lindquist, PhD lindquist@rti.org
Mindy Herman-Stahl, PhD mindy@rti.org
Kristine Fahrney fahrney@rti.org
Tasseli McKay tmckay@rti.org
Sarah Siegel, PhD ssiegel@rti.org
Danielle Steffey steffey@rti.org
Vince Keyes vkeyes@rti.org
Accordino, M.P., and B. Guerney. 1998. “An Evaluation of the Relationship Enhancement Program with Prisoners and Their Wives.” International Journal of Offender Therapy and Comparative Criminology 42(1):5-15.
Bauer, B., J. Hart, A. Hopewell, and N. Tein. 2007. “Research and Practice Symposium on Marriage and Incarceration: A Meeting Summary.” Prepared by Health Systems Research Inc. for the Office of the Assistant Secretary for Planning and Evaluation, U.S. Department of Health and Human Services, January 2007. <http://aspe.hhs.gov/hsp/07/marr-incar/index.htm>.
Bayse, D.J., S.M. Allgood, and P.H. Van Wyk. 1991. “Family Life Education: An Effective Tool for Prisoner Rehabilitation.” Family Relations: Journal of Applied Family & Child Studies 40(3):254-257.
Bloom, Howard S. 1995. “Minimum Detectable Effects: A Simple Way to Report the Statistical Power of Experimental Designs.” Evaluation Review 19(5): 547-56.
Bowling, T.K., C.M. Hill, and M. Jencius. 2005. “An Overview of Marriage Enrichment.” The Family Journal: Counseling and Therapy for Couples and Families 13(1):87-94.
Carlson, B.S., and N. Cervera. 1991. “Inmates and Their Families.” Criminal Justice and Behavior 18(3):318-339.
Center for Children of Incarcerated Parents (CCIP). 2001. CCIP Data Sheet 3a: How Many Children of Incarcerated Parents Are There? Eagle Rock, CA: Center for Children of Incarcerated Parents. <http://www.e-ccip.org/publication.html>. As obtained on June 29, 2006.
Cohen, J. 1988. Statistical Power Analysis for the Behavioral Sciences (2nd ed.). Hillsdale, NJ: Lawrence Earlbaum Associates.
Economic Policy Institute. (2008). Economic snapshots. Snapshot for August 21, 2002: Bringing the jobs back home to prisons. Retrieved March 7, 2008, from http://www.epi.org/content.cfm/webfeatures_snapshots_archive_08212002
Hairston, C.F. 1995. “Fathers in Prison.” In Children of Incarcerated Parents, D. Johnston and K. Gables, eds., pp. 31-40. Lexington, MA: Lexington Books.
Hairston, C.F. 2001. “Prisoners and Families: Parenting Issues during Incarceration.” Chicago, IL: Jane Addams College of Social Work, University of Illinois, Chicago. <http://aspe.hhs.gov/search/hsp/prison2home02/Hairston.htm>.
Hairston, C.F., and P.C. Locket. 1987. “Parents in Prison: New Directions for Social Services.” Social Work 32(2):162-164.
Harrison, K. 1997. “Parental Training for Incarcerated Fathers: Effects on Attitudes, Self-Esteem, and Children’s Self Perceptions.” Journal of Social Psychology 137:588-593.
Larson, J.H. 2004. “Innovations in Marriage Education: Introduction and Challenges.” Family Relations 53:421-424.
Laub, J.H., and R.J. Sampson. 2003. Shared Beginnings, Divergent Lives: Delinquent Boys to Age 70. Cambridge, MA: Harvard University Press.
Markman, H.J., G. Kline, J.G. Rea, S. Simms-Piper, and S.M. Stanley. 2005. “A Sampling of Theoretical, Methodological and Policy Issues in Marriage Education: Implications for Family Psychology.” In Family Psychology: The Art of the Science, W.M. Pinsof and J.L. Lebow, pp. 115-137. New York: Oxford University Press.
Martin, J.S. 2001. Inside Looking Out: Jailed Fathers’ Perceptions about Separation from Their Children. New York: LFB Scholarly Publishing LLC.
Peladeau, N., and Y. Lacouture. 1993. “SIMSTAT: Bootstrap Computer Simulation and Statistical Program for IBM Personal Computers.” Behavior Research Methods, Instruments & Computers 25(33): 410-413.
Sampson, R.J., J.H. Laub, and C. Wimer. 2006. “Does Marriage Reduce Crime? A Counterfactual Approach to Within-Individual Causal Effects?” Criminology 44(3):465-508.
Sartorra, A., and W.E. Saris. 1985. “Power of the Likelihood Ratio Test in Covariance Structure Analysis.” Psychometrika 50:83-90.
Singer, E. 2002. “The use of Incentives to Reduce Non-Response in Household Surveys.” In Survey Nonresponse, R. Groves, D.A. Dillman, J.L. Eltinge, and R.J.A. Little, eds. New York: Wiley.
Snyder, Z.K., T.A. Carlo, and M.M. Coats Mullins. 2001. “Parenting from Prison: An Examination of a Children’s Visitation Program at a Women’s Correctional Facility.” Marriage & Family Review 32(3-4):33-61.
Thomas, L., and C.A. Krebs. 1997. “A Review of Statistical Power Software.” Bulletin of the Ecological Society of America 78:126-139.
Travis, J., E.C. McBride, and A.L. Solomon. 2005. The Hidden Costs of Incarceration and Reentry. Washington, DC: The Urban Institute, Justice Policy Center. <http://www.urban.org/publications/310882.html>. As obtained on June 29, 2006.
Visher, C., N.G. La Vigne, and J. Travis. 2004. Returning Home: Understanding the Challenges of Prisoner Reentry, Maryland Pilot Study: Findings from Baltimore. Washington, DC: The Urban Institute, Justice Policy Center. <http://www.urban.org/publications/ 410974.html>. As obtained on June 29, 2006.
Visher, C.A., and J. Travis. 2003. “Transitions from Prison to the Community: Understanding Individual Pathways.” Annual Review of Sociology 29:89-113.
Warriner, K., H. Goyder, H. Gjertsen, P. Hohner, and K. McSpurren. 1996. “Charities, No; Lotteries, No; Cash, Yes: Main Effects and Interactions in a Canadian Incentives Experiment.” Public Opinion Quarterly 60:542-562.
Western, B., L. Lopoo, and S. McLanahan. 2004. “Incarceration and the Bonds Between Parents in Fragile Families.” In Imprisoning America: The Social Effects of Mass Incarceration, M. Patillo, D. Weiman, and B. Western, eds., pp. 21-45. New York: Russell Sage Foundation.
1 The power estimates for the CTS II (a measure of partner violence) require the additional assumption of a negative binomial model for the very highly skewed characteristics that usually mark violence data.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Division of Violence Prevention |
Author | tmckay |
File Modified | 0000-00-00 |
File Created | 2021-01-31 |