Regional Partnerships Grants (RPG) National Cross- Site Evaluation and Evaluation Technical Assistance
Supporting Statement, Part A
For OMB Approval
August 2016
The Children’s Bureau (CB) within the Administration for Children and Families (ACF) of the U.S. Department of Health and Human Services seeks a renewal of clearance to collect information for the Regional Partnership Grants to Increase the Well-being of and to Improve Permanency Outcomes for Children Affected by Substance Abuse Cross-Site Evaluation and Evaluation-Related Technical Assistance (RPG2) and Evaluation-Related Technical Assistance and Data Collection Support for Regional Partnership Grant Program Round Three Sites (RPG3) collectively referred to as the RPG Cross-Site Evaluation. Under RPG, CB has issued 21 grants (17 grants under RPG2 and 4 grants under RPG3) to organizations such as child welfare or substance use disorder treatment providers or family court systems to develop interagency collaborations and provide services designed to increase well-being, improve permanency, and enhance the safety of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance use disorder. CB required RPG grantees to use evidence-based or evidence-informed programs to deliver services to children, adults, and families.
The overall objective of the RPG Cross-Site Evaluation is to plan, develop, and implement a rigorous national cross-site evaluation of the RPG Grant Program, provide legislatively-mandated performance measurement, and furnish evaluation-related technical assistance to the grantees to improve the quality and rigor of their local evaluations. The project will document the programs and activities conducted through the RPG program and assess the extent to which the grants have been successful in addressing the needs of families with substance use disorders who come to the attention of the child welfare system.
As part of providing technical assistance, the evaluator is required to advise CB on the instruments grantees are to use to collect data from program participants for required local evaluations. Grantees have secured approval from their local IRBs for collecting these data. This information collection request (ICR) requests a renewal of clearance for obtaining from grantees participant data they collect for their local evaluations, and for directly collecting additional data from grantees and their partners and providers, for the cross-site evaluation.
This ICR requests a renewal of clearance for the OMB package #0970-0444 which was originally approved in March 2014. Four RPG3 grantees were awarded grants from CB in September 2014 and were added to the existing OMB package through a non-substantive change request approved by OMB in June 2015. Specifically, this ICR requests clearance for the following data collection activities: (1) RPG staff and partner semi-structured interviews during site visits, (2) a web-based staff survey, (3) semi-annual progress reports, (4) enrollment and service use data collection, (5) a web-based partner survey, and (6) data entry and uploads to a web-based data portal of child, adult, and family outcome data for families enrolled in RPG, and those in comparison groups for a subset of grantees. These data collection activities will be used in an implementation and partnership study, an outcomes study, and an impact analysis.
The evaluation is being undertaken by the U.S. Department of Health and Human Services, ACF, CB, and its contractor, Mathematica Policy Research. The evaluation is being implemented by Mathematica Policy Research and its subcontractors, WRMA, Inc., and Synergy Enterprises, Inc.
When mothers, fathers, or other caregivers struggle with substance use disorders, children can experience unresponsive, erratic, neglectful, or abusive care from those responsible for their nurture. This in turn can interfere with children’s physical, social, and emotional development and well-being. Substance use disorder limits parents’ ability to create a safe and stable environment for their children, and children of parents with substance use disorder have poorer physical, intellectual, social, and emotional health and are at greater risk of abusing drugs or alcohol themselves as adults (U.S. Department of Health and Human Services 1999; U.S. Department of Health and Human Services 2009; Osterling and Austin 2008; Niccols et al. 2012). Trauma resulting from parental neglect or abuse associated with substance use disorder can have a particularly detrimental effect on young children’s development.
Substance use disorder is a prominent cause of family involvement in the child welfare system: research indicates that between 50 and 80 percent of child welfare cases involve a parent with substance use disorder (Niccols et al. 2012; U.S. Department of Health and Human Services 1999). In 2009, the rate of substantiated child maltreatment reports was 10 per 1,000 children ages birth to 17; the rate was especially high for children under age 1, at 21 per 1,000 (Federal Interagency Forum on Child and Family Statistics 2012).
Most adult participants in substance use disorder treatment are parents. One study concluded that about 58 percent of participants in treatment had minor children—69 percent of women were mothers, and 52 percent of men were fathers (Young et al. 2007; Brady and Ashley 2005). Further, it was estimated that 27 percent of parents in treatment had lost custody of one or more children. Nonetheless, there has been limited targeting of treatment in a way that explicitly recognizes participants’ status as parents, especially parents who are engaged with the child welfare system. The targeted programs that do exist tend to focus on mothers rather than fathers (mothers more typically being the custodial parent), though research indicates that substance use disorder among fathers is also associated with less engaged and less responsible parenting (Conners et al. 2006; McMahon et al. 2007).
Many parents with children in the child welfare system have greater difficulty than others completing substance use disorder treatment programs. About one-fifth of parents whose child was involved with the child welfare system successfully completed substance use disorder treatment, compared to about half of those seeking treatment in the general population (Choi and Ryan 2006; Brady and Ashley 2005). These parents’ relative difficulty in addressing their addictions may be due to the inability of treatment programs to accommodate their complex circumstances and service needs. Research indicates that parents involved in substance use disorder treatment and the child welfare system may differ in important ways from those who are not involved in child welfare services. One California study, for example, found that these dual-system parents tend to be younger, have more children, experience greater economic instability, and have greater involvement in the criminal justice system, when compared with other parents in treatment (Grella et al. 2006). However, mothers who participated in treatment programs that provided a high level of family-related services or that focused on education or employment were about twice as likely to reunify with their children as those in programs with low levels of such services, which suggests strongly that addressing the full range of treatment-related needs of parents involved with child welfare is important (Grella et al. 2009; Brady and Ashley 2005). Mothers with substance-exposed infants can benefit from residential treatment in terms of both treatment progress and family reunification, but only when residential services are delivered in combination with transitional services (Huang and Ryan 2010).
The ability of the child welfare system and substance use disorder treatment providers to coordinate services to address the needs of these families has been challenging for several reasons (U.S. Department of Health and Human Services 1999; Semidei et al. 2001). Each system has different perspectives about who the “client” is and about issues such as removal and reunification. They are embedded in different federal and state legal and policy environments. In addition, many child welfare agencies operate in a culture of crisis (Golden 2009). There has been a chronic shortage of substance use disorder treatment programs, especially those appropriate for parents of young children. Confidentiality requirements can make cooperation and communication across systems challenging. In addition, ineffective screening by staff in both types of agencies can make early detection of problems difficult. One research review, for example, noted that child welfare agency staff in one study failed to identify substance use disorder in 61 percent of caregivers that in fact met the clinical criteria for alcohol or drug dependency (Young et al. 2007). Similarly, substance use disorder treatment workers must be trained to screen effectively for child neglect and abuse and make appropriate referrals.
Since 2006, Congress has authorized competitive grants to address these problems. The Child and Family Services Improvement Act of 2006 (Pub. L 109-288) provided funding over a five-year period to implement regional partnerships among child welfare, substance use disorder treatment providers, and related organizations to improve the well-being, permanency, and safety outcomes of children who were in, or at risk of, out-of-home placement as a result of a parent’s or caregiver’s methamphetamine or other substance use disorder. With this funding, the Children’s Bureau (CB) within the Administration on Children, Youth and Families, Administration for Children and Families at the U.S. Department of Health and Human Services (HHS) established the Regional Partnership Grant (RPG) program.
The Child and Family Services Improvement and Innovation Act of 2011 (Pub. L. 112-34) reauthorized the RPG program, extended funding, and authorized new demonstration projects through 2016. With the funding, CB offered new competitive grants up to $1 million per year for five years. In September 2012, CB awarded 17 RPG grants (Regional Partnership Grants to Increase the Well-Being of and to Improve the Permanency Outcomes for Children Affected by Substance Abuse HHS-2012-ACF-ACYF-CU-0321), and in September 2014, four additional five-year grants were awarded (Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children Affected by Substance Abuse HHS-2014-ACF-ACYF-CU-0809). In total CB has funded 21 RPG projects under this legislation.
The RPG Grant Program is unique in its emphasis on developing partnerships between child welfare and substance use disorder treatment systems to better meet the needs of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance use disorder. The RPG cross-site evaluation will provide important information about the characteristics of these families and the services they receive through RPG, as well as characteristics of the partnerships and how child welfare and substance use disorder treatment providers work together. In addition, the study will provide important information about changes over time in child, adult, and family outcomes, and the effectiveness of RPG services for selected grantees, including the effectiveness of evidence-based programs (EBPs) being implemented with these target populations for the first time. The information gathered will be critical to informing decisions related to future federal and community investments in services that meet the needs of children and families involved in the child welfare and substance use disorder treatment systems, as well as information about developing strong partnerships between the two systems.
The RPG Cross-Site Evaluation is a comprehensive yet efficient study that includes an implementation and partnership study and an outcomes study. These studies will build knowledge about implementing programs and services for families involved in the child welfare and substance use disorder treatment systems, developing more effective partnerships between the two systems to coordinate services for these families. They will describe the characteristics and outcomes of children, adults, and families involved in both systems and exposed to evidence-based program and practice models that may not have been tested before with these target populations. A pooled cross-site impact study will test the impact of these EBPs and other integrated services on child well-being, safety, and permanency, on adult recovery, and on family functioning and stability.
The implementation and partnership study will build knowledge about (1) effective implementation strategies across the 21 RPG projects, with a focus on factors shown in the research literature to be associated with quality implementation and (2) effective strategies for building and sustaining partnerships and integrated services between the child welfare and substance use disorder treatment providers. Key data collection activities include: (1) conducting semi-structured interviews with selected grantee and partner staff during site visits; (2) collecting semi-annual progress reports from grantees; (3) obtaining data from grantees on program enrollment, exit, and service use; (4) administering a web-based survey of service delivery staff; and (5) administering a web-based survey of lead partner staff.
The outcomes study will describe the characteristics of and changes over time in children, adults, and families who participate in the RPG programs. This descriptive study will report participant outcomes in five domains of high interest to CB: (1) child well-being, (2) family functioning/stability, (3) adult recovery, (4) child permanency, and (5) child safety. RPG grantees will be collecting data from or about participants in their RPG programs for local evaluations required under the terms of their RPG grants. They will provide some of these data to the cross-site evaluation contractor for use in the cross-site outcomes study.
The impact study will estimate the effectiveness of selected RPG interventions by comparing outcomes for individuals enrolled in RPG services to those in comparison groups. The impact study will pool outcome data on program and comparison group members from five grantees with appropriate local evaluation designs.
This ICR requests a renewal of clearance for seven data collection instruments. Five will be used to collect evidence for the implementation and partnership study, one will be used for the outcomes study, and one will be used for the impact study. This ICR focuses on the remaining data collection across all 21 RPG projects. A subset of some of the data collection activities were completed with RPG2 grantees during the first OMB clearance period and were removed from this ICR. Thus, some of the remaining data collection involves only the four RPG3 projects, while other data collection activities include all 21 grantees. These efforts are listed below, and described in greater detail in section A.2.
Grantee and partner staff topic guide. A topic guide will be used to conduct semi-structured interviews with selected grantee and partner staff during site visits to each of the four RPG3 grantees that will be conducted during the first year of the 3-year OMB clearance period being requested. The interview topic guide is included as attachment I.
Semi-annual progress reports. The implementation study will use information from federally required semi-annual progress reports to be submitted twice a year during the requested extension period, to obtain implementation information. The semi-annual progress reports for RPG2 and RPG3 are included as attachments IIA and IID in the appendix. The descriptions of evidence-based practices and other services and activities are included as attachments IIB and IIC for RPG2 and attachments IIE and IIF for RPG3.
Enrollment and service log. An enrollment and service log will be used to collect data from grantees on their enrollment of participants and provision of services to them. Grantee or provider staff will enter data as services occur. The enrollment and service log data dictionary is included as attachment III in the appendix.
Staff survey. The staff survey will be web-based and administered to frontline staff, from each of the four RPG3 grantees, who provide direct services to children, adults, and families through 10 focal EBPs (identified in Part B of this Supporting Statement). The survey will be administered, in the first year of the OMB extension. The web-based staff survey instrument is included as attachment IV in the appendix.
Partner survey. The partner survey will be web-based and administered to representatives of the grantee organizations and their partner organizations from each of the four RPG3 grantees. The survey will be administered in the first year of the extension period. The web-based partner survey is included as attachment V in the appendix.
As part of providing technical assistance, the evaluator is required to advise CB in selecting on a core set of instruments and administrative records grantees are required to use to collect data from or obtain administrative data on program participants for their required local evaluations. Some grantees may collect additional data to meet their needs, however to minimize the data collection burden on participating families, grantees will share data from the core instruments with the cross-site evaluation for use in the cross-site outcomes study (and the impact study, described next).
Outcomes study master instrument. The master instrument refers to the required standardized instruments and a list of required data elements to be drawn from administrative records. These are included as attachment VI in the appendix.
Impact study master instrument. In addition to sharing data on program participants for the outcomes study, grantees participating in the impact study will share a subset of core outcome data they collect on comparison group members. The “impact master instrument” refers to five of the 10 standardized instruments being used for the outcome study and the list of data elements to be drawn from administrative records, which will be reported for comparison group members. These are included as attachment VI in the appendix.
The Promoting Safe and Stable Families Program (Section 437(f), Subpart 2, Title IV-B, of the Social Security Act) (42 U.S.C. 629g(f)), as amended by the Child and Family Services Improvement and Innovation Act (Pub. L. 112-34). The Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) includes a targeted grants program (section 437(f) of the Social Security Act) that directs the Secretary of Health and Human Services (HHS) to reserve a specified portion of the appropriation for Regional Partnership Grants to improve the well-being of children affected by substance use disorder. This legislation also requires grantees to report performance indicators aligned with their proposed program strategies and activities. Under the terms of the RPG grant, CB requires grantees to participate in a national cross-site evaluation. The Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) is included as attachment VII in the appendix.
The data collected through the instruments included in this ICR will be analyzed and reported on by the RPG Cross-Site Evaluation. The purpose of the evaluation is to meet the legislative requirement for evaluation, and may assist Congress in setting future policy. It is also designed to contribute to the knowledge base about the implementation and effectiveness of strategies and evidence-based programs selected by RPG grantees for meeting the needs of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance use disorder. Interim findings from the evaluation have been summarized in three reports to Congress (U.S. Department of Health and Human Services 2014, 2015, 2016). The findings from the RPG Cross-Site Evaluation will be used by policymakers and funders to consider what strategies and programs they should support to meet the needs of these families. The findings can be used by providers to select and implement strategies and program models suited to the specific families and communities they serve. Evaluation findings can fill research gaps such as by rigorously testing program models that have prior evidence of effectiveness with some target populations but not these target populations, or when provided in combination with other services and programs. Congress will also use information provided through the evaluation to examine the performance of the grantees and the grant program. Details on the purpose and use of the information collected through each instrument in the implementation and partnership study, outcomes study, and impact study, are provided below.
The purpose of the implementation and partnership study is to examine the processes and content of implementation and partnership development and management, with a focus on factors shown in the research literature to be associated with quality implementation and sustainable partnerships. The study will provide descriptions of RPG grantees’ target populations, selection of EBPs and their fit with the target population, inputs to implementation (such as staff selection and hiring, staff qualifications and attitudes toward implementing EBPs, staff training, supervision and feedback, organizational climate, leadership and decision making, administrative support, referral processes, and use of data systems), and actual services provided for selected EBPs (including dosage, duration, content, adherence to program models, and participant responsiveness). The study will also provide a description of the characteristics of RPG partners, their roles in RPG programs, their relationships and communication systems, the extent of coordination and collaboration among partners, and their potential to sustain the partnerships at the end of the grant funding.
Grantee and partner staff topic guide. The purpose of the topic guide is to collect detailed information from selected program and grantee staff and partners about plans and goals for their RPG program; their decisions about which EBPs to select; the organization and leadership of the RPG partnership; the community and state context; staff satisfaction with using the selected EBPs and their perceptions of the consistency and quality of service provision; and implementation experiences, facilitators, barriers, challenges, and lessons learned.
Semi-annual progress reports. The semi-annual progress reports will be used to obtain updated information from grantee project directors about their program operations and partnerships, including any changes from prior periods. To fully meet the intent of the Funding Opportunity Announcement, grantees must adopt and implement specific, well-defined program services and activities that are evidence-based and evidence-informed and trauma-informed. The CB has tailored the semi-annual progress reports to collect information on grantees’ evidence-based and evidence-informed programs and other services grantees implement, the target population for the RPG program, and grantees’ perceived successes and challenges to implementation.
Enrollment and service log. The purpose of this instrument is to describe the services that RPG clients actually receive. Grantees will record the enrollment date for each RPG family or household and demographic information on each family member including data of birth, ethnicity, race, primary language spoken at home, type of current residence, income and sources (adults only), highest education level attained (adults only) and relationship to a focal child in each family on whom data will be collected. Grantees will also record the enrollment date for families or individual family members into specific EBPs, weekly service contact information for selected EBPs, and exit dates for EBPs and RPG.
Staff survey. Respondents for the staff survey will be all staff members who provide direct services to children, adults, and families through the 10 focal EBPs. (Each grantee is implementing one or more of the focal EBPs). The survey will collect information about their roles on RPG; their demographic characteristics, prior experience, and education; their attitudes toward implementing the EBP; any planned or unplanned adaptations made to the EBP; the supervision and support they receive; and the climate within their organization.
Partner survey. The partner survey will be administered to grantees and their partners. The purpose of the partner survey is to gather information on the characteristics of the partner organizations, how partners communicate and collaborate, goals of the partnership, and the types of organizations and roles within the partnership.
The goal of the outcomes study is to describe the characteristics of participating families and their outcomes in the five domains: (1) child well-being; (2) family functioning and stability; (3) adult recovery; (4) child permanency; and (5) child safety, for children and families who participate in the RPG programs.
Outcomes study master instrument. The purpose of the outcome master instrument is to provide instruments and specifications for administrative records in a convenient format, to help ensure consistency across grantees and to minimize duplication across instruments. The master instrument includes: (1) 10 standardized instruments used widely for family support, child development, and substance use disorder treatment research—including 7 copyrighted instruments; and (2) a list of data elements to be drawn from administrative records. Forms and information in the master instrument will be used by grantees to collect data from or on participants in the RPG programs, for use to evaluation outcomes or impacts in their local evaluations, and to share with the cross-site evaluation for describing participant characteristics and outcomes for the overall RPG grant program.
Ten standardized instruments will be included in the master instrument. The instruments will be administered by grantees at program entry and exit to obtain data on child well-being for a focal child identified in each RPG case, and for the family functioning/stability and recovery domains, as follows:
Child well-being
Trauma Symptoms Checklist for Young Children (Briere et al. 2001)
Behavior Rating Inventory of Executive Function (Gioia 2000) or the Behavior Rating Inventory of Executive Function-Preschool (Gioia 2000), depending on the age of the focal child
Child Behavior Checklist-Preschool Form (Achenbach and Rescorla 2000) or the Child Behavior Checklist-School-Age Form (Achenbach and Rescorla 2000), depending on the age of the focal child
Infant-Toddler Sensory Profile (Dunn 2002) if appropriate depending on the age of the focal child
Socialization Subscale, Vineland Adaptive Behavior Scales, Second Edition, Parent-Caregiver Rating Form (Sparrow, Cicchetti and Balla 2005) if appropriate depending on the age of the focal child
Family functioning
Adult-Adolescent Parenting Inventory (Bavolek and Keene 1999)
Center for Epidemiologic Studies-Depression Scale, 12-Item Short Form (Radloff 1977)
Parenting Stress Index, Short Form (Abidin 1995)
Adult recovery
Addiction Severity Index, Self-Report Form (McLellon et al. 1992)
Trauma Symptoms Checklist-40 (Briere and Runtz 1989)
Grantees will also obtain data obtained from administrative records maintained by local or state child welfare, foster care, and substance use disorder treatment providers for their local evaluations, and provide a core set of records to the cross-site evaluator. These records will be used to create measures of child safety and permanency, and adult receipt of substance use disorder treatment services and their recovery. A list and specifications of the core set of records needed will be included in the master instrument.
The goal of the impact study is to provide pooled estimates of the effectiveness of RPG programs among selected RPG grantees with rigorous local evaluation designs. To help minimize the burden on grantees participating in this portion of the cross-site evaluation, the impact study will use a subset of outcome data to compare treatment and comparison groups.
Impact study master instrument. The purpose of the impact master instrument is to provide instruments and specifications for administrative records in a convenient format, and to help ensure consistency across the 5 grantees who will contribute data to the cross-site impact study.
This instrument includes specifications for administrative records and four standardized instruments—including three copyrighted instruments—that will be collected by grantees to capture outcomes in the child well-being, family functioning and recovery domains from comparison group members:
Child well-being
Child Behavior Checklist-Preschool Form or the Child Behavior Checklist-School-Age Form, depending on the age of the focal child
Socialization Subscale, Vineland Adaptive Behavior Scales, Second Edition, Parent-Caregiver Rating Form, if appropriate to the age of the child
Family functioning
Parenting Stress Index, Short Form
Recovery
Addiction Severity Index
Grantees will also obtain data obtained from administrative records maintained by state child welfare, and substance use disorder treatment providers for their local evaluations, and provide a core set of records to the cross-site evaluator. These records will be used to create measures of child safety and permanency, and adult receipt of substance use disorder treatment services and their recovery. A list and specifications of the core set of records needed will be included in the outcomes and impact study master instruments.
The RPG Cross-Site Evaluation will make use of technology to collect study information. The only exceptions are for the semi-structured in-person interviews conducted during site visits, and the written semi-annual progress reports.
Web-based staff and partner surveys. The surveys of program staff and grantee partners will be administered via the web. Compared to other survey modes, web-based surveys offer ease and efficiency to respondents and help ensure data quality. The surveys will be programmed to automatically skip questions not relevant to the respondent, thus reducing cognitive and time burden. The instruments will also allow respondents to complete the surveys at a time (or times) convenient to them. If respondents are unable to complete the survey in one sitting they can save their place in the survey and return to the questionnaire at another time. Validation checks and data ranges will be built into appropriate items to ensure data quality.
Use of optimum technology applications to collect outcome and service data from grantees. The evaluation contractor and its subcontractors developed and maintain a seamless and transparent data reporting system for use by grantees, known as the RPG Data Portal. The RPG Data Portal is a user interface accessible from any computer, allowing for ease of entry, while all data is housed on secure servers behind the contractors’ firewall, thereby maintaining data security. The system has been designed with use by grantee staff in mind, and based on experience from prior studies with similar types of service providers and data. It is composed of two applications, each designed to facilitate efficient reporting of 1) outcome data, and 2) enrollment and service data.
Outcome data management system. Each grantee will report data from standardized instruments, and a list of data elements to be drawn from administrative records using the master outcome instrument. Grantees have developed their own project or agency databases to store these data. This outcome data management system includes all data that the grantee collects from clients or on behalf of clients. The contractor has provided format specifications to the grantees for use in uploading outcome data through this application. These are in easy-to-use PDF and Excel formats. Grantees will upload these data twice a year. All 21 grantees have been trained on the system and have used the data management system under the previous OMB clearance period.
Enrollment and service log. Grantee staff will use the enrollment and service log to provide demographic information on each RPG case at enrollment, as well as enrollment and exit dates for the RPG project and each EBP in which case members enroll. It will also be used to provide data for families that enroll in the focal EBPs and by home visitors to document service delivery and facilitate the tracking of all activities completed with the family, including assessments, referrals, education, and support. The design of the RPG enrollment and service log is based on web-based case management systems that Mathematica has developed and implemented successfully for multiple projects that involved collecting similar data from similar types of providers. For example, the enrollment and service log is flexible, easy-to-use, and includes navigational links to relevant fields for each type of entry to minimize burden on grantee staff and increase the quality and quantity of data collected. The log is designed to be used by multiple users at each organization and provides varying levels of access depending on users’ needs. For example, administrators or supervisors have the greatest rights within the system, having the ability to create new users, assign program participants to staff members, and review all activity from the organization. Staff providing direct services to study participants have the ability to record and review information about participants assigned to their caseload. The various levels of system access allows for streamlining of information. Limiting full system access to a small set of staff members promotes increased data security, reduces respondent confusion, and supports the collection of higher quality information. All 21 grantees have used the enrollment and service log system under the previous OMB clearance period.
Use of evaluation data to construct performance indicators. The legislation that established the RPG Grant Program requires grantees to provide performance indicators to be included in annual reports to the Congress, which the cross-site evaluation contractor will produce. To minimize grantee burden, the cross-site evaluation contractor will use data obtained for the implementation and partnership study, the outcomes study, and the impact study to create the needed performance indicators. This avoids having grantees submit performance data in addition to data required for the evaluation. Data collected directly by Mathematica or provided by grantees for the cross-site evaluation will be used to describe to Congress the program strategies of each RPG grantee and their selected EBPs; the structure and membership of their collaborative partners across child welfare, substance use disorder treatment, judicial, and other systems; enrollment targets and the pace of enrollment, along with a description of program participants; and the services received by RPG clients. Combined with information on child, adult, and family outcomes, this information will give Congress a full picture of how grantees performed and the extent to which they met their RPG program goals.
The RPG Cross-Site Evaluation is specifically designed to minimize the duplication of efforts or data. First, to participate in the cross-site outcomes and impact evaluations, grantees will share some or all of the data they are collecting for their own required local evaluations. Second, data shared by grantees or provided through direct collection from grantees, staff members, and partners for the cross-site evaluation will also be used to describe grantee performance. That is, to reduce duplication of efforts for grantees to comply with both the CB’s local and cross-site evaluation requirements and legislatively mandated performance indicators, the cross-site evaluation data needs completely overlap with data needed for performance indicators. Since the only existing reporting systems that collect the data required for reporting to Congress and for the cross-site evaluation were developed by the evaluation contractor and its subcontractors, this data collection plan does not duplicate any other efforts.
Furthermore, the design of the cross-site evaluation instruments ensures that there is no duplication of data collected through each instrument. For example, during the semi-structured interviews conducted during site visits, grantee and EBP staff members will not be asked any questions included in the staff survey. Information on program implementation, partners, and implementation challenges and successes provided in the semi-annual progress reports will reduce the level of detail needed from site visit interview participants, and also reduces the need to address RPG program operations in the staff and partner surveys. In creating the master outcome and impact instruments, the contractor has reviewed and cross-walked all items in order to identify duplication across instruments. Any duplicate items not needed for scoring the instruments were removed from the versions of the standardized instruments provided in the outcome and impact master instruments. This not only reduces burden on RPG participants in providing data for grantees’ local evaluations, but also reduces the burden on grantee staff for preparing and uploading outcome data to the cross-site evaluation.
The potential exists to affect small entities within a grantee site, depending on the local community partners and funders with which RPG grantees engage. RPG grantee partners, and direct service providers will be included as part of site visit interviews and partner surveys. Additionally, RPG grantee agencies will enter data into the RPG data portal. Proposed data collection for these three efforts is designed to minimize the burden on all organizations involved, including small business and entities and consistent with meeting the aims of the legislation establishing RPG and CB’s need for valid, reliable, and rigorous evaluations of federally funded programs.
Not collecting information for the remainder of the RPG Cross-Site Evaluation would limit the government’s ability to document the performance of its grantees, as legislatively mandated, and to assess the extent to which these federal grants are successful in achieving their purpose. Furthermore, the RPG Cross-Site Evaluation provides a valuable opportunity for CB, practitioners, and researchers to gain empirical knowledge about the implementation and effectiveness of coordinated, evidence-based strategies for meeting the needs of families in the child welfare and substance use disorder treatment providers. The study will examine whether the government’s strategy of funding collaborative, cross-system partnerships is a productive one that is likely to be sustained after the grant period ends, along with understanding the characteristics and roles of key partnering organizations, their coordination and communication mechanisms, and the quality of collaboration.
Grantee and partner staff topic guide. Without the information being collected through interviews with grantee and partner staff, the cross-site evaluation would have to rely entirely on implementation information reported by a single source: the RPG grantee project directors through the semi-annual progress reports. Thus, the study would lack the broader perspectives of other key participants and it would not be possible to conduct any in-depth analysis of critical program challenges and successes or implementation issues. The remaining site visits to each of the four RPG3 grantees are planned for the extension period. The visits will focus on understanding program design, rationale for selecting EBPs, implementation experiences, changes made to the program design, and rationale for the changes.
Semi-annual progress reports. Without continuing to obtain information from the semi-annual progress reports, the study will not have detailed information about grantee operations; changes to planned interventions, target population and eligibility criteria, or target outcomes; and planned or unplanned adaptations of EBPs that occur as the RPG grants are implemented. The progress reports will provide timely information about the infrastructure that grantees put in place to support implementation as well as features of the community context that have influenced grantees’ implementation plans. Collecting this information less often than twice a year would violate the federal requirements for grantee progress reporting, and would in fact place larger burdens on respondents to remember or store information about events, changes in direction, or challenges and successes over a longer period of time. Since aggregate information from the reports will be extracted and shared with grantees for program improvement and peer learning, less frequent reporting would also limit the ability of grantees to consider adjustments or reach out to each one another. The data also provide information for designing evaluation-related and programmatic technical assistance in response to emerging issues and grantee needs.
Enrollment and service log. The enrollment and service log is important for describing actual service delivery to cases receiving selected EBPs and for tracking all activities completed with the family, including assessments, referrals, education, and support. Data will be collected when participants enroll, as they receive services, and at exit. Without continuing to obtain these data, the study would have incomplete information on the services recipients actually receive, including their duration and dosage. The evaluation would be unable to link participant outcomes to the levels or combinations of specific services or to understand whether and how participants participated in the selected EBPs. If data were collected less frequently, providers would have to store service data or try to recall it weeks or months after delivery. Regular collection will also enable us to check data quality and address missing data, errors, or other problems in a timely way.
Staff survey. Without this survey, information that would be difficult to obtain during semi-structured interviews, such as the quality of staff relationships and the supportiveness of program leadership, will not be collected. The staff survey will also enable the collection of data from a broader set of program staff than those who will be interviewed during the site visits and enable the collection of more structured information.
Partner survey. Without the partner survey, information that will help understand the roles that partners play in RPG, the communications and working relationships among partners, the quality of their collaboration and their goals for the RPG program in their region will not be collected. Since many federal initiatives require grantees to establish collaborations—and since the literature shows that collaboration and service integration between child welfare agencies, substance use disorder treatment providers, and other key systems such as the courts has been rare or difficult in the past, collecting these data will help fill important gaps in knowledge.
Outcomes study master outcome instrument. It is the mission of the CB to ensure child well-being, safety, and permanency for children that experience maltreatment. Continued use of the master outcome instrument will provide detailed information on these outcomes and the adults who receive services. Grantees will upload data from the master outcome instrument twice each year. Without this information, evaluators will be unable to describe the outcomes of RPG program participants, and to analyze the extent to which grants have been successful in addressing the needs of families co-involved with substance use disorder treatment providers and child welfare systems. Further, it would be impossible to conduct an impact study (described next). During each upload, the outcome data management system performs automatic validation checks, enabling grantees to ensure quality and completeness of their data. Mathematica then reviews submissions to address any remaining data quality issues, and works with grantees to resolve problems. This ensures that data quality issues can be addressed early and resolved. If data were uploaded less often, it would be more cumbersome and difficult for grantees to search through older records to make corrections or provide missing data.
Impact study master outcome instrument. In addition to reporting data for the implementation and partnership study and the outcomes study, grantees participating in the impact study will also upload outcome data for participants in their comparison groups (those who do not receive RPG services or receive only a subset of RPG services). Without this information, it would not be possible to rigorously analyze the effectiveness of the interventions by comparing outcomes for individuals with access to RPG services to those in comparison groups, and attribute differences to RPG services. Uploading the data every six months provides the same benefits with respect to data quality described above.
There are no special circumstances requiring deviation from these guidelines.
The first Federal Register Notice was published on June 24, 2016 (Federal Register /Vol. 81, No. 122 /Thursday, June 23, 2016 /Notices, pp. 41310-12). The comment period ended August 22, 2016. No comments were received.
No payments or gifts will be provided to respondents as part of data collection.
This study is being conducted in accordance with all relevant regulations and requirements, including the Privacy Act of 1974 (5USC 552a), the Privacy Act Regulations (34 CFR Part 5b), and the Freedom of Information Act (5 CFR 552) and related regulations (41 CFR Part 1-1, 45 CFR Part 5b, and 40 CFR 44502). Several specific measures will be taken to protect respondent privacy.
Adopting strict security measures and web security best practices to protect data collected through the data portal. Data collected through the data portal (which includes outcome data as well as enrollment service logs), will continue to be housed on secure servers that conform to the requirements of the HHS Information Security Program Policy. The data portal employs strict security measures and web security best practices to ensure the data are submitted, stored, maintained and disseminated securely and safely. Strict security measures will continue to be employed to protect the confidentiality of participant information stored in the system including data authentication, monitoring, auditing, and encryption. Specific security procedures include, but are not limited to:
All data will be encrypted in transit (using TLS protocol backward compatible to SSL)
Data will be encrypted at rest and reside behind firewalls
Access to the data portal will be restricted to approved staff members who will be assigned a password only with permission from the study director. Each user will have a unique user id/password combination
Database access will require special system accounts. Portal users will not be able to access the database directly
Portal users will be able to access the system only within the scope of their assigned roles and responsibilities
Security procedures will be integrated into the design, implementation, and day-to-day operations of the portal.
All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. To further ensure data security, project personnel are required to adhere to strict standards, receive periodic security training, and sign security agreements as a condition of employment
Training cross-site evaluation interviewers in confidentiality procedures. All site visit interviewers will be knowledgeable about privacy procedures and will be prepared to describe them in detail or to answer any related questions raised by respondents. During the introduction to each interview, site visit informants will be told that none of the information they provide will be used for monitoring or accountability purposes and that the results of the study will be presented in aggregate form only.
Using web-based staff and partner surveys. Administering the staff and partner surveys via web eliminates security risks related to shipping hard-copy forms containing personal identifying information (PII) to the evaluator.
Assignment of content-free case and participant identification numbers to replace personal identifying information associated with all participant outcome data provided by grantees to the cross-site evaluation. The cross-site evaluation has worked with grantees to implement standard procedures for assigning identification numbers to all participant-level data. Case- and individual-level numbers are content-free. For example, they do not include special codes to indicate enrollment dates, participant location, gender, age, or other characteristics.
There are no sensitive questions in the instruments that the contractor will use to collect data.
(Some of the specified standardized instruments that grantees will use to collect data do include sensitive questions. For example, in the case of parents who are endangering their children as a result of their substance use disorder, it is essential for grantees to measure the parents’ pattern of substance use dependence as this is a critical indicator of recovery. In recognition of this need and to ensure confidentiality and other protections to their clients, as a condition of their RPG grant, all grantees were required and have obtained IRB clearance for their data collection.)
The estimated reporting burden and cost for the data collection instruments included in this ICR is presented in Table A.1. We are requesting renewal of clearance to collect data within a three year period; all remaining data collection for the cross-site evaluation will occur during this three-year period.
We estimate the average hourly wage for program directors and managers to be the average hourly wage of “Social and Community Services Manager” ($30.99), that of grantee staff to be the average hourly wage of “Counselors, Social Workers, and Other Community and Social Service Specialists” ($21.26), that of data managers to be the average hourly wage of “Data Administrators” ($38.04), that of data entry specialists to be the average hourly wage of “Data Entry and Information Processing Workers” ($15.11) and that for partners to be the average hourly wage of “General and Operations Manager” ($55.22), taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2012. Table A.1 summarizes the proposed burden and cost estimates for the use of the instruments and products associated with the implementation and partnership study, the outcomes study, and the impact study.
The total estimated cost figures are computed from the total annual burden hours and average hourly wages for program directors and managers ($30.99), grantee staff ($21.26) and partners ($55.22), described above. For each burden estimate, annualized burden has been calculated by dividing the estimated total burden hours by the number of study years. Figures are estimated as follows:
Individual interview with program director. We expect to interview 4 RPG3 program directors (1 per grantee across 4 RPG3 grantees) once during the clearance period. These interviews will take two hours. Thus, the total burden for individual interview with program directors is 8 hours, and the total annualized burden is 2.67 hours (8 ÷ 3).
Group interview with program manager/supervisor group interview. We expect to conduct semi-structured small-group interviews with 36 program managers (3 staff per EBP from 3 EBPs in each of the 4 RPG3 sites) once during the clearance period. These interviews will last 2 hours. Thus the total burden of participating in group interviews is 72 hours, and the total annualized burden is 24 hours (72 ÷ 3).
Individual interview with program manager or supervisor. We expect to conduct individual, semi-structured interviews with 24 program managers or supervisors (2 staff per EBP from 3 EBPs in each of the 4 RPG3 sites) once during the clearance period. These interviews will take 1 hour. Thus the total burden for individual interviews with program managers is 24 hours, and the total annualized burden is 8 hours (24 ÷ 3).
Individual interview with frontline staff. We expect to conduct individual, semi-structured interviews with 24 frontline staff who work directly with children and families (2 staff per EBP from 3 EBPs in each of the 4 RPG3 sites) once during the clearance period. These interviews will take 1 hour. Thus the total burden for individual interviews with frontline staff is 24 hours, and the total annualized burden is 8 hours (24 ÷ 3).
Semi-annual progress report. Grantees will submit two progress reports per year for each year of the clearance period. We assume that 21 project directors (1 per grantee) will submit the semi-annual progress reports 6 times during the clearance period. It will take 16.5 hours to submit each one. The total burden for submitting the semi-annual progress report is thus 2,079 hours, and the total annualized burden is 693 hours (2,079 ÷ 3).
Case enrollment. Based on grantee projections, we assume enrollment of 1,890 families per year (90 families per site across 21 sites). We assume that 3 staff per grantee will conduct enrollment, or 63 staff total, where each staff member conducts enrollment with approximately 30 families. It will take 15 minutes total to enroll each family. Thus, the total burden for enrolling families across all staff members for three years is 1,417.5 hours, and the total annualized burden is 472.5 hours (1,417.5 ÷ 3).
Service log entries. Based on the expected participation of families in specific RPG services and EBPs, we assume there will be one service log entry each week for 1,890 families per year (90 families per site across 21 sites). We assume that 6 staff per grantee will provide services and enter service data (126 staff total), with a caseload size of 15 families each. Each weekly entry will take 3 minutes. Thus, the total burden for completing service log entries is 14,742 hours, and the total annualized burden is 4,914 hours (14,742÷ 3).
Staff survey. We expect to administer the web-based survey once to 80 frontline staff (20 per site across 4 RPG3 sites). The survey will take 25 minutes to complete. Thus the total burden for the staff survey is 33.6 hours, and the total annualized burden is 11.2 hours (33.6÷ 3).
Partner survey. We expect to administer the web-based survey once to 80 grantee partners (20 per site across 4 sites). The survey will take 20 minutes to complete. Thus the total burden for the partner survey is 26.4 hours, and the total annualized burden is 8.8 hours (26.4÷ 3).
The total estimated cost is computed from the total annual burden hours and average hourly wages for data managers ($38.04) and data entry specialists ($15.11). Costs are estimated as follows:
Administrative Data
Obtain access to administrative data. Grantees will update administrative agreements with agencies that house the administrative records once in the first year of the extension and once in the second year of the extension. It will take 18 hours to update the agreements. Thus, the total burden is 756 hours. RPG grantees will use these data for their local evaluations as well; however to comply with the procedures for providing the data to the cross-site evaluation there may be additional steps needed. Therefore we have assumed that half of the burden of obtaining the administrative data (378 hours) should be allocated to the cross-site evaluation. The annualized burden is 126 hours (378 ÷ 3). We assume 1 data manager per grantee (or 21 data managers) will complete these processes.
Report administrative data. Grantees will upload administrative data they have obtained to the RPG data portal twice per year for the three-year clearance period. For each upload, it will take grantees 144 hours to prepare and upload their administrative data, including correcting any data validation problems. The total burden for reporting administrative data is thus 18,144 hours for all 21 grantees combined, and the total annualized burden is 6,048 hours (18,144 ÷ 3). We assume that 1 data entry operator per grantee (or 21 data entry operators) will upload the data.
Standardized Instruments
Enter data into local database. Over the course of the 3-year clearance period, grantees will enroll a total of 5,670 cases (1,890 cases enrolled each year). For every case, 10 standardized instruments will be administered at baseline and again at program completion, for a total of 113,400 administrations. Grantees will enter data from the completed instruments into their local databases, and data entry for each instrument will take 15 minutes (.25 hours) per administration. The total burden is thus 28,350 hours. RPG grantees will use these data for their local evaluations, however to comply with the procedures for providing the data to the cross-site evaluation there may be additional steps needed to enter these data into their local databases. Therefore we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus the total burden for entering cross-site evaluation data is 14,175 hours, and the total annualized burden is 4,725 hours (14,175 ÷ 3). We assume that 21 data entry operators (1 operator in each site) will enter the data.
Review records and submit electronically. Grantees will review records to ensure that all data has been entered and upload the data to the RPG portal twice per year for each year of the evaluation period. It will take 6 hours to review and submit data for each of the 10 instruments. Grantees will then validate and resubmit data when errors are identified. It will take 4 hours to validate data for each of the 10 instruments, including time for obtaining responses to validation questions and resubmitting the data. The total burden for reviewing and electronically uploading records is 7,560 hours, and the total burden for validating and resubmitting data is 5,040hours. Thus the total burden is 12,600 hours, and the annualized burden is 4,200 hours (12,600 ÷ 3). We assume that 21 data entry operators (1 operator in each site) will review and submit the data.
The total estimated cost is computed from the total annual burden hours and an average hourly wage for data entry specialists ($15.11 described above). Amounts are estimated as follows.
Standardized Instruments
Data entry for comparison study sites. Five grantees participating in the impact study will also enter data for control group members. Over the course of the study the five grantees will enroll and collect data from 868 comparison group members. For every member, five standardized instruments will be administered at baseline and follow-up, for a total of 8,680 administrations. Grantees will enter data from the completed instruments into their local databases. It will take .25 hours for each administration, for a total of 2,170hours. RPG grantees will use these data for their local evaluations as well, however to comply with the procedures for providing the data to the cross-site evaluation there may be additional steps needed to enter these data into their local databases. Therefore we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 1,085 hours, and the total annualized burden is 361.6 hours (1,085÷3). We assume that 5 data entry operators (1 operator in each of the 7 sites) will enter the data.
Table A.1. Estimate of Burden and Cost for the RPG Evaluation – TOTAL Burden Request
Data Collection Activity |
Number of Respondents |
Number of Responses |
Average Burden per Response (hours) |
Total Burden Hours |
Average Hourly Wage |
Total Annual Burden Hours |
|
|
Implementation and Partnership Study |
||||||
Program Director Individual Interview |
4 |
1 |
2 |
8 |
$30.99 |
2.67 |
|
Program Manager/Supervisor Group Interview |
36 |
1 |
2 |
72 |
$30.99 |
24 |
|
Program Manager/Supervisor Individual Interview |
24 |
1 |
1 |
24 |
$30.99 |
8 |
|
Frontline Staff Individual Interview |
24 |
1 |
1 |
24 |
$21.26 |
8 |
|
Semi-Annual Progress Report |
21 |
6 |
16.5 |
2,079 |
$30.99 |
693 |
|
Case Enrollment Data |
63 |
90 |
0.25 |
1,417.5 |
$21.26 |
472.5 |
|
Service Log Entries |
126 |
2,340 |
0.05 |
14,742 |
$21.26 |
4,914 |
|
Staff Survey |
80 |
1 |
0.42 |
33.6 |
$21.26 |
11.2 |
|
Partner Survey |
80 |
1 |
0.33 |
26.4 |
$55.22 |
8.8 |
|
|
Data Uploading for Outcomes Evaluation |
||||||
Administrative Data |
|
|
|
|
|
|
|
Obtain Access to Administrative Data |
21 |
1 |
18 |
378 |
$38.04 |
126 |
|
Report Administrative Data |
21 |
6 |
144 |
18,144 |
$15.11 |
6,048 |
|
Standardized Instruments |
|
|
|
|
|
|
|
Enter Data into Local Database |
21 |
6 |
112.5 |
14,175 |
$15.11 |
4,725 |
|
Review Records and Submit Electronically |
21 |
6 |
100 |
12,600 |
$15.11 |
4,200 |
|
|
Additional Data Entry for Impact Evaluation |
||||||
Data Entry for Comparison Study Sites |
5 |
868 |
.25 |
1,085 |
$15.11 |
361.6 |
|
Total |
|
|
|
64,808.5 |
|
21,602.77 |
These information collection activities do not place any additional costs on respondents or record keepers.
The estimated cost for completion of the RPG cross-site evaluation and technical assistance project is $5,204,361 over the five years of the evaluation. Of this total, $ $3,773,396 represents the costs of the cross-site evaluation. The total cost of the cross-site evaluation over the three years of the requested clearance is thus $2,231,564. The annualized cost to the federal government includes one third of that total ($743,855), plus the annualized burden cost of $374,460 for a total of $1,118,315 per year. The annualized burden cost is slightly higher than during the original clearance period (approved in March 2014) due to the increased number of grantees participating in the cross-site evaluation because four additional grants were awarded in September 2014. However, the annualized burden is slightly lower compared to the non-substantive change request adding the four grantees approved by OMB in June 2015 because a subset of some data collection activities have been completed.
The annualized burden cost is slightly higher than during the original clearance period (approved in March 2014) due to the four additional grants awarded in September 2014 (Regional Partnership Grants to Increase the Well-Being of, and to Improve the Permanency Outcomes for, Children Affected by Substance Abuse HHS-2014-ACF-ACYF-CU-0809). This increase was due to the four additional grantees participation in the cross-site evaluation, including the implementation, partnership, outcomes, and impact studies.
However, the annualized burden is slightly lower compared to the non-substantive change request submitted to add the additional four RPG grantees, approved by OMB in June 2015, because a subset of some data collection activities have been completed with the RPG2 and RPG3 grantees.
No changes have been made to the data collection processes. However, a subset of some of the data collection activities were completed with RPG grantees during the first OMB clearance period. The IC “Review and adopt reporting templates” was completed under the prior period and removed from this ICR.
The information from the RPG Cross-Site Evaluation—with a focus on program operations, implementation, and outcomes for families—will be useful to funders, practitioners, and other stakeholders interested in targeting resources to effective approaches to address the needs of families affected by substance use disorder. Identifying what has worked well allows subsequent efforts of program operators and funders to hone in on evidence-based practices and strategies.
The instruments included in this OMB package for the implementation and partnership study, and the outcomes study, will continue to yield data that will be analyzed using qualitative and quantitative methods to describe the target populations’ characteristics and outcomes; program implementation and factors shown to be associated with high quality implementation; and the structure, quality and goals of partnerships. A descriptive analysis of participants will provide a snapshot of child, adult, and family characteristics and outcomes. Thorough documentation of program implementation and partnerships will expand understanding about the breadth of programs, practices, and services being offered by RPG to vulnerable families and will describe successes in achieving goals and barriers encountered. A greater understanding of how programs can be implemented with a network of partners may inform future efforts in this area.
Mathematica will continue to use standard qualitative procedures to analyze and summarize information from program staff interviews conducted using the semi-structured staff interview topic guide. These procedures include organization, coding, and theme identification. Standardized templates will be used to organize and document the information and then code interview data. Coded text will be searched to gauge consistency and consolidate data across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics/themes/categories (Yin 1994; Coffey et al. 1996) which can then be analyzed to address the study’s research questions.
Quantitative data will continue to be summarized using basic descriptive methods. For the outcomes study, data from the standardized instruments will be tabulated and used to create scales and scores appropriate for each instrument and using established norms when appropriate for the RPG target populations. Administrative records will be examined to determine whether incidents of maltreatment and removal from the home have occurred for children and whether adults have received substance use disorder treatment, and their frequency and resolution. These data will capture information at baseline and program exit for families who participate in services. For the implementation and partnership study, sources of quantitative data include the frontline staff and partner surveys. Data analysis from the surveys will follow a common set of steps involving data cleaning, variable construction, and computing descriptive statistics. To facilitate analysis of each data source we will create variables to address the study’s research questions. Construction of these analytic variables will vary depending on a variable’s purpose and the data source being used. Variables may combine several survey responses into a scale or a score, aggregate attendance data from a set time period, or compare responses to identify a level of agreement.
Service data use, entered by grantees in a web-based system, will also be used for the implementation study. The study will provide summary statistics for key program features:
Enrollment. For example, the average number of new cases each month.
Services provided by grantees. For example, the services in which clients typically participate (including any common combinations of services), distribution of location of services (such as home, treatment facility, or other site), the average number of selected services (such as workshops) offered each month, and common topics covered during services.
Participation. For example, the average length of time participants are served by the program, the average number of hours of services received by program participants, the average duration between enrollment and start of services.
We will analyze data from the web-based system for each grantee once a year to correspond to the annual reports identified in Table A.2. In each report, we will describe enrollment patterns, services provided, and participation patterns over the previous 12 months. Later analyses may describe how patterns changed over time, such as from the early to late implementation period.
The impact study will complement other components of the evaluation with an examination of program effectiveness in the areas of child well-being, safety, and permanency; recovery; and family functioning. A selected subset of five grantees who have proposed rigorous local evaluations, either using random assignment or a strong matched comparison group, will be included in the impact study. To be considered a strong matched comparison group, the local evaluation much include baseline data on key characteristics, such as family functioning and parental substance use dependence, on which to establish equivalence with those receiving RPG services. As noted above, all grantees will provide data on the program groups as part of the outcomes study. Those involved in the impact study will also provide data on comparison groups who do not receive RPG services at baseline and program exit.
The analysis of effects will have two components. First, we will pool the grantees that used a randomized-controlled trial (RCTs) in their local evaluations. RCTs have excellent internal validity—ability to determine whether the program caused the outcomes—because the treatment and comparison groups are initially equivalent on all observed and unobserved characteristics, on average. Any observed differences in outcomes between the program and control group of families can therefore be attributed to the program with a known degree of precision. Second, we will pool grantees with RCTs and those with strong quasi-experimental designs (QEDs), in which program and comparison groups were matched on key factors, such as baseline history of substance use dependence and family functioning. The internal validity of QEDs is weaker than RCTs, because differences on unobservable characteristics cannot be determined. However, a design with well-matched program participants and comparison group members provides useful information on program effects. Combining the QED results with RCTs will increase the statistical power of the overall analysis, enabling the detection of smaller effects. Because of the serious consequences of child maltreatment, even relatively small effect sizes may be clinically meaningful.
Baseline data will be collected by grantees and their local evaluators and used in the impact and implementation analyses. First, baseline data will be used to describe the characteristics of RPG program participants. For each grantee, we will present tables of frequencies and means for key participant characteristics, including demographic and family information. We will also present aggregated results for the grantees with RCTs and the combined RCT-QED sample used for the impact study.
A key use of baseline data is to test for baseline equivalence for both the RCT and the RCT-QED samples. Though random assignment ensures that families participating in the program and those in comparison groups do not initially differ in any systematic way, there might still be chance differences between groups. Establishing baseline equivalence for the QEDs is critical for determining whether the comparison group represents a reasonable counterfactual that represents what would have happened to the program group had they not received treatment. To confirm that there were no differences between the program and comparison groups at the study’s onset, we will statistically compare key characteristics between the groups. In addition, since the standardized instruments will be administered twice- once at program entry and again at program exit, we will also compare outcome measures at program entry between the two groups. In particular, to establish baseline equivalence, we will conduct t-tests and F-tests to test for differences between the two groups both overall and separately by grantee. In these comparisons, we will use the analytic sample, which is composed of respondents to both the baseline and follow-up instruments.
Baseline data will also be analyzed jointly with the follow-up data to estimate impacts. Using baseline data in the impact analysis will improve the statistical precision of impact estimates and control for any remaining differences between groups. The average impact estimate will be weighted average of each site-specific impact, where the weight of each site-specific impact is the inverse of the squared standard error of the impact. As such, sites with more precise impact estimates (for example, sites with larger sample sizes or baseline variables that are highly correlated with the outcomes) will receive greater weight in the average impact estimate. We will compare the results using the sites with RCT evaluations to those obtained with the RCT-QED sample, noting that the former is most rigorous, whereas the latter should be considered “suggestive” or “promising” evidence of effectiveness.
To inform Congress on the performance and progress of the RPG sites, we will annually estimate and report on performance measures for the 21 sites. The reporting will include selected measures collected and calculated for the (1) Implementation and Partnership Study (2) Outcomes Study and (3) Impact Study. To reduce the burden for the grantees and local evaluators, we have designed the studies for complete overlap between the performance measures and those of the other evaluation components, so no additional data are needed. Reporting will comprise results from the implementation analysis, such as information about program operations, enrollment, and participation; the partnership analysis, including partnership goals and collaboration; and detailed descriptions of the characteristics and outcomes associated with participating families.
This ICR is requesting clearance for data collection for the cross-site evaluation for three years, beginning April 2017. Once data collection is complete, reporting that utilizes data collected for the cross-site evaluation will continue through September 2020.
We will develop reports describing the results of the evaluation components in progress each year (Table A.2). For the overall performance component, we will produce annual reports to Congress beginning in September 2017 of this clearance period.
Table A.2. Schedule for the Continued RPG Cross-Site Evaluation
Activity |
Date |
Data Collection |
April 2017-March 2020a |
Reports to Congress |
Annually beginning September 2017b |
Ad-hoc Reports or Research Briefs |
As requested by Children’s Bureaub |
a Data collection will continue as it did under the previous clearance period once the OMB renewal clearance is received.
b Reports prior to this OMB clearance renewal use data collected during the previous clearance period.
In addition to planned reports on the findings, RPG will provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. Short research or policy briefs are an effective and efficient way of disseminating study information and findings. The cross-site evaluation team will produce up to two ad hoc reports or special topics briefs each at the request of the CB. Topics for these briefs will emerge as the evaluation progresses but could, for example, provide background on selected evidence-based practices, summarize key implementation, impact, or subgroup findings or describe the study purpose and grantees.
Approval not to display the expiration date for OMB approval is not requested.
No exceptions are necessary for this data collection.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Autumn Parker |
File Modified | 0000-00-00 |
File Created | 2021-01-23 |