Part_A revised 12-3-13 clean

Part_A revised 12-3-13 clean.docx

Regional Partnership Grants National Cross-Site Evaluation and Evaluation Technical Assistance

OMB: 0970-0444

Document [docx]
Download: docx | pdf





Regional Partnerships Grants (RPG) National Cross- Site Evaluation and Evaluation Technical Assistance


Supporting Statement, Part A

For OMB Approval


December 4, 2013




A. JUSTIFICATION

A.1. Circumstances Making the Collection of Information Necessary

The Children’s Bureau (CB) within the Administration for Children and Families (ACF) of the U.S. Department of Health and Human Services seeks approval to collect information for the Regional Partnership Grants to Increase the Well-being of and Improve Permanency Outcomes for Children Affected by Substance Abuse (known as the Regional Partnership Grants Program or “RPG”) Cross-Site Evaluation and Evaluation-Related Technical Assistance project (the “RPG Cross-Site Evaluation”). Under RPG, CB has issued 17 grants to organizations such as child welfare or substance abuse treatment providers or family court systems to develop interagency collaborations and provide services designed to increase well-being, improve permanency, and enhance the safety of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance abuse. CB required RPG grantees to use evidence-based or evidence-informed programs to deliver services to children, adults, and families.

The overall objective of the RPG Cross-Site Evaluation is to plan, develop, and implement a rigorous national cross-site evaluation of the RPG Grant Program, provide legislatively-mandated performance measurement, and furnish evaluation-related technical assistance to the grantees to improve the quality and rigor of their local evaluations. The project will document the programs and activities conducted through the RPG program and assess the extent to which the grants have been successful in addressing the needs of families with substance abuse problems who come to the attention of the child welfare system.

As part of providing technical assistance, the evaluator is required to advise CB on the instruments grantees are to use to collect data from program participants for required local evaluations. Grantees will secure approval from their local IRBs for collecting these data. This information collection request (ICR) requests clearance for obtaining from grantees participant data they collect for their local evaluations, and for directly collecting additional data from grantees and their partners and providers, for the cross-site evaluation.

Specifically, this ICR requests clearance for the following data collection activities: (1) RPG staff and partner semi-structured interviews during site visits, (2) a web-based staff survey, (3) semi-annual progress reports, (4) enrollment and service use data collection, (5) a web-based partner survey, and (6) data entry and uploads to a web-based data portal of child, adult, and family outcome data for families enrolled in RPG, and those in comparison groups for a subset of grantees. These data collection activities will be used in an implementation and partnership study, an outcomes study, and an impact analyses.

The evaluation is being undertaken by the U.S. Department of Health and Human Services, ACF, CB, and its contractor, Mathematica Policy Research. The evaluation is being implemented by Mathematica Policy Research and its subcontractors, Walter R. McDonald & Associates and Synergy Enterprises.

a. Background

When mothers, fathers, or other caregivers struggle with addiction, children can experience unresponsive, erratic, neglectful, or abusive care from those responsible for their nurture. This in turn can interfere with children’s physical, social, and emotional development and well-being. Substance abuse limits parents’ ability to create a safe and stable environment for their children, and children of substance-abusing parents have poorer physical, intellectual, social, and emotional health and are at greater risk of abusing drugs or alcohol themselves as adults (U.S. Department of Health and Human Services 1999; U.S. Department of Health and Human Services 2009; Osterling and Austin 2008; Niccols et al. 2012). Trauma resulting from parental neglect or abuse associated with substance abuse can have a particularly detrimental effect on young children’s development.

Substance abuse is a prominent cause of family involvement in the child welfare system: research indicates that between 50 and 80 percent of child welfare cases involve a substance-abusing parent (Niccols et al. 2012; U.S. Department of Health and Human Services 1999). In 2009, the rate of substantiated child maltreatment reports was 10 per 1,000 children ages birth to 17; the rate was especially high for children under age 1, at 21 per 1,000 (Federal Interagency Forum on Child and Family Statistics 2012).

Most adult participants in substance abuse treatment are parents. One study concluded that about 58 percent of participants in treatment had minor children—69 percent of women were mothers, and 52 percent of men were fathers (Young et al. 2007; Brady and Ashley 2005). Further, it was estimated that 27 percent of parents in treatment had lost custody of one or more children. Nonetheless, there has been limited targeting of treatment in a way that explicitly recognizes participants’ status as parents, especially parents who are engaged with the child welfare system. The targeted programs that do exist tend to focus on mothers rather than fathers (mothers more typically being the custodial parent), though research indicates that substance abuse among fathers is also associated with less engaged and less responsible parenting (Conners et al. 2006; McMahon et al. 2007).

Many parents with children in the child welfare system have greater difficulty than others completing substance abuse treatment programs. About one-fifth of parents whose child was involved with the child welfare system successfully completed substance abuse treatment, compared to about half of those seeking treatment in the general population (Choi and Ryan 2006; Brady and Ashley 2005). These parents’ relative difficulty in addressing their addictions may be due to the inability of treatment programs to accommodate their complex circumstances and service needs. Research indicates that parents involved in substance abuse treatment and the child welfare system may differ in important ways from those who are not involved in child welfare services. One California study, for example, found that these dual-system parents tend to be younger, have more children, experience greater economic instability, and have greater involvement in the criminal justice system, when compared with other parents in treatment (Grella et al. 2006). However, mothers who participated in treatment programs that provided a high level of family-related services or that focused on education or employment were about twice as likely to reunify with their children as those in programs with low levels of such services, which suggests strongly that addressing the full range of treatment-related needs of parents involved with child welfare is important (Grella et al. 2009; Brady and Ashley 2005). Mothers with substance-exposed infants can benefit from residential treatment in terms of both treatment progress and family reunification, but only when residential services are delivered in combination with transitional services (Huang and Ryan 2010).

The ability of the child welfare and substance abuse systems to coordinate services to address the needs of these families has been challenging for several reasons (U.S. Department of Health and Human Services 1999; Semidei et al. 2001). Each system has different perspectives about who the “client” is and about issues such as removal and reunification. They are embedded in different federal and state legal and policy environments. In addition, many child welfare agencies operate in a culture of crisis (Golden 2009). There has been a chronic shortage of substance abuse treatment programs, especially those appropriate for parents of young children. Confidentiality requirements can make cooperation and communication across systems challenging. In addition, ineffective screening by staff in both types of agencies can make early detection of problems difficult. One research review, for example, noted that child welfare agency staff in one study failed to identify substance abuse problems in 61 percent of caregivers that in fact met the clinical criteria for alcohol or drug dependency (Young et al. 2007). Similarly, substance abuse treatment workers must be trained to screen effectively for child neglect and abuse and make appropriate referrals.

Since 2006, Congress has authorized competitive grants to address these problems. The Child and Family Services Improvement Act of 2006 (Pub. L 109-288) provided funding over a five-year period to implement regional partnerships among child welfare, substance abuse treatment, and related organizations to improve the well-being, permanency, and safety outcomes of children who were in, or at risk of, out-of-home placement as a result of a parent’s or caregiver’s methamphetamine or other substance abuse. With this funding, the Children’s Bureau (CB) within the Administration on Children, Youth and Families, Administration for Children and Families at the U.S. Department of Health and Human Services (HHS) established the Regional Partnership Grant (RPG) program.

The Child and Family Services Improvement and Innovation Act of 2011 (Pub. L. 112-34) reauthorized the RPG program and extended funding through 2016. With the funding, CB offered new competitive grants up to $1 million per year for five years (Regional Partnership Grants to Increase the Well-Being of and to Improve the Permanency Outcomes for Children Affected by Substance Abuse HHS-2012-ACF-ACYF-CU-0321). In keeping with the requirements of the legislation

The RPG Grant Program is unique in its emphasis on developing partnerships between child welfare and substance abuse treatment systems to better meet the needs of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance abuse. The RPG cross-site evaluation will provide important information about the characteristics of these families and the services they receive through RPG, as well as characteristics of the partnerships and how child welfare and substance abuse treatment providers work together. In addition, the study will provide important information about changes over time in child, adult, and family outcomes, and the effectiveness of RPG services for selected grantees, including the effectiveness of evidence-based programs (EBPs) being implemented with these target populations for the first time. The information gathered will be critical to informing decisions related to future federal and community investments in services that meet the needs of children and families involved in the child welfare and substance abuse treatment systems, as well as information about developing strong partnerships between the two systems.

b. Overview of the Evaluation

The RPG Cross-Site Evaluation is a comprehensive yet efficient study that includes an implementation and partnership study and an outcomes study. These studies will build knowledge about implementing programs and services for families involved in the child welfare and substance abuse treatment systems, developing more effective partnerships between the two systems to coordinate services for these families. They will describe the characteristics and outcomes of children, adults, and families involved in both systems and exposed to evidence-based program and practice models that may not have been tested before with these target populations. A pooled cross-site impact study will test the impact of these EBPs and other integrated services on child well-being, safety, and permanency, on adult recovery, and on family functioning and stability.

The implementation and partnership study will build knowledge about (1) effective implementation strategies across the 17 RPG projects, with a focus on factors shown in the research literature to be associated with quality implementation and (2) effective strategies for building and sustaining partnerships and integrated services between the child welfare and substance abuse systems. Key data collection activities include: (1) conducting semi-structured interviews with selected grantee and partner staff during site visits; (2) collecting semi-annual progress reports from grantees; (3) obtaining data from grantees on program enrollment, exit, and service use; (4) administering a web-based survey of service delivery staff; and (5) administering a web-based survey of lead partner staff.

The outcomes study will describe the characteristics of and changes over time in children, adults, and families who participate in the RPG programs. This descriptive study will report participant outcomes in five domains of high interest to CB: (1) child well-being, (2) family functioning/stability, (3) adult recovery, (4) child permanency, and (5) child safety. RPG grantees will be collecting data from or about participants in their RPG programs for local evaluations required under the terms of their RPG grants. They will provide some of these data to the cross-site evaluation contractor for use in the cross-site outcomes study.

The impact study will estimate the effectiveness of selected RPG interventions by comparing outcomes for individuals enrolled in RPG services to those in comparison groups. The impact study will pool outcome data on program and comparison group members from seven grantees with appropriate local evaluation designs.

c. Data Collection Activities Requiring Clearance

This ICR requests clearance for seven data collection instruments. Five will be used to collect evidence for the implementation and partnership study, one will be used for the outcomes study, and one will be used for the impact study. These efforts are listed below, and described in greater detail in section A.2.

Implementation and Partnership Study

  1. Grantee and partner staff topic guide. A topic guide will be used to conduct semi-structured interviews with selected grantee and partner staff during site visits to each of the 17 grantees that will be conducted during the second and fourth year of the five-year RPG grant program (the first and third years of the 3-year OMB clearance period being requested). The interview topic guide is included as attachment IA and the site visit informant form, which will be used to collect information from site visit participants, is included as attachment IB in the appendix.

  2. Semi-annual progress reports. The implementation study will use information from federally required semi-annual progress reports to be submitted twice a year in years two through five, to obtain implementation information. The semi-annual progress report is included as attachment IIA in the appendix. The descriptions of evidence-based practices, other services and activities, and partnerships; and the adherence form are included as attachment IIB through IIE.

  3. Enrollment and service log. An enrollment and service log will be used to collect data from grantees on their enrollment of participants and provision of services to them. Grantee or provider staff will enter data as services occur during years two through five. The enrollment and service log data dictionary is included as attachment III in the appendix.

  4. Staff survey. The staff survey will be web-based and administered to frontline staff who provide direct services to children, adults, and families through 10 focal EBPs (identified in Part B of this Supporting Statement). The survey will be administered twice, once each during years two and four. The web-based staff survey instrument is included as attachment IVA in the appendix. Frequently asked questions (FAQs) related to the staff survey are provided as attachment IVB and associated electronic mail materials are included as attachments IVC to IVH.

  5. Partner survey. The partner survey will be web-based and administered to representatives of the grantee organizations and their partner organizations. The survey will be administered twice, once each during years two and four. The web-based partner survey is included as attachment VA in the appendix. FAQs related to the partner survey are included as attachment VB and electronic mail materials associated with the partner survey are included as attachments VC to VH.

Outcomes Study

As part of providing technical assistance, the evaluator is required to advise CB in selecting on a core set of instruments and administrative records grantees are required to use to collect data from or obtain administrative data on program participants for their required local evaluations. Some grantees may collect additional data to meet their needs, however to minimize the data collection burden on participating families, grantees will share data from the core instruments with the cross-site evaluation for use in the cross-site outcomes study (and the impact study, described next).

  • Outcomes study master instrument. The master instrument refers to the required standardized instruments, a required household roster, and a list of required data elements to be drawn from administrative records. These are included as attachments VIA, VIB and VIC in the appendix.

Impact Study

  • Impact study master instrument. In addition to sharing data on program participants for the outcomes study, grantees participating in the impact study will share a subset of core outcome data they collect on comparison group members. The “impact master instrument” refers to five of the 10 standardized instruments being used for the outcome study, the household roster, and the list of data elements to be drawn from administrative records, which will be reported for comparison group members. These are included as attachments VIIA, VIIB and VIIC in the appendix.

d. Legal or Administrative Requirements that Necessitate the Collection.

The Promoting Safe and Stable Families Program (Section 437(f), Subpart 2, Title IV-B, of the Social Security Act) (42 U.S.C. 629g(f)), as amended by the Child and Family Services Improvement and Innovation Act (Pub. L. 112-34). The Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) includes a targeted grants program (section 437(f) of the Social Security Act) that directs the Secretary of Health and Human Services (HHS) to reserve a specified portion of the appropriation for Regional Partnership Grants to improve the well-being of children affected by substance abuse. This legislation also requires grantees to report performance indicators aligned with their proposed program strategies and activities. Under the terms of the RPG grant, CB requires grantees to participate in a national cross-site evaluation. The Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) is included as attachment VIII in the appendix.

A.2. Purpose and Use of the Information Collection

The data collected through the instruments included in this ICR will be analyzed and reported on by the RPG Cross-Site Evaluation. The purpose of the evaluation is to meet the legislative requirement for evaluation, and may assist Congress in setting future policy. It is also designed to contribute to the knowledge base about the implementation and effectiveness of strategies and evidence-based programs selected by RPG grantees for meeting the needs of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance abuse. The findings from the RPG Cross-Site Evaluation will be used by policymakers and funders to consider what strategies and programs they should support to meet the needs of these families. The findings can be used by providers to select and implement strategies and program models suited to the specific families and communities they serve. Evaluation findings can fill research gaps such as by rigorously testing program models that have prior evidence of effectiveness with some target populations but not these target populations, or when provided in combination with other services and programs. Congress will also use information provided through the evaluation to examine the performance of the grantees and the grant program. Details on the purpose and use of the information collected through each instrument in the implementation and partnership study, outcomes study, and impact study, are provided below.

a. Implementation and Partnership Study

The purpose of the implementation and partnership study is to examine the processes and content of implementation and partnership development and management, with a focus on factors shown in the research literature to be associated with quality implementation and sustainable partnerships. The study will provide descriptions of RPG grantees’ target populations, selection of EBPs and their fit with the target population, inputs to implementation (such as staff selection and hiring, staff qualifications and attitudes toward implementing EBPs, staff training, supervision and feedback, organizational climate, leadership and decision making, administrative support, referral processes, and use of data systems), and actual services provided for selected EBPs (including dosage, duration, content, adherence to program models, and participant responsiveness). The study will also provide a description of the characteristics of RPG partners, their roles in RPG programs, their relationships and communication systems, the extent of coordination and collaboration among partners, and their potential to sustain the partnerships at the end of the grant funding.

  • Grantee and partner staff topic guide. The purpose of the topic guide is to collect detailed information from selected program and grantee staff and partners about plans and goals for their RPG program; their decisions about which EBPs to select; the organization and leadership of the RPG partnership; the community and state context; staff satisfaction with using the selected EBPs and their perceptions of the consistency and quality of service provision; and implementation experiences, facilitators, barriers, challenges, and lessons learned.

  • Semi-annual progress reports. The semi-annual progress reports will be used to obtain updated information from grantee project directors about their program operations and partnerships, including any changes from prior periods. To fully meet the intent of the Funding Opportunity Announcement, grantees must adopt and implement specific, well-defined program services and activities that are evidence-based and evidence-informed and trauma-informed. The CB has tailored the semi-annual progress reports to collect information on grantees’ evidence-based and evidence-informed programs and other services grantees implement, the target population for the RPG program, and grantees’ perceived successes and challenges to implementation. Grantees will also report on a series of indicators of adherence to program models for selected EBPs..

  • Enrollment and service log. The purpose of this instrument is to describe the services that RPG clients actually receive. Grantees will record the enrollment date for each RPG family or household and demographic information on each family member including data of birth, ethnicity, race, primary language spoken at home, type of current residence, income and sources (adults only), highest education level attained (adults only) and relationship to a focal child in each family on whom data will be collected. Grantees will also record the enrollment date for families or individual family members into specific EBPs, weekly service contact information for selected EBPs, and exit dates for EBPs and RPG.

  • Staff survey. Respondents for the staff survey will be all staff members who provide direct services to children, adults, and families through the 10 focal EBPs. (Each grantee is implementing one or more of the focal EBPs). The survey will collect information about their roles on RPG; their demographic characteristics, prior experience, and education; their attitudes toward implementing the EBP; any planned or unplanned adaptations made to the EBP; the supervision and support they recieve; and the climate within their organization.

  • Partner survey. The partner survey will be administered to grantees and their partners. The purpose of the partner survey is to gather information on the characteristics of the partner organizations, how partners communicate and collaborate, goals of the partnership, and the types of organizations and roles within the partnership.

b. Outcomes Study

The goal of the outcomes study is to describe the characteristics of participating families and their outcomes in the five domains: (1) child well-being; (2) family functioning and stability; (3) adult recovery; (4) child permanency; and (5) child safety, for children and families who participate in the RPG programs.

  • Outcomes study master instrument. The purpose of the outcome master instrument is to provide instruments and specifications for administrative records in a convenient format, to help ensure consistency across grantees and to minimize duplication across instruments. The master instrument includes: (1) 10 standardized instruments used widely for family support, child development, and substance abuse treatment research—including 7 copyrighted instruments; (2) a household roster developed by Mathematica Policy Research; and (3) a list of data elements to be drawn from administrative records. Forms and information in the master instrument will be used by grantees to collect data from or on participants in the RPG programs, for use to evaluation outcomes or impacts in their local evaluations, and to share with the cross-site evaluation for describing participant characteristics and outcomes for the overall RPG grant program.

Ten standardized instruments will be included in the master instrument. The instruments will be administered by grantees at program entry and exit to obtain data on child well-being for a focal child identified in each RPG case, and for the family functioning/stability and recovery domains, as follows:

  1. Child well-being

  • Trauma Symptoms Checklist for Young Children (Briere et al. 2001)

  • Behavior Rating Inventory of Executive Function (Gioia 2000) or the Behavior Rating Inventory of Executive Function-Preschool (Gioia 2000), depending on the age of the focal child

  • Child Behavior Checklist-Preschool Form (Achenbach and Rescorla 2000) or the Child Behavior Checklist-School-Age Form (Achenbach and Rescorla 2000), depending on the age of the focal child

  • Infant-Toddler Sensory Profile (Dunn 2002) if appropriate depending on the age of the focal child

  • Socialization Subscale, Vineland Adaptive Behavior Scales, Second Edition, Parent-Caregiver Rating Form (Sparrow, Cicchetti and Balla 2005) if appropriate depending on the age of the focal child

  1. Family functioning

  • Adult-Adolescent Parenting Inventory (Bavolek and Keene 1999)

  • Center for Epidemiologic Studies-Depression Scale, 12-Item Short Form (Radloff 1977)

  • Parenting Stress Index, Short Form (Abidin 1995)

  1. Adult recovery

  • Addiction Severity Index, Self-Report Form (McLellon et al. 1992)

  • Trauma Symptoms Checklist-40 (Briere and Runtz 1989)

In addition to the standardized instruments, a household roster will be used by grantees to collect data on the focal child’s household composition at RPG program entry and exit in order to assess family stability. This instrument collects information from the focal child’s primary caregiver about who has been living in the household with the child during the past month and each household member’s relationship to the focal child.

Grantees will also obtain data obtained from administrative records maintained by local or state child welfare, foster care, and substance abuse treatment agencies for their local evaluations, and provide a core set of records to the cross-site evaluator. These records will be used to create measures of child safety and permanency, and adult receipt of substance abuse treatment services and their recovery. A list and specifications of the core set of records needed will be included in the master instrument.

c. Impact Study

The goal of the impact study is to provide pooled estimates of the effectiveness of RPG programs among selected RPG grantees with rigorous local evaluation designs. To help minimize the burden on grantees participating in this portion of the cross-site evaluation, the impact study will use a subset of outcome data to compare treatment and comparison groups.

  • Impact study master instrument. The purpose of the impact master instrument is to provide instruments and specifications for administrative records in a convenient format, and to help ensure consistency across the 7 grantees who will contribute data to the cross-site impact study.

This instrument includes specifications for administrative records and four standardized instruments—including three copyrighted instruments—that will be collected by grantees to capture outcomes in the child well-being, family functioning and recovery domains from comparison group members:

  1. Child well-being

  1. Child Behavior Checklist-Preschool Form or the Child Behavior Checklist-School-Age Form, depending on the age of the focal child

  2. Socialization Subscale, Vineland Adaptive Behavior Scales, Second Edition, Parent-Caregiver Rating Form, if appropriate to the age of the child

  3. Family functioning

  4. Parenting Stress Index, Short Form

  5. Recovery

  6. Addiction Severity Index

In addition to the four standardized instruments, a household roster will be used by grantees to collect data on the focal child’s household composition. Grantees will also obtain data obtained from administrative records maintained by local or state child welfare, foster care, and substance abuse treatment agencies for their local evaluations, and provide a core set of records to the cross-site evaluator. These records will be used to create measures of child safety and permanency, and adult receipt of substance abuse treatment services and their recovery. The household roster and a list and specifications of the core set of records needed will be included in the outcomes and impact study master instruments.

A.3. Use of Improved Information Technology and Burden Reduction

The RPG Cross-Site Evaluation will make use of technology to collect study information. The only exceptions are for the semi-structured in-person interviews conducted during site visits, and the written semi-annual progress reports.

  • Web-based staff and partner surveys. The surveys of program staff and grantee partners will be administered via the web. Compared to other survey modes, web-based surveys offer ease and efficiency to respondents and help ensure data quality. The surveys will be programmed to automatically skip questions not relevant to the respondent, thus reducing cognitive and time burden. The instruments will also allow respondents to complete the surveys at a time (or times) convenient to them. If respondents are unable to complete the survey in one sitting they can save their place in the survey and return to the questionnaire at another time. Validation checks and data ranges will be built into appropriate items to ensure data quality.

  • Use of optimum technology applications to collect outcome and service data from grantees. The evaluation contractor and its subcontractors will develop and operate a seamless and transparent data reporting system for use by grantees, known as the RPG Data Portal. The RPG Data Portal will be a user interface accessible from any computer, allowing for ease of entry, while all data will be housed on secure servers behind the contractors’ firewall, thereby maintaining data security. The system has been designed with use by grantee staff in mind, and based on experience from prior studies with similar types of service providers and data. It will be composed of two applications, each designed to facilitate efficient reporting of 1) outcome data, and 2) enrollment and service data.

  • Outcome data management system. Each grantee will report data from standardized instruments, a household roster, and a list of data elements to be drawn from administrative records using the master outcome instrument. Grantees will develop their own project or agency databases to store these data. This database will include all data that the grantee collects from clients or on behalf of clients. The contractor will provide format specifications to the grantees for use in uploading outcome data through this application. These are likely to be in easy-to-use PDF or Excel format. Grantees will upload these data twice a year. This application is modeled on the system that was used to obtain these types of data from RPG grantees during the first round of grants made in 2007 (“RPG1”), Ten of the 17 current RPG grantees also received RPG1 grants and reported data through the RPG1 system. They are thus well-prepared to use this type of application. Importantly, the new outcome data management system will incorporate advances in technology and software, and improved programming approaches, to enhance the experience of providing outcome data for all current RPG grantees including to reduce the time needed for preparing and uploading data to the system. .

  • Enrollment and service log. Grantee staff will use the enrollment and service log to provide demographic information on each RPG case at enrollment, as well as enrollment and exit dates for the RPG project and each EBP in which case members enroll. It will also be used to provide data for families that enroll in the focal EBPs and by home visitors to document service delivery and facilitate the tracking of all activities completed with the family, including assessments, referrals, education, and support. The design of the RPG enrollment and service log is based on web-based case management systems that Mathematica has developed and implemented successfully for multiple projects that involved collecting similar data from similar types of providers. For example, the enrollment and service log will be flexible, easy-to-use, and include navigational links to relevant fields for each type of entry to minimize burden on grantee staff and increase the quality and quantity of data collected. The log is designed to be used by multiple users at each organization and will provide varying levels of access depending on users’ needs. For example, administrators or supervisors will have the greatest rights within the system, having the ability to create new users, assign program participants to staff members, and review all activity from the organization. Staff providing direct services to study participants will have the ability to record and review information about participants assigned to their caseload. The various levels of system access allow for streamlining of information. Limiting full system access to a small set of staff members promotes increased data security, reduces respondent confusion, and supports the collection of higher quality information.

  • Use of evaluation data to construct performance indicators. The legislation that established the RPG Grant Program requires grantees to provide performance indicators to be included in annual reports to the Congress, which the cross-site evaluation contractor will produce. To minimize grantee burden, the cross-site evaluation contractor will use data obtained for the implementation and partnership study, the outcomes study, and the impact study to create the needed performance indicators. This avoids having grantees submit performance data in addition to data required for the evaluation. Data collected directly by Mathematica or provided by grantees for the cross-site evaluation will be used to describe to Congress the program strategies of each RPG grantee and their selected EBPs; the structure and membership of their collaborative partners across child welfare, substance abuse treatment, judicial, and other systems; enrollment targets and the pace of enrollment, along with a description of program participants; and the services received by RPG clients. Combined with information on child, adult, and family outcomes, this information will give Congress a full picture of how grantees performed and the extent to which they met their RPG program goals.

A.4. Efforts to Identify Duplication and Use of Similar Information

The RPG Cross-Site Evaluation is specifically designed to minimize the duplication of efforts or data. First, to participate in the cross-site outcomes and impact evaluations, grantees will share some or all of the data they are collecting for their own required local evaluations. Second, data shared by grantees or provided through direct collection from grantees, staff members, and partners for the cross-site evaluation will also be used to describe grantee performance. That is, to reduce duplication of efforts for grantees to comply with both the CB’s local and cross-site evaluation requirements and legislatively mandated performance indicators, the cross-site evaluation data needs completely overlap with data needed for performance indicators. Since there are no existing reporting systems that collect the data required for reporting to Congress or for the cross-site evaluation, this data collection plan does not duplicate any current efforts.

Furthermore, the design of the cross-site evaluation instruments ensures that there is no duplication of data collected through each instrument. For example, during the semi-structured interviews conducted during site visits, grantee and EBP staff members will not be asked any questions included in the staff survey. Questions asked during the second round of site visits will reflect information collected during the earlier site visit—for instance irrelevant questions will be eliminated. Information on program implementation, partners, and implementation challenges and successes provided in the semi-annual progress reports will reduce the level of detail needed from site visit interview participants, and also reduces the need to address RPG program operations in the staff and partner surveys. In creating the master outcome and impact instruments, the contractor will review and crosswalk all items in order to identify duplication across instruments. Any duplicate items not needed for scoring the instruments will be removed from the versions of the standardized instruments provided in the outcome and impact master instruments. This not only reduces burden on RPG participants in providing data for grantees’ local evaluations, but also reduces the burden on grantee staff for preparing and uploading outcome data to the cross-site evaluation.

A.5. Impact on Small Businesses or Other Small Entities

The potential exists to affect small entities within a grantee site, depending on the local community partners and funders with which RPG grantees engage. RPG grantee partners, and direct service providers will be included as part of site visit interviews and partner surveys. Additionally, RPG grantee agencies will enter data into the RPG data portal. Proposed data collection for these three efforts is designed to minimize the burden on all organizations involved, including small business and entities and consistent with meeting the aims of the legislation establishing RPG and CB’s need for valid, reliable, and rigorous evaluations of federally funded programs.

A.6. Consequences of Not Collecting Information or Collecting Information Less Frequently

Not collecting information for the RPG Cross-Site Evaluation would limit the government’s ability to document the performance of its grantees, as legislatively mandated, and to assess the extent to which these federal grants are successful in achieving their purpose. Furthermore, the RPG Cross-Site Evaluation provides a valuable opportunity for CB, practitioners, and researchers to gain empirical knowledge about the implementation and effectiveness of coordinated, evidence-based strategies for meeting the needs of families in the child welfare and substance abuse treatment systems. The study will examine whether the government’s strategy of funding collaborative, cross-system partnerships is a productive one that is likely to be sustained after the grant period ends, along with understanding the characteristics and roles of key partnering organizations, their coordination and communication mechanisms, and the quality of collaboration.

a. Implementation and Partner Study

  • Grantee and partner staff topic guide. Without the information being collected through interviews with grantee and partner staff, the cross-site evaluation would have to rely entirely on implementation information reported by a single source: the RPG grantee project directors through the semi-annual progress reports. Thus, the study would lack the broader perspectives of other key participants and it would not be possible to conduct any in-depth analysis of critical program challenges and successes or implementation issues. The site visits are planned during the early implementation phase (year 2 of the grant period) and after full implementation (year 4 of the grant period). The first visit will focus on understanding program design, rationale for selecting EBPs, and early implementation experiences. The second visit will collect updated information on implementation experiences, changes made to the program design and rationale for the changes. Without conducting two visits, the study will be unable to capture program changes made over time, or to obtain staff and partner feedback about the lessons they learned along the way.

  • Semi-annual progress reports. Without obtaining information from the semi-annual progress reports, the study will not have detailed information about grantee operations; changes to planned interventions, target population and eligibility criteria, or target outcomes; and planned or unplanned adaptations of EBPs that occur as the RPG grants are implemented. The progress reports will provide timely information about the infrastructure that grantees put in place to support implementation as well as features of the community context that have influenced grantees’ implementation plans. Data obtained from adherence forms will measure implementation quality and fidelity to EBPs. Without this information, it will not be possible to assess the extent to which grantees adhere to specific benchmarks based on program requirements. Collecting this information less often than twice a year would violate the federal requirements for grantee progress reporting, and would in fact place larger burdens on respondents to remember or store information about events, changes in direction, or challenges and successes over a longer period of time. Since aggregate information from the reports will be extracted and shared with grantees for program improvement and peer learning, less frequent reporting would also limit the ability of grantees to consider adjustments or reach out to each one another. The data also provide information for designing evaluation-related and programmatic technical assistance in response to emerging issues and grantee needs.

  • Enrollment and service log. The enrollment and service log is important for describing actual service delivery to cases receiving selected EBPs and for tracking all activities completed with the family, including assessments, referrals, education, and support. Data will be collected when participants enroll, as they receive services, and at exit. Without these data, the study would have no information on the services recipients actually receive, including their duration and dosage. The evaluation would be unable to link participant outcomes to the levels or combinations of specific services or to understand whether and how participants participated in the selected EBPs. If data were collected less frequently, providers would have to store service data or try to recall it weeks or months after delivery. Regular collection will also enable us to check data quality and address missing data, errors, or other problems in a timely way.

  • Staff survey. Without this survey, information that would be difficult to obtain during semi-structured interviews, such as the quality of staff relationships and the supportiveness of program leadership, will not be collected. The staff survey will also enable the collection of data from a broader set of program staff than those who will be interviewed during the site visits and enable the collection of more structured information. The staff survey will be administered twice, once in year two and once in year four of the study. This will enable the cross-site evaluation to identify changes over time in staff composition, staff perceptions of the program, or other factors important to quality implementation of EBPs.

  • Partner survey. Without the partner survey, information that will help understand the roles that partners play in RPG, the communications and working relationships among partners, the quality of their collaboration and their goals for the RPG program in their region will not be collected. Since many federal initiatives require grantees to establish collaborations—and since the literature shows that collaboration and service integration between child welfare agencies, substance abuse treatment providers, and other key systems such as the courts has been rare or difficult in the past, collecting these data will help fill important gaps in knowledge. Partner surveys will be conducted twice during the grant period, in year two of the RPG program, when implementation has begun, and in year four, after operations are more established. Without administering the surveys at two points, the evaluation will be unable to describe changes in the composition of partners, the extent of coordination and collaboration over time, and how partner roles evolved.

b. Outcomes Study

  • Outcomes study master outcome instrument. It is the mission of the CB to ensure child well-being, safety, and permanency for children that experience maltreatment. The master outcome instrument will provide detailed information on these outcomes and the adults who receive services. Grantees will upload data from the master outcome instrument twice each year. Without this information, evaluators will be unable to describe the outcomes of RPG program participants, and to analyze the extent to which grants have been successful in addressing the needs of families co-involved with substance abuse and child welfare systems. Further, it would be impossible to conduct an impact study (described next). During each upload, the outcome data management system will perform automatic validation checks, enabling grantees to ensure quality and completeness of their data. Mathematica will then review submissions to address any remaining data quality issues, and work with grantees to resolve problems. This ensures that data quality issues can be addressed early and resolved. If data were uploaded less often, it would be more cumbersome and difficult for grantees to search through older records to make corrections or provide missing data.

c. Impact Study

  • Impact study master outcome instrument. In addition to reporting data for the implementation and partnership study and the outcomes study, grantees participating in the impact study will also upload outcome data for participants in their comparison group (i.e. those who do not receive RPG services or receive only a subset of RPG services). Without this information, it would not be possible to rigorously analyze the effectiveness of the interventions by comparing outcomes for individuals with access to RPG services to those in comparison groups. Uploading the data every six months provides the same benefits with respect to data quality described above.

A.7. Special Circumstances Relating to the Guidelines of 5 CFR 1320.5

There are no special circumstances requiring deviation from these guidelines.

A.8. Comments in Response to the Federal Register Notice and Efforts to Consult Outside the Agency

The first Federal Register Notice was published on August 19,2013 (Federal Register /Vol. 78, No. 182 /Thursday, September 19, 2013 /Notices, pp. 57641-43). The comment period ended November 18, 2013. No comments were received. The second Federal Register Notice was published on December 3, 2013 (Federal Register/ Vol. 78, No. 232/ Tuesday, December 3, 2013/ Notices, pp. 72679-80).

The agency consulted with the RPG grantees during the development of the RPG cross-site evaluation design and data collection requirements. Feedback sessions were held during the initial grantee kick-off meeting in January 23-25, 2013 and annual grantees meeting on April 23-24, 2013. To select the initial set of outcome measures and instruments for the outcomes and impact studies, the project team held three calls with consultants, Cheryl Smithgall, Ira Chasnoff, and Joe Ryan. Once an initial set was developed, the team held three workgroup calls and an in-person meeting (April 23-24, 2013) with grantees to solicit their feedback and comments.

We designed two surveys for the cross-site evaluation, one for grantee staff and the other for partner staff, working with consultants Allison Metz and Phyllis Panzano. We held a webinar and solicited feedback in a follow-up call with grantees on the data to be collected as part of the implementation study. Six front-line staff from three RPG grantees pretested the staff survey; five representatives from partner agencies affiliated with three RPG grantees pretested the partner survey. The team used the results to establish the burden estimates, and to finalize the surveys.





Changes were incorporated into to the final evaluation plan and data collection plans in response to feedback from grantees and consultation with the experts mentioned previously.

A.9. Explanation of Any Payment or Gift to Respondents

No payments or gifts will be provided to respondents as part of data collection.

A.10. Assurance of Privacy Provided to Respondents

This study is being conducted in accordance with all relevant regulations and requirements, including the Privacy Act of 1974 (5USC 552a), the Privacy Act Regulations (34 CFR Part 5b), and the Freedom of Information Act (5 CFR 552) and related regulations (41 CFR Part 1-1, 45 CFR Part 5b, and 40 CFR 44502). Several specific measures will be taken to protect respondent privacy.

  • Adopting strict security measures and web security best practices to protect data collected through the data portal. Data collected through the data portal (which includes outcome data as well as enrollment service logs), will be housed on secure servers that conform to the requirements of the HHS Information Security Program Policy. The data portal will employ strict security measures and web security best practices to ensure the data will be submitted, stored, maintained and disseminated securely and safely. Strict security measures will be employed to protect the confidentiality of participant information stored in the system including data authentication, monitoring, auditing, and encryption. Specific security procedures include, but are not limited to:

  • All data will be encrypted in transit (using TLS protocol backward compatible to SSL)

  • Data will be encrypted at rest and reside behind firewalls

  • Access to the data portal will be restricted to approved staff members who will be assigned a password only with permission from the study director. Each user will have a unique user id/password combination

  • Database access will require special system accounts. Portal users will not be able to access the database directly

  • Portal users will be able to access the system only within the scope of their assigned roles and responsibilities

  • Security procedures will be integrated into the design, implementation, and day-to-day operations of the portal.

  • All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. To further ensure data security, project personnel are required to adhere to strict standards, receive periodic security training, and sign security agreements as a condition of employment

  • Training cross-site evaluation interviewers in confidentiality procedures. All site visit interviewers will be knowledgeable about privacy procedures and will be prepared to describe them in detail or to answer any related questions raised by respondents. During the introduction to each interview, site visit informants will be told that none of the information they provide will be used for monitoring or accountability purposes and that the results of the study will be presented in aggregate form only.

  • Using web-based staff and partner surveys. Administering the staff and partner surveys via web eliminates security risks related to shipping hard-copy forms containing personal identifying information (PII) to the evaluator.

  • Assignment of content-free case and participant identification numbers to replace personal identifying information associated with all participant outcome data provided by grantees to the cross-site evaluation. The cross-site evaluation will develop and work with grantees to implement standard procedures for assigning identification numbers to all participant-level data. Case- and individual-level numbers will be content-free. For example, they will not include special codes to indicate enrollment dates, participant location, gender, age, or other characteristics.

A.11. Justification for Sensitive Questions

There are no sensitive questions in the instruments that the contractor will use to collect data.

(Some of the specified standardized instruments that grantees will use to collect data do include sensitive questions. For example, in the case of parents who are endangering their children as a result of their substance abuse, it is essential for grantees to measure the parents’ pattern of substance abuse as this is a critical indicator of recovery. In recognition of this need and to ensure confidentiality and other protections to their clients, as a condition of their RPG grant, all grantees are required to obtain IRB clearance for their data collection.)

A.12. Estimates of Annualized Hour and Cost Burden

The estimated reporting burden and cost for the data collection instruments included in this ICR is presented in Table A.1. The grant period is for five years; we are requesting clearance to collect data within a three year period; all data collection for the cross-site evaluation will occur during this three-year period.

We estimate the average hourly wage for program directors and managers to be the average hourly wage of “Social and Community Services Manager” ($30.99), that of grantee staff to be the average hourly wage of “Counselors, Social Workers, and Other Community and Social Service Specialists” ($21.26), that of data managers to be the average hourly wage of “Data Administrators” ($38.04), that of data entry specialists to be the average hourly wage of “Data Entry and Information Processing Workers” ($15.11) and that for partners to be the average hourly wage of “General and Operations Manager” ($55.22), taken from the U.S. Bureau of Labor Statistics, National Compensation Survey, 2012. Table A.1 summarizes the proposed burden estimates for the use of the instruments and products associated with the implementation and partnership study, the outcomes study, and the impact study.

Implementation and Partnership Study

The total estimated cost figures are computed from the total annual burden hours and average hourly wages for program directors and managers ($30.99), grantee staff ($21.26) and partners ($55.22), described above. For each burden estimate, annualized burden has been calculated by dividing the estimated total burden hours by the number of study years. Figures are estimated as follows:

  • Individual interview with program director. We expect to interview 17 RPG program directors (1 per grantee across 17 grantees) twice during the evaluation period. These interviews will take two hours. Thus, the total burden for individual interview with program directors is 68 hours, and the total annualized burden is 22.6 hours (68 ÷ 3).

  • Group interview with program manager/supervisor group interview. We expect to conduct semi-structured small-group interviews with 153 program managers (3 staff per EBP from 3 EBPs in each of the 17 sites) twice during the evaluation period. These interviews will last 2 hours. Thus the total burden of participating in group interviews is 612 hours, and the total annualized burden is 204 hours (612 ÷ 3).

  • Individual interview with program manager or supervisor. We expect to conduct individual, semi-structured interviews with 102 program managers or supervisors (2 staff per EBP from 3 EBPs in each of the 17 sites) twice during the evaluation period. These interviews will take 1 hour. Thus the total burden for individual interviews with program managers is 204 hours, and the total annualized burden is 68 hours (204 ÷ 3).

  • Individual interview with frontline staff. We expect to conduct individual, semi-structured interviews with 102 frontline staff who work directly with children and families (2 staff per EBP from 3 EBPs in each of the 17 sites) twice during the evaluation period. These interviews will take 1 hour. Thus the total burden for individual interviews with frontline staff is 204 hours, and the total annualized burden is 68 hours (204 ÷ 3).

  • Semi-annual progress report. Grantees will submit two progress reports per year for each year of the evaluation period. We assume that 17 project directors (1 per grantee) will submit the semi-annual progress reports and adherence forms 6 times during the evaluation period. It will take 16.5 hours to submit each one. The total burden for submitting the semi-annual progress report and adherence form is thus 1,683 hours, and the total annualized burden is 561 hours (1,683 ÷ 3).

  • Case enrollment. Based on grantee projections, we assume enrollment of 1,530 families per year (90 families per site across 17 sites). We assume that 3 staff per grantee will conduct enrollment, or 51 staff total, where each staff member conducts enrollment with approximately 30 families. It will take 15 minutes total to enroll each family. Thus, the total burden for enrolling families across all staff members for three years is 1147.5 hours, and the total annualized burden is 382.5 hours (1147.5 ÷ 3).

  • Service log entries. Based on the expected participation of families in specific RPG services and EBPs, we assume there will be one service log entry each week for 1,530 families per year (90 families per site across 17 sites). We assume that 6 staff per grantee will provide services and enter service data (102 staff total), with a caseload size of 15 families each. Each weekly entry will take 3 minutes. Thus, the total burden for completing service log entries is 11,934 hours, and the total annualized burden is 3,978 hours (11,934 ÷ 3).

  • Staff survey. We expect to administer the web-based survey twice to 340 frontline staff (20 per site across 17 sites). The survey will take 25 minutes to complete. Thus the total burden for the staff survey is 283.3 hours, and the total annualized burden is 94.4 hours (283.3 ÷ 3).

  • Partner survey. We expect to administer the web-based survey twice to 340 grantee partners (20 per site across 17 sites). The survey will take 20 minutes to complete. Thus the total burden for the partner survey is 226.7 hours, and the total annualized burden is 75.5 hours (226.7 ÷ 3).

Outcomes study

The total estimated cost is computed from the total annual burden hours and average hourly wages for data managers ($38.04) and data entry specialists ($15.11). Costs are estimated as follows:

Administrative Data

  • Obtain access to administrative data. During the second year of the study, grantees will review all data submission instructions and grantee agency personnel will develop a data management plan and the necessary administrative agreements (such as memoranda of understanding) with agencies that house the administrative records to obtain the requested records. They will implement data protocols, including mapping their data fields to the fields in the RPG data portal. Finally, they will pilot the data request and receipt process. It will take 220 hours to obtain initial access. Thus, the total burden for obtaining initial access is 3,740 hours. Grantees will then update administrative agreements with agencies that house the administrative records once in the third year and once in the fourth year of the study. It will take 18 hours to update the agreements. Thus, the total burden is 612 hours. The combined burden for obtaining initial and ongoing access to administrative data is thus 3,740 plus 612 hours or 4,352 hours. RPG grantees will use these data for their local evaluations as well; however to comply with the procedures for providing the data to the cross-site evaluation there may be additional steps needed. Therefore we have assumed that half of the burden of obtaining the administrative data (2,176 hours) should be allocated to the cross-site evaluation. The annualized burden is 725 hours (2,176 ÷ 3). We assume 1 data manager per grantee (or 17 data managers) will complete these processes.

  • Report administrative data. Grantees will upload administrative data they have obtained to the RPG data portal twice per year for the three-year evaluation period. For each upload, it will take grantees 144 hours to prepare and upload their administrative data, including correcting any data validation problems. The total burden for reporting administrative data is thus 14,688 hours for all 17 grantees combined, and the total annualized burden is 4,896 hours (14,688 ÷ 3). We assume that 1 data entry operator per grantee (or 17 data entry operators) will upload the data.

Standardized Instruments

  • Review and adopt reporting templates. During the first year of the study, grantees will review and adopt our reporting templates for uploading standardized data in the second year of the study. (They will utilize the same templates for subsequent data reporting). We assume that it will take 8 hours for each of 17 data entry operators (1 in each of the 17 sites) to review the reporting templates. The total burden for reviewing and adopting the reporting templates is thus 136 hours, and the total annualized burden is 45.33 hours (136 ÷ 3).

  • Enter data into local databse. Over the course of the 3-year study grantees will enroll a total of 4,590 cases (1530 cases enrolled each year). For every case, 10 standardized instruments will be administered at baseline and again at program completion, for a total of 91,800 administrations. Grantees will enter data from the completed instruments into their local databases, and data entry for each instrument will take 15 minutes (.25 hours) per administration. The total burden is thus 22,950 hours. RPG grantees will use these data for their local evaluations, however to comply with the procedures for providing the data to the cross-site evaluation there may be additional steps needed to enter these data into their local databases. Therefore we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation, Thus the total burden for entering cross-site evaluation data is 11,475 hours, and the total annualized burden is 3,825 hours (9,022.5 ÷ 3). We assume that 17 data entry operators (1 operator in each site) will enter the data.

  • Review records and submit electronically. Grantees will review records to ensure that all data has been entered and upload the data to the RPG portal twice per year for each year of the evaluation period. It will take 6 hours to review and submit data for each of the 10 instruments. Grantees will then validate and resubmit data when errors are identified. It will take 4 hours to validate data for each of the 10 instruments, including time for obtaining responses to validation questions and resubmitting the data. The total burden for reviewing and electronically uploading records is 6,120 hours, and the total burden for validating and resubmitting data is 4,080 hours. Thus the total burden is 10,200 hours, and the annualized burden is 3,400 hours (10,200 ÷ 3). We assume that 17 data entry operators (1 operator in each site) will review and submit the data.


Impact study

The total estimated cost is computed from the total annual burden hours and an average hourly wage for data entry specialists ($15.11 described above). Amounts are estimated as follows.

Standardized Instruments

  • Data entry for comparison study sites. Seven grantees participating in the impact study will also enter data for control group members. Over the course of the study the seven grantees will enroll and collect data from 1,215 comparison group members. For every member, five standardized instruments will be administered at baseline and followup, for a total of 12,150 administrations. Grantees will enter data from the completed instruments into their local databases. It will take .25 hours for each administration, for a total of 3037.5 hours. RPG grantees will use these data for their local evaluations as well, however to comply with the procedures for providing the data to the cross-site evaluation there may be additional steps needed to enter these data into their local databases. Therefore we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus the total burden for entering cross-site evaluation data is 1,518.9 hours, and the total annualized burden is 506.3 hours (1,518.9 ÷3). We assume that 7 data entry operators (1 operator in each of the 7 sites) will enter the data.

Table A.1. Estimate of Burden and Cost for the RPG Evaluation – TOTAL Burden Request

Data Collection Activity

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (hours)

Total Burden Hours

Estimated

Total Annual Burden Hours


Implementation and Partnership Study


Program Director Individual Interview

17

2

2

68

22.6


Program Manager/Supervisor Group Interview

153

2

2

612

204


Program Manager/Supervisor Individual Interview

102

2

1

204

68


Frontline Staff Individual Interview

102

2

1

204

68


Semi-Annual Progress Report

17

6

16.5

1,683

561


Case Enrollment Data

51

90

0.25

1,147.5

382.5


Service Log Entries

102

2,340

0.05

11,934

3,978


Staff Survey

340

2

0.42

283.3

94.4


Partner Survey

340

2

0.33

226.7

75.6





Table A.1 (continued)

Data Collection Activity

Number of Respondents

Number of Responses per Respondent

Average Burden per Response (hours)

Total Burden Hours


Estimated

Total Annual Burden Hours



Data Uploading for Outcomes Evaluation

Administrative Data








Obtain Access to Administrative Data

17

3

42.7

2,175


725


Report Administrative Data

17

6

144

14,688


4,896


Standardized Instruments








Review and Adopt Reporting Templates

17

1

8

136


45.3


Enter Data into Local Database

17

6

112.5

11,475


3,825


Review Records and Submit Electronically

17

6

100

10,200


3,400



Additional Data Entry for Impact Evaluation

Data Entry for Comparison Study Sites

7

1

217

1,518.9


506.3


Total






18,852


Estimated total annual opportunity cost by data collection activity and respondent

The total estimated annual opportunity cost to the respondents is $345,910. This total was calculated by multiplying the average labor rates for the various respondent categories by the total annual burden hours for each activity (see previous sections for details on the specific labor rates used).

A.13. Estimates of other Total Annual Cost Burden to Respondents or Recordkeepers/Capital Costs

These information collection activities do not place any additional costs on respondents or record keepers.

A.14. Annualized Cost to Federal Government

The estimated cost for completion of the RPG cross-site evaluation and technical assistance project is $$5,788,269, which represents the costs of the cross-site evaluation. The total cost over the three years of the requested clearance is $3,472,961. The annualized cost to the federal government is one third of that total $1,157,654.

A.15. Explanations for Program Changes or Adjustments

None; this is a new collection.

A.16. Plans for Tabulation and Publication and Project Time Schedule

a. Plans for Tabulation

The information from the RPG Cross-Site Evaluation—with a focus on program operations, implementation, and outcomes for families—will be useful to funders, practitioners, and other stakeholders interested in targeting resources to effective approaches to address the needs of families affected by substance abuse. Identifying what has worked well allows subsequent efforts of program operators and funders to hone in on evidence-based practices and strategies.

Implementation and Partnership Study, and Outcomes Study

The instruments included in this OMB package for the implementation and partnership study, and the outcomes study, will yield data that will be analyzed using qualitative and quantitative methods to describe the target populations’ characteristics and outcomes; program implementation and factors shown to be associated with high quality implementation; and the structure, quality and goals of partnerships. A descriptive analysis of participants will provide a snapshot of child, adult, and family characteristics and outcomes. Thorough documentation of program implementation and partnerships will expand understanding about the breadth of programs, practices, and services being offered by RPG to vulnerable families and will describe successes in achieving goals and barriers encountered. A greater understanding of how programs can be implemented with a network of partners may inform future efforts in this area.

Mathematica will use standard qualitative procedures to analyze and summarize information from program staff interviews conducted using the semi-structured staff interview topic guide. These procedures include organization, coding, and theme identification. Standardized templates will be used to organize and document the information and then code interview data. Coded text will be searched to gauge consistency and consolidate data across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics/themes/categories (Yin 1994; Coffey et al. 1996) which can then be analyzed to address the study’s research questions.

Quantitative data will be summarized using basic descriptive methods. For the outcomes study, data from the standardized instruments and household roster will be tabulated and used to create scales and scores appropriate for each instrument and using established norms when appropriate for the RPG target populations. Administrative records will be examined to determine whether incidents of maltreatment and removal from the home have occurred for children and whether adults have received substance abuse treatment, and their frequency and resolution. These data will capture information at baseline and program exit for families who participate in services. For the implementation and partnership study, sources of quantitative data include the frontline staff and partner surveys and a form on adherence to specific program requirements (that is, staff hiring, training, and support, and service delivery). Data from each source will follow a common set of steps involving data cleaning, variable construction, and computing descriptive statistics. To facilitate analysis of each data source we will create variables to address the study’s research questions. Construction of these analytic variables will vary depending on a variable’s purpose and the data source being used. Variables may combine several survey responses into a scale or a score, aggregate attendance data from a set time period, or compare responses to identify a level of agreement.

Service data use, entered by grantees in a web-based system, will also be used for the implementation study. The study will provide summary statistics for key program features:

  • Enrollment. For example, the average number of new cases each month.

  • Services provided by grantees. For example, the services in which clients typically participate (including any common combinations of services), distribution of location of services (such as home, treatment facility, or other site), the average number of selected services (such as workshops) offered each month, and common topics covered during services.

  • Participation. For example, the average length of time participants are served by the program, the average number of hours of services received by program participants, the average duration between enrollment and start of services.

We will analyze data from the web-based system for each grantee once a year to correspond to the annual reports identified in Table A.2. In each report, we will describe enrollment patterns, services provided, and participation patterns over the previous 12 months. Later analyses may describe how patterns changed over time, such as from the early to late implementation period.

Impact Study

The impact study will complement other components of the evaluation with an examination of program effectiveness in the areas of child well-being, safety, and permanency; recovery; and family functioning. A selected subset of seven grantees who have proposed rigorous local evaluations, either using random assignment or a strong matched comparison group, will be included in the impact study. To be considered a strong matched comparison group, the local evaluation much include baseline data on key characteristics, such as family functioning and parental substance abuse, on which to establish equivalence with those receiving RPG services. As noted above, all grantees will provide data on the program groups as part of the outcomes study. Those involved in the impact study will also collect data on comparison groups who do not receive RPG services at baseline and program exit.

The analysis of effects will have two components. First, we will pool the grantees that used a randomized-controlled trial (RCTs) in their local evaluations. RCTs have excellent internal validity—ability to determine whether the program caused the outcomes—because the treatment and comparison groups are initially equivalent on all observed and unobserved characteristics, on average. Any observed differences in outcomes between the program and control group of families can therefore be attributed to the program with a known degree of precision. Second, we will pool grantees with RCTs and those with strong quasi-experimental designs (QEDs), in which program and comparison groups were matched on key factors, such as baseline history of substance abuse and family functioning. The internal validity of QEDs is weaker than RCTs, because differences on unobservable characteristics cannot be determined. However, a design with well-matched program participants and comparison group members provides useful information on program effects. Combining the QED results with RCTs will increase the statistical power of the overall analysis, enabling the detection of smaller effects. Because of the serious consequences of child maltreatment, even relatively small effect sizes may be clinically meaningful.

Baseline data will be collected by grantees and their local evaluators and used in the impact and implementation analyses. First, baseline data will be used to describe the characteristics of RPG program participants. For each grantee, we will present tables of frequencies and means for key participant characteristics, including demographic and family information. We will also present aggregated results for the grantees with RCTs and the combined RCT-QED sample used for the impact study.

A key use of baseline data is to test for baseline equivalence for both the RCT and the RCT-QED samples. Though random assignment ensures that families participating in the program and those in comparison groups do not initially differ in any systematic way, there might still be chance differences between groups. Establishing baseline equivalence for the QEDs is critical for determining whether the comparison group represents a reasonable counterfactual that represents what would have happened to the program group had they not received treatment. To confirm that there were no differences between the program and comparison groups at the study’s onset, we will statistically compare key characteristics between the groups. In addition, since the standardized instruments will be administered twice- once at program entry and again at program exit, we will also compare outcome measures at program entry between the two groups. In particular, to establish baseline equivalence, we will conduct t-tests and F-tests to test for differences between the two groups both overall and separately by grantee. In these comparisons, we will use the analytic sample, which is composed of respondents to both the baseline and follow-up instruments.

Baseline data will also be analyzed jointly with the follow-up data to estimate impacts. Using baseline data in the impact analysis will improve the statistical precision of impact estimates and control for any remaining differences between groups. The average impact estimate will be weighted average of each site-specific impact, where the weight of each site-specific impact is the inverse of the squared standard error of the impact. As such, sites with more precise impact estimates (for example, sites with larger sample sizes or baseline variables that are highly correlated with the outcomes) will receive greater weight in the average impact estimate. We will compare the results using the sites with RCT evaluations to those obtained with the RCT-QED sample, noting that the former is most rigorous, whereas the latter should be considered “suggestive” or “promising” evidence of effectiveness.

Overall Performance

To inform Congress on the performance and progress of the RPG sites, we will annually estimate and report on performance measures for the 17 sites. The reporting will include selected measures collected and calculated for the (1) Implementation and Partnership Study (2) Outcomes Study and (2) Impact Study. To reduce the burden for the grantees and local evaluators, we have designed the studies for complete overlap between the performance measures and those of the other evaluation components, so no additional data are needed. Reporting is likely to include results from the implementation analysis, such as information about program operations, enrollment, and participation; the partnership analysis, including partnership goals and collaboration; and detailed descriptions of the characteristics and outcomes associated with participating families. When the results from the impact analysis are ready, we will include them in the annual reporting.

b. Time Schedule and Publications

This ICR is requesting clearance for data collection for the cross-site evaluation for three years, beginning April 2014. Once data collection is complete, reporting will continue through August 2017.

We will develop reports describing the results of the evaluation components in progress each year (Table A.2). Annual reports, starting in November 2013, will be designed for accessibility by a broad audience of policymakers and practitioners. For the overall performance component, we will produce annual reports to Congress beginning in September 2013. A final evaluation report will provide a comprehensive synthesis of all aspects of the study over the entire contract, including integration and interpretation of both qualitative and quantitative data.

Table A.2. Schedule for the RPG Cross-Site Evaluation

Activity

Date

Data Collection

April 2014-March 2017a

Reports to Congress

Annually beginning September 2013b

Annual Reports

Ad-hoc Reports or Research Briefs

Annually beginning November 2013b

As requested by Children’s Bureaub

Final Evaluation Report

August 2017

a Data collection will begin once OMB clearance is received; depending on this timing the start date may be earlier or later than April of 2014

b Reports prior to OMB clearance will describe project activities or issues that do not include any of the data collection described in this clearance request.

In addition to planned reports on the findings, RPG will provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. Short research or policy briefs are an effective and efficient way of disseminating study information and findings. The cross-site evaluation team will produce up to two ad hoc reports or special topics briefs each at the request of the CB. Topics for these briefs will emerge as the evaluation progresses but could, for example, provide background on selected evidence-based practices, summarize key implementation, impact, or subgroup findings or describe the study purpose and grantees.

A.17. Reason(s) Display of OMB Expiration Date is Inappropriate

Approval not to display the expiration date for OMB approval is not requested.

A.18. Exceptions to Certification for Paperwork Reduction Act Submissions

No exceptions are necessary for this data collection.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAutumn Parker
File Modified0000-00-00
File Created2021-01-28

© 2024 OMB.report | Privacy Policy