RPG
National Cross-Site Evaluation and Evaluation Technical
Assistance
Supporting Statement, Part A
For OMB
Approval
February 26, 2020
The Children’s Bureau (CB) within the Administration for Children and Families of the U.S. Department of Health and Human Services (HHS) seeks approval to collect information for the Regional Partnership Grants to Increase the Well-being of and to Improve Permanency Outcomes for Children Affected by Substance Abuse (known as the Regional Partnership Grants Program or “RPG”) Cross-Site Evaluation and Evaluation-Related Technical Assistance project. The Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) includes a targeted grants program (section 437(f) of the Social Security Act) that directs the Secretary of HHS to reserve a specified portion of the appropriation for these RPGs, to be used to improve the well-being of children affected by substance abuse. Under three prior rounds of RPG, CB has issued 74 grants to organizations such as child welfare or substance abuse treatment providers or family court systems to develop interagency collaborations and integration of programs, activities, and services designed to increase well-being, improve permanency, and enhance the safety of children who are in an out-of-home placement or at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance abuse. In 2017, CB awarded grants to a fourth cohort of 17 grantees, including 2 grantees serving American Indian or Alaska Native (AI/AN) participants, and in 2018 CB awarded 10 grants to a fifth cohort. In 2019 CB awarded 8 grants to a sixth cohort. The current information collection request (ICR) is for data collection activities associated with the 35 grantees. The previous 74 grantees were included in previous ICRs (OMB Control Numbers 0970-0353 and 0970-0444).
The RPG cross-site evaluation will extend our understanding of the types of programs and services grantees provided to participants, how grantees leveraged their partnerships to coordinate services for children and families, and the outcomes for children and families enrolled in RPG programs. First, the cross-site evaluation will assess the coordination of partners’ service systems (e.g., shared participant data, joint staff training) to better understand how partners’ collaborative effort affected the services offered to families (partnerships analysis). The cross-site evaluation will also focus on the partnership between the child welfare and substance use disorder (SUD) treatment agencies, to add to the research base about how these agencies can collaborate to address the needs of children and families affected by SUD. Second, the evaluation will describe the characteristics of participants served by RPG programs, the types of services provided to families, the dosage of each type of service received by families, and the level of participant engagement with the services provided (enrollment and services analysis). Finally, the evaluation will assess the outcomes of children and adults served through the RPG program, such as child behavioral problems, adult depressive symptoms, or adult substance use issues (outcomes and impacts analysis).
The evaluation is being undertaken by CB and its contractor Mathematica Policy Research. The evaluation is being implemented by Mathematica Policy Research and its subcontractor, WRMA Inc. The evaluator is required to advise CB on the instruments grantees use to collect data from program participants for required local evaluations. Grantees will secure approval from their local institutional review boards (IRBs) for collecting these data.
This ICR requests clearance for obtaining from grantees participant data they collect for their local evaluations, and for directly collecting additional data from grantees and their partners and providers, for the cross-site evaluation. Specifically, this ICR requests clearance for the following data collection activities: (1) site visits with grantees, (2) web-based survey of grantee partners, (3) semiannual progress reports, (4) enrollment and services data provided by grantees, and (5) outcomes and impacts data provided by grantees.
Parents’ substance abuse is linked to significant developmental delays for children and disruptions to healthy parenting routines. Children born to parents who abuse substances may suffer neonatal abstinence syndrome or fetal alcohol syndrome (Hudak and Tan 2012). Substance abuse may limit parents’ ability to meet their children’s emotional and physical needs; contribute to problematic parental behavior patterns, including secrecy, violence, and emotional neglect; and inhibit parents’ ability to form healthy attachments necessary for child well-being (Lander et al. 2013).
As a central segment of the U.S. safety net, child welfare agencies assist families facing crises resulting from parental substance abuse. Children of parents with substance abuse issues are more likely to be placed in out-of-home care and more likely to stay in care longer than other children (Barth et al. 2006; Neger and Prinz 2015). Data from HHS (2016) indicate the number of children in foster care is growing and parental substance abuse is increasingly a factor contributing to a child’s removal from the home.
Among tribal communities, the problem of parental substance abuse has taken an especially high toll. Research shows elevated rates of behavioral health problems among AI/AN adults, including alcohol and substance abuse and suicide (Boyd-Ball et al. 2006; Indian Health Service 2011). According to one estimate, substance abuse is linked to about 85 percent of all AI/AN child welfare cases (Crofoot 2005; Crofoot and Harris 2012).
Child welfare agencies and other social services organizations that become involved in cases of parental substance abuse encounter myriad systemic barriers to addressing this problem effectively. Parents might struggle to access and complete treatment for SUDs because too few facilities accept parents with their children. Logistical obstacles such as lack of child care or transportation might prevent parents from accessing treatment for SUDs. For parents who do access treatment, recovery from SUDs is a lengthy process that sometimes includes relapse. Extended treatment might be incompatible with timelines mandated by the Adoption and Safe Families Act regarding permanency or reunification (Ryan et al. 2017).
Recognizing the need to better support families affected by SUD and seeking ways to coordinate systems of care, in 2006, Congress authorized HHS to offer competitive RPGs. These grants aim to support partnerships among child welfare agencies, SUD treatment organizations, and other social service systems and thereby improve the well-being, permanency, and safety outcomes of children and families. In the prior rounds of grants (in fiscal year [FY] 2007, FY 2012, and FY 2014), CB awarded RPGs to 74 partnerships.
RPGs serve families in which adults have (1) a substance abuse problem and (2) one or more children in or at risk of out-of-home placement. Grantees must recruit and enroll families or individuals from their target groups, collaborate with partners, and align services and timelines across systems, including child welfare, SUD treatment, and the courts. They must design, deliver, and evaluate program strategies to meet grant requirements and improve outcomes at the individual, family, and system level. Each project operates in state and local contexts with potential implications for families, services, and partnerships (May et al. 2016).
The current RPG projects represent CB’s continued support of this important work. CB awarded 17 new five-year RPGs in FY 2017, including 2 awards to grantees serving AI/AN communities, made 10 three-year awards in FY 2018, and awarded eight five-year awards in FY2019. RPG continues to focus on achieving outcomes related to child well-being, family functioning and stability, adult recovery from SUDs, child permanency, and child safety. In addition to implementing programs, RPG projects must also comply with the Child and Family Services Improvement and Innovation Act of 2011 (Pub. L. 112-34), which requires that HHS evaluate the services and activities funded through RPG. Thus, to address the legislation’s goals and to contribute knowledge to the fields of child welfare and substance abuse treatment programming, HHS requires RPG projects to design and conduct a local evaluation and participate in a national cross-site evaluation. All RPG projects funded in FY 2012 and FY 2014 conducted a local evaluation and participated in a national cross-site evaluation. Grantees funded in FY 2017, FY 2018, and FY 2019 will also continue to participate in these evaluation activities. This OMB package only covers data collection activities that are part of the cross-site evaluation.
Through the RPG cross-site evaluation, CB seeks to learn about RPG programs and services and their potential effect on improving outcomes for children and families in the key areas of increased child well-being, family functioning and stability, adult recovery, improved permanency, and enhanced child safety. By analyzing data on RPG partnerships and services, CB seeks to understand how the proximal and distal outcomes are influenced by the partnership between the child welfare and SUD treatment agencies and the level of integration among the partners (partnerships analysis). In addition, CB seeks to understand how the partnership influences service delivery and how the services provided influence outcomes (enrollment and services analysis). An outcomes and impacts analysis will also be conducted. The inclusion of a rigorously designed impact study using a subset of grantees will also provide CB, Congress, grantees, providers, and researchers with information about the effectiveness of RPG programs.
Taking these goals into consideration, the cross-site evaluation aims to address the following research questions:
Partnerships analysis
What partners were involved in each RPG project and how did they work together?
How much progress did the RPG projects make toward interagency collaboration and service coordination?
How do the child welfare and substance use treatment agencies work together?
Enrollment and services analysis
What referral sources did projects use?
Who enrolled in RPG?
To what extent did RPG projects reach their target populations?
What core services3 were provided and to whom?
Were core services that families received different from the services that were proposed in the RPG project applications? If so, what led to the changes in planned services?
How engaged were participants with the services provided?
How did grantees and their partners collaborate to provide services?
Outcomes analysis
What are the well-being, family functioning, recovery, permanency, and safety outcomes of children and adults who received services from RPG projects?
Impact analysis
What are the impacts of RPG projects on children and adults who enrolled in RPG?
The RPG cross-site evaluation will include the following data collection activities to support the partnerships, enrollment and services, and outcomes and impacts analyses:
Site visits and key informant interviews. To understand the design and implementation of RPG projects, the cross-site evaluation team will visit up to 27 sites to better understand the partnership and coordination between the child welfare and SUD treatment agencies. The team will conduct telephone interviews with the remaining 6 sites to gather similar information about their design and implementation. The site visits will focus on the RPG planning process; how and why particular services were selected; the ability of the child welfare, SUD treatment, and other service systems to collaborate and support quality implementation of the RPG services; challenges experienced; and the potential for sustaining the collaborations and services after RPG funding ends.
Partner survey. To describe the interagency collaboration within RPG sites, grantees and their partners will participate in a paper survey one time during the grant period. One person from each organization knowledgeable about the RPG project will be invited to participate in the survey. The survey will collect information about communication and service coordination among partners. The survey will also collect information on characteristics of strong partnerships (e.g., data-sharing agreements, co-location of staff, referral procedures, and cross-staff training).
Semiannual progress reports. Grantee project directors will complete semiannual progress reports with updated information about their projects, including any changes from prior periods. CB has tailored the semiannual progress reports to collect information on grantees’ services, the target population for the RPG program, project operations, partnerships, and grantees’ perceived successes and challenges to implementation.
Enrollment and services data. To document participants’ characteristics and their enrollment in RPG services, all grantees will provide data on enrollment of and services provided to RPG families. These data include demographic information on family members, dates of entry into and exit from RPG services, and information on RPG service dosage. These data will be submitted regularly by staff at the grantee organizations into an information system developed by the cross-site evaluation team.
Outcome and impact data. To measure participant outcomes, all grantees will use self-administered standardized instruments to collect data from RPG adults. The standardized instruments used in RPG collect information on child well-being, adult and family functioning, and adult substance use. Grantees will also obtain administrative data on a common set of child welfare and SUD treatment data elements. Grantees will share the responses on these self-report instruments and the administrative data with the cross-site evaluation team through a system developed by the cross-site team.
In addition to conducting local evaluations and participating in the RPG cross-site evaluation, the grantees are legislatively required to report performance indicators aligned with their proposed program strategies and activities. A key strategy of the RPG cross-site evaluation is to minimize burden on the grantees by ensuring that the cross-site evaluation, which will include information from all grantees on implementation, partnerships, and participant characteristics and outcomes, fully meets the need for performance reporting. Thus, rather than collecting separate evaluation and performance indicator data, the grantees need only participate in the cross-site evaluation. In addition, using the standardized instruments that CB has specified will ensure that grantees have valid and reliable data on child and family outcomes for their local evaluations.
The Promoting Safe and Stable Families Program (Section 437(f), Subpart 2, Title IV-B, of the Social Security Act) (42 U.S.C. 629g(f)), as amended by the Child and Family Services Improvement and Innovation Act (Pub. L. 112-34), includes a targeted grants program (section 437(f) of the Social Security Act) that directs the Secretary of HHS to reserve a specified portion of the appropriation for RPGs to improve the well-being of children affected by substance abuse. This legislation also requires grantees to report performance indicators aligned with their proposed program strategies and activities. Under the terms of the RPG grant, CB requires grantees to participate in a national cross-site evaluation. The Child and Family Services Improvement and Innovation Act (Pub. L. 112-34) is included in appendix A.
The data collected through the instruments included in this ICR will be analyzed and reported on by the RPG cross-site evaluation. The purpose of the evaluation is to meet the legislative requirement for evaluation, and might help Congress in setting future policy. The evaluation also aims to contribute to the knowledge base about the implementation and effectiveness of strategies and programs selected by grantees for meeting the needs of children who are in an out-of-home placement or are at risk of being placed in out-of-home care as a result of a parent’s or caretaker’s substance abuse. The findings from the RPG cross-site evaluation will be used by policymakers and funders to consider what strategies and programs they should support to meet the needs of these families. Providers can use the findings to select and implement strategies and program models suited to the specific families and communities they serve. Evaluation findings can fill research gaps by rigorously testing program models that have prior evidence of effectiveness with some target populations but not the RPG target populations, or when provided in combination with other services and programs. Congress will also use information provided through the evaluation to examine the performance of the grantees and the grant program. Details on the purpose and use of the information collected through each instrument used to support the partnerships, enrollment and services, and outcomes and impacts analyses follow.
The partnerships analysis will assess the collaboration and coordination of services the RPG projects provided for families. The analysis will examine which partners are involved in each project, the roles they play, and the extent of collaboration among partners, such as sharing a vision and goals to integrating assessment and treatment. In addition, the analysis will explore the interagency collaboration and coordination of the child welfare and substance use treatment agencies, specifically examining topics such as competing priorities within each agency, conflicting timelines of recovery and permanency decisions, and conflicting and limited sharing of data between agencies. Advancing the collaboration and coordination of these two agencies is critical to the success of the RPG partnerships because they aim to serve the same families and support their well-being. The analysis will primarily draw on two data sources:
Grantee and partner staff site visit topic guide (Appendix B). This topic guide will collect detailed information from selected project and grantee staff and partners about the RPG planning process, how and why particular partners were selected, how the partnerships developed, changes in partnerships and the rationale for those changes, the project director’s perceptions of partnership quality, partnership challenges, and lessons learned. In addition, site visitors will interview representatives from the child welfare provider and substance use treatment agency (if those differ from the grantee) to understand their role in RPG planning, their roles and responsibilities, views on the goals of RPG, agency goals and priorities, reconciling competing priorities, and any policy or process changes within the agencies resulting from collaboration on RPG.
Partner survey (Appendix C). The partner survey will be administered to grantees and their partners. This survey will gather information on the characteristics of the partner organizations, how partners communicate and collaborate, goals of the partnership, and the types of organizations and roles within the partnership.
The enrollment and services analysis will describe who was served in the RPG projects and how. The analysis will examine how grantees defined and refined their target populations over the course of their projects and why those changes occurred. It will provide an expanded picture of all core services provided to families enrolled in RPG. Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners. The analysis also seeks to describe how engagement varied across participants and services, and how grantees and their partners collaborated to provide the services. The enrollment and services analysis will use the following data sources:
Semiannual progress reports (Appendix D). Grantee project directors will complete semiannual progress reports with updated information about their projects, including any changes from prior periods. CB has tailored the semiannual progress reports to collect information on grantees’ services, the target population for the RPG program, project operations, partnerships, and grantees’ perceived successes and challenges to implementation.
Enrollment and services data (Appendix E). These data will describe participants’ characteristics at enrollment and the services they receive. Grantees will record the enrollment date for each RPG family or household and demographic information on each family member including date of birth, ethnicity, race, primary language spoken at home, type of current residence, income (adults only), highest education level attained (adults only), and relationship to a focal child in each family on whom data will be collected. Grantees will also record the enrollment date for families or individual family members into RPG services, service contact information for core services, and exit dates for RPG.
The outcomes analysis will describe the characteristics of participating families and their outcomes in the five domains: (1) child well-being, (2) family functioning and stability, (3) adult recovery, (4) child permanency, and (5) child safety.
Grantees will administer five instruments at project entry and exit to obtain data on child well-being for a focal child identified in each RPG case, and for the family functioning/stability and recovery domains, as follows (also in Appendix F):
Child well-being (one of the following age-appropriate instruments depending on the age of the focal child)
Child Behavior Checklist-Preschool Form (Achenbach and Rescorla 2000)
Child Behavior Checklist-School-Age Form (Achenbach and Rescorla 2001)
Infant-Toddler Sensory Profile (Dunn 2002)
Family functioning and stability (both)
Adult-Adolescent Parenting Inventory (Bavolek and Keene 1999)
Center for Epidemiologic Studies-Depression Scale, 12-Item Short Form (Radloff 1977)
Adult recovery (both)
Addiction Severity Index, Self-Report Form (drug and alcohol scales only) (McLellan et al. 1992)
Trauma Symptoms Checklist-40 (Briere and Runtz 1989)
Grantees will also obtain data from administrative records maintained by local or state child welfare, foster care, and substance abuse treatment agencies for their local evaluations, and provide a core set of records to the cross-site evaluator. These records will serve to create measures of child safety and permanency, and adult receipt of substance abuse treatment services and their recovery. Grantees will receive a list and specifications of the core set of records needed (see Appendix G).
The impacts analysis aims to provide pooled estimates of the effectiveness of RPG projects among grantees with rigorous local evaluation designs. All grantees who have a well-specified quasi-experimental or randomized controlled trial design and a non-administrative data comparison group will be part of the impacts analysis. Grantees in the impacts analysis will collect data using the same set of standardized instruments and obtain the same administrative data on the comparison group as described above for the outcomes analysis (see Appendix F and G).
The RPG cross-site evaluation will use technology to collect study information. The only exceptions are for the semi-structured in-person interviews conducted during site visits, the written semiannual progress reports, and the paper and pencil partner survey. The cross-site evaluation will use technology to improve the user experience and reduce burden in the following ways:
Data entry system to collect data from grantees. The evaluation contractor and its subcontractor operate a seamless and transparent web-based data reporting system, known as the RPG-Evaluation Data System (EDS). RPG-EDS has a user interface accessible from any computer, allowing for ease of entry, while all data are housed on secure servers behind the contractors’ firewalls, thereby maintaining data security. The system has been modeled after the data systems used with prior cohorts of RPG grantees. It includes two applications, each designed to facilitate efficient reporting of (1) enrollment and services data, and (2) outcomes data. The system can be used by multiple users at each organization and will provide varying levels of access depending on users’ needs. For example, administrators or supervisors will have the greatest rights within the system, being able to create new users, assign program participants to staff members, and review all activity from the organization. Staff providing direct services to study participants will be able to record and review information about participants assigned to their caseload. The various levels of system access allow for streamlining of information. Limiting full system access to a small set of staff members increases data security, reduces respondent confusion, and supports the collection of higher quality information.
Enrollment and services data. On a rolling basis, grantee staff will use the enrollment and services data application to provide demographic information on each RPG case at enrollment, as well as enrollment and exit dates for the RPG project and information on each service in which case members enroll. The design of the RPG-EDS enrollment and services data entry application is based on web-based case management systems that Mathematica has developed and implemented successfully for multiple projects, including evaluations of prior RPG cohorts that involved collecting similar data from similar types of providers. For example, the enrollment and services data entry application will be flexible and easy to use, and will include navigational links to relevant fields for each type of entry to minimize burden on grantee staff and increase the quality and quantity of data collected.
Outcomes data. Each grantee will report data from standardized instruments and a list of data elements they will draw from administrative records. Grantees will develop their own project or agency databases to store these data. The grantee database will include all data the grantee collects from clients or on behalf of clients. The contractor will provide format specifications to the grantees to use when uploading outcomes data to RPG-EDS. These are in easy-to-use PDF and Microsoft Excel formats. Twice a year, grantees will upload these data to RPG-EDS. This application in RPG-EDS is modeled on the system that was used to obtain these types of data from RPG recipients during previous rounds of grants. Six of the current grantees were also in prior RPG cohorts and reported data through the prior data systems; thus, they are well prepared to use this type of application. Importantly, the new application in RPG-EDS will incorporate advances in technology and software, and improved programming approaches. These improvements will enhance the experience of providing outcomes data for this RPG cohort, including reducing the time to prepare and upload data to the system.
In addition, the cross-site evaluation will reduce burden by using evaluation data to construct performance indicators. The legislation that established the RPG program requires grantees to provide performance indicators to be included in reports to Congress, which the cross-site evaluation contractor will produce. To minimize grantee burden, the cross-site evaluation contractor will use data obtained for the partnerships, enrollment and services, and outcomes and impacts analyses to create the needed performance indicators. This avoids grantees submitting performance data in addition to data required for the evaluation. Data collected directly by Mathematica or provided by grantees for the cross-site evaluation will serve to describe to Congress each grantee’s project strategies; the structure and membership of each grantee’s collaborative partners across child welfare, substance abuse treatment, judicial, and other systems; enrollment targets and the pace of enrollment, along with a description of program participants; and the services RPG participants receive. Combined with information on child, adult, and family outcomes, this information will give Congress a full picture of how grantees performed and the extent to which they met their RPG program goals.
The RPG cross-site evaluation is specifically designed to minimize the duplication of efforts or data. First, to participate in the cross-site outcomes and impacts evaluations, grantees will share some or all of the data they are collecting for their own required local evaluations. Second, data shared by grantees or provided through direct collection from grantees, staff members, and partners for the cross-site evaluation will also serve to describe grantee performance. That is, to reduce duplication of efforts for grantees to comply with both CB’s local and cross-site evaluation requirements and legislatively mandated performance indicators, the cross-site evaluation data must completely overlap with data needed for performance indicators. Because no existing reporting systems collect the data required for reporting to Congress or for the cross-site evaluation, this data collection plan does not duplicate any current efforts.
Furthermore, the design of the cross-site evaluation instruments prevents duplication of data collected through each instrument. For example, during the semi-structured interviews conducted during site visits, partner representatives will not be asked any questions included in the partner survey. In creating the instruments for the outcomes and impacts analysis, the contractor reviewed and performed a crosswalk of all items to identify duplication across instruments. Any duplicate items not needed for scoring the instruments were removed from the versions of the standardized instruments provided in the outcomes and impacts instruments. This not only reduces burden on RPG participants providing data for grantees’ local evaluations, but also reduces the burden on grantee staff preparing and uploading outcomes data to the cross-site evaluation.
The potential exists to affect small entities within a grantee site, depending on the local community partners and funders with which grantees engage. RPG partners will be included in the site visit interviews and partner surveys. Additionally, grantee agencies will enter data into the RPG-EDS. Proposed data collection for these efforts aims to minimize the burden on all organizations involved, including small businesses and entities and is consistent with the aims of the legislation establishing RPG and CB’s need for valid, reliable, and rigorous evaluations of federally funded programs.
Not collecting information for the RPG cross-site evaluation would limit the government’s ability to document the performance of its grantees, as legislatively mandated, and to assess the extent to which these federal grants successfully achieve their purpose. Furthermore, the RPG cross-site evaluation is a valuable opportunity for CB, practitioners, and researchers to learn about the implementation and effectiveness of coordinated strategies and services for meeting the needs of families in the child welfare and substance abuse treatment systems. The study will examine whether the government’s strategy of funding collaborative, cross-system partnerships is a productive one that is likely to be sustained after the grant period ends, and understand how well partnerships are collaborating, the characteristics of the participants enrolling in RPG, the services provided to participants, and the outcomes and impacts on children and adults enrolled in RPG.
The information collection proposed is necessary for a successful cross-site evaluation. The consequences of not collecting this information or collecting the information less frequently are discussed as follows for each data collection element:
Grantee and partner staff topic guide. Without the information being collected through interviews with grantee and partner staff, the cross-site evaluation would have to rely entirely on information reported by a single source: the RPG project directors through the semiannual progress reports. Thus, the study would lack the broader perspectives of other key participants, and it would not be possible to conduct any in-depth analysis of critical program challenges and successes or implementation issues.
Partner survey. Without the partner survey, CB would not be able to collect information to understand the roles that partners play in RPG, the communications and working relationships among partners, the quality of their collaboration, and their goals for the RPG project in their region. Partnerships are a key element of the RPG program, but the literature shows that collaboration and service integration between child welfare agencies, substance abuse treatment providers, and other key systems such as the courts has been rare or challenging in the past. In addition, many federal initiatives require grantees to establish collaborations. Thus, collecting these data will help fill important gaps in knowledge for RPG and potentially other grant programs.
Semiannual progress reports. Without obtaining information from the semiannual progress reports, the study would not have detailed information about grantee operations; changes to planned interventions, target population and eligibility criteria, or target outcomes; nor planned or unplanned changes to services provided to participants. The progress reports will provide timely information about the infrastructure that grantees put in place to support implementation as well as features of the community context that have influenced grantees’ implementation plans. Collecting this information less often than twice a year would violate the federal requirements for grantee progress reporting, and would place larger burdens on respondents to remember or store information about events, changes in direction, or challenges and successes over a longer period. Because aggregate information from the reports will be extracted and shared with grantees for program improvement and peer learning, less frequent reporting would also limit grantees’ ability to consider adjustments or reach out to one another. The data also provide a critical supplement to other data being collected and provide information for designing evaluation-related and programmatic technical assistance in response to emerging issues and grantee needs.
Enrollment and services data. The enrollment and services data are important for describing actual service delivery to cases and for tracking all activities completed with the participants, including assessments, referrals, education, and support. Data will be collected when participants enroll, as they receive services, and at exit. Without these data, the study would have no information on the services recipients actually receive, including their duration and dosage. The evaluation would be unable to link participant outcomes to the levels or combinations of specific services or understand whether and how participants participated in the selected services. If data were collected less frequently, providers would have to store services data or try to recall them weeks or months after delivery. Regular collection also enables data quality checks to address missing data, errors, or other problems in a timely way.
Outcomes data. It is CB’s mission to ensure child well-being, safety, and permanency for children who experience maltreatment. The outcomes instruments will provide detailed information on these outcomes and the participants who receive services. Grantees will upload data from the outcomes instruments twice each year. Without this information, evaluators would be unable to describe the outcomes of RPG program participants or analyze the extent to which grants have affected the outcomes of or addressed the needs of families co-involved with substance abuse and child welfare systems. Further, it would be impossible to conduct an impact study (described next). During each upload, RPG-EDS will perform automatic validation checks of the quality and completeness of the data. Mathematica will then review submissions to address any remaining data quality issues, and work with grantees to resolve problems. If data were uploaded less often, it would be more cumbersome and challenging for grantees to search through older records to correct or provide missing data.
Impacts analysis. Grantees participating in the impacts analysis will also upload outcomes data for participants in their comparison group (that is, those who do not receive RPG services or receive only a subset of RPG services). Without this information, it would not be possible to rigorously analyze the effectiveness of the interventions by comparing outcomes for individuals with access to RPG services with those in comparison groups. Uploading the data every six months provides the same benefits with respect to data quality described above.
There are no special circumstances requiring deviation from these guidelines.
The first Federal Register Notice was published on October 10, 2018 (Federal Register /Vol. 83, No. 196 /Wednesday, October 10, 2018/Notices, pp. 50936-50938). The comment period ended December 10, 2018. No comments were received.
No payments or gifts will be provided to respondents as part of data collection.
This study is being conducted in accordance with all relevant regulations and requirements, including the Privacy Act of 1974 (5USC 552a), the Privacy Act Regulations (34 CFR Part 5b), and the Freedom of Information Act (5 CFR 552) and related regulations (41 CFR Part 1-1, 45 CFR Part 5b, and 40 CFR 44502). Several specific measures will be taken to protect respondent privacy.
Adopting strict security measures and web security best practices to protect data collected through RPG-EDS. Data collected through RPG-EDS (which include outcomes data as well as enrollment and services data), will be housed on secure servers that conform to the requirements of the HHS Information Security Program Policy. The data portal will employ strict security measures and web security best practices to ensure the data will be submitted, stored, maintained, and disseminated securely and safely. Strict security measures will be employed to protect the confidentiality of participant information stored in the system including data authentication, monitoring, auditing, and encryption. Specific security procedures include, but are not limited to the following:
The system is currently undergoing the HHS security authorization process to obtain an Authority to Operate.
All data will be encrypted in transit and at rest and reside behind firewalls.
Access to RPG-EDS will be restricted to approved staff members who will be assigned a password only with permission from the study director. Each user will have a unique user ID/password combination and will be enrolled in the project’s multifactor authentication system.
Database access will require special system accounts. RPG-EDS users will not be able to access the database directly.
RPG-EDS users will be able to access the system only within the scope of their assigned roles and responsibilities.
Security procedures will be integrated into the design, implementation, and day-to-day operations of RPG-EDS.
All data files on multi-user systems will be under the control of a database manager, with access limited to project staff on a “need-to-know” basis only. To further ensure data security, project personnel must adhere to strict standards, receive periodic security training, and sign security agreements as a condition of employment.
Training cross-site evaluation interviewers in confidentiality procedures. All site visit interviewers will be trained on privacy procedures and will be prepared to describe them in detail or to answer any related questions raised by respondents. During the introduction to each interview, site visit informants will be told that none of the information they provide will be used for monitoring or accountability purposes and that the results of the study will be presented in aggregate form only.
Assignment of content-free case and participant identification numbers to replace PII associated with all participant outcomes data provided by grantees to the cross-site evaluation. The cross-site evaluation will develop and work with grantees to implement standard procedures for assigning identification numbers to all participant-level data. Case- and individual-level numbers will be content-free. For example, they will not include special codes to indicate enrollment dates, participant location, gender, age, or other characteristics.
There are no sensitive questions in the instruments that the contractor will use to collect data.
Some of the specified standardized instruments that grantees will use to collect data do include sensitive questions. For example, in the case of parents who are endangering their children as a result of their substance use, grantees must measure the parents’ pattern of substance use as a critical indicator of recovery. In recognition of this need, and to ensure confidentiality and other protections to their clients, as a condition of their RPG, all grantees must obtain IRB clearance for their data collection.
A.12. Estimates of Annualized Hour and Cost Burden
The estimated reporting burden and cost for the data collection instruments included in this ICR is presented in Table A.1. The grant period is five years; we are requesting clearance to collect data within a three-year period.
We estimate the average hourly wage for program directors and managers to be the average hourly wage of “Social and Community Services Manager” ($33.91), that of grantee staff to be the average hourly wage of “Counselors, Social Workers, and Other Community and Social Service Specialists” ($23.11), that of data managers to be the average hourly wage of “Database Administrators” ($42.81), that of data entry specialists to be the average hourly wage of “Data Entry and Information Processing Workers” ($16.73), and that for partners to be the average hourly wage of “General and Operations Manager” ($59.35), taken from the U.S. Bureau of Labor Statistics, Occupational Employment Statistics survey, 2017. Table A.1 summarizes the proposed burden and cost estimates for the use of the instruments and products associated with the partnerships, enrollment and services, and outcomes and impacts analyses.
For each burden estimate, annualized burden has been calculated by dividing the estimated total burden hours by the three years covered by this submission. Figures are estimated as follows:
Individual interview with program director. We expect to interview 27 RPG program directors (1 per grantee across 27 grantees) once during the evaluation period. These interviews will take 2 hours. The total burden for individual interviews with program directors is 54 hours, and the total annualized burden is 18 hours.
Individual interview with program manager or supervisor. We expect to conduct individual, semi-structured interviews with 27 program managers or supervisors (1 staff per 27 sites) once during the evaluation period. These interviews will take 1 hour. The total burden for individual interviews with program managers is 27 hours, and the total annualized burden is 9 hours.
Individual interview with frontline staff. We expect to conduct individual, semi-structured interviews with 54 frontline staff (2 staff per 27 sites) once during the evaluation period. These interviews will take 1 hour. The total burden for individual interviews with program managers is 54 hours, and the total annualized burden is 18 hours.
Partner representative interviews. We expect to conduct individual, semi-structured interviews with 81 partner representatives (3 partners for 27 grantees), once during the evaluation. These interviews will take 1 hour. The total burden for the individual interviews with partner representatives is 81 hours, and the total annualized burden is 27 hours.
Project director/program manager phone interview. We expect to conduct individual, semi-structured phone interviews with 8 project directors and 8 program managers, once during the evaluation. These phone interviews will take 1 hour. The total burden for the individual interview with partner representatives is 16 hours, and the total annualized burden is 5.3 hours.
Partner survey. We expect to administer the paper-based survey once to 175 grantee partners (5 per site across 35 sites). The survey will take 25 minutes to complete. The total burden for the partner survey is 73 hours, and the total annualized burden is 24.3 hours.
Semiannual progress report. Grantees will submit two progress reports per year for each year of the evaluation period. We assume that 35 project directors (1 per grantee) will submit the semiannual progress reports six times during the evaluation period. It will take 16.5 hours to submit each one. The total burden for submitting the semiannual progress report is 3,465 hours, and the total annualized burden is 1,155 hours.
Case enrollment. Based on grantee estimates, we assume enrollment of 130 families per year per grantee. We assume that 3 staff per grantee will conduct enrollment, or 105 staff total. Each staff person will enroll about 43 families per year. It will take 15 minutes to enroll each family using RPG-EDS. Thus, the total burden for enrolling families across all staff members for three years is 3,386.3 hours, and the total annualized burden is 1,128.8 hours.
Case closure. Based on grantee estimates, we assume 130 cases will close each year, per grantee. We assume that 3 staff per grantee will conduct case closures in RPG-EDS, or 105 staff total. Each staff will close 43 cases per year. It will take 1 minute to close a case. Thus, the total burden for case closure across all staff members for three years is 270.9 hours, and the total annualized burden is 90.3 hours.
Case closure – prenatal cases. We assume one-quarter of cases, or 33 families per grantee, per year will include pregnant women. We assume 3 staff per grantee will conduct case closures for prenatal cases, which will require additional time at closure. It will take 1 additional minute to close a case in RPG-EDS. Thus, the total burden for case closure prenatal cases across all staff members for three years is 207.9 hours, and the total annualized burden is 69.3 hours.
Service log entries. Based on the expected participation of families in specific RPG services, we assume there will be two service log entries each week for each family (104 entries per family per year) in RPG-EDS. We assume that 6 staff per grantee will enter services data (210 staff total), with a caseload size of 22 families each. Each weekly entry will take 2 minutes. Thus, the total burden for completing service log entries is 43,243.2 hours, and the total annualized burden is 14,414.4 hours.
Administrative data
Obtain access to administrative data. During the second year of the study, grantees will review all data submission instructions, and grantee agency personnel will develop a data management plan and the necessary administrative agreements (such as memoranda of understanding) with agencies that house the administrative records to obtain the requested records. They will implement data protocols, including mapping their data fields to the fields in RPG-EDS. Finally, they will pilot the data request and receipt process. It will take 220 hours to obtain initial access. Thus, the total burden for obtaining initial access across all 35 grantees is 7,700 hours. Grantees will then update administrative agreements with agencies that house the administrative records once in the third year and once in the fourth year of the study. It will take 18 hours to update each agreement. Thus, the total burden is 1,260 hours for 35 grantees to update agreements twice. The combined burden for obtaining initial and ongoing access to administrative data is 8,960 hours. Grantees will use these data for their local evaluations as well; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps might be necessary. Therefore, we have assumed that half of the burden of obtaining the administrative data (4,480 hours) should be allocated to the cross-site evaluation. The annualized burden is 1,493.3 hours. We assume 1 data manager per grantee (or 35 data managers) will complete these processes.
Report administrative data. Grantees will upload administrative data they have obtained to RPG-EDS twice per year for the three-year evaluation period. For each upload, each grantee will require 144 hours to prepare and upload their administrative data, including correcting any data validation problems. The total burden for reporting administrative data is thus 30,240 hours for all 35 grantees combined, and the total annualized burden is 10,080 hours. We assume that 1 data entry operator per grantee (or 35 data entry operators) will upload the data.
Standardized instruments
Review and adopt reporting templates. During the first year of the study, grantees will review and adopt our reporting templates so they can upload standardized data in the second year of the study. (They will use the same templates for subsequent data reporting). We assume that each of 35 data entry operators (1 in each of the 35 sites) will require 8 hours to review the reporting templates. The total burden for reviewing and adopting the reporting templates is thus 280 hours, and the total annualized burden is 93.3 hours.
Data entry for standardized instruments. Over the course of the three-year study grantees will enroll 390 cases (130 cases enrolled each year). For every case, five standardized instruments will be administered at baseline and again at program completion. Grantees will enter data from the completed instruments into their local databases, and data entry for each instrument will take 15 minutes (0.25 hours) per administration (1.25 hours total). RPG grantees will use these data for their local evaluations; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps to enter these data into their local databases might be necessary. Therefore, we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 17,062.5 hours, and the total annualized burden is 5,687.5 hours. We assume that 35 data entry operators (1 operator in each site) will enter the data.
Review records and submit electronically. Grantees will review records to ensure that all data have been entered and upload the data to RPG-EDS twice per year for each year of the evaluation period. It will take 3 hours to review and submit data for each of the five instruments twice per year. Grantees will then validate and resubmit data when errors are identified. It will take 2 hours to validate data for each of the five instruments, including time for obtaining responses to validation questions and resubmitting the data. Thus, the total burden is 5,250 hours, and the annualized burden is 1,750 hours. We assume that 35 data entry operators (1 operator in each site) will review and submit the data.
Data entry for comparison study sites. Twenty-eight grantees participating in the impact study will also enter data for control group members. For every member, five standardized instruments will be administered at baseline and follow-up. Grantees will enter data from the completed instruments into their local databases. It will take 0.25 hours for each administration (1.25 hours total). RPG grantees will use these data for their local evaluations as well; however, to comply with the procedures for providing the data to the cross-site evaluation, additional steps to enter these data into their local databases might be necessary. Therefore, we have assumed that half of the burden of data entry should be allocated to the cross-site evaluation. Thus, the total burden for entering cross-site evaluation data is 13,650 hours, and the total annualized burden is 4,550 hours. We assume the same enrollment size as grantees (130 cases per year) and that 28 data entry operators (1 operator in each of the 28 sites) will enter the data.
Table A.1. Estimate of burden and cost for the RPG evaluation and total burden request
Data collection activity |
TOTAL number of respondents |
Number of responses per respondent (each year) |
Average burden hours per response (in hours) |
Total burden hours |
Average hourly wage |
Total annual burden hours |
Total annualized cost |
Site visit and key informant data collection |
|||||||
Program director individual interview |
27 |
0.33 |
2 |
54.0 |
$33.91 |
18.0 |
$610.38 |
Program manager/supervisor individual interviews |
27 |
0.33 |
11 |
27.0 |
$33.91 |
9.0 |
$305.19 |
Partner representative interviews |
81 |
0.33 |
1 |
81.0 |
$33.91 |
27.0 |
$915.57 |
Frontline staff interviews |
54 |
0.33 |
1 |
54.0 |
$23.11 |
18.0 |
$415.98 |
PD/PM phone interview |
16 |
0.33 |
1 |
16.0 |
$33.91 |
5.3 |
$180.85 |
Partner survey |
175 |
0.33 |
0.42 |
73.0 |
$59.35 |
24.3 |
$1,443.69 |
Enrollment and services data |
|||||||
Semiannual progress reports |
35 |
2 |
16.5 |
3,465.0 |
$33.91 |
1,155.0 |
$39,166.05 |
Case enrollment data |
105 |
43 |
0.25 |
3,386.3 |
$23.11 |
1,128.8 |
$26,085.41 |
Case closure |
105 |
43 |
0.02 |
270.9 |
$23.11 |
90.3 |
$2,086.83 |
Case closure – prenatal |
105 |
33 |
0.02 |
207.9 |
$23.11 |
69.3 |
$1,601.52 |
Service log entries |
210 |
2288 |
0.03 |
43,243.2 |
$23.11 |
14,414.4 |
$333,116.78 |
Outcomes and impacts data |
|||||||
Administrative data |
|||||||
Obtain access to administrative data |
35 |
1 |
42.67 |
4,480.0 |
$42.81 |
1,493.3 |
$63,929.60 |
Report administrative data |
35 |
2 |
144 |
30,240.0 |
$16.73 |
10,080.0 |
$168,638.40 |
Standardized instruments |
|||||||
Review and adopt reporting templates |
35 |
0.33 |
8 |
280.0 |
$16.73 |
93.3 |
$1,561.47 |
Data entry for standardized instrumentsa |
35 |
130 |
1.25 |
17,062.5 |
$16.73 |
5,687.5 |
$95,151.88 |
Review records and submit |
35 |
2 |
25 |
5,250.0 |
$16.73 |
1,750.0 |
$29,277.50 |
Data entry for comparison study sites (28 grantees)a |
28 |
130 |
1.25 |
13,650.0 |
$16.73 |
4,550.0 |
$76,121.50 |
Estimated total burden |
|
|
|
121,840.7 |
|
40,613.6 |
$840,608.61 |
aBurden hour estimates assume that only half of this burden is part of the cross-site evaluation.
These information collection activities do not place any additional costs on respondents or record keepers.
The estimated cost for completing the RPG cross-site evaluation and technical assistance project is $11,742,807over the five years of the evaluation. Of this total, $5,990,355 represents the costs of the cross-site evaluation. The total cost over the three years of the requested clearance is $3,594,213. The annualized cost to the federal government includes one-third of that total ($1,198,071), plus the annualized burden cost (from Table A.1) of $1,490,104.28 for a total of $2,688,175.28 per year.
None; this is a new collection.
The information from the RPG cross-site evaluation—with a focus on partnerships, services, and outcomes for families—will be useful to funders, practitioners, and other stakeholders interested in targeting resources to effective approaches to address the needs of families affected by substance use issues. Identifying what has worked well allows subsequent efforts of program operators and funders to hone in on evidence-based practices and strategies.
Partnerships, enrollment and services, and outcomes analyses
Data from the instruments included in this OMB package will be analyzed using qualitative and quantitative methods to describe the target populations’ characteristics and outcomes; program services, dosage, and participant engagement; and the structure, quality, and goals of partnerships. An enrollment and services analysis of participants will provide a snapshot of child, adult, and family characteristics and outcomes. Thoroughly documenting program services and partnerships will expand understanding of the breadth of programs, practices, and services being offered through RPG to vulnerable families and will describe successes in achieving goals and barriers encountered. A greater understanding of how programs can be implemented with a network of partners might inform future efforts in this area.
Mathematica will use standard qualitative procedures to analyze and summarize information from project staff and partner interviews conducted using the semi-structured staff interview topic guide. These procedures include organization, coding, and theme identification. Standardized templates will be used to organize and document the information and then code interview data. Coded text will be searched to gauge consistency and consolidate data across respondents and data sources. This process will reduce large volumes of qualitative data to a manageable number of topics, themes, or categories (Yin 1994; Coffey and Atkinson,1996), which can then be analyzed to address the study’s research questions.
Quantitative data will be summarized using basic descriptive methods. For the outcomes analysis, data from the standardized instruments will be tabulated and used to create scales and scores appropriate for each instrument and will use established norms when appropriate for the RPG target populations. Administrative records will be examined to determine whether incidents of child maltreatment and child removal from the home have occurred and whether adults have received substance abuse treatment, the frequency of treatment, and resolution. These data will capture information at baseline and program exit for families who participate in services. For both the partnerships and enrollment and services analyses, sources of quantitative data include a partner survey and the enrollment and services log. Data from each source will undergo a common set of steps involving cleaning data, constructing variables, and computing descriptive statistics. To facilitate analysis of each data source, we will create variables to address the study’s research questions. Constructing these analytic variables will depend on a variable’s purpose and the data source being used. Variables might combine several survey responses into a scale or a score, aggregate attendance data from a set period, or compare responses to identify a level of agreement.
Enrollment and services data, which grantees enter into RPG-EDS, will also be used for the enrollment and services analysis. The study will provide summary statistics for key program features:
Enrollment. For example, the average number of new cases each month
Services provided by grantees. For example, the services in which clients typically participate (including any common combinations of services); distribution of location of services (such as home, treatment facility, or other site); the average number of selected services (such as workshops) offered each month; and common topics covered during services
Participation. For example, the average length of time participants are served by the program, the average number of hours of services program participants receive, and the average duration between enrollment and start of services
We will analyze data from RPG-EDS for each grantee for the reports to Congress and annual reports identified in Table A.2. The reports to Congress will include topics such as enrollment patterns, services provided, and participation patterns over the previous 12 months. Later analyses might describe how patterns changed over time, such as from the early to late implementation period.
Impacts analysis
The impacts analysis will complement other components of the evaluation by examining program effectiveness in the areas of child well-being, safety, and permanency; adult recovery; and family functioning. It will include the 28 grantees who have proposed rigorous local evaluations, either using random assignment or a strong matched comparison group. To be considered a strong matched comparison group, the local evaluation must include baseline data on key characteristics, such as family functioning and parental substance abuse, on which to establish equivalence with those enrolled in RPG programs. As noted above, all grantees will provide data on the program groups as part of the outcomes study. Those involved in the impact study will also collect data on comparison group members who are not enrolled in RPG projects at baseline and program exit.
The impacts analyses will be conducted for three groups of studies. First, we will pool the grantees’ projects that used well-implemented randomized controlled trials (RCTs) in their local evaluations. RCTs have excellent internal validity—ability to determine whether the program caused the outcomes—because the treatment and comparison groups are initially equivalent on all observed and unobserved characteristics, on average. Any observed differences in outcomes between the program and control group of families can therefore be attributed to the program with a known degree of precision. Second, we will pool grantees with RCTs with some issues (such as high attrition) and those with strong quasi-experimental designs (QEDs), in which program and comparison groups were matched on key factors, such as baseline history of substance abuse and family functioning. The internal validity of QEDs is weaker than that of RCTs, because differences on unobservable characteristics cannot be determined. However, a design with well-matched program participants and comparison group members provides useful information on program effects. Third, we will pool the studies in groups 1 and 2 that include well-implemented RCTs, RCTs with issues, and QEDs. Combining the QED results with RCTs will increase the statistical power of the overall analysis, enabling us to detect smaller effects. Because of the serious consequences of child maltreatment, even relatively small effect sizes might be clinically meaningful.
Grantees and their local evaluators will collect baseline data for use in the cross-site evaluation. First, baseline data will serve to describe the characteristics of RPG program participants. We will present tables of frequencies and means for key participant characteristics, including demographic and family information for the three groups of impacts analyses: the grantees with well-implemented RCTs, the combined group of RCTs with issues and QEDs, and the group that combines well-implemented RCTs, RCTs with issues, and QEDs.
A key use of baseline data is to test for baseline equivalence for both the RCT and the RCT-QED samples. Though random assignment ensures that families participating in the program and those in comparison groups do not initially differ in any systematic way, chance differences might exist between groups. Establishing baseline equivalence for the QEDs is critical for determining whether the comparison group serves as a reasonable counterfactual that represents what would have happened to the program group had it not received treatment. To confirm that there were no differences between the program and comparison groups at the study’s onset, we will statistically compare key characteristics between the groups. In addition, because the standardized instruments will be administered twice—once at program entry and again at program exit, we will also compare baseline measures of outcomes at program entry between the two groups. In particular, to establish baseline equivalence, we will conduct t-tests for differences between the two groups both overall and separately by grantee. In these comparisons, we will use the analytic sample, which includes respondents to both the baseline and follow-up instruments.
A key use of follow-up data is to estimate program impacts. We will use baseline data to improve the statistical precision of impact estimates and control for any remaining differences between groups. The average impact estimate will be the weighted average of each site-specific impact, where the weight of each site-specific impact is the inverse of the squared standard error of the impact. As such, sites with more precise impact estimates (for example, sites with larger sample sizes or baseline variables that are highly correlated with the outcomes) will receive greater weight in the average impact estimate. We will compare the results using the sites with well-implemented RCT evaluations with those obtained from the RCT with issues and QED sample, noting that the former is most rigorous, whereas the latter should be considered suggestive or promising evidence of effectiveness.
To inform Congress on the performance and progress of the RPG sites, we will produce two reports to estimate and report on performance measures for the 35 sites. The reporting will include selected measures collected and calculated for the (1) partnerships analysis, including partnership goals and collaboration; (2) enrollment and services analysis, including information about program operations, enrollment, and participation; (3) outcomes analysis, including detailed descriptions of the characteristics and outcomes associated with participating families; and (4) impacts analysis. To reduce the burden for the grantees and local evaluators, we have designed the cross-site analyses to completely overlap between the performance measures and those of the other evaluation components, so no additional data are needed.
B. Time schedule and publications
This ICR is requesting clearance for collecting data for the cross-site evaluation for three years, beginning March 2019. Once data collection is complete, reporting will continue through September 2022.
Three types of reports will summarize the progress and findings of the cross-site evaluation: annual reports, reports to Congress, and a final evaluation report (Table A.2). Each year, we will develop reports describing cross-site evaluation progress. Annual reports, starting in October 2018, will be designed for accessibility for a broad audience of policymakers and practitioners. For the overall performance component, we will produce three reports to Congress beginning in September 2018. A final evaluation report will provide a comprehensive synthesis of all aspects of the study over the entire contract, including integration and interpretation of both qualitative and quantitative data.
Table A.2. Schedule for the RPG cross-site evaluation
Activity |
Date |
Data collection |
March 2019–March 2022a |
Reports to Congress |
Three reports (every other year) beginning September 2018b |
Annual reports Ad-hoc reports or research briefs |
Annually beginning October 2018b As requested by Children’s Bureaub |
Final evaluation report |
September 2022 |
a Data collection will begin once OMB clearance is received; depending on this timing, the start date might be earlier or later than April 2019.
b Reports scheduled before OMB clearance will describe project activities or issues that do not include any of the data collection activities described in this ICR.
In addition to planned reports on the findings, RPG will provide opportunities for analyzing and disseminating additional information through special topics reports and research or issue briefs. Short research or policy briefs are an effective and efficient way of disseminating study information and findings. The cross-site evaluation team will produce up to two ad hoc reports or special topics briefs each year at the request of CB. Topics for these briefs will emerge as the evaluation progresses but could, for example, provide background on selected services offered by grantees, summarize program activities in tribal grantee sites, discuss impact or subgroup findings, or describe the grantees.
Approval not to display the expiration date for OMB approval is not requested.
No exceptions are necessary for this data collection.
3 Core services are the services defined by the grantee that make up its main RPG project. These include, at a minimum, all services funded by the grant, and might include in-kind services provided by partners.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | RPG OMB Supporting statement Part A |
Subject | OMB |
Author | VARIOUS |
File Modified | 0000-00-00 |
File Created | 2021-01-14 |