Supporting Statement A Information Collection for the American Rescue Plan National Evaluation
(OMB Control Number: 3090-0332)
Background
The Office of Evaluation Sciences (OES) in the Office of Government-wide Policy (OGP) at the
U.S. General Services Administration (GSA) is proposing new data collection activities conducted for a American Rescue Plan (ARP) National Evaluation. The objective of this project is to provide a systematic look at the contributions of selected ARP-funded programs toward achieving equitable outcomes to inform program design and delivery across the Federal Government. The project includes a series of in-depth, cross-cutting evaluations as well as data analysis of selected ARP programs, especially those with shared outcomes, common approaches, or overlapping recipient communities; and targeted, program-specific analyses to fill critical gaps in evidence needs.
This information collection request is for three mixed or multi-method evaluations under the American Rescue Plan National Evaluation Generic Clearance (OMB #: 3090-0332, expires 05/31/2027):
Integration of Funding to Increase Equitable Access to Behavioral Health Crisis Services (Behavioral Health study)
Local Innovations and Practices in the Equitable Implementation of ARP Programs to Reduce Homelessness (Homelessness study)
State Coordination Strategies to Equitably Serve Children Through the American Rescue Plan (State Coordination Strategies study)
Justification
Circumstances Making the Collection of Information Necessary
Given the prevalence of US households facing potential income shocks during the pandemic and the inability of many Americans to financially to weather an emergency,1 the ARP was crafted to safeguard the economic security and wellbeing of households. Upon passage of the ARP, President Biden stated his expectation that ARP programs would be implemented in ways that align with the goals of Executive Order 13985, Advancing Racial Equity and Support for Underserved Communities Through the Federal Government. The charge to the agencies was further communicated in OMB M-21-20 and OMB M-21-24, and in early direction from the White House ARP Implementation Coordinator. OES’s ARP National Evaluation includes a portfolio of studies that supports this goal by providing a systematic look at the contributions of selected ARP-funded programs toward achieving equitable outcomes to inform program design and delivery across the Federal government. The subset of programs of interest is the list of 32 programs covered in the May 2022 White House “Advancing Equity through the American
1 Pokora, B. (2023, December 1). Survey: How Many Americans Are Living Paycheck To Paycheck? Forbes Advisor. https://www.forbes.com/advisor/credit-cards/survey-living-paycheck-to-paycheck/
Rescue Plan” report, which represent nearly $900 billion or 60 percent of American Rescue Plan funds, excluding Economic Impact Payments (i.e., stimulus checks).
The ARP National Evaluation aims to learn how lessons from examination of ARP programs and interventions with shared outcomes, common approaches, or overlapping recipient communities may inform equitable program design and delivery across the federal government.
Purposes and Uses of the Information Collection
The ICRs being proposed under the ARP Generic Clearance cover three evaluations that are part of OES’s ARP National Evaluation, which aims to provide an integrated account of whether, how, and to what extent the implementation of a subset of ARP programs served to achieve their intended outcomes, particularly with respect to advancing equity. The three evaluations are:
Integration of Funding to Increase Equitable Access to Behavioral Health Crisis Services (Phase 2)
Local Innovations and Practices in the Equitable Implementation of ARP Programs to Reduce Homelessness (Phase 1)
State Coordination Strategies to Equitably Serve Children Through the American Rescue Plan (Phase 1)
Each of the evaluations will address specific research questions related to the area or the specific program it examines. Each of these evaluations will include qualitative and/or quantitative data collection to answer descriptive questions about program design, staffing, service provision, coordination, and other details as described in the sections that follow.
The COVID-19 pandemic produced and exacerbated multiple stressors, including sickness, grief, isolation, and economic instability, which contributed to an increase in the prevalence of mental illness and substance use (referred to by the term “behavioral health crisis”) and the incidence of behavioral health crises. During the pandemic, appropriate behavioral health crisis services became particularly important because health care settings were overwhelmed and the use of in- person services was discouraged to prevent the spread of COVID-19. Certain behavioral health crisis-related services are commonly paid for through a combination of funds, including Medicaid, block grants from the Substance Abuse and Mental Health Services Administration (SAMHSA), and state and local general funds. The ARP provided these sources with temporary additional funding, which states and localities then used to support behavioral health crisis services. Together, the additional temporary influx of ARP funds provided through these three programs allowed states and localities to expand and support different aspects of their behavioral health crisis response continuum at a time of increased need, and make their services available to multiple populations, including those with greater need and limited access.
This implementation evaluation uses cases studies and includes information from key informant interviews (KIIs) with a wide range of individuals from across the crisis care continuum. It will examine how states and localities used funding provided through ARP to expand and support the availability of behavioral health services along the crisis response continuum. Specifically,
through this study we will (1) describe the ARP funding streams that states and localities used to build or expand behavioral health crisis response infrastructure and (2) describe the decision- making process and factors that guided how states and localities identified and prioritized populations and behavioral health crisis response needs.
In the first phase of this study (National Review), we will recruit and consult with a Community Advisory Board. In the second phase (Case Studies) of this study, we will conduct in-depth KIIs with 104 respondents from across the continuum of services that address behavioral health crises including public health administrators, state, and local government administrators, and behavioral health consumer advocates. This PRA package covers the activities included in phase two. Phase one activities were included in generic clearance that preceded this PRA package.
Table 1 shows the timeline of the Behavioral Health study activities by phase.
|
2024 |
2025 |
|||||||||||||||||
Evaluation Activities |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
Jan |
Feb |
Mar |
Apr |
May |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
Phase 1: National Review |
|
||||||||||||||||||
CAB Recruitment and Onboarding |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
National Review and Case Study Selection |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Phase 2: Case Studies |
|
||||||||||||||||||
Key Informant Recruitment and Data Collection |
|
|
|
|
|
|
|
|
|
|
|
||||||||
KII Qualitative Analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|||||||
CAB Meeting |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||||
Draft Report for CAB |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||
CAB Review of Case Studies and Summary Report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Draft Report Revisions |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Draft Report to GSA |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Final Report to GSA |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Phase 1 activities included in generic clearance and precede activities in this PRA package. Phase 2 activities included in this PRA package.
Several ARP programs aimed to reduce housing instability for low-income households hit hard by the pandemic by (1) providing rental assistance, or (2) developing new affordable housing opportunities to prevent homelessness. While these programs focused on housing instability broadly and did not necessarily focus exclusively on homelessness, some programs included dedicated funds for people experiencing homelessness and others recommended prioritizing services for households currently experiencing homelessness or at risk of experiencing homelessness. This study focuses on four of programs that had a housing focus: Emergency Housing Vouchers (EHV), Emergency Rental Assistance (ERA), Low-Income Home Energy Assistance Program (LIHEAP), and Low-Income Home Water Assistance Program (LIHWAP). We also identified additional ARP programs that served people at risk of or experiencing homelessness but did not explicitly have a housing-focus or did not provide direct assistance.
These programs also incorporated new guidance and/or waivers not found in pre-existing
programs, with the explicit goal of providing equitable access to, and allocation of, ARP funds to people from underserved communities.2
This study examines how local agencies and homelessness service organizations (the Continuum of Care or CoC) coordinated together in designing and implementing ARP programs to reduce and prevent homelessness, particularly among people who are at increased risk of homelessness. In the first phase of the study (Survey), we will administer a survey to CoC staff and select 8-10 communities for deeper analysis in the second phase of the study. The survey will be about the degree to which CoCs were involved in ARP housing programs and strategies used to serve populations with a disproportionate risk of experiencing homelessness. These data are fundamental in understanding how communities have benefited from ARP funding and the innovative practices they adopted. Further, this survey enables subsequent research by informing selection of sites for in-depth interviews. In the second phase, we will conduct site level analysis in 8-10 communities by conducting interviews with local agency staff, conduct analyses of national and local public datasets, and review administrative documents for local ARP programs. This PRA package covers the activities included in phase one. The information collection request for phase two will be submitted in a future PRA package in 2024. Table 2 displays the timeline of the Homelessness study activities by phase.
|
2024 |
2025 |
|||||||||||||||||
Evaluation Activities |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
Jan |
Feb |
Mar |
Apr |
May |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
Phase 1: Survey |
|
|
|||||||||||||||||
Recruit and Convene National Expert Panel |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Prepare Interview Guides |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
PRA Review & Approval - Survey |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
Conduct Survey and Analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
Phase 2: Interviews |
|
|
|||||||||||||||||
PRA Review & Approval - Interview Guides |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||
Site Selection for Interviews |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Local Environmental Scan |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Interviews and Local Data Analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||
Interview Coding and |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
Findings |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Draft Report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Final Report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Phase 1 activities included in this PRA package.
Phase 2 activities included in a future PRA package.
The COVID-19 pandemic exacerbated many challenges that children in underserved communities
2 The White House Report, “Advancing Equity through the American Rescue Plan” identifies underserved communities as communities where individuals have been denied “consistent and systematic fair, just, and impartial treatment.” https://www.whitehouse.gov/wp- content/uploads/2022/05/ADVANCING-EQUITY-THROUGH-THE-AMERICAN-RESCUE-PLAN.pdf.
were already facing—simultaneously increasing both the need to obtain services and the barriers to accessing them—due to the closure of many venues that provided services. School and child care program closures impacted children’s academic growth, social-emotional development, and access to food, as well as family member caregivers’ ability to work. ARP-funded programs targeted a variety of outcomes integral to the welfare of children and families, including economic security; food and nutritional security; access to health services; affordable quality childcare; and safe, effective, and equitable education. The multiple agencies implementing ARP programs to improve equitable outcomes for children in low-income families have both fragmented and overlapping areas of authority with respect to service provision. Without coordination among agencies, service recipients must navigate for themselves the multiple unique processes for eligibility determination, application, and obtaining services. In such cases, applying for and maintaining participation in multiple benefit programs can represent a significant burden for families with low incomes or create a barrier to their ability to participate in and benefit from all programs for which they are eligible.
This study will describe and examine the approaches that states used to coordinate across ARP programs to equitably support children. The study will consist of two study phases. In the first study phase (Landscape Review), we will conduct consultations with national experts, key informant interviews with state leaders, and a limited document review. In the second study phase (Case Studies) we will conduct case studies in selected states. These case studies will include key informant interviews (KIIs) and a social network analysis. This PRA package covers the activities included in phase one (Landscape Review). The information collection request for the case studies will be submitted in a future PRA package in 2024.
Table 3 displays the timeline of the evaluation activities by phase. Through this study we will describe: (1) the extent to which, and how, states coordinated implementation of ARP programs to best support children in families with low incomes, (2) state strategies to foster coordination with localities (and tribal governments when relevant), (3) considerations that influenced the coordination and collaboration strategies that were employed, (4) the existence of common characteristics, strategies, or other key factors found among states and (5) coordination approaches that are sustainable beyond ARP implementation.
|
2024 |
2025 |
|||||||||||||||||
Evaluation Activities |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
Jan |
Feb |
Mar |
Apr |
May |
Jun |
Jul |
Aug |
Sep |
Oct |
Nov |
Dec |
Phase 1: Landscape Review |
|
||||||||||||||||||
Instrument Development |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
PRA Review & Approval |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
Limited Document Review |
|
|
|
|
|
|
|
|
|
|
|
|
|
||||||
State Leader Recruitment and Interviews |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||
Data Analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||||
Review Findings Brief Development |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
Draft Landscape Review Findings Brief |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Final Landscape Review Findings Brief |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Phase 2: Case Studies |
|
||||||||||||||||||
Instrument Development |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||
PRA Review & Approval |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||
Case Study KII Recruitment and Interviews |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||
Detailed Document Review and Quantitative Data Analysis (Case Study |
|
|
|
|
|
|
|
|
|
|
|
|
|||||||
Social Network |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||||
Social Network Analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
||||
Qualitative Data Analysis |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||
Report Development |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|||
Draft Report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Final Report |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Phase 1 activities included in this PRA package.
Phase 2 activities included in a future PRA package.
Use of Information Technology
Wherever possible and appropriate, information technology will be used to capture information and reduce burden relative to alternative methods of data collection.
For the Behavioral Health study and State Coordination Strategies study, we will use information technology to make KIIs less burdensome, relying on video calls (e.g., Zoom) when helpful to minimize participant travel time and facilitate participation.
For the Homelessness study, the CoC survey will be programmed using Qualtrics or another FedRamp approved platform. To reduce participant burden, individualized survey links will be generated for each CoC with a pre-populated CoC name. Multiple people from the same organization can use the same link to complete the survey.
Duplication of Efforts
This information collection does not duplicate any other Federal effort. The ARP National Evaluation team conducted a landscape analysis of different research studies and evaluation
efforts focused on a total of 32 ARP programs.
Impact on Small Businesses
These three information collections will not have any impact on small businesses as no information will be requested from small businesses.
Less Frequent Collection
The project team will work to streamline participant interaction with the study, and activities will be coordinated to minimize duplicative requests. Without this first of its kind evaluation, OES will not be able to provide valuable insight into federal program design and delivery.
Special Circumstances
These surveys will be consistent with all the guidelines in 5 CFR 1320. There are no such special circumstances that would cause this information collection to be conducted in an unusual or intrusive manner. All participation will be voluntary. Should the Agency need to deviate from the requirements outlined in 5 CFR 1320, individual justification will be provided to OMB on a case-by-case basis.
Public Comments/Outside Consultation
As this ICR is being submitted under a generic clearance, there is only a limited 30 day public comment period for the public to submit written comment on the information collection requirements. The 30-day notice was published in the Federal Register at 89 FR 70650 on August 30, 2024
For all three studies, the project team developed the evaluation study design and instruments in partnership with various subject matter and methodological experts (e.g., in consultation with federal agency staff, national organizations representing state and local government, and other national experts knowledgeable about the implementation of ARP programs). As covered in the generic clearance, the evaluation designs have and will be informed by expert consultations and advisory groups. The evaluation team also conducts annual convenings with national experts and may receive input from participants during these convenings.
Payments/Gifts to Respondents
For the State Coordination Strategies and Behavioral Health studies, eligible key informant interviewees (e.g., crisis care providers, behavioral health consumer advocates, parents/guardians) will be offered a $100 electronic gift card as a token of appreciation to ensure active participation and acknowledge individuals for their time and effort. Public employees such as state or local administrators will not be eligible to receive the gift cards. The amount of the gift card takes into consideration prevailing market rates and is in line with the expectations of potential participants. This will help us attract a diverse and representative group, thereby
potential participants. This will help us attract a diverse and representative group, thereby enhancing the quality and validity of our data collection efforts. We do not believe that this amount is coercive, instead we recognize that parents and guardians, particularly those in underserved groups, may have lower incomes and fewer sources of material and social capital, and as such taking time away from work and home may require additional resources than those with more means.
For the Homelessness study, no payments and/or gifts will be provided to respondents because most of the survey respondents and key informants are expected to either be public employees or organizations that receive federal funding and are not eligible to receive payments.
Privacy & Confidentiality
Respondents will be informed of all planned uses of the data, that their participation is voluntary, and that their information will be kept private to the extent permitted by law. No assurance of confidentiality will be provided to respondents. The study team will not disclose any individual- level survey or interview information to the persons outside the study team. Information provided by or about participants throughout the course of the study may contain participant- level personally identifiable information (PII).
For the Behavioral Health study and State Coordination Strategies study, we will provide the key informant interviewee with the informed consent (see Appendix A. Behavioral Health Outreach and Consent Form; Appendix I. State Coordination Strategies Key Informant Interview Consent Form) process by sharing a study description and disclosure during the recruitment outreach and will begin each interview with a verbal disclosure and consent process. The consent and disclosure process will also explain to key informants how we will secure their contact information and assure them that it will be used only by the study team. For the Homelessness study, the study team will include information explaining the privacy and consent particulars in the email that contains the survey link (see Appendix E. Homelessness Study Survey Consent). Upon clicking the survey, respondents will first see the screen that displays information on the study explaining how the survey responses will be disclosed. Only participants who consent to participate in the survey will be able to view the survey.
Sensitive Questions
For the Behavioral Health study and State Coordination Strategies study, it will be necessary to ask questions about sensitive topics, including benefit receipt and demographic information to evaluate the effectiveness of recovery programs aimed at vulnerable populations, particularly with respect to equity. All data collection for this study is voluntary, and respondents will have the ability to skip any questions (during interviews or surveys) that they do not feel comfortable
answering. Respondents will be informed prior to any information collection that their identities will be kept private to the extent permitted by law, that results will only be reported in the aggregate, that their responses will not affect any services or benefits they or their family members receive, and that they do not have to answer any questions that make them uncomfortable.
For the Homelessness study, there are no sensitive questions included in this information collection effort.
Burden Estimates (Hours & Costs)
Table 8. Summary of Annual Total Burden for American Rescue Plan National Evaluation.
Table Number: Name |
Total Burden Hours |
Total Burden Costs |
Table 5. Annual Burden for Integration of Funding to Increase Equitable Access to Behavioral Health Crisis Services |
129.50 |
$8,312.43 |
Table 6. Annual Burden for Local Innovations and Practices in the Equitable Implementation of ARP Programs to Reduce Homelessness Survey |
197.50 |
$14,626.85 |
Table 7. Annual Burden for State Coordination Strategies to Equitably Serve Children Through the American Rescue Plan |
105 |
$10,224.90 |
Total - Annual |
432 |
$33,164.18 |
We used the Bureau of Labor Statistics (BLS), Occupational Employment and Wage Statistics, May 2023 (https://www.bls.gov/oes/current/oes_stru.htm) to estimate the burden cost (including 100 percent fringe benefits) for the information collections. Table 4 displays a description of the median hourly wages for the three information collections.
Occupational Title |
Occupational Code |
Median Hourly Wage ($/hour) |
Fringe Benefits & Overhead (100%)($/hour) |
Adjusted Hourly Wage ($/hour) |
Behavioral Health Study |
||||
General Public (Behavioral Health Consumer Advocates) |
00-0000 |
$23.11 |
$23.11 |
$46.22 |
Medical & Health Services Managers (Public Health, State, and Local Government Administrators) |
11-9111 |
$53.21 |
$53.21 |
$106.42 |
Homelessness Study |
||||
Social and Community Service Managers (Continuum of Care Staff) |
11-9151 |
$37.03 |
$37.03 |
$74.06 |
State Coordination Strategies Study |
||||
General and Operations Managers (State and Local Government Administrators) |
11-1021 |
$48.69 |
$48.69 |
$97.38 |
General Public (Parents/Guardians) |
00-0000 |
$23.11 |
$23.11 |
$46.22 |
The following sections of this document contain estimates of burden hours imposed by the associated information collection requirements.
This study is an implementation evaluation that includes conducting key informant interviews (KIIs).
The first phase of this study (National Review) involves recruitment and consultation with a Community Advisory Board. For the second phase (Case Studies) of this study, we will conduct in-depth KIIs with 104 respondents from across the continuum of services that address behavioral health crises including public health administrators, state, and local government administrators, and behavioral health consumer advocates (see Table 1).
Recruitment. We plan to identify most of our administrators through online research or through the limited document review we will conduct; however, recruiting and behavioral health consumer advocates will require a different approach. Our first step will be to conduct online research to identify advocacy organizations at the national, state, and local levels that are focused on behavioral and mental health. We will share this list with Community Advisory Board (CAB) members to vet and edit the list. If CAB members have connections to any individuals in these organizations, we will request an introduction. Otherwise, we will conduct email outreach (Appendix A. Behavioral Health Key Informant Interview Recruitment Outreach) to individuals in the organization based on their profiles or areas of advocacy. Given past experience, we know that in many cases advocates have lived experience pertaining to the area for which they advocate—as a consumer, caregiver, friend, or provider. Therefore, in some cases, these individuals will be able to speak to the issues as both an advocate and as someone who has interacted with or used behavioral health crisis services. At the conclusion of our interviews, we will ask advocates if they would be willing to let individuals who have used behavioral health crisis services know about our study or if there are particular organizations that serve this population that might be willing to advertise our study to recruit individuals for interviews.
Wherever possible, we will use referrals and warm handoffs—through CAB members, federal contacts, or key informants we have already interviewed—to identify and recruit additional individuals for KIIs. If a selected individual is not available to participate, we will ask for recommendations of others to invite based on their area of expertise. In the recruitment and interview materials, we will remind all participants that we are not evaluating them or their organization, nor are we conducting a compliance review; rather, we are seeking to understand their experiences related to the provision of behavioral health crisis care and their organizations’ use of ARP funds. We will be mindful when working with the CAB to recruit interviewees to intentionally select a wide range of voices and will stratify our recruitment so that we capture the experiences of those whose facilities or geographic areas received ARP funding. We will make sure we recruit individuals able to speak about: (1) services provided to more diverse populations in larger cities and urban settings and (2) services provided in rural areas or directed toward specific demographics, such as older adults or individuals living in institutional or group settings.
To help prospective key informants understand the aims of the study, we will share an information sheet in our outreach that includes an overview of the study, confidentiality and
consent procedures, what to expect in the interview, and any available tokens of appreciation (for nongovernmental key informants). We will clarify that interview findings will be shared with the CAB, GSA, and federal partners but that confidentiality will be preserved wherever possible. (For state and local administrators, it may not always be possible to keep their responses entirely confidential, as they provide specific information that cannot be obtained elsewhere.) We will make three attempts via email, at 1-week intervals, to contact a key informant before moving on. We will log outreach, contact, response, and scheduling of KIIs in a database to track our response rate and identify gaps in our data collection (e.g., lack of response by behavioral health providers). Researchers on each case study team will lead the process of outreach, recruitment, and interview scheduling.
Screening. We will screen key informants to ensure that they have been involved in behavioral health crisis response in the selected state area for part of the period since ARP funding became available (2021–2023). We aim to include a large number of interviews from a range of perspectives and roles. We will use a 10-minute web-based online screening form to assess eligibility, see Appendix B. Behavioral Health Key Informant Interview Screener. We estimate screening 150 individuals to reach the goal of 104 KIIs. The total estimated burden for KII screening is 25.5 hours at a cost of $1,639.35. Table 5 displays the burden for KII screening.
Conduct Key Informant Interviews. Using the profiles we create for each study geographic area, we will tailor KII questions to the specific context of the geographic area. In the early CAB meetings, we will share these profiles with members and solicit feedback that will help us to refine each profile. The profiles will cite the source list of documents used, primary state and local contacts, and information on gaps or other specific issues that we want to gather in KIIs to answer the research questions.
We will conduct the interviews via video conferencing software to make it feasible to include individuals in multiple geographic areas and to allow for recording and transcription of each interview for ease of creating a summary, see Appendix C. Behavioral Health Key Informant Interview Instruments. We will provide a complete informed consent process by sharing a study description and disclosure during the recruitment outreach and will also begin each interview with a verbal disclosure and consent process, see Appendix A. Behavioral Health Outreach and Consent Form. The consent and disclosure process will also explain to key informants how we will secure their contact information and assure them that it will be used only by the study team.
We estimate conducting KIIs with about 73 behavioral health consumer advocates and 31 public health administrators, state, and local government administrators. The total estimated burden per respondent for outreach emails, screening and the interview is 1.17 hours at a cost of $46.22 per behavioral health consumer advocate and $106.42 per public health, state, and local government administrators. The total estimated burden per nonrespondent for screening is 0.17 hours. For all interview respondents and non-respondents, the total estimated burden is 129.50 hours at a cost of $8,312.43. Table 5 displays the burden of conducting the KIIs.
Labor Category |
Number of Respondents |
Hourly Labor Costs (Hourly rate + 100% Fringe Benefits) |
Burden Hours |
Total Burden Cost (per Respondent) |
Total Burden Costs (All Respondents) |
Key Informant Interview Outreach Emails and Screening |
|||||
Behavioral Health Consumer Advocates |
105 |
$46.22 |
0.17 |
$7.86 |
$825.30 |
Public Health, State, and Local Government Administrators |
45 |
$106.42 |
0.17 |
$18.09 |
$814.05 |
Key Informant Interview Scheduling and Interviews |
|||||
Behavioral Health Consumer Advocates |
73 |
$46.22 |
1 |
$46.22 |
$3,374.06 |
Public Health, State, and Local Government Administrators |
31 |
$106.42 |
1 |
$106.42 |
$3,299.02 |
Total - Annual |
|
|
129.50 |
|
$8,312.43 |
Local Innovations and Practices in the Equitable Implementation of ARP Programs to Reduce Homelessness (Phase 1).
To respond to the research questions listed in section 2 and inform selection of 8-10 sites for follow up interviews in phase two of the study (Interviews), the ARP Homelessness study will administer an online survey in the first phase of the study to an estimated 395 CoC Collaborative Applicants. The total number of CoCs in 2023 according to HUD was 388. However, this number may change by the time this survey is administered. Since the survey aims to cover all CoCs, we are providing burden estimates for an estimated 395 CoCs to allow for any increases in the total number of CoCs by the time of administering the survey. The goal of this survey is to gauge the level of each CoC’s involvement in local ARP programs serving those experiencing or at-risk of homelessness. It will also provide information about the degree to which CoCs’ were involved in ARP housing programs and strategies used to serve populations with a disproportionate risk of experiencing homelessness.
Survey Outreach. We will collaborate with the U.S Department for Housing and Urban Development’s (HUD) Office of Special Needs Assistance Programs (SNAPS) and a networking group of CoC leaders to publicize the survey. We will attend meetings of the CoC networking groups to discuss the upcoming survey and ask for their participation. HUD’s SNAPS program will send an introductory email to all CoC Collaborative Applicants to notify them of the upcoming survey. This email will describe the importance of answering this survey, the types of questions that will be asked on the survey, and the approximate time needed to complete it. This prior communication will help CoC staff prepare and identify appropriate individuals who are best positioned to answer the survey, see Appendix D. Homelessness Study Outreach
Web Survey. The study team will send an individualized survey link and an attached hardcopy of the survey instrument (see Appendix E. Homelessness Study Survey Instrument), and information sheet to all CoCs. We will obtain contact information for the CoCs from HUD Exchange or directly from the SNAPS office. We will send an individualized link to each CoC that can allow multiple staff to contribute toward completion of the survey. Only one record per CoC (containing the shared answers) will be available. To minimize burden, the survey will only ask for information that is not reported in other secondary resources and will primarily include close-ended responses.
We will send two follow-up emails to non-respondents in one-week intervals after the launch of the survey. To improve survey response rates, trained interviewers will conduct telephone follow-ups with CoCs who have not responded or completed the survey by one week after the second follow-up email. Phone interviewers will offer to complete the survey over the phone or share a link to the web survey. We estimate that we will conduct telephone follow-ups with 30 percent of the sample. We will call each non-respondent no more than two times. The second phone call will be made one week following the first attempt.
One staff person at each of the Continuum of Care Agencies will be requested to complete the web survey. Respondents may take anywhere from 10 to 45 minutes depending on the extent of CoC’s ARP involvement. This estimated time includes time to review emails, attachments, and completing the survey. Respondents are not expected to furnish any data that requires additional time for searching or compiling information to fill the survey. The average estimated burden per survey respondent is 0.5 hours at a cost of $37.03 per CoC staff. For all survey respondents, the total estimated burden is 197.50 hours at a cost of $14,626.85. Table 3 displays the burden of conducting the survey.
Labor Category |
Number of Respondents |
Hourly Labor Costs (Hourly rate + 100% Fringe Benefits) |
Burden Hours |
Total Burden Cost (per Respondent) |
Total Burden Costs (All Respondents) |
Continuum of Care Staff |
395 |
$74.06 |
0.5 |
$37.03 |
$14,626.85 |
State Coordination Strategies to Equitably Serve Children Through the American Rescue Plan (Phase 1).
This study is a policy implementation evaluation that includes conducting key informant interviews (KIIs).
For phase one of the study (Landscape), we will conduct in-depth KIIs with 70 respondents from state government agencies which administer ARP-funded programs. The study is focused on seven specific ARP-funded programs. For each of these seven ARP-funded programs, we will
conduct 10 KIIs with state informants (70 KIIs in total for phase one of the study). For the second phase of the study (Case Studies), we will conduct additional KIIs (see Table 3). The information collection request for phase two will be submitted in a future PRA package in 2024.
Recruitment. To identify and recruit state government administrators for the KIIs, we will solicit feedback from national experts and conduct an internet-based search of state governments to identify leaders of education, health, and social service agencies and compile this information in a spreadsheet, by state, see Appendix F. State Coordination Strategies Key Informant Interview Outreach. Wherever possible, we will use referrals and warm handoffs—through CAB members, federal contacts, or key informants we have already interviewed—to identify and recruit additional individuals for KIIs. If a selected individual is not available to participate, we will ask for recommendations of others to invite based on their area of expertise. In the recruitment and interview materials, we will remind all participants that we are not evaluating them or their state government agency, nor are we conducting a compliance review; rather, we are seeking to understand their experiences related to the coordination of ARP-funded programs.
Once we have identified potential respondents, we will email them with an invitation to participate in the study. To help prospective key informants understand the aims of the study, we will share an information sheet in our outreach that includes an overview of the study, confidentiality and consent procedures, what to expect in the interview, and any available incentives (for nongovernmental key informants). This time spent preparing for the KII is included in the burden estimate. We will clarify that interview findings will be shared with the CAB, GSA, and federal partners but that confidentiality will be preserved wherever possible. (For state and local government administrators, it may not always be possible to keep their responses entirely confidential, as they provide specific information that cannot be obtained elsewhere.) We will make three attempts via email, at 1-week intervals, to contact a key informant before moving on. We will log outreach, contact, response, and scheduling of KIIs in a database to track our response rate and identify gaps in our data collection (e.g., lack of response by state). Researchers will lead the process of outreach, recruitment, and interview scheduling.
Conduct Key Informant Interviews. We will conduct the KIIs via video conferencing software to make it feasible to include individuals in multiple geographic areas and to allow for recording and transcription of each interview for ease of creating a summary, see Appendix G. State Coordination Strategies Key Informant Interview Instrument. We will provide a complete informed consent process by sharing a study description and disclosure during the recruitment outreach and will also begin each interview with a verbal disclosure and consent process, see Appendix H. State Coordination Strategies Key Informant Interview Consent Form. The consent and disclosure process will also explain to key informants how we will secure their contact information and assure them that it will be used only by the study team.
We estimate conducting KIIs with 70 state government administrators. The total estimated burden per respondent is 1.5 hours at a cost of $97.38 per state government administrators. For all interview respondents, the total estimated burden is 105 hours at a cost of $10,224.90. Table 7 displays the burden for conducting the KIIs.
Labor Category |
Number of Respondents |
Hourly Labor Costs (Hourly rate + 100% Fringe Benefits) |
Burden Hours |
Total Burden Cost (per Respondent) |
Total Burden Costs (All Respondents) |
State Government Administrators |
70 |
$97.38 |
1.5 |
$146.07 |
$10,224.90 |
Capital Costs
There are no anticipated capital costs associated with these information collections.
Annualized Cost to Federal Government
As noted in the previously approved American Data Rescue Plan Generic Clearance ICR (OMB #: 3090-0332, expires 05/31/2027), the total cost to the federal government for the cross-cutting evaluations that the data collection activities under this ICR will support will be about
$2,618,100 over two years. This estimate includes all work on research design, data collection, and analysis.
Changes to Burden
The burden in these three information collections is included in the previously approved American Data Rescue Plan Generic Clearance ICR (OMB #: 3090-0332, expires 05/31/2027).
Pending OMB approval, the anticipated schedule for the conduct of the data collection, analysis, and preparation of the reports for the three evaluations is shown in Table 9. Documentation of the findings of these three evaluations will be shared with relevant agencies for awareness and technical review prior to publication.
Data Collection, Analysis, and Reporting Activities |
Due to Start |
Date to Complete |
Behavioral Health Study |
||
KII recruitment and data collection |
September 2024 |
March 2025 |
Data analysis |
November2024 |
June2025 |
Report (Draft & Final) |
October 2025 |
December 2025 |
Homelessness Study |
||
Field survey |
October 2024 |
December 2024 |
Survey analysis |
November 2024 |
January 2025 |
Report (Draft & Final) |
September 2025 |
December 2025 |
State Coordination Strategies Study |
||
KII recruitment and data collection |
September 2024 |
January 2025 |
Data analysis |
October 2024 |
March 2025 |
Report (Draft & Final) |
October 2025 |
December 2025 |
Expiration Date
The expiration date and OMB control number will appear on the first page of the instruments (top-right corner).
The Office of Evaluation Sciences (OES) in the Office of Government-wide Policy (OGP) at the
U.S. General Services Administration (GSA) is proposing new data collection activities conducted for a American Rescue Plan (ARP) National Evaluation. The objective of this project is to provide a systematic look at the contributions of selected ARP-funded programs toward achieving equitable outcomes to inform program design and delivery across the Federal Government. The project includes a series of in-depth, cross-cutting evaluations as well as data analysis of selected ARP programs, especially those with shared outcomes, common approaches, or overlapping recipient communities; and targeted, program-specific analyses to fill critical gaps in evidence needs.
The primary purpose of this information collection is to generate evidence from a systematic exploration of a selected subset of ARP programs, to provide an integrated account of whether, how, and to what extent their implementation served to achieve their intended outcomes, particularly with respect to advancing equity; public sharing of this data is limited to the context described in Supporting Statement A.
Of the three evaluations being submitted under the American Rescue Plan National Evaluation Generic Clearance (OMB #: 3090-0332, expires 05/31/2027), the only evaluation that includes statistical methods is the Local Innovations and Practices in the Equitable Implementation of ARP Programs to Reduce Homelessness evaluation. The evaluation involves conducting a web- based survey.
Respondent Universe and Sampling Methods
In 2023, there were 388 Continuum of Care (CoC) lead agencies. The respondent universe for this study of the level of involvement of each CoC local ARP housing programs will consist of all the 388 CoC agencies. There were 388 CoCs in 2023, according to HUD. However, this number may change by the time this survey is administered. Since the survey aims to cover all CoCs, we are providing burden estimates for an estimated 395 CoCs to allow for any increases in the total number of CoCs by the time of administering the survey.
The web survey will be distributed to all CoCs. No sampling methods will be used in the administration of the survey, as it will be distributed to all agencies that comprise the respondent universe of CoCs as of 2023 (currently 388). The surveys will gather information about the involvement of each CoC in local ARP housing programs. In addition, information gathered from this survey will provide information about the degree to which CoCs were involved in ARP housing programs and strategies used to serve populations with a disproportionate risk of experiencing homelessness. The survey will also be used to inform site selection. The study design aims to achieve a minimum response rate of 50 percent.
A sampling frame for the survey will not be utilized, as the survey will be distributed to all CoCs (currently 388).
B1.2 Sample Design and Sample Size
The survey will be distributed to the universe of CoCs and will not require the use of a sample.
Procedures for Collection of Information
We will include all CoC collaborative applicants in our sample (n=388). For each respondent, we will send an individualized survey link, an attached hardcopy of the survey instrument (see Appendix E. Homelessness Study Survey Consent and Instrument), and information sheet.
We will get contact information (names, email addresses, phone numbers) for the CoCs from HUD Exchange or directly from the SNAPS office. We will send an individualized link to each CoC that can allow multiple staff to contribute toward completion of the survey. Only one record per CoC (containing the shared answers) will be available. To minimize burden, the survey will only ask for information that is not reported in other secondary resources and will primarily include close-ended responses.
We will send two follow-up emails to non-respondents in one-week intervals after the launch of the survey. To improve survey response rates, trained interviewers will conduct telephone follow-ups with CoCs who have not responded or completed the survey by one week after the second follow-up email. Phone interviewers will offer to complete the survey over the phone or share a link to the web survey. We estimate that we will conduct telephone follow-ups with 30 percent of the sample. We will call each non-respondent no more than two times. The second phone call will be made one week following the first attempt.
We will program the survey using Qualtrics or another FedRamp approved platform. Surveys will take approximately 10 to 45 minutes to complete, including reviewing emails and attachments. The estimated average time for each respondent to complete the survey is expected to be no longer than 30 minutes, see Appendix E. Homelessness Study Survey Consent and Instrument.
Methods to Maximize Response Rates and Deal with Nonresponse
We will collaborate with the U.S Department for Housing and Urban Development’s (HUD) Office of Special Needs Assistance Programs (SNAPS) and a networking group of CoC leaders to publicize the survey. We will attend meetings of the CoC networking groups to discuss the upcoming survey and ask for their participation.
HUD’s SNAPS program will send an introductory email to all CoC Collaborative Applicants to notify them of the upcoming survey. This email will describe the importance of answering this survey, the types of questions that will be asked on the survey, and the approximate time needed to complete it. This prior communication will help CoC staff prepare and identify appropriate individuals who are best positioned to answer the survey. We will send two follow-up emails to
all non-respondents at 7 and 14 days following the launch of the survey. After one week of sending the second email follow up, we will begin phone follow-ups with an estimated 30 percent of the sample. We will call each non-respondent no more than two times. The second phone call will be made in one week following the first attempt. Phone interviewers will offer to complete the survey over the phone or share a link to the web survey. All the aforementioned can be found in Appendix D. Homelessness Study Survey Outreach Materials.
Test Procedures for Methods to be Undertaken
Drafts of the survey were reviewed by staff from HUD, GSA, Abt Global, American Institutes for Research (AIR), and Decision Information Resources (DIR) to ensure that the instruments are clear, flow well, and are as concise as possible.
Individuals Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
The individuals listed below contributed to the design of the study. AIR will administer the survey in partnership with Abt Global and, with Dr. Ron McCowan as the Survey Lead in collaboration with the following:
Individuals Consulted
Name |
Role in Study |
Telephone Number |
Dr. Keely Stater, Abt Global |
Quantitative Technical Lead |
(301) 347-5167 |
Dr. Christina LiCalsi, AIR |
Overall Project Director |
(312) 288-7600 |
Dr. Larry Buron, Abt Global |
Project Quality Reviewer |
(301) 634-1735 |
Galen Savidge-Wilkins |
Advisor |
|
Dr. Danielle Berman |
Advisor |
|
Inquiries regarding the study’s planned analysis should be directed to: |
||
Dr. Naganika Sanga |
Abt Global, Project Director |
(301) 347-5027 |
Lizzie Martin |
GSA, Technical Lead and Program Manager |
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Supporting Statement A Information Collection for the American Rescue Plan National Evaluation |
Subject | Supporting Statement A Information Collection for the American Rescue Plan National Evaluation |
Author | Centers for Medicare & Medicaid Services (CMS) |
File Modified | 0000-00-00 |
File Created | 2024-10-06 |