Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Phase II Evaluation Activities for Implementing a Next Generation Evaluation Agenda for the Chafee Foster Care Program for Successful Transition to Adulthood: A Study of the Use of the Family Unification Program (FUP) for Youth who have Experienced Foster Care
OMB Information Collection Request
0970-0544
Supporting Statement
Part B
December 2019
Submitted By:
Office of Planning, Research, and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officers: Maria Woolverton
The objective of this study is to provide ACF with descriptive information on the implementation of FUP for youth. This work will contribute to an evidence base for interventions with potential to help youth currently and formerly in foster care achieve self-sufficiency.
As noted in SSA, our core study questions include:
How is the partnership between the PHA, the PCWA, and the CoC structured?
Which youth are targeted by the PCWA and CoC for FUP?
How are the CoCs and PCWAs identifying eligible youth?
How are partners prioritizing youth for referrals?
How many youth are served with FUP vouchers?
How do communities determine how many youth and families to serve with FUP?
What share of youth who receive FUP vouchers sign a lease and maintain their housing?
What are the barriers and facilitators to a youth signing a lease and to maintaining their housing?
To what extent are parenting youth accessing FUP?
How do the needs and success of parenting youth in the program differ from childless youth?
What types of services are provided along with the FUP housing subsidy?
Which agency provides these services?
What is the nature, frequency and duration of the services?
To what extent do youth participate in the FSS program?
What does the program offer youth?
What does it require of youth?
To what extent do activities outlined in the site’s FUP program model reflect actual program practice?
How does context shape the FUP program in each site?
How does extended foster care affect how FUP is used?
How does the local housing market affect FUP?
How does the local economy affect youths’ abilities to prepare for when their FUP voucher expires?
How does the local service environment complement or substitute for FUP?
Are there regulatory or statutory barriers to serving youth?
How do youth experience FUP?
What can we learn to inform a future evaluation of how FUP impacts youth outcomes delineated in the Chafee legislation (e.g., education, employment, well-being)?
The survey conducted as part of this study will be for all FUP programs receiving awards in FY2018 and FY2019 and is thus generalizable to that population; however, it will not be generalizable beyond that population. The qualitative portion of the study is intended to present an internally-valid description of the implementation of the program in chosen sites, not to promote statistical generalization to other sites or service populations.
The study design will allow us to learn about the implementation of FUP for youth from both the agency and youth perspectives. It will also allow us to learn more about how the programmatic changes to FUP and foster care provision influences implementation. The survey of all PHAs and their partners (PCWAs and CoCs) will provide information on how the program is implemented by covering the following topics: historical use of FUP for youth, collaboration with partner agencies, identification and referral processes, housing search services, housing choice quality, lease-up and move-in processes, post-move services, and overall housing choice for this population. A web-based survey is the most appropriate method of collecting this information because it will allow us to survey all agencies who currently or will provide services to youth with the 2018 and 2019 FUP vouchers. Alternatively, the site visits to eight sites will allow us to collect more detailed data on a smaller sample of agencies. The interviews on the site visits will provide more detailed insight into the implementation of FUP for youth, including the topics covered in the survey and listed above. Lastly, the site visits will allow us to hold focus groups with youth who have participated in FUP. These focus groups are very valuable to understand the experiences and perspectives of these youth about the program’s implementation. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
The respondents to the survey will be the FUP liaisons at each of the partner agencies: the public housing authorities (PHA), the public child welfare agencies (PCWA), and the Continuum of Care (COC) lead agency, each named in their submission of grant application to HUD. The respondent universe for the implementation study of FUP includes agency heads, program managers, and front-line practitioners from the PHAs, PCWAs, CoCs; other service providers that are administering the program; and youth formerly in foster care who have leased up with FUP vouchers.
HUD will provide the research team with contact information for each PHA, PCWA, and CoC, based on the PHA applications.
PHAs with a current allotment of FUP vouchers may allocate them either to FUP-eligible families with children and/or FUP-eligible youth aging out of foster care (or formerly in foster care). Because the research team is uncertain as to which PHAs are using FUP vouchers for youth, the PHA survey sample will comprise the entire population of 61 PHAs that received vouchers awards in 2018, as well as the entire population of approximately 40 PHAs expected to receive voucher awards in 2019. Therefore, the PHA survey will be a census of all PHAs receiving new funding for FUP in 2018 and 2019, expected to be approximately 101 agencies.
While most PHAs are local-level entities, some are state or regional agencies that contract with local organizations to administer FUP; some local PHAs also may contract out the administration of FUP vouchers to other organizations. The previous study of this program found 3.3 percent of PHAs contracted their FUP administration to other organizations. Given the potential for differences in the current sample, we are planning for the number of PHAs contracting with local agencies or other organizations to total as many as 10. PHAs that contract with other agencies to administer FUP will be instructed to enlist the help of their most involved contracted partner to complete the survey.
There are some PCWAs and CoCs that partner with multiple PHAs to administer FUP. As a result, the population for these entities are less than that of the PHAs. The PCWA survey will be a census of the entire population of 43 PCWAs and the CoC survey will be a census of the entire population of 5 CoCs that partnered to administer FUP vouchers awarded in 2018. The PCWA and CoC survey samples will also include the entire population of PCWAs and CoCs that partner with PHAs that receive voucher awards in 2019, expected to be approximately 22 and 36, respectively. Therefore, the total PCWA survey sample is expected to be a census of approximately 72 agencies and the total CoC survey sample is expected to be a census of approximately 99 agencies.
The survey results will help guide the selection of FUP sites for site visits. A FUP site consists of three agencies, the PHA receiving the FUP award and their partner PCWA and CoC. First, the research team will select sites that have 20 or more vouchers provided to youth. If the team determines that it cannot identify a group of eight sites that are ideal candidates for visits from among the 2018 awardees, it will opt to conduct a portion of the site visits at 2019 awardee sites. From these sites, the research team will consider the following criteria to ensure a diversity of sites:
Geographic diversity,
Rental vacancy rates,
Sites with and without extended foster care in their state, and
Sites with and without a history of using FUP vouchers to serve youth.
The research team will select eight sites to participate in site visits. The universe of respondents will include:
Agency administrators: approximately 24 agency administrators (roughly 1 administrator per agency at 3 agencies, across 8 sites)
Agency FUP liaisons: approximately 24 FUP liaisons (roughly 1 FUP liaison per agency at 3 agencies, across 8 sites)
PHA Family Self-Sufficiency (FSS) program managers: approximately 8 FSS managers (1 FSS manager at each of 8 sites)
PHA front-line staff: approximately 192 front-line staff (up to 12 staff members at 2 focus groups per site across 8 sites).
PCWA and partner front-line staff: approximately 312 front-line staff (up to 12 agency and service provider staff members at 3 focus groups per site across 8 sites, plus up to 6 staff members at referring agency focus groups across 4 sites).
Youth leased up with FUP: approximately 96 youth leased up with FUP (1 focus group with up to 12 participants at each of 8 sites).
Service provider FUP leads: approximately 7 (one interview with one FUP lead in 7 of 8 sites)
The research team developed the instruments using similar data collection instruments from other studies, in particular the prior study of FUP for youth (Dion et al. 2014; OMB #2528-0285). Based on the report from that study (Dion et al. 2014), the research team removed questions where the data did not help answer the research questions or could be known from other sources (e.g. whether the agency’s state offered extended foster care). The research team added questions to the PHA and PCWA survey instruments about their partnership with the CoC. These questions are the same as the questions asked about the PHA’s partnership with the PCWA and vice versa. Similarly, the research team created the CoC survey based on the PHA and PCWA surveys using similar questions, but adapting the language for the CoC.
The research team also adapted discussion guides for interviews and focus groups with both agency staff and youth from the prior study of FUP for youth (Dion, Dworsky et al. 2014). The team added supplemental questions to the interview and focus group guides from guides used for an ongoing OMB-approved study “Evaluation of the Family Unification Program” (OMB # 0970-0514) that included questions on the CoCs and reflected updates to the FUP program and eligibility requirements. Although the in-depth focus groups with youth include some sensitive questions, similar questions have successfully been asked of similar respondents in other data collection efforts, such as the in-depth youth interviews conducted as part of the MultiSite Evaluation of Foster Youth Programs (Courtney et al. 2008a; Courtney et al. 2008b; Courtney et al. 2011a; Courtney et al. 2011b).
Because all additional questions mirror those used in the prior study of FUP for youth, the research team does not plan to pilot the discussion guides or survey questionnaires. The survey instruments minimize measurement error by asking the same question of multiple agencies.
The research team chose elements for the administrative data list that would address specific quantitative research questions. These capture the application, voucher issuance, and signing a lease to calculate the number of youth served FUP vouchers, their housing outcomes, and the time to complete the process. Child welfare placement histories will be used to explain differences in those outcomes accounted for by demographics and child welfare experiences.
See the table below for a description of how each data collection instrument aligns with the research questions.
Instrument |
Research Question(s) |
Instrument 1 -- Public Child Welfare Agency Survey |
1, 2, 5, 7, 8, 9 |
Instrument 2 -- Public Housing Agency Survey |
5, 7, 8, 9 |
Instrument 3 -- Continuum of Care Survey |
1, 2, 5, 7, 8, 9 |
Instrument 4 -- Focus Group Guide for Youth |
10 |
Instrument 5 -- Focus Group Guide for Public Housing Agency Intake Workers and Case Managers |
5, 7, 8, 9 |
Instrument 6 -- Focus Group Guide for Public Child Welfare Agency Caseworkers, Referring Partner and Service Provider Partners |
1, 2 5, 7, 8, 9 |
Instrument 7 -- Interview Guide for Public Housing Agency Administrator and FUP Liaison |
5, 7, 8, 9 |
Instrument 8 -- Interview Guide for Public Child Welfare Agency Administrator and FUP Liaison |
1, 2 5, 7, 8, 9 |
Instrument 9 -- Interview Guide for Continuum of Care Lead Organization Administrator and FUP Liaisons |
1, 2, 5, 7, 8, 9 |
Instrument 10 -- Interview Guide for Service Provider FUP Leads |
5, 7, 8, 9 |
Instrument 11 -- Interview Guide for Family Self Sufficiency Manager |
6 |
Instrument 12 -- Administrative data list |
3, 4 |
Data collection will take place in the form of a web-based survey, eight site visits, program data collection, and administrative data collection. Tables A1 and A4 in Supporting Statement A summarize all the data collection that will be conducted for the evaluation.
Approximately one year after the FUP vouchers were awarded in 2018 and will be awarded in 2019, we will conduct a web-based survey of three key informants in each community awarded vouchers. Key informants will include FUP liaisons at the PHA, PCWA and CoC. The survey will be open for two months and we will remind key informants to complete the survey after 1 to 2 weeks and again after 3 to 4 weeks (Appendices A, B, and I). The introduction to the web-based survey will cover the following: the study’s purpose and funder, the nature of the information that will be collected, how the information will be used, the potential benefits and risks of participating, and assurance that participation in the study is voluntary (Instruments 1-3). The web-based survey will also inform participants that they may choose to skip any questions or stop participating at any time.
The research team will send the survey via email to contacts at the CoC, PHA and PCWA provided by HUD (Appendix A). If the individual receiving the survey is not the current FUP liaison, that individual will be asked to forward the survey to the correct individual during research team reminder calls (Appendix I). We will perform initial data checks while the surveys are being fielded to ensure that skip patterns are formatted properly and that a reasonable distribution of responses had been collected. At the conclusion of the survey administration, we will review one-way frequencies for inconsistencies and out-of-range responses. We will contact agencies that had unclear responses to clarify their answers.
Approximately 6 to 12 weeks after the survey of 2018 awardees closes, the research team will conduct eight site visits. If the team determines that it cannot identify a group of eight sites that are ideal candidates for visits from among the 2018 awardees, it will opt to conduct a portion of the site visits at 2019 awardee sites, approximately six to twelve weeks after the survey of 2019 awardees closes. The team will reach out to agency administrators, managers and front-line staff via email requesting their participation in an interview (Appendices G-H). We will ask front-line staff who agree to participate in interviews to assist with the recruitment of youth and will provide them with recruitment materials to do so (Appendices C-F). All recruited staff and youth will be assured that their participation is voluntary and that they are free to choose not to participate without consequence.
All research team members who participate in site visits will be trained on consent and interview procedures prior to entering the field. The team will all sign the Urban Institute Confidentiality Forms prior to data collection and will store all data in locked cabinets or on a secure drive.
At least one senior researcher from the research team will attend each site visit. Due to COVID-19, site visits may be conducted in-person or virtually. If in-person site visits are feasible, the visits will generally last two days. If site visits are conducted virtually, interviews and focus groups with participants will be scheduled based off participants’ availability to reduce burden and work disruption. The discussion guide questions for interviews and focus groups are designed to elicit nuanced responses, and the research team will need to probe when answers are vague, ambiguous, or when we want to obtain more specific or in-depth information. At the start of the interviews and focus groups with staff, the research team will ask the respondents for verbal consent to participate and permission to record the conversation using the informed consent form attached to each interview and focus group guide (Instruments 4-11). Verbal consent will also be requested during youth focus groups, and the team will ask youth to sign consent forms for focus groups conducted in-person. The team will cover the following during the consent process: the study’s purpose and funder, the nature of the information that will be collected, how the information will be used, the potential benefits and risks of participating, and assurance that participation in the study is voluntary. The team will also inform study participants that they may choose to skip any questions or stop participating in the interview at any time. Additionally, the team will also ask youth participants for consent to audio record the focus group. The team will only use this audio recording to supplement notes taken during the focus group and will not record if any youth does not consent to being recorded. If at any time a study participant becomes distressed, the study team members conducting the interview will stop the interview.
The project team will rely on agency leaders and staff at each site to help recruit staff and youth for the interviews and focus groups. As directed by the research team, agency staff will recruit young adult participants age 18 and older who have leased up with FUP. Should focus groups be conducted virtually, program staff will share the contact information for youth with the project team. This information will be kept on a secure server and used for scheduling purposes as well as tracking the distribution of gift cards to youth. Once confirmation of gift cards is received, all correspondence containing youth contact information will be permanently deleted.
The elements to be collected from administrative records at each of the eight sites selected for visits are outlined in the administrative data list (Instrument 12). These will be collected from PHA and PCWA administrative data, as well as HMIS data from the CoC, and education data from the National Student Clearinghouse (NSC). While on site, the research team will have site-specific conversations with the staff most familiar with the data to understand what data are available and its structure and quality. During this conversation, we will establish a timeline and procedures for transferring the data to the research team.
Based on previous surveys of FUP grantees, we expect a 90 percent response rate for PHAs and 89 percent response rate among PCWAs and CoCs. We expect a similar response rate for the CoCs as the PCWAs in previous studies because they are also a partner agency to the PHA who is the grantee. Since the subject matter and purpose of the survey are highly salient to the universe of PHAs, we expect a high response rate among this group.
Program Staff
Previous implementation studies of FUP (Cunningham et al. 2015) and of a supportive housing program for child welfare involved families (Cunningham et al. 2014) have had no potential participants refuse to participate in all interviews and focus groups. Based on this past experience, we do not anticipate any refusals from staff. ACF anticipates that once management of an agency agree to participate in the evaluation, they will encourage agency staff to participate in all study activities. The research team will conduct interviews and focus groups on site that will be scheduled with adequate notice to facilitate matching interview and focus group times with staff’s schedules.
Youth focus groups
We expect a 3 percent refusal rate for in-depth youth focus groups based on past studies (Courtney et al. 2008a; Courtney et al. 2008b; Courtney et al. 2011a; Courtney et al. 2011b). The research team will select locations, dates and times that are most convenient for youth respondents, and propose to offer a$25 gift card to offset incidental costs to minimize refusal rates and avoid disproportionate demographics between those who participate and the standing study pool. These tokens of appreciation will be provided via gift card for in-person meetings, and an e-gift card for focus groups conducted virtually.
ACF expects a high response rate on the administrative data. All the administrative data we intend to collect already exist in a form required for submission to the federal government including the Administrative Foster Care and Adoption Reporting System (AFCARS), Homeless Management Information System (HMIS) and required submissions to U.S. Department of Housing and Urban Development. For this reason, we believe the data will generally be available and complete. From past studies (Pergamit et al. 2019; Cunningham et al. 2015), we expect most sites to have the data necessary. We will also collect data from the NSC, which has high rates of coverage for students enrolled in private and public post-secondary institutions.
At the individual level, we expect low rates of missing data. In similar, prior evaluations we have been able to obtain administrative data on about 99 percent of individuals in child welfare administrative data sets (Pergamit et al. 2019; Cunningham 2015).
We do not expect that all sites will have complete HMIS data. Coverage of shelters in the HMIS data can vary widely from site to site, therefore the HMIS data in some sites may not cover all shelter stays. In addition, CoC’s will need consent from individuals to share data. Typically, a shelter will collect a standard consent form for every person who enters the shelter that permits the sharing of their de-identified information for research purposes. If a person has had a shelter stay and has not consented to have their data shared, their stay will not appear in the administrative data. We will assess both the coverage of the HMIS data and the consent rates in our conversations with the CoC to determine the quality of the data. Unfortunately, for emergency shelter stays, if some shelters do not report into the HMIS, or if the youth has not given consent to share their information with researchers, we have no other way to acquire these data. For emergency shelter stays, we will estimate impacts using data from sites for which we have the data items.
At the site level, we expect a high response rate for the critical questions in the web-based survey. To address questions that arise during the survey, Urban will offer a phone number and email address dedicated to survey help. This phone and email contact information will be made available to sample members in the initial letter and all following emails. During telephone reminder calls, this information will also be provided. The research team will also follow up with the respondent via phone when critical questions are missing to obtain answers.
The implementation study data collection questions for program staff do not include sensitive topics and are designed to ask appropriate questions to each respondent. We therefore expect a 100 percent response rate for critical implementation study questions. We do not expect significant item non-response for these interviews and focus groups.
At the site level, we expect a high item response rate for administrative data. We will request the specific data elements in Instrument 12 in a format that is convenient for the agency compiling the data. The critical questions for the implementation study are on youth demographics, placement histories, service referrals, and housing outcomes. These data will come from the PHAs, PCWAs and Homeless Management Information System (HMIS) data (through the CoCs). These data already exist in a form required for submission to the federal government for the Administrative Foster Care and Adoption Reporting System (AFCARS), the National Child Abuse and Neglect Data System (NCANDS), and the HMIS. For this reason, we believe the data should be available and complete. Past evaluations involving similar agencies (Pergamit et al. 2019; Cunningham 2015), have been able to obtain administrative data for all sites included in those evaluations.
At the individual level, we expect low rates of missing data. In past studies, we had been able to obtain administrative data for about 99 percent of individuals in child welfare administrative data sets (Pergamit et al. 2019; Cunningham et al. 2015). Similarly, the 2016 Annual Homeless Assessment Report (AHAR) found that the missing data rate for the number of nights in a program for emergency shelters and transitional housing, the only critical items in the HMIS for our study, were very low at 0.2 percent missing for females and 0.3 percent missing for males (Buron, McCall, and Solari 2016). Missing data rates for PHA data varies widely by agency and has not been documented. However, we expect low missing data rates for our variables of interest as they are critical for PHA program management and reporting to HUD.
As noted earlier, we anticipate high response rates among all agencies surveyed. However, we will analyze nonresponse patterns using relevant information that is known about both respondents and nonrespondents. We will compare characteristics, such as geographic location, agency size, and history of FUP administration, of respondents, nonrespondents, and the total attempted sample to evaluate the risk for nonresponse bias of estimates, which cannot be measured directly. Unless response rates turn out to be much lower than expected, we do not expect to weight the survey data; however, we will note any nonresponse bias as a limitation.
Program Staff
If staff are not available to participate in interviews and focus groups, we will work closely with program leaders to either identify other staff with similar knowledge, or ways to schedule telephone interviews or follow-up conversations. Because the evaluation is voluntary, any member of the program may choose not to participate. This could lead to disproportionate agency representation in the results if those who do not respond are somehow different from those who respond. If substantial refusal rates from members of a program are experienced, especially if agencies at a site do not make their staff available, this will be reported as a study limitation. We do not anticipate this type of refusal taking place. Regardless, we note that in some cases refusal to participate from certain staff may be less critical than others if they are less familiar with the FUP program and its use for youth. We will analyze data of refusals among staff who are familiar with the FUP program and we will adjust recruitment if refusal rates are high.
Youth
The research team will ask PCWA, PHA, CoC, or service provider staff to recruit youth who receive a FUP voucher to participate in focus groups. Because the evaluation is voluntary, any youth may choose not to participate. This may lead to disproportionate refusal demographics if those who do not respond are somehow different from those who respond. If substantial refusal rates from youth are experienced, this will be reported as a study limitation. If refusal rates are high, we will work with staff to adjust our recruitment of youth participants. We will also report, based on administrative data, the extent to which youth who participate reflect the broader population of youth served by the local FUP program.
Similarly, if administrative data have incomplete or missing information, the research team will note the data elements that are missing and work with program staff to understand what factors drive missingness and develop procedures for collecting the information more consistently. For example, if the HMIS is attempting to pull data on youth who receive FUP vouchers, the research team may suggest they try matching on different data elements when trying to pull data on these youth. In the analysis, we will consider whether imputation techniques to account for any missing data will be useful and feasible based on how extensive the missingness is and the availability of variables useful for imputation. If we decide to impute any missing variables, we will use Stata to run multiple imputation (Little and Rubin, 2002).
We will produce descriptive statistics from the survey data, a combination of tabulations and cross-tabulations. We do not anticipate weighting the data. All non-responses or missing data that was not a result of a skip pattern will be included as its own category regardless of the item non-response rate.
The estimates produced are for official external release by the Office of Planning, Research, and Evaluation (OPRE) and the Urban Institute. The data will not be used to generate population estimates, either for internal use or dissemination.
We will perform initial data checks while the surveys are being fielded to ensure that skip patterns are formatted properly and that a reasonable distribution of responses had been collected. At the conclusion of the survey administration, we will review one-way frequencies for inconsistencies and out-of-range responses. We will contact agencies that had unclear responses to clarify their answers. Survey data will be cleaned to represent “logical skips” as a missing response and to distinguish logical skips from item nonresponse. We will also back code open-ended response options. If respondents write in their own answer but do not mark the “Other” option indicating they would be doing so, the Other option will be marked for them to force consistency in the response. We will also recode any open-ended responses that match one of the answer choices.
For qualitative data coding, we will ensure inter-rater reliability through having multiple coders code several transcripts once the coding scheme is established and running coding comparison queries in NVivo and re-coding until a Kappa coefficient of over 0.80 is achieved (considered a high level of agreement between) (McHugh, 2012). If the initial level of agreement is below 0.80, the coders will meet to discuss the definitions of each code before returning to recode the transcripts.
The study will use a combination of qualitative and quantitative data analysis. Qualitative data analysis will combine information from the various data sources. The semi-structured interview guides and focus group protocols we have developed to guide qualitative data collection include discussion topics and questions that reflect key implementation study research questions. The research team will fully transcribe audio recordings of interviews and focus groups and clean notes in cases when participants did not consent to being recorded. We will develop a coding scheme to organize the data into themes or topic areas. Transcripts will be coded (tagged based on the theme or topic for which they are relevant) and analyzed using the qualitative analysis software package NVivo.
The research team will also produce descriptive statistics on FUP program context and services based on PHA, PCWA and CoC survey data. The research team will also look at a youths’ progress through the leasing process using the quantitative data collected from the PHA. These data will also be used to provide descriptive statistics on the reasons for voucher denial, voucher loss, and housing exit. If there is sufficient variation, we will run regression analyses to determine what factors were correlated with obtaining a voucher, leasing up, and exiting housing. Additional descriptive analyses will look at the service receipt and characteristics of youth receiving vouchers (e.g., child welfare placement history, age at entry).
We will include a detailed study methodology in our final report that will help the public understand and properly interpret the information derived from the data collection. The methodology section will include, but not be limited to, survey instrument topics, survey administration and response rates, data preparation, handling of missing data, site selection, interview and focus group discussion topics, and qualitative data analysis techniques.
The information for this study is being collected by the Urban Institute and Chapin Hall at the University of Chicago on behalf of ACF. Principal Investigator Michael Pergamit led development of the study design plan and data collection protocols, and will oversee collection and analysis of data gathered through on-site interviews and telephone interviews.
The agency responsible for receiving and approving contract deliverables is:
The Office of Planning, Research, and Evaluation (OPRE),
Administration for Children and Families (ACF)
U.S. Department of Health and Human Services
The Federal project officer for this project is Maria Woolverton.
Instrument 1 -- Public Child Welfare Agency Survey |
Instrument 2 -- Public Housing Agency Survey |
Instrument 3 -- Continuum of Care Survey |
Instrument 4 -- Focus Group Guide for Youth |
Instrument 5 -- Focus Group Guide for Public Housing Agency Intake Workers and Case Managers |
Instrument 6 -- Focus Group Guide for Public Child Welfare Agency Caseworkers, Referring Partner and Service Provider Partners |
Instrument 7 -- Interview Guide for Public Housing Agency Administrator and FUP Liaison |
Instrument 8 -- Interview Guide for Public Child Welfare Agency Administrator and FUP Liaison |
Instrument 9 -- Interview Guide for Continuum of Care Lead Organization Administrator and FUP Liaisons |
Instrument 10 -- Interview Guide for Service Provider FUP Leads |
Instrument 11 -- Interview Guide for Family Self Sufficiency Manager |
Instrument 12 -- Administrative data list |
Appendix A -- FUP Survey Outreach Email |
Appendix B -- FUP Survey Reminder Email |
Appendix C -- FUP Project Overview |
Appendix D -- FUP Project Fact Sheet |
Appendix E -- Youth Outreach Email |
Appendix F -- Youth Outreach Phone Script |
Appendix G -- Front-Line Staff Outreach Email |
Appendix H -- Lead Staff Outreach Email |
Appendix I -- FUP Survey Reminder Telephone Script |
Buron, Larry, Tom McCall, and Claudia D. Solari. 2016. The 2016 Annual Homeless Assessment Report (AHAR) to Congress. Washington, DC: The U.S. Department of Housing and Urban Development.
Courtney, Mark E., Andrew Zinn, Erica H. Zielewski, Roseana Bess, Karin Malm, Matthew Stagner, and Mike Pergamit. 2008a. “Evaluation of the Early Start to Emancipation Preparation - Tutoring Program Los Angeles County, California: Final Report.” Washington, DC: Urban Institute.
Courtney, Mark E., Andrew Zinn, Erica H. Zielewski, Roseana Bess, Karin Malm, Matthew Stagner, and Mike Pergamit. 2008b. “Evaluation of the Life Skills Training Program Los Angeles County, California: Final Report.” Washington, DC: Urban Institute.
Courtney, Mark E., Andrew Zinn, Heidi Johnson, and Karin E. Malm. 2011b. “Evaluation of the Massachusetts Adolescent Outreach Program for Youths in Intensive Foster Care: Final Report.” Washington, DC: Urban Institute.
Courtney, Mark E., Andrew Zinn, Robin Koralek, and Roseana J. Bess. 2011a. Evaluation of the Independent Living – Employment Services Program Kern County, California: Final Report.” Washington, DC: Urban Institute.
Cunningham, Mary, Michael Pergamit, Abigail Baum, and Jessica Luna. 2015. “Helping Families Involved in the Child Welfare System Achieve Housing Stability.” Washington, DC: Urban Institute.
Cunningham, Mary, Michael Pergamit, Maeve Gearing, Simone Zhang, Brent Howell. 2014. Supportive Housing for High-Need Families in the Child Welfare System. Urban Institute. https://www.urban.org/research/publication/supportive-housing-high-need-families-child-welfare-system
Dion, Robin, Amy Dworsky, Jackie Kauff, and Rebecca Kleinman. 2014. Housing for youth aging out of foster care. Washington, DC: U.S. Department of Housing and Urban Development, Office of Policy Development and Research.
Little, Roderick J.A. and Donald B. Rubin. 2002. “Bayes and Multiple Imputation.” In Statistical Analysis with Missing Data, 2nd ed., 200-20. Hoboken, NJ: Wiley Publishing Company.
McHugh M. L. (2012). Interrater reliability: the kappa statistic. Biochemia medica, 22(3), 276–282.
Pergamit, Michael, Mary Cunningham, Devlin Hanson, Alexandra Stanczyk. 2019. “Does Supportive Housing Keep Families Together?” Washington, DC: Urban Institute.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Sullivan, Laura |
File Modified | 0000-00-00 |
File Created | 2021-03-05 |