2 - Supporting Statement Part B FUP-FSS Demonstration Evaluation

2 - Supporting Statement Part B FUP-FSS Demonstration Evaluation.docx

Family Unification Program/Family Self-Sufficiency Demonstration Evaluation

OMB: 2528-0327

Document [docx]
Download: docx | pdf

Supporting Statement for Paperwork Reduction Act Submissions

Family Unification Program - Family Self-Sufficiency Demonstration Evaluation

OMB Control # 2528-XXXX


B. Collections of Information Employing Statistical Methods


  1. Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected

response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.


Web-based Survey

For the Family Unification Program - Family Self Sufficiency (FUP-FSS) Demonstration Evaluation web-based survey, the universe of 51 public housing authorities (PHAs) and their public child welfare agency (PCWA) partners involved in the FUP-FSS demonstration will be invited to participate.

We expect one primary staff member respondent at each of the 51 PHAs, and one respondent at each of their respective partner PCWAs. The intended respondents for surveys are staff with hands-on knowledge about FUP youth and the FUP-FSS demonstration. At the PHAs, this staff person is likely the Housing Choice Voucher (HCV) program manager, a case manager, or FUP liaison, and/or the FSS grant manager or FSS coordinator. At larger PHAs with more robust FUP and FSS programs or at state housing agencies we expect multiple staff will need to contribute information to the primary respondent, from the voucher issuance and FUP-FSS program perspectives. At PCWAs, the most likely respondent is the FUP liaison.



Telephone Interviews

For the telephone interviews, we expect to interview staff from a sample of 10 demonstration sites. A demonstration site consists of a PHA and its partnering PCWA. Specifically, we expect to interview two staff respondents at each of 10 demonstration PHAs selected for phone interviews (up to 20 PHA staff in total) and one PCWA staff respondent at each of the 10 PCWA partners (10 PCWA staff in total), to gain a nuanced understanding of the FUP-FSS program implementation progress and early outcomes. The total number of respondents interviewed for each FUP-FSS partnership may vary based on the size and complexity of the participating PHAs and partner PCWAs. Key PHA staff to be interviewed are the staff with the most direct knowledge and hands-on experience implementing the FUP-FSS demonstration. Roles may vary by PHA size or staffing arrangements, but we expect relevant staff roles to include a combination of HCV program managers, HCV case managers, FUP grant managers or coordinators, or PHA resident or community services managers. We will identify the appropriate staff for interviews through preliminary conversations with PHA staff during the initial site recruitment and interview scheduling process. As with the web-based survey responses, we expect the PCWA staff interviews will be conducted with FUP liaisons, although case workers or independent living coordinators may be included for some sites.

Site Visits

During three site visits we will interview staff from three different PHA/PCWA partnerships (one partnership per site visit). At each site visit, in-person interviews will be conducted with five to seven PHA staff (up to 21 in total), one to two PCWA staff (up to six in total), and up to six FUP-FSS demonstration participants (up to 18 in total). As feasible, interviews may be conducted with 1 to 2 local service partners at each site, including CoCs if relevant (up to 6 in total). This amounts to up to 33 in-person interviews in total with program administrators, and up to 18 youth interviews.

  • PHA in-person interviews. PHA interviews will include executive directors, resident services managers, HCV program managers, case managers who work with FUP-FSS demonstration participants and/or FUP youth, FSS grant managers, FSS coordinators, and any other staff identified by the PHAs as working directly with FUP-FSS eligible youth. We expect to conduct five to seven interviews with PHA staff at each site, over one day on-site at PHA offices.

  • PCWA in-person interviews. The PCWA FUP liaison for each partnership will be interviewed in person if feasible, and we will attempt to travel to county or state-wide PCWA offices if they are not in the PHA’s local jurisdiction. We expect to interview one to two PCWA representatives during each site visit.

  • CoC partner in-person interviews. We will identify one local CoC administrator at each site that serves FUP-FSS youth through referrals from the PHAs.

  • Community service provider in-person interviews. We will identify one to two local service provider organizations at each site that serve FUP-FSS youth through referrals from the PHAs.

  • Youth in-person interviews. We plan to recruit approximately six youth at each site, for a total of 18 youth across the three sites. However, these goals may be revised based on the number of FUP-FSS participants enrolled at each site. Interviews will last about an hour. As noted, all youth will review and sign a consent form. We will also administer a five- to 10-minute iPad survey before the interviews to capture demographics and closed-ended questions such as the length of FUP and FSS participation and time since exit from foster care.

Administrative Data

The universe for administrative data analysis will include the 51 PHAs that are participating in the FUP-FSS demonstration, as well as the 191 other agencies that administer FUP vouchers but are not participating in the FUP-FSS demonstration (a total of 242 agencies). We will use administrative data already collected by HUD (Public and Indian Housing Information Center [PIC] and Voucher Management System [VMS] data), and will not request any administrative records directly from PHAs or their PCWA partners. To identify the proportion of eligible youth participating in FUP-FSS and assess any early outcomes for FUP-FSS participants, PIC data will include all individual-level records for the 242 PHAs participating in FUP and FUP-FSS (not just those of FUP-FSS participants). This will allow us to contrast short-term outcomes for FUP-FSS participants against other demographic groups, and assess any variations in FUP-FSS participants’ characteristics compared to other PHA-assisted households and to FUP youth who do not participate in FUP-FSS.










  1. Describe the procedures for the collection of information including:


  • Statistical methodology for stratification and sample selection,

  • Estimation procedure,

  • Degree of accuracy needed for the purpose described in the justification,

  • Unusual problems requiring specialized sampling procedures, and

  • Any use of periodic (less frequent than annual) data collection cycles to reduce burden.


Sampling plan

WEB-BASED SURVEY

The web-based survey of PHAs and PCWAs does not require a sampling plan, as it is a census; the universe of demonstration PHAs and their partnering PCWAs will be invited to participate in the survey. The survey invitation will be sent to the PHA and PCWA leads for all participating agencies using contact information provided by HUD. If the individual at the PHA or PCWA receiving the survey is not the current relevant staff person, that individual will be asked to forward the survey to the correct individual during research team reminder calls to unresponsive agencies.

TELEPHONE INTERVIEWS

The research team will select 10 PHAs and their partner PCWAs for in-depth telephone interviews using the following process:

  • First, we will identify a preliminary, representative sample of 10 partnerships through cluster analysis of PIC and publicly-available data from the Census (American Community Survey) and the Bureau of Labor Statistics (BLS), based on PHA size and type (Moving to Work [MTW] vs traditional PHA), FUP youth voucher take-up, measures of geographic diversity and rental market context (e.g., region, urban vs. not, vacancy rates, and median rents), and economic conditions (e.g., unemployment rates). While we will use publicly-available data for geography and market conditions, we will use individual-level PIC data to identify take-up rates. To incorporate trends over time, we require data from 2016 through the most current year available. Cluster analysis, which groups objects based on shared characteristics, is appropriate here because our goal is to identify groups of PHAs that may not be readily identifiable from a less systematic grouping approach.

  • Reviewing this initial sample, we will consider whether to purposefully include specific PHAs based on MTW status or innovative practices, as identified by the COR, HUD program staff, or document review (for example, MTW plans and FSS action plans).

  • We will then review the extent of overlap with the Urban Institute’s US Department of Health and Human Services (HHS) funded FUP Youth project currently in progress, and opportunities to coordinate with that effort and/or limit our sample to unique sites.1

  • Finally, we will determine which of these sites are well suited to site visits (e.g., with the largest numbers of FUP-FSS participants) as opposed to phone interviews.

SITE VISITS

The research team will select three demonstration PHA/PCWA partnerships for site visits. We will aim to identify three sites that reflect the diversity of FUP-FSS program characteristics and local contexts, while also prioritizing opportunities to reach FUP youth and FUP-FSS demonstration participants for interviews. The four main selection criteria for site visits include:

  • FUP youth issuances. We will prioritize demonstration sites that have enrolled at least 10 FUP youth as per PIC administrative data. As of 2018, 16 of the 51 demonstration sites had at least 10 FUP youth enrolled. If it appears from administrative data analysis or other data collection that some demonstration sites tend to have low FUP-FSS enrollment, we will identify one site for a site visit that has substantial numbers of FUP youth vouchers issued but low FUP-FSS demonstration enrollment.

  • Innovative practices. We will attempt to select at least one PHA for having notable components in their FUP-FSS program, such as robust or unique supportive services, unique partnerships, or resources for supportive services--for example, sites that may have strong relationships with employers or local community colleges. We will attempt to identify these PHAs through document review and discussions with HUD program staff and PHA contacts.

  • Geographic diversity. We propose to select PHAs in different regions of the country.

  • Size and local market characteristics. As feasible given the small number of site visits planned, we will select PHAs that reflect a diversity of sizes and local housing market characteristics. We expect to select one large urban PHA, one small suburban or exurban PHA, and one PHA serving a small city. We expect this diversity of local contexts will result in a combination of higher-and lower-cost areas, and variations in availability of supportive services for FUP-FSS youth.

Staff for interviews will be identified through a combination of document review and direct contact with PHAs and PCWAs. We expect to identify and engage the community service providers, and subsequently contact staff for interviews, through FSS grant managers or coordinators during site visit planning. This may include the CoC if the CoC is a partner in the FUP youth or FUP-FSS program. For youth interviews, we will work with FUP and FSS staff to recruit participants.

ADMINISTRATIVE DATA

The administrative data analysis does not require a sampling plan, as data will include the full universe of participants and residents served by the 242 PHAs that had FUP voucher allocations as of 2018.

Justification of level of accuracy

The study will report on the results of the web-based survey using descriptive analytic techniques. Because this is a census and we anticipate a high response rate, the resulting estimates should have a high level of accuracy. However, if the response rate is below 80 percent, we will conduct analysis for non-response bias.

The study will also report on the staff and participant telephone and in-person interviews using descriptive analytic techniques. The research team will not use the qualitative data collection to generalize findings to the larger population or to draw statistical inferences.

Unusual problems requiring specialized sampling procedures

There are no unusual problems associated with this data collection. The study seeks survey responses from the entire population of PHAs and PCWAs participating in the demonstration rather than sampling from that population. In addition, the study team will ask program administrators to help recruit staff and youth to participate in the qualitative data collection.

Any use of periodic (less frequent than annual) data collection cycles to reduce burden

Not applicable to this study.






  1. Describe methods to maximize response rates and to deal with issues of non-response. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.



Web-Based Survey

Based on response rates achieved in a previous HUD study of PHAs and PCWAs with FUP vouchers (Dion et al. 2014), we expect an 89.9 percent response rate for the web-based survey of PHAs and an 89.0 percent response rate among PCWAs. Since the subject matter and purpose of the surveys are highly salient to the program partners, we expect to repeat these high response rates.

At the site level, we expect a high response rate for the critical questions in the web-based survey. To address questions that arise during the survey, Urban will offer a phone number and email address dedicated to survey help. This phone and email contact information will be made available to respondents in the initial letter and all following emails. During telephone reminder calls, this information will also be provided. The research team will also follow up with the respondent via phone to obtain answers when critical questions are missing.

The web-based survey will be open for five weeks, and we will remind key informants to complete the survey after one to two weeks and again after three to four weeks. The web-based survey will cover the following: the study’s purpose and funder, the nature of the information that will be collected, how the information will be used, the potential benefits and risks of participating, and assurance that participation in the study is voluntary. The web-based survey will also inform participants that they may choose to skip any questions or stop participating at any time.

The research team will send the survey to contacts at the PHAs and PCWAs provided by HUD. The initial request will be sent via email, and we will reach out with a reminder email and via phone as necessary (Attachments B.1-B.6). If the individual receiving the survey is not the current appropriate PHA or PCWA staff member, that individual will be asked to forward the survey to the correct individual during research team reminder calls. We will perform initial data checks while the surveys are being fielded to ensure that skip patterns are formatted properly and that a reasonable distribution of responses had been collected. At the conclusion of the survey administration, we will review one-way frequencies for inconsistencies and out-of-range responses. We will contact agencies that had odd responses to clarify their answers.

As noted earlier, we anticipate high response rates among all agencies surveyed. However, we will analyze non-response patterns using relevant information that is known about both respondents and non-respondents. We will compare characteristics, such as geographic location, agency size, and history of FUP administration, of respondents and non-respondents, and the total attempted sample to evaluate the risk for nonresponse bias of estimates, which cannot be measured directly.

Interviews

We expect a 100-percent response rate for the interview data to be collected from demonstration site staff. Past evaluations conducting implementation studies of housing programs for child welfare-involved families have had 100-percent response rates on data collection (Cunningham et al., 2014; Cunningham et al., 2015). For the youth interviews, studies conducted with older foster youth as part of the MultiSite Evaluation of Foster Youth Programs achieved an average of a 97-percent response rate for in-depth interviews (Courtney et al., 2008a; Courtney et al., 2008b; Courtney et al., 2011a; Courtney et al., 2011b). As with that study, by including an incentive, we assume a similar response rate for this study.

The interview questions for program staff do not include sensitive topics and are designed to ask appropriate questions to each respondent. We therefore expect a 100 percent response rate for critical implementation study questions. Although the in-depth interviews with youth include some sensitive questions, similar questions have successfully been asked of similar respondents in other data collection efforts, such as the in-depth youth interviews conducted as part of the MultiSite Evaluation of Foster Youth Programs (Courtney et al., 2008a; Courtney et al., 2008b; Courtney et al., 2011a; Courtney et al., 2011b). We do not expect significant item non-response for these interviews.

For both the telephone interviews and site visits, the research team will reach out to agency administrators and managers via email requesting their participation in an interview (Attachment A.5). For sites where we are conducting site visits, we will ask staff who agree to participate in interviews to assist with the recruitment of youth and will provide them with recruitment materials to do so, including a project overview and informational fact sheet that will describe the purpose of the study and address other logistical questions (Attachments A.1-A.2). All recruited staff and youth will be assured that their participation is voluntary and that they are free to choose not to participate without consequence.

If staff are not available to participate in interviews, we will work closely with program leaders to identify other staff with similar knowledge or ways to schedule telephone interviews or follow-up conversations. Because the evaluation is voluntary, any member of the program may choose not to participate. This may lead to nonresponse bias in the results if those who do not respond are somehow different from those who respond. Any substantial nonresponse from members of a program, essentially if agencies at a site do not make their staff available, will be reported as a study limitation. We do not anticipate this type of nonresponse taking place. Regardless, we note that in some cases nonresponse from certain staff may be less critical than others if they are less familiar with FUP or FUP-FSS and its use for youth.

The research team will ask PHA staff to recruit youth who participate in the FUP-FSS demonstration for in-depth interviews. If FUP-FSS participation is low among FUP youth at demonstration sites, some youth who receive a FUP voucher but choose not to participate in the demonstration may also be recruited for interviews. Because the interviews are voluntary, any youth may choose not to participate. This may lead to nonresponse bias in the results if those who do not respond to interview invitations are somehow different from those who agree to interviews. Substantial nonresponse from youth will be reported as a study limitation. We will also report, to the extent possible based on administrative data, the extent to which youth who opt to participate in the interviews may differ from youth served by the local FUP-FSS program.


  1. Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.



HUD personnel and staff from the contractor, Urban Institute, reviewed draft versions of the web-based survey instrument and interview protocols. Their comments are reflected in the versions of the instruments included in this package.

During the review period of the request for OMB clearance, the study team plans to complete a pretest of the survey instrument with a small group of three PHAs and PCWAs.





  1. Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractor(s), grantee(s), or other person(s) who will actually collect and/or analyze the information for the agency.



The individuals to contact are as follows:

Mindy Ault – Government Technical Representative

Office of Policy Development and Research

U.S. Department of Housing and Urban Development

Email: Melinda.A.Ault@hud.gov

Phone: (202) 402-3116


Martha Galvez – Co-Principal Investigator

Urban Institute

Email: MGalvez@urban.org

Phone: (202) 261-5260


Mike Pergamit – Co-Principal Investigator

Urban Institute

Email: MPergamit@urban.org

Phone: (202)-261-5276


Amy Dworsky – Project Consultant

Chapin Hall

Email: adworsky@chapinhall.org

Phone: (773)-753-5900


Mark Treskon – Project Manager

Urban Institute

Email: MTreskon@urban.org

Phone: (202)-261-5348


Amelia Coffey – Research Analyst

Urban Institute

Email: acoffey@urban.org

Phone: (202) 261-5470


Laura Sullivan –Research Analyst

Urban Institute

Email: lsullivan@urban.org

Phone: (202) 261-5837


Hannah Daly – Research Assistant

Urban Institute

Email: hdaly@urban.org

Phone: (202) 261-5509

Mica O’Brien – Research Assistant

Urban Institute

Email: mobrien@urban.org

Phone: (202) 261-5571

1 For example, it may be useful, if feasible, to combine site visits planned to sites that are in both the HHS and HUD-sponsored studies and include common Urban Institute research team members.

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorh03483
File Modified0000-00-00
File Created2021-01-14

© 2024 OMB.report | Privacy Policy