Supporting Statement for Paperwork Reduction Act Submission
Evaluation of the HUD-DOJ Pay for Success Re-Entry Permanent Supportive Housing Demonstration
The U.S. Departments of Housing and Urban Development (HUD) and the Department of Justice (DOJ) entered into an innovative interagency collaboration that combines DOJ’s mission to promote safer communities by focusing on the reentry population with HUD’s mission to end chronic homelessness. This collaboration resulted in the Pay for Success Permanent Supportive Housing Demonstration with $8.68M awarded in June 2016 to seven communities to develop supportive housing for persons cycling between the jail or prison systems using Pay for Success (PFS) as a funding mechanism. The PFS Demonstration grant supports activities throughout the PFS lifecycle, including feasibility analysis, transaction structuring, and outcome evaluation and success payments, with each grantee receiving funds for different phases in the PFS lifecycle.
HUD and DOJ funded a national evaluation to assess whether PFS is a viable model for scaling supportive housing in order to improve outcomes for a re-entry population. The evaluation is funded through an interagency agreement and is managed by HUD’s Office of Policy Development and Research. The overarching goal of this formative evaluation is to learn how the PFS model is implemented in diverse settings with different structures, populations, and community contexts. The Urban Institute has designed a multi-disciplinary, multi-method process study to “learn as we do” and meet the key objectives of the formative evaluation.
This information collection request concerns two specific data collection activities that are part of the national evaluation: (A) A Partnership Survey will be conducted about the development and functioning of partnerships and community-level collaborations that may benefit the target population. (B) Time Use Interview will be conducted as part of a study of the staff time that is used to develop each PFS project, through the phases of its PFS life-cycle of feasibility analysis, transaction structuring, and project implementation.
The national evaluation also includes other activities not discussed in this package, including annual site visits involving semi-structured interviews with key stakeholders and observation of partnership meetings, monthly calls to discuss implementation progress and successes and challenges encountered, and review of key site documents.
At the center of PFS’s theory of change is that it acts as an instrument to bring cross-sector partners together to work strategically and collaboratively across silos for better outcomes for a vulnerable population. There are many reasons a PFS project may not continue through the full lifecycle, but, in the case of an early termination, the PFS process itself may still have real benefits to both the partners in terms of collaboration, and thereby for the target population.
To understand whether the PFS projects are realizing these types of benefits, the research team will collect data on shifts in community collaboration, data sharing, and service provision through a Partnership Survey. Specifically, the following questions will be answered:
Throughout the PFS lifecycle, how do PFS partner perceptions and interactions change and how does partners’ “business as usual” change in ways that benefit the target population? This includes questions regarding community support for interventions for the target population, collaboration among service providers and local systems of care, data sharing, resource sharing, and evaluating performance.
In implementation, how do PFS projects improve the effective collaboration among partners working with the target population, particularly in terms of better sharing of information among service providers, attention to the needs of and improved access to services for the target population, better program performance, and improved sustainability planning.
One research objective for the evaluation is to document the time costs incurred by the demonstration grantees and their partners as they move through the Pay for Success (PFS) phases: feasibility assessment, transaction structuring, and implementation. (We distinguish the costs of developing a PFS project here from the costs of designing and implementing the permanent supportive housing (PSH) intervention, which include housing and service provision.) Most of the PFS-specific costs are staff time. Time is needed for intensive engagement with partners, educating partners about the project and the PFS process and bringing them to the table together from different sectors, collecting and combining data from different sources, conducting feasibility assessments, structuring the transaction, finding and securing end payors and investors, and maintaining engagement with all parties throughout the process.
During the ongoing process study, some grantees have reported that the amount of time they are spending on this project exceeds their budgeted costs covered by their HUD grants, through each PFS phase. The feasibility analysis and transaction structuring phases require involvement from a broad set of partners, including individuals whose time is not billed to the project grant. During project implementation too, PFS requires the continued involvement of a set of partners in activities such as oversight and monitoring. Changes may also need to be made to the design of the intervention based on early implementation results; in a PFS project, this can necessitate the renegotiation of the contract formed in the transaction structuring phase. Many of these activities involve a time commitment that would not be expected for implementing PSH outside the PFS structure.
The goal of this data collection activity is to describe those time costs for the PFS process, and across the PFS phases. Research is limited in this area. PFS has not previously been used in a consistent manner to support PSH on the scale of this initiative, so that the initiative provides an important opportunity to assess the time costs of the PFS process.
Specifically, the goal is to answer the following questions:
How much time do partners spend on developing the PFS PSH project in each lifecycle phase? How does this vary by site?
During each PFS lifecycle phase, which partners are spending the most time on the development of the project? What level of staff is working on the project? How does this change over time?
The original plan for this research objective was to obtain data on the time costs of PFS by using a weekly SMS (Short Message Service) survey to collect data on the time each key partner spends on the project, as an efficient way to obtain time data. Each staff member identified by the site as spending time on the project would receive a SMS (text) message every Friday asking how many hours they had spent on it in the past week.
However, pre-testing the text survey and interviewing key staff at several sites have led the research team to conclude that it will not be possible to collect reliable and detailed time cost data through this method. The nine pre-testers represented a variety of roles: intermediary, government, service provider, investor, evaluator, and technical assistance provider. Most felt like it was an intrusion to receive a work-related text message to their personal cell phone; many refused to provide their number, or even participate in the pre-test. Those who did opt-in to the text message filled out the survey regularly, however they also expressed that response rates would likely decrease overtime. An e-mail survey was created as an alternative for those who preferred that to text messages; however, those who opted into the e-mail survey were much less likely to fill out the survey. Two did not fill it out once over a two-week pre-test period, despite reminder emails, and three others filled it out only once. In addition, testers expressed worries that starting this component so late in the PFS process would miss a lot of data collection from the beginning of the projects, including all of the feasibility phase.
In response to the objections by pre-testers to the original plan, the research team, along with HUD and DOJ, developed an alternative method of collecting regular cost data from sites: supplementing administrative timesheet records with qualitative interviews.
The Demonstration awards $8.68 million to seven grantees, to be distributed across PFS phases according to each project’s budget. The grantees bill their time to HUD through the Disaster Recovery Grant Reporting (DRGR) system. The research team can thus utilize DRGR administrative data to gain one measure of how much time they are spending on this project for the different phases of PFS.
The newly proposed data collection method of quarterly qualitative interviews is intended to supplement this administrative data with information from staff members about time costs that are beyond those covered by their grants.
The research team will use the following data collection activities:
Quarterly interviews with selected supervisors and administrative staff to collect summaries of time spent on PFS tasks by staff and consultants.
Specifically, the following questions will be answered:
How many people whose time is not covered by the HUD grant are working on the PFS project? How much time do they spend on the PFS project?
For staff with time that is covered by the HUD grant, do they spend significantly more time than the grant covers? How much time do they spend on the project beyond what is covered by the grant?
Is other funding being secured or leveraged to support the PFS process (e.g., philanthropic support, pro bono time)? How much staff time does that support?
Then, these will be combined with the DRGR records to create a complete picture of time use for the organizations, across the phases of the PFS lifecycle. In the first interview with each organization, the research team will attempt to collect historical data on time spent during PFS lifecycle phases that are already complete.
This is a new initiative as described above, and there has not been any prior information collected. This data collection is part of the national evaluation, managed by the Office of Policy Development and Research at HUD, that is designed to help HUD-DOJ assess whether PFS is a viable model for scaling supportive housing to improve outcomes for a re-entry population.
This data collection request is for two specific data collection activities: a Partnership Survey and Time Use Interviews. The Partnership Survey will be used to document partner perceptions and interactions and community-level changes that may benefit the target population and the Time Use Interviews will be used to assess staff time spent on development of each PFS project throughout the different lifecycle phases. These questions are of particular import to the broader implementation study because they provide context for the level of effort and the benefits accrued within a jurisdiction which speak to possible motivation for the completion of a project.
This is a new collection. Data will be collected through an online survey administered annually by the Urban Institute, for up to the duration of the evaluation, which is funded through 2021. (A renewal application will be submitted after three years, as required.) The survey will be administered to partners in a variety of roles within the grantee and partner organizations participating in the PFS Demonstration. The survey will only be available for completion online. The data will be used for analyses of partner perceptions and interactions and community-level changes that may benefit the target population in each Demonstration site.
The survey will ask respondents to answer questions that:
Assess collaboration among partners within the PFS project
Assess collaboration among partners on shared tasks outside the PFS project
Provide information on the use of outcome-based procurement
Provide information on the collection and use of data outside the requirements of the PFS project
Item-by-item justification for the Partnership Survey instrument (Appendix C) is provided in exhibit 4.
EXHIBIT 4. Item-by-Item Justification of Partnership Survey Instrument |
|
Question(s) / Topic(s) |
Content and justification for inclusion |
I. Background Appendix C, Partnership Survey, Questions 1-7 |
These questions ask for the respondent’s description of their organization, primary service focus, title or position within their organization, years in their position, number of PFS project meetings attended, and organizational commitment to the primary PFS outcome areas. Together, these questions provide a picture of survey respondents and their organizational focus. |
II. Collaboration with partners Appendix C, Partnership Survey, Questions 8-17 |
The series of questions asked in this section of the survey captures respondents’ perspectives on the nature, strength, and frequency of partnerships among organizations involved in the PFS project and the community’s overall commitment and capacity to collaborate to serve the PFS project’s target population. The respondents are asked to rate the clarity of different aspects of their PFS project’s goals and roles. Respondents are asked to assess their community’s commitment to and experience serving a chronically homeless jail reentry population. Respondents are asked how much collaboration occurs among various partners in the community that serve the PFS project’s target population. They are asked to assess changes in collaboration among organizations serving the PFS project’s target population. The respondents are asked to rate a range of factors based on how problematic they are for collaboration among organizations that serve the PFS project’s target population. Finally, they are asked to assess how often their organization engages with other partners that serve the PFS project’s target population. |
III. Data sharing and focus on outcomes Appendix C, Partnership Survey, Questions 18- 23 |
The series of questions asked in this section of the survey captures respondents’ perspectives on how organizations serving the PFS project’s target population share data for program design and management. The respondents are asked to indicate how their community shares data to identify and serve a chronically homeless jail reentry population as well as how their community uses evidence to develop and manage supportive housing programs for the target population. They are asked to indicate what types of client data their organizations share with other partners to improve service outcomes and how often. They are asked to identify the types of organizations with which they share client data. Finally, they are asked whether outcome-based procurement has become more common as a financing mechanism, outside of the PFS project. |
IV. Barriers to Service Provision
Appendix C, Partnership Survey, Questions 24-25 |
The series of questions asked in this section of the survey captures respondents’ perspectives on the presence of common barriers to serving the target population in their community and how well the community has addressed those barriers. The respondents are asked to rate a range of potential organizational-level and population-level barriers based on how problematic they are for the target populations’ access to services, including resources, regulations, and technology. Finally, they are asked to indicate how well the community has addressed some of the most common system-level barriers to supportive housing for a chronically homeless jail reentry population. |
V. Implementation and Sustainability
Appendix C, Partnership Survey, Questions 26-27 |
The series of questions asked in this section captures respondents’ perspectives on the presence of infrastructure and supports for the project activities that would indicate the sustainability of programming for the target population. Respondents are asked to indicate how solidified the activities are in community leadership, partner organization programming, and long-term financial planning. |
Information on time spent not recorded through the DRGR instrument will be collected through quarterly interviews with a key staff member at each organization identified in the contact information data collection phase. Interviews will be conducted over the phone by the Urban Institute. Each informant will be asked to report on 1) time spent by individuals who are covered by the grant and submitting hours to DRGR, but who are spending more time than is covered; and 2) time spent by individuals who do not record time spent in DRGR. The interview will be conducted following the protocols in the Time Use Interview Guide (Appendix D).
If the organization does not submit time into DRGR, the research team will probe for an estimate of all time spent on PFS tasks in the previous quarter, by role. If the organization reports time spent through DRGR, the research team will investigate whether there were activities not covered in the DRGR report. Some time spent by high-level staff may be captured through attendance records at work groups and steering committees. The research team will also explore with the representative at each organization whether there are other, simple ways to capture that time involvement.
This is not intended to be an audit of the grantees in this demonstration. The research team is responding to grantees’ acknowledgement that PFS can be a heavy lift. Through the process study, individuals have shared directly that they are spending time outside of what they can bill to the HUD grant. The PFS field is emerging; right now, it is only known anecdotally that PFS requires a significant time and monetary commitment. This cost study aims to illuminate what is unknown about the costs of each PFS phase. By being specific about the time and associated costs spent by different sites across the PFS phases, the evaluation will inform other communities’ understanding of the potential costs involved in using PFS to implement PSH for a re-entry population.
The survey will be administered online using Qualtrics survey software. Stakeholders will be contacted by email and invited to take the survey. An online survey was determined to be the most cost-effective method of collecting responses from the community of stakeholders. The survey is designed to be completed online accessible through multiple platforms, such as computer, tablet, and smart phone; a PDF version will be available for download for informational purposes only.
Information will be collected through telephone conversations. No use of other information technology is anticipated.
There have been no previous efforts to collect data from these organizations implementing PFS about the benefits to partnership and collaboration from the PFS process.
While HUD will require the PFS grantees to report out annually on grant activities, accomplishments, and financial summaries and draw downs, HUD’s reporting will not be sufficient to ascertain the actual time spent on PFS activities, by whom, and at what staff level and organization type.
This collection of information will affect all PFS grantees and their partners participating in the surveys including local/state government agencies, non-profits, and financial organizations. We will seek to minimize burden on these entities by providing clear and concise information on the purpose of our data collection via email, and by conducting all the data collection online via Qualtrics, accessible through multiple platforms, such as computer, tablet, and smart phone.
The survey has been designed to minimize respondent burden; the Partnership Survey is expected to take 15 minutes to complete, as described below. Pre-testing of the survey in fall 2017 helped trouble-shoot technical issues that should reduce the time burden on respondents. Pretest results are described below in questions 9 and 13 in Part A and question 4 in Part B.
Time Use Interview
To reduce the organizational burden of the quarterly interviews, only one person from each organization that is identified in the contact collection phase will be interviewed. To reduce the burden, informant interviews will be conducted via phone, scheduled at the convenience of the interviewee, and kept to a minimal amount of time.
HUD and DOJ entered into an innovative interagency collaboration that resulted in the Pay for Success Permanent Supportive Housing Demonstration with $8.68M awarded to seven grantees in June 2016. This interagency collaboration and demonstration is new. If the proposed data were not collected or not collected as frequently, then HUD and DOJ would not be able meet the evaluation objectives: (1) to learn how the PFS model is implemented in diverse settings with different structures, populations, and community contexts and (2) to assess whether PFS is a viable model for scaling supportive housing to improve outcomes for a re-entry population.
This survey will be administered annually, for up to five years or the duration of the evaluation. There is no other comprehensive source for this information. This component of the evaluation is necessary to understand the impact of collaboration among the participating organizations and how those partnerships change over time. Without the partnership and community change information obtained from this survey, HUD would not know the benefits of funding permanent supportive housing through a PFS framework. Administering this survey less frequently would affect the reliability of the data.
The only current time data on PFS implementation comes from submissions to HUD’s DRGR system, for billing purposes. There is no current source of data concerning time use beyond staff time is billed to the HUD award. This component of the evaluation is necessary to understand the additional costs of PFS to these organizations, beyond what is covered by the HUD grant. Without this information, HUD will not know about the additional costs of funding permanent supportive housing through a PFS framework. Conducting the interviews less frequently would reduce the reliability of the data.
Requiring respondents to report information to the agency more often than quarterly;
Requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
Requiring respondents to submit more than an original and two copies of any document;
Requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than 3 years;
In connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
Requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
That includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or that unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
Requiring respondents to submit proprietary trade secrets or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
The proposed data collection activities are consistent with the guidelines set forth in 5 CFR 1320 (Controlling Paperwork Burdens on the Public). There are no special circumstances that require deviation from these guidelines.
Under this ICR, HUD will not conduct any data collection requiring respondents to report information to the agency more often than quarterly;
Under this ICR, HUD will not conduct any data collection requiring respondents to prepare a written response to a collection of information in fewer than 30 days after receipt of it;
Under this ICR, HUD will not conduct any data collection requiring respondents to submit more than an original and two copies of any document;
Under this ICR, HUD will not conduct any data collection requiring respondents to retain records, other than health, medical, government contract, grant-in-aid, or tax records, for more than three years;
Under this ICR, HUD will not conduct any data collection in connection with a statistical survey, that is not designed to produce valid and reliable results that can be generalized to the universe of study;
Under this ICR, HUD will not conduct any data collection requiring the use of a statistical data classification that has not been reviewed and approved by OMB;
Under this ICR, HUD will not conduct any data collection that includes a pledge of confidentiality that is not supported by authority established in statute or regulation, that is not supported by disclosure and data security policies that are consistent with the pledge, or which unnecessarily impedes sharing of data with other agencies for compatible confidential use; or
Under this ICR, HUD will not conduct any data collection requiring respondents to submit proprietary trade secrets, or other confidential information unless the agency can demonstrate that it has instituted procedures to protect the information's confidentiality to the extent permitted by law.
The language of the 60-day notice is included in this package and was published on August 25, 2017 on pages 40586-40588. The notice period has ended and no comments were received.
Grantees were consulted during the research design process to obtain feedback on the proposed data collection in order to reduce burden on grantees and increase clarity of instructions.
The Partnership Survey tested survey questions that were adapted from other survey efforts. The survey was tested with 8 respondents representing 4 different sites (Lane County, OR; Austin, TX; Alaska; Los Angeles, CA), and 4 different PFS roles (government, service provider, evaluator, TA provider) across sites to ensure clarity of instructions and data elements to be reported. Feedback on formatting and content was solicited from all testers via e-mail and by phone call. Based on tester feedback, changes were made including the addition of a progress bar, removal of ambiguity about when the survey is asking about the PFS project or the broader community, and added questions about benefits of data infrastructure and sustainability.
Pre-tests of the previously proposed text-message-based survey indicated that it would not achieve an acceptable response rate. In response, we propose to replace that data collection quarterly interviews of key informants, to be combined with the use of administrative data.
No incentives, or other payments or gifts, will be offered to survey or interview participants.
Before beginning any survey or interview, stakeholders will be provided an explanation of the purpose of the evaluation and how their responses will be used. Respondents will be told that their individual responses will be anonymous, de-identified, and will be publicly reported only in the aggregate. However, they will also be told that unique roles or responses could be potentially identifying, and therefore we are not promising confidentiality.
The survey research instruments and interview protocols will be reviewed and approved by the Urban Institute’s Institutional Review Board prior to initiating any research, which operates according to the Common Rule on the Protection of Human Subjects found in Title 45 of the Code of Federal Regulations, Part 46 (45 CFR 46). The information requested under this collection is protected and held private in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C.552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974) and OMB Circular No. A-130. A Privacy Impact Assessment was approved by the Department on 11/17//2017.
Authority to offer confidentiality is made on the basis of:
Section 3(b) of the Department of Housing and Urban Development Act, as amended, 42 U.S.C. 3532, authorizes the Secretary to “conduct continuing comprehensive studies, and make available findings, with respect to the problems of housing and urban development.”
Section 7(r)(1) of the Department of Housing and Urban Development Act, as amended, 42 U.S.C. 3535, provides that appropriated funds “shall be available to the Secretary for evaluating and monitoring of all such programs . . . and collecting and maintaining data for such purposes.” Subsection (r)(4)(a) of the act further provides that the Secretary “may provide for evaluation and monitoring under this subsection and collecting and maintaining data for such purposes directly or by grants, contracts, or interagency agreements.”
Section 502(g) of title V of the Housing and Urban Development Act of 1970, as amended, 12 USC 1701z-2 (g), authorizes the Secretary “to request and receive such information or data as he deems appropriate from private individuals and organizations, and from public agencies.” It further provides that “[a]ny such information or data shall be used only for the purposes for which it is supplied, and no publication shall be made by the Secretary whereby the information or data furnished by any particular person or establishment can be identified, except with the consent of such person or establishment.
No questions of a sensitive nature will be included.
The Partnership Survey will be administered annually to an estimated 100 people in management roles in partner organizations participating in the PFS Demonstration.
The surveys were pilot tested with 8 respondents to improve the survey instrument. We asked respondents to record their start and stop time to gage the appropriateness of our burden estimates. Respondents reported the web-based partnership survey took less than 15 minutes, and timing in Qualtrics showed the average was 13 minutes. Therefore, the estimate of the burden is based on 0.25 hours (15 minutes) to complete, and the estimated annual burden for the Partnership Survey is 25 hours per year.
Quarterly interviews will be conducted four times per year, to an estimated 64 people in supervisory or administrative positions in partner organization participating in the PFS Demonstration. Estimate of the burden is based on 1 hour to complete each interview, and the estimated annual burden for the interviews is 256 hours.
Based on the below assumptions and tables, we calculate the annual burden hours for these study activities in aggregate to be 298 hours and the annual cost to be $8,060.76 as broken down in detail by respondent type, burden, and wages below.
Based on the expectation that the typical key project partner role is either a management or support role, we estimated their cost per response using the average of the most recent (May 2017) Bureau of Labor Statistics, Occupational Employment Statistics median hourly wages for the labor categories “social and community services manager” and “community and social service specialist, all other.”
Respondent |
Occupation |
SOC Code |
Median Hourly Wage Rate |
Average (Median) Hourly Wage Rate |
Key Project Partners |
Social and Community Services Manager |
11-9151 |
$30.82 |
$25.92 |
Key Informant for Time Use Interviews |
First-Line Supervisors of Office and Administrative Support Workers |
43-1011 |
$26.47 |
$31.10 |
Source: Bureau of Labor Statistics, Occupational Employment Statistics (May 2017), https://www.bls.gov/oes/current/oes_stru.htm
All assumptions are reflected in the table below.
Information Collection |
Number of Respondents |
Frequency of Response |
Responses Per Annum |
Burden Hour Per Response |
Annual Burden Hours |
Hourly Cost Per Response |
Annual Cost
|
Partnership Survey |
168.00 |
1.00 |
168.00 |
0.25 |
42 |
30.82 |
$1,294.44 |
Time Use Interviews |
64.00 |
4 |
256.00 |
1 |
256 |
26.47 |
$6,766.32 |
Total.......................... |
232.00 |
.................. |
.................. |
.................. |
298 |
................ |
$8,060.76 |
There are no additional total annual cost burdens to respondents or record-keepers beyond the labor cost of burden-hours described in item 12 above.
The estimated cost to the federal government for these data collection efforts the Evaluation of the Pay for Success Permanent Supportive Housing Demonstration totals $68,483.36 over a 12-month period. The data collection costs are based on the competitively bid and awarded contract for this study.
Task 6: Data Collection |
||||
Labor |
||||
IDIQ Labor Category |
Estimated task Hours |
Hourly Rate |
Total Cost |
|
Senior Principal Associate/Scientist |
72 |
$366.11 |
$26,359.92 |
|
Senior Associate |
80 |
$237.97 |
$19,037.60 |
|
Senior Programmer/Analyst |
80 |
$133.37 |
$10,669.60 |
|
Programmer/Analyst |
64 |
$95.01 |
$6,080.64 |
|
Research Associate/Analyst |
80 |
$58.52 |
$4,681.60 |
|
Other Direct Costs |
||||
Computer Network Services |
$1,500.00 |
|||
Books/Periodicals/Library Services |
$10.00 |
|||
Reproduction @ $.095/page |
$40.00 |
|||
Telephone Expenses |
$40.00 |
|||
Postage/Delivery |
$30.00 |
|||
Supplies and Miscellaneous |
$10.00 |
|||
Inflation Factor on ODCs (excl Sub. Admin)* |
$24.00 |
|||
Total Expenses: |
$68,483.36 |
|||
This is a new program.
The start of data collection will be preceded by an introductory Respondent Contact Letter to respondents from HUD to explain the importance of the evaluation, participation in data collection, and the role of the Urban Institute (Appendix A).
The Urban Institute will conduct the Partnership Survey on an annual basis, beginning January 2018, or as soon as OMB approval is received, and ending January 2020 or at the conclusion of the period of performance.
During the survey fielding period, progress on survey administration will be reported biweekly to HUD with production reports showing ongoing response rates. A summary will be provided upon survey completion with tables of frequencies for all survey questions.
Results from the Partnership Survey will be presented in the final report as descriptive statistics and correlations. Crosstabs of survey responses by organization type and respondent role will also be produced.
The first year of survey data will be submitted in February 2019 and analysis of the survey data will be submitted in March 2019. Findings will be discussed in annual reports and briefs and considered in relation to findings across other research objectives, beginning with the Year 2 annual report to be submitted in March 2019. A briefing for HUD will be conducted in July 2019.
The Urban Institute will conduct the quarterly interviews on an ongoing basis, beginning January 2018, or as soon as OMB approval is received, and ending January 2020 or, if an extension to the study is granted, at the conclusion of the period of performance.
To analyze the time data, the Urban Institute will analyze the data annually and summarize the costs of each PFS phase as projects move through feasibility analysis, transaction structuring, and implementation. By combining information from actual grant spending from DRGR with time estimates from staff informants and any information about additional funding, we will develop descriptions of the overall time spent. The research team will break this down by partner and by site, how many people at what level are involved in working on each PFS project, and what portion of PFS project time is and is not covered by the HUD grant. In addition, a description will be included of these time costs by PFS phase. Because different demonstration sites’ grants cover different PFS phases, data from different sites will be involved in the estimates for the different phases.
The first year of time data will be submitted in February 2019 and analysis of the survey data will be submitted in March 2019. Findings will be discussed in annual reports and briefs and considered in relation to findings across other research objectives, beginning with the Year 2 annual report to be submitted in March 2019. A briefing for HUD will be conducted in July 2019.
The expiration date for OMB approval will be displayed on any forms completed as part of the data collection.
No exceptions are necessary for this information collection.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Hayes, Christopher |
File Modified | 0000-00-00 |
File Created | 2021-01-20 |