Supporting Statement Part B for Paperwork Reduction Act Submission Evaluation of the HUD-DOJ Pay for Success Re-Entry Permanent Supportive Housing Demonstration
OMB # 2528-XXXX
Part B. Collections of Information Employing Statistical Methods
Describe (including a numerical estimate) the potential respondent universe and any sampling or other respondent selection methods to be used. Data on the number of entities (e.g., establishments, State and local government units, households, or persons) in the universe covered by the collection and in the corresponding sample are to be provided in tabular form for the universe as a whole and for each of the strata in the proposed sample. Indicate expected response rates for the collection as a whole. If the collection had been conducted previously, include the actual response rate achieved during the last collection.
The respondent universe for each survey includes the key project partners engaged in Pay for Success (PFS)-related tasks in each demonstration site. We will survey the full universe of respondents, not a sample. To identify the full universe of respondents, the Urban Institute currently maintains a list of key partners in each site and will confirm the completeness of this list through conversations with grantees prior to the start of the survey period and regularly update this list as needed. Exhibit 1 below outlines the types of organizations and category of staff involved in each site. The relevant organizations and staff will be different across sites and over time in any given site. The universe of respondents will be updated as needed for each survey, if and when key project partners change at each demonstration site.
Exhibit 1. Description of Key Project Partners for the Survey Respondent Universe
Organization |
Staff Category/Role |
Grantees/Intermediaries |
Financial intermediary/fiscal agent |
Knowledge /programmatic intermediary/project coordinator |
|
Analysis/reporting compliance |
|
Government partners |
Government lead/government representative in: |
Budget/finance |
|
Programmatic |
|
Communications/elected officials |
|
Legal |
|
Service providers |
Executive/associate director |
Service provider program director |
|
Service provider outreach director |
|
Service provider financial director |
|
Evaluation partner |
Evaluator team leads and Associates/analysts |
Technical assistance advisors |
Team leads and Associates/analysts |
Investors/funders |
Investor leads and Associates/analysts |
Outside legal |
Lead attorney and Legal associates |
Based on our knowledge of the sites, each of the seven demonstration sites has approximately five partner organizations involved in implementation, and each partner organization has between one and four staff regularly working on PFS-related tasks. This information informs our expected estimate of the total universe of respondents as outlined in exhibit 2 below. We will not be selecting a specific number of respondents, but rather we are estimating the total universe of respondents across all seven sites.
Exhibit 2. Estimate of Total Universe of Respondents
|
Estimated Partner Organizations |
Estimated Individual Staff |
Site 1 |
5 |
15 |
Site 2 |
5 |
14 |
Site 3 |
5 |
10 |
Site 4 |
5 |
20 |
Site 5 |
5 |
15 |
Site 6 |
5 |
12 |
Site 7 |
5 |
14 |
Total Estimated Universe of Respondents |
35 |
100 |
The goal of the Weekly Text Time Survey will be to collect data on time spent for every person regularly engaged in PFS-related tasks within the organizations and staff roles outlined in exhibit 1. The respondent universe, therefore, is the full universe and not a sample. From these organizations, an estimated 100 individuals will be identified (exhibit 2) and asked to participate in the Weekly Text Time Survey, based on their regular work on PFS-related tasks. We expect an overall weekly response rate of 70 percent.
The goal of the Monthly Web-Based Time Survey will be to collect data on time spent on PFS-related tasks by staff who participate only on an ad hoc basis. This survey will be fielded to a supervisory staff member at each key project partner organization (exhibit 2), not a sample of these organizations. These respondents will be aware of the time spent on PFS tasks by their contractors and staff members with only occasional or intermittent involvement with PFS, whose time would not be captured by the Weekly Text Time Survey. It would be inefficient to ask staff and contractors with only occasional PFS tasks to respond to a weekly survey, but combined these workers could represent significant time that should be considered. The Monthly Web-Based Time Survey is a compromise data collection we consider our best option for creating reasonable estimates of the time spent by this group. We expect a response rate of 70 percent.
Exhibit 3. Weekly Text Time Survey and Monthly Web-Based Time Survey Respondents and Response Rates
Respondent category |
Universe |
Number covered by collection |
Number of responses expected |
Response rate |
Key Project Partners |
100 |
100 |
70 |
70% |
Project Supervisors |
35 |
35 |
25 |
70% |
Annual
Web-Based Partnership Survey
The goal of the Annual Web-Based Partnership survey is to collect data on partner perceptions and interactions and community changes that may benefit the target population. The respondent universe for this survey overlaps with that of the Weekly Text Time Survey and Monthly Web-Based Time Survey. It will be fielded to the full universe of respondents and will not be a sample. From these organizations, an estimated 100 individuals will be identified (exhibit 2) and asked to participate in the data collection. We expect a response rate of 70 percent.
Exhibit 4. Annual Web-Based Partnership Survey Respondents and Response Rate
Respondent category |
Universe |
Number covered by collection |
Number of responses expected |
Response rate |
Key Project Partners |
100 |
100 |
70 |
70% |
Describe the procedures for the collection of information including:
Statistical methodology for stratification and sample selection,
Estimation procedure,
Degree of accuracy needed for the purpose described in the justification,
Unusual Problems requiring specialized sampling procedures, and
Any use of periodic (less frequent than annual) data collection cycles to reduce burden.
At the start of the survey period (targeted for January 2018 for all surveys) HUD will send an introductory Respondent Contact Letter to respondents to explain the importance of the evaluation, participation in data collection, and the role of the Urban Institute (Appendix A).
Overview: The Urban Institute will field the Weekly Text Time Survey to the universe of participants described in exhibit 1 who regularly work on PFS-related tasks. The Monthly Web-Based Time Survey will be fielded to a supervisory staff member of the organizations described in exhibit 1 with other staff or contractors not covered by the Weekly Text Time Survey who work on PFS-related tasks on an ad hoc basis only.
Consent: Launch of the Weekly Text Time Survey will be preceded by an email from the Urban Institute, requesting consent to participate in the Weekly Text Time Survey. Because of the use of texting technology, use of respondent cell phones, and frequency of data collection, active consent will be recorded through a Qualtrics link embedded in the invitation email. Only the Weekly Text Time Survey will request active consent.
Survey Administration: The Weekly Text Time Survey will be administered through a Short Message Service (SMS) survey. When the Weekly Text Time Survey period begins, a weekly text message will be sent to survey participants, asking for the number of hours they spent on PFS tasks in the previous week. A reminder text will automatically be sent a day later if they fail to respond. Respondents will also have the option of replying through a web survey, using Qualtrics, or to submit weekly reports via paper or spreadsheet. The Monthly Web-Based Time Survey will be administered online using Qualtrics survey software. Stakeholders will be contacted by email and invited to take the survey. The survey is designed to be completed online accessible through multiple platforms, such as computer, tablet, and smart phone. The Weekly Text Time Survey is designed to take approximately 2 minutes to complete and the Monthly Web-Based Time Survey is designed to take approximately 10 minutes to complete. The universe of respondents will be updated as needed for each survey, if and when key project partners change at each demonstration site.
Data Management and Storage: Access to survey data by Urban Institute staff will be password controlled, and limited to those staff involved in fielding the survey, and who have signed the confidentiality agreement. All survey and other sensitive data will be saved to an encrypted network drive, with access limited to Urban Institute staff with a need to work with raw data, and who have signed the confidentiality agreement. Access will only be available on site, through password protected PCs.
Follow up and Quality Control: Respondents to the Weekly Text Time Survey will receive a weekly text message asking for their hours spent, and one text reminder if they do not respond. At the end of the month, we will email to each participant a summary of their reported hours, highlighting any gaps, and ask for confirmation. If they failed to report for one or more weeks, we will ask for a total for the month that includes the missing week(s). If the email does not draw a response, we will make a reminder call. Surveying respondents each week, rather than asking them to provide retrospective time data on a less frequent basis, will allow us to produce a more accurate estimate of the time spent by the participants most involved with PFS tasks. During the survey fielding period, the survey manager at the Urban Institute will monitor incoming data weekly to inspect for any irregularities. The research team will share monthly response rates with the sites, to encourage higher response rates, and the HUD Government Technical Representative (GTR).
Analysis: To analyze the time survey data, the Urban Institute will cross walk respondent IDs to respondent labor costs and complete a simple calculation to provide PFS time costs: PFS Time Costs = hours x hourly labor cost. The labor cost estimates will be based on the most recent Bureau of Labor Statistics, Occupational Employment Statistics median hourly wages. The Urban Institute will sum time data by lifecycle stage and PFS role to provide several ways to understand and interpret the findings and the time cost of implementing a PFS permanent supportive housing model. The analyses will acknowledge the uncertainties caused by missing data, explain the methods used to improve the data, and discuss the potential impacts on the cost estimates.
Data Delivery: The annual data submission to HUD will consist of the cleaned survey data collected to date, de-identified with sites identified by a number (e.g. Site 1, Site 2, etc.) and respondents identified by role (e.g. investor, end payor, etc.). Metadata for each will provide the file name, a complete description of the survey methodology, the dates fielded, the participant universe and the response rate. The methodology description will include any measures taken to correct for nonresponse, if necessary. Files will be provided in both Suite of Analytics Software (SAS) and comma-separated value (CSV) formats. Submissions will also include a data dictionary, tables of frequencies, and copies of the survey instrument for each file.
Other Notes: The Weekly Text Time Survey and Monthly Web-Based Time Survey do not require sample selection or specialized sampling procedures because it will be fielded to the full universe of respondents. Administering these surveys less frequently would affect the reliability of the data.
Statistical methodology for stratification and sample selection: N/A
Estimation procedure: N/A
Degree of accuracy needed for the purpose described in the justification: N/A
Unusual Problems requiring specialized sampling procedures: N/A
Any use of periodic (less frequent than annual) data collection cycles to reduce burden: N/A
Annual Web-Based Partnership Survey
Overview: The Urban Institute will field the Annual Web-Based Partnership Survey to the universe of participants described in exhibit 1. The research team developed a survey instrument that draws on tested questions from other surveys intended to measure the strength of partnership and community- and system-level changes. The survey instrument includes questions in the areas of collaboration with partners, data sharing and outcomes, and barriers to service provision.
Survey Administration: The survey will be administered online using Qualtrics survey software. Stakeholders will be contacted by email and invited to take the survey. The survey is designed to be completed online accessible through multiple platforms, such as computer, tablet, and smart phone; a PDF version will be available for download for informational purposes only. The survey is designed to take approximately fifteen minutes to complete. The universe of respondents will be updated as needed for each survey, if and when key project partners change at each demonstration site.
Data Management and Storage: While the survey is being fielded, completed and partially completed surveys will be stored on the Qualtrics secure site. Access by respondents will be through a link and ID provided in the email invitation. Once the survey has been completed, respondents will no longer have access. Access by Urban Institute staff will be password controlled, and limited to those staff involved in fielding the survey, and who have signed the confidentiality agreement. All survey and other sensitive data will be saved to an encrypted network drive, with access limited to Urban Institute staff with a need to work with raw data, and who have signed the confidentiality agreement. Access will only be available on site, through password protected PCs.
Follow up and Quality Control: During the survey period, reminders will be sent on different days to respondents that have not responded after two weeks and again after four weeks. Because we are surveying the entire population of interest, the population to be surveyed is not large, and we expect the pool of non-respondents to be small, we anticipate efforts to increase response rates during the survey period will be more effective than post-survey adjustments for producing reliable data. Progress on survey administration will be reported biweekly to HUD with production reports showing ongoing response rates.
Analysis: A summary will be provided upon survey completion with tables of frequencies for all survey questions. Results from the Annual Web-Based Partnership Survey will be presented in the final report as descriptive statistics and correlations. Crosstabs of survey responses by organization type and respondent role will also be produced.
Data Delivery: The annual data submission to HUD will consist of the cleaned survey data collected to date, de-identified with sites identified by a number (e.g. Site 1, Site 2, etc.) and respondents identified by role (e.g. investor, end payor, etc.). Metadata for each will provide the file name, a complete description of the survey methodology, the dates fielded, the participant universe and the response rate. The methodology description will include any measures taken to correct for nonresponse, if necessary. Files will be provided in both SAS and CSV formats. Submissions will also include a data dictionary, tables of frequencies, and copies of the survey instrument for each file.
Other Notes: The Annual Web-Based Partnership Survey does not require sample selection or specialized sampling procedures because it will be fielded to the full universe of respondents. Administering this survey less frequently would affect the reliability of the data.
Statistical methodology for stratification and sample selection: N/A
Estimation procedure: N/A
Degree of accuracy needed for the purpose described in the justification: N/A
Unusual Problems requiring specialized sampling procedures: N/A
Any use of periodic (less frequent than annual) data collection cycles to reduce burden: N/A
Describe methods to maximize response rates and to deal with issues of nonresponse. The accuracy and reliability of information collected must be shown to be adequate for intended uses. For collections based on sampling, a special justification must be provided for any collection that will not yield "reliable" data that can be generalized to the universe studied.
All proposed surveys will be fielded to the full universe of respondents and will not be based on sampling. The accuracy and reliability of the survey data will be a factor of adequate response rates and data quality. Strategies for maximizing response rates and addressing issues of nonresponse are discussed below.
Maximizing Response Rates. Together with HUD, the Urban Institute has conducted webinars and other informational calls with grantees to introduce the Weekly Text Time Survey and Monthly Web-Based Time Survey and solicit their feedback. This engagement prior to the survey period is intended to build buy-in among respondents and improve the proposed survey processes. HUD will also send an introductory Respondent Contact Letter to all respondents at the start of the survey period to encourage participation (Appendix A).
For the Weekly Text Time Survey, we expect to achieve a response rate of 70 percent. To achieve this response rate, respondents will be able to access the survey in a manner most convenient to them. The survey will primarily be fielded through a weekly SMS text message. As noted above, respondents will have the option of responding through text or by following a link to a browser, or, if they do not feel comfortable responding via cell phone, they can submit weekly time reports via paper or web-based log. We expect that most respondents will choose the SMS option. In addition, the SMS survey manager will monitor weekly response rates, sending a text message reminder to those who have not responded in a certain timeframe. After the text message is sent on Friday afternoon, a text reminder will be sent Monday morning if there is no response, and an email or phone outreach will be conducted for respondents who continue to be unresponsive. The survey manager on the research team will monitor incoming SMS data on a biweekly basis to inspect for any irregularities. The research team will share biweekly response rates with the grantees and the HUD GTR, to encourage higher response rates.
For the Monthly Web-Based Time Survey, we expect a response rate of 70 percent. Participants will be prompted to take the survey by email once per month, with one email reminder and phone outreach to those respondents who continue to be unresponsive. We will share a summary of the time estimates with the respondents quarterly to encourage responses and confirm time estimates for the quarter. The survey will be fielded through Qualtrics, a web-based survey platform. The survey will be designed to be accessible through a variety of electronic devices, including personal computer, tablet, and cell phone.
Issues of Nonresponse. The Weekly Text Time Survey presents the biggest challenge to achieve the target response rate because of the frequency and duration of the proposed data collection and because we are requesting active consent to contact respondents by weekly text message.
We expect to see three different types of non-response:
People who did not provide active consent (see consent procedures described on page 2)
People who actively consented, but frequently do not respond
People who actively consented, but sometimes fail to respond
For people who were identified as respondents for the Weekly Text Time Survey, but did not provide active consent to participate, we will talk with them about their decisions to opt out, the importance of the survey, and attempt to find another source of their time spent, such as administrative data or a supervisor. If we are unable to collect any time information, we will try to match the refusal with a person in a similar role who is participating, and use that person’s data to impute the time estimates for the refusal.
People who consent will receive a weekly text message asking for their hours spent, and one text reminder if they do not respond. At the end of the month, we will email to each participant a summary of their reported hours, highlighting any gaps, and ask for confirmation. If they failed to report for one or more weeks, we will ask for a total for the month that includes the missing week(s), which can be entered through a web survey. If the email does not draw a response, we will make one reminder call.
As a final measure to fill in missing data for people participating in the survey we will impute based on the data we have received. For participants with a large number of gaps, we will try to match the respondent with one or more persons in a similar role with more complete data, and impute based on those estimates. Where data are more complete, we will base the imputation on patterns from previous responses.
We will maintain a record of the source of each estimate.
Annual Web-Based Partnership Survey
Maximizing Response Rates. Together with HUD, the Urban Institute has conducted webinars and other informational calls with grantees to introduce the Annual Web-Based Partnership Survey and solicit their feedback. This engagement prior to the survey period is intended to build buy-in among respondents and improve the proposed survey process. HUD will also send an introductory Respondent Contact Letter to all respondents at the start of the survey period to encourage participation (Appendix A).
For the Annual Web-Based Partnership Survey, we expect a response rate of 70 percent. The study team will employ a variety of techniques to ensure the highest possible response rate, including: survey design that allows respondents to use various types of electronic devices to complete the survey; effective communication before the survey to prepare respondents for participation; assurance that only de-identified, aggregated data will be shared; survey reminders throughout the fielding period; ongoing response tracking; and email follow-up with non-responders. Reminders will be sent to key personnel that have not responded after one week and again after three weeks. The research team will share response rates with the grantees and the HUD GTR, to encourage higher response rates.
Issues of Nonresponse. The universe will be stratified by grantee type and stakeholder role; if after three weeks particular strata have not yet achieved the target response rate, we will make phone call reminders to stakeholders within those strata. Completed surveys from the late-to-complete cohort will be compared with all other completed surveys to test for significant differences. Because of the overlap between Weekly Text Time Survey and the Annual Web-Based Partnership Survey, we may be able to collect characteristics of non-respondents to one survey from their responses to the other survey. This additional information on differences between the respondents and non-respondents could also be used in testing for bias.
To measure the extent of response bias for each survey, will we conduct two types of tests. First, we will stratify the sample by characteristics of the organization and role of the respondent, and compare the characteristics of respondents to non-respondents. If the characteristics of respondents are significantly different from the rest of the respondents, adjustments to the estimates through weighting may be used, and the results of the testing will be reported. We will also use t-tests to compare respondents to non-respondents using information acquired from non-participants who have participated in one of the other surveys. Because we are surveying the entire population of interest, the population to be surveyed is not large, and we expect the pool of non-respondents to be small, we anticipate efforts to increase response rates overall will be more effective than post-survey adjustments for producing reliable data.
Describe any tests of procedures or methods to be undertaken. Testing is encouraged as an effective means of refining collections of information to minimize burden and improve utility. Tests must be approved if they call for answers to identical questions from 10 or more respondents. A proposed test or set of test may be submitted for approval separately or in combination with the main collection of information.
The Urban Institute will pretest each survey in fall 2017 with up to 9 respondents across sites active in PFS-related tasks who volunteer for the test. The objectives are to: (a) test each survey for wording, flow, and meaning; (b) verify the estimated time to complete the survey; and (c) conduct post-survey interviews with respondents to assess their interpretation of the questions and the reasoning behind their answers. After administering the pretest survey, Urban Institute will conduct interviews to learn about survey fatigue and question clarity and answerability. Based on these interviews we will make wording and other changes as needed to improve clarity and time burden. The pretest will also provide experience with survey procedures, such as the ability to contact respondents, and in the case of the Weekly Text Time Survey, the most effective days and times for data collection.
Provide the name and telephone number of individuals consulted on statistical aspects of the design and the name of the agency unit, contractors, grantees, or other person(s) who will actually collect or analyze the information for the agency.
The agency responsible for receiving and approving contract deliverables is:
Office of Policy Development and Research, Program Evaluation Division
U.S. Department of Housing and Urban Development
451 Seventh St. SW
Washington, DC 20410
Person Responsible: Marina Myhre, Social Science Analyst/GTR, HUD (202-402-5705)
The organization responsible for survey design, data collection, and data analysis is:
The Urban Institute
2100 M St. NW
Washington, DC 20037
Persons Responsible:
Mary Cunningham, Co-Principal Investigator, Urban Institute (202-261-5764)
Akiva Liberman, Co-Principal Investigator, Urban Institute (202-261-5704)
Christopher Hayes, Survey Lead, Urban Institute (202-261-5650)
Timothy Triplett, Senior Survey Advisor, Urban Institute (202-261-5579)
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Hayes, Christopher |
File Modified | 0000-00-00 |
File Created | 2021-01-21 |