VHPD_OMB_Package_Part_B 8-28-2012

VHPD_OMB_Package_Part_B 8-28-2012.docx

Evaluation of Veterans Homelessness Prevention Demonstration

OMB: 2528-0287

Document [docx]
Download: docx | pdf

Veterans Homelessness Prevention Demonstration

Task Order Number

Contract Number C-CHI-01115T-0001


Paperwork Reduction Act Submission for Veterans Homelessness Prevention Demonstration Evaluation Focus Groups and Telephone Survey


Part B: Statistical Methods


Revised August 16, 2012


Prepared for

Elizabeth Rudd, PhD

Government Technical Representative

U.S. Department of Housing and Urban Development

Office of Policy Development and Research

451 Seventh St, SW, Room 8120

Washington, DC 20410


Prepared by

Bohne G. Silber, PhD

Silber & Associates

13067 Twelve Hills Rd, Suite B

Clarksville, MD 21029


Martha R. Burt, PhD

Mary K. Cunningham
Jennifer Biess

Urban Institute

2100 M St NW

Washington, DC 20037


PART B: STATISTICAL METHODS

PART B. STATISTICAL METHODS

The U.S. Department of Housing and Urban Development has contracted with Silber & Associates and its subcontractor, the Urban Institute, to conduct baseline and follow-up telephone surveys with participants in the Veterans Homelessness Prevention Demonstration (VHPD). The goal of this survey is to track housing outcomes and other measures of well-being for veterans who receive services through VHPD. We plan to survey 500 VHPD participants as well as collect administrative data from two comparison groups, one of non-VHPD veterans and the other of Homelessness Prevention and Rapid Rehousing Program (HPRP) participants. Each of the comparison groups will be comprised of 500 people.


B1. Respondent Universe, Sample Selection, and Expected Response Rates


B1.1. Respondent Universe

To understand the impact of the program on VHPD participants, we will compare their outcomes to veterans who would otherwise qualify for the program, but who did not receive services and to non-veterans who received prevention services similar to those offered by VHPD. To accomplish this, we will sample three groups:


  • Group 1: VHPD Participants

The respondent universe for Group 1 is all VHPD participants who receive services at the five demonstration sites.1 We expect that sites will enroll approximately 2,500 veterans, around 500 per site, during the program grant period.


  • Group 2: Veterans who contact VAMCs.

The respondent universe for Group 2 is veterans who are at risk of homelessness and who contact local VA medical centers. The universe for this group is unknown.


  • Group 3: HPRP Participants

The respondent universe for Group 3 non-veterans who received services from the sites’ Homelessness Prevention and Rapid Re-Housing Program (HPRP) participants varies by site. We estimate that each site will enroll approximately 2,000 participants, and that around 75 percent of those enrolled will not be veterans.


As described in more detail below, the data from these groups will allow the research team to examine the efficacy of VHPD in preventing homelessness, including an impact analysis that examines differences in outcomes between VHPD clients and other veterans and veterans and non-veterans. We will report the findings from the process and outcomes study in interim and final reports that summarize our findings and draw policy implications.


B1.2. Sample Selection

One of the biggest challenges to understanding program effects in nonexperimental designs, such as the VHPD evaluation, is selection bias. To understand the true impact of VHPD on program participants, it is critical to create a counterfactual that addresses the question: all else equal, what would have happened in the absence of the VHPD intervention? This requires selecting samples of groups that did not receive VHPD, but that look similar to program participants who did receive services.


Evaluating demonstrations such as VHPD is particularly challenging because the goal (homelessness prevention) is relatively new, no standardized interventions exist, program sites vary considerably in their service configuration and local circumstances, and the interventions change in greater or lesser ways over time as programs gain experience with their clientele and with what seems to work. Under these circumstances the evaluation “gold standard” of random assignment is inadvisable; the investment in random assignment studies is best reserved for much more controlled situations. Coupled with cost considerations and the fact that the program had launched before the research got underway, made using random assignment impossible.


Nevertheless, one wants to make some comparisons if at all possible, to answer two questions:

  1. Among veterans at similar risk for housing loss/homelessness, does the intervention make a difference to housing outcomes? and

  2. Among households at similar risk for housing loss/homelessness who receive approximately the same intervention, does being a veteran make a difference to housing outcomes?


Comparing Group 1 to Group 2 on housing outcomes is meant to address question 1, while comparing Group 1 to Group 3 is meant to address question 2. Below, we explain the universe for each group, to the best of current knowledge; how VHPD participants will be selected for interviews to comprise Group 1; and how Group 2 and Group 3 members will be selected to resemble as closely as possible the makeup of Group 1. Should serious differences still exist after group selection, we will create propensity scores to use as weights during each group-to-group comparison to compensate for remaining differences. Propensity scores will use available baseline characteristics that are associated with risk of homelessness and produce unbiased estimates of program effects.


B1.2.a. Group 1: VHPD participants

Universe: All veterans with incomes below 50% of area median income facing a housing crisis that puts them at imminent risk of losing their housing/becoming homeless, or who are already homeless but for fewer than 90 days (i.e., this program is not meant for chronically homeless people).

Size of universe: unknown—there are no national or local estimates of the size of this population.

Recruiting approach: this group is not technically a sample. VHPD has been accepting clients since about April 2011; each has served at least 100 people so far, out of the 300 each is expected to serve throughout the 3-year life of the program. Given our anticipated start date of September 3, 2012 and our need to interview 100 VHPD participants from each site to make our sample size, we will be recruiting and interviewing every veteran who enrolls and signs the study’s informed consent between September 3, 2012 and whenever we reach 100 interviewees per VHPD site, probably around June 30, 2013.

Assessing Group 1 characteristics for identifying/matching Group 2 and Group 3 members: We will use three VHPD target population characteristics as the basis for matching for Groups 2 and 3. These are: (1) female/male veteran—% female; (2) family/individual—% family households, meaning at least one minor child lives in the household; and (3) employed/not employed at enrollment—% employed. We will use one additional matching variable for Group 2, OEF/OIF/OND—% OEF/OIF/OND, as everyone in Groups 1 and 2 will be veterans and hence have this information. We will use one additional matching variable for Group 3, prevention/rapid rehousing—% prevention, as everyone in Groups 1 and 3 will have received one or the other. We will of course also have the VHPD site itself as a geographical matching variable.

.

B1.2.b. Group 2: Non-VHPD Veteran Comparison Group

Universe: All veterans enrolling in VAMC services before June 1, 20112, with incomes below 50% of area median income and facing a housing crisis that puts them at imminent risk of losing their housing/becoming homeless, or who are already homeless but for fewer than 90 days.

Size of universe: unknown—there are no national or local estimates of the size of this population.

Selection approach: Through an MOU with the Urban Institute and S&A, the University of Pennsylvania’s National Center for Homelessness Among Veterans (NCHV) will use VA and VAMC administrative data to select group members that, collectively, match the proportions female, family households, employed, and OEF/OIF/OND that exist among Group 1 members. This matching will be done per VHPD site, as these proportions are known to vary by site. NCHV will select Group 2 members to match to each site’s Group 1 cohort by first filtering the national VA databases to choose only veterans in the VHPD program site catchment areas or very close to them, to hold geography as constant as possible. The search will work backward from May 31, 2011 for as many months as it takes to compose a Group 2 for each VHPD site.


B1.2.c. Group 3: HPRP Participants Comparison Group

Universe: All non-veteran households enrolling in HPRP services starting June 1, 2011 (to make it simultaneous with VHPD enrollment) in the catchment areas covered by the five VHPD programs, with incomes below 50% of area median income and facing a housing crisis that puts them at imminent risk of losing their housing/becoming homeless, or who are already homeless but for fewer than 90 days.

Size of universe: HPRP grantees have been reporting this information to HUD but it is not public, so we cannot provide this information.

Selection approach: Through MOUs with administrators of HMISs covering the five VHPD catchment areas, we will access HMIS administrative data to select group members that, collectively, match the proportions female, family households, employed, and prevention/rapid rehousing that exist among Group 1 members. This matching will be done per VHPD site, as these proportions are known to vary by site. HMIS administrators will select Group 3 members to match to each site’s Group 1 cohort; doing these matches locally will hold geography as constant as possible.


B1.3. Expected Response Rates

We are collecting new data via baseline and follow up surveys from Group 1.  A high (80 percent) response rate is expected for the follow-up data collection effort; as we will be getting 500 baseline interviews we expect about 400 follow-up interviews.  The 80 percent projection is based on the following facts: (1) veterans will receive a monetary incentive as a token of appreciation for their participation, and (2) Silber & Associates will conduct an intensive follow-up campaign with non-respondents.  For Groups 2 and 3, we rely on already existing administrative data, so we expect a 100 percent “response rate” for these groups.

Silber & Associates has achieved response rates of 80% in a number of situations where the respondents are difficult to reach or have little or no incentive to participate.  See, for instance, ten customer surveys we conducted in 2005 and 2010 for U.S. Department of Housing and Urban Development (Surveys of Partner Satisfaction with HUD Performance, 2005 and 2010, http://www.huduser.org/portal/pdredge/pdr_edge_research_021012.html).  


Reliability of Estimates

Our goal is to interview 500 VHPD participants in Group 1 (100 per site), plus get information on the use of shelters and other homeless services from HMIS and/or HOMES3 for this group. In addition, we will have HMIS and/or HOMES data on post-VHPD rental assistance use of emergency shelters and other homeless services if they are reported in local HMISs, and VA homeless services if they are reported in HOMES. With these data we can do:

  1. Pre-post comparisons of VHPD participants (follow-up vs. baseline) on any variable or scale included in both surveys, and

  2. Treatment (Group 1) to comparison (Groups 2 and 3) analyses of shelter/homeless service use at follow-up for Group 1 and after an equivalent time lapse for Groups 2 and 3.


We have sufficient sample size for pre-post analyses for Group 1/VHPD participants (500/~400), after accounting for likely attrition—that is, sample members who do not complete the follow-up survey. We also have sufficient power to detect significant differences between Group 1 to either Group 2 or Group 3 (500/500). As the evaluation proceeds, it will be very important to be aware of the consequences of possibly reduced sample sizes (for example, if OMB approval takes more time than expected or other delays are encountered). For this reason, we examine the implications of these possible changes through power calculations of alternative sample sizes.


In Exhibits 1 and 2, we show the power calculations for two hypothetical measures that may be included in the outcomes examined for VHPD impacts: (1) a 5-point well-being scale comparing the baseline survey with the follow-up survey for Group 1, and (2) the percentage of Group 1 respondents who remain housed at follow-up to Group 2 or Group 3 members at an equivalent time period post-program enrollment. The data for the latter comparison will come from administrative data, HOMES for Group 1 vs. Group 2 and HMIS for Group 1 vs. Group 3. A power of 80 percent to detect a difference at the 5% level is generally considered adequate for such comparisons. As the Exhibits show, for the first measure the power is adequate to detect the level of differences observed in the VHPD study with samples of only 400, which we expect to get with a follow-up survey response rate of 80 percent. The power is also adequate to detect a percentage point difference as small as 6 percent in housing retention rates (an effect size that seems plausible with the intervention).


Thus the group sizes we propose should give us ample power to detect differences that will make a difference to policy. Such power calculations vary from measure to measure, but these examples show that the scale measures (often scales from 1 to 5) are likely to require larger samples to measure the impact of the intervention. In addition, even if impacts can be detected with the larger sample sizes for each group, the planned group sizes may not allow for sub-group analyses for many measures.


Exhibit 1: Power to Detect Difference for a Hypothetical Measure on a 5-Point Scale—Pre-post comparisons for respondents to the baseline and follow-up surveys (VHPD Clients)


Measure:


Well-Being Scale-2/10 of a point difference

Baseline Mean—3.5; Follow-up Mean—3.7



Sample Size

Baseline Follow-up

Power to Detect the

Difference at 5% Level

500

400

90.9%

400

320

84.7%



Assumptions: Standard Deviation=1; One-tailed test.



Exhibit 2: Power to Detect Difference for a Hypothetical Percentage Measure—Comparing Treatment Group (VHPD Clients) to Comparison Groups (Veterans not in VHPD & non-Veterans in HPRP)



Measure:

% of respondents who remain housed

Veterans Mean—85%; Comparison Mean—75%

Veterans Mean—85%; Comparison Mean—79%

Sample Size

VHPD Comp.

Power to Detect the

Difference at 5% Level

500

500

99%

500

500

80%

Assumption: One-tailed test.


B2. Procedures for the Collection of Information

For this study, we are collecting two types of quantitative data: (1) a baseline and follow-up survey of Group 1 VHPD respondents; and (2) administrative data for Groups 1, 2, and 3. Our data collection procedures are described below.

Baseline and Follow-Up Survey Interviews

S&A will conduct baseline and follow-up telephone interviews with VHPD participants in Group 1. We anticipate a response rate of 80 percent for each survey. Silber & Associates has achieved response rates of 80% in a number of situations where the respondents are difficult to reach or have little or no incentive to participate.  See, for instance, ten customer surveys we conducted in 2005 and 2010 for U.S. Department of Housing and Urban Development (Surveys of Partner Satisfaction with HUD Performance, 2005 and 2010, http://www.huduser.org/portal/pdredge/pdr_edge_research_021012.html).  We attribute our success to extensive follow-up of non-respondents, plus our phone interviewers have excellent interviewing skills and training.  For a telephone survey, we make repeated calls to non-respondents during varying times of the day and varying days of the week, and we leave a toll-free callback number.  During survey periods, our office is open extended hours.  This is particularly important for the VHPD study since many members of the target population are on the west coast.  We use a multi-method approach of contacting non-respondents which might involve, for instance, sending certified letters to non-respondents encouraging their participation in a telephone survey.  We project an 80% or better response rate for the VHPD survey because the survey is incentivized ($30 payments to participants) and because we’ve budgeted for extensive follow-up of non-respondents.


The research team developed survey instrument for the baseline and follow-up interviews that covers the major areas upon which VHPD interventions are expected to have an impact, or that are important baseline characteristics for understanding veterans’ situations at intake, and how they might affect their experience with and outcomes from VHPD. Specifically:


  • Housing status—nature of current housing, security in that housing (e.g. leaseholder?, cost, trouble paying rent or utilities), some housing characteristics associated with homeless risk (overcrowding, violence), brief homeless history;

  • Household composition and identity, and the whereabouts of any adults or minor children considered part of the household but not currently living within it;

  • Barriers to maintaining housing or getting new housing housing—for example, sudden loss of income, bad credit, bad rental history, criminal history, or dishonorable discharge;

  • Education and training— completed and any current/recent education, including training and certifications;

  • Income and employment--current and brief history of employment, income level, sources, and non-cash benefits for the household;

  • Housing costs—rent, utilities;

  • Family health and well-being, including disabilities, children’s school status;

  • Veteran status/military experience—when and where served, deployments, respondent’s perception of impact of military experiences on current housing/household/ employment situation; and

  • Demographics—race/ethnicity, gender, age, marital status.


Interviews are expected to last about 25-30 minutes; participating veterans will be sent $30 as a token of appreciation for their participation in each interview.


Survey Administration

Silber & Associates will administer all aspects of the survey, and will adopt several measures to ensure that data collection runs smoothly and that only high quality data are collected. Specifically:


  • Survey Automation. Silber & Associates' interviewers will conduct telephone interviews using computer-assisted technology. Silber & Associates' IT staff will convert the survey questionnaire to html and upload it to Silber & Associates' system, where it can be accessed by the interviewing team members at their computer stations. The CATI questionnaire has built-in logic that automatically customizes the interview based on the respondent's answers to previous questions, streamlining the interviewing process. Respondents' answers are captured instantly in a database, making data entry unnecessary.

  • Interviewer training and qualifications. Silber & Associates' telephone interviewers have extensive experience conducting telephone surveys. They have made thousands of calls this year alone and have fielded some difficult surveys, including ones in which the respondents have no vested interest in the study, and receive no incentive to participate. All interviewers have in-depth experience using Silber & Associates' CATI system and have undergone training that covers interviewing skills, techniques, etiquette, instilling confidence in the respondent, and converting refusals to respondents. Dr. Silber will conduct interviewer training for Group 1 surveys and will practice one-on-one with each interviewer to ensure that the highest standards are maintained.


  • Quality control methods. The phone interviews with veterans will be conducted at Silber & Associates' office, not a remote location. The interview supervisor shares space with the interviewers so she can monitor their calls and be available to answer questions. Dr. Silber's office is adjacent to the interviewing room and she keeps close tabs on their calls and progress. The survey data are downloaded daily, and Dr. Silber inspects it for any irregularities. Because the survey questionnaire is automated, old-fashioned coding errors are avoided.

  • Response rate and follow up. Veterans who participate in the telephone survey will receive a monetary honorarium for their time, which is likely to facilitate a high response rate. Silber & Associates has an excellent record of high response rates, due mainly to our persistent follow-up calls to non-respondents. Furthermore, Silber & Associates' interviewers place multiple calls at varying times of the day and days of the week to non-respondents. If we consistently reach voice mail when we call a number, we'll leave Silber & Associates' toll-free number for the veteran to call back.

  • Construction of dataset and transmission to UI. During data collection, the survey responses reside in a Microsoft Access database that is backed up nightly on two servers. At the end of the project, we will create an SPSS file to read the data in Access. This file will fully define the data fields, including variable names, value labels, and missing values. Silber & Associates will transmit the electronic SPSS file to the Urban Institute for analysis.


Administrative Data and Consent Procedures

For Group 1 (VHPD participants), we propose to obtain information on major outcomes directly from the veterans themselves, conducting telephone interviews with them six months after their VHPD rental assistance ends. The only outcome we will gather for the two comparison groups, because it is the only outcome we can consistently get for these two groups, will be HMIS or HOMES records of homeless service use (mostly emergency shelter for Group 3, but various VA homeless services for Groups 1 and 3). We will also gather HMIS and HOMES shelter and services utilization information for all VHPD participants so that we have at least one outcome consistently measured across all three groups.4 For this, we will need the cooperation of most or all CoCs/HMISs because, although the biggest CoC/HMIS in each VHPD catchment area will be recording and storing information known to VHPD programs for all VHPD participants, it will not have subsequent shelter or other homeless service use for any participants who reside in other CoCs.


Exhibit 3. Required Data and Process to Access Data

Data Purpose

Group(s)

Variables

Consent Process

Source

Timing

Access Process

Create the VHPD Survey Sample


Group 1

Names and

contact

information

Consent form for

sharing of name and

contact information

administered by VHPD

grantee/subgrantees at

enrollment

VHPD

Grantees

July 2012

through June

2014

VHPD grantee

provides list to

UI

Create a veterans comparison group

Group 2

Demographic,

socioeconomic,

and

homelessness

history


De-identified data. No

consent needed

NCHV

Negotiations

during 2012, and

extractions and

analysis mid-2014

UI works with

NCHV. NCHV

creates

comparison group, using propensity score matching and does analysis.

Create an HPRP comparison group



Group 3

Demographic,

socioeconomic,

and

homelessness

history

De-identified data. No

consent needed

Primary

CoC

Negotiations

during 2012, and

extractions and

analysis mid-2014

UI works with

CoC to obtain

data. UI does

propensity

score

matching

and analysis.

Check HMIS for shelter entry




Groups 1

and 3

Days in shelter

Written consent from Group 1. De-identified data from Group 3: no consent needed.

Selection of

CoCs

Negotiations

during 2012, and

extractions and

analysis mid-2014

UI works with

CoC to obtain

data. UI does

propensity

score

matching and

analysis.

Check HOMES data for utilization of VA homeless services



Groups 1

and 2

Days in

homeless

services

Written consent from Group



De-identified data from

Group 3: no consent needed

NCHV

Negotiations

during 2012, and

extractions and

analysis mid-2014

UI works with

NCHV. NCHV

creates

comparison

group, using

propensity

score

matching and

does analysis.






Consent Procedures

To assure that we can include as many households as possible in each group, we will work with VHPD staff to design and implement procedures for obtaining the consents we need well before we ourselves will be able to interview these households. Specifically:


  • Survey Sample List. VHPD program staff will forward names and contact information for Group 1 members to the research team. For this to happen, VHPD participants will need to consent to having their names transmitted. We will obtain consent for accessing this information by administering a study informed consent form at the time of VHPD enrollment.

  • VHPD program data entered into HMIS for Group 1. We will want to access HMIS universal and program-specific elements from each grantee and their subgrantees, and attach it to the data files created by the interviewing we do ourselves. We will either obtain consent for accessing this information through existing data release forms, or create study specific release forms that the site will need to administer during enrollment for Group 1.


  • Cross check HMIS shelter entry data for Group 1. We will work with most or all of the 16 local CoCs to understand if VHPD participants and the HPRP comparison group participants are entering shelter after receiving services. This will require that the research team provide a list that includes the VHPD participant’s name and other identifying characteristics (e.g., DOB, race, age, etc.) to all of the local CoCs so that they can cross check their HMIS data. We will either obtain consent for accessing this information through existing data release forms or create study specific release forms that the site will need to administer during enrollment for Group 1.


  • Comparison group data for Group 2 in the HOMES database. To create a comparison group of veterans who are eligible for VHPD services but did not receive them, we will work with NCHV to develop analysis plans that will identify appropriate Group 2 members in the VAMC intake files and assess their homelessness by their use of VA homeless services as recorded in the HOMES database and its predecessor. We will provide specific parameters and the NCHV will conduct the analyses.


  • Aggregate HMIS data for Group 3 HPRP participants. To compare Group 1 and 3 outcomes with those for a non-veteran population experiencing risk of housing loss and homelessness (Group 3), we will work with the major HMISs in the VHPD catchment areas to provide the research team with data on homelessness outcomes for HPRP participants. We will ask for demographic comparisons to make sure we are comparing similar populations and for housing outcomes, specifically shelter reentry and use of any other homeless services that are reported in HMIS. These negotiations with HMISs may result in receipt of aggregate data fulfilling specific table shells, or in the HMISs extracting de-identified individual-level data from their systems and forwarding it to UI for analysis. The data will not include any personal data or identifiers, and we will therefore not need permission from program participants to receive or analyze these data.


B3. Methods to Maximize Response Rates and to Deal with Issues of Non-response


Response rate and follow up. Veterans who participate in the telephone survey will receive a monetary honorarium for their time, which is likely to facilitate a high response rate. Silber & Associates has an excellent record of high response rates, due mainly to our persistent follow-up calls to non-respondents. Furthermore, Silber & Associates' interviewers place multiple calls at varying times of the day and days of the week to non-respondents. If we consistently reach voice mail when we call a number, we'll leave Silber & Associates' toll-free number for the veteran to call back.


We anticipate a high response rate for this survey; however, there will be some people who will refuse to participate in this study. Those people who refuse to participate could potentially bias our survey estimates if the characteristics of those who refuse differ from those who were interviewed. The size of the potential bias depends on how much the non-participants differ and the response rate.  Using administrative data we will demographically compare our respondents to our non-respondents and if there are significant differences then we will adjust our estimates by applying a post-stratification weighting adjustment that would make our sample have the same demographic make-up as the overall population which will reduce the possibility nonresponse bias.


Statistical adjustments for non-response bias in pre-post analyses for Group 1, should it prove to be more than 5-10% on any available matching variable, will be handled in the same way we expect to handle propensity score matching when comparing Group 1 to Group 2 or Group 3.  That is, we use the available demographic variables to predict being in the baseline group vs. being in the follow-up group, then use the coefficients of each predictor variable to adjust the value of the predictor variables for the follow-up group.  The procedure generates a weight, or vector score, which combines all the adjustments and is applied to each person in the follow-up group.  Mean values of each predictor variable can be reported before and after the weighting/adjustment, so a reader can see both the level of bias and the effects of the adjustment. 

 

B4. Pre-testing of Procedures and Methods

Silber & Associates’ staff will pre-test the questionnaires in late May and early June 2012 , by conducting telephone interviews with no more than nine veterans drawn from veterans already enrolled in VHPD who volunteer for the test. The objectives are to: (a) test the questionnaire for wording, flow, and meaning; (b) determine the average time to complete the survey; and (c) conduct post-survey cognitive interviews with respondents to assess understand their interpretation of the questions and the reasoning behind their answers. After administering the pretest survey, Dr. Silber, a psychologist, will conduct cognitive interviews to learn about survey fatigue and question clarity, answerability, and sensitivity. Based on these interviews we will make wording and other changes as needed to improve cognitive clarity. The pretest will also provide experience with survey procedures. It will test the ability to contact respondents using the lists from which the sample will be drawn, reach appropriate respondents, and complete interviews with them. The experience will be used to modify the instrument as well as procedures related to contacting potential respondents, scheduling interview time, explaining the survey purpose, and encouraging participation.


B5. Individuals or Contractors Responsible for Statistical Aspects of the Design


  • The agency responsible for receiving and approving contract deliverables is:


Office of Policy Development and Research, Program Evaluation Division

U.S. Department of Housing and Urban Development

451 Seventh St. SW, Room 8120

Washington, DC 20410


Person Responsible: Elizabeth Rudd, HUD/GTR, (202) 402-7607, Elizabeth.C.Rudd@hud.gov


  • The organization responsible for administering the baseline and follow-up telephone surveys is:


Silber & Associates

13067 Twelve Hills Rd, Suite B

Clarksville, MD 21029-1144

Person Responsible: Dr. Bohne Silber, Principle Investigator, (410) 531-2121 ext. 11, bgsilber@silberandassociates.com


  • The organization responsible for statistical design of data to be collected is:


The Urban Institute

2100 M Street, NW

Washington, DC 20037


Persons Responsible:


Ms. Mary Cunningham, Team Leader, (202) 261-5764, mcunningham@urban.org

Dr. Martha Burt, Team Leader, (202) 261-5551, mburt@urban.org


  • The organization responsible for analyzing all data to be collected is:


The Urban Institute

2100 M Street, NW

Washington, DC 20037


Persons Responsible:


Ms. Mary Cunningham, Team Leader, (202) 261-5764, mcunningham@urban.org

Dr. Martha Burt, Team Leader, (202) 261-5551, mburt@urban.org



1 HUD selected five military bases and their surrounding communities to participate in VHPD: Camp Pendleton in San Diego, CA; Fort Hood in Killeen, Texas; Fort Drum in Watertown, NY; Joint Base Lewis-McChord in Tacoma, WA; and MacDill Air Force Base in Tampa, FL. HUD demonstration funds were allocated directly to the largest Continuum of Care (CoC) in the geographic area covered by the VHPD programs: the City of San Diego; Austin/Travis County; Utica/Rome/Oneida County; Tacoma/Lakewood/Pierce County; and Tampa/Hillsborough County.


2 Date selected after consultation with VA and HUD officials because neither VHPD nor Supportive Services for Veteran Families was available before that date, so any confounding with those services would be avoided.

3 HOMES (VA Homeless Operations Management Evaluation System), is a new database that tracks service utilization and outcomes for VA-funded homeless services; it became operational in April 2011. HOMES will include data management information from VHA Service Support Center (VSSC) & HUD‘s Homeless Management Information System (HMIS). HUD’s HMIS system includes name, social security number, date of birth, race, ethnicity, gender, veteran status, disabling condition, residence prior to program entry, zip code for last permanent address, housing status, program entry date, program exit date, personal identification number and household identification number, income and sources, non-cash benefits, physical disability, developmental disability, chronic health condition, HIV/AIDS, mental health, substance abuse, domestic abuse, destination, date of contact, date of engagement, financial assistance provided, housing relocation and stabilization services provided.

4 We will only be able to obtain this information for all VHPD participants at sites that have consent procedures in place to readily share information. Based on what we found in the program reconnaissance, we believe that this includes Tampa and possibly Austin and Utica.

2


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
File Modified0000-00-00
File Created2021-01-31

© 2024 OMB.report | Privacy Policy