MDRC_FSS PRA Supporting Statement _A revised final 7-18-2013

MDRC_FSS PRA Supporting Statement _A revised final 7-18-2013.doc

Family Self-Sufficiency Program Evaluation

OMB: 2528-0296

Document [doc]
Download: doc | pdf



THE FAMILY SELF-SUFFICIENCY PROGRAM

EVALUATION






SUPPORTING STATEMENT – PART A




OMB CLEARANCE PACKAGE




REVISED



06/07/2013


Submitted to:




U.S. Department of Housing and Urban Development


Contract No: GS-10F-0245N

Part A: Justification


This Supporting Statement provides information on the data collection activities associated with the evaluation of the Family Self-Sufficiency program (FSS) administered by the U.S. Department of Housing and Urban Development (HUD). This evaluation is being conducted by MDRC and its subcontractors Branch Associates and M.Davis on behalf of HUD.


A1. Circumstances Necessitating Data Collection


The primary goal of the project is to increase our knowledge about the efficacy of FSS, which is aimed at helping Housing Choice Voucher holders (HCV) secure and maintain employment and gain independence from public support programs.

This study was authorized under the Consolidated Appropriations Act, 2010, Pub. Law 111-117, 123 Stat. 3034, which was approved on Dec. 16, 2009.


Two information collection requests are anticipated for this project. This is the first one, which covers the informed consent, baseline information collection (intake survey/Baseline Information Form), and the burden on public housing agency staff in the random assignment process.


The second information collection request will cover interviews with site staff for the implementation data, interviews with site staff for the cost-benefit analysis, and the survey of both treatment and control families, planned to be conducted in approximately 2016.


A.1.1 Background and Policy Context


The linkage of housing assistance to economic opportunities has long been an important component of US housing policy. For many years, policymakers and analysts have argued that low-income, working-age people who receive a rent subsidy from the government should strive for some degree of economic self-sufficiency, and that housing subsidy systems should play a more active role in supporting these efforts, such as through the Family Self-Sufficiency (FSS) program. However, housing assistance, and the FSS program itself, have had an unclear, uncertain impact upon economic opportunities for subsidized tenants.

Housing assistance and work

Research findings differ on the question of whether housing assistance per se promotes or impedes tenants’ progress towards self-sufficiency. Some research has suggested that obtaining housing assistance may discourage tenants from entering the labor market. For example, Jacob and Ludwig (2008), making use of a randomized lottery in Chicago, found that receipt of a housing voucher among former residents of private sector housing had a modest negative effect on employment and earnings. Other (non-random assignment) studies have suggested modest positive impacts of housing assistance on labor market outcomes (Olsen, 2003; Newman and Harkness, 2002: 24). Still other research has suggested that the type of housing assistance (vouchers versus public housing) may make no difference either way for families’ employment or earnings, despite the expectation that vouchers could help low-income families move closer to where jobs are easier to find. This was the case with the randomized study of the Moving to Opportunity (MTO) demonstration.1

Whether or not housing assistance harms tenants’ work efforts, the weight of the evidence, as well as the strongest evidence, suggest that housing assistance by itself (i.e., without any explicit employment-focused intervention tied to it) does not substantially improve work outcomes for tenants (Riccio, 2008). At the same time, the evidence on what kind of employment –focused strategies linked to housing assistance actually make a difference is scant.


FSS goals, features, and participation


The FSS program, created in 1990 by Section 554 of the National Affordable Housing Act,2 is administered by state and local public housing agencies. HUD provides funding for FSS program coordinators to manage the program and ensure that participants are linked with appropriate services, and it also funds the escrow accounts. HUD does not fund services; PHAs must rely on their own or other resources available in the community. In March 2011, HUD announced that a total of $54 million would be made available for FSS in the Housing Choice Voucher (HCV) program. The funds were allocated to roughly 600 PHAs for an average grant of approximately $90,000.


Tenants’ participation in all FSS programs is voluntary. While PHAs may screen for interest or motivation, they are barred from screening for education, job history, marital status, or credit rating. Tenants who elect to participate typically execute a 5-year self-sufficiency contract that specifies their obligations and the services that will be provided over the course of the contract (US Department of Housing and Urban Development, 2011).

Central to the FSS program model has been the inclusion of a rent escrow provision. Rent increases that, under HUD rent rules, would normally follow a rise in income are instead diverted into an interest-bearing, PHA-administered escrow savings account, the proceeds of which the family can access after completing their contract. After families successfully complete their contracts, they receive the full value of their accumulated savings without restriction on how the money can be used. However, participants may qualify for interim disbursements to cover expenses that can help them meet the terms of their contracts, such as for paying for education or training courses. FSS introduced this feature both as an incentive for households to increase work and earnings, and as a long-term savings vehicle to help families build their financial assets. These asset-building provisions appeared to play an important role in motivating families’ continued participation in FSS, according to early research on the program (Rohe and Kleit, 1999: 358).


The escrow feature is believed to be critical to the potential success of FSS because it helps to address the HUD rule requiring families to pay 30 percent of their adjusted income in rent. Some experts view this rent rule as imposing an implicit tax that may severely “dampen a parent’s enthusiasm for work” (Newman and Harkness, 2002: 24).


The second component of the FSS program is case management. HUD funds the PHAs to hire coordinators, who are expected to work with program participants to connect them to services meant to address their barriers to employment. Past research3 suggests the low-income individuals face difficulties in securing and advancing in employment, including lack of diploma or certification, poor mental and physical health, low English language fluency, drug addiction, criminal records, and lack of “soft skills” such as interview protocol and effective communication with superiors. MDRC’s experiences in the New York City Work Rewards demonstration, (discussed below) suggest, additionally, that a high proportion of single mothers (over 80 percent of the sample in the Work Rewards) enroll in FSS.4 This population faces additional barriers to employment that include reliable transportation and access to high-quality childcare. FSS coordinators are expected to tailor referral services provided to the particular needs of the client. The assumptions made about and understanding of these barriers to employment on the part of FSS program staff, as well as the overall approach the PHA takes toward the self-sufficiency contract (e.g. a “work first” or human capital development-oriented approach, see below), may be crucial factors in determining whether FSS is effective in helping participants advance and achieve self-sufficiency.


Currently roughly 77,000 households participate in FSS nationwide. The majority (over 80 percent according to HUD) are Housing Choice Voucher (HCV) recipients rather than public housing residents. FSS households make up a very small share of the families receiving various forms of HUD assistance.5 Recent HUD-funded studies of FSS participation reveal that about 5 percent of voucher program users were enrolled in FSS (de Silva et al., 2011: 13; Ficke and Piesse, 2004: xiii). Some analysts have noted declines in FSS program enrollment associated with limits on program funding (Corporation for Enterprise Development, 2011).


Despite FSS’s attractive features, a number of studies have highlighted substantial dropout rates among FSS participants. For example, Rohe and Kleit (1999) and de Silva et al. (2011) described the relatively large proportions of FSS enrollees who fail to complete their program because they drop out of the program or were “asked to leave” due to participation problems. Ficke and Piesse (2004) reported that over one-quarter of participants were asked to leave, raising questions about how FSS programs might be better designed or escrow-like provisions redesigned to increase the number of clients who remain enrolled through the entire period of their FSS obligations.


Evidence on the effectiveness of FSS


Evaluations of FSS have been handicapped by important methodological limitations, particularly the absence of credible, multi-site tests with control groups, and limited data to follow participants and controls beyond the end of their participation in the programs (Anthony, 2005; Rohe and Kleit, 1999; Riccio, 1997; Riccio 1999; Riccio 2008).


Some studies using non-experimental data suggest that FSS graduates had higher incomes than those who did not participate. In his Rockford, IL study, Anthony (2005: 79) finds that those who remain in the program longer achieved higher escrow savings account balances, with substantial increases in household incomes between program entry and exit. He, unlike Rohe and Kleit (1999), report on higher incomes at enrollment for program participants (Anthony, 2005: 84). However, Anthony (2005: 68) notes the absence of data on reasons for enrollees completing, dropping out or being forced out of the program limits the utility of such studies.


More recent studies of FSS have sought to provide a better understanding of how the program operates, the characteristics of those who elect to participate, and the major outcomes or benefits from the program. For example, a study by Ficke and Piesse (2004) was based on a retrospective analysis of program data from 1996 through 2000, and sought to answer the basic question of whether “FSS met basic program goals of increasing self-sufficiency for program participants.” It found that participants’ incomes increased at a faster rate than for other families in HUD’s database. Families had also achieved average escrow account savings of over $3,300. However, these findings may reflect the impact of self-selection into the program upon resulting incomes and savings.


HUD’s 2011 study by de Silva et al. describes FSS program operations and outcomes using a nationally representative sample of 100 FSS programs, and a deeper tracking sample of 181 participants in 14 of those programs from 2005 to 2009. This study found that 37 percent of participants left the program before completing and forfeited their escrow balances. Roughly 24 percent had completed their programs within that period, with another 39 percent still enrolled. The escrow savings of those who completed the program were more than double the balance of those who had already exited FSS. The study noted that most of the program graduates had higher incomes and had already been working at the time they enrolled into FSS, suggesting the same type of “creaming” described in Anthony’s (2005) analysis. However, the study did not involve a randomized control trial (RCT) and impacts of the program on participants’ labor market outcomes and incomes are not known.


The only evaluation of FSS to date that uses a random assignment research design with a reliable control group is MDRC’s current study, known as Opportunity NYC-Work Rewards (hereafter referred to as Work Rewards). As part of Work Rewards, MDRC is conducting two related random assignment tests in New York City that involve HCV recipients. One study in this project involves a test of the effectiveness of the FSS program operated by the City’s Department of Housing Preservation and Development (HPD), which operates the fourth-largest voucher program in the country, against a control group as well as against another treatment group that was offered an extra financial incentive to work. The second study involves families that hold vouchers from the New York City Housing Authority (NYCHA) and tests the offer of financial incentives without an FSS program. The Work Rewards financial incentive offer was a $300 payment every two months (or $1,800 per year) for two years for sustained full-time employment. The reason for offering this incentive was to test whether the offer of an immediate reward to help “make work pay” could help counteract the potential disincentive effects of existing rent rules of housing vouchers. The evaluations of these treatments include implementation, impact, and cost-benefit analyses covering a five-year follow-up period.


For the study involving HPD voucher-holders, MDRC set up a three-group design in which more than 2,100 voucher-holders were randomly assigned to: 1) FSS-only, 2) FSS+incentives , or 3) a control group. Comparing the FSS-only group to the control group will show whether the FSS program increases work, earnings, and other outcomes beyond levels that participants would have achieved without FSS. Comparing the FSS program group to those offered FSS+incentives will show whether the cash rewards improve upon or “add value” to the effects that FSS produces on its own. As part of this evaluation, MDRC has been studying the process of recruiting voucher holders to become part of the random assignment study; the roles of PHA staff and the strategies used by FSS case managers in local community-based organizations (CBOs) that were contracted by HPD to operate the programs; efforts by the PHA and the CBO staff to market and help participants understand the escrow and (for those eligible) the added work incentive; and the two treatments’ early impacts on critical outcomes.


For the study using the NYCHA sample, MDRC randomly assigned voucher-holders to: 1) a group that received the offer of the full-time work incentive, or 2) a control group that was not offered the incentive. This test will show the effectiveness of just offering the financial incentive, without an FSS program. (NYCHA does not operate an FSS program.)


Early evaluation results from Works Rewards suggest no overall sustained impacts of FSS or FSS+ incentives on employment or earnings after 30 months (Verma, et al., 2012), but show noteworthy subgroup results. The FSS program, when combined with the reward payments (or incentives), produced large and statistically significant positive effects on employment and earnings for individuals who were not working at study entry. For example, it led to a 6.9 percentage point increase in employment rates, over a control group rate of 22.9 percent, and a 45 percent increase in average earnings. It did not produce positive effects for participants who were already working when they entered the program. The evaluation has also found so far that FSS alone produced a similar pattern of larger employment and earnings gains for the already-working subgroup, although the differences were smaller and not statistically significant. The early impact findings for the Incentives-Only study are also largely a subgroup story. In this case, the subgroups that mattered most were defined in terms of participants’ receipt of food stamps at the time of random assignment. (The food stamps program is now called the Supplemental Nutrition Assistance Program, or SNAP). The Incentives-Only intervention produced sizable and statistically significant increases in earnings for participants who were food stamp recipients, but not for non-recipients. Interestingly, the FSS+incentives and FSS-only interventions produced a similar pattern of results (and were statistically significant in the latter case). One interpretation of this general finding is that perhaps the extra financial rewards (special workforce incentives) and/or the extra prodding and support offered by the FSS program helped counteract the worries that some food stamp recipients may have had about potential reductions in their food stamp benefits if they earned more money. This pattern of results has important implications for the recruitment strategies and the impact analysis used in the HUD FSS demonstration, as discussed below.


A1.2 Overview of the FSS Evaluation


MDRC will conduct a comprehensive study of the FSS program. The study will involve at least 2,000 voucher households in up to 20 Public Housing Authorities (PHAs) – the original target sample was 3,000, but the feasibility of enrolling this higher number will be assessed once PHAs have been selected for the evaluation.


The evaluation design is structured around three main research components: an impact analysis, an implementation and participation analysis, and a benefit-cost analysis. The overarching goal of the evaluation is to test the effectiveness of the FSS program, the main self-sufficiency intervention that program participants will be exposed to.


Using a random assignment design, the research will be positioned to speak to the following types of research questions:


  • Does the FSS program improve self-sufficiency outcomes for program participants?


  • Do the effects vary across types of people and places?


  • How does variation in the implementation of the interventions affect participants’ experiences and the interventions’ success?


  • Does the intervention produce positive benefit-cost results?


Impact study


The impact analysis is the component of the evaluation that is focused on examining the effectiveness of FSS on improving outcomes for individuals and the families enrolled in the demonstration. Put simply, it will determine whether the program group had better outcomes than it would have achieved without the program. These impacts will be determined using a two-group randomized control trial. Because random assignment, when properly implemented, helps eliminate systematic differences between the program and control groups prior to the start of the program, any subsequent differences in outcomes – for example, differences in employment, earnings, family income and poverty can be attributed to the program with confidence.


As per the current thinking, the design will involve randomly assigning a total sample of at least 2,000 households6 to one of two groups (at least 1,000 per group):


  • FSS group. These individuals have access to the core elements of the FSS program – case management services as well as rent escrow provisions.


  • Control group. These individuals will not be enrolled in FSS and will not have access to FSS case management or escrow for 3 years following random assignment (i.e., the embargo period).


The impact analysis will assess the overall and independent effects of the FSS program by comparing the key outcomes of this treatment group to the outcomes of the control group. The study will track both the program and the control groups for a number of years using administrative and survey data to measures outcomes. As noted, a 3-year FSS embargo will apply to the control group.


The impact analysis will examine the program’s effects on a wide range of outcomes. Key clusters of outcomes under consideration are included below.


Education and Work: MDRC will use both Unemployment Insurance wage records and the survey to collect data on employment, earnings, job characteristics, and work search behaviors. Discussions with PHAs have revealed that some programs take a human capital development approach to self-sufficiency and thus emphasize degree, diploma and certification achievement. Though potentially regarded as an intermediate outcome, MDRC will track educational attainment among study participants through survey data.


Income, assets, finances, and rent burden: If FSS affects participants’ disposable income, it may help them accumulate assets. With survey data, MDRC will assess the effects of the program on household finances and financial behaviors (such as savings, access to credit, and debt reduction). Data on income combined with housing authority and survey data on tenant rent and utilities payments would be used to construct measures of rent burden.


Health, material hardship, and family well-being: As in Work Rewards, Jobs-Plus, and other housing studies, the FSS evaluation will estimate the effects of FSS on residents’ overall health and specific health conditions, and their access to preventive health care. These outcomes could be affected, indirectly, by changes in residents’ income, and by potential changes in their housing and neighborhood contexts. These factors may also affect mental health outcomes, such as depression. Increases in disposable income may also produce reductions in material hardships, including housing-related hardships such as disconnection of phone and utilities, as well as food-related hardships. (MDRC observed such effects on poverty and hardship in its study of New York City’s conditional cash transfer program.)


In addition, MDRC will obtain basic information about all household members, including names, ages, employment status (if appropriate), and relationship to the head of household through the Baseline Information Form (BIF), survey, and the 50058 form.


Depending on resources and sample build-up period, the impact analysis will assess the program’s effects on families covering a period of at least 3 to 4 years after each family’s date of random assignment. MDRC and HUD are also considering the feasibility of conducting longer-term follow-up on families.


Subgroups


The impact analysis will also investigate whether the intervention worked especially well for particular subgroups of families. These subgroups will be identified before the start of the analysis (See Supporting Statement B for additional information on the impact approach).


MDRC is currently considering several subgroups of interest. Informed by the findings from Work Rewards, MDRC plans to examine impacts by work status and SNAP receipt status at program entry. As discussed above, the early impact findings for Work Rewards show that the interventions did have positive effects for certain subgroups that were specified before the impact estimates were calculated. Based on prior research on employment programs, it was expected that the effects of Work Rewards might vary for different types of individuals, defined, for example, in terms of their prior attachment to the labor market and prior receipt of public transfer benefits. The study found that FSS+Incentives produced large and statistically significant increases in average quarterly employment rates and average earnings for voucher holders who were not already working at the time they entered the study. Impact estimates for FSS-Only were also positive for that subgroup, but they were smaller and not statistically significant (meaning that the effects are less certain). Neither of the two interventions improved outcomes for participants who were already working when they enrolled in the study. This pattern is consistent with the study’s qualitative research, which found that the community organizations operating FSS had much more concrete assistance to offer to participants who needed to find jobs than they could offer to participants who were already working and hoping to increase their earnings.

The Incentives-Only intervention produced sizable and statistically significant increases in earnings for participants who were food stamp recipients, but had no effects for participants who were not food stamp recipients. Interestingly, FSS+Incentives and FSS-Only produced a similar pattern of results (and were statistically significant in the latter case). One interpretation of this general finding is that perhaps the special workforce incentives and/or the extra prodding and support offered by the FSS program helped counteract the worries that some food stamp recipients may have had about potential reductions in their food stamp benefits if they earned more money.


Finally, earlier reports on FSS, although non-experimental, suggest that education level and TANF status may be associated with success in the program.


Implementation and participation study


The FSS model is viewed as more of a broad intervention framework than a prescriptive model, leaving much discretion to local PHAs to define its actual shape and content. For this reason, it is important for the implementation analysis to describe clearly how that broad framework is interpreted and operated across PHAs. Given the number of PHAs likely to be involved in the study, it will be critical for the implementation analysis to examine variation across a few critical dimensions of FSS operations, so as not to lose the overall picture of the “forest” through an overly detailed description of the “trees.” Important dimensions of variation might include such features as the local context (e.g., labor market, housing market, service environment) in which FSS is implemented, the types of participants enrolled in the program; the service delivery networks developed for the program; the scope and intensity of services offered through those networks; the degree to which participants take advantage of the services offered; the ratio of participants to case managers; and the priorities attached to the variety of goals that fall under the rubric of “self-sufficiency.” The implementation findings should be of interest in their own right, but they are also crucial for making sense of the evaluation’s impact and cost-benefit results. They define the actual treatment, in practice, that the impact and cost-benefit results assess.


While the scope of the implementation research will also be informed by ongoing site observations, it will give particular but not exclusive attention to the following types of topics:


What program approaches do the sites adopt, and are these understood by participants? A major goal of the FSS study is to understand the basic elements of program models in each of the sites included in the study, including case management approaches, recruitment, caseloads, number of meetings per year, goals, and number and type of partner agencies (and the services made available through them). Past studies on FSS have noted that sites also differ on family selection procedures, outreach efforts, FSS activities and supportive services, and terms for program termination. MDRC’s implementation research on FSS in the Work Rewards demonstration revealed that many participants did not fully understood the work focus of the program and enrolled in it more for the social services and supports it offered. Accordingly, our implementation research for the HUD evaluation will not only describe the basic FSS approach across sites, but also how the program is described to participants and how participants want it to help them. The research will draw on analysis of interviews with providers and participants.7


What are the most promising strategies that program coordinators use to assist participants in obtaining greater economic independence? Intermediate outcomes of interest exist between unemployment or underemployment and suitable stable employment, such as moving from a first job to a higher-paying job, or obtaining additional education or training. Achieving these outcomes will not be easy. Further, programs may vary in their primary emphasis, with some focusing on human capital development and others taking a “work first” approach. As shown by MDRC’s Work Rewards research and other studies of FSS, challenges are likely to include low participation in services and multiple participant barriers to employment. The implementation research will examine how PHAs assist participants in reaching their goals, the assumptions that underlie their case management strategies and referral approaches, and how they address participation challenges.


How do PHAs and service providers market the escrow account? What are seen by providers and participants as the most effective components of escrow marketing strategies? The implementation research will examine how program staff inform participants in the FSS program about the escrow offer, what residents appear to understand about that offer, and how they say it influences their decisions about work. Work Rewards research revealed the substantial challenges that FSS case managers had in marketing the escrow effectively and in using the escrow as a motivational “hook” for engaging participants in other services. Also, the early reconnaissance suggests that many PHAs only lightly market the escrow account, if at all. The implementation research will try to understand the extent to which the focus on escrow is integrated into staff – participant interactions, and the variations in staff marketing practices across PHAs.


How much do participants engage with case managers, take part in various activities outlined in their participant plans, and achieve interim milestones outlined in those plans?

The implementation study will also conduct a quantitative analysis of participation in FSS activities and achievement of milestones, using information provided by the FSS programs and HUD. MDRC will compare participation patterns across PHAs to try to understand the variation in engagement in program services. This includes paying special attention to the experiences of subgroups of participants, including the unemployed and SNAP recipients, that were found to experience notable impacts in MDRC’s analysis of the Work Rewards program. To the extent possible, MDRC will look for notable patterns in service variation across various dimensions of program focus, size, and quality.


How much do participants accumulate in their escrow accounts? Who is most likely to graduate and accumulate more escrow savings vs. drop out of the program and forfeit their savings? Finally, the implementation study will examine the extent to which participants accumulate escrow savings, how much these savings grow over time, the extent to which savings are forfeited, and which types of participants (according to their background characteristics) accumulate more savings or are more likely to forfeit their escrow. MDRC will look for patterns in escrow accumulation across various dimensions of program focus, size, and quality.


How does HUD funding for the FSS program influence the stability of FSS practice and referral partnerships? One of HUD’s major research goals for the demonstration is to understand how changes in and uncertainty over the stability of funding for program coordinators affects the level of case management/referrals families receive, and how smaller caseload sizes may result in better family outcomes. Accordingly, research will address how the FSS annual funding process influences the stability of external partnerships and Program Coordinating Committees, the strengths and challenges of the current funding process for FSS coordinators, and how management and oversight structures for case management and Coordinating Committee efforts influence the quality of services. It will draw primarily on analysis of interviews with PHA FSS program managers and community partners.


Benefit-cost study


The benefit-cost analysis will build upon the impact and implementation analyses and compare the costs of operating the FSS program with the economic benefits it produces. A comprehensive cost-benefit analysis examines the net present value (or net economic “gain” and “loss”) from several accounting perspectives, such as from the perspective of participants and their families, of taxpayers and government budgets, of the PHAs, and of society as a whole. It looks at benefits and costs that are directly observable during the period of data collection, as well as over a longer time horizon (e.g., 5 or 10 years), based on systematic projections using alternative assumptions about trends in costs and impacts. Sensitivity tests would be done to assess how much the general pattern of findings is dependent on the underlying assumptions used for those projections. A comprehensive cost-benefit analysis of this kind requires extensive data on program group members service receipt, control group service receipt, and impacts on a variety of economic outcomes.


A first step in the benefit-cost analysis of the FSS program will be to determine the costs incurred by the PHAs and HUD for this intervention, including the cash value of the escrow savings that are paid. MDRC would estimate these costs using PHA expenditure reports and MIS participation data. The cost of escrow payments will be estimated using PHA records.

It is important to recognize that the program costs incurred by PHAs and HUD will not capture the full government investment if the program causes an increase in the use of services provided by other agencies and institutions. For example, if FSS increases the proportion of voucher recipients who attend community colleges or other government-supported training programs, it would be appropriate to consider those costs as part of the overall resource investment attributable to FSS, even though they are not paid for by the PHAs or HUD. Including such costs requires data on the duration and intensity of participation in those other services, which the FSS programs may not reliably record, in part because participants may not fully report those activities to the program. That information can be obtained through interviews with sample members. MDRC will then obtain from the appropriate institutions or published sources information on unit costs for the types of services received. MDRC will then compute average costs per sample member from these data. A survey will also make it possible to estimate the net government expenditures on services for FSS participants, over and above the cost for relevant services that controls receive without FSS.


In estimating the costs of the FSS program, our analysis will distinguish between the average costs for providing FSS services and for the escrow savings component. The “benefits” side of the benefit-cost framework builds on estimates produced by the impact analysis, with adjustments for the specified time horizon, an assumed discount rate, and projections of likely trends in impacts (i.e., decay, growth, or no change) after impacts are directly observed.


The overall estimated net present value (the sum of gains and losses) will be computed for each accounting perspective (e.g., program participants, government budget, PHAs, society as a whole). If the available data do not permit a full and comprehensive analysis of costs and benefits (e.g., if survey data are unavailable), the presentation of findings will highlight the limitations of the analysis, but also offer guidance on the ways in which the findings may still be informative for budgeting purposes and for judging the overall merits of FSS.


A2. How, by Whom, and for What Purpose Are Data to be Used


How the Information will be used?


The findings from the study will be used to inform the Federal government, Public Housing Authorities and other stake-holders about the effectiveness of FSS in helping HCV holders secure and maintain employment and achieve self-sufficiency.


Who will collect the information?


MDRC, the contractor for the study, and its subcontractors Branch Associates and MDavis will collect data through face-to-face interviews with key informants, access to state Unemployment Insurance records, site visits (including observations of program activities, case file reviews, and staff interviews), and a survey. Program data and other existing MIS data will be analyzed, where possible.


Baseline information


Informed consent and baseline information will be collected from all study participants following determination of program eligibility, but prior to random assignment. The baseline form will provide general information about participants that will ensure the comparability of the program and control groups, obtain data needed for subgroup analyses, and facilitate contact for subsequent follow-up surveys.


Informed Consent Form (Appendix A): The informed consent form will ensure that participants: 1) understand the FSS evaluation, as well as their role and rights within the study; and 2) provide their consent to participate. The informed consent form will also inform study participants that HUD will be releasing a NOFA to conduct supplemental studies on FSS and that data collected for the core evaluation could be shared with a broader set of researchers selected by HUD.


To ensure that all study participants receive a clear, consistent explanation of the project, the evaluation team will train FSS program staff on how to introduce and discuss the goals and design of the evaluation, the random assignment process, and data collection efforts. All staff will emphasize: 1) that participation in the study is voluntary; 2) that strict rules are in place to protect sample members’ privacy; 3) that data collected during the study may also be used for supplemental studies funded through the FSS small grants Notice of Funding Availability.


Baseline Information Form (Appendix B): The baseline information form will collect identifying information about sample members enrolling in the study, including social security number and date of birth. It also designed to collect demographic information such as race and ethnicity, primary language spoken, and housing and marital status. The form also includes questions about participants’ education and employment history and any involvement with public assistance. Contact information will also be collected on the BIF. In addition to study participants’ own name, address, and phone numbers, participants will be asked to provide information for two additional individuals who are likely to be in future contact with them and can assist the research team in locating them for the follow-up survey.


These data will be used by the evaluation team to ensure the comparability of program and control groups, to facilitate contact for follow-up surveys, and to inform subgroup analyses in the impact study. Baseline forms will be administered to individual sample members by program staff members in one-on-one settings, with program staff members available in all cases to answer any questions. MDRC will design a special web-based module to capture the BIF data, and program staff will be trained to use this tool.


Impact analysis data sources


Table 1 provides a summary of the impact analysis outcome clusters described above and the associated data sources. As shown, these data will come from a few data sources, including UI wage records (for employment and earnings outcomes), HUD data, and a survey.



Table 1

FSS Evaluation

Research Topics and Data Sources


Key Research Topics

Data Sources

PHA staff interviews

Participant interviews

Interviews with program partners

Survey

HUD/ MIS



UI

Recruitment and enrollment

Marketing and outreach

Adequacy of staff tools and training

Program dissemination networks, and community resources

X

X

X

X



Case Management & Participation

Characteristics of participants

Knowledge of services available

Services and supports utilized

Contacts/interactions with program staff

Strategies to promote participation

Program completion status

X

X

X

X

X


Escrow

Awareness, knowledge, attitudes

Strategies for informing, educating participants

Escrow accumulation, uses

X

X


X

X


Other Stakeholders’ Perspectives of FSS



X






Counterfactual Environment:

Supports and services available for control group

Control participation in similar activities

X

X

X

X



(contd.)


Table 1

FSS Evaluation Research Topics and Data Sources


Key Research Topics8

Data Sources

PHA staff interviews

Participant interviews

Interviews with program partners

Survey

HUD/ MIS




UI

Education and labor force participation

Education, participation in education and training

Employment stability and hours worked

Participation in and completion of education and training

Earnings, wage rates, job quality





X








X




X








X

Housing

Housing Status

Rent





X






X



Family well-being

Family income and well-being

Savings and assets

Material hardship

Residential mobility/stability





X






X






X






X



Health

Perceived physical and mental health

Depression

Access to preventive care

Health coverage





X






X






X






X



The formal survey is slated for fielding in late 2016. As per our agreement with GTR, information on the survey will be included and discussed in a separate, later OMB submission. This timeline for conducting the survey will allow MDRC’s survey subcontractor to complete the fielding effort in time for data analysis and inclusion of findings in the final report.


Through these various data sources, the analysis will address the types of topics listed above. The analysis will take a comparative approach, examining the similarities and differences across PHAs in how they structure and operate FSS, with the goal of understanding adaptations to fit the basic model to different local circumstances. The analysis will examine how families are recruited and screened into the program (and the evaluation), the roles and responsibilities of the PHAs’ key partners in serving those families; the priority levels that staff and families attach to different program outcomes (e.g., shorter-term versus longer-term employment, skills-building, and financial management); and the variety of ways in which the programs try to engage participants in self-sufficiency activities. The study will also measure families’ rates of participation in key program components and their achievement of particular milestones and escrow savings. It will give special attention to a description and assessment of how the FSS and

escrow are marketed and communicated to families, and how much families are aware of and understand the escrow offer. The analysis will examine families’ overall perceptions of the program, their participation in case management services, and their amounts of escrow savings.


As shown in Table 1, the implementation and process study will also rely on FSS data reported by the PHAs to HUD, standing management reports on specified performance indicators, and field observations and interviews with staff and participants. Program data available through HUD are an important resource. However, it is our understanding that there may be quality issues with FSS data reported to HUD. Given the evaluation will rely on multiple data sources (i.e., survey and administrative records), with some data elements available in more than one source, MDRC will be able to compare the quality of specific data elements across these data sources. Such comparison (or verification) may be best reserved for those data that are deemed inconsistent with site observations or across data sources. Once the PHAs are selected for the study, and MDRC has a better understanding of the quality issues as they pertain to the selected PHAs, the team will begin to identify the data elements that might warrant such a comparison.

A.3. Use of Information Technology for Data Collection to Reduce Respondent Burden

Wherever possible, advanced technology will be used in data collection efforts to reduce burden on study participants and on site staff. The following methods will be used:


1) FSS program staff members will record Baseline Information Form data directly into MDRC’s online random assignment database. Accordingly, problems with missing BIF records for a study participant would occur infrequently, most likely when the random assignment process is interrupted (for example, when the provider loses the connection to MDRC’s random assignment website). MDRC will conduct random assignment training to test the data entry safeguards (required field designators, range limits for responses, and automated skip patterns) programmed into the system. We will make sure that these safeguards are working properly before starting random assignment. Finally, our Random Assignment Procedures Manual and random assignment training will include guidance to program staff members on recording BIF data items. All efforts will be made to reduce data collection burden on PHA staff.


2) Computer-assisted survey interviews (CATI/CAPI) will be used during the follow-up survey. This helps to reduce respondent burden, as interviewers can proceed more quickly and accurately through the survey instrument, minimizing the interview length and the need for subsequent call backs. Computer programs enable respondents to avoid inappropriate or non-applicable questions. CATI/CAPI also improves data quality through more uniform administration of survey questions, more accurate implementation of the skip patterns, and immediate application of range checks, edit checks, and consistency checks of item-by-item responses.9


3) In a future submission we will thoroughly describe information technology used during survey fielding. Briefly, we will utilize survey tracking systems. The survey firms have databases that track the location of the participants throughout the project.  Database changes come from mailings to participants and passive tracking of participants through the U.S. Postal Service Change of Address database. In addition to being an inexpensive method for contacting a respondent, this passive collection approach reduces the need for the respondent to continually provide their most recent address information. It also reduces the probability of seeking alternate contacts in order to find the respondent.


4) Integration of other data sources. Finally, when relevant person-level data has been identified as available through an accessible, centralized and computerized source, the information has generally been excluded from the proposed data collection package. For example, UI data will be obtained through administrative records. While implementation data collection relies on evaluation staff efforts on-site, we have sought wherever possible to minimize overlap between questions we include in implementation questionnaires and protocols and questions that will be asked through computer-assisted surveys.

A.4. Efforts to Identify Duplication


The information collection will not duplicate information that is already available. Where possible, the evaluation will use available data sources, such as UI wage records. We will utilize PHA MIS for service participation and engagement measures. The BIF and survey will collect data on various other outcomes not available routinely or systematically in program records.


Finally, a central topic of the evaluation involves the tracking of employment, service participation, and well-being of participants over time. Thus, it is critical to have a core set of consistently worded questions in order to track how the impacts of the program change over time on a common set of measures. However, whenever possible, measures that do not require tracking over time will be assigned to the single follow-up survey.


A.5 Burden on Small Business

We do not anticipate that this study will burden small businesses.

A.6 Consequence If Data Collection is Not Conducted


This evaluation represents an important opportunity for the Federal government to add to the body of knowledge about the impacts of a key employment-oriented program for HCV recipients. This is consistent with the Administration’s strong focus on evidence-based policymaking. With the exception of the New York City FSS study, there is no rigorous evaluation of the FSS program. If this study is not conducted and the data not collected, analyzed, reported, and disseminated, Federal program or policy activities will not be informed by high quality information upon which to base critical decisions regarding future investments in self-sufficiency programs for voucher households.

A.7 Special Data Collection Circumstances


The proposed data collection activities are consistent with the guidelines set forth in 5 CFR 1320 (Controlling Paperwork Burdens on the Public). There are no special circumstances that require deviation from these guidelines.


A8. Form 5 CFR 1320.8(d) and Consultations Prior to OMB Submission

  1. Federal Register Notice and Comments


Please see Appendix X for a copy of the HUD’s notice in the Federal Register, required by 5 CFR 1320.8(d), soliciting comments on the information collection prior to submission to OMB.


The notice appeared on page 73,493, Vol. 77, No. 237, Monday, December 10, 2012.


b. Consultations Outside of the Agency


All data collection instruments included in this package have gone through extensive review by expert consultants, HUD staff, and members of the research team. During study design, we have sought the input of nationally recognized experts on public housing and Section 8 housing support, including John Goering, Ingrid Gould-Ellen, and Jeff Lubell. We continue to consult with outside experts as we move toward project launch.

A.9 Justification for Respondent Payments

Over the course of the evaluation, two types of payments to respondents are planned. The first will be distributed upon study enrollment; the second will be distributed upon survey completion. Additionally, a site payment will be provided to the participating PHAs. Justification for each payment is provided.

Payment to study participants upon study enrollment

This payment is intended as a small compensation for study participants for their time. Study participants have to take time to fill out informed consent and baseline information forms before completing the study enrollment process. Also, those assigned to the control group will receive no program services as a result of their efforts. As a result, and consistent with the practice of other MDRC random assignment studies, we expect to provide a $20 - $25 dollar payment as an acknowledgment and appreciation for their time spent on the enrollment effort.

Payment to participants upon completion of the survey

As noted, the survey and its associated details, including respondent payment, will be discussed in a later OMB submission. However, to summarize, payment upon survey completion is intended as a token of appreciation. As documented in the literature, this token of appreciation is likely to improve response rates by decreasing the number of refusals, enhancing respondent retention, and providing a gesture of goodwill to acknowledge respondent burdens. This technique is proposed in addition to many of the techniques suggested by OMB to improve response rates that have been incorporated into our data collection effort and are described in Section B3, because our experience has shown that small monetary amounts are useful when fielding data collection instruments with hard-to-employ populations as part of a complex study design. It is essential to include a token of appreciation in order to maximize the response rate, and it is particularly important with a challenging population and a demanding data collection strategy. In a seminal meta-analysis, Singer, et al. (1999) found that incentives in face-to-face and telephone surveys were effective at increasing response rates, with a one dollar increase in incentive resulting in approximately a one-third of a percentage point increase in response rate, on average. They found some evidence that incentives were useful in boosting response rates among underrepresented demographic groups, such as low-income and non-white individuals.10 This is a significant consideration for this study. Another important consideration is the burden posed by the data collection, which will take on average 30 to 40 minutes of the participant’s time for the follow-up survey. The survey firm will also provide a small token (e.g. a $2 bill) to compensate the respondent for their time updating contact information with the expectation that it will double the odds of a return (Edwards, et al., 2002, p. 3). About a quarter to a third of respondents will submit information for themselves and for their alternate contacts.

We are aiming to achieve an 80 percent survey response rate for the follow-up survey. Even with the best data collection practices, it would be very difficult, if not impossible, to obtain such a high completion rate without providing a token of appreciation to participants.


Site payments to participating PHAs


It is critical to compensate PHAs financially for their participation in the study in order for the study to be effective. Participation in the evaluation will require PHAs to undertake activities not normally required for their regular operations of the FSS program. For instance, the initial stage of the study assumes that participating PHAs will increase their recruitment and orientation activities to enroll an adequate sample in the study groups. Further, PHA staff will need to be trained on the baseline data collection tools and the use of the random assignment application, which will be customized for this evaluation. Also, over the course of the evaluation, PHA staff will have to maintain steady communication with the evaluation team so that progress can be monitored via conference calls, on-site interviews, and annual data reports. The proposed compensation would offset some of the costs incurred by the PHA from devoting staff time and resources to these types of activities. Given the additional work that study participation entails, it is important that the PHA leadership and HCV program managers incorporate the study requirements into their staff’s workload rather than allowing the study to become a source of conflict, which would substantially hurt the effort.


The FSS evaluation budget assumes about $40,000 in site payments to each participating PHA. MDRC recognizes that this amount is insufficient to fully compensate PHAs for the additional burden study participation entails; however, it is meant to help defray some of the costs associated with the activities described above. Based on experience, study burden will be higher during the first two years of participation because of concentrated recruitment and enrollment activities during this period. As a result, we expect at least 50 percent of the $40,000 sum will be distributed to PHAs during this period. Subsequent payments would help defray some costs associated with the PHAs’ ongoing reporting needs for the study, which are less intense in years 3-5. We therefore propose to issue payments to sites in six installments, with the first two totaling $10,000 each and each subsequent payment being equal to $5K. However, the exact distribution of payments will be finalized with the PHAs during contract negotiations. MDRC would like to offer maximum flexibility to participating agencies in how the site payments are used, and the Memorandum of Understanding executed between MDRC and each PHA will clearly indicate the agency’s responsibilities associated with study participation.


MDRC’s experience with evaluations and demonstrations also suggests that the additional workload associated with the sample enrollment phase of research has the potential to disrupt services to a point where having at least one more staff person is deemed necessary to support site operations and participation in the research. In the case of the FSS evaluation, and the anticipated one-year sample build-up period, a part-time hire might be necessary to cover the intake process and to ensure recruitment protocols are followed in a way that doesn’t undermine the validity of the study. PHAs may use the initially larger site payments to partially fund this position (and it is our understanding that PHAs are unable to draw on their current FSS grants to support a hire for study purposes only). The ongoing site visits to PHAs that are potential candidates for the evaluation will help MDRC understand how the PHAs plan to allocate site payments to support research needs. While we may not be in a position to offer higher site payments, this will be important feedback for HUD and MDRC in the early planning phase.


A.10. Assurances of Privacy

Every effort will be made to maintain the privacy of respondents, to the extent permitted by law. Please see the informed consent form in Appendix A. All respondents included in the study will be informed that information they provide will be used only for the purpose of this research. Individuals will not be cited as sources of information in prepared reports. All research staff working on the project have been trained to protect private information and have signed a pledge stating that they will keep all information gathered private to the extent permissible by law. All papers that contain participant names or other identifying information will be kept in locked areas and any computer documents containing identifying information will be protected with a password. The attached DCAP also provides additional information on how evaluation data will be protected.


A. 11. Questions of a Sensitive Nature

Many of the questions in the Baseline Information Form are potentially sensitive for respondents, but they cover information generally collected at the time of enrollment in FSS or in in follow-up discussions with case managers. Respondents are asked about potential barriers to employment (for example, disabilities, criminal record, perceptions and attitudes towards employment) and these topics may be addressed directly by program case managers, in service referrals, and in participants’ FSS plans. It is thus crucial that MDRC collect data on these subjects. Respondents will be informed by program staff prior to the start of intake that their answers are confidential, that they may refuse to answer any question, that results will only be reported in the aggregate, and that their responses will not have any effect on any services or benefits they or their family members receive.


A.12. Estimates of the Hour Burden of Data Collection to Respondents

The hour burden estimates for data collection for FSS operators and participants is outlined below in Table 2. We have assumed the maximum possible number of sites and study participants.  The estimates included below are based on experience with previous random assignment studies involving similar populations and data collection instruments.


Table 2

FSS Evaluation: Estimation of Hours and Burden


Instrument

Number of Respondents

Number responses per respondent

Average burden/ response (in hours)

Total burden hours

Informed Consent Form (ICF)11

3,000

1

Up to 15 minutes (or .25 hours) for FSS family respondents; Up to 15 minutes (or .25 hours) for PHA staff

1,500 hours (750 hours x 2)

Baseline Information Form (includes completion of the Contact Sheet)

3,000

1

30 minutes, on average (or .50 hours). Approximately 45 minutes (or .75 hours) for larger households for FSS family respondents; 30 minutes, on average (or .50 hours). Approximately 45 minutes (or .75 hours) for larger households for PHA staff.

4,500 hours (2,250 hours x 2 )

Tracking survey sample

3,000

1

Maximum of 1 hour over the tracking period, mainly to update contact information.

3,000 hours

Implementation research (round 1 projected to occur in Year 3)

Meetings could include: FSS coordinator; FSS case management staff; lead manager to whom the FSS coordinator reports; representatives of 4 key partner agencies; FSS participants

20 per PHA (or 20 * 18 sites )

1

Field research visits will last two days. Meetings to run 30 to 60 minutes, depending on the group of participants.

252 hours or

14 hours (or 2 days) per site visit * 18 sites

Cost-Benefit analysis data collection meetings with: FSS MIS/ data analyst; FSS coordinator; FSS case management staff; lead manager to whom the FSS coordinator reports

6 per PHA (or 6 * 18 sites )

1

Site visits will last 1 day. Meetings to run 30 to 60 minutes, depending on the group of participants.

126 hours or 7 hours (or 1 day) per site visit * 18 sites

TOTAL 9,378 hours

Cost burden to FSS families

Potential respondents for the above described data collection range widely in position and earnings. For study participants, we have estimated the hourly wage at the federal minimum wage: $7.25 per hour. Based on other housing-focused research, we expect about 50 percent of the participants to be employed at the time of study entry. Also, based on a recent report by the Center on Budget and Policy Priorities, some 55 percent of non-elderly, non-disabled households receiving voucher assistance reported earned income in 201012[1]. The typical (median) annual earnings for these families were $15,600, only slightly more than the pay from full-time, year-round minimum-wage work.13[2]

Based on this assumption, the estimated total respondent costs are:

Study participants: 6,000 x $7.25 = $ 43,500.00

Cost burden to PHA staff

As discussed in Supporting Statement B, participation in the evaluation will require HAs to undertake activities not normally required for their regular operations of the FSS program. These activities will span the duration of the study, although some activities, especially those involving the FSS program staff, will be more intensive in the earlier years. Required tasks will fall across four areas: planning for random assignment launch, orientations and sample build-up, data extraction and transfer, and ongoing management and review.

For program staff, our estimate of annual burden uses the median hourly wages of selected occupations (classified by Standard Occupational Classification, SOC, codes) was compared using Occupational Employment Statistics from the U.S. Department of Labor’s Bureau of Labor Statistics. Potentially relevant occupations and their median hourly wages include:

Occupation

SOC Code

Median Hourly Wage Rate

Community and Social Service Specialist

21-1099

$19.74

Social/community Service Manager

11-9151

$29.98

Source: Occupational Employment Statistics, May 2012, accessed online May 21st, 2013 at http://www.bls.gov/oes/current/oes_stru.htm



To estimate cost burden to program staff respondents, we use an average of the occupations listed, or $24.86/hr or $198/day, assuming an 8-hour workday.

Assuming the study would collectively require the full time equivalent of a staff member over six months period of time (keeping in mind this time would be spread out over the duration of the study), then estimated cost burden = $198/day x 120 days (6 months) = $23,760.


Note, this estimate does not include fringe benefits or other overhead costs. This estimate also only pertains to staff directly involved in administering study protocols and participating in data collection activities, and does not include effort associated with other housing authority staff whose contribution will be needed to effectively implement the study at the site. Further, for staff included in this estimate, this calculation does not include any additional effort associated with their participation in the design and planning prior to the launch of random assignment (for example, training on random assignment protocols), nor the time that will be required to check-in regularly with MDRC and to provide other regular status updates throughout the duration of the study. The remaining $16,240 will go toward defraying the costs of other levels of staff supporting the multi-year evaluation. For example, executive management, including legal or other administrative staff, will be needed to provide key managerial oversight, and data managers or other technical staff will assist with required data extraction and to incorporate any requested changes to the management information system (MIS).



A.13. Other Cost Burden to Respondents and Record Keepers

The proposed data collection will not require the respondents to purchase equipment or services. Therefore, there are no additional costs to respondents.


The proposed data collection will not require program staff to establish new data retrieval mechanisms or design a new MIS. It will however, require that they participate in ongoing data collection efforts and use the MDRC-developed web-based system for collecting background information on study participants.

A.14. Annualized Cost to the Government.

The total cost to the federal government for the study, including but not limited to the data collection activities discussed in this submission, is $12,468,002 over a 60 month period. Included are costs associated with background research, evaluation design, development of data collection instruments and data collection activities analysis and reporting.

A.15. Reasons for Any Program Changes or Adjustments

This submission is a new request for approval; there is no change in burden.



A.16. Tabulation, Analysis, and Publication Plans and Schedule


To determine the effectiveness of the targeted programs, MDRC will collect four categories of data: 1) baseline data, 2) implementation and process data (some of which will support the cost effectiveness study), 3) surveys of study sample members, and 4) administrative records.


As shown in the Table 3 below, the evaluation data will be analyzed from 2014 to 2018 and MDRC intends to produce four formal deliverables drawing on the various data sources.





Table 3

FSS Evaluation Analysis and Publication Schedule


Activity

Schedule

Analysis

04/01/2014- 02/28/2018


1st Report

Draft

Final


09/15/2014

10/30/2014


2nd Report

Draft

Final


09/15/2015

10/30/2015


3rd Report

Draft

Final


10/10/2016

11/07/2016

Final Report

06/30/2018


A.17. Reasons for Not Displaying OMB Approval Expiration Date

The expiration date for OMB approval will be displayed on all forms completed as part of the data collection. Because data collection will occur for more than 3 years, we will submit a future ICR to update and extend this information collection.



A.18. Exceptions to Certification Statement


No exceptions are necessary for this information collection.



APPENDIXES



APPENDIX A (INFORMED CONSENT FORM)

APPENDIX B (BASELINE INFORMATION FORM)

APPENDIX C (DATA COLLECTION AND ANALYSIS PLAN)

1 Research to date on the HUD-funded, experimentally designed Moving to Opportunity for Fair Housing Demonstration has indicated no major impacts on labor market outcomes of offering public housing residents regular vouchers or special vouchers linked to low-poverty locations (see Sanbonmatsu, et al. 2011).

2 A brief description of HUD’s prior efforts at FSS-like programs is found in Rohe and Kleit (1999).

3 See for example MDRC’s Chicago Employment Retention and Advancement Project (http://www.mdrc.org/publications/441/full.pdf)

4 Verma, Tessler, Miller, Riccio, Rucks, and Yang (2012).

5 HUD data indicate that there are currently over 3.3 million units of public and assisted housing (Schwartz, 2010; US Department of Housing and Urban Development, 2008).

6 See section below for the determinants of the sample size.

7 The random assignment design provides the most credible evidence on FSS’ effectiveness, but only a fraction of eligible voucher holders will enroll in the program. We anticipate that marketing and outreach will be extensive enough to bring in a relatively wide range of families. Nonetheless, there is always the possibility that participating families will differ significantly from non-participants. In this case, although we would know, for example, that the program is effective for those who volunteered for it, we would be limited in our ability to predict its effectiveness for all eligible families. As part of the FSS NOFA, HUD is considering whether an analysis could be done to examine the external validity of the study and assess whether the results can be generalized to the full eligible voucher population, a critical issue should FSS programs be scaled up. Should the analysis reveal notable differences between the two groups, MDRC would seek to examine in more depth, through the implementation research, the ways in which participants differ from non-participants and the reasons for non-participation.

8 The table does not reflect the benefit-cost analysis. As described in the design paper, it will draw on data collected for the implementation and impact study components, including PHA expenditure reports, MIS participation data, and information on escrow payments from HUD records. Additional details on this study component can be found in the research design paper.


9 Another benefit of CATI/CAPI is that interviewers can focus on the respondent rather than management of the survey instrument, creating a more pleasant experience for the respondent. The technology ensures that scheduled appointments are honored with respondents. CATI/CAPI also ensures adherence to dialing protocols, maintaining the integrity of the study without unduly burdening sample members. It eliminates many human errors, such as accidental calling of resolved sample records. Finally, CATI provides translated scripts for crisp script delivery to non-English speakers.

10 Berlin, M., L. Mohadjer and J. Waksberg (1992). An experiment in monetary incentives. Proceedings of the Survey Research Section of the American Statistical Association, 393-398; de Heer, W. and E. de Leeuw. “Trends in household survey non-response: A longitudinal and international comparison.” In Survey Non-response, edited by R. M. Groves, D. A. Dillman, J. L. Eltinge, and R. J. A. Little. New York: John Wiley, 2002, pp.41-54; Singer, E. and Kulka, R. Studies of Welfare Populations: Data Collection and Research Issues, Panel on Data and Methods for Measuring the Effects of Changes in Social Welfare Programs. Ploeg, Robert A.Moffitt, and Constance F.Citro, Editors. National Academies Press, Washington, DC, 2000, pp. 105-128.

11 We assume that the Informed Consent Form (ICF) will include language to enable us to collect administrative records from the designated state agency where the PHA is located. Therefore, the time to obtain this consent is included in the estimate to complete the ICF. Upon selection of sites, MDRC will contact the corresponding state agencies to initiate legal agreements to obtain these records, including the approved language to incorporate into the ICF.

13[2] In the NYC Work Rewards study, the median wage for working participants was $10 an hour. This finding is based on preliminary analysis of the Work Rewards 36-month survey.



File Typeapplication/msword
Authornunez
Last Modified ByJennifer Stoloff
File Modified2013-07-18
File Created2013-07-18

© 2025 OMB.report | Privacy Policy