Supporting Statement A - LifeSet - 6th Draft Clean

Supporting Statement A - LifeSet - 6th Draft Clean.docx

OPRE Study: Evaluation of LifeSet [Impact and Implementation Evaluation]

OMB: 0970-0577

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes





Evaluation of LifeSet



OMB Information Collection Request

New Collection





Supporting Statement

Part A






June 2021








Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers: Kathleen Dwyer, Alysia Blandon





Part A




Executive Summary


  • Type of Request: This Information Collection Request is for a new information request. We are requesting 3 years year of approval.


  • Description of Request:

ACF is seeking OMB approval to collect data for an impact and implementation evaluation of LifeSet. LifeSet is a therapeutic case management program that provides youth and young adults leaving foster care, juvenile justice, and mental health systems with the intensive in-home support and guidance they need to move towards youth-defined goals in multiple domains of independent living including education, housing, employment and financial security, health and safety, and social connections and support. The evaluation activities will be conducted in two phases. This current information collection covers Phase 1 activities of a baseline youth survey; interviews and focus groups with child welfare agency, LifeSet developer, and local LifeSet provider agencies staff; and collection of administrative data. A future request will cover Phase 2 activities of two waves of follow-up youth surveys; interviews with youth who receive LifeSet; focus groups with youth receiving LifeSet and youth receiving services as usual; additional interviews and focus groups with child welfare agency, LifeSet developer, and local LifeSet provider agencies staff; a survey of LifeSet frontline staff; and observations of LifeSet program activities. We do not intend for this information to be used as the principal basis for public policy decisions.


  • Time Sensitivity:

To meet the required sample size for minimum detectable effects, data collection must begin in September 2021.




A1. Necessity for Collection

Each year, around 20,000 young adults age out of foster care, meaning they left foster care solely due to their age (Annie E. Casey Foundation, 2020). Many young adults leaving foster care experience poor outcomes such as homelessness, unemployment, lack of education, incarceration, and untreated mental health and substance use problems (Courtney et al., 2020; Dworksy et al., 2011). However, the evidence base for interventions that effectively meet the needs of young adults leaving foster care is extremely limited. This shortage of evidence prevents public and private agencies from implementing evidenced-based practices and programs.


LifeSet is a therapeutic case management intervention that provides youth and young adults leaving foster care, juvenile justice, and mental health systems with the intensive in-home support and guidance they need to move towards youth-defined goals in multiple domains of independent living including education, housing, employment and financial security, health and safety, and social connections and support. LifeSet has undergone one randomized controlled trial (RCT) evaluation in Tennessee (results presented in Courtney et al., 2019, Skemer & Valentine, 2016, Valentine et al., 2015, and Manno et al., 2014). The RCT evaluation assessed LifeSet impacts at 12- and 24-months after program entry, while some youth were still in the program, rather than after program completion. For this reason, LifeSet is currently rated as “promising” by the California Evidence-Based Clearinghouse for Child Welfare (CEBC) and the Blueprints for Healthy Youth Development (Blueprints) registries. However, the evidence for LifeSet could be strengthened if positive impacts were found in an RCT evaluation in a new location that is designed to assess sustained effects at 12 months after program completion.


The proposed study will build on the previous RCT through a rigorous impact study to assess sustained effects at 12 months following program completion. The proposed study also includes an implementation evaluation of LifeSet. Importantly, the study will also further the Administration for Children and Families (ACF’s) goal of building the evidence base for programs for child welfare involved children, youth, and families. The ACF Office of Planning, Research, and Evaluation (OPRE) seeks approval to collect information necessary to conduct a rigorous evaluation of LifeSet. The proposed study is partially funded through The John H. Chafee Independence Program (Chafee, P.L. 106-169) which requires ACF to conduct rigorous evaluations of independent living programs that are “innovative or of potential national significance” such as LifeSet.


ACF awarded a contract to the Urban Institute to conduct this information collection.



A2. Purpose

Purpose and Use

The purpose of this information collection is to conduct a rigorous evaluation of LifeSet. The study will have two main components: an impact study to assess the effects of program participation on outcomes of interest and an implementation study to describe and document how LifeSet is implemented in New Jersey (see Study Design below for additional information on site selection). The impact study will collect youth survey data and administrative and program data to determine whether youth randomized to receive LifeSet fare better in key independent living domains than youth randomized to receive services as usual. The implementation study will evaluate how LifeSet has been implemented and how it compares to other services in New Jersey for young adults transitioning out of foster care.


ACF will use the results from the study to contribute to the evidence base on program models that support young adults in the transition out of foster care. OPRE intends that key child welfare stakeholders, including administrators and program providers, will better understand how the LifeSet program model operates in one state from multiple perspectives including program staff, youth and young adults, child welfare case workers, and the program developer. The implementation study can inform policymakers and local agencies on the elements that contribute to program success and program challenges.


The information collected is meant to contribute to the body of knowledge on ACF programs. It is not intended to be used as the principal basis for a decision by a federal decision-maker and is not expected to meet the threshold of influential or highly influential scientific information.


Research Questions or Tests

Impact Study:

The primary research questions for the impact study address whether LifeSet improves outcomes of target population youth in the domains of:

  1. Connections to education and employment

  2. Social connections

  3. Housing stability

  4. Youth well-being (including resilience and social-emotional competence)


The LifeSet developer, Youth Villages, and the New Jersey child welfare agency, Department of Children and Families (DCF), identified these domains as the primary outcomes the program will target.


The LifeSet logic model identifies additional domains that may be impacted by the program. However, these domains are not the primary targets of the program in New Jersey and should not be used to determine programmatic effectiveness. The secondary research questions for the impact study address whether LifeSet improves outcomes of target population youth in the domains of:

  1. Mental health

  2. Contact with the criminal justice system

  3. Intimate partner violence

  4. Economic well-being


Implementation Study:

Core research questions for the implementation study include:

  1. How is the LifeSet program implemented in New Jersey?

  2. How do relevant aspects of the local demographic, political, economic, and service environment shape the LifeSet program in New Jersey?

  3. What services do youth transitioning out of foster care in New Jersey receive in the absence of LifeSet, and how does LifeSet differ from usual services?

  4. Is LifeSet being delivered as intended, or are there modifications made based on the New Jersey context?

  5. What infrastructure supports the implementation of LifeSet in New Jersey, and what are the key implementation facilitators and challenges?

  6. What is the level of youths’ engagement with LifeSet services and what are their perceptions of the program?



Study Design

The study design includes two components: an impact study and an implementation study. The two studies will be carried out simultaneously. The study will take place in New Jersey. LifeSet is currently implemented in 15 states. The project team engaged the developer of LifeSet, Youth Villages, in discussions to determine which locations may be suitable for a rigorous evaluation. New Jersey was ultimately selected for the proposed evaluation after discussions with the New Jersey Department of Children and Families indicated support and feasibility of a rigorous evaluation.


Impact Study:

The impact study will use an RCT to determine whether youth assigned to LifeSet achieve better outcomes on the domains noted above than youth assigned to receive services as usual. Randomization is expected to occur for two years and be conducted by age cohort, starting with youth who are at least 19 years old and moving to younger ages in successive monthly randomization rounds. Randomization may extend beyond two years if needed to obtain a minimum sample size. Data for the impact study will be collected through a combination of youth surveys and administrative data obtained from the child welfare agency, LifeSet providers, and other relevant entities listed in section B3. Youth will be surveyed three times: at baseline and 12- and 24-months post-randomization. Panel outreach will be conducted at four- and eight-months after youth complete the baseline survey and again after they complete the first follow-up survey in order to reduce participant attrition. See Supporting Statement B Section B4 Collection of Data and Quality Control for additional information about panel outreach activities. The current request only includes the baseline youth survey (Instrument 1) and panel outreach materials between the baseline and first follow-up survey (Appendices G-I). Future requests will include the two follow-up surveys. Administrative data from the child welfare agency will be obtained by the project team two times: at the end of randomization and two years after the end of randomization. The project team will obtain administrative data from other sources one time, two years after the end of randomization. Data on outcomes will be analyzed using an intent-to-treat (ITT) analysis as detailed in B7. This randomized study is intended to produce internally-valid estimates of the intervention’s causal impact, not to promote statistical generalization to other sites or service populations.


Implementation Study:

For the implementation study, data collection will mostly occur over the course of five site visits (which may be virtual, by phone, or in-person depending on current conditions at the time – e.g., due to COVID-19 safety and travel restrictions). Site visits will take place intermittently over a four year period1 to capture program implementation during the study period and allow for data collection with a range of respondents. Each respondent will only participate in one interview or focus group during a specific site visit. However, some respondents may be asked to participate in more than one site visit if they are the best informant for that visit’s topic areas.

  • The first site visit will focus on understanding the implementation of LifeSet as a new program in New Jersey and the context in which the program operates.

  • The second site visit will focus on developing a better understanding of the program model and goals and fidelity early in the program’s implementation.


The current request only includes information collection activities that will occur during the first two site visits. A future request will include information activities that will occur during the third through fifth site visits, which will focus on understanding differences between LifeSet and services as usual, the relationship among stakeholder agencies, supervision and assessment of program staff, and youths’ experiences with LifeSet and services as usual.


Data will be collected through interviews with staff and administrators at the New Jersey child welfare agency (DCF), the LifeSet developer (Youth Villages), and the local LifeSet provider agencies. Data will also be collected through focus groups with LifeSet case managers (Specialists) and their supervisors (Team Supervisors). Future requests will include data collection activities for the remaining three site visits which include interviews and focus groups with staff of the child welfare agency, LifeSet providers, and the LifeSet developer; interviews with young adults receiving LifeSet services; focus groups with young adults receiving LifeSet services and young adults receiving services as usual; and observations of program activities. Future requests will also include a brief online survey of LifeSet Specialists that will occur between the third and fourth site visits.


The project team will use information from the implementation study to provide important context for findings regarding outcomes of interest in the impact study. Data will be analyzed using qualitative analysis techniques and triangulation of data across data sources. During analysis of the data for the implementation study, the project team will map information from the implementation study data sources to the LifeSet logic model. The analytic focus will be on examining the program against the logic model, to examine whether the program achieves what the logic model says it should (Epstein and Klerman, 2012). This process will allow confirmation of the effective functioning of implementation supports as part of the necessary pathways toward expected outcomes. The implementation study is not designed to promote statistical generalization to other sites or service populations. See Supporting Statement B, Section B7. Data Handling and Analysis for additional information about data analysis.


A crosswalk of the data to be collected and the research questions is presented in Table A1 below. The table also outlines the data collection group, timing, respondents, and type of data collection for each instrument.


Table A1.

Data Collection Activity

Instruments

Respondent, Content, Purpose of Collection

Mode and Duration

Baseline Data Collection – Impact Study

Instrument 1 – Baseline Youth Survey

Respondents: Youth randomized to LifeSet or services as usual

Content:

  • Demographics

  • Living arrangements

  • Social support

  • Fertility

  • Education

  • Employment and earnings

  • Economic hardships

  • Mental health services

  • Substance abuse

  • Criminal justice involvement

  • Spouse/partner violence

  • Youth resiliency

  • Social-emotional competence

  • LifeSet services

  • Information for follow up contact

Purpose: Pre-test measures on study outcomes

Mode: Computer assisted personal interview (CAPI) or computer assisted telephone interview (CATI)


Duration: 35 minutes

Baseline and Outcome Data Collection – Impact Study

Instrument 2 – Administrative Data List

Respondents: Public child welfare agency, LifeSet provider agencies, criminal justice agencies, public benefits agency, homeless management information system, National Student Clearinghouse, National Directory of New Hires, state wage records

Content:

  • Dates and types of study participant interactions with respondent entities

Purpose: Outcome data to assess program impacts

Mode: Secure file transfer protocol (SFTP)


Duration: 5 hours

Early Implementation Data Collection – Implementation Study

Instrument 3A – Site Visit 1 Interview Guide for Administrators: Child Welfare Agency Administrators

Respondents: DCF administrators

Content:

  • Background and role

  • Program selection and startup

  • Community context

Purpose: Child welfare administrator knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 3B – Site Visit 1 Interview Guide for Administrators: Licensed LifeSet Experts

Respondents: LifeSet licensed program experts

Content:

  • Background and role

  • Program model

  • Staff requirements and responsibilities

Purpose: Program clinical consultant knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 3C – Site Visit 1 Interview Guide for Administrators: LifeSet Developer Administrators

Respondents: Youth Villages administrators

Content:

  • Background and role

  • Program selection and startup

  • Staff requirements and responsibilities

Purpose: Program developer knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 3D – Site Visit 1 Interview Guide for Administrators: Provider Agency Administrators

Respondents: LifeSet provider agency administrators

Content:

  • Background and role

  • Program selection and startup

  • Community context

  • Staff requirements and responsibilities

Purpose: Program administrator knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 4A – Site Visit 2 Focus Group Guide for Staff: LifeSet Specialists


Instrument 4B – Site Visit 2 Focus Group Guide for Staff: LifeSet Team Supervisors

Respondents: Frontline LifeSet staff

Content:

  • Background and role

  • Program model

  • Staff requirements and responsibilities

  • Service delivery

  • Community context

Purpose: Frontline program staff knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1.5 hours

Early Implementation Data Collection – Implementation Study

Instrument 5A - Site Visit 2 Interview Guide for Administrators: Child Welfare Agency Administrators

Respondents: DCF administrators

Content:

  • Background and role

  • Implementation supports

  • Program model

  • Service delivery

  • Data systems and use

  • Program improvement (CQI)

  • Opinion of program effectiveness

Purpose: Child welfare administrator knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 5B – Site Visit 2 Interview Guide for Administrators: Licensed LifeSet Experts

Respondents: LifeSet licensed program experts

Content:

  • Background and role

  • Implementation supports

  • Program model fidelity

  • Opinion of program effectiveness

Purpose: Program clinical consultant knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 5C - Site Visit 2 Interview Guide for Administrators: LifeSet Developer Administrators

Respondents: Youth Villages administrators

Content:

  • Background and role

  • Implementation supports

  • Program model

  • Partners

Purpose: Program developer knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour

Early Implementation Data Collection – Implementation Study

Instrument 5D – Site Visit 2 Interview Guide for Administrators: Provider Agency Administrators

Respondents: LifeSet provider agency administrators

Content:

  • Background and role

  • Implementation supports

  • Program model

  • Service delivery

  • Data systems and use

  • Opinion of program effectiveness

Purpose: Program administrator knowledge and perspective

Mode: in- person or virtually (i.e., phone, video)


Duration: 1 hour


The list below outlines the instruments that the project team anticipates including in a future information collection request.


  • Impact Study

    • 12-months post-randomization youth survey

    • 24-months post-randomization youth survey

    • Administrative data (extension)

  • Implementation Study

    • Interview protocols

      • Child welfare agency administrators

      • Licensed LifeSet experts

      • LifeSet developer administrators

      • Provider agency administrators

      • Young adults randomized to LifeSet

    • Focus group protocols

      • LifeSet team supervisors

      • LifeSet specialists

      • Child welfare agency caseworkers

      • Young adults randomized to LifeSet

      • Young adults randomized to services as usual

    • Staff survey of LifeSet specialists

    • Observations of program activities


Limitations

Impact Study:

The results of the impact study are not designed to be representative or generalizable to all youth transitioning out of foster care but are intended to provide internally-valid estimates of the program’s impact in New Jersey. The design is limited in that findings may not hold in states where youth have different demographic characteristics or with a different policy or service context than New Jersey. The study may also face challenges recruiting and retaining youth for impact study data collection, leading to high rates of attrition. The Urban Institute has subcontracted with an independent survey firm, RTI International (RTI), with experience in conducting data collection and panel maintenance of similar target populations to reduce attrition (see section B4 for recruitment procedures). High attrition rates, if present, will be acknowledged as a limitation in reports of the study’s findings.


Implementation Study:

The results of the implementation study are not designed to be representative of or generalizable to all providers or youth leaving foster care in New Jersey, but are intended to reflect variation in stakeholders’ experiences. The design is limited in that it will not capture every potential stakeholder and each stakeholder’s participation is voluntary. Important information needed to answer the study’s research questions may not be collected if stakeholders decline to participate or if outreach misses key stakeholder groups. However, the purposive sampling design (discussed in section B4) identifies and recruits key stakeholders most knowledgeable about the program and services, maximizing what the project team will learn from the data collection. Limitations of the qualitative study design and lack of generalizability will be acknowledged in reports of the study’s findings.


The planned study design combining a quantitative impact study with a qualitative implementation study is the best approach for obtaining the information OPRE needs to better understand the impact of LifeSet, how the program is currently being delivered, and assess if the program is being operated to fidelity.


Other Data Sources and Uses of Information

The information to be collected for this study is being submitted in multiple phases. The current information collection request is Phase 1. The Phase 2 information collection request will cover two waves of follow-up youth surveys, a LifeSet staff survey, and information collected as part of the remaining three site visits. Finally, a Phase 3 information collection request will seek a no-change extension as Phase 2 collection is planned to last for more than three years.


New Jersey DCF will send information needed to conduct the random assignment and youth surveys for the impact study to the project team. This information will include youths’ name, phone number, address, and email address (if known) and the youths’ caseworkers’ name, phone number, and email address. The data collection is not included in the information request as it will be collected and sent by only one person within New Jersey DCF.



A3. Use of Information Technology to Reduce Burden

Impact Study:

All youth surveys will be conducted by trained field interviewers using CAPI/CATI technology. Field interviewers will read items aloud to respondents and record their responses. This reduces burden on respondents who may have reading difficulties (e.g., comprehension issues, low literacy skills, etc.) and who may not have access to technology required to complete an online survey. The project team will only request administrative data that agencies or organizations collect during their usual duties, that are relevant to the research questions, and that can be reasonably extracted from management information systems.



Implementation Study:

With respondents’ permission, the project team will audio record the interviews and focus groups to minimize time needed for potential follow-up to clarify notes.



A4. Use of Existing Data: Efforts to reduce duplication, minimize burden, and increase utility and government efficiency

Impact Study:

The data collected through the baseline youth survey do not duplicate current data collection efforts with this population.


The project team will use child welfare administrative data to describe participants’ foster care histories instead of asking participants for this information to reduce duplication and minimize participant burden.


There may be minimal duplication in the outcome data collected from administrative data sources and the follow-up youth surveys (to be included in a future request). This duplication is needed to reduce missingness in outcome data for participants who are not able to be matched in administrative data (i.e. false negatives) and for participants who do not respond to the follow-up survey requests.


Implementation Study:

The data collected through interview and focus group protocols do not duplicate any current data collection efforts with these populations. Each respondent will only respond once to each instrument though some respondents may be asked to participate in interviews or focus groups during more than one site visit. To reduce duplication and reduce burden on respondents that may participate in data collection activities during multiple site visits, the project team has avoided asking questions to collect the same information more than once and will use subsequent interviews to probe on changes since the previous interview. The project team has designed the data collection instruments so that no two instruments collect the same information from the same respondent. However, different respondents may be asked the same questions in order to capture different knowledge and different perspectives. This provides a more robust description of the program model and provides qualitative measures of model fidelity.



A5. Impact on Small Businesses

No small businesses will be involved with this information collection.



A6. Consequences of Less Frequent Collection

Impact Study:

The activities in the current request are a one-time data collection.


Implementation Study:

The project team will collect data during five site visits over the course of four years. Potential negative consequences of less frequent data collection would be inaccurate findings that are relevant only to a specific point in time. LifeSet is a newly implemented program in New Jersey; less frequent data collection would not capture the changes in implementation as the program matures. The purposive sample study design allows for the project team to strategically identify and interview respondents with various perspectives, at various points during the implementation study (see A2 for more information on site visits). The study design calls for the minimum number of staff and participant hours necessary for successful and complete data collection. To reduce the time burden on agency administrators and staff in the implementation study, the project team will conduct the interviews and focus groups with as minimal disruption to participants as possible and will work with agency leaders and staff to determine the most appropriate respondents for each interview and focus group.



A7. Now subsumed under 2(b) above and 10 (below)



A8. Consultation

Federal Register Notice and Comments

In accordance with the Paperwork Reduction Act of 1995 (Pub. L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), ACF published a notice in the Federal Register announcing the agency’s intention to request an OMB review of this information collection activity. This notice was published on April 15, 2021, Volume 86, Number 71, pages 19892-19893, and provided a sixty-day period for public comment. During the notice and comment period, the agency did not receive any comments.


Consultation with Experts Outside of the Study

The LifeSet developer and New Jersey DCF were consulted on the study design and data collection protocols to ensure the study builds on lessons learned from the previous impact evaluation of the program in Tennessee, is feasible in the context in which it will take place, and measures the correct activities and outcomes so that the results will be of broad use to the field.



A9. Tokens of Appreciation

Impact Study:

Youth Surveys: The project team plans to offer youth a token of appreciation in the form of a gift card (or e-gift card if survey interviews are conducted virtually) for each survey wave in which they participate. The project team plans to offer a $25 token of appreciation to youth who participate in the baseline survey (Instrument 1), which is estimated to take 35 minutes, and a $50 token of appreciation for each of the follow-up surveys (future information request), which are estimated to take up to 75 minutes. Minimizing nonresponse is critically important for impact evaluations of services like LifeSet. In order for the study to produce internally valid estimates of the program’s impact, it is important to secure participation from as many youth as possible from those who were randomized, including those participating in the program and those in the control group. The tokens are intended to offset costs of participation in the study, such as childcare, technology costs if surveys are conducted virtually (i.e., phone minutes, data plan), or other expenses to help ensure against non-response bias by supporting the participation of individuals with more constraints on their ability to participate. The $25 and $50 tokens are intended to be high enough to support participation but are not so high as to appear coercive for potential participants.


Panel Outreach: Maintaining high retention rates across survey waves is critical to produce reliable impact estimates. To aid in survey retention, the project team plans to enclose a $10 token of appreciation, in the form of a gift card, in the panel outreach mailings detailed in Supporting Statement B Section B4 Collection of Data and Quality Control. This token encourages participants to maintain contact with the survey firm, RTI International, ensuring adequate response rates in the follow-up surveys for the study to produce internally valid estimates of program impact. This approach is supported by research conducted for the Panel Study of Income Dynamics (PSID) and the PSID -Transition to Adulthood Study where families and young adults received a contact update mailing between rounds of the study to maintain engagement and to collect contact updates before the next round of data collection. Sample members receiving a $10 or $20 incentive for returning the contact update form had significantly higher return rates, particularly for the young adult sample where return rates were 23-25 percentage points higher than for those receiving no incentive for completing the request (McGonagle et al. 2011).


Implementation Study:

Administrators and staff from the child welfare agency, LifeSet provider agencies, and the LifeSet developer will not receive a token of appreciation for participating in interviews or focus groups.


A10. Privacy: Procedures to protect privacy of information, while maximizing data sharing

Personally Identifiable Information

Impact Study:

The project team will obtain study participants’ personally identifiable information (PII) including names, email addresses, phone numbers, physical addresses, and birth dates from the child welfare agency’s administrative data. Collection of this information is required for performing random assignment, conducting outreach to participants to complete surveys, and obtaining participants’ administrative data from other agencies. Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.


Implementation Study:

The project team will obtain names, email addresses, and phone numbers of administrators and staff within the child welfare agency, LifeSet provider agencies, and the LifeSet developer. Collection of this information is required in order to schedule interviews and focus groups with program and agency staff during the site visits. Information will not be maintained in a paper or electronic system from which data are actually or directly retrieved by an individuals’ personal identifier.


Assurances of Privacy

Information collected will be kept private to the extent permitted by law. Respondents will be informed of all planned uses of data, that their participation is voluntary, and that their information will be kept private. The project team will comply with all Federal and Departmental regulations for private information.


Due to the sensitive nature of this research (see A.11 for more information), the evaluation will obtain a Certificate of Confidentiality. The study team has applied for this Certificate and will provide it to OMB once it is received. The Certificate of Confidentiality helps to assure participants that their information will be kept private to the fullest extent permitted by law.


The project team has also obtained preliminary Institutional Review Board (IRB) approval for all data collection under this contract. An Urban Institute developed IRB-approved confidentiality pledge, agreeing to adhere to the data security procedures laid out in the approved IRB submission, will be read and signed during the project training process by all researchers working with the data.


Impact Study:

For the impact study, the project team will use the informed consent form designed for participants aged 18 or older (Appendix A) and the informed assent form designed for participants under age 18 (Appendix B). The informed consent form covers participation in the baseline youth survey, permission to obtain their administrative data, and permission to contact them for future surveys. The informed assent form only covers participation in the baseline youth survey and permission to contact them for future surveys. Participants under age 18 at study enrollment who provide assent will be asked to provide consent after they turn 18. These consent/assent statements detail the risks and benefits of participating in the study and the level of expected privacy for each participant. Study participants will be informed that they may choose not to answer any or all items during the baseline survey interview and that participation in the baseline survey does not obligate them to participate in follow-up surveys nor to the collection of their administrative data. Some sensitive questions will be asked in the baseline youth survey. Participants can choose to answer these questions using audio computer assisted self-interviewing (audio-CASI) in order to keep their responses private from the field interviewer. With participant permission, portions of the baseline survey interview will be recorded for data quality assurance purposes.


Implementation Study:

For interviews and focus groups with administrators and staff of the child welfare agency, LifeSet provider agencies, and the LifeSet developer, the project team will request verbal consent at the start of each discussion (Appendix L). Participants will be provided a physical copy of the consent form before the discussion if it is in-person or presented with the consent form via video or email if it is done virtually. This form details the risks and benefits of participating and the level of expected privacy for each participant. Administrators and staff are categories of respondents not designated as vulnerable populations, and the information the research team will collect is not highly sensitive. The project team will ask respondents for factual information about their programs and work (e.g., what the programs do, the number of people they serve, who is eligible, the outreach and referral process). Because some study participants will be agency or organization leaders, administrators or staff members, and because the project team will name the site in our reports, individuals reading the reports may be able to attribute particular information or comments to that respondent. The project team will tell respondents about this potential risk. With respondents’ permission, the project team will audio record the interviews and focus groups to minimize time needed for potential follow-up to clarify notes.


Data Security and Monitoring

The contract with the Urban Institute explicitly requires a data security plan that outlines how the project will store, transfer, and destroy sensitive information as well as the precautions to be taken during each of those activities to ensure the security of those data. As specified in the contract, the Contractor shall protect respondent privacy to the extent permitted by law and will comply with all Federal and Departmental regulations for private information. The Contractor has developed a Data Security Plan that assesses all protections of respondents’ PII. The Contractor shall ensure that all of its employees, subcontractors (at all tiers), and employees of each subcontractor, who perform work under this contract/subcontract, are trained on data privacy issues and comply with the above requirements. 


The data security plan meets the requirements of U.S. federal government agencies and is continually reviewed in the light of new government requirements. Such security is based on (1) exacting company policy promulgated by the highest corporate officers in consultation with systems staff and outside consultants, (2) a secure systems infrastructure that is continually monitored and evaluated with respect to security risks, and (3) secure work practices of an informed staff that take all necessary precautions when dealing with private data.


The research team will archive the data at the National Data Archive on Child Abuse and Neglect (NDACAN) as required by its contract with ACF with the following provisions:

  • All personal identifying information will be stripped from the file.

  • To prevent secondary disclosure, the Urban Institute will conduct disclosure analysis and mask, suppress, or categorize any items that could lead to identification of individuals.


Impact study data to be archived includes all three waves of survey data and administrative data sources. Implementation study data from program and agency administrators and staff will be combined and quantified based on patterns across respondents before archiving. No implementation study data collected from service recipients (to be included in future requests) will be archived. Archived data will only be available under password protected secure access.



A11. Sensitive Information 2

Impact Study:

The baseline youth survey (Instrument 1) includes sensitive questions and the administrative data list (Instrument 2) includes sensitive information about study participants. The collection of sensitive information is necessary to understand the program’s impact on participants, and subgroups of participants, in the domains listed in section A2. Sensitive information collected for the impact study is outlined in Table A2 below by instrument:


Table A2

Type of Sensitive Information

Instrument 1 – Baseline Youth Survey

Instrument 2 – Administrative Data List

Sexual orientation and gender identity (SOGI)

X


Mental or emotional problems

X


Receipt of mental health services

X

X

Substance use, including illicit substances

X


Criminal justice involvement

X

X

Receipt of economic assistance from the government


X

Victim or perpetrator of spouse/ partner violence

X


Experiences of homelessness

X

X


Format of SOGI items: To collect study participants’ sexual orientation and gender identity on the baseline youth survey (Instrument 1, items sorient_w1 and gender_w1, respectively), the project team proposes to use a format different from the guidelines presented in the 2016 report of the Federal Interagency Working Group on Improving Measurement of Sexual Orientation and Gender Identity in Federal Surveys, “Evaluations of Sexual Orientation and Gender Identity Survey Measures: What Have We Learned?”. Table A3 shows the response options for both items as proposed in Instrument 1. For both items, the project team proposes offering respondents more identity options than are suggested in the Federal Interagency Working Group’s 2016 guidelines. For the sexual orientation item, the project team proposes a single question to be used for all respondents rather than separate questions based on respondents’ gender identity.


Table A3

Item

Question

Response Options

Gender identity

What gender do you identify as?

[INTERVIEWER: If needed, definitions noted in brackets]


1 GENDERQUEER/NONBINARY [do not identify as either a woman nor a man]

2 TRANSGENDER MAN [assigned female at birth but identify as a man]

3 TRANSGENDER WOMAN [assigned male at birth but identify as a woman]

4 CISGENDER WOMAN [assigned female at birth and identify as a woman]

5 CISGENDER MAN [assigned male at birth and identify as a man]

6 I IDENTIFY AS, SPECIFY

D DON’T KNOW

R REFUSED

Sexual orientation

What is your sexual orientation?

[INTERVIEWER: If needed, definitions noted in brackets]


1 GAY [you identify as a man and are attracted to men]

2 LESBIAN [you identify as a woman and are attracted to women]

3 BISEXUAL [you are attracted to men and women]

4 PANSEXUAL [you are attracted to people regardless of their sex or gender identity]

5 ASEXUAL [you do not experience sexual attraction]

6 STRAIGHT [you identify as a woman and are attracted to men, or you identify as a man and are attracted to women, also called heterosexual]

7 I IDENTIFY AS, SPECIFY

D DON’T KNOW

R REFUSED


Surveys of middle and high school students have found between 1.3 and 2.7 percent of youth self-identified as transgender or gender non-conforming (Goodman et al., 2019). A 2020 Gallup poll found that people under age 24 are more likely than other age groups to identify as LGBT (Jones, 2021). Analysis of the Human Rights Campaign’s LGBTQ National Teen Survey found that 26 percent of responding youth used non-traditional terms, such as pansexual or asexual, to describe their sexual orientation (Watson et al., 2020). Recent studies have demonstrated that LGBTQ youth are represented in foster care at nearly double their rate in the general population (Mountz, Capous-Desyllas, 2020). A survey of youth aged 12 to 21 in foster care in Los Angeles reported that 14 percent identified as lesbian, gay, bisexual, or questioning and 5.6 percent identified as transgender or gender non-conforming (Wilson et al., 2014). A survey of youth in foster care aged 12-20 in New York City found that 30 percent identified their sexuality as lesbian, gay, bi- or pansexual, or questioning and 13 percent identified as transgender (Sandfort, 2020). These studies demonstrate that various sexual orientations and non-binary genders are prevalent in the study’s target population, especially when compared to the general population, supporting the need for more inclusive response options.


Consent for secondary use of data: The informed consent and assent forms state that information collected from the surveys and administrative data will be de-identified and archived for secondary analysis by other researchers.


Parental permission: Some study participants may be minors when contacted for the baseline survey. The public child welfare agency will provide consent for minors who are in its guardianship or custody. For minors who are in custody, but not guardianship, of the public child welfare agency, passive parental permission for their youth’s participation in the baseline youth survey will be obtained by mailing a letter to the parent’s last known address (Appendix C). Parents may revoke permission by contacting the project’s co-PI at the Urban Institute.


All impact study information collection protocols have preliminary approval from the Urban Institute IRB and will receive full approval prior to any data collection (Appendix M).


Implementation Study:

There are no sensitive questions that will be asked of program or agency staff.



A12. Burden

Explanation of Burden Estimates

The burden was estimated for each proposed impact study instrument based on the project team’s experience of time required based on similar data collections. To estimate the burden for each proposed implementation study instrument, the project team piloted each instrument internally and considered the amount of time allotted for each interview or focus group during any given site visit. The design of each instrument and the data collection effort overall was to maximize the efficiency of data collection activities and minimize burden on participants.


Estimated Annualized Cost to Respondents

The total annual respondent cost was estimated based on the Bureau of Labor Statistics’ wage data. The total annual cost burden to respondents is approximately $3,156.88. For administrators and managers of DCF, the LifeSet developer, and LifeSet provider agencies, the figure ($44.41/hr) is based on the mean wages for “Social and Community Service Managers,” job code 11-9151, as reported in the May 2020 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wages for New Jersey3. For front-line staff at LifeSet provider agencies, the figure ($34.19/hr) is based on the mean wages for “Child, Family, and School Social Workers,” job code 21-1021, as reported in the May 2020 U.S. Department of Labor, Bureau of Labor Statistics for New Jersey, Occupational Employment and Wages4. For youth and young adults, the $15 figure is based on the New Jersey minimum wage, which is set to incrementally increase to a ceiling of $15 an hour in 20245. For the compilation and submission of administrative data files, the figure ($26.50/hr) is based on the mean wages for “Statistical Assistants,” job code 43-9111, as reported in the May 2020 U.S. Department of Labor, Bureau of Labor Statistics, Occupational Employment and Wages for New Jersey6.



Table A4.

Instrument

No. of Respondents (total over request period)

No. of Responses per Respondent (total over request period)

Avg. Burden per Response (in hours)

Total Burden (in hours)

Annual Burden (in hours)

Average Hourly Wage Rate

Total Annual Respondent Cost

Instrument 1- Baseline Youth Survey

600

1

0.6

360

120

$15

$1,800.00

Instrument 2- Administrative Data File

12

1

5

60

20

$26.50

$530.00

Instrument 3- Site Visit 1 Interview Guide for Administrators

22

1

1

22

7

$44.41

$310.87

Instrument 4- Site Visit 2 Focus Group Guide for Staff

12

1

1.5

18

6

$34.19

$205.14

Instrument 5- Site Visit 2 Interview Guide for Administrators

22

1

1

22

7

$44.41

$310.87

Total




482

160


$3,156.88



A13. Costs

There are no additional costs to respondents.



A14. Estimated Annualized Costs to the Federal Government

The total cost for the data collection activities under this current request is estimated to be $1,587,900 with an annualized cost of $529,300. Costs related to data collection under future requests will be included within those information collection requests.


Table A5.

Cost Category

Current Request Estimated Costs

Instrument Development and OMB Clearance

$82,600

Impact Study Survey Collection

$1,004,000

Implementation Study Field Work

$184,600

Analysis

$200,700

Impact Study

$80,280

Implementation Study

$120,420

Publications/Dissemination

$116,000

Total costs over the request period

$1,587,900

Annual costs

$529,300




A15. Reasons for changes in burden

This is a new information collection request.



A16. Timeline

Table A6 below provides a data collection schedule for both the current and future requests. The project team will prepare a final report for public dissemination following the completion of data collection. See Supporting Statement B, section B7 for additional information about plans for dissemination.


Table A6.

Task

Description

Timeframe (after OMB approval)


Current request


Baseline youth survey

Collect baseline data from treatment and control group participants

Months 1-30

Site visits (including interviews and focus groups)

Interviews with program leaders;

Focus groups with staff

Months 1-12

Administrative data

Engage with agencies and programs regarding administrative data and data sharing agreements

Months 1-36


Future requests


12-month follow-up survey

Collect outcome data from treatment and control group participants

Months 13-40

Site visits (including interviews, focus groups, and observations)

Interviews with program leaders;

Focus groups with staff; Interviews with treatment group participants; Focus groups with treatment and control group participants; Observations of LifeSet program activities

Months 13-37

24-month follow-up survey

Collect outcome data from treatment and control group participants

Months 25-52

Administrative data

Receive administrative data

Months 48-51

Analysis


Months 51-57

Reporting and Disseminating findings


Individual formative evaluation reports

Months 57-63




A17. Exceptions

No exceptions are necessary for this information collection.


Attachments

Instrument 1: Baseline Youth Survey

Instrument 2: Administrative Data List

Instrument 3A: Site Visit 1 Interview Guide for Administrators: Child Welfare Agency Administrators

Instrument 3B: Site Visit 1 Interview Guide for Administrators: Licensed LifeSet Experts

Instrument 3C: Site Visit 1 Interview Guide for Administrators: LifeSet Developer Administrators

Instrument 3D: Site Visit 1 Interview Guide for Administrators: Provider Agency Administrators

Instrument 4A: Site Visit 2 Focus Group Guide for Staff: LifeSet Specialists

Instrument 4B: Site Visit 2 Focus Group Guide for Staff: LifeSet Team Supervisors

Instrument 5A: Site Visit 2 Interview Guide for Administrators: Child Welfare Agency Administrators

Instrument 5B: Site Visit 2 Interview Guide for Administrators: Licensed LifeSet Experts

Instrument 5C: Site Visit 2 Interview Guide for Administrators: LifeSet Developer Administrators

Instrument 5D: Site Visit 2 Interview Guide for Administrators: Provider Agency Administrators

Appendix A: Young Adult Study Consent Form

Appendix B: Youth Study Assent Form

Appendix C: Notification Letter to Parents of Minors at Baseline

Appendix D: Baseline Youth Survey Lead Letters

Appendix E: Baseline Youth Survey Fact Sheets

Appendix F: Baseline Youth Survey Refusal Letters

Appendix G: Panel Maintenance Tracking Scripts

Appendix H: Panel Maintenance Letter

Appendix I: Panel Maintenance Postcard

Appendix J: Outreach Email Staff Interviews and Focus Groups – Staff Connected to via Program/Agency

Appendix K: Outreach Email Staff Interviews and Focus Groups – Staff Not Connected to via Program/Agency

Appendix L: Implementation Study Informed Consent for Staff

Appendix M: IRB Approval Letter


References

Annie E. Casey Foundation. 2020. “KIDS COUNT Data Center.” https://datacenter.kidscount.org/data/tables/6277-children-exiting-foster-care-by-exit-reason?loc=1&loct=2#detailed/1/any/false/37,871,870,573,869,36,868,867,133,38/2631,2636,2632,2633,2630,2629,2635,2634/13050,13051



Berlin M, Mohadjer L, Waksberg J, Kolstad. A, Kirsch I, Rock D, Yamamoto K. An experiment in monetary incentives. American Statistical Association, Proceedings of Survey Research Methods Section; Alexandria, VA: 1992. pp. 393–398.

Courtney, Mark E., N.J. Okpych, J. Harty, H. Feng, S. Park, J. Powers, M. Nadon, D.J. Ditto, and K. Park. 2020. “Findings from the California Youth Transitions to Adulthood Study (CalYOUTH): Conditions of Youth at Age 23.” Chicago, IL: Chapin Hall at the University of Chicago. https://www.chapinhall.org/research/calyouth-wave4-report/.


Courtney, Mark E., P. Charles, N.J. Okpych, L. Napolitanto, and K. Halsted. 2014. “Findings from the California Youth Transitions to Adulthood Study (CalYOUTH): Conditions of Foster Youth at Age 17.” Chicago, IL: Chapin Hall at the University of Chicago. https://www.chapinhall.org/research/study-of-youth-in-california-foster-care-at-age-17-reveals-need-for-ongoing-support/.


Dworsky, Amy, Mark E. Courtney, Jennifer Hook, Adam Brown, Colleen Cary, Kara Love, Vanessa Vorhies, et al. 2011. “Midwest Evaluation of the Adult Functioning of Former Foster Youth.” https://www.chapinhall.org/research/midwest-evaluation-of-the-adult-functioning-of-former-foster-youth/.


Epstein, D., & Klerman, J. A. 2012. When is a program ready for rigorous impact evaluation? The role of a falsifiable logic model. Evaluation Review, 36(5), 375-401.


Goodman, Michael, Noah Adams, Trevor Corneil, Baudewijntje Kreukels, Joz Motmans, and Eli Coleman. 2019. "Size and distribution of transgender and gender nonconforming populations: a narrative review." Endocrinology and Metabolism Clinics 48, no. 2: 303-321.


Jones, Jeffery M. 2021. “LGBT Identification Rises to 5.6% in Latest U.S. Estimate.” https://news.gallup.com/poll/329708/lgbt-identification-rises-latest-estimate.aspx


McGonagle, Katherine A., Robert F. Schoeni, Mick P. Couper, and Mohammad Mushtaq. "An incentive experiment designed to increase response to a between-wave contact update mailing in two panel studies." Survey practice 4, no. 3 (2011). http://surveypractice.wordpress.com/2011/06/.


Mountz, Sarah, and Moshoula Capous-Desyllas. 2020. "Exploring the families of origin of LGBTQ former foster youth and their trajectories throughout care." Children and Youth Services Review 109.


Pergamit, Michael, Mary Cunningham, Devlin Hanson, and Alexandra Stanczyk. 2019. “Does Supportive Housing Keep Families Together?” Supportive Housing for Child Welfare Families Research Partnership. Washington, D.C.: Urban Institute, May. https://www.urban.org/sites/default/files/publication/100289/does_supportive_housing_keep_families_together_1.pdf.


Sandfort, Theo GM. 2020. "Experiences and Well-Being of Sexual and Gender Diverse Youth in Foster Care in New York City: Disproportionality and Disparities." New York City Administration for Children's Services (ACS). https://www1.nyc.gov/assets/acs/pdf/about/2020/WellBeingStudyLGBTQ.pdf


Watson, Ryan J., Christopher W. Wheldon, and Rebecca M. Puhl. 2020. "Evidence of diverse identities in a large national sample of sexual and gender minority adolescents." Journal of Research on Adolescence 30: 431-442. https://doi.org/10.1111/jora.12488


Wilson, Bianca DM, Khush Cooper, Angeliki Kastanis, and Sheila Nezhad. 2014. "Sexual and gender minority youth in foster care: Assessing disproportionality and disparities in Los Angeles." Los Angeles: The William’s Institute. https://williamsinstitute.law.ucla.edu/wp-content/uploads/SGM-Youth-in-Foster-Care-Aug-2014.pdf



1 Additional requests to extend data collection beyond the initial three year approval will be submitted to OMB.

2 Examples of sensitive topics include (but not limited to): social security number; sex behavior and attitudes; illegal, anti-social, self-incriminating and demeaning behavior; critical appraisals of other individuals with whom respondents have close relationships, e.g., family, pupil-teacher, employee-supervisor; mental and psychological problems potentially embarrassing to respondents; religion and indicators of religion; community activities which indicate political affiliation and attitudes; legally recognized privileged and analogous relationships, such as those of lawyers, physicians and ministers; records describing how an individual exercises rights guaranteed by the First Amendment; receipt of economic assistance from the government (e.g., unemployment or WIC or SNAP); immigration/citizenship status.

3 “Occupational Employment Statistics: Occupational Employment and Wages, May 2020,” Bureau of Labor Statistics, accessed April 7th, 2021, https://www.bls.gov/oes/current/oes119151.htm

4 Occupational Employment Statistics: Occupational Employment and Wages, May 2020,” Bureau of Labor Statistics, accessed April 7th, 2021, https://www.bls.gov/oes/current/oes211021.htm

5 “New Jersey’s Minimum Wage,” Department of Labor and Workforce Development, accessed May 13th, 2021, https://www.nj.gov/labor/wageandhour/assets/PDFs/minimumwage_postcard.pdf

6 Occupational Employment Statistics: Occupational Employment and Wages, May 2020,” Bureau of Labor Statistics, accessed April 7th, 2021, https://www.bls.gov/oes/current/oes439111.htm

20


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorBrewsaugh, Katrina
File Modified0000-00-00
File Created2021-06-25

© 2024 OMB.report | Privacy Policy