Attachment A. NextGen site assessment discussion guide_rev

Formative Data Collections for ACF Research

Attachment A. NextGen site assessment discussion guide_rev

OMB: 0970-0356

Document [docx]
Download: docx | pdf

Attachment A MATHEMATICA

Next Generation of Enhanced Employment Strategies Project

Site Assessment Discussion Guide

Introduction

Note: if we have already conducted information gathering interviews with this respondent, we will skip introduction script.

My name is [NAME] and I am from Mathematica Policy Research. On behalf of the Office of Planning, Research, and Evaluation in the federal Administration for Children and Families, we are conducting a study of innovative, promising employment interventions for low-income individuals facing complex challenges to employment. “Complex challenges to employment” is a broad concept that could encompass, for example, physical and mental health conditions, substance misuse, a criminal history, or limited work skills and experience. The study is called the Next Generation of Enhanced Employment Strategies Project (NextGen Project). It will help us learn more about how employment programs can help low-income people move toward economic independence.

If asked: The Office of Planning, Research, and Evaluation is also conducting a companion study, Building Evidence on Employment Strategies for Low-Income Families Project (BEES). The projects are both designed to test innovative employment interventions that are rooted in the best available research and aim to help low-income people move toward economic independence. The projects are coordinating closely. Depending on the types of interventions we find that warrant evaluation, the two projects may focus on different interventions or different populations.

If a program that might qualify for Section 1110 funds: Additionally, our project [and the BEES project if above noted] is working closely with the Social Security Administration (SSA) to incorporate a focus on employment-related early interventions for individuals with current or foreseeable disabilities who have limited work history and are potential applicants for Supplemental Security Income (SSI).

We plan to conduct a rigorous, random assignment study of about nine interventions. The study will examine the implementation of the intervention and estimate its impacts on employment and other outcomes. We will also assess the intervention’s costs. We have identified [PROGRAM NAME] as operating an intervention to consider for inclusion in the evaluation. Right now, we are collecting information on interventions that might be a good fit for the study; we will select programs to invite to participate in the study later. We would like to ask you some questions about your program and intervention to determine whether they are well suited to be included in the evaluation. An agency may not conduct or sponsor, and a person is not required to respond to, a collection of information unless it displays a currently valid OMB control number. This information collection has been approved under OMB information collection request 0970-0356, which expires on June 30, 2021.

Participating in an evaluation has benefits to you, other programs like yours, and the clients you all serve. Funds are available for us to work with programs to refine their operations to get the interventions working as well as possible before evaluation. All programs participating in the evaluation will receive funding to support their participation in a rigorous evaluation.

If a program might qualify for Section 1110 funds: For interventions focused on people with current or foreseeable disabilities who are potential applicants for SSI, there may be additional funds that can be used to bolster the intervention—for example, by adding an employment service component to an existing intervention, by increasing the services already being provided, or by increasing the number of people who are recruited for or served by the program.

Programs that participate in random assignment studies can benefit by learning about the effectiveness of their interventions and how to improve them. They will build their capacity to collect and use data to inform their operations. They also gain an objective assessment of their intervention that policymakers and funders will trust. If the study shows the intervention is effective, it might be easier to pursue additional funding.

Your participation in this information-gathering interview is voluntary and, should you wish to participate, the information you provide will be kept private within the project team and the Office of Planning, Research, and Evaluation. This interview should take about an hour [for staff and supervisors] OR an hour and a half [for administrators/managers].

Before we begin, do you have any questions about the project?





  1. Respondent background

Note: Probe for any changes as a result of COVID-19.

  1. What is your job title?

  2. How long have you been working at [organization name]?

  3. What is your main role at [organization name]?

  4. What is your role in implementing [intervention name]? What are your main responsibilities?

B. Overview of the intervention and objectives

Note: in some cases we will have already collected this information through stage 2 information-gathering phone calls or through direct outreach to program administrators. If that is the case, we will use the information already collected under that effort, but not ask respondents to repeat information. As applicable, probe for whether the intervention changed as a result of COVID-19.

This section is for the program administrator/manager.

  1. Tell me about [intervention name].

  1. What are the objectives of the intervention?

  2. What are the main components of the intervention?

  3. How is the intervention situated within the larger organization? That is, is it the main service provided, or a small part of the overall organization’s services?

  4. For programs that operate in multiple locations: Where is this intervention available? How similar is the intervention and target population as implemented in these multiple locations? Who makes most decisions about how the program is implemented in these locations (local staff or centralized staff)?

  5. How long has the organization been implementing the intervention?

  6. What are the characteristics of participants? Do you have any data or reports we could see? Probe for age, education level, income, disability, health or substance use challenges, work history, and other employment challenges. Do you know about what proportion might be receiving Supplemental Security Income?

  7. How is the intervention funded?

  8. What written materials about the intervention or program could you share with me?

C. Intake and services

Note: in some cases we will have already collected this information through stage 2 information-gathering phone calls or through direct outreach to program administrators. If that is the case, we will use the information already collected under that effort, but not ask respondents to repeat information. As applicable, probe for whether the intervention made changes as a result of COVID-19. Changes may include offering virtual services, implementing social distancing or other safety measures, adding or discontinuing services, and furloughing staff or changing roles.

  1. How do participants usually find out about the intervention? Probe for referral from human service agency, court, health treatment center, etc.

  1. Are any referrals for mandatory participation (i.e., courts)?

  1. Walk me through what happens when someone first expresses interest in participating.

  1. Where are they determined eligible, and by whom? Are partners involved in intake?

  2. What are the eligibility criteria? To what extent do intake staff have discretion over the criteria?

  3. What do prospective participants need to do to be enrolled? For example, what documentation do they need to provide?

  4. How long does it take from first contact to enrollment? About what fraction of potential participants who begin enrollment end up completing enrollment?

  1. If an individual does not seem right for the intervention, what happens? Do staff refer them to [organization] or other providers for services?

  1. If referred to [organization], would they receive any services received by [intervention] participants?

  1. What are the main services the intervention offers to participants? Probe for:

  • Assessments

  • Education

  • Occupational skills training

  • Life or soft skills training

  • Job search assistance

  • Paid or unpaid work experiences other than employment

  • Subsidized or competitive employment

  • Case management

  • Coaching

  • Benefits counseling

  • Service coordination

  • Support services

  • Referrals

  • Financial support or incentives (probe for how much)

  • Substance abuse treatment

  • Mental, emotional, or physical health treatment

  • Expungement or legal assistance


  1. Do all participants receive the same suite of services? If not, how do you decide which services are provided to each participant, and in what order?

  2. If all do not receive the same services, what services do most participants receive?

  3. What are some of the main drop-off points—that is, points at which participants tend to disengage? About what fraction drops off at these points?

  4. Of the services you just described, are these only for [intervention] participants, or can people enrolled in other programs access them? Please explain.

  5. Are any of the services provided by employers?

  1. How many employers are involved? What are their industry, size, etc.?

  2. What do they offer to participants?

  3. Are participants paid? Who pays them (that is, the program, employers, some combination)?

  1. Are any of the services provided by partners other than employers?

  1. How many partners are involved? What types of organizations are they (e.g., health treatment providers, educational institutions, other non-profits)?

  2. What services do they offer to participants?

  1. Do program staff encourage participants to apply for public assistance benefits, such as TANF, SNAP, or SSI? What kind of supports do they provide to help them?

  2. [For administrators/managers] How many staff work directly with participants?

  1. What are their backgrounds and qualifications?

  2. How many participants does each staff member work with? Do staff carry a caseload, and if so what is the average?

  3. How many staff would the typical participant interact with?

  4. How many supervisors are there? How many staff do they supervise?

  5. What is staff turnover like? About what fraction leave each year?

  1. [For administrators/managers] Do you track intervention costs? If so, how much does your program cost per participant? Do participants pay any out-of-pocket costs?

D. Intervention validity

Note: this section only asked of administrators/program developers.

  1. How was the intervention developed? Who developed it? Were employers involved in its development?

  2. Has [intervention] ever been evaluated?

  1. If so, who conducted the evaluation? Is there a report available that describes the evaluation’s methods and findings?

  2. If not, did it involve an evaluation of its effectiveness? What outcomes were measured and when? What were the findings?

E. Implementation integrity

Note: As applicable, probe for whether changes happened as a result of COVID-19, for example, changes to participant engagement, program completion expectations, and staff training.

  1. [For administrators/managers] Does the intervention have a well-defined, standardized model? If so: Has fidelity to this program’s implementation of the intervention been assessed by a third party? What was the result of that assessment?

  2. [For administrators/managers] About what percentage of the staff would you estimate is implementing the program as designed? How do you know? What strategies do you use to address any issues?

  3. [For administrators/managers] Are specific program components less likely than others to be implemented as designed? Which ones and why?

  4. On average, how frequently do participants receive intervention services? How does that compare to your expectations for how often they should be receiving services?

  5. How do you define program completion? About what fraction of participants complete the program? On average, how long (in calendar time) do participants participate in the program? What follow-up services are provided?

  6. For those who do not complete the program, about how often do they return in the same year (or other program-relevant period)? How does the program handle this?

  7. How engaged do participants seem to be? How well do they think it is working?

  8. How much training on the intervention did you/your staff receive, both upfront and ongoing?

  1. How is it delivered (for example, in person, webinar, on-the-job)? By whom?

  2. Is there a training manual? If so, could we have a copy?

  3. Is any follow up or ongoing training provided?

  1. Was the training you/your staff received sufficient to implement the program as designed? If not, what additional training would be helpful?

  2. How comfortable are you/your staff in implementing the intervention?

  3. Does the intervention have standardized tools and resources that you/your staff are supposed to use with participants?

  1. [If so]: Describe the tools and what each is used for.

  1. Do the tools have a cost? If so, what is it?

  2. Do you/your staff regularly use these tools with participants? If not, why not?

  3. Do you/your staff use these tools as intended? If not, how do you/they use them?

  4. Could you share copies of the tools with us?

  1. [If not]: Are staff using other tools and resources that are not standard to the program? If so, please describe.

  1. How often do supervisors meet with staff to discuss program implementation? Probes: this could include case consultations, observations of staff-client interactions, staff meetings, etc.

  1. What happens if a supervisor determines a staff person is not implementing the intervention as intended? Probe for whether additional training is provided, staff person is given more supervision, and so on.

  1. [For supervisors:] About what percentage of staff need ongoing support to implement the intervention?

  2. [For managers/supervisors:] Do you have processes in place to monitor whether the program is being implemented as intended? If so, describe.

  3. How enthusiastic are you about the intervention? How well do you think it is working?

  4. How enthusiastic are your staff/your leadership about the intervention? How well do they think it is working?

F. Feasibility of evaluation

Note: this section only to be asked of managers/administrators

1. Participant recruitment, flow, and program size

  1. About how many new participants does your program serve in an average month? How many total participants does your program serve each month? How many participants did you serve in the last year? Probe for whether this changed during COVID-19.

  2. Do you maintain a waitlist? If you had the funds to serve more people, do you think you could find people eligible for the intervention who would want to participate? How would you find these people?

  3. What changes to staffing or other resource allocation would be needed to serve additional participants? What factors might limit growth of the program? Probe for: facilities, costs, staff, eligible population, competing programs.

2. Counterfactual condition

  1. Are there other places in your community where participants could receive similar services as those offered by the intervention?

  1. If so, what are some of those places? How are they similar to or different from your program?

  2. Would participants need to pay for those services out-of-pocket?

3. Program management and staff supportive of experimental evaluation

  1. If we were to conduct an evaluation, we would need to randomly assign clients to either receive the intervention services or not. [Suggest to the respondent how we would see this working. Note that if they recruit more eligible participants, then they would serve the same number of people.]

  1. Do you think the proposed process would work?

  2. Can I address any questions you have about that?

  1. Can staff commit to abiding by group assignment (i.e., not offering the intervention to someone who was randomly assigned to the control group)?

  2. Are leadership and staff interested in participating in research about the intervention? What would be the next steps for getting approval to participate in research about the intervention?

  3. How likely is it that the intervention will continue to be implemented without notable changes (e.g., staffing, budget, reorganization) in the near future?

4. Partnerships

Note: As applicable, probe for whether changes happened as a result of COVID-19.

  1. How do you coordinate with partner service providers or employers? Do you share a data system?

  2. For those partners that refer participants to you, what information about participants do they share?

  3. Do you think partners would agree to participate in a random assignment study? Would they have any particular concerns? Please explain.

  4. For which partners would you/we need buy-in to implement an evaluation?

G. Formative evaluation needs

Note: As applicable, probe for any changes made as a result of COVID-19 and how the program operated during the pandemic.

  1. Do you use data to inform program services or improve program operations? Explain.

  2. What is working well in how [intervention] is implemented? What are some examples of successes?

  3. What have been the main challenges? How were those challenges addressed?

  4. Is there anything you’re still working on addressing? What supports might you need?

H. COVID-19-specific questions

  1. How has your work been affected by COVID-19? Probes: working from home, changes to performance expectations, personal challenges faced, support received and still needed

  2. [For leadership only:] Describe how leadership responded to the pandemic. Probes: the planning process, implementation, communication, staff and participant considerations

  3. [For leadership only:] What, if any, changes made during the pandemic do you anticipate continuing after the pandemic?

  4. What lessons have you learned from this experience that can help inform future crisis response?

Next Steps

Thank you very much for your time today. I know I asked a lot of questions and probably raised some questions for you. What questions do you have? What concerns?

We are currently conducting similar visits to other programs to assess their suitability for the evaluation.

We may have some further questions for you before deciding on the set of interventions for the evaluation. Would that be ok?

If we reached mutual agreement to partner with you for this study, then it sounds like the next steps are [fill in based on responses to questions 44 and 48].





8

File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorKristen Joyce
File Modified0000-00-00
File Created2023-10-30

© 2024 OMB.report | Privacy Policy