The Social Security Administration and the Department of Labor are undertaking the Retaining Employment and Talent After Injury/Illness Network (RETAIN) demonstration, which will test promising early intervention approaches to improve the labor force participation and retention of individuals with recently-acquired injuries and disabilities and to reduce their future need for Social Security disability benefits. The Department of Labor (DOL) is funding the intervention approaches and American Institutes for Research to provide programmatic technical assistance for the demonstration. The Social Security Administration (SSA) is funding Mathematica to provide evaluation support for the demonstration, including evaluation-related technical assistance and conducting a comprehensive evaluation.
The RETAIN demonstration consists of two Phases. The first involves cooperative awards to eight states to conduct planning and start-up activities, including the launch of a small pilot. During Phase 1, SSA will provide evaluation-related technical assistance and planning, and conduct evaluability assessments to assess which states’ projects would allow for a rigorous evaluation if continued beyond the pilot phase. DOL will select a subset of the states to continue to Phase 2, full implementation.1 During Phase 2, DOL will fund the operations and programmatic technical assistance activities for the selected states, and SSA will fund a full set of evaluation activities.
SSA is requesting clearance for the collection of data needed to evaluate RETAIN. The specific data collection efforts for which SSA is seeking OMB approval in this package include: (1) semi-structured interviews with program administrators and service providers conducted during two rounds of site visits. The interviews will focus on program implementation, perceptions of what worked well in each state’s program, and implementation challenges; (2) interviews with RETAIN service users to understand their experiences as they engage in program services; (3) staff activity logs to understand the costs of RETAIN services and inform the benefit-cost analysis; and (4) two rounds of RETAIN enrollee and provider surveys. The RETAIN enrollee surveys, planned for 2 and 12 months after enrollment, will focus on individual-level outcomes and perceptions of enrollees. The surveys will include up to 12,000 enrollees drawn from the states DOL selects for Phase 2, composed of individuals in the treatment and control groups. The provider survey, which will include a sample of up to 400 RETAIN providers, will be conducted 4 and 16 months after the launch of Phase 2. It will collect information to help explain how providers delivered services, and will highlight any systems changes that might have occurred as a result of RETAIN.
The RETAIN Demonstration Projects are a collaborative effort led by the Department of Labor (DOL), in partnership with DOL’s Employment and Training Administration (ETA) and the Social Security Administration (SSA). RETAIN projects will test the impact of early intervention strategies that improve stay-at-work/return-to-work (SAW/RTW) outcomes of individuals who experience work disability while employed. “Work disability” is defined as an injury, illness, or medical condition that has the potential to inhibit or prevent continued employment or labor force participation.
SAW/RTW programs succeed by returning injured or ill workers to productive work as soon as medically possible during their recovery process and by providing interim part-time or light duty work and accommodations, as necessary. The RETAIN Demonstration Projects are loosely modeled after promising programs operating in Washington State, including the Centers of Occupational Health and Education (COHE), the Early Return to Work (ERTW), and the Stay at Work programs. While these programs operate within the state’s workers’ compensation system and are available only to people experiencing work-related injuries or illnesses, the RETAIN Demonstration Projects provide opportunities to improve SAW/RTW outcomes for both occupational and non-occupational injuries and illnesses of people who are employed, or at a minimum in the labor force, when their injury or illness occurs.
Central to these projects is the early coordination of health care and employment-related supports and services to help injured or ill workers remain in the workforce. These supports and services include:
Training in occupational health best practices for participating health care providers;
Active involvement of a Return-to-Work Coordinator throughout the medical recovery period to facilitate continued employment;
Enhanced communication among workers, employers, and health care providers;
Accommodations and job modifications; and
Retraining and rehabilitation services.
To accomplish this, projects will provide services through an integrated network of partners that includes close collaboration between state and/or local workforce development entities, health care systems and/or health care provider networks, and other partners as appropriate.
The primary goals of the RETAIN Demonstration Projects are:
To increase employment retention and labor force participation of individuals who acquire, and/or are at risk of developing, work disabilities; and
To reduce long-term work disability among RETAIN service users, including the need for Social Security Disability Insurance and Supplemental Security Income.
The ultimate purpose of the demonstration is to validate and expand implementation of evidence-based strategies to accomplish these goals. DOL is funding the intervention approaches and programmatic technical assistance for the demonstration. SSA is funding evaluation support, including technical assistance and the full evaluation for the demonstration.
To provide information to inform the development of evidence-based strategies, the evaluation from Phase 2 implementation will include the following four components:
Participation analysis: Using RETAIN service user interviews and surveys, this analysis will provide insights into which eligible workers choose to participate in the program, in what ways they participate, and how services received vary with participant characteristics. Similarly, it will assess the characteristics of, and if possible, reasons for non-enrollment of non-participants.
Process analysis: Using staff interviews and logs, this analysis will produce information about operational features that affect service provision; perceptions of the intervention design by service users, providers, administrators, and other stakeholders; the relationships among the partner organizations; each program’s fidelity to the research design; and lessons for future programs with similar objectives.
Impact analysis: This analysis will produce estimates of the effects of the interventions on primary outcomes, including employment and Social Security disability applications, and secondary outcomes, such as health and service usage. SSA will identify evaluation designs for each state to generate impact estimates. The evaluation design could include experimental or non-experimental designs.
Cost-benefit analysis: This analysis will assess whether the benefits of RETAIN justify its costs. We conduct this assessment from a range of perspectives, including those of the participants, state and Federal governments, SSA, and society as a whole.
The purpose and proposed use of this information collection is to gather qualitative and quantitative data needed to conduct the analysis described in Section 1. These activities, described in the text that follows, include (1) site visits conducted in Phase 2; (2) interviews with RETAIN service users; (3) surveys of RETAIN enrollees and (4) surveys of RETAIN service providers.
Mathematica will conduct site visits, including in-person interviews with state administrators and program staff, and telephone interviews with RETAIN service users, in all of the Phase 2 RETAIN programs.
Information from these site visits will be used to examine research questions for the process evaluation to address three key objectives:
Document recruitment and enrollment activities. The process evaluation will document strategies RETAIN programs used to identify workers at risk of SSI/SSDI entry and recruit them into program services. It will also highlight the challenges RETAIN programs faced in recruiting and enrolling service providers and workers, and how they addressed the challenges.
Document the program’s model for service delivery. The process evaluation will document each RETAIN program’s logic model and sequence of services and assess how well states implemented services with fidelity to their program model. It will describe how intervention services differ from the usual services already available to ill or injured workers, and the relationships and partners that were necessary to deliver services effectively.
Document barriers and facilitators to program implementation. Most of the staff interviews will take place during the site visits, though some will be done by telephone if necessary. Examples of state program administrators we will interview include the RETAIN program director of state-level agencies participating in RETAIN, along with executive directors of health care systems and nongovernmental or community-based organizations that provide services to RETAIN service users. Examples of RETAIN program staff include return-to-work coordinators, health care providers, and other staff working on the front lines of organizations providing demonstration services.
The first site visit will occur five months after the beginning of Phase 2 enrollment, in February 2022. The evaluation team will use information collected during the first round of site visits to describe states’ early experiences with the demonstration. This information will support analyses related to six key research questions for the process analysis:
What organizational partnerships have formed under the RETAIN programs to support service delivery?
To what extent are states doing the following: recruiting and enrolling enrollees with fidelity to the planned model? Delivering services in accordance with the service model? Maintaining fidelity to the evaluation design?
What strategies are RETAIN programs using to identify workers who have recently experienced the onset of a work-threatening injury, illness, or disability? How did the states design the programs for workers? How are RETAIN programs recruiting eligible individuals into the demonstration? What challenges are RETAIN programs facing in doing so, and how are they overcoming these challenges?
What are states doing to deliver RETAIN services? How are these services different from the services states were providing to injured or ill workers before RETAIN?
What have been important programmatic and environmental facilitators in implementing RETAIN services to date? What are the challenges states have faced during the implementation of RETAIN services to date? How have these challenges influenced implementation? How have states overcome those challenges?
How did states use programmatic and evaluation technical assistance to implement programs? Did they need to make any modifications to meet implementation goals (e.g., recruitment targets)?
How do states’ data collection procedures work? How are states using management information systems to support data collection?
To address these questions, the evaluation team will conduct semi-structured interviews with administrators and project staff. These interviews will last up to 60 minutes each and we will conduct them one on one or, if requested, in small groups of two to three staff per session. We will use the findings from this data collection to assess program implementation and provide feedback to states about where they may need potential improvements to programs, and areas where they may need additional technical assistance or support for RETAIN. Attachment B lists the topics we will address during the semi‑structured interviews.
If it is feasible and appropriate given the RETAIN program model, Mathematica will visit two service delivery providers that differ on some key factor (for example, different provider or health care systems, or different areas of the state) to get the perspectives of a range of stakeholders. The evaluation team will reach out to the RETAIN program director in each state to begin planning the site visits in the month before the visit. The researcher visiting the state will schedule an initial telephone call to discuss the purpose of the site visits, identify the two areas of interest, and get names and contact information for the staff interviews.
The second site visit will occur approximately 12 months after the launch of Phase 2 enrollment, in September of 2022. RETAIN evaluation staff will use the information collected during the second round of site visits to describe states’ experiences with the fully implemented programs. We will structure and organize the second round of visits in the same manner as the first site visits described above. Staff interviews will last up to 60 minutes each, and we will conduct them one on one or, if requested, in small groups of two to three staff per session. Mathematica will visit the same service delivery providers to understand changes in key aspects of the demonstration. Information collected during the second round of site visits will support analyses related to seven key research questions:
How have organizational partnerships under the RETAIN program changed to support service delivery?
How have states changed the delivery of RETAIN services in the prior year (since the last visit)?
What challenges and facilitators have influenced the implementation of RETAIN services, and how have states overcome those challenges?
To what extent are states delivering services with fidelity to the planned program model and evaluation design?
What are states’ plans for sustaining RETAIN services after the demonstration? What changes do states anticipate making to sustain RETAIN services after the demonstration?
What changes have been made to the counterfactual services (i.e., the environment without RETAIN services) available to control group members? What are the implications for the ability of the evaluation to detect and interpret impacts?
What are the key program cost components?
To answer these questions, Mathematica will conduct in-depth interviews with administrators and RETAIN program staff. We will use the information gathered to assess process findings related to program implementation. The findings will also provide contextual information to other aspects of the participation, impact, and benefit cost analysis. For example, the findings could inform the evaluation of any changes to the counterfactual service environment that may have implications for the evaluation’s ability to detect impacts and how to interpret those impacts. Finally, the evaluation team will use information collected during the second site visit to develop a template for collecting cost information from each state. Attachment B lists the topics to be addressed during these interviews.
Mathematica will conduct interviews with 60 RETAIN service users in August of 2022, 10 months after the launch of Phase 2 enrollment. These interviews will occur outside of the site visits described above. We will draw enrollees selected for these interviews based on their use of services. Mathematica will use the information collected during these interviews to describe the experiences of service users involved in the demonstration, and to supplement other data collected and used in conducting the process evaluation.
Information collected during the interviews will support analyses related to six key research questions for the process analysis, including:
What motivated enrollees to enroll in RETAIN?
What are enrollees’ employment goals and their attitudes about staying at work or returning to work?
What do enrollees like and dislike about the RETAIN services?
What other services are enrollees either aware of or receiving already?
How satisfied were enrollees with services and did satisfaction vary by state or service use level?
What factors were related to RETAIN service use?
Each interview will last up to 30 minutes. We included the topics to be addressed during these telephone interviews in Attachment C. We will use the findings to assess enrollees’ engagement in, and satisfaction with, the RETAIN services; identify which aspects of the services may be more or less associated with participation outcomes; and give RETAIN programs feedback about potential improvements.
Recruitment for the RETAIN service user interviews will occur in August of 2022. We will use data from the states’ enrollment systems to recruit a purposefully selected sample of service users. If necessary, Mathematica may also reach out to state liaisons to help identify enrollees with particular characteristics, not identifiable in the enrollment data, or to identify individuals based on service engagement. The evaluation team will send recruitment letters to potential respondents and give them a toll-free number to call and schedule an interview.
The staff activity logs (Attachment I) provide data on aspects of service delivery that we cannot readily obtain from administrative data files and other sources. The logs will include staff’s daily time spent on various activities that are core components of the RETAIN model: recruitment and enrollment, case management, return-to-work services, care coordination, and communication with and training for health care providers and employers. The logs will also include categories related to program administration (evaluation, training, and other management), as well as travel, work leave, and other program activities outside the above categories. This information will be useful for the benefit-cost analysis, enabling us to allocate program costs across the various components. Such information will be helpful for understanding the level of resources RETAIN programs allocates to one or another component, which could inform those interested in replicating a specific state’s program and interpreting program impacts.
Data from the staff activity logs will answer the following research questions:
How does a program allocate resources across RETAIN components?
How does actual program allocation align with the program’s model of service delivery?
What level of effort does a program allocate to program management versus program services?
How do specific types of staff differ in how they spend their time on program management and service delivery?
To answer these questions, we will collect staff activity logs from selected staff for two one‑week periods around the time of the second evaluation site visit (during fall 2022). The one-week periods will represent typical work weeks for staff, avoiding weeks with atypical training or conferences. We expect to ask approximately 13 staff members from each program to complete the logs, depending on the number of staff and the different staff categories involved in delivering substantive services. Individuals selected to complete the logs will include both administrative and direct service staff.
The survey of RETAIN enrollees will collect information on a variety of issues that is not easily accessible or available through administrative data. SSA plans to administer two rounds of the enrollee survey to measure changes in enrollees’ health, service use, and employment over time. Mathematica will conduct surveys with about 12,000 enrollees across the four RETAIN programs at 2 and 12 months after enrollment. The first survey will collect information about the disability onset event, service receipt, and immediate-term outcomes related to the return-to-work process. The second survey will obtain information on interim outcomes that could inform the evaluation, such as changes in employment status, earnings, benefit receipt, and enrollee health and well-being. The first round of the enrollee survey will begin in December of 2021 and end in May of 2024. The second round will begin in October of 2022 and end in April of 2025.
The surveys will use a sequential, mixed-mode design. Each round will be web-based, with mail and telephone follow-up, and will be administered in English and Spanish. We estimate the duration of the interview is about 12 minutes for the Round 1 (R1) survey and 18 minutes for Round 2 (R2). We will release sample cases on a rolling basis that mirrors the months of study enrollment. The surveys will have a 12-week field period, with the full data collection period spanning 25 months in total in each round. If DOL decides to include additional states in Phase 2, we anticipate adding a proportionate number of cases to the enrollee survey sample (3,000 per state).
Exhibit A1-1 lists enrollee survey domains and measures, roughly in the order that the items will be collected during the interviews. The enrollee survey instruments are provided in Attachment E.
Table A1-1. Enrollee surveys, by domain or topic, by round
Domain/topica |
Round 1 |
Round 2 |
Current Employment |
|
|
Illness or injury that limits work |
X |
X |
Employment status and duration of employment with main employer |
X |
X |
Wage, hours, and benefits |
X |
X |
Employer accommodations |
X |
X |
Reasons for medical leave |
X |
X |
Reasons for not working now |
X |
X |
Job search |
X |
X |
Return-to-work expectations |
X |
X |
Participation in the gig economy |
X |
X |
Benefits |
|
|
Receipt of workers’ compensation and disability insurance |
|
X |
Income |
|
|
Household income |
|
X |
Receipt of public assistance (e.g., SNAP, TANF, other) |
|
X |
Training and receipt of employment services |
|
|
Use of employment services |
X |
X |
Participation in training |
X |
X |
Use RTW coordinator and satisfaction with services |
X |
X |
Health and functioning |
|
|
Physical and mental health status |
X |
X |
Health insurance |
X |
X |
Work limitations and pain |
X |
X |
Prescribed opioid pain relievers |
X |
X |
Contextual factor |
|
|
Marital status |
X |
X |
a number of measures are not included in Table A1-1; they are collected during enrollment and captured on part one of the DOL enrollment form.
RTW = return to work; SNAP = Supplemental Nutrition Assistance Program; TANF = Temporary Assistance for Needy Families.
We will document findings from each round of the enrollee survey in the final impact report. SSA might also include findings from the first survey round in a special topic report on interim impacts (October 2024).
The purpose of the surveys with RETAIN service providers is to collect information on program operations, service delivery, and RETAIN-induced practice changes. The surveys will capture information not available from other sources about provider practices and the experiences of RETAIN service providers. SSA plans to administer two rounds of the provider survey to measure changes over time. Mathematica will conduct surveys with up to 400 providers from the four RETAIN programs at 3 and 15 months after the launch of Phase 2. Each round of the provider survey will have a 14-week field period. The first round will begin in January, 2022. The second survey will begin in January 2023.
Like the enrollee survey, the provider survey will use a sequential mixed-mode design, with respondents having the option to participate by web, paper, or over the telephone. We will administer it in English, with a Spanish translation provided upon request. Each round of the survey will take about 15 minutes to complete. If DOL decides to include additional states in Phase 2, we anticipate adding a proportionate number of cases to the provider survey (100 per state). Exhibit A1-2 lists provider survey domains and measures, roughly in the order that we will collect the items during the interviews. The survey instruments are provided in Attachment G.
Table A1-2. Provider Survey Topics by round
Domain/topic |
Round 1 |
Round 2 |
Provision of health care services |
|
|
Primary role |
X |
|
Years in practice |
X |
|
Percentage of patients using workers’ compensation benefits |
X |
X |
Use of return-to-work best practices |
X |
X |
Experience working with a service coordinator |
X |
X |
Barriers to providing optimal patient care |
X |
X |
Provider experience in RETAIN |
|
|
Awareness of practice participation in RETAIN |
X |
X |
Share of patients enrolled in RETAIN |
X |
X |
Burden of RETAIN administrative requirements |
X |
X |
Receipt of formal training for RETAIN |
X |
X |
RETAIN training topics |
X |
X |
Satisfaction with training and impact on interaction with all patients |
X |
X |
Barriers for RETAIN success |
X |
X |
Factors discouraging practice participation |
X |
X |
Recommendation for RETAIN adoption by other practices |
X |
X |
Placeholder for state-specific items |
X |
X |
Provision of patient care at practice before RETAIN |
X |
|
We will document the findings from the provider survey data in the final impact report. SSA might also include findings from the provider surveys in a process analysis report (October, 2023) and in a special topic report (October, 2024xxx-month, year) that focuses on early RETAIN impacts.
We will not employ extensive use of technology for the qualitative components of this data collection, such as site visits and interviews with service users. Trained and experienced professional researchers will conduct the interviews using semi-structured protocols. We will digitally record and transcribe interviews (with service user permission), to allow the interviewer to focus on the conversation. To the extent possible, we will send interview invitations and reminders, based on interviewees’ preferences.
Mathematica will send the staff activity logs to program staff via email. We designed the logs to be completed in Microsoft Excel, although program staff can print a PDF version to complete the logs on paper if they prefer. Program staff will return the completed logs to Mathematica via email or fax.
We will use technology in the surveys of enrollees and providers to reduce respondent burden; standardize data collection; and store the evaluation data in a secure, consistent manner. The surveys will use the following:
Web-based questionnaires. Mathematica will field the enrollee and provider surveys by web, offering a low-burden way for respondents to self-report whenever it is most convenient for them. Mathematica will deploy the web survey using Confirmit® software.2 This multimode platform allows respondents to complete the interviews using a tablet, computer, or mobile device connecting to the web-based instrument, and to complete the interview by telephone with staff at Mathematica’s Survey Operations Centers. The software offers all the advantages of computer-based administration, including range and logic checks, preprogrammed skips based on item responses or preloaded variables, and dynamic text fills.
To launch the web survey, we will send sample members the survey link and a unique password in the advance letter (see Attachment F). We will send this information to providers in the advance letter, as well as in their invitation and reminder emails (see Attachment H).3 Provider email invitations will feature personalized hyperlinks that allow respondents to begin answering questions without having to input their login information, further minimizing burden.
Computer-assisted telephone interviewing (CATI). Mathematica will field the CATI versions of the enrollee and provider instruments using Confirmit® software. Mathematica’s professionally trained interviewers will use the software to manage nonresponse follow-up by telephone, ensuring that nonresponding sample members receive contact attempts across different days of the week and times of day, as well as ensuring that interviewers contact them during appropriate calling hours for their time zone. The system will enable interviewers to record notes after each contact attempt, minimizing sample member burden associated with repeating information to several different interviewers.
Both the web and CATI instruments will allow for breakoffs, should respondents need to pause the interview and resume at a later time, without having to re-populate responses they have already provided.
Computer-based sample management system. The sample management system will minimize respondent burden by ensuring that nonresponse follow-up efforts are directed only to applicable cases in each survey. Furthermore, it will ensure that we deliver survey mailings and telephone follow-up efforts in respondents’ preferred language. We will update the system in real time, as respondents complete interviews in any mode. This database will allow Mathematica to update respondent contact information over time, using information provided by the program states and other sources, and direct subsequent mailings to the most current location. Finally, Mathematica will use the sample management system to document why cases may have become ineligible for the surveys (for example, documenting deceased enrollees or providers’ departure from their practice organization).
Software for coding of open-ended responses. Some of the questions in the enrollee survey instrument contain an open-ended response format. Mathematica’s trained data coding team will review these responses and group them according to themes, applying codes that facilitate statistical analysis. Mathematica’s coding software, ASCRIBE®, will facilitate high-quality coding by offering proposed codes based on prior decisions, coding all instances of a given statement uniformly, and providing quality assurance checks for supervisors to test for intercoder reliability.
Toll-free telephone number, survey website, and email address. All survey mailings will include a toll-free number that sample members can use to contact the study team with questions or concerns. Professionally trained interviewers will respond to these calls throughout the field periods. In addition, SSA will host an information website, which sample members can visit to obtain information and relay concerns about the legitimacy of the surveys. Provider survey sample members will also have access to a survey email address, which they can use to contact Mathematica staff about the survey.
The evaluation of RETAIN will not require collection of information that is available through alternate sources.
The site visits and service user interviews will provide information that cannot be obtained through SSA’s administrative records, other readily available sources, or other planned survey efforts for the demonstration. We will use these data to describe how the RETAIN programs designed and delivered RETAIN services. For example, the first round of interviews with state administrators and program staff will include discussion of organizational partnerships, recruitment and enrollment, provider and service users’ participation, service take-up, fidelity to the service model, and data collection procedures. The second round of interviews with state staff will focus on the fully implemented programs’ service delivery experiences, changes to the model or stakeholder partnerships since implementation, and the feasibility of and plans for sustaining the model after the demonstration. These later interviews will also yield information about changes in the counterfactual service environment that have implications for the treatment contrast or the evaluation’s ability to detect impacts.
The staff activity logs will provide information that is not available through SSA’s administrative records, the programs’ management information systems, or the programs’ administrative cost data. The amount of time staff spend on services such as coordination with medical providers and on program administration will help us understand how the programs operate and the services that they emphasize.
The enrollee and provider surveys will provide additional information that is unavailable in SSA’s program records. For example, the enrollee surveys will collect information on the experiences and well-being of RETAIN enrollees, including their employment status, job skills development, health, health insurance coverage, employer accommodations, satisfaction with RETAIN services, expectations for the future, and household income and benefit receipt. These data are not available from any other source. The survey will not collect information that is available in SSA administrative data, including SSA disability applications and payments and calendar year earnings.
Similarly, the RETAIN service provider surveys will collect information that is not available through any other source, including data on provider awareness of participation in the demonstration, engagement in RETAIN training, and approach to delivering services.
Some of the service providers that Mathematica will interview for the process analysis may be staff of small entities. Understanding this, we will minimize burden on those and all organizations as best we can while still obtaining the necessary information. In particular, Mathematica will keep discussions to one hour or less, and whenever possible, we will obtain information from other sources (such as administrative data) to limit how much we ask of staff. Mathematica has kept the number of interviews to a minimum and will schedule them at times that are convenient to the respondents.
The survey of RETAIN providers will pose minimal burden to small entities that may be participating in the demonstration. These health care organizations have agreed to participate in the demonstration and recognize that provider participation in the surveys is a part of that effort. Mathematica will minimize the burden by directing nonresponse follow‑up efforts to the providers themselves, placing minimal burden on staff at the front desk of the organization, and by making the survey available for completion outside business hours, at a day, time, and format that is most convenient and least burdensome for the provider.
These site visits and interviews are valuable for observing program operations firsthand and understanding which aspects of the programs work well and why. Moreover, if we make fewer visits, SSA and DOL will not be able to assess how the programs evolve over time to address challenges and leverage successes. Conducting interviews in person will allow the evaluation team to capture as complete a picture as possible of what program implementation looked like in practice and enhance Mathematica’s ability to develop a narrative about service delivery that will give DOL and SSA a rich source of information on ways to improve programs.
These telephone interviews are necessary to help DOL and SSA assess whether service users have a favorable impression of the services; how their impressions translate into service use; and how participation in RETAIN affected enrollees’ employment decisions and quality of life. Not collecting this information would lead to missed opportunities for improving programs and assessing how well the quantitative analysis findings apply in different settings. Finally, speaking with both RETAIN enrollees and project staff will support a more balanced approach to understanding program implementation than we could gain from interviewing project staff alone.
Mathematica will collect the staff activity logs in two one-week periods around the time of the second round of site visits to each program. Two periods are necessary to provide a representative sample of staff’s time use and to account for potential seasonal differences in program activities. The data collected are necessary to conduct a credible evaluation and are not available from other sources. Failure to collect the data would result in a less-precise benefit-cost analysis.
Surveys with RETAIN enrollees
The enrollee surveys collect critical data to help measure program outcomes for which data are not available from other sources. Without the survey data enrollees provide, the evaluation will not be able to assess the impact of RETAIN on outcomes for which data are not available from other sources, including employment status, income and benefit receipt, workplace accommodations, health status, and satisfaction with services RETAIN provides. Furthermore, we cannot collect these data less frequently, as the two surveys provide a critical measure of short- and longer-term impact of receipt of RETAIN services.
The survey of RETAIN service providers will collect critical data to help measure program outcomes for which data are not available from other sources. Without these data, the evaluation will not be able to assess the impact of RETAIN on outcomes such as greater utilization of return-to-work best practices in care delivery or the providers’ perceptions about barriers they face in providing optimal patient care with this population. These data, which are not available from other sources, play a critical role in evaluating whether the field as a whole should seek to emulate the practices developed by RETAIN.
There are no special circumstances that would cause this information collection to be conducted in a manner inconsistent with 5 CFR 1320.5.
SSA published the 60-day advance Federal Register Notice on January 6, 2021 at 86 FR 667, and we received no public comments. The 30-day FRN published on March 12, 2021 at 86 FR 14170. If we receive any comments in response to this Notice, we will forward them to OMB.
As a first step in the RETAIN evaluation, SSA undertook collaboration with their partner agency, DOL, on key issues relating to Phase 1 implementation and recruitment efforts across the state programs. SSA has also organized a technical working group (TWG) to provide input on key research questions, evaluability considerations, feasible experimental and nonexperimental methods, survey designs, analysis strategies, and interpretation and presentation of results. The TWG consists of researchers and clinicians with expertise in the areas of disability, early intervention, and evaluation design. The TWG is scheduled to meet on a regular basis, with three meetings planned during Phase 1 of the demonstration (in February, May, and August 2019) and three meetings planned during Phase 2, timed around key evaluation reports (early assessment, process and early impacts, final impacts). These external experts are:
Thomas Wickizer, Ph.D., Ohio State University College of Public Health
Glenn Pransky, former director at Center for Disability Research at the Liberty Mutual Research Institute
Carolyn Heinrich, Ph.D., Vanderbilt University
Jack Smalligan, M.A., Urban Institute
Frank Neuhauser, Ph.D., University of California at Berkeley’s Institute for the Study of Societal Issues
Douglas Martin M.D., Medical Director, UnityPoint Health – St. Luke’s Occupational Medicine
Marianne Cloeren, M.D., M.P.H., University of Maryland School of Medicine
Benjamin Doornink, M.B.A., Kootenai Health
An interdisciplinary team of economists, disability policy researchers, and survey researchers on staff at the evaluation contractor (Mathematica and its subcontractor, Tree House Economics, LLC) are contributing to the design of the overall evaluation. These individuals include:
Jillian Berk, Ph.D., Mathematica
Rosalind Keith, Ph.D., Mathematica
Gina Livermore, Ph.D., Mathematica
Holly Matulewicz, M.A., Mathematica
David Wittenburg, Ph.D., Mathematica
David Stapleton, Ph.D., Tree House Economics, LLC
Kenneth Fortson, Ph.D., Mathematica
Interviews with RETAIN service users will provide firsthand feedback on experiences with RETAIN. Where applicable, we will use findings from the interviews we hold early on to refine procedures and discussion topics for interviews we conduct later. Because of the timing of the RETAIN service user interviews (August 2022), we will not use these findings to inform the design of the survey instruments for the 2- and 12-month questionnaires used in the RETAIN enrollee surveys.
Mathematica will not offer remuneration to program administrators or directors or to RETAIN program staff members for participating in the qualitative interviews or completing staff activity logs. Mathematica will give respondents to the RETAIN service user interviewees a $30 gift card to express the study team’s appreciation for their time.
Both rounds of the enrollee and provider surveys will offer incentives for participation. SSA plans the following respondent payments:
Each round of the enrollee survey will feature a total incentive of $30. Mathematica will include a $5 prepaid cash incentive in the survey advance letter. The $5 prepayment is designed to encourage participation and offset costs associated with nonresponse follow-up. Respondents who complete the survey, by any mode, will receive a $25 gift card. Although the demographic characteristics of RETAIN enrollees are not yet known, Mathematica anticipates that gift cards will maximize the use and value of the incentive amount among survey respondents, especially for those who lack access to banks and might incur check-cashing fees.
Research shows that incentives increase response rates without compromising data quality (Singer and Kulka 2000), and they help increase response rates among people with relatively low educational levels (Berlin et al. 1992), among low-income and non‑white populations (James and Bolstein 1990), and among unemployed workers (Jäckle and Lynn 2007). There is also evidence that incentives bolster participation among those with lower interest in the survey topic (Jäckle and Lynn 2007; Kay 2001; Schwartz, Goble, and English 2006), resulting in data that are more complete.
Each round of the provider survey will feature a total incentive of $50. Mathematica will include a $5 prepaid cash incentive in the survey advance letter. As with the enrollee survey, this prepayment is designed to encourage participation and offset costs associated with nonresponse follow-up. Respondents will receive a $45 check. The provider incentive design is drawn from industry-wide practices for motivating responses from health care professionals (Cho, Johnson, and VanGeest 2013; McLeod et. Al 2013).
The information provided during the staff and RETAIN enrollee interviews is protected and held in confidential accordance with 42 U.S.C. 1306, 20 CFR 401 and 422, 5 U.S.C. 552 (Freedom of Information Act) 5 U.S.C. 552a (Privacy Act of 1974) and OMB Circular No. A-130. The data will be treated in a confidential manner unless otherwise compelled by law.
The study team takes seriously the ethical and legal obligations associated with the collection of confidential data. Secure handling of confidential data is accomplished via several mechanisms, including obtaining suitability determinations for designated staff, training staff to recognize and handle sensitive data, protecting computer systems from access by staff without favorable suitability determinations, limiting the use of personally identifiable information in data, limiting access to secure data on a “need to know” basis, and only for staff with favorable suitability determinations, and creating data extract files from which identifying information has been removed.
We will make clear the assurances and limits of confidentiality in all advance materials sent to recruit RETAIN service users and we will restate these assurances at the beginning of each interview. Although Mathematica staff members may work with the state liaisons to schedule and coordinate the interviews with RETAIN staff, they will not give those staff direct feedback on findings from the interviews. Mathematicall will aggregate all relevant findings from the staff interviews will in the evaluation reports. For the RETAIN service user interviews, Mathematica staff will have access to the states’ enrollment data, which will contain contact information for each potential participant for the interviews. However, Mathematica will not release this information to anyone outside the evaluation team. Moreover, Mathematica will not reveal to the states, RETAIN programs, or any other entity the names of the service users who participated in these interviews.
The Paperwork Reduction and Privacy Act statements appear on the enrollee and provider survey advance letters (Appendices F and H). Mathematica will include text reiterating assurances about the purposes of the survey and how we will use the data provided in the advance notification letter and in the survey introductions across all modes (Appendices E, F, G, and H). After we collect and analyze the survey data, Mathematica will not attribute the information that survey respondents provide to specific individuals in any public documents. Finally, Mathematica will destroy all data collected during the interviews and surveys in a secure manner at the completion of the evaluation.
We will not ask RETAIN staff any questions that are sensitive in nature. We do not expect the interviews with RETAIN service users to touch on any sensitive topics related to their involvement with RETAIN and services they have received. However, the general process of discussing their return-to-work experiences might be sensitive for some individuals, depending on their lived experiences and perspectives on their medical condition(s). We anticipate that these individuals will decline the interview solicitation.
Some enrollee survey respondents might have similar sensitivities with respect to discussion of their health. Additional items might also be sensitive topics for respondents, including household income, participation in public benefit programs, and whether the respondent was prescribed opioid medications. Some enrollees might consider questions about household income and benefit receipt to be sensitive because they believe financial matters are private. Public concern about opioid addiction and abuse might cause some respondents to feel embarrassed or ashamed to report opioid use even when that use is appropriate. All modes of survey administration will permit respondents to refuse to answer questions they do not wish to answer or that make them feel uncomfortable.
The provider surveys do not collect any information that could be considered sensitive. Nonetheless, these respondents will have the same opportunity to decline responding to any questions they do not wish to answer.
As noted in section 2, DOL will make a final decision on the number of states to fund for Phase 2 upon completion of Phase 1. For the purposes of presenting information below, SSA assumes that at least four states will participate in Phase 2, and we use four states as the assumption for developing burden estimates. If necessary, we will update the burden estimates in this package at the end of Phase 1 (September 2021), based on the actual number of Phase 2 states.
Staff interviews. Over the course of the evaluation, we will conduct interviews with a total of 76 staff, including interviews with RETAIN administrators. Burden estimates per staff member for these interviews are 1.25 hours in total for each round, which includes time for setting up interview appointments by telephone or email (0.25 hours) and participating in the interview (1.0 hours). We have allocated an additional 0.5 hours for the RETAIN state administrators or directors (1.75 hours in total for each round) to participate in provider selection and overall planning for the first site visit and in gathering information that will inform the benefit-cost analysis in the second visit. The estimated total burden time for all respondents and nonrespondents is 194 hours.
Interviews with RETAIN service users. The estimated time per response for these interviews varies from 0.1 hours (to review the invitation letter) for nonrespondents and 0.6 hours for interviewees (to review the invitation letter, call in to schedule an appointment, and complete the telephone interview). The bulk of annual burden time is spent in the interview itself, which will last up to 30 minutes. The estimated total burden time for all respondents and nonrespondents is 90 hours. This includes time spent fielding inquiries and scheduling interviews with up to 60 enrollees across the four RETAIN programs (15 per state).
Staff activity logs. The estimated time to complete the staff activity log is five minutes per day, and we are asking staff to complete the log each day for two one-week periods. This estimate includes time spent reviewing task instructions, recording their information, and returning the completed form. We anticipate that the data collection will include 1 RETAIN administrator (4 total) and 12 agency line staff per state (48 total). The total burden for this effort is 60 hours.
Enrollee surveys. The sample will include 12,000 individuals enrolled in RETAIN across the four programs (3,000 per program). Assuming a response rate of 80 percent at each round, we will conduct 19,200 interviews (9,600 each round). We anticipate the response burden for the R1 enrollee survey to be 15 minutes (0.25 hours). This includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment (0.05 hours), as well as the time anticipated for completing the interview (0.2 hours). Across the 9,600 enrollee interviews in R1, the total burden is 2,400 hours for survey respondents. We expect the second round of the survey to have a slightly larger burden, as the R2 survey interview duration is longer. The R2 survey interview includes questions that we will not include in R1. We estimate the R2 response burden to be 21 minutes (0.35 hours), which includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment (0.05 hours), as well as the time anticipated for completing the interview via any mode (0.30 hours). Across the 9,600 enrollee interviews in R2, the total burden is 3,360 hours. These estimates reflect a total expected burden of 5,907 hours for respondents and nonrespondents in the enrollee survey.
Provider surveys. The sample will include 400 providers delivering RETAIN services across the four programs (100 per program). Assuming a response rate of 80 percent at each round, we will conduct a total of 640 interviews (320 at each round). We estimate the response burden to be 17 minutes (0.28 hours), which includes time allocated for reviewing the advance mailing and potentially calling in to book an interview appointment (0.05 hours), as well as the time anticipated for completing the interview (0.23 hours). The provider survey burden will not change from R1 to R2. Thus, the R2 provider burden estimates are the same as for R1. Each round will have a total burden estimate of 89.6 hours. These estimates reflect a total expected burden estimate of 187 hours for provider survey respondents and nonrespondents.
Please see the burden charts below:
RETAIN 2021 Burden Figures:
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Average Wait Time in state RETAIN facilities (minutes)** |
Total Annual Opportunity Cost (dollars)*** |
Enrollee Survey Round 1 (Respondents) |
320 |
1 |
15 |
80 |
$25.72* |
24** |
$5,350*** |
Enrollee Survey Round 1 (Nonrespondents) |
80 |
1 |
3 |
4 |
$25.72* |
24** |
$926*** |
Totals |
400 |
|
|
84 |
|
|
$6,276*** |
RETAIN 2022 Burden Figures:
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Average Wait Time in state RETAIN facilities (minutes)** |
Total Annual Opportunity Cost (dollars)*** |
Staff Interviews (state administrators / directors) |
4 |
1 |
105 |
7 |
$45.23* |
24** |
$407*** |
Staff Interviews (program line staff) |
72 |
1 |
75 |
90 |
$32.58* |
24** |
$3,870*** |
Service User Interviews (Respondents) |
60 |
1 |
36 |
36 |
$25.72* |
24** |
$1,543*** |
Service User Interviews (Nonrespondents) |
540 |
1 |
6 |
54 |
$25.72* |
24** |
$6,945*** |
Staff Activity Logs (state administrators / directors) |
4 |
1 |
70 |
5 |
$45.23* |
24** |
$298*** |
Staff Activity Logs (program line staff) |
48 |
1 |
70 |
56 |
$32.58* |
24** |
$2,450*** |
Enrollee Survey Round 1 (Respondents) |
3,840 |
1 |
15 |
960 |
$25.72* |
24** |
$64,197*** |
Enrollee Survey Round 1 (Nonrespondents) |
960 |
1 |
3 |
48 |
$25.72* |
24** |
$11,111*** |
Enrollee Survey Round 2 (Respondents) |
960 |
1 |
21 |
336 |
$25.72* |
24** |
$18,518*** |
Enrollee Survey Round 2 (Nonrespondents) |
240 |
1 |
3 |
12 |
$25.72* |
24** |
$2,778*** |
Provider Survey Round 2 (Respondents) |
320 |
1 |
17 |
91 |
$32.58* |
24** |
$7,135*** |
Provider Survey Round 2 (Nonrespondents) |
80 |
1 |
3 |
4 |
$32.58* |
24** |
$1,173*** |
Totals |
7,128 |
|
|
1,699 |
|
|
$120,425*** |
RETAIN 2023 Burden Figures:
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Average Wait Time in state RETAIN facilities (minutes)** |
Total Annual Opportunity Cost (dollars)*** |
Enrollee Survey Round 1 (Respondents) |
3,840 |
1 |
15 |
960 |
$25.72* |
24** |
$64,197*** |
Enrollee Survey Round 1 (Nonrespondents) |
960 |
1 |
3 |
48 |
$25.72* |
24** |
$11,111*** |
Enrollee Survey Round 2 (Respondents) |
3,840 |
1 |
21 |
1,344 |
$25.72* |
24** |
$74,074*** |
Enrollee Survey Round 2 (Nonrespondents) |
960 |
1 |
3 |
48 |
$25.72* |
24** |
$11,111*** |
Provider Survey Round 2 (Respondents) |
320 |
1 |
17 |
91 |
$32.58* |
24** |
$7,135*** |
Provider Survey Round 2 (Nonrespondents) |
80 |
1 |
3 |
4 |
$32.58* |
24** |
$1,173*** |
Totals |
10,000 |
|
|
2,495 |
|
|
$168,801*** |
RETAIN 2024 Burden Figures:
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Average Wait Time in state RETAIN facilities (minutes)** |
Total Annual Opportunity Cost (dollars)*** |
|
Enrollee Survey Round 1 (Respondents) |
1,600 |
1 |
15 |
400 |
$25.72* |
24** |
$26,749*** |
|
Enrollee Survey Round 1 (Nonrespondents) |
400 |
1 |
3 |
20 |
$25.72* |
24** |
$4,629*** |
|
Enrollee Survey Round 2 (Respondents) |
3,840 |
1 |
21 |
1,344 |
$25.72* |
24** |
$74,074*** |
|
Enrollee Survey Round 2 (Nonrespondents) |
960 |
1 |
3 |
48 |
$25.72* |
24** |
$11,111*** |
|
Totals |
6,800 |
|
|
1,812 |
|
|
$116,563*** |
RETAIN 2025 Burden Figures:
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Average Wait Time in state RETAIN facilities (minutes)** |
Total Annual Opportunity Cost (dollars)*** |
Enrollee Survey Round 2 (Respondents) |
960 |
1 |
21 |
336 |
$25.72* |
24** |
$18,518*** |
Enrollee Survey Round 2 (Nonrespondents) |
240 |
1 |
3 |
12 |
$25.72* |
24** |
$2,778*** |
Totals |
1,200 |
|
|
348 |
|
|
$21,296*** |
RETAIN Grand Total Burden Figures:
Modality of Completion |
Number of Respondents |
Frequency of Response |
Average Burden per Response (minutes) |
Estimated Total Annual Burden (hours) |
Average Theoretical Hourly Cost Amount (dollars)* |
Average Wait Time in state RETAIN facilities (minutes)** |
Total Annual Opportunity Cost (dollars)*** |
Totals |
25,528 |
|
|
6,438 |
|
|
$433,361*** |
* We based these figures on average U.S. citizen’s hourly salary, as reported by Bureau of Labor Statistics data (https://www.bls.gov/oes/current/oes_nat.htm), and average local Government Management and staff hourly wages, as reported by Bureau of Labor Statistics data (https://www.bls.gov/oes/current/oes110000.htm) & (https://www.bls.gov/oes/current/oes131071.htm).
** We based this figure on the average FY 2020 wait times for field offices, based on SSA’s current management information data.
*** This figure does not represent actual costs that SSA is imposing on recipients of Social Security payments to complete this application; rather, these are theoretical opportunity costs for the additional time respondents will spend to complete the application. There is no actual charge to respondents to complete the application.
There is no cost burden to respondents other than the value of their time to participate in the study. Costs for data collection, storage, processing, and other functions related to these data are born solely by the evaluation contractor. The total cost to study participants for their time in this collection is shown in Exhibit A3.
For use in estimating annual costs to study participants, we estimated hourly wage rates for the RETAIN program administrators and service provider staff based on data available on the Bureau of Labor Statistics website. We based the estimates on the 2019 national median hourly wages for social and community service managers ($30.63) (Bureau of Labor Statistics 2019a) and for RETAIN line staff ($20.79, based on a combined national average for service delivery staff in social service agencies, social workers in health care settings, and rehabilitation counselors) (Bureau of Labor Statistics 2019b).
For both the RETAIN service user interviews and the enrollee surveys, we assumed enrollees will have a range of occupations. Therefore, we calculated the annualized cost to enrollees using the 2019 median wages for all workers across the eight states involved in Phase 1 of the RETAIN demonstration. This cost is $19.03, based on data from the Bureau of Labor Statistics (2019c).
We assume RETAIN service providers will represent a range of occupations. Therefore, we calculated the annualized cost to service providers using the 2019 median wages for family and general practitioners; physicians and surgeons; exercise physiologists; physical therapists; chiropractors; registered nurses; nurse practitioners; physicians assistants; psychiatrists; and substance abuse, behavioral disorder, and mental health counselors across the eight states involved in Phase 1 of the RETAIN demonstration. The median average wage in 2019 was $62.70, based on data on median wages from the Bureau of Labor Statistics (2019c).
Table A3. Annual Cost to Respondents
Respondent |
Average burden per response (hours) |
Number of respondents |
Frequency of response |
Median hourly wage rate |
Respondent cost |
2021 |
|
|
|
|
|
Enrollee survey R1 |
|
|
|
|
|
Respondents |
0.25 |
320 |
1 |
$19.03 |
$1,522.40 |
Nonrespondents |
0.05 |
80 |
1 |
$19.03 |
$76.12 |
Total |
0.3 |
400 |
** |
** |
$1,598.52 |
2022 |
|
|
|
|
|
Staff interviews |
|
|
|
|
|
RETAIN administrators/directors |
1.75 |
4 |
1 |
$30.63 |
$214.41 |
RETAIN program line staff |
1.25 |
72 |
1 |
$20.79 |
$1,871.10 |
Services user interviews |
|
|
|
|
|
Respondents |
0.6 |
60 |
1 |
$19.03 |
$685.08 |
Nonrespondents |
0.1 |
540 |
1 |
$19.03 |
$1,027.62 |
RETAIN staff activity logs |
|
|
|
|
|
RETAIN state administrators/directors |
1.16 |
4 |
1 |
$30.63 |
$142.12 |
RETAIN program line staff |
1.16 |
48 |
1 |
$20.79 |
$1,157.59 |
Enrollee survey R1 |
|
|
|
|
|
Respondents |
0.25 |
3,840 |
1 |
$19.03 |
$18,268.80 |
Nonrespondents |
0.05 |
960 |
1 |
$19.03 |
$913.44 |
Enrollee survey R2 |
|
|
|
|
|
Respondents |
0.35 |
960 |
1 |
$19.03 |
$6,394.08 |
Nonrespondents |
0.05 |
240 |
1 |
$19.03 |
$228.36 |
Provider survey R1 |
|
|
|
|
|
Respondents |
0.28 |
320 |
1 |
$62.70 |
$5,617.92 |
Nonrespondents |
0.05 |
80 |
1 |
$62.70 |
$250.80 |
Total |
7.05 |
7128 |
** |
** |
$36,771.32 |
2023 |
|
|
|
|
|
Enrollee survey R1 |
|
|
|
|
|
Respondents |
0.25 |
3,840 |
1 |
$19.03 |
$18,268.80 |
Nonrespondents |
0.05 |
960 |
1 |
$19.03 |
$913.44 |
Enrollee survey R2 |
|
|
|
|
|
Respondents |
0.35 |
3,840 |
1 |
$19.03 |
$25,576.32 |
Nonrespondents |
0.05 |
960 |
1 |
$19.03 |
$913.44 |
Provider survey R2 |
|
|
|
|
|
Respondents |
0.28 |
320 |
1 |
$62.70 |
$5,617.92 |
Nonrespondents |
0.05 |
80 |
1 |
$62.70 |
$250.80 |
Total |
1.03 |
10,000 |
** |
** |
$51,540.72 |
2024 |
|
|
|
|
|
Enrollee survey R1 |
|
|
|
|
|
Respondents |
0.25 |
1,600 |
1 |
$19.03 |
$7,612.00 |
Nonrespondents |
0.05 |
400 |
1 |
$19.03 |
$380.60 |
Enrollee survey R2 |
|
|
|
|
|
Respondents |
0.35 |
3,840 |
1 |
$19.03 |
$25,576.32 |
Nonrespondents |
0.05 |
960 |
1 |
$19.03 |
$913.44 |
Total |
0.7 |
6,800 |
** |
** |
$34,482.36 |
2025 |
|
|
|
|
|
Enrollee survey R2 |
|
|
|
|
|
Respondents |
0.35 |
960 |
1 |
$19.03 |
$6,394.08 |
Nonrespondents |
0.05 |
240 |
1 |
$19.03 |
$228.36 |
Total |
0.4 |
1200 |
** |
** |
$6,622.44 |
Grand total |
|
|
|
|
|
Staff interviews |
|
|
|
|
|
Administrators/directors |
3.5 |
4 |
1 |
$30.63 |
$428.82 |
RETAIN program line staff |
2.5 |
72 |
1 |
$20.79 |
$3,742.20 |
RETAIN staff activity logs |
|
|
|
|
|
RETAIN state administrators/directors |
1.16 |
4 |
1 |
$30.63 |
$142.12 |
RETAIN program line staff |
1.16 |
48 |
1 |
$20.79 |
$1,157.59 |
Service user interviews |
|
|
|
|
|
Respondents |
0.6 |
60 |
1 |
$19.03 |
$685.08 |
Nonrespondents |
0.1 |
540 |
1 |
$19.03 |
$1,027.62 |
Enrollee surveys R1, R2 |
|
|
|
|
|
Respondents |
0.6 |
9,600 |
1 |
$19.03 |
$109,612.80 |
Nonrespondents |
0.1 |
2,400 |
1 |
$19.03 |
$4,567.20 |
Provider surveys R1, R2 |
|
|
|
|
|
Respondents |
0.56 |
320 |
1 |
$62.70 |
$11,235.84 |
Nonrespondents |
0.01 |
80 |
1 |
$62.70 |
$50.16 |
Total |
10.29 |
13,128 |
** |
** |
$132,649.43 |
14. Annualized cost to the federal government
The total cost to SSA of conducting the RETAIN evaluation is $20,806,467.00. The cost by year is shown in Exhibit A4. We budgeted labor costs by estimating the number of hours of required staff at the various wage levels, multiplying by the applicable wage rates, and multiplying the resulting subtotals by factors to cover fringe benefits and burden expense. The basis for estimating other direct costs varies with the type of cost being estimated. We summed the total of labor costs and other direct costs and multiplied them by a factor to cover general and administrative expenses, and the fee is added.
Table A4. Annual Costs to the Federal Government
Fiscal year |
Cost |
2021 |
$ 3,200,000 |
2022 |
$ 4,200,000 |
2023 |
$ 4,200,000 |
2024 |
$ 2,944,519 |
2025 |
$ 1,786,501 |
Total |
$ 16,331,020 |
This is a new information collection that will increase the public reporting burden.
With the findings of the RETAIN evaluation, SSA and DOL will be able to advise federal policymakers and state administrators on supports, services, and policy and program changes that could improve labor force participation and retention of individuals experiencing the onset of an injury, illness, or condition that could threaten their ability to remain employed.
Mathematica will analyze the information collected in the interviews to prepare reports that present the findings and their program and policy implications. We will not use complex quantitative analytical techniques with these data.
Two major reports will present the findings from the site visits and interviews. The reports will include a stand-alone summary of the purpose, methods, key findings, and policy implications, as well as a short executive summary. Products resulting from information obtained in this data collection will provide DOL and SSA with information about the experiences of RETAIN program administrators, project staff, and enrollees. Mathematica will integrate the information obtained from the qualitative information collected for the process evaluation with information collected from the other components of the evaluation, which we will use to draw comparisons between states.
We will include the enrollee and provider survey findings in the impact report. The impact report will include a stand-alone summary of the purpose of the demonstration and the evaluation, methods, key findings, and policy implications, as well as a short executive summary. Products resulting from information obtained in this data collection will provide DOL and SSA with information on the short- and intermediate-term impacts of participation in RETAIN for enrollees and providers. Mathematica will integrate the information obtained from the surveys with information collected from the other components of the evaluation. We will use these data to draw summary conclusions about RETAIN overall, and to identify and provide potential explanations for any differences in outcomes observed across the participating states. Exhibit A5 shows the planned timeline for the data collection along with the completion dates for the public reports that will include the interview findings.
Table A5. Data collection and Reporting Schedule
Activity/report |
Approximate dates |
Data collection |
|
RETAIN program administrator, staff interviews |
February 2022 and September 2022 |
RETAIN service user interviews |
August 2022 |
Staff activity logs |
April–June 2022 |
Reports |
|
Early assessment report |
October 2022 |
Process analysis report |
January 2024 |
Early impacts report |
January 2025 |
Final impacts report |
February 2026 |
SSA is not seeking an exemption with this submission. We will display the OMB expiration date on all interview materials.
SSA is not requesting an exemption to certification requirements.
1 This submission assumes four program states will be selected for Phase 2, per the initial evaluation design. However, the decisions relating to which state programs, along with the final number of programs selected for Phase 2, were still pending at the time of this submission. This submission will therefore be updated when these decisions are finalized.
2 Confirmit® is the computer-assisted interviewing system and survey-processing tool Mathematica uses for survey data collection. The software was developed by Confirmit® for the Windows® operating system and web browsers.
3 SSA security requirements do not permit correspondence with enrollee survey sample members through email. However, provider survey sample members will receive electronic communications at the email address provided by their practice organizations.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | RETAIN OMB Data Collection Package |
Author | MATHEMATICA |
File Modified | 0000-00-00 |
File Created | 2021-03-23 |