BOND
Implementation Updated
OMB Clearance Request OMB
No. 0960-0785
and Evaluation
Table of Contents
A.1 Introduction/Authoring Laws and Regulations 3
A.2 Description of Collection 3
A.2.1 Overview of the Evaluation 3
A.2.2 Data Collection Purpose and Process 7
A.2.3 Who Will Use the Information 12
A.2.4 Item-by-Item Justification 13
A.3 Use of Information Technology to Collect the Information 14
A.3.1 Information Technology and Data Management 15
A.3.2 Information Technology and Sample Tracking 15
A.3.3 Information Technology and Administrative Data Collection 16
A.3.4 Information Technology and Survey Administration 16
A.4 Why We Cannot Use Duplicate Information 17
A.5 Minimizing Burden on Small Respondents 17
A.6 Consequence of Not Collecting Information or Collecting it Less Frequently 17
A.8 Solicitation of Public Comment and Other Consultations with the Public 18
A.9 Payment or Gifts to Respondents 18
A.10 Assurances of Confidentiality 20
A.10.2 Data Confidentiality Protections 21
A.10.3 Data Storage and Handling of Survey Data 21
A.11 Justification for Sensitive Questions 21
A.12 Estimates of Public Reporting Burden 22
A.13 Annual Cost to the Respondents (Other) 23
A.14 Annual Cost to Federal Government 23
A.15 Program Changes or Adjustments to the Information Collection Request 23
A.16 Plans for Publication Information Collection Results 23
A.16.1 Time Schedule for Analysis and Reporting 23
A.16.2 Analytic Techniques, Tabulations, and Reporting 24
A.17 Displaying the OMB Approval Expiration Date 25
A.18 Exceptions to Certification Statement 25
Part B: Collection of Information Employing Statistical Methods (see separate document)
B.1.1 Sample Recruitment and Random Assignment 3
B.1.2 Universe of Households and Survey Samples 5
B.2 Procedures for Collecting the Information 6
B.2.3 Degree of Accuracy Required 10
B.2.4 Procedures with Special Populations 13
B.3 Methods to Maximize Response Rates 14
Participant Tracking and Locating 14
B.3.1 Sample Control During the Data Collection Period 15
B.5 Statistical Agency Contact for Statistical Information 16
Appendix A. Site Selection Summary
Appendix B. BOND Stage 2 Baseline Survey Instrument
Appendix C. BOND Stage 2 Interim Survey Instrument
Appendix D. BOND Stage 1 36-month survey Instrument
Appendix E. BOND Stage 2 36-month survey Instrument
Appendix F. Master Protocol and Key Informant Table and Focus Group Moderator’s Guide
Appendix G. Item-by-Item Justification for the Stage 2 Baseline Instrument
Appendix H. Item-by-Item Justification for the Stage 2 Interim Instrument
Appendix I. Item-by-Item Justification for the Stage 1 Follow-up Instrument
Appendix J. Item-by-Item Justification for the Stage 2 Follow-up Instrument
Appendix K. BOND Participation Agreement (Informed Consent Form)
Appendix L. Federal Register Notice (Pending)
Appendix M. Enhanced Work Incentive Counseling Assessment Tool
Benefit Offset National Demonstration
OMB Control No. 0960-0785
A.1 Introduction/Authoring Laws and Regulations
As part of the Ticket to Work and Work Incentives Improvement Act of 1999, 42 USC 434(a)(1)(A) and (B), Congress asked that the Social Security Administration (SSA) conduct a demonstration project testing a program under which Title II disability benefits are offset $1 for every $2 above a specific amount of earnings. We moved forward with this Congressional request by implementing the Benefit Offset National Demonstration (BOND), a field-test and evaluation of policy changes and services on the Title II Social Security Disability Insurance (SSDI) program, which allows SSA to test the effectiveness of potential solutions that would improve the historically very low rate of return to work among SSDI beneficiaries. OMB approved BOND information collections in February 2011.
BOND uses an experimental design, to test a gradual reduction of benefits both alone and together with the provision of enhanced work incentives counseling. Under this enhanced work incentives counseling option, some treatment groups receive a more intensive version of currently available benefit counseling services, allowing more personalized support to help with the complications of returning to work or working more.
A.2.1 Overview of the Evaluation
Currently, SSDI beneficiaries lose their entire SSDI benefit if they have earnings or work activity above the threshold of Substantial Gainful Activity (SGA). Often we refer to this feature of the benefit design as “the cliff.” The benefit-offset component of the demonstration allows earnings above this level, reducing benefits by $1 for each additional $2 earned, thereby eliminating the “cliff” currently in effect.
BOND tests a benefit offset alone and in conjunction with enhanced work incentives counseling. The central research questions concern:
What is the effect of the benefit offset alone on employment and other outcomes?
What is the effect of the benefit offset in combination with enhanced work incentives counseling, on employment and other outcomes?
The evaluation uses an experimental design (with random assignment) to measure these effects. Random assignment takes place at two levels, Stage 1 and Stage 2. Each level creates a current law control group to compare to one or more treatment group. Exhibit A1 shows the sample intake flow for both Stage 1 and Stage 2.
The exhibit illustrates how the three-way random assignment takes place at Stage 1, creating a Stage 1 treatment group that receives the offset (T1), a current law control group (C1), and the “solicitation pool” for the Stage 2 study that randomly assigns volunteers for Stage 2. Similarly, the exhibit shows how at Stage 2, we assign those in the Stage 1 “solicitation pool” to three groups that receive different treatment by the SSDI program: those who receive the offset and enhanced work incentives counseling (T22), those who receive the offset only (T21), and a current law control group (C1).
Exhibit A1
Exhibit A1 also shows how the three treatment groups created by the demonstration – one at Stage 1 and two at Stage 2 – receive one of the following interventions:
The $1 for $2 benefit offset, which helps a participant increase income through a combination of increased earnings and a gradual reduction in cash benefits. If beneficiaries complete their Trial Work Period and Grace Period and continue to have earnings above an annualized SGA threshold amount, they automatically shift to the offset schedule for the calculation of monthly benefits. In Stage 1, SSDI-only and concurrent beneficiaries we randomly assign to the treatment group shift from the regular SSDI benefit schedule to a new schedule to which we apply the $1 for $2 benefit offset to earnings above an annualized SGA threshold. All beneficiaries in the Stage 1 treatment group receive notification of the offset availability. Based on historical levels of beneficiary employment, we project that most of those participants we assign to the offset will not use it. In addition, a random subset of SSDI-only beneficiaries forms a “solicitation pool,” and we will invite them to volunteer to participate in Stage 2 of the demonstration. A random share of those who volunteer comprise an offset-only Stage 2 treatment group.
The $1 for $2 benefit offset with enhanced work incentives counseling which, in addition to allowing beneficiaries to earn more without losing cash benefits via the offset, provides participants in this treatment group with enhanced work incentives counseling services that assists them in understanding how work and earnings affect their benefits; and provides information about and referrals to other employment-related services and supports available through the SSDI program and other local programs. The counseling we provide represents a more intensive version of currently available benefits counseling services,1 allowing more personalized support to help with the complications of returning to work (or working more). We randomly assign a subset of the SSDI-only beneficiaries in the “solicitation pool” who volunteer to participate in Stage 2 of the demonstration to this Stage 2 treatment group.
BOND has several research components. The impact study compares outcomes between the treatment and control groups using data from administrative records and the surveys, and it compares impacts among the three treatment designs. A second research component, the participation analysis, examines the proportion of applicants and beneficiaries who take up the offer of the benefit offset or benefits counseling, how participation varies with the treatment design, the timing, and intensity of participation, and personal characteristics.
A third research component, the process study, examines all aspects of demonstration implementation and operations, in order to assist SSA in understanding and interpreting project results and identifying ways that we might modify the interventions to improve quantitative outcomes. The study documents implementation and operations, identifying important differences across sites and over time; defines clearly the interventions that we tested; and draws lessons to guide future replication of the demonstration in a national program. It also provides qualitative information that will be helpful in interpreting the results of the impact analysis.
The project’s benefit-cost analysis estimates the net budgetary effects of the offset and enhanced work incentives counseling we provide in each treatment package. The analysis estimates budgetary effects separately for SSA, the federal government, and state and local governments. The cost analysis draws upon the impact and process analyses in estimating the net budgetary effects of the program treatments for the various levels of government. In addition, the cost analysis examines the net costs to participants and society
Finally, in response to OMB’s suggestion, SSA developed a short survey to evaluate the effectiveness of the initial letter. The survey determines whether a subject: (1) received the letter, (2) read the letter, and (3) understood the letter. A contracted call center administered the survey by telephone to 500 Stage 1 treatment subjects we randomly selected from those to whom we sent the letter in an August 26, 2011 mailing. Since the beginning of May 2011, Abt, the implementation contractor for BOND, mailed initial contact letters approximately every two weeks. Abt mailed the last of the letters on August 26, 2011. We chose to target individuals from the final mailing because we believed the best chance for accurate responses came from those who most recently received the letter. We completed the analysis for this survey in 2012 and 2013.
Across the United States, 10 sites continue to implement BOND, from 2011 through 2017. Appendix A provides details of the site selection process.
A.2.2 Data Collection Purpose and Process
This request for renewal of OMB’s approval covers the data collection instruments for both the impact and benefit cost analyses, and the process study analysis. This section first explains the purpose of each of the participant survey instruments we require for the impact analysis. Then, it explains the interview guides we require for the process study analysis.
Impact Analysis Participant Surveys Purpose and Data Collection Activities
The data we obtain through beneficiary surveys are critical inputs to the BOND impact and benefit-cost studies. Administrative data analysis is valuable, but it includes only a limited set of variables and outcomes. The surveys provide information we need to measure baseline characteristics and outcomes that we cannot obtain using administrative records. Specifically, the survey instruments collect information that is more detailed on a wide range of topics such as employment, earnings, health and functional status, education and training, opinions about the demonstration, health insurance, and income.
BOND is a longitudinal study – with at least three points of contact for the Stage 2 sample. We conduct all of the survey efforts by telephone, with in-person follow-up as necessary, except for the Stage 2 baseline. Wherever possible, the Stage 2 interim and 36-month surveys use the same field interviewers that conducted the baseline survey data collection. This allows the field interviewers to build upon the rapport they established with the beneficiary at the time of enrollment. For the Stage 2 interim survey, we estimate we will complete 45 percent of the surveys by telephone and 35 percent will require in-person follow-up. Since the Stage 2 interim survey is the second interview contact for this sample, it is likely the respondents have preconceived notions about the burden survey participation will impose. As a result, they may be less likely to participate in the survey when called on the phone. However, they may be more likely to complete the survey in-person, especially when the interviewer is the person they built a rapport with at baseline. Thus, we anticipate a larger proportion requires in-person administration. Given the longer interview length, the longer follow-up period, and the fact that the Stage 2 36-month survey is the third point of contact with the sample members, we project completing a much larger proportion of the Stage 2 36-month interviews in person.
The situation is different for the Stage 1 36-month survey. Although the time elapsed since baseline could pose some locating challenge, we project completing a higher proportion of this sample by telephone. We will interview this sample just once during the study period. This will make it easier to solicit an initial survey response by telephone since they have no preconceived notions of the interview burden. The length of the interview is significantly shorter than the Stage 2 36-month survey, which makes telephone administration more feasible.
Purpose of the Participant Surveys for the Impact Analysis. This survey information collection is essential for conducting a rigorous evaluation of the range of the effects of the BOND intervention that would be important to policy and that we need to conduct a comprehensive benefit-cost analysis of the intervention. The administrative data researchers will also use to evaluate BOND do not offer the full range of topics that the evaluation requires. In order to ensure the availability of the data we require for analysis in the BOND impact and benefit-cost studies, we developed five survey instruments:
A Stage 2 baseline survey we administer to all Stage 2 volunteers prior to random assignment;
A Stage 2 interim survey we will administer 12 months after random assignment for all Stage 2 volunteers, with a response rate target of 80 percent;
A Stage 2 36-month survey we will administer 36 months after random assignment for all Stage 2 volunteers, with a response rate target of 80 percent;
A Stage 1 36-month survey, we will administer 36 months after random assignment for a subsample of 10,000 Stage 1 treatment and control group members, with a response rate target of 80 percent.
A Stage 1 first contact letter survey, we administered as soon as possible after Abt mailed the last letters on August 26, 2011.
We designed the Stage 1 36-month survey specifically to measure employment outcomes—including wages, occupations, benefits, and hours worked—of beneficiaries who return to work because of participation in the project, as authorized by the Ticket to Work legislation.2
The Stage 1 First Contact Letter Survey is the primary source for evaluating the effectiveness of the BOND initial contact letter, and determining whether follow-up contact is necessary.
The Stage 2 baseline and Stage 2 36-month surveys (we administer respectively at enrollment and approximately 36 months later) are the primary sources for contextual variables and outcome measures on the Stage 2 treatment and control group members.
The Stage 2 interim survey is the primary source to capture information about experiences with the BOND program and receipt of demonstration services.
The survey data collection is the primary source for data to measure the effects of a more generous benefit offset, and the provision of enhanced work incentives counseling on the work effort and earnings of SSDI beneficiaries. The experimental design generates data to draw rigorous inferences about the effects of the benefit offset and benefits counseling, independent of all other factors affecting the lives of demonstration participants. Random assignment serves to ensure that the treatment and control groups match well on both observed and unobserved characteristics at the time of their entry into the study. It thus establishes the strongest possible foundation for understanding whether the benefit offset, alone or in conjunction with enhanced work incentives counseling, can lead SSDI beneficiaries to greater economic self- sufficiency and other improvements in their lives.
The baseline survey provides data necessary to define the covariates we use in the later impact analyses and to define subgroups of the demonstration study sample for analysis. The interim survey collects information on service receipt in the first year after Stage 2 random assignment to establish the treatment/control differential in service receipt, and to measure for the cost-benefit analysis the extent to which Stage 2 demonstration participants participated in employment-related services. We expect service participation of Stage 2 volunteers to be much higher than that of DI-only beneficiaries as a whole. We measure and report both, and do not consider the former to be a bias. Rather, it is a measure of a different policy-relevant concept: How much beneficiaries who want the offset or EWIC use employment-related services, as opposed to simply responding to the offset’s work incentives alone.
The Stage 1 and Stage 2 36-month surveys, in conjunction with an analysis of outcomes derived from SSA administrative data on earnings and benefit receipt, captures the experiences of treatment and control group members over a period of three years. A follow-up interval of this length is important to measuring the impacts of BOND, as the effects of the demonstration on individual behavior and wellbeing may take time to emerge.
Data Collection Activities
Trained interviewers administer each of the participant surveys to the appropriate study participants. All of the Stage 1 participants are SSDI or concurrent beneficiaries randomly assigned to one of the three Stage 1 groups. All of the Stage 2 participants are SSDI-only beneficiaries, randomly assigned to one of the three Stage 2 treatment groups. Interviewers will inform participants that their participation in the survey is voluntary, that they can choose not to answer any question at any time, and that they can stop the interview at any time. Trained interviewers administer each of the surveys either in person (Stage 2 baseline) or through a combination of telephone and in-person administration (Stage 2 interim and 36‑month surveys; Stage 1 36-month survey). All survey administration uses computer-assisted technology—web and computer assisted personal interviewing (CAPI) for the Stage 2 baseline survey and computer-assisted telephone interviewing and personal interviewing (CATI/CAPI) technology for the Stage 2 interim and the Stage 1 and Stage 2 36-month surveys.
Process Study Purpose and Data Collection Activities
To learn about the implementation of the demonstration, the process study team conducts site visits to each of the 10 demonstration sites. We scheduled seven rounds of site visits. Site visits began 6 months prior to and continue 54 months after the implementation of the BOND demonstration (see Exhibit A3). We conduct each of the key informant interviews using semi-structured interview guides that typically last about 60 minutes. For the process study, we will also conduct participant focus groups. We summarized the purpose and timing of each round of site visit in Exhibit A2.
Exhibit A2. Overview of Site Visit Data Collection Activities
Type of Site Visit |
Purpose of the Visit |
Estimated Date(s) |
Focus Group Activities |
Round 1: Pre-implementation |
Document and describe the service area prior to demonstration implementation |
4 months before BOND implementation
|
None |
Round 2: Initial implementation |
Document and describe the demonstration structure and service delivery process, begin assessing the fidelity of the treatment |
6 months after BOND implementation
|
None |
Round 3: Demonstration enrollment #1 |
Document the enrollment process, describe how the demonstration has evolved over time, assess the fidelity of the treatment |
18 months after BOND implementation
|
Focus groups with treatment group participants |
Round 4: Demonstration enrollment #2 |
Demonstration updates, document any changes to the enrollment process, assess the fidelity of the treatment |
24 months after BOND implementation
|
None |
Round 5: Post-enrollment #1 |
Demonstration updates, assess the fidelity of the treatment, begin to interpret early impact findings, identify lessons learned |
30 months after BOND implementation
|
Focus groups with treatment group participants |
Round 6: Post-enrollment #2 |
Demonstration updates, assess the fidelity of the treatment, interpret early impact findings, identify lessons learned |
42 months after BOND implementation
|
None |
Round 7: Post-enrollment #3 |
Demonstration updates, assess the fidelity of the treatment, interpret impact findings |
54 months after BOND implementation
|
Focus groups with treatment group participants |
Purpose of Data Collection. The process study documents project operations and provide a basis for interpreting and generalizing the demonstration findings. Abt and their partners use data they collect for the process study for four primary purposes:
Describe the Intervention as Implemented. We use the data to provide detailed documentation of the projects’ operations and service environments.
Assess the Demonstration Fidelity. We use the process study data to monitor the demonstration’s fidelity to its design. It identifies the key operational challenges that emerge as we implement the intervention, and examines how, and how well, we address those challenges.
Help Interpret Impact Results. The evaluation team uses the process study data to interpret and hypothesize about the cross-site, cross-time impact findings. From these data, the evaluation team assesses whether variation in implementation across sites is likely to affect demonstration participation and impacts.
Identify Lessons. We structured data collection to identify the primary lessons during each stage of the demonstration.
Data Collection Activities
The process study team, consisting of senior researchers, will conduct seven rounds of in-depth site visits to each of the BOND demonstration sites. During these site visits the team focuses on key topics such as the demonstration planning, agencies involved with the demonstration, process for recruiting and enrolling participants, provision of regular and enhanced work incentives counseling to treatment group participants, and the challenges and lessons learned in recruiting and serving participants.
We conduct each of the key informant interviews using semi-structured interview guides that typically last about 60 minutes. Interview participation is voluntary. Abt and their partners compiled a range of topics and questions into a resource bank that we use to create the key respondent interview guides for each round of site visits (see Appendix F). The process study team only uses the topic areas and questions that are relevant to that particular round of site visits and key respondent (see Appendix F, Tables 1 and 2). For example, we only used the questions describing the agencies involved with the demonstration planning and initial implementation for the second round of site visits. This bank of topic areas and questions allow Abt and their partners to adapt the interview guides as the demonstration evolves over time.
At the beginning of each round of visits, the task leader creates the key respondent interview guides by drawing from the resource bank of topic areas and questions. All site visitors use those interview guides for that round of visits. We change the interview guides from one round to the next to capture the dynamic nature of the demonstration.
In three of the rounds of site visits (Rounds 3, 5, and 7), we will conduct focus groups with treatment group participants to learn about their experiences with work incentives counseling and how the benefit offset influences their decision to work. We will complete two focus groups within each of the sites. One group will include those receiving regular work incentives counseling, and the other group will consist of those receiving enhanced counseling. Participation in the focus groups is voluntary.
We will use a semi-structured interview guide for the treatment group participant focus groups (see Appendix F). Focus groups will last 90 minutes. To encourage broad participation from those with different types of disabilities, where possible, we will provide the needed accommodations, such as translation services for the hearing impaired, so that treatment group members may participate in the focus groups.
Frequency of Data Collection
Site visits began 4 months prior to and continue 54 months after the implementation of the BOND demonstration (see Exhibit A2).
Key Respondents
In each of the 10 demonstration sites, the process study team works with the BOND site offices to determine who is involved with administering and providing BOND activities, including the regular and enhanced work incentives counseling. While the team expects that the list of respondents will vary somewhat across sites (depending on the specific service providers involved in each of the sites), the categories of potential respondents include:
Demonstration site director and staff (e.g., outreach and recruitment specialist, BOND specialist, and mobile BOND specialist) who are responsible for demonstration planning as well as coordinating the initial and ongoing implementation of BOND. They are also responsible for all recruitment, enrollment, and research activities (e.g., surveys, random assignment, preventing contamination or crossovers).
Headquarters, Regional, and Area Office SSA personnel who work with the sites to recruit participants, and perform other responsibilities associated with the demonstration (these staff also provide information on the experiences of control group members and stage two treatment group participants concurrently enrolled in SSI and SSDI).
Staff and managers at work incentives counseling service providers who serve demonstration participants and who offer employment-related services more generally (these individuals also provide information on the experiences of control group members).
Representatives of state and local government agencies that serve people with disabilities, which could include State Vocational Rehabilitation Agencies (SVRA), employment agencies, health policy and services agencies who serve both treatment and control group members.
Managers and Community Work Incentive Coordinators at the Work Incentives, Planning, and Assistance (WIPA) agencies funded by SSA to assist beneficiaries in return-to-work efforts.
State and local advocates for working people with disabilities.
Stage 2 BOND participants who receive WIC services and those who receive EWIC services.
A.2.3 Who Will Use the Information
SSA is the primary beneficiary of the planned survey data collection. SSA uses the information to assess the effects of the BOND demonstration for SSDI beneficiaries. These data will begin to answer SSA's questions about impacts of the benefit offset alone and in combination with enhanced work incentives counseling services in all study domains: employment and earnings; income; benefits; etc. SSA is also the primary beneficiary of information from the key informant interviews, although the demonstration operator also utilizes information from them. The information we obtain through the interviews helps identify best practices in implementing a program like BOND and potentially guides the decision to implement BOND nationwide.
Members of the public policy and social science research community who are interested in developing policy initiatives to promote return to work among people with disabilities, and to stem growth in the numbers and length of time for beneficiaries on the SSDI program are secondary beneficiaries of this data collection. Demonstration site staff working directly with individuals use some of the data as well.
Ultimately, these data will benefit researchers, policy analysts, policy makers and the United States Congress in a wide range of program areas. The effects of BOND on the wellbeing of SSDI beneficiaries could manifest themselves in many dimensions and could be relevant to an array of other public programs. This project offers the first opportunity to obtain reliable measures of these effects. The long-term indirect benefits of this research are therefore likely to be substantial.
A.2.4 Item-by-Item Justification
In developing the surveys, we attempted to balance the need to capture the desired data elements against placing undue burden on the respondents, excluding items that—while potentially interesting—are not critical to the measurement of outcomes and analysis of the impacts of BOND. Another goal was to keep the time we need for survey administration to a reasonable duration, thereby limiting respondent burden.
This section provides a brief overview of the content of each of the four participant surveys. Appendices B-E contain the actual survey instruments and appendices G-J contain the item-by-item justifications for the proposed survey instruments. For each included survey item, these appendices show the content, the reason for inclusion—or intended use, and the source of the survey question. Appendix F provides the instruments for the process study data collection. The Enhanced Work Incentives Counseling Assessment Tool is included in Appendix M. Enhanced work incentives counselors (EWICs) will administer this assessment one time to each beneficiary assigned to the T22 group.
Exhibit A3 summarizes the data by domain for each of the four main participant surveys (we did not include the Stage 1 First Contact Letter survey in this chart, as we used it only to evaluate the initial contact letter and determine whether we needed any follow-up contact).
Exhibit A3: Data Collected in Participant Surveys
|
Stage 2 |
Stage 1 |
||
Baseline Survey |
Interim Survey |
36-Month Follow-up Survey |
36-Month Follow-up Survey |
|
Employment and Earnings |
|
|
|
|
Service Use |
|
|
|
|
Income |
|
|
|
|
Barriers to Employment |
|
|
|
|
Disability Onset |
|
|
|
|
Education and Training |
|
|
|
|
Health and Functional Status |
|
|
|
|
Awareness of BOND or SSA Work Incentives |
|
|
|
|
Work History |
|
|
|
|
Personal Characteristics |
|
|
|
|
We designed the Stage 1 36-month survey specifically to measure employment outcomes—including wages, occupations, benefits, and hours worked—of beneficiaries who return to work because of participation in the project, as authorized by the Ticket to Work legislation.3
The Stage 2 baseline and Stage 2 36-month surveys (we administer respectively at enrollment and approximately 36 months later) are the primary sources for contextual variables and outcome measures on the Stage 2 treatment and control group members.
The Stage 2 interim survey is the primary source to capture information about experiences with the BOND program and receipt of demonstration services.
The key informant interviews and focus groups that are part of the process study help the research team describe the demonstration start-up, implementation, ongoing operations, and the provision of services to participants over time. Key informant interviews we conduct during site visits focus on the following domains:
site background;
current services available to beneficiaries;
agencies which involve themselves in demonstration planning and initial implementation;
demonstration site recruitment, planning, and startup;
sample selection, recruitment, and enrollment;
development and structure of the demonstration;
demonstration service delivery;
participation patterns and experiences;
communication, coordination, and interagency relationships;
monitoring and tracking;
assessment of demonstration implementation; and
successes, challenges, and lessons learned.
Appendix M displays the Enhanced Work Incentives Counseling (EWIC) Assessment tool. Enhanced Work Incentives Counselors (EWICs) administer the assessment tool once to each of the beneficiaries assigned to the T22 group. The assessment tool is a guided interview that equips the EWIC counselors with a profile of a beneficiary's needs in a short period of time, enabling a rapid response of referrals to services targeted to resolve needs and promote a return to work. The responses to the assessment enable EWICs to document needs of BOND participants that inhibit their employment and provide information and recommendations that quickly address issues in an integrated and effective manner. This enables EWICs to assist beneficiaries and their families to locate the community resources that enable a return to work.
The open-ended questions on the assessment tool give the beneficiary a chance to describe their aspirations and concerns regarding employment, the barriers they may face in finding and keeping work, and any support they have or may need. The guided interview also explores the beneficiary’s employment goal, past education or work experience, and their perceptions of the skills they bring to their employment. The questions cover the following domains:
Daily Activities. Counselors use responses to questions in the daily activities domain to ascertain how beneficiaries believe their health conditions affect their lives, and to develop service referral strategies. This domain includes questions about present living situation, support used for daily living activities, primary mode of communication, primary language, and use of email.
Health/Mental Health. Questions related to health capture critical issues that relate directly to physical and emotional health concerns. This domain includes questions about general health and physical, emotional, or secondary disabilities. Questions also focus on mobility/ambulation; motor skills; and use of, or need for, adaptive technologies. The EWIC’s goal during the assessment is to obtain a brief snapshot of the beneficiaries’ health, from their perspective, in the past, present, and future.
Work. For many SSDI beneficiaries, events at prior jobs can be instrumental in whether a disability prevents a beneficiary's likely return to work. Persons with identical impairments have very different outcomes based on their perceptions of work or a potential work place. The beneficiaries base many of these perceptions on prior work experiences. Questions about work not only provide a snapshot of the beneficiary’s background, but also give the EWIC counselor a general picture of the beneficiary’s motivation or return to work, assessment of their skills and capabilities, and their desire for work. The EWIC counselor uses this information to develop the service referral approach. Questions in this domain focus on employment goals; previous work experience; skills; preferred work environment; reasons for leaving previous jobs; educational attainment; and receipt of employment training.
Family. The reaction of a family to a disability of a loved one, and the type of support system available from the family can be critical in determining a beneficiary’s encouragement to return to work. Questions in this domain address support provided by the family and beneficiary’s living situation.
Financial. The beneficiaries’ financial situations can have unexpected effects on how they react to disability. To promote effective referral planning, EWIC counselors gather information from beneficiaries about receipt of benefits and knowledge of how work interacts with benefits.
The respondents are SSDI beneficiaries who volunteer to participate in the BOND evaluation.
A.3 Use of Information Technology to Collect the Information
This evaluation will use information technology in four distinct ways:
to maintain all demonstration data in a consistent manner;
to assist the ongoing sample tracking and locating efforts;
to measure certain outcomes through data we abstract from administrative records data; and
to facilitate collection of the survey data in standardized and accurate ways that also accommodates the confidential collection of sensitive data.
A.3.1 Information Technology and Data Management
The Benefit Offset National Demonstration will generate a substantial amount of data over its five years in operation. BOND requires multiple information systems to support project data collection, operations, and evaluation. We will manage the four key aspects of the demonstration’s functionally distinct data needs separately, linking them only for specific purposes. The four main types of data we require to support the needs of the demonstration are:
Administrative data from U.S. Department of Education’s Rehabilitation Services Administration (RSA)Vocational Rehabilitation administrative records, SSA administrative records, and Centers for Medicare & Medicaid Services (CMS) administrative records;
Stage 1 follow-up survey;
Stage 2 Surveys; and
Operations data, including but not limited to those operations we require for the management of participant recruitment and intake, earnings data collection, and data exchange with the BOND Standalone System.
The organization within the Abt Associates BOND team that specifically conducts each data collection activity deploys and administers the data systems needed to support the data collection. The BOND Operations Data System (BODS) manages all data the benefit offset demonstration collects or generates. This ensures clear, consistent documentation of all data transactions. This is particularly important as analysts work with various data components, creating their own analysis files etc. Documentation of all such transactions resides in the database environment, to ensure the ability to replicate each one in the future.
The database environment is the source of all data extracts, including the request files necessary for collecting administrative data from agencies other than SSA and the sample files for the survey data collection (see A.3.4 below). Since the participants in the demonstration have disabilities, it is likely that some require special assistance—either proxies or some form of assistive technology. The sample file includes a disability flag to note whether participants require special assistance for the interview, and if so, which forms of assistance they need.
A.3.2 Information Technology and Sample Tracking
The study’s tracking design contains both passive methods (requiring no contact with the sample member) and active methods (requiring interaction with the sample member). Passive methods include collection of location information from sources such as postal address updates, directory assistance, reverse directories, credit bureau data, and public agency administrative data. Active tracking methods include direct contact with the participant, and can consist of annual brief telephone or in-person interviews, as well as contact mailings that ask the participants to confirm or update how best to reach them.
The BOND Beneficiary Tracking System (BTS) stores all tracking updates we collect, and links them to the participants.
A.3.3 Information Technology and Administrative Data Collection
The second way in which improved information technology benefits the BOND evaluation is through collection of administrative data on certain outcomes. By accessing administrative information from SSA and other organizations, the Abt team was able to reduce the scope and burden of the survey instruments on evaluation participants. Exhibit A4 shows the uses to be made of administrative data.
Exhibit A4. Role of Administrative Data
Research Domain |
Outcomes/Purpose |
Data Sources |
Employment and earnings |
Annual employment and earnings |
SSA earnings records |
Income and benefits |
Monthly cash assistance |
SSA administrative records on beneficiary status and benefit receipt |
Service delivery data |
Assess service differentials between treatment and controls |
Rehabilitation Services Administration (RSA) Administrative Records. |
Medicaid and Medicare claims data |
Assess differences in participation patterns by control and treatment |
Centers for Medicare & Medicaid Services (CMS) Administrative Records |
A.3.4 Information Technology and Survey Administration
In accordance with SSA’s Government Paperwork Elimination Act plan, information technology assists in the survey data collection in three ways:
Design and management of the sample;
Survey administration; and
Survey management.
We use the BODS (described above) as the primary source for identifying the sample for each survey. Data drawn from various sources in the BODS define the survey sample, which we then incorporate into the automated survey data collection software.
The BOND Stage 2 baseline administration uses Web
& computer-assisted in-person interviewing (CAPI) technology. We
will administer the remaining BOND surveys—Stage 2 Interim and
both the
Stage 1 and Stage 2 36-month surveys—using
computer-assisted telephone interviewing and personal interviewing
(CATI/CAPI) technology. The Web/CATI/CAPI technology helps to
improve the quality of the survey data we collect in several ways.
First, Web/CATI/CAPI technology controls the flow of the interview,
which virtually eliminates any chance for missing data. Second,
controlling the flow of the interview also ensures that the skip
patterns work properly. Third, computer assisted interviewing can
build in checkpoints which allow the interviewer to confirm responses
thereby minimizing data entry errors. Finally, automated survey
administration can incorporate system checks for allowable ranges for
quantity and range value questions, minimizing out of range or
unallowable values. Web/CATI/CAPI technology also allows
interviewers to easily record verbatim responses to open-ended
questions; thus, supporting efficient survey management. Finally,
this technology allows us to collect data in accordance with the
agency’s Government Paperwork Elimination Act plan as all
participant surveys will done via automated means.
Although the CATI/CAPI software enhances the quality of the survey data we collect, and minimizes data entry errors, post data collection cleaning—using rigorous protocols—is necessary. First, researchers review frequency distributions to ensure that there are no outlying values, and to make sure that related data items are consistent. Analysts review the open-end or verbatim responses, group like answers together, and assign a numeric value. The most important open-end responses to require coding are the industry and occupation questions, and those that capture data about the respondent’s knowledge of SSA rules, and the demonstration. The automated nature of the survey data collection should result in little to no missing data items. In rare instances where missing data exist, we recode the variables in question with standard default codes to indicate missing data. Once we complete the data cleaning protocols, we construct appropriate weights required for analysis.4
Analysts undertake a similar process with the administrative data we collect. Analysts review the files for duplicate entries, incorrect matches, and individual item review, to ensure that they too are within the expected ranges.
Once we finalize data processing protocols, the research team will create the final data set and codebook.
A.4 Why We Cannot Use Duplicate Information
The information we are collecting and the manner in which we are collecting it preclude duplication. SSA does not use another collection instrument to obtain similar data. The purpose of the follow-up survey for the BOND evaluation is to obtain current information on the status and wellbeing of individuals in the BOND demonstration study sample. Information about these respondents' educational achievement, employment status, job skills development, physical and mental health, and use of health insurance are not available through any other source. Further, as described in A.3 above, the evaluation utilizes administrative data in conjunction with survey data to avoid duplication of reporting (e.g. disability benefits or earnings).
We also avoid duplication in this study by use of the centrally maintained BODS, which links all the data we collect from random assignment for use in the Baseline Survey (and during the subsequent active and passive tracking efforts) with subsequent information we gather from administrative sources. This eliminates the need to ask about personal characteristics or background factors for known household members on the interim or follow-up surveys.
A.5 Minimizing Burden on Small Respondents
We will not involve any small businesses or other small entities as respondents in the proposed data collection effort. Respondents are all individuals participating in the BOND evaluation.
A.6 Consequence of Not Collecting Information or Collecting it Less Frequently
Each of the four survey data collection efforts is essential to evaluating BOND. If we collected the data less frequently, it would jeopardize SSA’s ability to conduct the impact analysis. Delays in the administration of the interim survey run an inherent risk that the respondent will have trouble recalling the details about the services received. Delaying the administration of the follow-up surveys allows more time for outside factors—such as changes in the labor market, or declining health of the respondent—to mask the impact of the BOND interventions.
The process study data, particularly the key informant interviews, are critical to understanding the differences in program implementation across sites. The process analysis research team must make frequent visits to each demonstration site to document demonstration activities at critical points in time. We conducted two site visits before randomization; we will conduct two site visits during the enrollment phase; and we will conduct three site visits post enrollment. The first two site visits allowed us to collect baseline data, and to describe the initial implementation of the demonstration. Rounds 3 and 4 will allow us to capture the random assignment process and enrollment. The last three site visits will allow us to describe how the demonstration offset and work incentives counseling services may have contributed to program impacts over time. We began to assess the fidelity of the intervention during Round 2 and continue to do so during all subsequent site visits.
There are no technical or legal obstacles to burden reduction.
The proposed data collection activities are consistent with the guidelines set forth in 5 CFR 1320.5 (Controlling Paperwork Burden on the Public, General Information Collection Guidelines). There are no circumstances that require deviation from these guidelines.
A.8 Solicitation of Public Comment and Other Consultations with the Public
The 60-day advance Federal Register Notice published on November 1, 2013 at 78 FR 65745 and SSA received no public comments. The second Notice published on January 30, 2014 at 79 FR 5014. If we receive any comments in response to the 30-day Notice, we will forward them to OMB. SSA did not consult members of the public in the development or maintenance of the collection instruments.
Abt Associates Inc., the prime contractor, assisted with the BOND evaluation design. Abt Associates is also assisting with the implementation of the demonstration. Key members of the Abt team include Dr. Howard Rolston, Dr. Stephen Bell, Dr. David Stapleton, Dr. Larry Orr, Dr. Judith Feins, Mr. David Long, and Dr. Ray Glazier. Dr. Brian Sokol (Abt Associates) assisted in the coordination of survey data collection and the database environment/data warehouse.
SSA collaborated on the design of the demonstration with the Abt team throughout all phases of the study to date. The purpose of such consultation is to ensure the technical soundness and usefulness of the data collection instruments in carrying out the aims of the evaluation.
A.9 Payment or Gifts to Respondents
We provide incentive payments to respondents for the participant surveys and the focus group participants. Our contractor has found incentive payments are a powerful tool for maintaining low attrition rates in longitudinal studies, especially for cases in the control group because these sample members are not receiving any (other) program benefits or services. The use of incentive payments for the BOND surveys help ensure a high response rate, which is necessary to ensure unbiased impact estimates. Low response rates increase the danger of differential response rates between the treatment and control groups, leading to non-comparability between the two groups and potentially biased impact estimates.
Three factors helped to determine the incentive amounts for each survey:
Respondent burden, both at the time of the interview and over the life of the demonstration;
Costs associated with participating in the interview at that time; and
Other studies of comparable populations and burden.
The incentive structure for Stage 2 and Stage 1 surveys is as follows:
Stage 2:
Baseline Survey: $40 for all respondents.5
12-Month Interim: $25
36-Month Follow-up: $45
Tracking Letters: $5
Stage 1:
36-Month Interim: $25
The respondent burden associated with surveys of the Stage 2 sample in BOND is much greater than that for the Stage 1 sample members. We interview the Stage 2 sample at three different points over a 36‑month follow-up period, as compared to the Stage 1 sample, interviewed just once during the 36‑month period. Incentives generally increase over time for longitudinal samples with multiple surveys. For BOND, it is especially vital to minimize Stage 2 sample attrition for the survey data. Although the overall sample size seems large, the sample sizes for analysis in any one cell are much smaller when broken down across 10 sites, three random assignment groups, and multiple disability types. Therefore, it is imperative to minimize attrition over the life of the demonstration; respondent incentives help to do that.
The BOND enrollment process requires that whenever possible study subjects come into the BOND site office and spend time with the site office staff, give informed consent, complete the baseline interview, and then await the results of random assignment. The incentives for all of the respondents at baseline are $40 whether they came into the office or completed the interview in their home. (Those who travel to the BOND site office for their intake session and complete the baseline survey will also receive $10 to cover transportation or childcare expenses.)
Respondents to the interim survey will receive $25, as that survey is shorter and less complex than the baseline survey. The incentive payment for the Stage 2 36-month follow-up interview is $45, as the burden to the respondent is more significant: the instrument is longer, and continued participation in the study will involve not only the two earlier rounds of data collection but also interim tracking contacts. Given the longer period of follow-up and potential for individuals to drop from the sample, this amount will also be important to help secure the cooperation of the control group.
Stage 2 sample members will receive periodic requests to update their contact information on an enclosed form and return the updated information to Abt. Those who respond to this request will receive $5 as a token of appreciation for their time.
Stage 1 follow-up respondents receive $25 for their participation. Incentives for this group will be very important, as controls will not receive prior information about BOND, and treatment group members may be unlikely to recall information contained in the SSA notice about BOND, given the absence of informed consent.
For the process study, we will offer $25 incentive payments to focus group participants. This incentive payment will cover the cost of their time spent in the focus group and traveling to and from the focus group site.
A.10 Assurances of Confidentiality
The subjects of this information collection and the nature of the information we will collect require strict confidentiality procedures. The information requested under this collection is protected and held confidential in accordance with 42 U.S.C. 1306, 20 CFR 401 and 402, 5 U.S.C.552 (Freedom of Information Act), 5 U.S.C. 552a (Privacy Act of 1974) and OMB Circular No. A-130. We describe detailed plans for informed consent and data security procedures below.
All potential participants for Stage 2 of the BOND should be able to make a genuinely informed decision about demonstration participation. Vigorous outreach with a clear message and strong supporting materials help to ensure that those we assign to the demonstration treatments understand the opportunities available, and are likely to take advantage of the demonstration’s benefits. In addition, we will screen those members of the Stage 2 Solicitation Pool who express interest in BOND for cognitive ability before they can provide their informed consent (see more below).
We designed BOND to ensure that participants would not be worse off than they would be under current rules. Those who participate actually commit little, and face minimal risk, by agreeing to be part of the demonstration. The outreach materials will clearly explain the risks to participants.
At Stage 1,we base the random assignment of eligible beneficiaries to an offset-only treatment group and a control group entirely on SSA administrative records, hence it does not require direct recruitment of beneficiaries or informed consent (as discussed below, we will solicit informed consent for Stage 2). However, the beneficiaries we assign to the Stage 1 offset-only group receive a notice from SSA explaining the new, more generous treatment of their earnings—the $1 for $2 offset—that apply when they work above SGA for a sustained period.
Abt Associates demonstration implementation staff will enroll SSDI-only beneficiaries who become members of the Stage 2 volunteer sample at each demonstration site. We obtain the informed consent of each sample member through a signed consent form, the “$1 for $2 Benefit Offset Demonstration Participant Agreement,” which describes the demonstration, the process of random assignment, and the information requirements of the evaluation. As shown in Appendix G of this submission, this form also indicates to the applicant that participation is voluntary, and that agreeing to participate means they give permission for researchers to access information about them from other sources.
A.10.2 Data Confidentiality Protections
We require the data we collect in the surveys for the BOND evaluation as well as the administrative data from the sites to evaluate the BOND demonstration. Mailings to potential respondents and all in-person introductions include assurances that participation is voluntary; that all information will remain confidential to the extent allowed by law; and that we will only report the respondents' information in aggregate form. Interviewers inform all participants that Abt Associates will ultimately report the data they provide to SSA.
We include an assurance of confidentiality in the Stage 2 informed consent document (see Appendix G). We reiterate this assurance of confidentiality to all respondents in the introduction to each of the Stage 2 surveys. We do not require Stage 1 sample members to give their consent to participate in BOND. However, all survey participants are entitled to know that their participation in the survey is voluntary. The introduction to the Stage 1 survey includes the same assurance of confidentiality as the Stage 2 surveys. Specifically, the assurance of confidentiality explains to the Stage 1 sample that their participation is voluntary; it describes the potentials risks and benefits of participating in the surveys; and defines the users of the survey data. All members of the Abt team, including telephone and field interviewers sign a pledge of confidentiality as a condition of employment. Survey data responses and identifying information reside in separate data files; we make linking possible by a common identification number. For both survey data and corresponding administrative data on sample members, passwords known only to project staff members that require access to these files will help maintain computer security.
A.10.3 Data Storage and Handling of Survey Data
The Systems Security Plan developed for this demonstration governs all data we collect under this demonstration. The Abt Associates team developed this plan in compliance with SSA’s guidelines. Computer assisted interviewing technology ensures data security and enhances data quality for all of the survey data collection efforts. Computer assisted technology provides centralized control of Web, CATI and CAPI data collection efforts. It allows the centralized distribution of new and updated surveys collected through Web and CAPI via the internet and through CATI via the intranet. It also permits centralized sample management and case distribution. The CATI/CAPI systems encrypt the data, store it with password protection, and restrict access to unauthorized users.
A.11 Justification for Sensitive Questions
The baseline and follow-up surveys include questions about household income and other financial circumstances that we do not deem sensitive. The surveys do not ask about spending on personal items that some participants may feel is private or sensitive. The interviews do include questions about physical and emotional health, items we sometimes consider sensitive. We need these confidential items to evaluate the effects of the BOND demonstration. Strict security policies protect all data, and researchers will use the data only for the purposes of the study. To encourage candid responses, interviewers remind respondents that their responses will remain confidential.
The key informant interviews do not contain any sensitive questions.
A.12 Estimates of Public Reporting Burden
The data collection for the BOND evaluation consists of four components we began to implement in early 2011: Participant surveys (including the participation agreement), Enhanced Work Incentives Assessments, Key Informant Interviews and Participant Focus Groups as well as data collection efforts started in 2012 (stage 2 interim), and 2014 (stage 1 and Stage 2 36 month follow up).
Exhibit A5 shows the total estimated respondent burden for the baseline, interim, and Stage 1 and Stage 2 follow-up interviews. It shows the average time, in hours, estimated to take demonstration participants to complete each of the interviews. The table also shows total burden estimates for the Enhanced Work Incentive Assessments and Participant Focus groups and the Stage 1 contact letter survey.
Exhibit A5. Estimated Respondent Burden Hours
Form |
Number of Respondents |
Frequency |
Number of Responses |
Average Time to Complete In Minutes |
Total Burden (hours) |
Participation Agreement |
12,954** |
1 |
12,954 |
20 |
4,318 |
Baseline Survey* |
12,954 |
1 |
12,954 |
41
|
8,852 |
Stage 2 12-month Interim Survey** |
10,363 |
1 |
10,363 |
29
|
5,009 |
Stage 1 36-month survey** |
8,000 |
1 |
8,000 |
49
|
6,533 |
Stage 2 36-month survey** |
10,363 |
1 |
10,363 |
60
|
10,363 |
Enhanced Work Incentives Assessment |
3,089 |
1 |
3,089 |
35 |
1,802 |
Key Informant Interviews |
100
|
7 |
700
|
60 |
700 |
Stage 2 Participant Focus Groups |
600 |
1 |
600 |
90 |
900 |
Stage 1 First Contact Letter Survey |
500 |
1 |
500 |
3 |
25 |
TOTAL |
58,923 |
|
59,523 |
|
38,502 |
* Response to the Stage 2 outreach efforts exceeded original expectations. As a result, 12,954 beneficiaries enrolled in BOND as compared to original projections of 12,600
** We used a target response rate of 80 percent of those Stage 2 beneficiaries actually enrolled to calculate the number of respondents and total burden estimates.
In addition, Exhibit 5a provides the estimated burden information per year for the lifetime of the collection:
Exhibit 5a- Estimated Annual Burden Hours
Form |
Total Burden (hours) |
2011 |
2012 |
2013 |
2014 |
2015 |
Participation Agreement* |
4,318 |
2159 |
2159 |
N/A |
N/A |
N/A |
Baseline Survey |
8,852 |
4,426 |
4,426 |
N/A |
N/A |
N/A |
Stage 2 12-month Interim Survey |
5008 |
N/A |
2,504 |
2,504 |
N/A |
N/A |
Stage 1 36-month survey** |
6,533 |
N/A |
N/A |
N/A |
6,533 |
N/A |
Stage 2 36-month survey** |
10,363 |
N/A |
N/A |
N/A |
5,182 |
5,181 |
Enhanced Work Incentives Assessment |
1,802 |
901 |
901 |
N/A |
N/A |
N/A |
Key Informant Interviews |
700 |
140 |
140
|
140 |
140 |
140
|
Stage 2 Participant Focus Groups |
900 |
N/A |
300 |
300 |
N/A |
300 |
Stage 1 First Contact Letter Survey |
25 |
25 |
N/A |
N/A |
N/A |
N/A |
TOTAL |
38,501 |
7,651 |
10,430 |
2,944 |
11,855 |
5,621 |
Exhibits A5and 5a also show the estimate of burden hours on the key informant interview respondents. The charts show the time, in hours, that we estimate site operations staff and local service provider staff will require to complete each of the interviews. The total burden of BOND data collection from these staff is 700 hours over a period of 60 months during 2011-2015, or 140 hours annually.
Using the average times shown in both A5 and 5a above, the total burden of BOND data collection from survey respondents is 38,502 hours over a period of 60 months during 2011-2015. This figure represents burden hours, and we did not calculate a separate cost burden.
A.13 Annual Cost to the Respondents (Other)
This data collection effort involves no recordkeeping or reporting costs for respondents other than those described in item A.12 above. There is no known cost burden to the respondents.
A.14 Annual Cost to Federal Government
Exhibit A6 shows the estimated costs to the Federal Government for the BOND participant survey and key informant interview data collection activities. Costs in years 6 and 7 reflect data processing, analysis, and reporting activities.
Exhibit A6. Estimated Costs to the Federal Government
Year |
Participant Surveys |
Key Informant Interviews |
Total |
Year 1 |
$398,326 |
|
$398,326 |
Year 2 |
$2,037,552 |
$508,658 |
$2,522,236 |
Year 3 |
$3,569,127 |
$671,757 |
$4,240,884 |
Year 4 |
$1,798,106 |
$412,074 |
$2,210,180 |
Year 5 |
$3,827,336 |
$275,903 |
$4,103,239 |
Year 6 |
$1,994,839 |
$433,552 |
$2,428,391 |
Year 7 |
$1,830,578 |
|
$1,830,578 |
Total Cost to the Federal Government |
$15,431,890 |
$2,301,944 |
$17,733,834 |
A.15 Program Changes or Adjustments to the Information Collection Request
Since we last cleared this information collection request in 2011, we increased the public reporting burden. Upon completion of the enrollment of the Stage 2 beneficiaries, we noticed our Stage 2 outreach efforts were more successful than originally estimated. As a result, we enrolled 12,954 beneficiaries into the study. The increase in burden shown in #12A above is due to the overall increase in Stage 2 enrollment. While we are reporting a larger number of total burden hours for the Stage 2 survey efforts (participation agreement, baseline, interim and follow-up surveys), we are basing the increase solely on total sample size. There is no change to the burden estimates for an individual respondent.
A.16 Plans for Publication Information Collection Results
The evaluation contractor, Abt Associates Inc., and its subcontractors will analyze, tabulate, and report the data collected for the BOND evaluation to SSA.
A.16.1 Time Schedule for Analysis and Reporting
The expected period of survey data collection from BOND participants is from early 2011 to late 2015—beginning with the baseline survey and ending with the final follow-up interviews.6 The contractor will clean the survey data set on a rolling basis. We will carry out the analysis of these data in the following months, with production of major reports from the outcome surveys in March 2014 (Stage 2 Interim Survey), December 2015 (Stage 1 36-month survey), and June 2016 (Stage 2 36-month survey), and a final report on the BOND evaluation completed in October 2017. The survey data collection and reporting schedule breaks down as follows:
Data Collection: 5 years beginning in early 2011 through December 2015
Data Analysis: 6 years beginning in early 2012 through June 2017
Final Report: October 2017
This remainder of this section describes the basic analytic framework for the evaluation.
A.16.2 Analytic Techniques, Tabulations, and Reporting
Researchers will estimate the demonstration impacts by comparing the mean outcomes of all sample members assigned to a given treatment (or combination of treatments) with the mean outcomes of all sample members assigned to the appropriate current law control group(s). This approach yields an estimate of the “intent to treat” (ITT) impact—the average impact on the entire group randomly assigned to treatment, whether or not they availed themselves of the treatment. To improve precision, we will complement these simple differences with regressions that control for observed characteristics at baseline, should those by chance differ between the treatment and control groups. To obtain the impact on those who actually make use of the enhanced work incentives counseling and work above SGA and benefit from the offset (the groups of most interest to policy considerations), we will adjust the ITT estimate, using the adjustment developed by Bloom (1984).7 This produces the “treatment on the treated” (TOT) impact—the average impact on members of the treatment group who use actually the treatment.8 Both ITT and TOT (treatment on treated) comparisons are appropriate in some analyses,9 such as testing the impact of enhanced work incentives counseling on benefits or earnings, or testing the impact of the benefit offset on benefits. In other cases—such as impacts of the benefit offset on earnings—only ITT comparisons are appropriate, since the assumption of no effect on those whose earnings never trigger the offset is not plausible: other beneficiaries in the offset treatment group could be working and earning more because of the demonstration.
In each of the project deliverables that present the demonstration impacts, we will make use of standardized table shells for presenting the impact results. Section B2.2.2 includes samples of these table shells.
A.17 Displaying the OMB Approval Expiration Date
All data collection instruments created for the BOND evaluation prominently display the number and expiration date for OMB approval. SSA is not requesting an exception to this requirement.
A.18 Exceptions to Certification Statement
SSA is not requesting an exception to the certification requirements at 5 CFR 1320.9 and related provisions at 5 CFR 1320.8(b)(3).
1 Benefits counseling services are currently available to SSDI beneficiaries through SSA’s Work Incentives Planning and Assistance (WIPA) providers.
2 P.L. 106–170, Section 302.
3 P.L. 106–170, Section 302.
4 The Evaluation Analysis Plan (Deliverable 16.1) will provide details about weighting, due to SSA on July 15, 2010.
5 Those who travel to the BOND site office receive an additional $10 to compensate for travel and/or childcare expenses.
6 This schedule is subject to revision depending upon SSA's schedule for implementing the demonstration.
7 Bloom, Howard S. 1984. "Accounting for No-Shows in Experimental Evaluation Designs," Evaluation Review 8 (April): 225-246.
8 As explained in Bloom (1984), calculation of the TOT estimate requires the assumption that assignment to the treatment group has no effect on beneficiaries who do not use the treatment (i.e., do not participate in enhanced work incentives counseling or work above SGA).
9 A third impact concept—average impact on the “per protocol” population—was considered and rejected because its estimation requires too great a departure from the experimental design of the demonstration. The “per protocol” population consists of correctly randomized treatment group members, who received the full correct treatment, and completed the study as designed in terms of providing all desired outcome data. This subpopulation could be identified in the treatment group, but (since its definition hinges on receipt of the full treatment) not in the control group. Moreover, the broader subpopulation of all treatment “users” seems most relevant to policy, since different kinds of full and partial “users” are bound to arise should we roll out the BOND interventions as national policy.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Abt Single-Sided Body Template |
Author | NicholsonJ |
File Modified | 0000-00-00 |
File Created | 2021-01-28 |