The Evaluation of Individual Training Account Experiment: OMB Supporting Statement, Part A
Irma Perez-Johnson
Patricia Nemeth
Kenneth Fortson
Quinn Moore
Submitted to:
U.S. Department of Labor Employment and Training Administration 200 Constitution Ave., NW Room N-5637 Washington, DC 20210
Project Officer:
|
Submitted by:
Mathematica Policy Research, Inc. P.O. Box 2393 Princeton, NJ 08543-2393 Telephone: (609) 799-3535 Facsimile: (609) 799-0005
Project Director: |
CONTENTS
Page
a. justification 1
1. Circumstances Necessitating the Data Collection 1
2. How, by Whom, and for What Purpose the Information Is to be Used 9
3. Use of Improved Technology to Reduce Burden 10
4. Efforts to Identify Duplication 12
5. Methods to Minimize Burden on Small Businesses or Entities 12
6. Consequences of Not Collecting the Data 12
7. Special Data Collection Circumstances 12
8. Federal Register Notice 13
9. Respondent Payments 14
10. Confidentiality 18
11. Questions of a Sensitive Nature. 26
12. Hour Burden of the Collection of Information 27
13. Estimated Total Annual Cost Burden to Respondents and Record Keepers 28
14. Estimated Annualized Cost to the Federal Government 28
15. Changes in Burden 29
16. Tabulations, Publication Plans, and Project Schedule 29
17. Reasons for Not Displaying Expiration Date of OMB Approval 30
18. Exception to the Certification Statement 19 30
REFERENCES 31
This request for Office of Management and Budget (OMB) clearance seeks approval for a second participant follow-up survey to be conducted as part of the Extension of the Evaluation of the Individual Training Account (ITA) Experiment. This request is a modification of an OMB-approved data collection effort conducted between November 2003 and June 2005 (OMB approval number 1205-0441) for the Evaluation of the ITA Experiment. The experiment and evaluation were conducted between June 1999 and September 2006 by the U.S. Department of Labor (DOL) and implemented, under contract to DOL (contract number N-7731-9-00-87-30) by Mathematica Policy Research, Inc. (MPR) and its subcontractors—Social Policy Research Associates (SPR) and Decision Information Resources, Inc. (DIR). DOL is conducting an extension of the evaluation (hereafter referred to as ITA2), which is also being implemented by MPR under contract to DOL. The objective of the ITA2 study is to evaluate the longer-term impacts and cost-effectiveness of the three approaches originally tested in the ITA Experiment.
The Workforce Investment Act (WIA) of 1998 brought about substantial changes in the way training and other employment services are provided to DOL customers. WIA required workforce investment areas to establish Individual Training Accounts (ITAs), which provide vouchers or other related funding methods that customers can use to pay for training. ITAs are intended to empower customers to choose the training services they need and raise the accountability of states, local areas, and service providers for meeting these needs.
a. The Experiment
Under the authority granted ETA in Section 171 of the Workforce Investment Act, the ITA Experiment tested different approaches for managing customer choice in the administration of ITAs. States and local offices have a great deal of flexibility in deciding how much guidance to provide to customers in choosing WIA-funded training. The experiment tested three approaches that differed widely in both the resources available to customers and the involvement of local counselors in guiding customer choice. The three approaches ranged from a highly structured approach, in which customers were steered to the highest-return training options, to a true voucher approach, in which customers were offered a lump sum and allowed to choose any state-approved training.
As Table 1 shows, the three ITA approaches varied along three dimensions related to the management of customer choice: (1) the type of counseling provided and whether it was mandatory or voluntary, (2) the ability of local counselors to reject the choices of customers, and (3) the method used to control each customer’s ITA spending.
TABLE
1
APPROACHES TESTED IN THE EXPERIMENT
|
Approach 1: Structured |
Approach 2: Guided |
Approach 3: Maximum |
Counseling |
Mandatory, most intensive |
Mandatory, moderate intensity |
Voluntary |
Can Counselors Reject Choices? |
Yes |
No |
No |
Award Amount |
Customized |
Fixed |
Fixed |
Approach 1: Structured Customer Choice was the most directive of the three approaches. Customers participated in a series of mandatory assessment and counseling sessions designed to identify promising training opportunities. During these sessions, customers were guided through the estimation of the benefits and costs of alternative training options and directed toward options expected to yield a high return—that is, programs expected to generate earnings on a new job that would be high relative to the resources invested in training. Local counselors were given the authority to disapprove training choices inconsistent with this high return emphasis. Once appropriate training was chosen, customers received a customized ITA to fully cover the costs of training.
Approach 2: Guided Customer Choice was designed to represent broadly the approach that most local agencies adopted in their transition to WIA. As in Approach 1, customers were required to participate in structured counseling activities, but the activities were less intensive under Approach 2 and not specifically focused on the return to the training investment. Once customers had completed the required counseling, they were free to choose any training program from the state Eligible Training Provider (ETP) list—counselors could not reject their choices. Although customers can choose any training program, they receive a fixed ITA award, which limits the ITA resources they can spend on training. Customers can use funds from other sources to supplement their ITA if they want to pursue a training program that costs more than the fixed ITA award.
Approach 3: Maximum Customer Choice was the least structured of the approaches. As in Approach 2, all Approach 3 customers received the same fixed ITA amount and had final authority to choose their own training provider from the ETP list. Unlike Approach 2 customers, however, Approach 3 customers were not required to participate in any counseling activities prior to pursuing the training of their choice, but could request counselor assistance if they felt they needed it.
The three ITA approaches were evaluated through an experiment that randomly assigned new customers to one of the three ITA approaches. The advantages of randomly assigning customers are increased precision and accuracy in the impact estimates. Specifically, random assignment ensured that customers assigned to the three ITA approaches had an equal probability of being assigned to each group and would therefore have the same characteristics, on average. Differences in outcomes between the groups could then be interpreted as resulting from differences in the ITA approaches, with a known degree of statistical precision. For example, the difference in average earnings for Approaches 1 and 2 represents the effect of Approach 1 on earnings relative to Approach 2.
Eight workforce development agencies were selected by DOL to participate in the evaluation of the ITA Experiment through a competitive process. Although these agencies were purposively selected, they offered a mix of program settings in urban, suburban, and rural areas throughout the country. One of the ITA study sites, the Workforce Board of Northern Cook County in Des Plaines, Illinois, was selected to be the pilot site; it began sample intake procedures, including random assignment, in 2001. The other study sites—the Human Services Departments of the City of Phoenix and Maricopa County, in Arizona; the Atlanta Regional Commission and the Northeast Georgia Regional Development Center, in Georgia; The Workplace Inc. in Bridgeport, Connecticut; the Charlotte-Mecklenburg Workforce Development Board, in North Carolina; and First Coast Development, Inc. in Jacksonville, Florida—began sample intake procedures in 2002.
b. The Evaluation
The evaluation of the ITA Experiment examined the relative impacts of the three ITA approaches on four types of outcomes:
Participation in training and related services, including receipt of training, receipt of counseling and other services, and receipt of support services (child care and transportation)
Customer satisfaction, including satisfaction with training and satisfaction with other services
Employment-related outcomes, including employment by quarter, earnings by quarter, and characteristics of jobs (wage rates and fringe benefits)
Dependence on public assistance, including unemployment insurance, cash welfare benefits, and Food Stamps
As noted, the ITA Experiment used a classical random assignment design to estimate the relative impacts of the three ITA approaches. Individuals found eligible for training during the experiment’s intake period in the six grantee sites were randomly assigned to one of the three approaches. Random assignment ensured that differences in mean outcomes between treatments provided unbiased estimates of the net impacts of the different approaches. Based on the estimates of the relative impacts of the three ITA approaches, the evaluation also included an analysis of the relative returns on investment (ROI) for each of the approaches. The objective of the ROI analysis was to assess whether, relative to less expensive models, more expensive ITA models provided additional benefits that were large enough to justify the additional costs.
Importantly, the interpretation of findings from this type of experiment depends on the implementation and reproducibility of the treatments tested. The evaluation of the ITA Experiment included an implementation study that examined the feasibility of the three ITA approaches and the challenges that emerged in their implementation. Based on implementation findings, the evaluation concluded that Approaches 2 and 3 had been implemented as designed and, thus, were broadly feasible (McConnell et al., 2006). Approach 1, however, was not implemented as designed (ibid.). Counselors in all study sites proved reluctant to be directive in their interactions with Approach 1 customers. They also failed to steer Approach 1 customers to high-return training and instead tended to defer to customer preferences. Lastly, counselors rarely denied training to Approach 1 customers and thus failed to constrain ITA expenditures under this approach. With its higher cap on ITA awards and more intensive counseling requirements, Approach 1 (as implemented) was still reproducible and distinct from the other two approaches tested in the Experiment. The evaluation interpreted findings on the relative impacts and cost-effectiveness of Approach 1 reflecting how this approach was implemented.
Findings from the original evaluation of the ITA Experiment also suggested that a longer-term follow-up was necessary in order to reach more definitive conclusions regarding the impacts and cost-effectiveness of the ITA approaches. The original follow-up participant survey allowed examination of employment outcomes for 15 months following random assignment. At that time, a substantial number of ITA study participants—17 percent of Approach 1 customers and 14 percent of Approach 2 and 3 customers—were still in training, so that the ultimate effects of the ITA approaches had not yet been completely realized. DOL commissioned MPR to conduct an extended evaluation of the ITA Experiment. The ITA2 evaluation will examine the longer-term outcomes of ITA study participants, with updated UI wage records and with a second follow-up survey—for which we are requesting clearance—to be administered between five and seven years after random assignment.
c. Data Needs and Sources
Data items used in the initial evaluation of the ITA Experiment to measure outcomes and to provide background information on sample members are listed in Table 2, together with the sources of these data. The four data sources used in the initial evaluation were as follows:
Program MIS data from the six sites provided information on training participation and use of other services obtained through the WIA system. This source also provided data on the main demographic and other baseline characteristics of study participants.
Unemployment Insurance (UI) wage records were collected to obtain a 27-month history of employment and earnings—12 months prior to random assignment and 15 months after random assignment. UI wage records provided important information to assess the relative impacts of the different ITA approaches. Furthermore, because higher economic output is the primary benefit to training, earnings data from UI wage records provided important information for the ROI analysis.
Unemployment Insurance (UI) program benefits data were collected to create a 15-month history of participation and benefits in the Unemployment Insurance (UI) program. We attempted to collect these data from the 6 states in which the ITA study grantees were located. UI program benefits data helped assess the number of weeks that ITA customers received UI benefits and were essential to the ROI analysis.
TABLE 2
INDIVIDUAL-LEVEL
DATA ITEMS and sources
Data Item |
Data Source |
Baseline Characteristics |
|
Identifying and Contact Information |
|
Sample member (name, address, telephone number) |
MIS |
Additional contacts (name, address, telephone number) |
MIS |
Demographics |
|
Age |
MIS |
Gender |
MIS |
Race/ethnicity |
MIS |
Marital status |
MIS, I |
Number of children |
MIS, I |
Household size |
MIS, I |
Prior Experience |
|
Education (highest grade, highest degree) |
MIS |
Characteristic of last job (wage, benefits, hours, industry, occupation, duration) |
MIS, I |
Number of years worked |
MIS |
Quarterly earnings prior to random assignment |
WR |
Reason for Job Loss |
I |
Employment and Training Services and Experiences |
|
Receipt of Reemployment Services |
|
Assessment and service planning |
MIS, I |
Job search assistance and training |
MIS, I |
Job counseling |
MIS, I |
Timing of service delivery |
MIS |
Receipt of Education and Training |
|
Basic-skills training |
MIS, I |
Occupational classroom training |
MIS, I |
On-the-job training (duration, service dates, costs, type/occupation, provider, whether completed) |
MIS, I |
Receipt of Support Services |
|
Child care |
MIS, I |
Transportation |
MIS, I |
Other |
MIS, I |
Satisfaction with Services and Training |
I |
Table 2 (continued) Income |
|
Unemployment insurance |
I or UI |
TANF/food stamps |
I |
Spouse’s earnings |
I |
Other income sources |
I |
Program Outcomes |
|
Employment status, by quarter after baseline |
WR, I |
Quarterly earnings, by quarter after baseline |
WR, I |
Proportion of follow-up period employed |
WR, I |
Number of jobs held |
WR, I |
Characteristics of postprogram job (wage, benefits, hours, industry, occupation) |
I |
Job search activities |
I |
Notes: MIS=Management information systems; I=15 month follow-up interview; WR=UI wage records; UI=Unemployment insurance record; TANF=Temporary Assistance for Needy Families.
The follow-up survey occurred approximately 15 months after random assignment and collected important information on a variety of outcomes for people who were randomly assigned to one of the three ITA approaches. The survey provided more detailed information on employment outcomes than Unemployment Insurance (UI) wage records, including information on wage rates and fringe benefits, and information on all jobs, not just those included in the wage record system.1 The survey also provided detailed information on household composition and other demographic characteristics. The follow-up survey was the only source for data on perceptions and attitudes toward each ITA approach, including the level of customer choice, job search behavior after random assignment, characteristics of post-training jobs, and participation in government programs other than UI. It also provided data on training and other services received outside of the WIA system. The survey was conducted by telephone using computer-assisted telephone interviewing (CATI) techniques.
The ITA2 study will conduct a second follow-up survey of participants in the ITA Experiment and collect additional UI wage records. These data will make it possible to examine a more extensive employment history for each ITA study participant and to update the experimental estimates of net impacts and return-on-investment for the three ITA approaches.
The second participant follow-up survey includes only minor modifications to the first follow-up survey. As with the first follow-up survey, the ITA2 survey will collect critical information on the training experiences and employment and earnings of ITA study participants that can only be obtained using survey data.
To determine the relative long-term impacts of different ITA approaches on training experiences and on labor market outcomes, MPR will use updated data from state-administrative records and data from a second follow-up survey. These data will make it possible to compare the outcomes of the three ITA approaches and evaluate their cost-effectiveness at five to seven years after random assignment. These comparisons will be based on the experiences and outcomes of ITA customers, such as participation in education and training, employment and earnings, and participation in government support programs. These comparisons will yield estimates of the relative impacts of different ITA approaches on key outcomes in the long-term.
To compare the three ITA approaches, MPR will use the administrative and survey data to compute summary statistics, such as means and proportions, separately for customers assigned to each ITA approach. For example, MPR will compute the percentage of ITA customers served by each approach that received training, whether funded with an ITA or some other source. This percentage will then be compared across approaches to determine whether the different approaches vary in the proportion of customers who participate in training.
Notably, as in the original study, we plan to estimate the impacts of the ITA approaches on employment and earnings using both data from the ITA2 follow-up survey and the UI wage records. The advantages of using the UI wage records data remain that they are available for the entire ITA study sample and are not subject to recall error (McConnell et al., 2006). Nevertheless, we still consider the administrative UI wage records to be less accurate than the survey data, for several reasons. As noted, the administrative records do not cover all jobs. For instance, they exclude federal workers, military staff, self-employed people, railroad employees, workers in service for relatives, most agricultural labor, some domestic service workers, part-time employees of non-profit organizations, insurance and real estate agents on commission, and workers performing “casual labor” (U.S. Department of Labor, 2004). They also exclude workers whose employers (illegally) fail to report their earnings to the UI agency. Because we will only collect data from the UI agencies in the six states where the ITA Experiment was conducted, the administrative records would exclude earnings from UI-covered jobs held out-of-state or if the participant moved to a different state during the follow-up period.
Hence, we will examine the robustness of survey-based findings by also estimating impacts on employment and earnings using the UI quarterly earnings records. We also plan to use the survey-based estimates of the impacts of the ITA approaches on employment and earnings as our benchmark estimates of benefits from increased earnings for the return-on-investment analysis.
Based on evaluation findings, DOL can advise local workforce boards on possible modifications to their ITA programs. The goal of the ITA2 evaluation is to determine the relative long-term impacts of different approaches to administering ITAs. The updated data collected from states and through the second participant follow-up survey will provide critical information to make those assessments. The planned data collection efforts are therefore essential to evaluating the different ITA approaches tested in the Experiment.
Computer-assisted telephone interviewing (CATI) will be the primary method of data collection for this survey. The CATI program from the first follow-up survey will form the basis for ITA2 but will be slightly revised and updated with current reference periods. CATI was selected because telephone interviews are more cost-effective and impose less burden on respondents than in-person interviews, given the flexibility they allow for scheduling interview times.
CATI is more cost effective and less burdensome on respondents than paper and pencil interviewing for other reasons, including the fact that CATI programs accept only valid responses and can be programmed to check for logical consistency across answers. Interviewers are thus able to correct errors during the interview, eliminating the need to call back respondents to obtain missing data. To minimize burden for respondents to the first follow-up survey, the CATI program will be preloaded with useful information from that survey, such as the respondent’s participation in training programs and job history, which should aid respondent recall and ensure that only new information is collected. Also, dialing errors will be virtually eliminated by making calls through an auto-dialer linked to the CATI system. The automated call scheduler will simplify scheduling and rescheduling of calls to respondents at their convenience and can assign cases to specific interviewers, for example, those who are fluent in Spanish.
Sample members who are difficult to find will be located through the efforts of field staff. Field staff will typically not conduct interviews. Instead they will facilitate the completion of interviews by having sample members call MPR’s telephone center using their own telephones or cell phones provided by MPR. These calls will be made to a toll free number with the field interviewer present, and responses will be entered directly into the CATI system.
For a small number of cases, interviews will be conducted in-person using hard copy instruments. Some respondents will not have access to telephones and may resist using MPR-provided cell phones to complete the interview. In other cases, phone connections may be problematic making it more expedient to complete the survey in person using paper and pencil. Having hard copy instruments on hand will enable MPR interviewers to accommodate those respondents for whom it is preferable or more convenient to complete the survey on paper.
This survey will be conducted to collect key long-term employment and earnings data about ITA customers beyond what is available in administrative records. No other survey data collection effort has been conducted as an extension to the evaluation of the ITA Experiment or has been planned to collect similar information.
No small businesses or entities will be interviewed for this survey.
Data will be collected from study participants only once. The survey will provide the only source of long-term data for ITA customers on the following outcomes:
Participation in education and training programs
Job search behavior after random assignment
Characteristics of post-training jobs
Participation in government programs, including UI
Therefore, if the ITA2 follow-up survey were not conducted, the evaluation would be unable to assess the impacts of the different ITA approaches on these outcomes, and the cost-effectiveness of the approaches, in the long-term.
No special circumstances apply to this data collection. In all respects, the data will be collected in a manner consistent with federal guidelines. The sample-based survey will produce valid and reliable results that can be generalized to the universe of participants in the ITA study, and it will include only statistical data classifications that have been reviewed and approved by OMB. It will include a pledge of confidentiality that is supported by authority established in statute or regulation and by disclosure and data security policies that are consistent with the pledge. It will not unnecessarily impede sharing of data with other agencies for compatible confidential use.
a. Federal Register Notice and Comments
In accordance with the Paperwork Reduction Act of 1995, the public was given an opportunity to review and comment through the 60-day Federal Register Notice, which was published on July 22, 2008 (FR, Vol. 73, No. 141, pp. 42597-42598). No comments were received from the public.
b. Consultations Outside of the Agency
The following individuals were consulted in developing the design, the data collection plan, and the questionnaire for the initial evaluation of the ITA Experiment and/or the changes to the original questionnaire for the ITA2 evaluation.
Name |
Affiliation |
Telephone Number |
Dr. Irma Perez-Johnson |
Mathematica Policy Research |
(609) 275-2339 |
Dr. Kenneth Fortson |
Mathematica Policy Research |
(312) 867-0496 |
Ms. Pat Nemeth |
Mathematica Policy Research |
(609) 275-2294 |
Dr. Sheena McConnell |
Mathematica Policy Research |
(202) 484-4518 |
Dr. Paul Decker |
Mathematica Policy Research |
(609) 275-2290 |
Dr. Dan Kasprzyck |
Mathematica Policy Research |
(202) 264-3482 |
Dr. John Eltinge |
Bureau of Labor Studies |
(202) 691-7404 |
Dr. Ralph Smith |
Congressional Budget Office |
(202) 225-3149 |
Incentives are among several methods—including locating sample members, refusal conversion, follow-up phone calls, and advance letters—that are used to help achieve high survey response rates. Offering incentives can help increase cooperation among sample members and thus help increase response rates. High response rates, in turn, help achieve sample representativeness, which is critical to achieving high data quality—that is, data that are complete, valid, reliable, and unbiased. Given the importance of the ITA2 study for DOL, the second ITA participant follow-up survey data collection must be held to the highest standards on these criteria, and offering incentives can help achieve that goal.
An incentive payment of $25 will be offered to respondents who complete an interview as part of this data collection. Such incentives were used successfully with 8 pretest respondents interviewed by telephone from June 24 to June 26, 2008. Notably, in the course of completing pretest interviews, only one sample member actively refused to be interviewed, despite the fact that the pretest sample included both respondents and non-respondents to the initial ITA survey. This high level of cooperation may be attributable in part to the use of the incentives.
There are several key features of any survey that affect response rates, including characteristics of the respondents and of the survey sponsor, as well as design features of the survey. While characteristics of the respondents and survey sponsor have not changed substantially between the first and second ITA follow-up surveys, for the second follow-up, several features of the survey design have changed significantly. These changes include the following:
The timing of the survey in relation to when the most recent contact information was obtained. For ITA2, this time lapse is significantly longer.
The timing of the survey in relation to the importance of the topic to the ITA participants. Many ITA participants may have participated in a training program five years ago, close to the time when they were contacted for the first ITA follow-up survey. The relationship between their participation in a training program and the ITA2 follow-up survey is more tenuous.
The relationship of ITA participants to the survey sponsor. ITA participants were offered an ITA voucher around the time when the first ITA follow-up survey was conducted, making the study highly relevant and the request for participation in the survey at that time more salient. For ITA2, there is no longer a clear reciprocal relationship between these ITA participants and the DOL.
In general, the telephone survey environment has become more difficult and costly (Curtin et al. 2005).
An incentive payment may help compensate for the challenges presented by these changes in the ITA2 survey context to significantly enhance both contact and cooperation.
Importantly, declining response rates in telephone surveys (Curtin et al. 2005) and the concomitant rise in effort and costs associated with achieving high response rates have made the use of incentives a more common practice for survey studies. Substantial evidence on the benefits of offering incentives has become available. Incentives can help achieve high response rates by increasing the sample members’ propensity to respond (Singer, Hoewyck, and Maher 2000). Studies offering incentives show decreased refusal rates and increased contact and cooperation rates. Among sample members who do initially refuse to participate, incentives increase refusal-conversion rates. By increasing sample members’ propensity to respond, incentive payments have been found to significantly reduce the number of calls required to resolve a case and to significantly reduce the number of interim refusals. Thus, incentive payments can help contain costs, and pass some of the costs of conducting the survey as a gain to the participant rather than into additional survey operations.
Lastly, while incentives help gain cooperation to increase the overall response rate, they also increase the likelihood of participation from subgroups with a lower propensity to cooperate with the survey request. This is an important component of ensuring the representativeness of the survey respondents and the quality of the data being collected. For example, Jackle and Lynn (2007) find that incentives increase the participation of sample members more likely to be unemployed, a key estimate in the ITA2 study. There is also evidence that incentives bolster participation among those with lower interest in the survey topic (Schwartz, Goble, and English 2006; Jackle and Lynn 2007; Kay 2001) resulting in data that is more complete. Furthermore, paying incentives does not impair the quality of the data obtained (such as item nonresponse or the distribution of responses) from groups who would otherwise be underrepresented in the survey (Singer et al. 2000).
The importance of achieving a high response rate makes offering incentives a critical addition to our intensive efforts to establish contact with prospective respondents and gain their cooperation with the planned data collection. To leverage fully the benefits of offering incentives in the ITA2 participant follow-up survey, we will mention the incentive for participation in our advance letter to the ITA study participants. Interviewers will also mention the proposed incentive when they establish contact with the participants and attempt to gain their cooperation.
Appropriateness of the Incentive Protocol. The planned incentive amount is on par with studies using similar methodology and for populations with similar characteristics. For example, Mack et al. (1998) found that $20 incentives offered for participation in the U.S. Survey of Income and Program Participation (SIPP), reduced household, person, and item (gross wages) nonresponse rates. Furthermore, they found the $20 incentive was particularly effective for recruiting African American households and households in poverty, while $10 incentives did not significantly reduce nonresponse. Similarly, in a federal study of low-income participants receiving means-tested benefits, Robbins et al. (2003) offered a $35 incentive. Thus the $25 incentive amount we have proposed for the ITA2 study is within the range of incentive amounts offered in federal surveys of populations that share characteristics with the participants in the ITA study.
While numerous studies have compared the effectiveness of prepaid and postpaid incentives in mail surveys, split-ballot experiments comparing prepaid and postpaid incentives in other survey modes (especially telephone mode) have not been investigated as extensively. Moreover, the results of such experiments in telephone and in-person surveys have yielded inconsistent results. For example, Singer, Van Hoewyk, and Maher (2000) report on their experiments conducted on the Survey of Consumer Attitudes (SCA), a random-digit-dial (RDD) survey. The authors found that a prepaid incentive (that is, included with an advance letter) led to higher respondent cooperation. However, in RDD surveys such as SCA, sample members are unfamiliar with the study, and response rates typically are lower than in list-sample surveys. In other words, prepaid incentive effects are likely to be larger in RDDs than in surveys where the sample population has an a priori relationship to the study or survey organization.
In a longitudinal telephone survey, Robbins et al. (2003) compared a protocol that prepaid $10 to all sample members and postpaid $25 to respondents with a protocol that postpaid uniformly an incentive of $35. They found that neither the response rate nor the amount of calling required to complete the study was changed by one protocol more than the other. It is worth noting that the Robbins et al. study is particularly relevant since it had many characteristics similar to those of the ITA2 evaluation. This suggests that while prepaid incentives add to the cost of the data collection and add complexity to the administration of the payments, there would be no advantage to offering prepaid incentives in the ITA2 survey. With respect to the subject matter, the Robbins et al. study collected data to evaluate the Individual Development Account (IDA) program offered through the U.S. Department of Health and Human Services. The survey instrument measured IDA-related program services received and outcomes related to employment status, earned income, educational attainment, and receipt of major means-tested benefits, among others. The sample consisted of IDA account-holders; like the ITA study (and unlike an RDD study), the sample population had an a priori relationship to the study and the organization sponsoring the survey. In its first wave of interviewing, the IDA study achieved a response rate of 83 percent, similar to that of the first ITA follow-up survey.
Lastly, MPR has recently concluded a rigorous test of the effect of pre-pay and post-pay incentive amounts in the survey for the Impact Evaluation of the TAA Program for the U.S. Department of Labor, Employment and Training Administration. The results of this test are not yet available. A second incentive experiment for the same project has also begun. Non-respondents and new sample members are offered one of three incentive amounts--$25, $50, and $75--in a split-ballot experiment to evaluate the impacts of the incentive protocol. Developing and implementing similar procedures for an incentive study in the ITA2 project would delay the survey timeline. Since this is a longitudinal sample with contact information that is outdated, and since the timeliness of data collection is crucial to the extended evaluation, it is preferable to implement a single-incentive protocol. Rather than duplicate the TAA study efforts, we propose to wait for the results of that experiment. When results from the TAA incentive experiment become available, we will consider how those results may be applicable to the ITA2 study as well as future studies we conduct.
MPR will follow procedures for assuring and maintaining confidentiality consistent with provisions of the Privacy Act of 1974 (5 U.S.C. § 552a). ITA participants will receive information about confidentiality protection in an advance letter describing the survey (Appendix B) and, again at the outset of the interview, as part of the interviewer’s introductory comments. Participants will be informed that all information they provide will be treated confidentially. Interviewers will be trained in confidentiality procedures and will be prepared to describe these procedures in full detail, if needed, or to answer any related questions raised by participants. For example, if asked about confidentiality, the interviewer will explain that the answers will be combined with those of others and presented in summary form only and that the answers will not affect past or future eligibility for any programs.
All data items that identify respondents will be kept only by the contractor, MPR, for use in assembling records data and in conducting the interview. Any data received by the U.S. Department of Labor, Employment and Training Administration will not contain personal identifiers thus precluding individual identification.
It is the policy of MPR to efficiently protect confidential information and data in whatever medium it exists, in accordance with applicable federal and state laws and contractual requirements. In conjunction with this policy, all MPR staff shall:
1. Comply with the MPR Confidentiality Pledge, which is signed by all MPR full-time, part-time, and hourly MPR staff, and with the MPR Security Manual procedures to prevent the improper disclosure, use, or alteration of confidential information. Staff may be subjected to disciplinary and/or civil or criminal actions for knowingly and willfully allowing the improper disclosure or unauthorized use of confidential information.
2. Only access confidential and proprietary information in performance of assigned duties.
3. Notify their supervisor, the project director, and the MPR Security Officer if confidential information has been disclosed to an unauthorized individual, used in an improper manner, or altered in an improper manner. All attempts to contact MPR staff about any study or evaluation by individuals who are not authorized access to the confidential information will be reported immediately to both the cognizant MPR Project Director and the MPR Security Officer.
Many MPR staff members have received security clearance by the Social Security Administration (SSA) and are experienced with the stringent security requirements of collecting sensitive and personally identifying information.
a. Statistical Disclosure Limitation Methods
Tabulations in study reports. To ensure that there is no secondary data disclosure that inadvertently identifies a sample member, tabulations in the ITA2 final report will be presented by ITA approach for the full sample in the eight study sites, for the full sample by site, and for subgroups drawn from all sites. Since we do not plan to report findings for subgroups by site, the minimum number of sample members in tabulations at the site level will include 140 individuals—the number of expected ITA2 survey respondents assigned to one approach in the average site. This number is large enough to avoid secondary data disclosure.
Public use file. A carefully documented public use data file will be an important final product of the extended evaluation that will allow for replication, verification, and testing of analysis results in the broader research community. The public use file will include data used for all major deliverables for the evaluation of the ITA Experiment, including data from the program MIS, the administrative UI records, and the two participant follow-up surveys.
The public-use data file we construct will be in compliance with all relevant federal statutes regarding personally identifiable data, particularly the Privacy Act of 1974, but also the Social Security Act and the Health Insurance Portability and Accountability Act (HIPAA) as appropriate. We will implement masking techniques and other strategies, as appropriate, to protect the privacy of the sample members and ensure that the public-access file meet the confidentiality requirements of these acts.
The masking techniques we employ will likely involve three steps:
Remove All Individual Identifiers. The study’s MIS includes the following information: name, date of birth, social security number, address, and telephone number. This information has been used to support locating efforts for the evaluation, but will not appear in any public-use file nor in any research file maintained by MPR. Each sample member is assigned a unique, random identification number.
Determine Whether Sample Members May Be Identified From Plausible Combinations of Variables. We will tabulate identifiable demographic characteristics from baseline forms and follow-up surveys to determine whether any sample members could be identified by any item or combination of items. These characteristics include variables such as age, gender, race/ethnicity, household composition, and level of education. Sample members will be placed in “cells” defined by these variables. If this distribution leads to fewer than five individuals within a cell, the variable will be flagged as potentially identifying information. In addition to these characteristics, other survey items, including continuous variables such as earnings or family income, and responses to health or functional characteristics questions, may disclose the identity of the sample member. We will analyze frequency distributions of these variables to determine whether this is a problem.
Recode Identifying Variables If Sample Members Could Be Identified. If we determine that combinations of variables create categories with very few members, we will collapse the category into a larger category. If a single variable may put a sample member’s confidentiality at risk, we will round these variables into categories and bottom- and top-code them (that is, collapse observations below or above a certain value into that value). In some cases, we may drop the variable from the public use file.
b. Systems Security
MPR computer facilities include state-of-the-art hardware and software. The hardware and software configurations have been designed to facilitate the secure processing and management of both small- and large-scale data sets.
1) Facilities
The doors to MPR’s office space and Survey Operations Center (SOC) are always locked and all SOC staff are required to display a current photo identification while on the premises. Visitors are required to sign in and out and are required to wear temporary identification badges while on the premises. Any network server containing confidential data must be in a controlled-access area. All authorized external access is through a server under strict password control.
2) Network
Data stored on network drives is protected using the security mechanisms available through the network operating system used on our primary network servers: Novell Netware 5 – 6.5. These versions of Novell Netware are compliant with the C2/E2 Red Book security specifications. Netware is certified at the National Computer Security Center’s Trusted Network Interpretation Class C2 level of security at the network level. The network is protected from unauthorized external access through the PIX Firewall from CISCO. This firewall resides between our network and the communications line over which our Internet traffic flows.
Access to all network features such as software, files, printers, Internet, email, and other peripherals is controlled by userid and password. Network passwords must be a minimum of eight characters in length and must be a combination of numbers and letters. All userids, passwords, and network access privileges are revoked within one working day for departing staff and immediately for terminated staff. All staff are required to log off the network before leaving for the day.
3) Printers
Printer access is granted to all staff with a valid userid and password. The physical hard disks on which the printer queues reside are subject to the same security/crash procedures that apply to the file servers. Staff monitor the printer stations appropriately depending on the sensitivity of the printed output produced. No confidential or proprietary data or information may be directed to a printer outside of MPR’s offices.
4) Electronic Communications
Ethernet is used for internal email communications over the network. As Ethernet communications use Novell Netware with built-in userid and password protections and Windows NT Challenge Handshake Authentication Protocols (CHAPs), sensitive information in both email text and attachments may be safely transmitted. Email transfer is also encrypted when sent to or from the MPR gateway facility, which allows staff to check and send emails from home. A dedicated private line supports cross-office communications between MPR offices.
c. Treatment of Data with Personal Identifying Information
All data containing personal identifying information (PII)—including SSN, name, home address, and home telephone number—are considered to be sensitive, or confidential, ITA2 data. The ITA2 project is in compliance with the aforementioned company security policies. Listed below are additional details regarding the handling and processing of confidential ITA2 information in this evaluation.
1) Access
Confidential electronic files are stored in restricted access network directories. Access to restricted directories is limited on a need-to-know basis to staff who have been assigned to and are currently working on the project. When temporarily away from their work area, ITA2 project staff are instructed to close files and applications. Access to their workstations lock within a set period of minutes and they must use a password to regain access through the protected screen saver.
2) Electronic Communications
Staff members have been instructed not to transmit sensitive ITA2 information as a regular file attachment to an internal email. Instead, staff are instructed to use the insert shortcut feature in Outlook to include a shortcut to the file. This allows the receiver to go to the file directly, but will not allow access to unauthorized individuals. Additionally, staff are instructed to avoid including sample member names or other PII in internal emails when possible so that there is no potential for these to be viewed by others.
Emails sent outside of MPR are not automatically encrypted and therefore neither the text nor attachments are secure. Before sending an email containing sensitive information, the sender is obligated to ensure that the recipient is approved to receive such data. When files must be sent as attachments internally and outside of MPR, staff are instructed to use WinZip 9.0 (256-bit AES encryption) to password protect the file. When sending sample member name and contact information outside of MPR, this information is included in a secure attachment rather than in the text of the email.
3) ITA2 Databases
ITA2 databases containing confidential information are password protected and only accessible to staff who are currently working on the project. To access an ITA2 database, users must first log onto their workstations and then upon starting the database, login again using a separate login prompt. ITA2 databases will be removed and securely archived at the end of the data processing period.
4) Telephone Interviewing
Interviewers for the ITA2 evaluation are in a common supervised area when they are conducting telephone interviews. As part of the verification process in the survey, interviewers will have access to respondents’ names, birth dates and the last four digits of their Social Security number. Birth date and the last four digits of the respondent’s Social Security number will only be displayed on the computer screen during the verification process. Further, the last four digits of a respondent’s Social Security number will only be displayed if the birth date provided by the respondent does not match that in our database. Interviewing staff for this project receive training that includes general SOC security and confidentiality procedures, as well as project-specific confidentiality training. This training includes information on the highly confidential nature of this information and instructions to not share this or any PII with anyone not on the project team.
5) Locating
Locators update sample member contact information when the original contact information is not successful and must have access to key identifying information for short periods of time. Locating staff receive training that includes general SOC security and confidentiality procedures, as well as, project-specific confidentiality training. This training includes clear instructions on what data and databases can be accessed and what data are required and can be recorded.
Locators may talk to sample member’s family, relatives, or other references to obtain updated contact information. To protect the sample member, locators are given scripts on what they can or cannot say when using these sources to obtain information. For example, they are instructed not to tell anyone that the sample member has been selected to participate in a study of unemployment insurance. Rather they are instructed to indicate that MPR is trying to reach the sample member for an important study concerning job training and employment services.
6) Locating and Calling Contact Sheets
Project team members keep only the minimum amount of printed confidential information needed to perform assigned duties. Hard copy materials (such as locating or calling contact sheets) containing data with any individual-level identifiers (e.g., name, street address) are stored in a locked cabinet/desk when not being used. When in use, such materials are carefully monitored by a project supervisor and are not left unattended. At the conclusion of the project, a complete disposition of all remaining sample will be conducted and the contact sheets and other associated materials will be either archived or destroyed per agreement with the project director.
7) Hardcopy Printouts
Sensitive temporary work files, used to create hard copy printouts and stored in temporary work files on local hard drives, are deleted on a periodic basis. Confidential hard copy output that is no longer needed is shredded or stored securely. Test printouts of data records carrying personal identifiers that are generated during file construction are shredded.
8) Data Files
When possible, electronic files without personal identifiers are created for everyday use. Data and sample files that must contain sensitive data are stored in a restricted access location on the network. Access to data and sample files is granted only at the request of the project director (Irma Perez-Johnson) or the survey director (Pat Nemeth). This folder is restricted to staff who are currently working on the project and is available only to the staff who must have access to all the sample information to select and process the sample or to process the data files. Sensitive data that are no longer needed in the performance of the project will be magnetically erased or overwritten using Hard Disk Scrubber or equivalent software, or otherwise destroyed.
The ITA2 follow-up survey contains a minimal set of items that may be considered sensitive in nature. These questions are related to the receipt of individual and household income (F1-F4 in the questionnaire) and public assistance receipt (F5-F22 in the questionnaire). As described in item A10, all participants will be assured of confidentiality at the outset of the interview. All survey responses will be held in strict confidence and reported in aggregate, summary format, eliminating the possibility of individual identification. MPR will comply with the requirements of the Privacy Act of 1974 in collecting all information.
All questions in the current survey, including those deemed potentially sensitive, have been pretested and used extensively in prior surveys with no evidence of harm. Questions about income and public assistance receipt are necessary to measure the economic well-being of study participants and the social rate of return to different ITA2 approaches.
The total hour burden for information collected for the ITA2 follow-up survey is 1,120 hours as shown in the attached table. This hour burden estimate is based on pretesting of the ITA2 survey questionnaire with 8 pretest respondents. Pretest respondents included both respondents and non-respondents to the initial ITA survey. Pretest interviews ranged from 9 to 23 minutes and averaged 18 minutes, but the shortest interview (9 minutes) was excluded from the burden calculation as it appeared to be an anomalous interview time that skewed the average. The remaining pretest interviews averaged to 20 minutes of respondent burden each.
Reference |
Total Respondents |
Frequency |
Average Time per Response |
Burden |
ITA2 follow-up survey |
3,360 |
One time |
20 min. |
1,120 hours |
Total |
3,360 |
|
|
1,120 hours |
The estimated total burden cost of collecting this information is $16,128. This cost represents 20 minutes to complete the survey multiplied by the number of completers (3,360 or 70 percent of the 4,800 sample targeted for the ITA2 survey) and by an estimated average hourly wage of $14.40 per hour.2
There will be no start-up or ongoing financial costs incurred by respondents.
The total estimated cost to the federal government of conducting the ITA2 study is $1.9 million, which is the total contractor cost of conducting the extended evaluation. This cost estimate includes $1,400,900 for the design and conduct of the ITA2 follow-up survey, which includes $30,407 for development of the ITA2 questionnaire; $16,409 for the preparation of OMB clearance materials; $44,905 for interviewer training; $227,813 in sample locating costs; $919,708 in interviewing costs; $89,994 for the processing, editing, and cleaning of ITA2 survey data; and $71,664 for related information services (e.g., CATI programming and database design). The table below provides additional details about the total estimated costs for the ITA2 study.
Study Task |
Estimated Cost |
ITA2 Study Design |
$ 26,282 |
Design and Conduct ITA2 Survey |
1,400,900 |
Collect UI Wage Data |
87,246 |
Update Experimental Impact Estimates |
122,474 |
Assess Feasibility of a Non-experimental Net Impact Study |
54,823 |
Final Report |
92,543 |
Brief ETA on Study Findings |
13,007 |
Public Use File |
74,474 |
Review Evidence on Self-Managed Accounts |
28,251 |
Total |
$1,900,000 |
This data collection effort, because it involves fewer participants, represents a decrease in the hours approved for this information collection.
The ITA2 follow-up survey data together with the updated wage records data will be used to examine long-term impacts on:
Participation in training, including receipt of training, as well as the type and duration of training
Employment-related outcomes, including employment by quarter, earnings by quarter, and characteristics of jobs (wage rates and fringe benefits)
Dependence on public assistance, including unemployment insurance, cash welfare benefits, and food stamps.
Additional details on our approach to the estimation of overall impacts, impacts by subgroup, and standard errors are provided under B.2.
The project schedule and publication plan is provided in the table that follows:
Tasks |
Schedule |
Administer ITA2 follow-up survey |
March 2009 to July 2009 |
Collect updated UI wage records |
March 2008 to January 2009 |
Update experimental impact and cost-benefit analysis (final impact report) |
February 2009 to December 2009 |
Brief ETA staff on updated experimental study findings |
January 2010 |
Create public use data file |
December 2009 to January 2010 |
The expiration date will be displayed on the advance letter and on the hard copy version of the questionnaire.
There are no exceptions taken to item 19 of OMB Form 83-1.
Brogan, Donna. “Software for sample survey data, misuse of standard packages.” In Encyclopedia of Biostatistics, Vol. 5 (P. Armitage and T. Colton, eds.). New York: Wiley, pp. 4167-4174, 1998.
Curtin, R., S. Presser, and Eleanor Singer. “Changes in Telephone Survey Nonresponse Over the Past Quarter Century.” Public Opinion Quarterly 69 (1): 87-98, , spring 2005.
Dale, Stacy and Alan Krueger “Estimating the payoff to attending a more selective college: An application of selection on observables and unobservables.” Quarterly Journal of Economics 117 (4): 1491-1527, 2002.
Jäckle, Annette, and Peter Lynn. “Respondent Incentives in a Multi-Mode Panel Survey: Cumulative Effects on Nonresponse and Bias.” Working paper presented to the Institute for Social and Economic Research, University of Essex, Colchester, United Kingdom, 2007.
Kay, Ward R. “The Use of Targeted Incentives to Reluctant Respondents on Response Rates and Data Quality.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research, 2001.
Kling, Jeffrey. “Incarceration length, employment, and earnings.” American Economic Review 96 (3): 863-876, June 2006.
Mack, S., V. Huggins, D. Keathley, and M. Sundukchi.. “Do Monetary Incentives Improve Response Rates in the Survey of Income and Programme Participation?” Proceedings of the American Statistical Association, Section on Survey Methodology, pp. 529-534, 1998.
McConnell, Sheena, Elizabeth Stuart, Kenneth Fortson, Paul Decker, Irma Perez-Johnson, Barbara Harris, and Jeffrey Salzman. “Managing Customers’ Training Choices: Findings from the Individual Training Account Experiment.” Washington, DC: Mathematica Policy Research, Inc., December 2006. (Report available at http://www.mathematica-mpr.com/publications/PDFs/managecust.pdf; technical appendices available at: http://www.mathematica-mpr.com/publications/PDFs/managecustappendices.pdf.)
Neter, John, Michael H. Kutner, Christopher J. Nachtsheim, and William Wasserman. Applied Linear Regression Models, 3rd Edition. Boston: Irwin, 1996.
Robbins, Todd, Ting Yan-Abt, Donna Demarco, Erik Paxman, and Rhiannon Patterson. “The Effect of Partial Incentive Pre-Payments on Telephone Survey Response Rates.” Paper presented at the 2003 Annual Conference of the American Association for Public Opinion Research, Nashville, TN, May 15-18, 2003.
Schwartz, Lisa K., Lisbeth Goble, and Edward M. English. “Counterbalancing Topic Interest with Cell Quotas and Incentives: Examining Leverage-Salience Theory in the Context of the Poetry in America Survey.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research, 2006.
Singer, Eleanor, John Van Hoewyk, and Mary P. Maher. 2000. “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly, vol. 64, no. 2, summer 2000, pp. 171-188.
U.S. Department of Labor, Bureau of Labor Statistics. “Comparison of State Unemployment Insurance Laws.” Washington, DC: U.S. Department of Labor, Employment and Training Administration, Office of Workforce Security, 2004. Available at http://workforcesecurity.doleta.gov/unemploy/comparison.asp.
APPENDIX A
FEDERAL REGISTER
NOTICE
[INSERT FRN HERE – Page 1]
[INSERT FRN HERE – Page 2]
APPENDIX B
ADVANCE LETTER
[INSERT ADVANCE LETTER HERE]
APPENDIX C
QUESTIONNAIRE
1 Wage records are not reported on a routine basis for Federal jobs and are unavailable for self-employment and wage and salary jobs not covered by state UI programs.
2 The initial ITA evaluation estimated hourly wages for the ITA study participants to range between $13.60 and $15.20 over quarters 1-5 after random assignment (McConnell et al., 2006). The burden estimate provided above is based on the midpoint for this hourly wage range (that is, $14.40 per hour).
File Type | application/msword |
File Title | MEMORANDUM |
Author | Dawn L. Patterson |
Last Modified By | naradzay.bonnie |
File Modified | 2009-01-28 |
File Created | 2009-01-28 |