Cascades Job Corps College and Career Academy (CCCA) Pilot program Evaluation: 18-Month Follow-Up Survey
ICR Reference Number 201902-1290-001
April 2019
The U. S. Department of Labor (DOL) contracted with Abt Associates (in partnership with MDRC) to conduct an evaluation of the Cascades Job Corps College and Career Academy Pilot program. As required under the Paperwork Reduction Act, DOL is seeking approval from the Office of Management and Budget (OMB) for data collection instruments associated with the evaluation. The Job Corps program is the Federal government’s largest investment in residential job training for disadvantaged youth. The pilot program will test innovative and promising models that could improve outcomes for students; particularly youth, ages 16-21. The evaluation, funded by DOL, will use multiple approaches including an impact study and implementation analysis of the Cascades Job Corps College and Career Academy (CCCA) pilot program.
OMB approved initial data collection activities for the CCCA Evaluation under OMB control number 1290-0012 (approved on February 6, 2017). Those approved data collection activities included the baseline information form to support the impact study, tracking data to support the planned 18-month follow-up survey, and stakeholder interview and student focus groups to support the implementation study.
This supporting statement is the second OMB submission regarding data collection activities for the evaluation of the CCCA pilot. DOL is seeking clearance in this submission for the 18-month follow-up survey. The survey will provide critical information on the experiences and educational and economic outcomes for both treatment and control members. Specific outcomes to be considered include the receipt of training and related supports, receipt of credentials, employment, socio-emotional skills, engagement in risky behaviors, receipt of public benefits, and opinions on the education and training services received.
The Workforce Innovation and Opportunity Act (WIOA) of 2014 authorizes the Job Corps program (P.L. 113-128). The Consolidated Appropriations Act in 2017 (P.L. 115-31) appropriated about $1.7 billion to fund Job Corps for Program Year 2017. The program aims to address the multiple barriers to employment faced by low-income youth ages 16-24 throughout the United States. Research has shown that while the program increases education and earnings of students, it is more beneficial for older students (i.e., age 20 and older) than for younger students.1
To strengthen program outcomes, DOL committed to using Job Corps’ demonstration authority to test and evaluate innovative and promising models, and measure impacts on outcomes for these youth. As such, the program contract (DOL-ETA-16-H-0010) to operate the CCCA pilot required participation in an independent, third-party evaluation. Participation in third-party evaluation activities will assist DOL in identifying and studying promising evidence-based strategies for younger (ages 16-21) Job Corps students. The strategies tested by this evaluation will be those implemented by the CCCA contractor and include career pathways models for the IT and Healthcare sectors, sector-specific training, partnerships with local community colleges, instruction in life skills, and a Student-Centered Design. These new and innovative strategies intend to improve student engagement and retention through efforts to build student centered ownership, instill a student-led community, and move students into successful career pathways.
This evaluation is designed to answer research questions related to an implementation analysis, a process and client flow analysis, a service contrast analysis, and an impact analysis. Research questions include (but are not limited to) the following:
Implementation Analysis: How is the CCCA model being implemented? Specifically, how were the components of the pilot program operationalized and the program implemented? (e.g. How were youth recruited and screened for the program? How did staff interact with students, and how were students involved in the operations of the CCCA pilot program? What factors influenced program implementation? What challenges did the program face in implementation and how were those challenges overcome? What implementation practices appear promising for replication?
Process/Client Flow Analysis: How do program students flow through and experience the CCCA model? Specifically, while at CCCA, did students gain literacy and math skills? While at CCCA, what percentage and which types of students achieved various program milestones, including GED attainment, attendance at a community college, industry-recognized credential attainment, and degree attainment? What percentage and which types of students used job placement and post-placement services?
Service Contrast Analysis: How did program group members’ experiences differ from what their experiences would have been in the absence of CCCA (i.e., what are the impacts on services received)? Specifically, what impact did CCCA have on the dosage of education and training services received? What impact did CCCA have on total months of full-time equivalent enrollment in education or training [this is the confirmatory research question for the 18-month follow-up survey]? What impact did CCCA have on total months of education, training, or employment (including military service)? What impact did CCCA have on receipt of instruction on non-cognitive skills (e.g., social/emotional intelligence)? How do these impacts vary by student characteristics?
Impact Analysis: How do program group members’ outcomes differ from control group members’ outcomes (i.e., what are the impacts of the program)? Specifically, what impact did CCCA have on education, employment, and earnings outcomes? Does CCCA improve critical social-emotional skills, such as self-efficacy and engagement in risky behaviors? How do these impacts vary by student characteristics?2
The research design is a randomized controlled trial with assignment to either a treatment or control group. Treatment group members are offered a slot at the CCCA Job Corps center. Control group members are not offered a slot at CCCA, but are also not prevented from enrolling in other available training programs, including at other Job Corps centers.
This configuration – a comparison of access to the focal program’s services to access to other services – is a common design for random assignment studies of training programs. It is also one that answers the relevant policy question: Do the services delivered in the CCCA pilot program improve student outcomes relative to existing Job Corps and non-Job Corps program services available in the area?
Through the Job Corps application and admission process to the CCCA center, individuals are randomly assigned to the treatment or control group. Roughly 1,100 students will be assigned to each group, for a total of approximately 2,200 study members overall (Exhibit A.1). Of these 2,200 total study members, approximately 1,000 will be administered an 18-month follow-up survey; administrative data will be collected on the remainder of the study sample.
Exhibit A.1: Size of Study Groups
Services Offered to Participants |
Treatment Group Members |
Control Group Members |
CCCA |
1,100 |
0 |
Not CCCA |
0 |
1,100 |
Total |
1,100 |
1,100 |
During the evaluation intake and enrollment period, program staff recruit potential participants and determine their eligibility. As part of the intake process, program staff administer the informed consent form, which describes the study, the data collected, and the rights and responsibilities of the participant. For (prospective) students who are minors, both parental consent and youth assent are administered. Those who consent to participate in the study complete the Baseline Information Form (BIF) that collects demographic information, employment and education history, and contact information, which is accessible on-line through the Participant Data System (PDS) – the web-based system that was custom-built for the study. Applicants choosing not to sign the informed consent form are excluded from the study and from participating in the CCCA Job Corps program for the duration of the study enrollment period. For consenting prospective students who have completed the baseline survey, program staff conduct random assignment using the PDS. Individuals assigned to the treatment group are offered the program while those assigned to the control group receive information about other services in the community, including other Job Corps centers. OMB (Control No. 1290-0012) previously approved data collection for these initial phases.
To address the research questions listed above, the evaluation of the CCCA pilot will include the following data collection activities:
Baseline data (for treatment and control group members) (Control No. 1290-0012)
Implementation site visits (two rounds of site visits) (Control No. 1290-0012)
Follow-up tracking forms to collect updated contact information (for treatment and control group members) (Control No. 1290-0012)
An 18-month follow-up survey (for treatment and control group members) (clearance requested in this package)
Existing administrative data from the Job Corps Management Information System(s), including the Administrative data from CCCA
Other administrative data from the National Student Clearinghouse (NSC) and the National Directory of New Hires (NDNH)3 (for treatment and control group members) (no clearance required)
With the submission of this justification, DOL requests clearance for the fourth collection components listed above (i.e., the 18-month follow-up survey). OMB approved the first, second, and third components on February 6, 2017. The fifth and sixth components are all existing administrative data for which no clearance is required. We note that use of administrative data greatly reduces the response burden on study participants by decreasing the length of the baseline and 18-month follow-up surveys.
Many of the employment outcomes for the study will be measured using quarterly Unemployment Insurance (UI) records maintained by the National Directory of New Hires (NDNH). The NDNH, which is compiled and maintained by the Office of Child Support Enforcement (OCSE) in the Department of Health and Human Services (HHS), is a national database of new hire data, quarterly wages, and unemployment insurance data that State Directors of New Hires, employees, and state workforce agencies submit to OCSE on a quarterly basis. NDNH captures information for all jobs covered by unemployment insurance, and thus will provide quarterly employment and earnings data for the vast majority of study participants and of jobs (these records will not, however, include information for jobs that are “off the books”). The evaluation will use NDNH quarterly data to determine sample members’ earnings, employment status (i.e., non-zero earnings), and job tenure.
For the service contrast and impact analyses, these employment and earnings data from the NDNH data are complemented by the 18-month follow-up survey. The 18-month follow-up survey, which is the subject of this PRA request, will provide critical information to measure the impact of the CCCA Job Corps program on training and related service receipt, receipt of credentials, employment details, socio-emotional skills, engagement in risky behaviors, public benefits receipt, and opinions on the education and training services received.
The purpose of the 18-month follow-up survey is to understand critical information in terms of service receipt, employment, and other outcomes. The survey is an important source for documenting the type and duration of training and other services received; receipt of credentials; socio-emotional skills; engagement in risky behaviors; information on employment that is not available from other sources; and information on public benefit receipt. In addition, the survey will collect opinions on and experiences with training services received, both at CCCA and at other providers.
The evaluator will use the survey results to describe outcomes for those at CCCA and to estimate the impact of the CCCA Job Corps services on participants’ education, economic, and other related outcomes. The primary beneficiaries of the evaluation results and planned data collection effort will be DOL, the National Office of Job Corps, Job Corps and other training program administrators, future Job Corps students, other state and local policymakers, and other federal agencies and policy makers. DOL will use the information to understand what Job Corps strategies and supports are effective in helping younger students engage in services, complete the program, attain credentials and/or degrees, and improve their economic prospects. This will be important information in guiding future Job Corps initiatives, including those utilizing sectoral training career pathways strategies, as well as those serving youth ages 16-21. Secondary beneficiaries of this data collection will be those in public policy and program administration who are interested in understanding effective youth-focused education and training strategies more broadly.
The follow-up survey will be administered using Computer-Assisted Personal Interviewing (CAPI) technology. CAPI technology reduces respondent burden, as interviewers can proceed more quickly and accurately through the survey instruments, minimizing the interview length. Computerized questionnaires ensure that the skip patterns work properly, minimizing respondent burden by not asking inappropriate or non-applicable questions. For example, respondents who have held no job since randomization will skip out of the questions about occupation, employment dates, and hours. Computer-assisted interviewing can build in checkpoints, which allow the interviewer or respondent to confirm responses, thereby minimizing data entry errors. Finally, automated survey administration can incorporate hard edits to check for allowable ranges for quantity and range value questions, minimizing out of range or unallowable values.
There is minimal duplication of data collection in the evaluation. The follow-up survey will ask a limited set of questions about employment status despite the availability of some employment information in the NDNH data. This survey question is necessary as a screener to asking about specific dates of employment and average hours worked, information not available through NDNH. Additionally, the follow-up survey will ask about participation in training and credential receipt. For the treatment group, some data on service receipt are available in Job Corps’ administrative data. However, these data are not available for control group members who do not attend another Job Corps center, which is estimated to be about 80 percent of the control group. (Based on Job Corps administrative reports gathered through the time of this submission, 80 percent of control group members are not attending any other Job Corps center.) Additionally, there are some data about attendance and credentials in the National Student Clearinghouse (NSC) data, but coverage is well known to be incomplete for the sub-BA programs that are likely to be most relevant for CCCA students. Finally, the Job Corps administrative data and the NSC data do not provide critical information on socio-emotional skills, student options, the effects of the Student-Centered Design, nor on participation in risky behaviors or receipt of public benefits.
The evaluation of the pilot program will impose no burden on this sector of the economy.
DOL will contact participants approximately 18 months after the participants are randomly assigned. Currently, this is the only follow-up survey that the evaluation team will administer to sample members. It is a one-time data collection activity, so it would not be possible to collect these data less frequently. The timing of the survey will provide enough time for sample members to participate in training programs and services. While the evaluation could report on the employment outcomes of CCCA Job Corps funding using NDNH data without the survey, the survey is vital to determine the service and credential receipt differential between the treatment and control groups; these short-term impacts are the primary impacts of concern for the 18-month follow-up period. Moreover, the survey allows for assessment of additional outcomes, including the effects of the Student-Centered Design aspect of CCCA, program effects on students’ socio-emotional skills and engagement in risky behaviors, and public benefit receipt.
Not collecting data through the 18-month follow-up survey would prevent the estimation of program impacts on training participation and service receipt, credential receipt, receipt of supportive services, participation in on-the-job training or internships, socio-emotional skills development, and engagement in risky behaviors. The CCCA Job Corps program was designed to produce impacts in these domains, which can only be measured for both the treatment and control group via a follow-up survey. Administrative data, including the Job Corps administrative data, will not provide information on service receipt for the majority of the control group, nor will administrative data provide information on socio-emotional skills, risky behaviors, and/or student opinions and perceptions. NDNH will only provide quarterly employment information. The National Student Clearinghouse (NSC) will not provide data on training or service receipt outside of the colleges that report to NSC, nor will it provide data on on-the-job training, socio-emotional skills, or risky behaviors.
There are no special circumstances for the proposed data collection.
In accordance with the Paperwork Reduction Act of 1995 (P.L. 104-13) and Office of Management and Budget (OMB) regulations at 5 CFR Part 1320 (60 FR 44978, August 29, 1995), DOL published a 60-Day Federal Register Notice on May 31, 2018, (83 FR 25055) announcing the agency’s intention to request an OMB review of this information collection activity. During the notice and comment period, the government did not receive any comment or request for copies of the instrument.
Many of the items in the survey are adapted from previously approved data collection instruments, including those from the following studies: the Ready to Work (RTW) Partnership Grants Evaluation (OMB No. 1291-0010), the Pathways for Advancing Careers and Education (PACE) Evaluation and the Health Profession Opportunity Grants (HPOG) Evaluation (OMB No. 0970-0397), the YouthBuild Impact Evaluation (OMB No. 1205-0503), the Cascades Job Corps College and Career Academy (CCCA) Pilot Study (OMB No. 1290-0012), the Health Profession Opportunity Grants (HPOG) Evaluation (OMB No. 0970-0394), and the Pathways for Advancing Careers and Education (PACE) Evaluation (OMB No. 0970-0397). Experts in their respective disciplines (statistics, policy analysis, economics, and survey operations) were consulted in developing the survey instrument.
Additionally, while developing the survey instrument, the evaluation team consulted with representatives from the National Office of Job Corps, the Department of Labor’s Employment and Training Administration, and the Department of Labor’s Chief Evaluation Office on theorized outcomes of the CCCA Pilot. The evaluation team also pre-tested the survey instrument on a group of nine study members who will not be part of the follow-up survey sample (i.e., randomized during the pilot period; see Part B, page 1). This group of study members consisted of five treatment group members who enrolled at CCCA, and four control group members who did not enroll at CCCA. Following the pre-test results, the evaluation team revised questions and interviewer directions based on respondent feedback, and removed questions and restructured the instrument to fit in the time allotted.
Respondents will receive $25 as a token of appreciation. The CCCA 18-month follow-up survey incentive payment amount is based on comparable other surveys of similar populations. ACF’s evaluation of Building Strong Families, conducted by Mathematica, found $25 incentives for low-income youth (and a $25 incentive for parents) effective in attaching the dyads to the study over time and achieving the required completion rates. In multiple studies for the Corporation for National Service, Abt used a $25 incentive as the standard incentive used over the last two decades to achieve the desired survey completion rates with youth populations at all socio-economic levels, although the incentive is sometimes raised to reach the hardest-to-reach youth. The Teen Pregnancy Prevention Replication Study, conducted for the Office of Adolescent Health (OAH) and Office of Assistant Secretary for Planning and Evaluation (ASPE) at HHS, utilized an incentive payment of $25. OMB approved these incentive payments as elements of prior information collection requests.
Incentive payments are a powerful tool for maintaining low attrition rates in longitudinal studies. The use of incentive payments for the CCCA Job Corps Evaluation can help maximize response rates, which is necessary to ensure unbiased impact estimates. Three factors helped to determine the incentive amounts for each survey:
Respondent burden, both at the time of the interview and over the life of the study
Costs associated with participating in the interview at that time
Other studies of comparable populations and burden
Given the importance of this evaluation, the data collection must maintain the highest standards. Providing a modest payment to study subjects who complete a given follow-up interview can contribute to the achievement of that goal by significantly increasing response rates, thereby ensuring data collection from a sample that is truly representative. Because response to telephone surveys has been declining in recent years and costs associated with achieving high response have been increasing, the use of respondent payments has become common practice for survey studies.4 These payments can help achieve high response rates by increasing the sample members’ propensity to respond.5 Studies offering respondent payments show decreased refusal rates and increased contact and cooperation rates. Among sample members who initially refuse to participate, the availability of payments increases refusal-conversion rates. These payments also can help contain costs by increasing sample members’ propensity to respond, thus significantly reducing the effort and funds expended to resolve a case and the number of interim refusals. These operational cost savings and direct participant benefits provide justification for offering payments to survey respondents.
In addition to helping gain cooperation to increase the overall response rate, respondent payments also increase the likelihood of participation from subgroups with a lower propensity to cooperate with the survey request. Increased response rates from subgroups with a lower propensity to cooperate are another important factor in helping to ensure the representative nature of the outcome data and the quality of the data being collected. For example, Jäckle and Lynn6 find that respondent payments increase the participation of sample members who are more likely to be unemployed. Evidence also shows that respondent payments bolster participation among those with lower interest in the survey topic,7,8,9 resulting in data that are more nearly complete. Prior research establishes that payments do not impair the quality of the data obtained (for example, by increasing item nonresponse or the distribution of responses) from groups who may otherwise be underrepresented in a survey.10
Offering respondent payments is a final critical addition to intensive efforts to establish contact with prospective respondents, and gain their cooperation with the planned data collection. A $25 payment will be offered to respondents as a token of appreciation for their time spent participating in the survey. Such a sign of appreciation motivates sample members to participate in the survey, and may influence their decision to provide updated contact information during the tracking period. To leverage fully the benefits of the incentive payments, the payments will be mentioned when contact is established with the participants and attempts are made to gain their cooperation.
Respondent privacy will be protected to the extent allowed by law. The study team is very cognizant of and committed to maintaining federal, state, and DOL data security requirements. Every study staff will comply with relevant policies related to secure data collection, data storage, transport and access, and data dissemination and analysis. DOL recognizes that the Job Corps serves vulnerable populations, and that centers must protect study participants from any risks of harm from evaluation activities. Accordingly, all evaluation staff also sign a privacy/non-disclosure agreement.
The research team developed strong protocols to help maintain the privacy of respondents to the extent permitted by law. All research staff working with personally identifiable information (PII) will follow strict procedures to protect private information and they will sign a pledge stating that they will keep all information gathered private to the extent permissible by law. All papers that contain participant names or other identifying information will reside in locked areas and workstations will be password protected to safeguard access to electronic data.
The 18-month follow-up survey is voluntary. Prior to the start of each survey, researchers will inform sample members that all of their responses will be kept private, their names will not appear in any written reports, that responses to the questions are voluntary, and that the study has a Certificate of Confidentiality from the National Institutes of Health (number CC-HD-17-008) to protect their data from subpoena. All consent and survey protocols have been submitted to and approved by Abt Associates’ Institutional Review Board (IRB).
DOL plans to produce a Public Use File (PUF) at the end of the evaluation contract that will include data collected from the baseline information form and 18-month survey. The contractor will remove PII from the file and will undertake other steps to reduce the risk of re-identification (e.g., collapsing small cells, not reporting exact dates). The PUF will contain administrative data from the Job Corps MIS and the CCCA MIS but will not contain any administrative data from other sources.
The 18-month survey will collect information from participants who have consented – or, as appropriate, have been given parental consent – to participate in this evaluation. Information to be collected includes: services received through the program or through other programs in areas such as training or educational courses and supportive services; any credentials earned; employment information since random assignment; military enlistment; experiences with and opinions on the services received through a program; socio-emotional skills; criminal activity; and public benefit receipt. Most of these types of information are generally collected as part of enrollment in government-funded training programs and are therefore not considered sensitive. However, depending on an individual’s particular circumstances, any question could be perceived as sensitive. In addition, many of the applicants are minors at the time of enrollment, and some may still be minors at the time of the follow-up interview. The evaluation team considers questions of a sensitive nature to include those related to criminal activity, use of alcohol and controlled substances, and socio-emotional measures. Evaluation team interviewers are well trained to show sensitivity while remaining impartial. Also, respondents have the right to refuse to answer any question. Finally, to encourage reporting, reluctant respondents are reminded that their answers will be kept private, to the extent allowable under law, and that the study has obtained a Certificate of Confidentiality to protect their data from subpoena.
Listed below are items that may be considered sensitive or identifying and the justification for including them:
Information about sample members’ criminal activity and use of alcohol and controlled substances during the follow-up period, and socio-emotional skills at the time of the follow-up interview. Many studies have shown that that these developmental outcomes affect education and employment outcomes.11 CCCA explicitly is attempting to affect critical social-emotional factors and risky behaviors that impact a young person’s ability to succeed in education and employment, and thus measurement of these domains at follow-up are critical to determining the full impacts of CCCA.
Updated participant contact information is collected so that the gift card can be sent to the respondent. Given that this is a mobile population, it will be necessary to collect this information to ensure that the gift card is sent to the respondent’s current address.
Information on date of birth, address, and telephone numbers is needed to identify and contact participants. This information was collected at baseline, and remains part of the respondent’s information. Except in instances in which errors are found, there will be no need to collect this information again. However, during follow-up survey administration, respondents will be asked to confirm this information.
The evaluator estimates that it will take respondents approximately 35 minutes (0.583 hours) on average to complete the CCCA 18-month follow-up survey. This estimate is based on experience with similar surveys, including the RTW Evaluation 18-month follow-up survey and the GJ-HC Impact Evaluation 18-month survey, both conducted for DOL, and the PACE Evaluation 15-month follow-up survey and the HPOG Impact Evaluation 15-month follow-up survey, both conducted for HHS. The burden estimate is based on a total sample of 800 respondents (based on an 80 percent response rate for approximately 1,000 fielded surveys). The burden is annualized by 3 years, the maximum period for which data collection will occur.
To place a value on respondents’ time, the evaluation team calculated the average hourly wage for respondents based on the average state-level minimum wage rates in the CCCA Evaluation site (Washington State). We multiplied this minimum hourly wage ($12.00) by 1.4 to account for the value of fringe benefits (estimated to equal 40 percent of the hourly wage), providing an hourly value of $16.80.12
Exhibit A.2: Estimated Annual Burden for the 18-Month Participant Follow-Up Survey
Type of Instrument |
Number of respondents |
Number of responses per respondent |
Total number of responses |
Average burden time per response |
Burden hours |
Time value |
Monetized burden hours |
18-month Follow-up Survey |
26713 |
1 |
267 |
0.583 |
156 |
$16.80 |
$2,62114 |
This data collection effort involves no recordkeeping or reporting costs for respondents other than their time.
The information collection activity and associated instruments have been developed by the evaluation contractor, Abt Associates, in performance of Contract Number: DOLQ129633231. The Chief Evaluation Office is funding the costs of the study.
The estimated annualized cost to the Federal government is $349,367. The estimated total cost to the Federal government is $1,048,100. This cost estimate is comprised of two components:
1. The estimated cost to the federal government for the contractor to carry out this study is $990,116 for survey data collection. Annualized over 3 years of data collection, this comes to $330,039.
2. DOL expects the annual level of effort for Federal government technical staff to oversee the contract will require 200 hours for one Washington D.C.-based GS-14, Step 4 employee earning $60.40 per hour. To account for fringe benefits and other overhead costs the agency applies a multiplication factor of 1.6. Thus, the estimated annual cost borne by DOL for these duties is $19,328. The data collection period covered by this justification is 36 months, so the estimated total cost for performance of these duties is $57,984.
This is a new data collection.
This data collection will contribute to the final report and a public use data set will be made available. The final report, which will cover all findings from the impact and implementation study portions of the evaluation, will be available to the public.
All instruments created for the CCCA evaluation will display the OMB approval number and the expiration date for OMB approval.
No exceptions are necessary for this information collection.
1 Schochet, Peter Z., Burghardt, J. and McConnell, S. (2008). "Does Job Corps Work? Impact Findings from the National Job Corps Study." The American Economic Review 98.5: 1864-1886.
2 Given that the length of stay at CCCA is expect to be two to three years, 18 months after enrollment is too early to expect a positive impact on earnings.
3 The primary purpose of the NDNH, operated by the Office of Child Support Enforcement (OCSE), Agency for Children and Families, HHS is to assist state child support agencies in locating parents and enforcing child support orders; however, Congress has authorized specific state and Federal agencies to receive information from the NDNH for authorized purposes. http://www.acf.hhs.gov/programs/css/resource/a-guide-to-the-national-directory-of-new-hires
4 Curtin, R., Presser, S., and Singer, E. (2005). “Changes in Telephone Survey Nonresponse Over the Past Quarter Century.” Public Opinion Quarterly 69.1: 87–98.
5 Singer, E., Van Hoewyk, J., and Maher, M.P. (2000). “Experiments with Incentives in Telephone Surveys.” Public Opinion Quarterly 64.2: 171–188.
6 Jäckle, A. and Lynn, P. (2007). “Respondent Incentives in a Multi-Mode Panel Survey: Cumulative Effects on Nonresponse and Bias.” Working paper presented to the Institute for Social and Economic Research, University of Essex, Colchester, United Kingdom.
7 Ibid.
8 Kay, W.R. (2001). “The Use of Targeted Incentives to Reluctant Respondents on Response Rates and Data Quality.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research.
9 Schwartz, L.K., Goble, L., and English, E.M. (2006). “Counterbalancing Topic Interest with Cell Quotas and Incentives: Examining Leverage-Salience Theory in the Context of the Poetry in America Survey.” Proceedings of the American Association for Public Research. Montreal, Canada: American Association for Public Opinion Research.
10 Singer et al. (2000). “Experiments with Incentives in Telephone Surveys.” (see footnote 5).
11 Eccles, J. and Gootman, J.A. (Eds.) (2002). Community Programs to Promote Youth Development. Or: Gambone, M.A., Klem, A.M. and Connell, J.P. (2002). Finding Out What Matters for Youth: Testing Key Links in a Community Action Framework for Youth Development. Philadelphia: Youth Development Strategies, Inc., and Institute for Research and Reform in Education. Washington, DC: National Academies Press.
12 Minimum wages for these states (accessed in April 2019) are available from the U.S Department of Labor at https://www.dol.gov/whd/minwage/america.htm#stateDetails: Washington rate is $12.00. This rate is then multiplied by 1.4 to account for fringe benefits. Washington state minimum wage increased from $11.50 to $12.00/hour between the time of the publication of the 60-day notice and submission of this Supporting Statement. The time value and monetized burden hours have been updated to reflect the new $12.00/hour minimum wage.
13 Based on a sample of 1,000 with an 80 percent response rate. Data collection will take place over up to 36 months.
14 The monetized burden hours the product of the annual burden hours (156) and the time value ($16.80).
CCCA
Pilot Evaluation Supporting
Statement for OMB Clearance Request ▌pg.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Modified | 0000-00-00 |
File Created | 0000-00-00 |