|
|
To: |
OMB Desk Officer |
|
|
From: |
ACF/OPRE |
|
|
Subject: |
Responses to OMB Questions on the ISIS Baseline Data Collection Clearance Package |
We received questions from OMB about the ISIS baseline data collection clearance package. This memo contains ACF’s responses to these comments, as well as a list of other modifications the team has made since the original OMB submission on April 12, 2011. ACF has modified the following sections of the OMB submission:
Part A First paragraph
A.1.4 Research Questions
A.2.4 Construct Justification
A.3 Use of Information Technology to Collect the Information
A.5 Minimizing Burden on Small Respondents
A.9 Payment to Respondents
A.11 Justification for Sensitive Questions
A.12 Estimates of Public Reporting Burden
B.2.4 Who Will Collect the Information and How It Will Be Done
B.2.5 Procedures with Special Populations
B.4 Tests of Procedures
B.5 Individuals Consulted on Statistical Aspects of the Design
References
Appendix B. ISIS Participation Agreements (Informed Consent Forms)
Appendix C. ISIS Basic Information Form (BIF) for Participants
Appendix D. ISIS Self-Administered Questionnaire (SAQ) for Participants
Below are ACF’s responses to OMB questions.
1. Separate submissions. As a general matter, we would prefer not to have each phase of a single study submitted as a request for a new control number. There are some transparency benefits to either including more information in one ICR, or amending a control number as needed for subsequent phases. That said, we understand there may be instances where separate control numbers make sense. We can talk more about this if that would help – it’s something relevant to all ACF studies, not just ISIS.
Originally we had planned to revise the original package and use the same number, however, delays in completing the instruments and materials would have resulted the previous number expiring prior to ACF being able to send materials for this collection to OMB. We are requesting a new number for this collection and will, in the future, use only this number for all collections associated with ISIS.
2. Questions about HPOG
a. What HPOG items are being used in this evaluation to minimize the burden on the three sites that are in both studies?
For the HPOG sites that are included in the ISIS evaluation, responses to ISIS items will be used to minimize burden during the HPOG intake data collection. Potential study members will first go through the ISIS informed consent process and fill out the ISIS BIF and ISIS SAQ. After random assignment, those individuals who are assigned to the treatment group will have the data elements that are in common to the ISIS BIF and the HPOG intake form automatically uploaded to the HPOG Performance Reporting System (PRS). Thus, treatment group members will not have to respond again to the same questions. Instead, they will only have to respond to the HPOG items that are not included in the ISIS data collection. The common data elements are listed below.
Proposed HPOG Data Elements |
Collected by ISIS in Proposed ISIS Basic Information Form? |
Individual Characteristics at Enrollment |
|
I1. Date of Birth |
Yes |
I2. Gender |
Yes |
I3. Race/Ethnicity |
Yes |
I4. Highest school grade completed |
Yes |
I5. Marital status |
Yes |
I6. Number and ages of dependent children |
Yes |
I7. Parent/guardian status (single, non-custodial, legal guardian) |
Yes |
I8. Veteran |
No |
I9. Tribal affiliation |
No |
I10. Individual with a disability |
No |
I11. Foster care youth |
No |
I12. Individual with limited English proficiency |
No |
I13. Homeless individual |
No |
I14. Offender |
No |
I15. Employed at enrollment If yes: - Hourly wage in last full week -Hours worked in last full week |
Yes |
I16. Unemployment insurance status |
No |
I17. Ever worked or trained in health profession prior to participation |
No |
I18. Public assistance status (TANF, other, none) |
Yes |
Contact Information |
|
C1. Social Security Number |
Yes |
C2. Street Address, City, State, ZIP |
Yes |
C3. Home, work, cell phone # |
Yes |
C4. Alternative contacts (name, address, relationship, phone #) – up to 3 |
Yes |
C5. Referral source |
No |
b. Is it possible to look at the 3 HPOG sites as a subgroup for analysis purposes?
Yes, it is possible to do this; however, the type of analysis will depend on the final constellation of HPOG grantees and ISIS sites. In general, ACF has recruited interventions in ISIS (including the HPOG sites included in ISIS) that represent some of the most innovative and promising career pathways models in the country. However, across the various sites, the interventions themselves differ from one another – and other HPOG sites. The currently proposed evaluation design is based on the ability to detect site-specific impacts. In those interventions where it is possible to pool sites together that clearly represent a policy-relevant model and where there would be an additional analytical benefit we would make every attempt to do.
c. Would ACF think that the HPOG sites approaches would vary systematically from the other sites due to the HPOG grant requirements and/or focus?
Not necessarily. As noted in the answer to the previous question, our intent is to be able to detect site-specific impacts using an experimental design. All ISIS sites, whether funded by HPOG or not, will thus have to meet the same requirements for the evaluation. Because of this, the HPOG sites included in ISIS will not differ from other ISIS sites more than any of the non-HPOG ISIS sites already differ from each other. Moreover, across the current set of ISIS partner sites, there is substantial focus on preparation for health professions. Thus, within ISIS, the HPOG sites will not be unique in focusing on health professions.
3. Supporting Statement Part A
a. Page 1, Introduction – Please note the control number(s) for the prior phase(s) of this study.
We have included more specific language in the introduction that includes references to the past collection number.
b. Item A.1.4 – Please explain how this particular phase of the study addresses these research questions.
Text from revised Section A.1.4:
Research questions the ISIS evaluation will address include:
Implementation—What services are provided under each intervention? What are the characteristics of the populations served? How are services for the target population implemented? How do services for the treatment group compare to the services available to the control group? What are the issues and challenges associated with implementing and operating the service packages and policy approaches studied?
Impact—What are the net impacts of career pathway programs on educational outcomes (program completion, attainment of credentials and degrees) and economic outcomes (earnings, employment levels, and wage progression)? What are the net impacts for hypothesized mediators of these primary impacts in domains such as academic skills, psycho-social skills, career awareness, family resources, and other personal and family challenges? What are the longer-run impacts on indicators of child and family well-being?
Subgroups—How do career pathway program impacts vary by subgroup?
Cost effectiveness—What are the costs of career pathway programs in the study? Do the estimated benefits of providing services outweigh the costs of these programs?
The Interview Guide will be used for the Implementation Study. Researchers will use interviews with program staff and key stakeholders to collect information that will provide a fuller understanding of the initial conditions surrounding these career pathway programs and the contexts in which they operate. This information will also allow researchers to assess the quality of early implementation of these programs—assessments that will be important to the interpretation of program impact results.
Baseline data collected with the BIF and SAQ instruments will be used for the Impact Studies. The contact information collected at baseline is necessary to enhance researchers’ ability to locate respondents for follow-up surveys that will measure outcomes in both the treatment and control groups. ISIS researchers also will utilize information collected at baseline to identify subgroups of interest and to compare impacts across subgroups. Other analytic purposes of the baseline data include describing the ISIS study sample, adjusting for chance differences in observable characteristics and increasing precision of impact estimates, and checking the integrity of random assignment.
Finally, the project will address the cost-effectiveness of programs through comparison of any economic benefits with net program costs in the Benefit-Cost Study. Although the bulk of cost data will come from programs’ existing administrative records, ISIS researchers will augment their understanding of program costs through interviews with program staff.
c. Item A.2.4 – We are unclear on the rationale for allowing sites to customize the questionnaire “to include one or two brief items” – why is this necessary?
ACF and Abt Associates conceived ISIS as an innovative partnership in which sites are engaged in the study design where practical. The opportunity to add one or two brief items is one expression of this approach, which we expect will help promote greater interest and cooperation with study procedures (and perhaps thereby reduce perceived burden for program staff). We will ensure that items added have only a slight effect on the overall respondent burden, and it is likely that some sites will not wish to add items. If OMB wishes, however, we will remove these items from the form.
d. Item A9, incentives – As a general matter, we do not typically “compensate” individuals for participating in federal studies. Instead, if ACF is going to make an assertion about the effectiveness of incentives (in this case that they help decrease item non-response), ACF should offer evidence to support such claims from experiments in studies similar to the one proposed. ACF should also offer a much more detailed discussion in part b about any concerns with item non-response and what the full set of strategies are for ensuring sufficient completes. As written, we do not see any justification for an incentive in the context of this study. Typically, in other studies, we would see enrollment in the study, with the promise of opportunity for a promising intervention, as a sufficient incentive and the lack of such opportunity should one not enroll in the study as a sufficient disincentive. This is not to say that some incentive may not be acceptable during follow up data collection, especially is it is after the completion of program participation.
Text from revised Section A.9:
The payment to individuals is not compensation for their participation in an evaluation; instead it is an incentive for them to participate in the data collection – specifically, the baseline information form and the self-administered questionnaire. The rationale for this is that ACF is planning to offer a $25 payment for respondents for participating in baseline data collection procedures. We believe that this rationale is justified for two important analytic purposes both of which are well-documented in the literature and with ACF’s previous experience in experimental studies. These reasons are 1) the importance of baseline data, specifically covariates, to improve precision of estimates and reduce bias is well-documented across the education, employment and training literature; 2) the effectiveness of small incentives for effectively and efficiently improving survey response rates. Based on these to principles, we believe that it is important to offer the initial incentive to both ensures accurate baseline information and to minimize sample attrition over the life of the project.
Incentive payments are a powerful tool for maintaining low attrition rates in longitudinal studies, especially for cases in the control group because these sample members are not receiving any (other) program benefits or services. There is evidence to suggest the use of incentive payments for the ISIS baseline data collection can help ensure a high response rate in future survey waves. Singer and Kulka (2001) cite evidence from the Survey of Income and Program Participation that incentives given to respondents in the first wave of the survey continued to produce higher response rates as far out as Wave 6, even though no other incentives were given. Likewise, Singer, Van Hoewyk, and Maher (1998) find that respondents of the Survey of Consumer Attitudes (SCA) who were given an initial incentive were more likely than a group of respondents who received no initial incentive to complete a follow-up survey without the payment of an additional incentive.
In addition to evidence on unit response rates, there is evidence to suggest that certain populations are more likely to provide complete information on surveys if they are given an incentive. Singer, Van Hoewyk, and Maher (2000) find that non-white respondents of the SCA who are offered an incentive are significantly less likely to fill out “don’t know” on a range of questions, including questions about family income. Most ISIS sites have high concentrations of minority populations in their programs, and insuring the best possible quality of baseline data on a range of constructs, including family income, will allow for more precise subgroup analysis.
ACF and Abt Associates have used incentives to minimize attrition in similar situations in the past. For example, the attached appendix documents the previous OMB-approved information collections that ACF has recently submitted, as well as their incentive payment levels and response rates. Across these studies, there is a consistent use of incentives on the order of magnitude similar to what is proposed in the ISIS collection. The incentive payments have consistently produced response rates above the 80 percent threshold requested from OMB. In addition, in the Benefit Offset National Demonstration (BOND), Abt Associates offered respondents a $40 gift card to fill out a baseline survey, even though potential participants had to complete the baseline survey to have any chance of being assigned to the new program. In the Homeless Families Impact Study, Abt Associates offered a $35 incentive at baseline. As in BOND, all potential participants were required to fill out the baseline survey to have a chance of receiving treatment services. Both studies justified using incentives to minimize attrition in subsequent study waves.
e. Item A11, sensitive questions – We disagree that questions related to domestic violence are not sensitive. We would prefer to see a specific citation of the Privacy Act in the confidentiality pledge, rather than the language that ACF uses when there is no specific statutory protection.
We agree that question about domestic violence is sensitive. We have amended the informed consent form to cite the Privacy Act of 1974.
f. Item A16 -- We would like to see more detail on the impact analysis plan. Is that available at this stage?
A detailed analysis plan is not yet available, but the analytical approach will involve well-tested methods used in many similar evaluations. The analysis will involve, first, calculating means for a series of outcome variables for members of the treatment and control groups at successive follow-up intervals and, then, taking the difference between estimated means for the two groups. Following conventional practices, differences will be adjusted using regression models that control for a series of baseline variables. Regression-adjustment can help both to guard against biases resulting from chance differences arising in random assignment and to improve the power and precision of estimates. At this point we expect to conduct most analyses at the site level, given that program models will differ substantially across sites. We also expect to perform some analyses of differences between subgroups defined with respect to varying baseline characteristics, though such analyses are likely to be treated as exploratory given levels of precision attainable with expected sample sizes. Major outcome domains are summarized in our response to 3b. We will measure outcomes through a combination of administrative data and several waves of follow-up surveys. At this point, we expect to conduct the first survey at the 12-month follow-up interval.
4. Supporting Statement Part B
a. Please provide additional information about field procedures related to recruitment and enrollment into the study. Specifically:
i. Are the questionnaires completed at time of initial enrollment into the study and is such enrollment essentially done as part of enrollment into the training program?
Yes. Sites will incorporate ISIS baseline data collection into their normal intake procedures. A more detailed description of enrollment procedures has been added to Section B.2.4.
ii. Is data collection performed by site staff or evaluation staff?
Site staff will collect all baseline information. This process is described in detail in Section B.2.4.
iii. How will site staff (if the data collectors) ensure that ACF provided confidentiality assurances are met?
Excerpt from revised Section B.2.4:
Abt Associates has set up site teams consisting of at least one senior and one junior staff member for each potential partnering site. The site team will be responsible for training partnering site staff to perform baseline data collection activities. ISIS trainers will use a training presentation developed with the assistance of the Abt Associates’ Institutional Review Board to train site staff on human research protections and data security. At the completion of the training sessions, ISIS trainers will have site staff sign a confidentiality agreement containing an Individual Investigator Agreement, confirming the staff is aware of all research guidelines. Initial trainings of site staff will occur in person immediately prior to the pilot phase of the study. In the case of staff turnover, new site staff will be trained using a recorded webinar of the same training module used at the start of the evaluation.
The training module will also include an explanation of the support materials created by Abt Associates to assist partner site staff. Training materials will include a site-specific procedural manual that explains the intake procedure, random assignment, and data collection. As part of the initial training, the ISIS trainer will walk through the training manual with site staff. Site staff will also be able to access additional support tools via the ISIS website or contact their ISIS site team with other questions.
The ISIS team will monitor recruitment, intake, and data security through regular technical assistance calls (likely bi-weekly) and site visits. During these calls, ISIS team members will probe partnering site staff for problems that have come up during the intake process. These probes will have the dual purpose of troubleshooting problems and making sure site staff is properly administering intake procedures. The ISIS site teams will visit each site at least once prior to the pilot period and once during the pilot period or early in the full implementation period to monitor their intake procedures. The frequency of site visits beyond this time is still being determined.
b. Power analysis
i. What does the literature suggest about the expected effect sizes and how did that literature drive the design decisions?
The most pertinent recent study of occupational training is the Sectoral Employment Impact Study (SEIS) conducted by Public/Private Ventures. This study found an impact on earnings of $4,011 in the second year after of random assignment. This was a relatively large impact with an effect size of about 0.28. (To compute this effect size, we first calculated the standard deviation of annual earnings using the standard error of impact, sample sizes, and R-squared found in the published study. The calculated standard deviation of annual earnings is $14,169.)
The MDE for annual earnings at an ISIS site, based upon an expected sample size of 1,200, is $1,887. This is equivalent to an effect size of about 0.13. As the SEIS impact is unusually large compared with most employment interventions, we wanted to have sufficient statistical power to detect a considerably smaller impact on earnings. The impact on annual earnings found in a recent non-experimental analysis of the effects of Workforce Investment Act (WIA) training is roughly the size of the ISIS MDE for annual earnings (Heinrich, Carolyn, Peter R. Mueser, and Kenneth R. Troske. 2009. “Workforce Investment Act Non-Experimental Net Impact Evaluation: Final Report.” Washington, D.C.: U.S. Department of Labor, Employment and Training Administration Occasional Paper 2009-10.)
ii. The rationale for a minimum detectable effect for earnings is useful. What is the comparable logic for the other key outcomes?
We also show MDEs for binary outcomes to represent outcomes involving completion of training, a key outcome in the ISIS evaluation. Expected impacts are somewhat more difficult to benchmark than earnings, as they will depend on the type of training credentials and expected impacts involved. Given that ISIS sites will be testing relatively intensive programs, we expect impacts at least as large as those shown in Exhibit B-3 for the overall sample in each site. As noted earlier, we are likely to treat subgroup comparisons as “exploratory” in light of larger MDEs involved.
iii. The power analysis uses a 0.10 level of significance, while 0.05 is much more typical. Why?
Many recent random assignment studies of social and educational programs use a 0.10 level of significance. We include a list below of recent studies that use this standard of significance. Using a 0.10 level of significance rather than a 0.05 level increases the chance of Type I error (finding statistically significant effects when true effects are zero) and decreases the chance of Type II error (failing to find significant effects when true effects are non-zero). Studies often characterize findings that are significant at the 0.05 level as stronger than those that are significant at the 0.10 level but not the 0.05 level.
Hendra, R., Ray, K., Vegeris, S., Hevenstone, D., & Hudson, M. (2011). Employment, Retention and Advancement (ERA) Demonstration Delivery, Take-Up, and Outcomes of an In-Work Training Support for Lone Parents. New York, NY: MDRC.
Hendra, R., Riccio, J., Dorsett, R., Greenberg, D. H., Knight, G., Phillips, J., Robins, P., Vegeris, S., & Walter, J. (with Hill, A., Ray, K., & Smith, J.). (2011). Breaking the Low-Pay, No-Pay Cycle: Final Evidence from the UK Employment Retention and Advancement (ERA) Demonstration. New York, NY: MDRC.
Maguire, S., Freely, J., Clymer, C., Conway, M., & Schwartz, D. (2010). Tuning In to Local Labor Markets: Findings From the Sectoral Employment Impact Study. Philadelphia, PA: Public/Private Ventures.
Michalopoulos, C., Wittenburg, D., Israel, D. A. R., Schore, J., Warren, A., Zutshi, A., Freedman, S., & Schwartz, L. (2011). The Accelerated Benefits Demonstration and Evaluation Project Impacts on Health and Employment at Twelve Months: Volume 1. New York, NY: MDRC.
Miller, C., Binder, M., Harris, V., & Krause, K. (2011). Staying on Track: Early Findings from a Performance-Based Scholarship Program at the University of New Mexico. New York, NY: MDRC.
Puma, M., Bell, S., Cook, R., Heid, C., Shapiro, G., Broene, P., Jenkins, F., Fletcher, P., Quinn, L., Friedman, J., Ciarico, J., Rohacek, M., Adams, G., & Spier, E. (2010). Head Start Impact Study: Final Report. Washington, D.C.: U.S. Department of Health and Human Services.
Scrivener, S., & Weiss, M. (with Teres, J.). (2009). More Guidance, Better Results? Three-Year Effects of an Enhanced Student Services Program at Two Community Colleges. New York, NY: MDRC.
Wood, R., McConnell, S., Moore, Q., Clarkwest, A., & Hsueh, J. (2010). Strengthening Unmarried Parents’ Relationships: The Early Impacts of Building Strong Families. Princeton, NJ: Mathematica Policy Research.
c. From what experience does the 80% response rate assumption come? Is that at baseline or first follow up? Is it conditional?
The MDE’s presented in Exhibit B-3 refer to outcomes estimated in planned follow-up surveys, not the baseline forms. The 80% response rate standard for follow-up surveys is consistent with rates Abt’s survey group and other professional survey groups have achieved in similar studies in the past. (The 12-month follow-up survey for ISIS will be submitted for OMB approval in a future submission package.)
As stated in Section B.3, all individuals who agree to participate in the evaluation must complete the BIF and the SAQ at baseline in order to have the opportunity to be randomly assigned to the career pathways program. Therefore, a response rate of 100 percent is expected for these baseline instruments.
d. Item B.2.4, is the “additional intake activities associated with ISIS will take under two hours to complete” referring to burden on the sites or the individuals? Is this additional burden included in the burden estimates?
This section has been edited to more clearly explain intake procedures and timing. Activities associated with ISIS will take participants approximately 45 minutes to complete, including informed consent (10 minutes), the BIF (15 minutes), and the SAQ (20 minutes). This burden applies to program participants.
5. Consent Form (Appendix B)
a. What happens to those who do not consent to participation in the study -- are they in effect offered “control group” services?
Yes. The informed consent has been amended so it is clear that if someone declines to participate, the site staff person will give him a list of other services in the community.
b. The potential risks paragraph seems a bit repetitive and confusing. Can this be streamlined?
Yes. We have added simpler language to minimize confusion.
6. Questionnaires
a. Race question – we appreciate the edits, but the question is not quite in compliance with OMB standards. It should be mark “one or more,” not “all that apply.” Also, please delete the “other” category.
We have made the above changes to the BIF instrument.
b. We understand the desire to collect family composition data. However, we do not consider the rationale provided adequate justification for collecting PII on all individual children in the family, especially when we know from other studies that these data can be considered sensitive by parents. Please delete/modify or justify each variable initially proposed.
Building knowledge on interventions that can promote child and family well-being in the long-run is a key aspect of ACF’s mission. There are both policy and theoretical reasons to expect impacts on wider aspects of child and adult wellbeing from ISIS and other interventions that seek to improve family economic outcomes. Tensions between work, family, and school may affect parenting of children of different ages. Parents’ educational activities, psycho-social effects, and study skills all may have impacts on parenting styles and behaviors.
We expect in the 12-month, and potentially the 36-month, survey will be pre-populated with information about children identified at baseline which will be used to determine children’s whereabouts and learn about new children in the household. For each, we will ask a short series of items about school and behavior and sometimes do more intensive interviews or observations with a focal child. This has been the approach in a number of national studies.
Identification information on children is needed to measure important outcomes via linkages to administrative data, potentially from education, criminal justice, and welfare systems. For example, we hypothesize that positive post-secondary results for low-income parents may help to foster higher high school graduation and college attendance for older children, and perhaps better school outcomes for younger children as well. The ability to link to administrative data systems capturing these child outcomes also will lessen any burden that might arise from having to otherwise add question on these outcomes to follow-up surveys. We have included the appropriate language concerning such linkages in the informed consent form.
Additional Modifications:
A.12 We revised the burden estimates to reflect the times generated in a pre-test of the baseline forms. The estimate for the time to complete the informed consent increased from 5 to 10 minutes, and the estimate for the time to complete the SAQ decreased from 30 minutes to 20 minutes. The estimate for the time to complete the BIF remained constant at 15 minutes. These changes reduced the yearly burden estimate from 4,662 to 4,212 hours and the total burden from 9,162 to 8,262.
B.2.4 In addition to providing a more detailed explanation of baseline data collection procedures, we made a decision to offer the SAQ only on a paper form. Previously we wrote that sites would have the option of using either paper or electronic forms.
B.2.5 We changed the informed consent to be at a 9th grade reading level, instead of an 8th grade reading level. Because these programs are conducted in English at the secondary education level and above, we are confident that all program participants will be comfortable with the language of all program materials.
B.4 We included the results from a pre-test we conducted of our baseline forms.
B.5 The ISIS project leadership changed from the time of the previous submission. David Fein is now a co-Principal Investigator instead of the Project Director. Karen Gardiner has replaced Dr. Fein as the Project Director. Daniel Kitrosser has taken Alan Werner’s position as Deputy Project Director.
Baseline data collection forms: The BIF and SAQ include several changes resulting from further instrument refinement, including problems identified during the pre-test. These changes are described in Section B.4.
Abt
Associates Inc.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
File Title | Abt Single-Sided Body Template |
Author | BartlettS |
File Modified | 0000-00-00 |
File Created | 2021-02-01 |