Youth Education and Relationship Services (YEARS)
OMB Information Collection Request
New Collection
Supporting Statement
Part B
August 2015
Submitted By:
Office of Planning, Research and Evaluation
Administration for Children and Families
U.S. Department of Health and Human Services
7th Floor, West Aerospace Building
370 L’Enfant Promenade, SW
Washington, D.C. 20447
Project Officers:
Lauren Supplee, ACF
Samantha Illangasekare, ACF
B1. Respondent Universe and Sampling Methods
Web-Based Staff Survey
We will administer a web-based staff survey to approximately 44 Healthy Marriage and Relationship Education (HMRE) grantees funded by the Office of Family Assistance in 2011 that are identified as youth-serving. We will ask that a Program Director/Administrator and a Program Facilitator from each grantee respond to the survey. The respondent universe includes all Program Directors/Administrators and Program Facilitators at 2011 OFA-funded HMRE programs. Individuals in the respondent universe are those who hear about the survey through the recruitment email or word of mouth and have access to the internet. We estimate a sample size of approximately 88 respondents (44 Program Directors/Administrators and 44 Program Facilitators).
Site Visits
Six site visits will be conducted with 2015 OFA-funded HMRE programs serving youth aged 14 to 24. We will prioritize programs that also had funding in a prior funding period (ideally 2011-2015 but we may go back to grantees funded from 2006-2011). At each site visit we will conduct three semi-structured interviews and up to two observations of program sessions. Semi-structured interviews will be conducted with a Program Director/Administrator, Program Facilitator, and an individual that works at a partner organization such as the school where the HMRE program is implemented or a community based organization that works closely with the program. The respondent universe for semi-structured interviews includes Program Directors/Administrators at grantee organizations, Program Facilitators working at OFA-funded HMRE programs serving youth aged 14 to 24, and non-program personnel working at organizations that partner with an OFA-funded HMRE program serving youth aged 14 to 24. Individuals in the respondent universe are those who hear about our study through word-of-mouth, or via email, and have an internet connection or a phone connection. We will not track characteristics of semi-structured interview respondents.
We will track characteristics of site visit organizations at the organizational level to ensure we visit, and therefore, observe a broad range of programs serving youth aged 14 to 24. These characteristics include geographic region, target population, age of respondents, race/ethnicity of youth participants, and gender of youth participants (see Appendix E Screening Matrix YEARS Site Visits)
We estimate a sample size of approximately 15 semi-structured interview participants (6 Program Directors/Administrators, 6 Program Facilitators, and 3 partner organization staff members).
B2. Procedures for Collection of Information
Web-Based Staff Survey
Web-based surveys have become an increasingly popular and powerful method to gather information in research and evaluation. Research exploring the merits of web-based versus mail, fax, phone, or mixed-mode surveys finds that web-based surveys have higher response rates than mail and fax but lower than mixed-mode methods (Cobanoglu, Warde, & Moreo, 2000; Greenlaw & Brown-Welty, 2009). Web-based surveys are significantly lower-cost than mixed-method efforts and literature has found that the use of incentives can substantially improve response rates in some populations (Greenlaw & Brown-Welty, 2009; Sills & Song, 2002).
Our proposed approach includes mailing a cover letter with a hyperlink, follow-up by email and phone with non-responders, and a gift of a $10 gift card as a token of appreciation. We believe that this combination of efforts will help to reduce nonresponse.
Child Trends will recruit OFA-funded HMRE grantees and administer a web-based survey (see Attachment 1 YEARS Web-based Staff Survey and Attachment 2 YEARS Web-Based Staff Survey for Program Facilitators) to a Program Director/Administrator and a Program Facilitator working at each OFA-funded HMRE grantee (for consent, see Appendix C Consent Form YEARS Web-based Staff Survey).
Child Trends will contact grantee staff via email (see Appendix F Letter to Grantees YEARS Survey) to inform them about the survey and to provide background about the goals and objectives of the survey. We will send the electronic link to each designated contact person at each grantee organization (identified by the contact information found in the grantee applications and grantee profiles). This correspondence will contain additional instructions about distributing the survey to appropriate staff internally, consent and privacy, due date, information about the gift card for completing the survey, and who to contact with questions or concerns. Additionally, the main point of contact at each grantee site will receive the option to have Child Trends send the survey to their staff directly. To ensure anonymity of respondents and ensure respondents’ contact information is not linked with their survey responses, at the end of the survey, participants will be provided a link for a separate form where they can enter their name and the address where they would like the $10 gift card to be sent.
While efforts will be made to triangulate data across different levels of staff (Program Directors/Administrators and Program Facilitators) where appropriate, Child Trends will avoid unnecessary redundancy to minimize participant burden. For instance, questions about recruitment strategies might be posed to both Program Directors/Administrators and Program Facilitators, but items related to enrollment have been reserved for Program Directors/Administrators. The web-based staff survey will be available for one month to ensure survey participants have ample time to complete the survey. After the due date, data will be exported from Survey Gizmo to Microsoft Excel and prepared for analysis.
The web-based staff survey will take up to 20 minutes and follow up with some grantees by phone or email to remind them to complete the online survey may require an additional 10 minutes for a total respondent burden of up to 30 minutes (0.5 hours). Participants will receive thank you letters (see Appendix G Thank You Letter YEARS Survey) and their token of appreciation after completing their survey (either in person or via mail). While the survey is open to respondents, Child Trends will check which grantees have responded and send a reminder email (see Appendix H Reminder Email YEARS Survey) or a reminder call (see Appendix I Reminder Call Script YEARS Survey) once a week to grantees who have not completed a survey.
Site Visits
Site visits will have two primary components: program observations and semi-structured interviews.
Program observations are typically used to allow a researcher or other third party to assess program quality and other program strengths and weaknesses. A program observation tool guides the observer in rating the program session on various elements depending on what domains are being assessed (Hamre & Maxwell, 2011). Often fidelity to a curriculum, participant engagement, and facilitation quality are important domains. Program observation is not the same as participatory action research; the researcher is not involved in the program session and does not interact with the participants or the program facilitator (Willis, 2007).
Semi-structured interviewing is one of the most common techniques used in small-scale qualitative research. It allows flexibility for the respondent to help guide the semi-structured interview and focus on the most important topics of conversation. Semi-structured interviews are versatile and can be adapted to a variety of respondent types which make them ideal for interviewing respondents at a range of different programs and for being adapted to different respondent types (Program Directors/Administrators, Program Facilitators, and Partner Organization/Provider personnel) (Rubin, 2011).
The study team will recruit OFA-funded HMRE grantees and determine eligibility at the organizational level. They will then schedule a site visit to observe one or two sessions of HMRE programming with youth aged 14 to 24, and conduct semi-structured interviews with a Program Director/Administrator, a Program Facilitator and an individual from a partner organization. Grantees will be recruited nationwide.
Grantees will be recruited in two ways: 1) By emailing all grantees using the point of contact names and email addresses available in the grantee profiles to introduce them to the study (see Appendix J Letter to Grantees YEARS Site Visits). Grantees will then be given the option to call a Child Trends study team member if they are interested in taking part in the study and 2) By calling grantees (see Appendix K Recruitment script YEARS Site Visits) using publicly available information from the OFA website after the email has been sent out to recruit them for the study. Grantees recommended by OFA leadership will be prioritized for these calls.
Child Trends study staff will call interested individuals from HMRE programs to establish eligibility using a 5-minute site visit screener (see Appendix L Screener YEARS Site Visits). Child Trends staff will be trained on a standardized recruitment script (see Appendix K Recruitment Script YEARS Site Visits) to use when calling and calling back interested participants. The site visit screener will identify OFA-funded HMRE grantees that serve youth aged 14 to 24. The HMRE grantee must serve primarily youth in their OFA-funded program, defined as 75% or more of the actual population served. We aim to visit sites that serve a diverse array of youth. Thus, our screener asks about the race/ethnicity, age, and income of youth served by the HMRE program. We will aim to recruit organizations that serve youth across these characteristics but if recruitment is not successful we will adjust some of the specifications of the sample as needed. We will use a screening matrix to track these characteristics (see Appendix E Screening Matrix YEARS Site Visits).
Once a site agrees to participate and is determined to be eligible, we will send a follow-up email (see Appendix M Follow Up Letter YEARS Site Visits) and attach a semi-structured interview consent form (see Appendix D Consent Form YEARS Interviews) to narrow down logistics such as dates for the site visit. We will also ask this point of contact to forward information about the study to potential participants who might be interested in participating in a semi-structured interview, and who may be available for the program observation. When the potential interview participants return our email or phone call, using the recruitment script (see Appendix N Recruitment Script YEARS Interviews) we will send them the consent form (see Appendix D Consent Form YEARS Interviews) and introduction to study letter (see Appendix O Introduction to Study Letter YEARS Interviews) separately via email (or if needed, fax or mail). We will schedule a date and time to conduct the semi-structured interview in person and if absolutely necessary, by phone. Ideally, but not necessarily, the Program Facilitator that participates in the semi-structured interview will be the same individual facilitating the lesson that we observe during the site visit. One week before the scheduled site visit we will send a reminder email of the date and time of the site visit (see Appendix P Reminder Email YEARS Site Visit). In total, the screener and recruitment and scheduling procedures will take 30 minutes (.5 hours) for both the program observation and semi-structured interviews.
Recruited participants (the point of contact for the site visit and the semi-structured interview respondents) will receive a reminder call (see Appendix Q Reminder Call Script YEARS Interviews) and email (see Appendix R Reminder Email YEARS Interviews) the day before the site visit and semi-structured interview. These procedures have been found to minimize the number of cancellations and no-shows. Participants for semi-structured interviews will be asked to provide consent at the time of their scheduled semi-structured interview. Participants will receive thank you letters (see Appendix S Thank You Letter YEARS Interviews) and a token of appreciation after completing their semi-structured interview (either in person or via mail).
Each semi-structured interview will be about one hour long; all other site visit activities described above (e.g., screener, scheduling and reminder emails, reminder calls, consent, etc.) will take place before the semi-structured interview and will take about .5 hours for a total burden of up to 90 minutes for all site visit activities (30 minutes for screening and recruitment, 60 minute semi-structured interviews). Semi-structured interviews will be conducted in-person at a time and place that is convenient for the participant. If an in-person semi-structured interview is not possible, interviews will be scheduled to take place by phone at a time that is convenient for the participant. Program observations will last as long as the program session typically last and thus will vary by program. Program observations will be scheduled at a time that is approved by the organization and that will allow the study team to observe a core component of the program curriculum. However, this part of the site visits does not add to participant burden.
B3. Methods to Maximize Response Rates and Deal with Nonresponse
Expected Response Rates
Web-Based Staff Survey
The web-based staff survey will be distributed to 2011 HMRE grantee staff. For both the pilot study and full-scale administration, we will target multiple positions within each program including HMRE Program Directors/Administrators and Program Facilitators with a single survey that uses skip logic, or branching, to provide a unique set of survey items tailored to each type of respondent. It is anticipated that the survey will be administered to two staff members from each organization serving youth. We will send the survey to the point of contact (typically the Program Director/Administrator) at each organization, and ask that they send it to a Program Facilitator in order to gain two different perspectives about the programs. Accordingly, we estimate a sample size of approximately 88 respondents (assuming 2 participants from 44 grantees in the 2011 cohort complete the survey). Child Trends recognizes that some respondents will be awarded new grants in 2015 by the time the final survey is developed, approved, and administered.
Site Visits
We expect that the maximum of up to 6 sites (including 15 semi-structured interview participants) can be successfully recruited based on experience with similar studies in the past. Up to three interviews will be conducted during each site visit (a Program Facilitator and Director/Administrator at each site, and an additional Partner Organization/Provider at some, but not all, sites), resulting in an expected 15 semi-structured interview participants. Based on prior experience, we anticipate that we will need to contact (call, email, or screen) 12 sites, 12 Program Directors/Administrators, 12 Program Facilitators and 12 non-program staff to achieve the remaining 6 sites, 6 Program Directors/Administrators, 6 Program Facilitators, and 3 non-program staff at partner organizations. We will recruit non-program staff at partner organizations only if services are offered at partner organizations. Since not all grantees are partnering with other organizations to deliver their programs, we expect to recruit half as many non-program staff from partner organizations.
Dealing with Nonresponse
Web-Based Staff Survey
If a staff member declines to participate they do not have to click the link to participate. If after reading the consent form on the first page that describes their rights and responsibilities as a participant, they do not wish to agree (“consent”) to participate, they will be taken to a screen that ends the survey. Non-response bias may occur if the participants who respond to the survey provide answers that vary from the responses we would have received from staff that choose not to answer. Program staff with more time and interest to respond to a survey may be more likely to agree to participate in a survey, for example. We hope that our provision of a gift card as a ‘thank you’ and follow-ups will maximize participation from a variety of respondents.
Site Visit
Program Observations
If the point of contact with whom we are speaking does not agree to let us observe a program session we will thank them for their time and not move forward with scheduling the site visit. There are many reasons a grantee may not wish to have an observer in their program session. Nonresponse bias may occurs if the individuals declining to allow the program observations have significantly different populations served, implementation characteristics, or staff qualities than programs that do agree to let us observe a program session. Our careful monitoring and screening process will help to ensure that we observe a diverse set of program sessions.
Interview
If an eligible participant declines to participate, the interviewer will discuss the individual’s concerns and attempt to address the concerns. If the participant cannot or will not participate in the research, we will select another individual (or organization) with similar characteristics. Nonresponse bias may occur if the participants provide responses that may differ from individuals who choose not to participate. For example, programs that are struggling to engage a hard to reach youth population may not be as interested in participating in the study. The responses from these programs may differ from the responses we receive from the programs we do recruit for our study. However, our screening and tracking procedures should help to alleviate some of this potential bias by carefully tracking the characteristics of the participants so that we may adjust our enrollment decisions to ensure a more balanced sample across a number of specific characteristics.
Maximizing Response Rates
Web-Based Staff Survey
Although the electronic format of the survey will facilitate rapid dissemination, collection, and data management while reducing costs and participant burden, web-based data collection can pose its own problems. It is not uncommon to encounter low response rates with web-based survey data collection; however, we will closely monitor response rates to the web-based staff survey and will follow-up as needed to encourage staff participation. Two follow-up emails will be sent to grantee contacts to remind them and their staff to complete the survey. The first email will be sent out two-weeks after the survey was sent out, and the second email will be sent out three days prior to the survey being closed. Given that respondents will be asked to report which grantee they work for, we will track response rates weekly, and will follow-up with a reminder email each week to grantees who have not responded. We will follow-up by phone if grantee contacts are unresponsive via email. We also expect that the provision of a gift card as a ‘thank you’ will result in higher response rates. Furthermore, staff may need assistance when completing the survey; therefore, study staff will make themselves available to provide technical support, if necessary. Lastly, we will proactively monitor the quality of the data from early responders to determine if there are any issues with the survey tool or select questions that may need to be addressed.
Site Visits
To maximize response rates, study staff will schedule site visits, program observations, and semi-structured interviews during dates and times that are most convenient for participants. We will try to interview all participants at each site during the in-person site visits. We will call and email the day before scheduled interviews to remind participants of the interview or program observation. To further ensure a high rate of participation, we will offer interview participants a $25 gift card as a token of appreciation
(See supporting statement A, Section A.9, for additional information).
B4. Tests of Procedures or Methods to be Undertaken
Web-Based Staff Survey
Child Trends piloted the web-based staff survey with nine respondents in June 2015 and revised the survey based on respondent feedback. The final version of the web-based staff survey will be distributed after OMB approval in January 2016, and the study team will collect data until March 2016.
Site Visits
Child Trends will visit three pilot sites and pilot the program observations and semi-structured interviews between October and December 2015. At these pilot site visits, Child Trends will interview up to nine participants. If edits are found necessary based on site visits, these changes will be submitted to OMB.
B5. Individual Consulted on Statistical Aspects and Individuals Collecting and/or Analyzing Data
The YEARS team at Child Trends is led by Dr. Scott, principal investigator, Dr. Kelly Murphy, research scientist; Shelby Hickman, senior research analyst; and Julia Goodwin, research assistant. The YEARS project officers at OPRE are Lauren Supplee and Samantha Illangasekare.
File Type | application/msword |
File Title | OPRE OMB Clearance Manual |
Author | DHHS |
Last Modified By | Molly |
File Modified | 2015-12-04 |
File Created | 2015-12-04 |