SSB Prog Supp GenIC - Multi-Site Study of State-Tribal Collaboration in Home Visiting

Generics Supporting Statement B - MUSE-STC_OMB_clean.docx

Formative Data Collections for ACF Program Support

SSB Prog Supp GenIC - Multi-Site Study of State-Tribal Collaboration in Home Visiting

OMB: 0970-0531

Document [docx]
Download: docx | pdf

Alternative Supporting Statement for Information Collections Designed for

Research, Public Health Surveillance, and Program Evaluation Purposes



Multi-Site Study of State-Tribal Collaboration

in Home Visiting


Formative Data Collections for ACF Research


0970 - 0356




Supporting Statement

Part B

June 2022


Submitted By:

Office of Planning, Research, and Evaluation

Administration for Children and Families

U.S. Department of Health and Human Services


4th Floor, Mary E. Switzer Building

330 C Street, SW

Washington, D.C. 20201


Project Officers:

Aleta Meyer, PhD, Office of Planning, Research, and Evaluation

Nicole Denmark, PhD, Office of Planning, Research, and Evaluation








Part B

B1. Objectives

Project Objectives

The purpose of this data collection is to provide descriptive information on the collaborations between MIECHV awardees and tribal communities and their influence on planning and implementation of services. This data collection is designed to support ACF and HRSA program offices to tailor program guidance, increase the usefulness of support provided to awardees, and inform the development of future research within the MIECHV Learning Agenda. Information from interviews can inform changes to program guidance to better support the development and sustainment of awardee-tribal partnerships. Interviews will also identify training and technical assistance resources that can support successful collaboration.


Generalizability of Results

This study is intended to present an internally-valid description of collaboration occurring between MIECHV awardees and programs serving tribal communities. It is not intended to promote statistical generalization to other programs or service populations. The study results and themes will be useful in informing federal planning, guidance, and technical assistance.


Appropriateness of Study Design and Methods for Planned Uses

As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.  


We will use a case study design because it allows us to deeply explore the unique state-tribal partnerships within MIECHV. We plan to recruit up to six cases (state-tribal partnerships) for the study and interview up to seven individuals within each case. We will interview up to 42 individual interview participants.


Based on our prior work with MIECHV, we know that collaboration between MIECHV awardees and tribal communities takes different forms and occurs within diverse contexts. As explained in section A1 of SSA and in section B2 of this supporting statement, a case study design will allow us to examine and describe the different types of partnerships. The study design will rely on individual key informant interviews with state and tribal agency staff members within each of the cases.


The information gathered will be purely descriptive and are not intended to be representative of all state-tribal collaborations. The information collected will not be used to assess service delivery or evaluate impact. Key limitations will be included in written products associated with this study.


B2. Methods and Design

Target Population

We will attempt to collect data from staff at six MIECHV awardee agencies and staff from the local agencies they partner with that serve AIAN families. Within each partnership, we will collect information from staff who are identified as knowledgeable of the partnership. This may include state administrators, tribal site program managers, state-tribal liaisons, supervising home visitors, and others knowledgeable of the partnership.


Sampling

Within the MUSE-STC study, sampling occurs at two levels: (1) identifying cases through a case selection process and (2) identifying individual interview participants at each site. As noted above, we anticipate recruiting up to six cases for the study and interviewing up to seven individuals within each case (a total of up to 42 individual interview participants).


Case Selection

In case studies that strive to describe a type or a larger set of examples, case selection is a critical step in the process because it helps to ensure that the cases selected will result in data that adequately describe the type. For case selection within MUSE-STC, we used existing data to determine a sample of cases that most clearly represent typical partnerships. We also consulted with federal partners and stakeholders throughout the case selection process that involved three iterative phases: 1) Reviewing and prioritizing data to inform case selection, 2) defining case criteria, and 3) selecting cases.


Our multi-step case selection process began by identifying the sampling frame, which is the full universe of potential cases. Through our review of existing data sources, we identified 29 possible cases. We then sequentially applied three exclusion criteria to ensure cases selected represented typical state-tribal partnerships: (1) partnerships that are not active or not focused on serving Tribal communities, (2) partnerships that are less than 3 years old, (3) partnerships where there is only 1 site that meets the case criteria (e.g., type of local agency). After applying the exclusion criteria, we identified characteristics that we wanted to prioritize for inclusion. These characteristics were identified through engagement with stakeholders and ensured that our team was able to identify a final selection of cases that not only represent typical partnerships but also capture the diverse and unique qualities of these collaborations. After applying the eight inclusion criteria, we selected six cases that represented the characteristics prioritized by the study team and stakeholders.


Participant Selection

The interviewees will be selected through purposive sampling (i.e., chosen based on specific characteristics (e.g., role, longevity at agency, knowledge of the partnership), not at random). The sample will include staff from both agencies directly participating in the partnership case. MUSE-STC study staff will work with an engagement partner from each case to identify individuals who are knowledgeable of the partnership. Respondents may include state administrators, tribal site program managers, state-tribal liaisons, supervising home visitors, and others knowledgeable of the partnership.


B3. Design of Data Collection Instruments

Development of Data Collection Instruments

Prior to developing the data collection instrument, we met with an expert consultant to obtain input on structure and flow of the instrument as well as key domains of inquiry based on the research questions. These domains were presented to the MUSE-STC State-Tribal Expert Panel (STEP) for additional feedback. The STEP consists of five individuals who are involved in state-tribal MIECHV partnerships. Once domains were established, we led a two-hour “working session” with federal partners to brainstorm potential interview questions. Following this planning and input process, we developed an initial draft interview protocol based on OPRE and HRSA’s goals and objectives for the project as well as stakeholder input received by the MUSE-STC STEP. Once we had a working draft, we held a follow-up meeting with the STEP to obtain input on the protocol. The groups that provided information were: 1) STEP members, 2) HRSA staff, and 3) OPRE staff. We piloted the instrument with contractor staff who are not involved in the study to determine an accurate burden estimate and refine the protocol. In response to feedback received during piloting, we refined question wording and reduced the number of sub-questions that were deemed redundant. Table 1 outlines how each research question will be addressed in the interviews.


Table 1. Crosswalk of Research Questions and Protocol Questions

Research Questions

Protocol Question Numbers

What do MIECHV awardees and tribal communities consider when deciding whether to partner?

2,5,6,7

How are partnerships between MIECHV awardees and tribal communities established?

2, 3, 5, 6, 17, 21

How are partnerships between MIECHV awardees and tribal communities structured?

2, 3, 4, 6, 16, 17, 19, 21

What are the markers of a well-functioning partnership between MIECHV awardees and tribal communities?

5, 7, 8, 11, 12, 13, 14, 15, 16, 17, 18, 19, 23, 25, 26

How are these partnerships functioning during planning?

2, 5, 8, 9, 10, 11, 13, 14, 17, 18, 21, 23

How are these partnerships functioning during implementation?

5, 8, 9, 10, 11, 13, 14, 15, 17, 18, 19, 21, 22, 23, 25, 26

What are the facilitators and barriers of well-functioning partnerships?

5, 7, 8, 9, 10, 11, 13, 14, 15, 16, 18, 19, 23, 24, 25, 26

What are the challenges and benefits of these partnerships from the perspective of MIECHV awardees and tribal communities? 

5, 7, 8, 9, 11, 13, 14, 15, 16, 18, 19, 22, 23, 24, 25, 26


B4. Collection of Data and Quality Control

Who will be collecting the data (e.g., agency, contractor, local health departments)?

One member of the contractor study team will conduct each interview. With permission from the participant, the interviews will be audio recorded. Interviews will then be transcribed.


What is the recruitment protocol?

Recruitment for MUSE-STC will occur in 2 stages: case recruitment and individual participant recruitment.


Case Recruitment

During the case recruitment phase, we will work to engage both the awardee agency and the local implementing agency serving AIAN families. To begin case recruitment, we will request an up-to-date contact list of MIECHV awardee leads from HRSA. HRSA Project Officers will email all awardees with tribal partnerships (i.e., entire sampling frame) to inform them of the study, convey HRSA’s support for the study, and notify the awardee that they may be selected for the study and contacted by study staff. See APPENDIX B for a draft of the email. We will then contact each awardee lead selected for the study via email to introduce the goals of the study, the partnership of focus, the staff roles we would like to speak with, the types of questions we will ask, and the time commitment. In this email we will request a phone conversation to provide more details and discuss the processes for engaging them as a partnering agency. See APPENDIX C for a draft of the email. We will attach a flyer describing the study in more detail to the outreach email (APPENDIX D).


In preparation for these initial engagement calls, study staff will gather information about the selected partnership including agency information, staffing, and any anticipated tribal and state approval processes. This preparation will enable study staff to be more informed during initial discussions, facilitating and expediting decision making. Because of the diversity of partnership arrangements, case recruitment specifics will vary by case. Case recruitment will proceed according to what is appropriate for the entities involved in the specific partnership and in a way that honors tribal sovereignty. Once we have successful engagement from individuals within both partnering agencies that have the authority to consider study participation, we will begin formal local study approval processes with both entities (e.g., Tribal approval, Tribal IRB).


Participant Recruitment

After we have received approval for study participation from all necessary entities within the case, we will begin the participant recruitment phase. We will work with the project director, or individual in an equivalent position, at each agency to identify an engagement partner. MUSE-STC study staff will work with the engagement partner to identify the individuals who fill the roles described above and obtain their contact information. A member of the study staff will contact the respondent via contact information provided (e.g., email, phone) to explain the study and invite them to participate in an interview (See APPENDIX A).


The email will provide at least three proposed times for the interview but offer the respondent an opportunity to identify another time as necessary. The email will also provide respondents a way to contact study staff if they have questions (via email and telephone) or want to let us know that they will participate, let respondents know that we will be following up via telephone and email for reminders and if they do not respond, include information about the informed consent process, and reference the $75 honorarium. If individuals cannot be reached after six total attempts (up to three emails and up to three phone calls), study staff will return to the designated engagement partner to check contact information and determine whether another individual in the same role can be identified for recruitment. Recruitment will proceed until the interview has been completed for that role at that site or no further potential participants can be identified.


What is the mode of data collection?

We will use an encrypted Zoom cloud platform to administer the informed consent process and conduct all interviews (See APPENDIX E for the Consent Form). Because interviews will be conducted virtually, the study team will pursue a waiver of documented consent and instead utilize a verbal consent process prior to each interview. The audio files will be uploaded into the secure, password protected cloud-based data storage website, and the file will then be deleted from the Zoom cloud. After transcription, audio files will also be deleted from this data storage website. No direct identifiers will be included in the transcripts used for analysis. All transcripts will be uploaded in a secure Dedoose account and transcripts will be deleted upon analysis completion. The original Word file will be stored on the study’s secure, password protected cloud- based data storage website as a backup and will be destroyed three years after study completion.


Identifying contact information used for recruitment and consent will be stored on JBA's secure, password protected cloud- based data storage website. Any other data collected during the course of virtual site visits (e.g., program materials, reports, contracts) will also be stored on JBA’s secure website.


How are the data collection activities monitored for quality and consistency (e.g., interviewer training)?

The Qualitative Specialist and Principal Investigator (PI) for MUSE-STC will train up to two additional contractor staff to conduct the interviews. During the training, the Qualitative Specialist and PI will provide MIECHV context and discuss interviewing techniques. Training will consist of role-play and didactic approaches. After the training, the team will debrief to ensure the trainees understood why certain questions were probed or modified. Once we are confident that the trainee understands the study goals and interviewing techniques, they will begin conducting interviews. The training will be accompanied by a comprehensive interviewing protocol providing detailed reviews of major interviewing skills (e.g., probing techniques) and step by step instructions for interview activities (e.g., using Zoom platform).


What data evaluation activities are planned as part of monitoring for quality and consistency in this collection, such as re-interviews?

While interviews are taking place, study staff will hold a weekly meeting with all interviewers (including the Qualitative Specialist and Principal Investigator for the project) to discuss issues that are arising, monitor progress and quality, and ensure consistency. The Qualitative Specialist will review a subset of interview transcripts on an ongoing basis.


B5. Response Rates and Potential Nonresponse Bias


Response Rates

We aim to recruit six cases. If either the MIECHV awardee or local agency declines to participate in the study, we will return to the list of potential cases to select a suitable replacement and proceed through the process with the replacement partnership. Our goal is to interview up to seven individuals knowledgeable of the partnership at each case. If individuals decline to participate, we will attempt to recruit someone in a similar role who could speak to the partnership. The interviews are not designed to produce statistically generalizable findings and participation is wholly at the respondent’s discretion.


NonResponse

Participants will not be randomly sampled, and findings are not intended to be representative, so non-response bias will not be calculated. We will track interview participation and report on the number of individuals who decline participation as well as those who we were unable to reach or confirm for an interview.


B6. Production of Estimates and Projections

The information will not be used to generate population estimates, either for internal use or dissemination, and no policy decisions will be made based on these data. The information will describe how collaborations between MIECHV awardees and local agencies serving tribal communities influence planning and implementation of home visiting services.


B7. Data Handling and Analysis

Data Handling and Data Analysis

Analysis of case study data will involve an iterative process. Analysis and interpretation will be done through ongoing refinement rather than confined to a pre-established set of concepts or theories. Study staff will initiate analysis when data collection begins, and analysis will proceed throughout the study. We will use content analysis – a process for interpreting meaning from what participants say in interviews (i.e., through reviewing transcripts). The study staff will use a mix of deductive and inductive approaches. Deductive approaches establish at the outset (before data collection) what researchers expect to find in the data. Inductive approaches identify themes, patterns, and trends emerging from the data itself. We will assess data collection findings and potential areas for additional inquiry on an on-going basis and address them in subsequent interviews or through follow-up with participants. Potential gaps could include uneven recruitment of interview participants across roles, interview questions not generating rich responses, or protocols not comprehensively addressing topics or themes that emerge as critical.


Data (interview transcripts) will be analyzed using both a priori codes (deductive), which are developed before reviewing transcripts, and free codes, which are added during the analysis (inductive). Codes will then be used to identify themes. These themes will be generated and prioritized in a collaborative process with multiple sources including consultations with the STEP and other study collaborators, from related literature on partnerships to support implementation, and the data itself. Study staff will code the data using qualitative software (e.g., Dedoose).


Data Use

MUSE-STC findings will be used to inform and tailor HRSA program guidance and supports. To this end, data will inform an internal memo describing methods, findings, and implications for federal planning and technical assistance and will likely inform a final report on MIECHV collaboration. Review and approval requirements of participating tribes will be followed before publicly disseminating any materials based on study findings.


The MUSE-STC study will not provide any open access to raw data either during the study or as an archive (i.e. a secondary data analysis file). The study team sees the benefits of open access archives of this kind to optimize the utility and enhance the impact of collected data, especially when data is accessible to the communities from which it is collected; however, opening access to this data set will not be appropriate because it would violate case and participant confidentiality.


B8. Contact Person(s)

Erin Geary

MUSE-STC Project Director

Senior Research Associate

James Bell Associates

3033 Wilson Blvd., Suite 650 | Arlington, VA 22201

geary@jbassoc.com

(703) 528-3230


Attachments

INSTRUMENT A: Interview Protocol

APPENDIX A: Draft Recruitment Email and Call Script

APPENDIX B: Draft Study Notification Email from Health Resources and Services Administration (HRSA)

APPENDIX C: Draft Site Engagement Email

APPENDIX D: MUSE-STC Research Flyer

APPENDIX E: Consent Form

8


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorDiane Early
File Modified0000-00-00
File Created2024-10-07

© 2024 OMB.report | Privacy Policy