Alternative Supporting Statement for Information Collections Designed for
Research, Public Health Surveillance, and Program Evaluation Purposes
Usability Testing of Resources to Support the Identification and Care of Children with Prenatal Substance or Alcohol Exposure in the Child Welfare System
Formative Data Collections for Program Support
0970 – 0531
Supporting Statement
Part B
January 2022
Submitted By:
Children’s Bureau
Administration for Children and Families
U.S. Department of Health and Human Services
4th Floor, Mary E. Switzer Building
330 C Street, SW
Washington, D.C. 20201
Project Officer:
Sharon Newburg-Rinn, Ph.D.
Social Science Research Analyst
Data Analytics and Reporting Team
Children’s Bureau, ACF, HHS
330 C Street SW Room 3042
Washington, DC 20201
Sharon.Newburg-Rinn@acf.hhs.gov
Part B
B1. Objectives
Study Objectives
The objective of this proposed information collection is to determine the usability of a toolkit of resources being produced by the Children’s Bureau (CB) in collaboration with the Centers for Disease Control and Prevention (CDC) to support child welfare agency staff in the identification and support of children living with prenatal substance exposure (PSE), in particular prenatal alcohol exposure (PAE). Information will be collected from staff at three local child welfare agencies about the usefulness and ease of use of the toolkit; the agency processes and structures involved in implementing the toolkit; and how the toolkit can be improved. The information collected from agency staff will be synthesized by the study team and shared with the toolkit development team, who will use the information to improve the toolkit. Improved sections of the toolkit may then be shared back with the users for additional feedback, if needed in some limited cases. Through this usability testing process, a stable, usable version of the toolkit will be developed, and the study team will gain an understanding of the technical support needed for implementation of the toolkit. The toolkit can then be piloted through formative evaluation and, eventually, summative evaluation (as data collection efforts to be submitted through a separate Information Collection Request).
Generalizability of Results
This study is intended to present an internally valid description of the usability of the toolkit resources from the perspective of those engaged in the usability testing process at select child welfare agencies. It is not intended to promote statistical generalization to other populations, yet the project’s intent is to gain useful information regarding utility and feasibility of the toolkit and its components and to learn about how it may work in different child welfare agency contexts.
Appropriateness of Study Design and Methods for Planned Uses
The study design, described in section A2 of Supporting Statement A, is appropriate to achieve the study objectives described above. A core function of usability testing is to quickly assess the adequacy of the toolkit components and to detect deficiencies that require correction by the toolkit development team before the toolkit is finalized and rolled out for implementation during subsequent evaluation (under a separate OMB Information Collection Request). The proposed study design enables the toolkit development team to understand how the toolkit is received by child welfare agency staff, who are the target end users of the toolkit. The usability testing process will provide feedback to the development team about strengths and weaknesses of the toolkit, and about implementation supports and considerations that are needed to support implementation of the toolkit, while there is still time to make modifications to the toolkit. While child welfare agency staff who will usability-test the toolkit resources typify the intended end users of the toolkit, they are not a representative sample of child welfare staff nationally and their perspectives are their own. In addition, their perspectives are intended only for the purposes of improving the quality of the toolkit; staff feedback data are not intended to be used to evaluate outcomes of the toolkit. As noted in Supporting Statement A, this information is not intended to be used as the principal basis for public policy decisions and is not expected to meet the threshold of influential or highly influential scientific information.
B2. Methods and Design
Target Population
For each of three selected usability testing sites, the study team will collect information from agency leadership and staff in roles that typically provide support or services to children and families living with PAE/PSE (e.g., supervisors, investigation/intake staff, ongoing case management staff) (see Recruitment and Site Selection below). To identify those staff, the study team will work with agency leadership and use non-probability, purposive sampling to select potential respondents who can inform the usability of the toolkit resources. Participants will not be representative of the population of child welfare staff; instead, we aim to obtain variation in child welfare staff members’ experiences to understand various practice-informed perspectives on the toolkit from those who work on various aspects of services to children and families living with PAE/PSE.
Recruitment and Site Selection
The study team has worked with selected experts and stakeholders regarding states they perceived were demonstrating strong practice in identifying and caring for children with PAE/PSE. The study team reviewed state/county child welfare and ancillary system websites and compiled information regarding state and local agency practices to inform site selection. Based on these initial activities, the team developed a preliminary list of state child welfare agencies to engage in discussions about participating in the evaluation of the toolkit. The study team will first approach three prioritized state child welfare agencies. The prioritized states have evidenced some pre-existing buy-in and motivation to engage with the resources being tested.
The team will communicate first with representatives of the respective ACF Regional Offices for the prioritized states to describe the project and study and explore suitability of engaging the state agencies (e.g., confirm there are no competing demands or concerns). The team will contact the selected state’s agency directors (see recruitment letter in appendix B and project description in appendix E) to invite their participation in discussions about their interest in and availability for the study and to assess whether they meet study criteria. The criteria for inclusion in the usability testing study are:
Agency has interest in the project, evaluation, and desired outcome
The values and culture of the agency align with the project objectives and activities (e.g., desire to improve screening and care of children with PAE/PSE)
Agency has adequate staff resources to test the toolkit (e.g., sufficient number of supervisors, time to complete usability activities)
Discussions with agency directors will continue until the study team has successfully recruited at least two states. In collaboration with the directors, the team will identify one or two local child welfare agency sites within each of the two states that could potentially serve as usability testing sites. The study team will contact the local agency directors of those sites to repeat the process of study description and invitation to participate (see recruitment letter in appendix B).
The study team will work with site directors to identify and recruit 6-8 staff at each site to participate in usability testing. Staff will be recruited to represent the following agency staff roles at each site:
Child welfare agency director (1 individual)
Supervisors (1-2 individuals)
Staff working in investigation/intake, ongoing case management, foster care/adoption/permanency, family preservation services (2-3 individuals)
Child welfare agency professionals working in specialist roles that align with toolkit resources and targeted processes (1-2 individuals), that may include:
Data/CQI specialists working with PSE-related data and documentation systems
Agency managers involved in determining policy and practice guidance for PAE/PSE (e.g., substance-exposed newborn program manager)
Staff involved in managing training
These potential study participants will receive an emailed invitation to participate from the study team (see recruitment letter in appendix D).
B3. Design of Data Collection Instruments
Development of Data Collection Instrument
Guided by the five questions listed in Supporting Statement A (see section A2) the study team developed a set of interview questions to assess the usability of each component of the toolkit. The study team collaborated with the project’s toolkit development team to streamline questions and ensure the interview protocol reflects the organization and content of the toolkit. To minimize burden on respondents, the protocol was limited to essential questions needed to understand target users’ perspectives on the usability of each component.
B4. Collection of Data and Quality Control
Data will be collected via interviews with child welfare agency staff who are identified as appropriate informants by the study team and agency leadership (see section B2). Over the course of five months, these study participants will work with the study team to establish a structured process to review and discuss two toolkit components at a time. As participants complete their review of each pair of toolkit components, the study team will contact them via email to schedule group or individual interviews to obtain their feedback on the toolkit components. Child welfare agency directors and staff in specialist roles will be interviewed individually. Child welfare agency supervisors and staff – who typically work together in supervisory teams – will be interviewed in groups.
Interviews will be conducted by phone or video call and facilitated by interviewers from the study team, who have all been trained in qualitative data collection. Interviewers will be trained in the interview protocols and will be knowledgeable about the agencies, to ensure consistent facilitation of these interviews. To further ensure data quality, primary interviewers will have a second study team member participate and record the interviews (with the consent of the interviewees). All transcripts derived from recorded interviews will be reviewed for accuracy by the interviewers and de-identified before the content of the interviews is analyzed. If a potential informant is not available for an interview or otherwise needs the flexibility to provide data at their own convenience, an online version of the interview questions will be made available to the respondent. The full study protocol including recruitment, consenting process, administration, analysis, and reporting will be approved by the study Institutional Review Board.
B5. Response Rates and Potential Nonresponse Bias
Response Rates
Maximizing participant participation is key to the success of these data collection efforts. The content and format of the interview protocol were designed to collect needed data with minimal burden on child welfare staff respondents, to encourage and enable participation. These data collection activities are not designed to produce statistically generalizable findings and participation in the data collection is wholly at the respondents’ discretion. Data collection strategies that emphasize flexibility, privacy, and respect for the respondent’s time can facilitate participation. The following strategies will be implemented to maximize participation in the data collection:
Introduction and notification: Strategies to introduce the data collection efforts to potential respondents and build buy-in for participation will be used.
Timing of data collection: Discussions will be held with leaders of the partnering agencies to determine optimal periods for data collection to minimize respondent burden and to facilitate recall.
Pre-interview preparation: A copy of the interview protocols will be sent to respondents in advance of interviews. Interviewers will be deeply familiar with the interview protocols and understand the context of each site’s child welfare system to expedite administration of the interview and improve the quality of the data collected.
Administration: The study team will be flexible and accommodating with respect to setting up interviews to meet participants’ availability and rescheduling interviews as necessary.
Alternate response methods: Respondents will be given the option to use an alternative method for providing data – such as completing an online version of the interview questions – if this method helps to increase participation.
Assurances of data privacy: Respondents will be assured that reported data (e.g., opinions, perspectives) will be aggregated and not attributable to individuals.
Provision of non-monetary resources: The team will offer access to expert consultants for program support and recognition in project publications (described in A13. Costs) to agencies to compensate for the time and opportunity costs of participating in the study.
NonResponse
As participants will not be randomly sampled and findings are not intended to be representative, non-response bias will not be calculated. Respondent demographics will be documented and reported in written materials associated with the data collection.
B6. Production of Estimates and Projections
These data will not be used to generate population estimates, either for internal use or dissemination.
B7. Data Handling and Analysis
Data Handling
The study team will be responsible for the collection, storage, and maintenance of data. All sensitive and personally identifiable information will be stored and maintained in accordance with ACF requirements; the study team has capabilities for the safe storage of sensitive information meeting federal guidelines.
Interview data will come from field notes and transcribed audio recordings. Detailed field notes will document key insights and suggestions for each toolkit component. Audio transcripts (automatically generated by the videoconference platform used for interviews) will be used as needed to verify the accuracy of field notes and to pull direct quotes to use as examples in summaries. When transcripts are downloaded and used for verification, they will be de-identified and cleaned before they are stored and used for analysis, and the audio recordings will be securely deleted.
Data Analysis
Field notes from interviews will be de-identified, cleaned, and uploaded into Dedoose (a FedRAMP certified qualitative analysis software). Codes will be developed in Dedoose to align with interview questions and to organize sub-topics (e.g., common suggestions to the toolkit) and applied to the text. Analytic tools in Dedoose will include the generation of data tables with code counts and narrative summaries organized by code. Finally, brief analytic memos will summarize key findings from interviews for each toolkit component and identify revision needs for the resource development team.
Data Use
Data collected through this proposed information collection will be used by the toolkit development team to improve the usability and usefulness of the toolkit components. Through iterative rounds of data collection, analysis, and responsive quality improvement of toolkit components, a stable version of the toolkit will be developed that can then be implemented in the field and evaluated to determine its efficacy.
B8. Contact Person
Erin Ingoldsby, Ph.D., Project Director
James Bell Associates
3033 Wilson Boulevard, Suite 60
Arlington, VA 22201
(703) 247-2647
Attachments
Appendix A. PAE Usability Testing Interview Protocol
Appendix B. PAE Recruitment Message for State Child Welfare Directors
Appendix C. PAE Recruitment Message for Local Agency Child Welfare Directors
Appendix D. PAE Recruitment Message for Local Agency Staff
Appendix E. PAE Project Description
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Heidi Melz |
File Modified | 0000-00-00 |
File Created | 2024-07-20 |