Memo to OMB for EAMN Semi Structure Interviews

EAMN - Memo to request OMB generic clearance approval to conduct interviews with state APS agencies_submitted 11-5-13.docx

Generic Clearance for Cognitive, Pilot and Field Studies for Bureau of Justice Statistics Data Collection Activities

Memo to OMB for EAMN Semi Structure Interviews

OMB: 1121-0339

Document [docx]
Download: docx | pdf

MEMORANDUM


MEMORANDUM TO: Shelly Wilkie Martinez

Office of Statistical and Science Policy

Office of Management and Budget


THROUGH: Lynn Murray

Clearance Officer

Justice Management Division


William J. Sabol, Ph.D.

Acting Director

Bureau of Justice Statistics


Howard N. Snyder

Deputy Director

Bureau of Justice Statistics


FROM: Erica L. Smith

Statistician

Bureau of Justice Statistics


DATE: November 1, 2013


SUBJECT: BJS request for OMB clearance to conduct semi-structured telephone interviews with representatives from state agencies responsible for overseeing Adult Protective Services, in support of The Assessment of Administrative Data on Elder Abuse, Mistreatment and Neglect (EAMN) project, through the generic clearance agreement OMB Number 1121-0339

Shape1


Introduction


The term elder abuse, mistreatment, and neglect (EAMN) refers to a wide range of criminal and civil violations against elderly victims, some of whom may be physically or mentally vulnerable. EAMN may take the form of physical or sexual violence, emotional or psychological abuse, financial or material exploitation, caregiving neglect, or abandonment. EAMN is a growing concern as the baby boomer generation ages and the number of elderly persons in the U.S. population increases.


The response to EAMN has mainly occurred at the state and local levels, primarily through Adult Protective Services (APS) agencies, as well as through local law enforcement, state Attorneys General, and state and local long-term care ombudsmen. As such, the federal role in defining and responding to EAMN has, to date, been relatively limited. National, uniform, comparative data on the incidence and prevalence of and responses to EAMN are not available. This lack of data is due in part to the variation among states in the legal definitions of EAMN, the reporting mechanisms for identifying cases of EAMN, the administrative structures for investigating and responding to reported cases of EAMN, and the storing of case information. The dearth of comparative data has impeded policymakers’ abilities to adequately estimate and track the national incidence and prevalence of EAMN, make comparisons across jurisdictions, and evaluate the effectiveness of responses to EAMN.


To understand the feasibility of, and the challenges associated with, the creation of an ongoing, national data collection on elder abuse, mistreatment and neglect, the Bureau of Justice Statistics (BJS) awarded a cooperative agreement to the Urban Institute (UI) to conduct an assessment of administrative data collected by Adult Protective Services agencies. The project will assess currently available administrative data about elder abuse, mistreatment and neglect as reported to APS agencies and the feasibility of utilizing those data to uniformly track the incidence and prevalence of reported EAMN and the outcomes of those cases. BJS chose to focus on administrative data collected by Adult Protective Services for several reasons. First, in the vast majority of states, APS agencies are the official government organizations to which the majority of initial reports of suspected abuse, neglect or exploitation of the elderly and other vulnerable adults are made. For these potential victims, APS are the first responders. Second, many of the cases referred to APS fall into a grey area of victimization not currently captured by other statistical data collections, somewhere between the “dark figure” of unreported victimizations and official offenses known to law enforcement. It is not known how often criminal victimizations involving elderly adults or other adults with disabilities are reported to APS nor whether those victimizations are referred out for processing by the criminal justice system. The interaction between APS and law enforcement and prosecutors is also not well understood, due to a lack of data as well as the need for a comprehensive assessment across APS agencies of the different ways in which APS engages with law enforcement and local prosecutors. As part of the work BJS has sponsored, UI will assess whether APS administrative data contain enough detail to distinguish whether a case is criminal in nature, as well as analyze the different ways in which APS agencies engage with local law enforcement and prosecutors to both refer out and receive cases.


This pilot project focuses on understanding the administrative structures and current data collection procedures in APS agencies in all 50 states and Washington, DC, in order to assess the feasibility of utilizing APS data for statistical reporting purposes. The need for national data on abuse of the elderly has been acknowledged by a number of sources, including the Federal Elder Justice Coordinating Council established as part of the Elder Justice Act of 2009, the Government Accountability Office, the Senate Special Committee on Aging, and the National Research Council. However, analysis of the reliability of information on cases of elder abuse, neglect, and exploitation from potential data sources has not been thoroughly evaluated. To that end, the four main objectives of the current EAMN project are to: 1) determine, in collaboration with stakeholders and a diverse selection of experts across the field of elder justice, a set of core indicators on which APS staff should collect data for each case of suspected abuse reported to them; 2) develop a broad understanding of states’ current practices related to detecting, reporting, and collecting data on alleged cases of EAMN; 3) map out, to the extent possible, the different methods of interaction between APS and the criminal justice system, namely local law enforcement and prosecutors, and how that interaction impacts data reporting by APS; and 4) assess existing administrative data on EAMN cases compiled by state APS agencies against the set of core indicators and for comparability across states and local jurisdictions.


An effort to implement a national Adult Protective Services data system is also underway within the Department of Health and Human Services, headed by staff from the Administration for Community Living and the Office of the Assistant Secretary for Planning and Evaluation. That effort will create a nationwide reporting system to uniformly track the incidence and prevalence of and responses to abuse, mistreatment and neglect of the elderly and other vulnerable adults reported to APS in the United States. HHS and BJS are coordinating these related projects and have been in frequent contact, in order for the information gathered in the EAMN project to inform the larger data system project and to avoid duplication of activities across federal agencies.




Request for developmental work


BJS plans to engage in developmental work for the EAMN project under the generic clearance (OMB number 1121-0339). Specifically, BJS is requesting clearance to conduct semi-structured phone interviews with state-level APS representatives in each of the 50 states and Washington, DC. The ultimate goal of this project is to determine whether a methodology for collecting data from state and/or local jurisdictions could be developed that would lead to the generation of reliable national statistics on the prevalence of EAMN reported to APS agencies. With that end in mind, the immediate goal of these proposed interviews is to identify, within each state, the highest organizational level at which uniformly collected data on EAMN cases are maintained. The interviews with state APS representatives are designed to confirm, correct, and supplement essential background information that the project has assembled to date about the organizational structure of APS in each state, the impact of that structure on the data collection operations at the state and local levels, and to determine the extent to which data collected across the state are uniform, if at all. Future development work for this project, for which BJS will seek a separate clearance, will include an online survey of the APS organizational units in each state that maintain uniform data to uncover information about the specific data elements they collect, how data on reported cases are entered, and other pertinent information about the administrative data they collect and how those data can be accessed. Among other things, the interviews with state APS representatives will clarify the number of individual APS organizational units in each state that have primary responsibility for collecting case data, and therefore will need to be included in the online survey.


We will conduct one interview per state. The appropriate contact within each state-level APS agency will be identified through information posted on the agency’s website or using contact information collected during a previous survey of APS agencies conducted in 2012 by the National Adult Protective Services Association (NAPSA). BJS considers the information gathered during interviews with state-level APS representatives in each of the 50 states and Washington, DC as critical to determining the correct organizational level for the accurate assessment of administrative data recorded by APS in each of the states.


Through these interviews, BJS will learn the appropriate state and local APS agencies to approach for more in-depth investigation into the detailed structure of each APS data system. The project will make this determination during the interview with the state APS representative by reviewing the following dimensions of the state’s APS program: 1) the processes by which allegations of EAMN are reported to APS and the resulting scope of case coverage; 2) which state or local agencies have responsibility for investigating various categories or types of elder abuse, and how the division of responsibility impacts the scope of abuse types reported to APS and subsequently recorded in APS data systems; 3) details about where case-, victim-, and/or incident-level data are stored (in state systems or locally-based systems), and 4) the level of centralization and oversight of the data systems, if any, by the state APS agency.

Assessing the feasibility of using administrative data on elder abuse, mistreatment, and neglect cases reported to APS for statistical purposes hinges on gaining this basic understanding of the data systems, in terms of scope, structure and availability.


As detailed in the data collection procedure below, respondents will receive a summary of the project’s background research in advance, and will be asked to confirm, clarify, or elaborate on existing information, some of which was collected for a different purpose by NAPSA through its 2012 survey of APS agencies. The proposed interview will solidify BJS’s knowledge about program operations and data systems that bear on the uniformity of case definitions and data collection. The protocol addresses three different categories of APS data—initial reports of suspected abuse, APS investigations that are opened, and APS investigative outcomes. Each category helps to answer substantive questions about the types of EAMN known to APS, program responses, and resource needs. The interview protocol is designed as follows to guide BJS’ decision-making:

  • Section A confirms the respondent’s contact information.

  • Section B identifies the organizational structure of APS within each state. BJS’ background research indicates whether states are organized at a state, county or other level, but additional information is needed to determine which of these levels would eventually be identified as a sampling unit for statistical reporting purposes. Through this interview, BJS will determine the type of administrative divisions within a state (e.g., counties or regions), the number of local divisions within the state, and the extent to which the state APS agency has oversight of the local administrative divisions. This information will be used, in part, to assess the uniformity of APS operations and administrative data in the state.

  • Section C establishes the location and scope of data on initial reports of suspected EAMN. BJS’s background research indicates the agency which receives initial reports of suspected EAMN, but further clarification is needed to distinguish between the agencies that receive the initial reports and those that maintain the data on initial reports, as they are not always the same entity. Additional questions ask about the scope of the data on initial reports, and whether multiple points of contact are needed to obtain full data.

  • Section D examines the scope of APS investigations and how a case is defined within the state. This information is needed to determine whether and how much APS agencies within the state are afforded discretion when deciding whether to investigate a report of suspected abuse, and how that might affect case statistics. The state APS agency representative will have knowledge of whether uniform case definitions are used across the state in practice. While BJS’s background research provides some information on the scope of APS investigations, the state respondent can confirm case definition criteria and whether those criteria are applied similarly across the state.

  • Section E establishes the electronic recording and location of data on investigations opened by APS staff. BJS has preliminary information about where data on newly-opened investigations are housed in each state, based on state responses to the NAPSA survey about investigative outcomes, but this preliminary information needs to be confirmed during the state interviews. Additional questions address local variation in record-keeping and elicit respondent recommendations on the appropriate number of agencies from which to collect data (county-based agencies, regional Area Agencies on Aging, etc.).

  • Section F establishes the electronic recording and location of data on APS investigative outcomes. BJS has preliminary information on each state’s ability to report on substantiated cases based on aggregate statistics reported in NAPSA’s 2012 survey of APS agencies. However, it is unclear where in each state those data reside, and whether the reported data were generated from aggregate counts or individual level data.

  • Section G synthesizes the information gathered from the interview. Based on responses to the earlier sections, project staff will propose an appropriate number of state and local entities within the state to survey for future development work. The respondent is asked to confirm or clarify this estimate and provide contact information.


Pilot Surveys


As preparation for this request for development work and to test the utility of the interview protocol, we completed three pilot interviews with representatives from three different state APS agencies (two in Southern states and one in the Northwest). Based on these interviews, the protocol was revised to refine language and question ordering. Additionally, the data collection process was refined, based on the request from one pilot participant, in order to provide each state with the opportunity to review their interview responses (as they are recorded in the interview database) for clarity and accuracy, if they so choose. The pilot interviews took 30 minutes, 40 minutes, and 90 minutes to complete. However, the 90-minute interview was atypical due to particular circumstances within that state (that is, the state is currently transitioning from one process of recording APS information to a new process, and the respondent wanted to provide information about both processes, given that the transition will not be complete until later in Fall 2013). We do not anticipate encountering other states with these issues; thus, our burden estimate below is based on an anticipated average length of interview.


Design of the Exploratory Study


Sample Design:


Sampling is not needed for these semi-structured interviews as a representative from every state-level APS agency will be interviewed as part of this process. Understanding the role of the state APS agency in the operations of regional, county and/or local level APS agencies hinges on obtaining information from each state agency about their current data collection practices and capacity, APS reporting requirements at the state-level, and information on the structure of APS in each state. Accordingly, BJS will conduct interviews with a representative from each of the 50 states and Washington, DC for this developmental work.


Data Collection Procedure:


Exploratory interviews with APS representatives will take place via phone in all 50 states and Washington, DC. Prior to the interview, Urban Institute staff will compile as much information as possible about state APS agencies, policies, and practices from on-line public resources, such as state-level APS agency web-sites and the National Adult Protective Services Association Resource Center, to answer the questions of interest. This background research is being conducted to reduce the length of time required for APS agency staff to participate in the phone interview, and will allow the calls to focus on confirming information, asking for clarification of information, and deepening our understanding of APS data issues.


To solicit participation for this pilot project, UI staff will first conduct outreach by email using an initial invitation letter (Attachment A). Also included in the initial invitation will be a project description (Attachment B) and a summary of background research for each particular state (Attachment C). The semi-structured telephone interview agenda is included as Attachment D.


Project staff will ask a series of semi-structured questions to confirm background research and obtain supporting information from respondents about participating agency operations pertaining to elder abuse, mistreatment, and neglect. The interview is designed to elicit information on who is eligible for services, standards and protocols for the processing of EAMN cases, details about data collection and management of client records, and the level of data collection (i.e. state or local level).


Project staff will use an electronic version of the interview agenda while conducting calls, preloaded with background information for respondents to confirm or clarify. Interview questions will have been programmed into Checkbox software, a data collection tool that automates the skip patterns built into the instrument and collects discrete, categorical responses in addition to longer text responses. Using Checkbox as a data collection tool will minimize the burden on respondents because it will allow the respondent to confirm existing information quickly and guide the interviewer to skip questions that are not applicable, based on prior responses. Additionally the frequent use of structured response categories throughout the interview will minimize interruptions needed for note-taking during the interview.


At the conclusion of the interview, project staff will email a copy of the interview responses to the respondent along with a brief note of thanks. This summary will be generated from the Checkbox data collection tool and will be a populated version of the survey instrument. It will provide the respondent with an opportunity to review the information recorded and make corrections, if desired; project staff will not follow up with respondents for further confirmation. Respondents who participated in the pilot testing of the data collection procedure requested this opportunity to ensure that their programs would be accurately represented.


All data collection and recruitment protocols for this work have been approved by UI’s Institutional Review Board (Attachment E).


Burden Hours Estimated for Developmental Study


The total burden for this project development work was estimated based on three key activities, shown in the table below. Once the 50 states and Washington DC have committed to participating, a maximum of 2 hours of time will be required of each respondent. BJS anticipates that it will take each respondent approximately 30 minutes of time to respond to the initial outreach email and to review the current information (Attachment C for reference) provided by project staff. Interviews are projected to take no more than 1 hour to complete per respondent, with one respondent per site. BJS has projected an additional 30 minutes per respondent for follow-up review of the final responses to the interview questions (based on the feedback during the pilot interviews). The total number of burden hours requested for use under the BJS generic clearance is 102 hours, per the table below.



Number of —


Activity

Burden hours per respondent

Respondents


Total burden hours per activity

Initial outreach by email

0.5

51


25.5

Telephone interview

1.0

51


51

Follow-up review of interview responses

0.5

51


25.5



Total burden hours:

102


Informed Consent and Data Confidentiality


At the outset of each interview, project staff conducting the interview will read an informed consent script (included in Attachment D) describing the risks and benefits of participation. The interviews are not intended to collect information on individuals or information that would otherwise be considered sensitive in nature. As such, the activities associated with this task are not considered human subjects research. However, the informed consent script specifies that participation in the interview is voluntary, and that respondents may decline to answer any and all questions and may stop their participation at any time. Project staff will obtain verbal informed consent before continuing with the interview.


Data Security


Information collected during the interview will be stored on the Urban Institute’s computer network that resides behind UI’s firewall. Because the interview elicits factual information about program policies and operations from state APS representatives in their professional capacities, UI’s IRB has determined that the information is neither private nor sensitive.


Contact Information


Questions regarding any aspect of this project can be directed to:


Erica L. Smith

Statistician

Bureau of Justice Statistics

U.S. Department of Justice

810 7th Street NW, Room 2326

Washington, DC 20531

Office Phone: 202-616-3491

Fax: 202-616-1351

E-Mail: Erica.L.Smith@ojp.usdoj.gov


Attachments


Attachment A: Initial invitation letter

Attachment B: Project description

Attachment C: Summary of background research (example)

Attachment D: Semi-structured telephone interview guide

Attachment E: Letter of project approval from Urban Institute IRB

6


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
Authorcoopera
File Modified0000-00-00
File Created2021-01-25

© 2024 OMB.report | Privacy Policy