1110-0015_Supporting Statement PART B Final

1110-0015_Supporting Statement PART B Final.doc

Hate Crime Incident Report

OMB: 1110-0015

Document [doc]
Download: doc | pdf

PART B. Statistical Methods


  1. The potential respondent universe of the Hate Crime Incident Report (OMB No. 1110-0015) includes 18,290 law enforcement agencies voluntarily participating in the FBI Uniform Crime Reporting (UCR) Program. The law enforcement agencies consist of approximately 11,696 local, 725 colleges and universities, 5,157 county, 511 state, 191 tribal, and 10 federal agencies.

    The majority of law enforcement agencies, especially those that serve larger populations, use automated records management systems (RMS) to capture and track incident reports that are filed with the agency as a result of complaints by citizens or those directly observed by law enforcement officers. These systems usually have the capability to extract an electronic file that contains statistical information on hate crimes that comply with our collection standards and can be forwarded to their state’s UCR Program or directly to the FBI’s UCR Program. These agencies do not use a data collection form to provide their hate crime statistical data to the UCR Program. Those agencies that do not have an automated RMS have traditionally relied upon the FBI’s Hate Crime Incident Report to complete and forward to state and federal UCR Programs for participation in the UCR Hate Crime Data Collection.

    As of July 1, 2013 all UCR Program participants must submit data electronically to the FBI. In order to accommodate the approximate 700 agencies that did use the FBI’s Hate Crime Incident Report form completed by law enforcement personnel, the FBI UCR Program developed a Workbook Tool based in Microsoft Excel as a data collection and reporting mechanism to replace the FBI’s Hate Crime Incident Report.

Approximate total number of agencies participating in UCR in 2012

18,000+

Total number of agencies participating in UCR Hate Crime in 2012

14,595

Approximate percentage of agencies participating in UCR Hate Crime

80%

Approximate percentage of agencies that submitted Hate Crime incidents, of the 14,595 participating agencies

13%

Approximate percentage of agencies that submitted zero Hate Crime incidents, of the 14,595 participating agencies

87%


Approximately 80 percent of the UCR law enforcement agencies participate in the Hate Crime Statistics Program. Of the 80 percent of participating agencies, approximately 13 percent submit hate crime incidents. The other approximately 87 percent submit zero hate crime incidents. All agencies that participate in the Hate Crime Statistics Program, whether they submit incidents or zero reports, correlate to all population group sizes and have many diverse attributes. Based on historical reporting trends, similar response rates are expected in future hate crime data collections, however, the FBI UCR Program actively liaisons with national and federal law enforcement agencies to encourage participation in UCR data collections.



  1. As the UCR hate crime data collection is intended to collect all hate crimes that come to the attention of law enforcement agencies in the United States, sampling methodologies are not used. Instead, the FBI UCR Program relies upon the enumeration of these incidents in total to make statements about the relative frequency and characteristics of hate crime in the United States. However, the voluntary nature of the UCR Hate Crime Data Collection results in some agencies reporting incomplete information and others not participating in the data collection at all. Accounting for the impact of missing data in the Hate Crime Data Collection is difficult because these agencies may vary in important ways. Possible attributes of these agencies that should be accounted for include: population density and degrees of urbanization; compositions of population particularly youth concentration; population mobility with respect to residents' mobility, commuting patterns, and transient factors; different economic conditions including median income, poverty level, and job availability; areas with different modes of transportation and highway systems; different cultural factors and educational, recreational, and religious characteristics; family conditions with respect to divorce and family cohesiveness; climate; effective strength of law enforcement agencies; administrative and investigative emphases of law enforcement; policies of other components of the criminal justice system; citizens' attitudes toward crime; and crime reporting practices of the citizenry.


Law enforcement agencies report hate crimes brought to their attention monthly or quarterly to the FBI either directly or through their state UCR Programs. These agencies submit hate crime data in either a NIBRS submission, an electronic hate crime record layout via e-mail, or via the UCR Microsoft Excel Workbook Tool.


Agencies that report offense data to the FBI via the NIBRS use a data element within their reporting software to indicate whether an incident was motivated by bias. Because the NIBRS is an incident-based, comprehensive data collection system, these agencies report considerably more information about a hate crime than that captured in the current electronic record or on the paper forms. For example, the data element that indicates bias motivation applies to 45 Group A offenses, and agencies can report information such as the age, sex, and race of victims, offenders, and arrestees. Although the additional data collected via the NIBRS are not maintained in the hate crime database, they are available in the NIBRS flat files. When agencies submit a Group A Incident Report with a bias indicator of “None,” a Group B Arrest Report (because no offenses [bias-motivated or otherwise] occurred in their respective jurisdictions), or a Zero Report (because no offenses [bias-motivated or otherwise] or arrests occurred), the FBI records zero hate crime incidents for that agency for the reporting period.


Law enforcement agencies that prefer electronic submissions but do not report via the NIBRS may use the hate crime record layout specified in the publication Hate Crime Magnetic Media Specifications for Tapes & Diskettes (January 1997 [with subsequent amendments]), since replaced by Hate Crime Technical Specification, Version 2.1 (05/25/2012).


Agencies that used the Hate Crime Incident Report and the Quarterly Hate Crime Report paper forms have transitioned into the UCR Microsoft Excel Workbook Tool capturing the following information about each hate crime incident:

  • Offense type and the respective bias motivation

  • Number, Age, and type of victims

  • Location of the incident

  • Number and Age of known offenders

  • Race and Ethnicity of known offenders

For each calendar quarter, law enforcement agencies submitted a Hate Crime Incident Report for each bias-motivated incident as well as a Quarterly Hate Crime Report, which summarizes the total number of incidents reported for the quarter. Agencies used the Quarterly Hate Crime Report to delete any previously reported incidents that were determined through subsequent investigation not to be bias motivated. If no hate crime incidents occurred in their jurisdictions that quarter, the agencies still submitted a Quarterly Hate Crime Report to report zero hate crime incidents. Within the design of the Microsoft Excel Workbook Tool, zero hate crime reporting was incorporated into the Agency administration page and the hate crime incident deletes were included in Hate Crime Incident Report; therefore, the Quarterly Hate Crime Report is no longer needed.

The UCR Program has yet to publish a national estimate of hate crime incidence. Applying hate crime rates per 100,000 persons broadly across United States populations when relatively few hate crimes occur would irresponsibly lead to a misrepresentation of the actual occurrence of hate crime in the Nation. As such, the data are published as they are reported to the FBI. The law enforcement agencies that do not send in hate crime reports are not estimated to compensate for the missing jurisdictions due to the already low occurrence of hate crime incidents as reported by the 80 percent of participating agencies. However, those law enforcement agencies that do not participate in the hate crime program are not included in the hate crime participation agency counts in order to assist the reader with interpreting the data available.

The FBI relies on the integrity of data contributors reporting data, however, Quality Assurance Reviews are conducted by the Criminal Justice Information Services (CJIS) Audit Unit on a triennial basis. The results of the audits are not used to adjust crime data, but are rather used to educate reporting agencies on their compliance with national UCR guidelines.


At the end of each reporting year, the FBI UCR Program requests for state program and direct contributor personnel to verify all hate crime incident data submitted to the program. Specific verifications are also forwarded to these contributors when specific offenses, bias types, and victim types are reported to ensure the accuracy of the data reported. For example, many times when incidents are re-evaluated that contain offenses of murder or rape and victim types of society, these incidents are found not to be motivated by bias.


Response rates are maximized through liaison with State UCR programs and agencies that directly contribute their data to the FBI UCR Program. FBI UCR staff communicate with law enforcement agencies frequently to encourage data submissions. The FBI UCR staff understands the contextual challenges that law enforcement agencies face in reporting valid and reliable data and regularly work to overcome nonresponse issues when such challenges occur. The FBI UCR Program sends out correspondences of non-responsiveness to the CJIS Systems Officers (CSOs) for each reporting year. The CSOs oversee all CJIS system operations at the state level and ensure the states are meeting all operational requirements. The FBI UCR Program asked the CSOs to work with their state program managers to submit this data.


Preliminary figures show the number of agencies participating in Hate Crime in 2013 increased by approximately 400 over the number of agencies who participated in 2012. The FBI UCR Program believes this initiative is one contributing factor for the increase in participation.



  1. Eighty percent of the FBI UCR Program agencies reports hate crime data, and the FBI is working to help the absent 20 percent of law enforcement agencies participate in the hate crime data collection. There are several initiatives that should, once completed, improved agencies’ ability to provide timely hate crime statistics to the FBI UCR Program.


The FBI is working to promote the participation in Hate Crime with the FBI CJIS UCR development of the new UCR system. The new system will manage the acquisition, development, and integration of a new information systems solution which affects UCR participating local, state, tribal, and federal law enforcement agencies. The goal is to improve UCR efficiency, usability, and maintainability while increasing the value to users of UCR products.


Implementation of Web-Based Training is being explored by the FBI UCR Program and is expected to be implemented in the future.



  1. The new Hate Crime Incident Report as the UCR Microsoft Excel Workbook Tool was pretested with 20 participants. Fifteen of the participants were law enforcement officers, while the remaining five were civilian law enforcement employees. The twenty participants were also comprised of four representatives from a small city police department, two representatives from a large city police department, two participants from a county sheriff’s office, five representatives from a campus police agency, four representatives from the state police, and three representatives from a federal police force. There were two separate versions of the collection instrument to test the presentation of the bias motivation codes. The twenty participants were split equally between each version to be tested.


The purpose of the interviews was to test cognitive and usability elements of the redesigned collection. The interviews found the following general observations:

  • Participants found that having to scroll back and forth to remind themselves of which information is to be reported for particular offenses in multiple offense incidents was frustrating.

  • The lengthy instructions at the top of the form created difficulty for the participants to get started.

  • Certain portions of the instructions that were replicated from the paper-based collection were non-functional and irrelevant causing confusion for the participants.

  • Participants expressed a desire for certain features to be automated such as the identification of which fields were in error or calculating ages.

  • Participants were frustrated by inconsistent requirements of “zeroes” versus “blank fields.”

  • Participants asked for improved functionality of the tabbing through the form in order to quickly move from one field to another.

  • Participants would often not realize that there were lengthy lists in the dropdown menus and would not initially scroll through all the choices. However, with some exploration, they all eventually realized that there were many more options to choose from other than what initially appeared in the list.



The findings related to the testing of two separate formats for the display of bias motivation codes related to race and ethnicity:

  • No participants expressed confusions about why anti-Arab bias would be included with the anti-ethnicity bias motivation codes.

  • One participant initially looked for anti-Hispanic bias motivation under the anti-race bias motivation codes, but quickly found it under the anti-ethnicity codes.

  • While most participants were able to correctly identify the proper use of the anti-multiple races, group bias motivation, they questioned why it was collected in this manner.

  • Results from the cognitive testing did not seem to indicate preference for one display over the other.

  • In general, participants indicated a need for better explanation about the purpose of the reference guide, which provided the definitions for each of the bias motivations on a separate tab in the Excel workbook.



Finally, when preliminary information was asked about the use of newer bias motivation codes of anti-Sikh, anti-Hindu, and anti-Arab, the following observations can be made:



  • If law enforcement personnel work in an environment of limited diversity, they tend to be uncertain about what signs or signifiers would assist them in correctly identifying members of these communities. Most of the incidents were classified as anti-Muslim religion or anti-Arab if there was any indication of ethnic headwear or symbols.

  • When asked how a particular scenario was identified as anti-Arab, often participants mentioned that the victim was speaking Arabic.

  • Interestingly, those law enforcement personnel that had recent military experience expressed that military training had familiarized them with information about these communities regardless of how much diversity existed in the locations that they serve.

  • Many participants mentioned that training would have helped both in understanding the form and in understanding the definitions of each of the bias motivations.


Pretest findings requiring changes to the form were made for this Paperwork Reduction Act submission. Details pertaining to the cognitive testing can be found in the supplementary documents of this Information Collection Review.



  1. Points of Contact


Amy C. Blasher

CSMU Chief

amy.blasher@ic.fbi.gov

304-625-4840


Cynthia Barnett-Ryan

Statistician

cynthia.barnett-ryan@ic.fbi.gov

304-625-3576


Dr. James H. Noonan

Statistician

james.noonan@ic.fbi.gov

304-625-2927




Kristi Donahue

Hate Crime Program Coordinator

kristi.donahue@ic.fbi.gov

304-625-2972


Patricia Hanning

Technical Information Specialist (OMB PWRA Point of Contact)

patricia.hanning@ic.fbi.gov

304-625-2957


File Typeapplication/msword
Authorphanning
Last Modified Byphanning
File Modified2014-09-08
File Created2014-08-29

© 2024 OMB.report | Privacy Policy