Cognitive Testing Memo for Census of Jails

OMB_memo.docx

Generic Clearance for Cognitive, Pilot and Field Studies for Bureau of Justice Statistics Data Collection Activities

Cognitive Testing Memo for Census of Jails

OMB: 1121-0339

Document [docx]
Download: docx | pdf



U.S. Department of Justice


Office of Justice Programs


Bureau of Justice Statistics

Washington, D.C. 20531


MEMORANDUM TO: Bob Sivinski

Office of Statistical Policy and Planning

Office of Management and Budget



THROUGH: Jeff Anderson, Director, Bureau of Justice Statistics (BJS)

Jeri M. Mulrow, Principal Deputy Director, BJS

Elizabeth Ann Carson, Acting Corrections Unit Chief, BJS



FROM: Zhen Zeng, Statistician, BJS


SUBJECT: BJS request for OMB generic clearance to conduct a cognitive test to evaluate and improve new survey items for the Census of Jails and Annual Survey of Jails, through the BJS generic clearance agreement (OMB Number 1121-0339)



DATE: May 16, 2018




Introduction


The Bureau of Justice Statistics (BJS) is requesting a generic clearance to conduct a cognitive test to assess the feasibility and clarity of new survey items for its jail data collections. The new items will collect data on inmate counts by age group, non-U.S. citizens, foreign-born inmates, offense type, conditional release violation, and on jail opioid testing and treatment programs, opioid screening procedures, and the prevalence of opioid abuse disorder. If shown feasible, the items will be incorporated into the Census of Jails (COJ, OMB Number 1121-0249) and the Annual Survey of Jails (ASJ, OMB Number 1121-0049).


RTI International, the current data collection agent for the ASJ, will conduct the cognitive test on behalf of BJS starting in the summer of 2018. RTI will recruit a total of 49 jails to participate in the test, invite them to fill out a short online survey consisting of the new items, and conduct telephone interviews with participants to discuss their experience filling out the survey. The cognitive test will help BJS understand the availability of the requested data and identify potential issues with question wording and layout. None of the new items has appeared in the COJ or the ASJ before. However, similar questions on offense type and conditional release violations were used in BJS’s 2004 Survey of Large Jails (OMB Number 1121-0049).


Justification of the New Items


There is widespread evidence that the population passing through the U.S. criminal justice system has been aging during the past two decades (see e.g., James 2004; Carson and Sabol 2014).1 The COJ and ASJ currently collect counts of local jail inmates who are age 17 or younger and those who are age 18 or older by sex. While this measure provides the total numbers of juvenile and adult inmates, the age information is not granular enough to track changes in the age structure of jail inmates. BJS proposes to replace the current age item with a new item that collects counts of jail inmates in six age groups (17 or younger, 18-24, 25-34, 35-44, 45-54, and 55 or older), disaggregated by sex (Question 2 of Appendix A). This cognitive test will allow BJS to evaluate whether local jails can provide this level of detail.

The COJ and ASJ currently collect the total number of non-U.S. citizens confined on the last weekday in June. Although the response rate of this item was problematic (75-85%) in the past, it picked up significantly and reached 95% since BJS changed the ASJ data collection agent in 2015. To support the Department of Justice’s initiative to combat illegal immigration, BJS proposes to replace the current item with one that collects the one-day counts of non-U.S. citizens by conviction status (Question 3 of Appendix A). The new question asks for the number of jail inmates that were U.S. citizens and the number of inmates that were not U.S. citizens. Jails that provide a positive count of non-U.S. citizens are then asked to break down the count by conviction status.


BJS also proposes to add an item that collects one-day counts of foreign-born inmates by conviction status (Question 4 of Appendix A), similar to the new item on non-U.S. citizens. Evidence from the recent 2017 Jail Citizenship Reporting Telephone Survey (JCRTS) under BJS generic clearance (OMB Number 1121-0339) suggests that it may be challenging for some jails to report counts of non-U.S. citizens by conviction status due to jail data system limitations. Evidence from the JCRTS also indicates that jails may collect data on place of birth from inmates instead of their U.S. citizenship status. BJS plans to collect both foreign-born counts and non-U.S. citizen counts for data quality checks and missing data imputation.


In 2002, one in every two jail inmates were probation, parole, or bail-bond violators and absconders, i.e., individuals who failed to comply with the conditions of release by committing a new offense or a technical offense.2,3 High jail reentry rates impose costs on local communities as returning offenders take up jail bed space. BJS currently collects data on conditional release violations of jail inmates only in individual-level inmate surveys, the National Inmate Survey (OMB Number 1121-0311) and Survey of Inmates in Local Jails (OMB Number 1121-0329). Due to their high costs and complexity, BJS conducts inmate surveys only periodically. BJS proposes to test an item that collects counts of jail inmates held for pretrial release violations, probation violations, and parole violations (Question 5 of Appendix A) in the COJ and ASJ. The percentages of inmates with various conditional release violations will provide a broad measure to track jail reentry.


Similarly, BJS currently obtains offense information on jail inmates in inmate surveys only. The COJ and ASJ collect inmate counts by severity of offense (felony vs. misdemeanor) but do not capture offense type. BJS proposes to test an item to collect aggregate counts of inmates by offense type (violent, property, drug, etc., Question 6 of Appendix A) for the COJ and ASJ. The item will provide the necessary data to research the core criminal justice question—why are people incarcerated in jail.


People passing through jail systems have exceedingly high rates of substance use disorders.4 As such, local jails play a pivotal role in our nation’s battle against the opioid crisis. Public officials increasingly recognize jails as an important site for initiating opioid use disorder treatments. The President’s Commission on Combatting Drug Addiction and the Opioid Crisis recommends the use of medication-assisted treatment (MAT) with pre-trial detainees and continuing treatment upon release. A growing number of jails have implemented pilot programs to provide MAT and to educate inmates with opioid use disorders and link them to treatment in the community upon release.


While prior iterations of the COJ captured overall drug screening policies, accurate and timely statistics are not available on the prevalence of opioid use disorders and treatment programs in jails. To fill this gap, BJS is interested in testing new questions on opioid use disorder screening practices and policies, prevalence of inmate opioid disorder, and treatment and services provided to inmates (Questions 7 through 9 in Appendix A). BJS envisions that the 2019 COJ will serve as the base year to capture opioid testing and treatment practices in jail, and ask these questions again in the 2025 COJ. Furthermore, some of the opioid questions will be incorporated into the ASJ in 2020 if additional research verifies that BJS can obtain valid and reliable estimates on these items based on the ASJ sampling design. In addition, the results from the 2019 COJ may generate interest for BJS to go beyond a few items in our data collections and conduct a special survey in jails on drug use, screening and treatment practices and policies.  



Cognitive Test Design and Procedure


The purpose of the cognitive test is to gain an understanding of how well questions are understood and capture the intended data when administered to a sample of the COJ and ASJ’s target population. The cognitive test consists of two activities. Participating jails will first take a short online survey (Appendix A), which contains the new items. Following the completion of the survey, staff from RTI International, the current data collection agent for the ASJ, will conduct telephone interviews with participating jails to discuss their responses and experience taking the survey. The interviewers will ask a series of open-ended questions that probe respondents’ interpretation and reaction to each question (Appendix B). The responses to the survey and interviews will be analyzed to identify issues with the survey items, technical or other difficulties jails may have in answering the questions, and to estimate burden on respondents.


The new survey items will be tested at up to 49 jails that vary by inmate population size, county drug overdose death rate, and past item completion patterns. The interviews will be conducted in three rounds with time built between rounds for reviewing findings and making revisions to the items for retesting in subsequent rounds. The first round will include up to 9 interviews. The second and third rounds will include up to 20 interviews each.


Respondent Universe and Sample Design

For the cognitive test, BJS will draw a sample of jail facilities from the most recent COJ conducted in 2013. The COJ collected aggregate data on jail capacity and jail inmate populations from the nation’s approximately 3,200 jails. Quota sampling, a non-probabilistic sampling method, will be used to select jails. The population of jails is segmented into mutually exclusive sub-groups (strata) formed by three variables (panel A of table 1):


  • Inmate population size (Small: average daily population [ADP] under 200; Medium: ADP between 200 and 999, and Large: ADP of 1000 or more)

  • The item completion rate of the 2013 COJ form (Low completion rate: responded to fewer than 50% of the items when data quality follow-up [DQFU] started; High completion rate: responded to 50% of the items or more when DQFU started)

  • 2016 county-level overdose death rates (Low death rate: under 16/100,000; High death rate: 16/100,000 or over).

Then judgment is used to select jails from each stratum to reach pre-determined target sample sizes (panel B of table 1). In selecting the sample, jails with known MAT programs (22 were identified through a Google search) will be targeted first. MAT jails are included to ensure that the opioid questions will be tested by a sufficient number of jails, as jails without specific opioid programs will skip two of the three opioid questions being tested. After the 22 MAT jails are contacted and given time to respond, BJS will go on to select jails from each stratum to fulfill the predetermined quota by stratum.



In this sampling design, BJS is oversampling large jails to increase the coverage of the inmate population. BJS is also oversampling jails with lower form completion rates in the last iteration of the COJ because these jails may have more difficulty with questions in BJS jail surveys. In addition, jails located in counties with higher overdose death rates are oversampled, as they may be more prepared to talk about their opioid programs.



Table 1. Jail Survey Cognitive Testing Sample Design

A. Number of reporting units by stratum

 

Form completion rate and county overdose death rate

 

 

Reported 50% or more

Reported less than 50%

 

 

DR < 15.9

DR >= 16.0

DR < 15.9

DR >= 16.0

Total

Jail population size

 

 

 

 

 

Large: ADP > 1000

53

36

15

12

116

Medium : ADP 200-999

347

187

52

21

607

Small: ADP < 200

1,321

474

125

53

1,973

Total

1,721

697

192

86

2,696

B. Target sample size by stratum

 

Form completion rate and county overdose death rate

 

 

Reported 50% or more

Reported less than 50%

 

 

DR < 15.9

DR >= 16.0

DR < 15.9

DR >= 16.0

Total

Jail population size

 

 

 

 

 

Large: ADP > 1000

6

6

3

3

18

Medium : ADP 200-999

6

6

3

3

18

Small: ADP < 200

4

4

2

3

13

Total

16

16

8

9

49

C. Sampling ratio

 

 

 

 

 

 

Form completion rate and county overdose death rate

 

 

Reported 50% or more

Reported less than 50%

 

 

DR < 15.9

DR >= 16.0

DR < 15.9

DR >= 16.0

Total

Jail population size

 

 

 

 

 

Large: ADP > 1000

11.3%

16.7%

20.0%

25.0%

15.5%

Medium : ADP 200-999

1.7%

3.2%

5.8%

14.3%

3.0%

Small: ADP < 200

0.3%

0.8%

1.6%

5.7%

0.7%

Total

0.9%

2.3%

4.2%

10.5%

1.8%








Notes: ADP: Average daily population; DR: death rate.



Information to Be Collected

Instrument The survey questionnaire is provided as Appendix A. The survey begins with instructions, and then is organized into two sections:


Section 1. Confined population asks about the confined population on May 31, 2018; counts by age group and sex; counts by U.S. citizenship and conviction status; counts by foreign-born and conviction status; counts by conditional release status; and counts by offense type. The total confined population (Question 1) is a current item on the ASJ and COJ, and is included in the cognitive test for reference because it provides instructions on how to count the confined population, which the later questions refer to.


Section 2. Facility programs, opioid testing, screening and treatment asks about jail policies and practices on opioid testing, screening, services provided for opioid use disorders, number of inmates screened positive for opioid use disorders, and the number of inmates receiving MAT.


Interview The semi-structured interview guide is provided as Appendix B. The interview guide includes an introductory script that interviewers will read verbatim to ensure that participants have a consistent understanding of the cognitive interview process. The interview guide also contains a list of probes that interviewers will use during the cognitive interview. Some probes are required; other probes are conditional and will be asked based on the content and direction of the interview. Probes listed in the interview guide are called “scripted probes.” The interviewer may also ask unscripted, “spontaneous probes” to inquire more deeply about issues raised by respondents (e.g., Could you tell me more about that?). All probes are neutral and non-directive.


The interviewers will record notes in real-time, but will not audio record the conversation.



Participant Recruitment and Follow-up Emails

RTI will field the cognitive test (online survey and follow-up interviews) using the following recruitment strategy with periodic reminders to ensure survey completion. RTI has recently successfully used this strategy in the cognitive test for the Annual Survey of Probation and Parole to achieve high response rates.


Lead email Sampled agencies for each of the three rounds will first be contacted via a lead email sent from the project manager at BJS (Appendix C1). This email will:

  • Outline the purpose of the cognitive test and describe the two-part procedure (i.e., completing online form and participating in a phone interview);

  • Introduce RTI as a data collection partner;

  • State that BJS will not publish or release individual jail data and that participation is voluntary;

  • Encourage interested agencies to contact RTI (either by email or phone).



Recruitment email RTI will send a follow-up recruitment email (Appendix C2) five days after the BJS lead email to sampled agencies that have not yet responded to BJS’s initial request. This email will:

  • Restate the purpose of the cognitive test and describe the two-part procedure (i.e., completing online form and participating in a phone interview);

  • Emphasize that BJS will not publish or release individual jail data;

  • Request that the agency contact the RTI recruiter (either by email or phone) to schedule an appointment

Scheduling phone call After an agency contacts the RTI recruiter, RTI will reply to the agency no later than 1 business day to schedule an interview (call script in Appendix C3). RTI will schedule interviews at an agency’s discretion at least seven business days in the future to allow the agency time to complete the cognitive test form.


Confirmation email After scheduling an interview, the RTI recruiter will send a confirmation email (Appendix C4) that includes the date and time of the interview, the name of the interviewer, and a link to the online cognitive test form. This email will request that the participant complete the form at least two business days before their scheduled interview. The email will also remind participants that the data they provide will be used solely for cognitive testing purposes.


Survey reminder email If a participant has not completed the cognitive test form online two days before the scheduled interview, RTI will send an email (Appendix C5) to the agency to remind them of the task.


Interview reminder email One business day before each scheduled cognitive interview, RTI will send a reminder email (Appendix C6) to the participating agencies that includes the date and time of the interview, as well as name of the interviewer. This email will also instruct participants to contact RTI if they are unable to generate a report of their responses so that one may be provided for them prior to the interview.


Thank-you email In addition to the contacts above, RTI will send a follow-up thank you email (Appendix C7) to participating agencies after the cognitive interview.


Collection-closed email If an agency contacts RTI to participate in a cognitive interview after data collection is closed (overall or by strata), RTI will send an email (Appendix C8) thanking them for their interest and informing them that data collection has closed and their participation is no longer needed.


Table 2 outlines the proposed flow for sending recruitment, confirmation, and reminder emails.






Table 2. Recruitment Email Schedule

Contact

Timing

BJS sends lead email (Appendix C1)

Initial contact

RTI sends recruitment email (Appendix C2)

5 days after lead email

RTI calls interested agencies to schedule interview (Appendix C3)

As soon as an interested agency contacts the RTI recruiter

RTI sends participant confirmation email (Appendix C4)

Day of scheduling

RTI sends reminder to complete cognitive test form (Appendix C5)*

2 days before interview

RTI sends appointment reminder email (Appendix C6)

1 day before interview

RTI sends thank-you email (Appendix C7)

1 day after interview

RTI sends “collection closed” email (Appendix C8)

As needed

*Sent to participants that have not completed the web form 2 days prior to their scheduled interview


Burden Hours Estimated for Pre-test and Follow-up Interview

BJS estimates the total respondent burden to be 91 hours. RTI will reach out to 100 jails with the goal of recruiting 49 respondents. Based on recent experience with a similar project (i.e., the Annual Survey of Probation and Parole cognitive test), BJS expects to complete close to 49 interviews using the recruitment protocol outlined above. The cognitive test will be administered to 49 respondents with an average self-administration time of 30 minutes and an additional 60 minutes for follow-up interview. If fewer than 49 respondents end up participating in the cognitive test, the total burden hours will be less than 91.


Table 3. Summary of burden hours for the Jail survey cognitive test

Reporting mode

Purpose of contact

Number of data providers

Average reporting time

Total burden hours

Email and telephone

Recruitment

100

10 min

17

Online

Complete the 2018 Jail Survey Cognitive Test form

49

30 min

25

Email and telephone

Respondent follow-up interview

49

60 min

49

Total





91


Efforts to Identify Duplication

BJS conducted a literature review and environmental scan to identify any additional potential duplication efforts currently taking place in the field. None were identified.



Informed Consent and Data Confidentiality

The surveys are not intended to collect individually identifying information or ask for information that would otherwise be considered sensitive in nature. As such, the activities associated with this task are not considered human subjects research. Both the interview guide (Appendix B) and the initial cognitive test invitation (Appendix C1) contain an informed consent statement. The respondents will be informed that participation in the survey is voluntary and that respondents may decline to answer any and all questions and may stop their participation at any time. Data collected from the cognitive test will be used to improve the COJ and ASJ survey instruments only. Individual jail responses collected from the cognitive test will not be published or released.


Data Security

Information collected from the surveys will be stored on RTI’s computer network that resides behind RTI’s firewall, which meets all federal security standards. Precautions will be taken to protect respondents’ contact information by maintaining responses on a password-protected computer for analysis. Because the survey elicits aggregate factual information about jail population and programs, and the only human subjects data collected are name and business contact information for any follow-up questions, the data collection does not involve human subjects research.


Contact Information

Questions regarding any aspect of this project can be directed to:

Zhen Zeng, Statistician

Bureau of Justice Statistics

U.S. Department of Justice

810 7th Street NW

Washington, DC 20531

Office Phone: 202-598-9955

Email: Zhen.Zeng@usdoj.gov

Appendices

  1. Paper version of the 2018 Jail Survey Cognitive Test

  2. Interview guide

  3. Respondent communication materials (email and call script)











1 James, Doris. 2004. Profile of Jail Inmates, 2002. http://www.bjs.gov/index.cfm?ty=pbdetail&iid=1118. Carson, Elizabeth and William S. Sabol. 2016. Aging of the State Prison Population, 1993–2013. http://www.bjs.gov/index.cfm?ty=pbdetail&iid=5602

2 James, Doris. 2004. Profile of Jail Inmates, 2002. http://www.bjs.gov/index.cfm?ty=pbdetail&iid=1118.

3 A probation or parole technical violation is misbehavior by an offender under supervision that is not by itself a criminal offense and generally does not result in arrest (e.g., failing to report for a scheduled office visit).

4 Bronson, Jennifer. 2017. Drug Use, Dependence, and Abuse among State Prisoners and Jail Inmates, 2007-2009. http://www.bjs.gov/index.cfm?ty=pbdetail&iid=5966.

11


File Typeapplication/vnd.openxmlformats-officedocument.wordprocessingml.document
AuthorAnn
File Modified0000-00-00
File Created2021-01-21

© 2024 OMB.report | Privacy Policy