MEMORANDUM
TO: Robert Sivinski
Office of Statistical and Science Policy
Office of Management and Budget
THROUGH: Kevin M. Scott
Acting Director
Bureau of Justice Statistics
Shelley S. Hyland
Senior Statistical Advisor
Bureau of Justice Statistics
Kristin Tennyson
Deputy Director, Statistical Collections
Bureau of Justice Statistics
Rich Kluckow
Chief, Prisons Corrections Statistics Unit
Bureau of Justice Statistics
FROM: Emily Buehler
Statistician, Prisons Corrections Statistics Unit
Bureau of Justice Statistics
DATE: August 12, 2024
SUBJECT: BJS request to conduct a field test for the Survey of Prison Inmates Research & Development under the OMB generic clearance agreement (OMB Number 1121-0339).
The Bureau of Justice Statistics (BJS) requests clearance under its generic clearance agreement (OMB Control Number 1121-0339) to conduct a field test of the 2016 Survey of Prison Inmates (SPI) instrument as part of the SPI Research and Development (SPI R&D) project. The central goals of this field test are to (1) examine the feasibility of facilities to support virtual survey administration, (2) test for potential differences in consent to data linkage based on the length of time for which survey responses could be linked to other data sources, and (3) measure the effects of administration mode on sample member participation and data quality. The field test will include a two-stage design. The first stage will include a convenience sample of Departments of Correction (DOCs) and the Federal Bureau of Prisons (FBOP) to participate in this field test, with a total sample of 40 facilities selected based on their resources and capacity to support the research (30 facilities for the primary sample and 10 as potential replacements). The second stage will include a random sample of 100 incarcerated people from each facility who will be assigned to either complete the approximately hour-long SPI interview face-to-face with an interviewer on site (50 incarcerated persons) or virtually over a video conferencing platform (50 incarcerated persons).
BJS recognizes that the scope of effort needed to successfully conduct the SPI places significant burden on facilities and incarcerated persons and has taken several measures to minimize the burden. Protocols that have been established to minimize burden on facilities include customizing the data collection schedule and minimizing the number of days in the facility to conduct data collection. The protocols allow for flexibility, given that facilities vary in terms of interviewing space, number of days and hours of each day when interviewing can be conducted, specific rules regarding items that may be brought into the prison, and instructions for arriving at the facility. Furthermore, BJS made efforts to minimize the burden of the SPI administration on participants. The 2016 SPI questionnaire was designed and cognitively tested to maximize respondent comprehension. Also, the interview length was reduced from an average of 83 minutes in the 2013 SPI Pilot Study to an estimated 60 minutes, including the informed consent process, for the 2016 national implementation.
These efforts notwithstanding, opportunities exist to further reduce burden and enhance SPI data quality and utility for the next iteration through innovations to the data collection methodology. The increased capacity of facilities to support virtual Computer-Assisted Personal Interviews (CAPI) – largely resulting from technological changes made during the COVID-19 pandemic – offers the potential to revisit data collection modes. BJS, with its data collection agent RTI International (RTI), will develop and test a virtual CAPI SPI to complement a traditional in-person CAPI. Including a virtual CAPI option will help accommodate facilities’ preferences and respondents’ circumstances that might affect their ability to participate, as well as improve data collection efficiency. Indeed, using new, and even multiple interview modes for data collection might reduce facility and respondent burden, enhance accessibility, improve response rates, and reduce nonresponse bias.
BJS is the primary statistical agency of the Department of Justice. The mission of BJS is to collect, analyze, publish, and disseminate information on crime, criminal offenders, victims of crime, and the operation of justice systems at all levels of government. BJS also provides financial and technical support to state, local, and tribal governments to improve both their statistical capabilities and the quality and utility of their criminal history records. In fulfilling its mission to collect data that can be used to develop crime-related policy, administer fair and efficient legal processes, and advance public safety, the Bureau collects information regarding the prison population.
RTI is an independent, nonprofit research institute dedicated to improving the human condition. RTI served as the primary data collector for the 2016 SPI, recruiting 306 state and 58 federal prisons (94% of the sampled facilities) and completing 24,848 interviews in English (94%) and Spanish (6%) using CAPI (Glaze, 2019).
The SPI (OMB Control Number 1121-0152) is a periodic, cross-sectional survey of state and federal prison populations nationwide. BJS first introduced SPI in 1974, collecting data again six times, most recently in 2016. SPI fits within the larger BJS portfolio of collections that inform the nation on the nature, composition, and changes in persons sentenced to state and federal prisons. BJS’s National Prisoner Statistics program (NPS-1, OMB Control Number 1121-0102) collects annual aggregate counts of the number of persons incarcerated in state and federal prisons, as well as the number admitted to and released from prisons annually. BJS uses NPS to report on the size of the prison population over time, changes in the size over time, and the flow of persons through state and federal prison systems. The National Corrections Reporting Program (NCRP, OMB Control Number 1121-0065) collects individual-level administrative records on prison admissions, releases, stock population, and discharges from post-custody community supervision in almost all states.
Like prior administrations of the survey, the 2016 SPI was a stratified two-stage sample design in which prisons were selected in the first stage and incarcerated persons within sampled prisons were selected in the second stage. The prison sample was selected from a universe of 2,001 unique prisons (1,808 state and 193 federal) that were either enumerated in the 2012 Census of State and Federal Correctional Facilities or had opened between the completion of the census and July 2014. A total of 364 prisons (306 state and 58 federal) participated in the 2016 SPI out of 370 selected that were eligible (312 state and 58 federal). The first-stage response rate among selected prisons was 98.4% (98.1% among state prisons and 100% among federal prisons). A total of 24,848 incarcerated persons participated (20,064 state and 4,784 federal) in the 2016 SPI, based on an eligible sample of 35,509 persons (28,934 state and 6,575 federal). The second-stage response rate among selected persons was 70.0% (69.3% among persons incarcerated in state prisons and 72.8% among those in federal prisons).
SPI is one of BJS’s most prominent collections and plays a vital role in generating national estimates on risks and needs among individuals sentenced to state and federal prisons. Given the importance of SPI, there is a pressing need to ensure that the survey content continues to address both longstanding and emerging issues affecting the incarcerated population and facilities in which they are housed, and that the survey research methods reflect best practices.
SPI has been a vital source of national statistics on characteristics of adults incarcerated in prisons. SPI revealed that people held in prisons face significant health and socioeconomic challenges. Among individuals in state prisons in 2016, 51% report chronic health conditions, 43% have a history of mental health issues, and 31% and 39% were under the influence of alcohol or drugs, respectively, at the time of their offense (Maruschak et al., 2021a,b,c). Moreover, approximately one-quarter of people in prison were unemployed and not seeking employment in the 30 days before arrest, and nearly two-thirds lacked a high school education prior to admission to prison (Beatty & Snell, 2021; Maruschak & Snell, 2023). SPI data are also used to track changes over time, describe special populations, and identify policy-relevant shifts in prison populations.
The SPI 2016 instrument will be used for this field test (Appendix A). The only revisions to the instrument since the national survey was conducted in 2016 are programming changes to fix errors in skip logic or other programming specifications. Footnotes describing these errors are noted in the instrument. BJS will not be implementing Statistical Policy Directive 15 (SPD15) for this field test but will update the race and ethnicity items according to SPD15 for the next full SPI data collection.
The 2019 Census of State and Federal Adult Correctional Facilities (CCF) will serve as an initial frame for the field test and will inform the size of each domain in the population (e.g., state and federal, public and private operator, security level, gender of population). Based on the 2019 CCF, there were 968 state and 111 federal prisons. A two-stage design will be used for the field test. Since the field test will not be used to produce national estimates, the first stage will include a convenience sample of state DOCs and facilities that reflect the range of facilities eligible for the next SPI, including federal facilities under the jurisdiction of the FBOP. BJS and RTI will leverage existing relationships with several DOCs and the FBOP to gain cooperation from up to seven state DOCs and the federal system. From the selected DOCs and the FBOP, 40 facilities will be selected for the field test. The main sample will include 30 facilities. Ten additional facilities will be held in reserve as replacements for any facilities in the main sample that are unable or unwilling to participate in the field test.
In the second stage, a random sample of incarcerated people will be selected from each participating facility. RTI will request rosters of incarcerated adults from each facility 3 weeks prior to the data collection visit. The roster will include facility ID number, state or FBOP number, FBI fingerprint ID number, full name, housing unit, gender, date of birth, race, ethnicity, current prison admission date, and sentence length, and offense type (FBOP facilities only). These rosters will be used to confirm that the facility can provide the information needed in a suitable format. RTI will request that a second roster be sent no more than one week prior to data collection. Receiving the roster as close as possible to data collection will ensure the person-level sampling frame is as accurate as possible. Once the roster is received, each sampled participant’s record will be reviewed to ensure they are eligible to participate in the field test. Eligible participants include all incarcerated people aged 18 and older who are held in a state prison or are serving a sentence in a federal prison in the United States. Any ineligible people will be removed from the roster prior to sampling. RTI statisticians will use this second roster to select a random sample of up to 100 incarcerated people within each selected facility. Alternatively, if the sample size of incarcerated people per facility is more than 75% of a facility’s population then the sample size will be capped at 75% of the facility population, to avoid overburdening small facilities that are included in the sample. RTI will consult with the DOC during recruitment and approvals of the research application to avoid initially selecting facilities that may be too small. The resulting sample will be approximately 3,000 incarcerated adults (i.e., 100 in each of 30 facilities)
The field test will assess potential mode differences by randomly assigning selected incarcerated participants within each facility to be interviewed either in-person or virtually. Both field test data collection modes are interviewer-administered; however, there could be mode effects based on whether the interviewer is physically present during the interview. Within each facility, 50% of sampled incarcerated participants will be randomly assigned to receive an in-person interview and 50% of sampled incarcerated participants will be randomly assigned to participate in a virtual interview.
The field test will also assess differences in consent rates for data linkage by randomly assigning selected incarcerated individuals within each facility to receive a consent form that references either a 5-year or 10-year administrative data linkage period. The consent process, including the data linkage component, is further detailed in Section 6.
Coupled with the mode assignment detailed in Section 4.1C, approximately 25% of sampled participants will be interviewed virtually with a 5-year data linkage request; 25% of sampled participants will be interviewed virtually with a 10-year data linkage request; 25% of sampled participants will be interviewed in-person with a 5-year data linkage request; and 25% of sampled participants will be interviewed in-person with a 10-year data linkage request.
The sample size of approximately 3,000 incarcerated adults was chosen to ensure differences in outcomes between the two modes can be detected at the desired level of precision and with adequate power. Based on previous experience, BJS assumes a response rate of 50%, yielding a total of 1,500 total interviews, approximately 750 completed field test interviews per mode. After accounting for design effects, we will be able to detect differences in prevalence rates between modes of 5-7% or greater at a 5% significance level and with 80% power. BJS also assumes a similar result of 750 completed interviews with each data linkage consent period. We will be able to detect differences in consent to data linkage across the two time periods of 5-7% or greater at a 5% significance level and with 80% power.
Recruitment efforts will include engagement with approximately 7 DOCs and the FBOP. Forty facilities will be sampled across the states and the FBOP; 30 of these will serve as the main sample and 10 will serve as reserve facilities. Initial outreach will be by mail to the agency heads. A letter from BJS to the DOC and FBOP agency head (Appendix B) will introduce the field test and request a point of contact (POC). RTI will reach out to any nonrespondent agency heads via phone and email to determine willingness to participate. Once identified, RTI, on BJS’s behalf, will send a letter to each POC, notifying them of their agency’s approval and providing information about the field test (Appendix C). RTI then will follow-up by phone with each POC to discuss study goals, general data collection protocols, and any requirements for review and approval of the study protocol by an Institutional Review Board or other research review committee.
Additional discussions between RTI and the POCs will address logistics (e.g., available interview space, technological capacity to support virtual CAPI, roster procedures, participant incentives, interviewer background checks, and data collection schedule). RTI will prepare and submit any required materials for research committee review and information needed for interviewer clearance.1 Between the time a facility agrees to participate and the launch of data collection at the facility, RTI will document all approved plans and procedures and share that with the POC to ensure a mutual understanding of the protocol. In the days immediately prior to the data collection, RTI will confirm the plan and arrange for study staff (i.e., a site coordinator and interviewing team) to travel to the facility in accordance with the agreed upon schedule. A document of Frequently Asked Questions (FAQs) will be provided for facility staff to assist in their planning and execution of the field test (Appendix D).
A key element of recruitment and engagement is the provision of rosters (Appendix E) using instructions provided by RTI (Appendix F). The week prior to the start of data collection at a given facility, the facility (or DOC/FBOP) will provide RTI with a roster of all eligible persons. The roster will include facility ID number, state or FBOP number, FBI fingerprint ID number, full name, housing unit, gender, date of birth, race, ethnicity, current prison admission date, and sentence length, and offense type (FBOP facilities only). In order to protect sensitive personally identifiable information (PII), POCs will be asked to password protect or encrypt the roster file and upload it via an FTP site, or encrypted e-mail if they are not capable of using the FTP site.
Data collection site visits will be scheduled for up to five business days, Monday through Friday. Per a logistics plan, which is developed in advance by RTI in collaboration with each facility, field staff will arrive as a group to reduce burden on facility staff conducting the check-in process. The study staff will be escorted to the interview room(s) where they will work with facility staff to ensure the space meets privacy and technical requirements. The site coordinator will confirm the daily schedule, including any pauses for facility population counts or lunch.
The site coordinator will provide designated facility staff with the roster of sampled persons and coordinate with facility staff on sample flow. They will request that, to manage the flow, facility staff retrieve no more than two individuals at a time for every interviewer and adjust the flow as needed to keep things moving efficiently. This strategy reduces the waiting time of sampled persons, thus reducing refusals. It also avoids too many people waiting together at any one time, which may interfere with facility protocols. Once an incarcerated individual is escorted to the interview location, the interviewer will verify their identity. In situations where a sample member is unavailable when called, interviewers will make subsequent attempts at least twice to interview the individual during the data collection visit.
After the person has been escorted to the interview location, an interviewer will administer the consent protocol (Appendix G). During the consent process, sample members will be informed of the planned five- or ten-year linkage of their survey data and administrative records. If the person declines data linkage, the interviewer will note this refusal, but the survey can still be administered.
RTI will work with BJS to develop specific recruitment goals for participant characteristics. Sampled individuals who are not willing to meet with the interviewer at the time they are called because they have other important activities (e.g., work, class, visits with attorney or family, etc.) will be called again later in the week with the hope that they will be willing to meet with the interviewer and hear about the study. Similarly, individuals who refuse to leave their housing unit to meet with the interviewer will be called again later in the week, unless the facility will not allow the second call. If a sampled individual again refuses to meet with the interviewer, the interviewer will finalize the case as a refusal.
RTI has found that offering a small incentive to sample members helps to motivate participation (e.g., a $2 snack or contribution to participants’ prison account).2 For example, on the National Inmate Survey (NIS-3), the average response rate in prisons that allowed incentives was 66%, compared to 57% in those that did not allow incentives (Caspar et al., 2012). During logistics planning for this field test, RTI will discuss with facility POCs the potential use of incentives. Incentives are subject to approval at both the DOC and facility level. Language describing the incentive will be added to the standard consent form if their use is approved. In the case of virtual interviewing, the on-site coordinator will be responsible for giving the incentives to the designated facility staff before sample members are brought to the interview areas. Incentives will be given to each sample member who comes to an interviewing area to meet with an interviewer virtually (regardless of their decision about participating in the interview).
As shown in Table 1, BJS has estimated the total respondent burden for the proposed 2025 SPI field test at 4,517 hours. This estimate is based on the number of selected facilities and incarcerated individuals, not the numbers of expected participating facilities and individuals, which would rely on assumptions about nonresponse rates. Therefore, this estimate represents the maximum estimated burden expected for the field test, including obtaining rosters from facilities and agencies, prison staff escorting sample members to and from interview rooms, participants completing the consent process and interview, and post-data collection follow-up with DOCs and the FBOP.
Based on BJS’s experience with other surveys of incarcerated adults, such as the 2016 SPI and NIS-3, staff burden is estimated at approximately 0.5 hours per facility to provide the roster request and updates. Additionally, staff will spend approximate 0.5 hours per inmate to escort them to and from the interviewing room.
The field test will involve a sample of 30 state and federal facilities from which approximately 3,000 incarcerated individuals will be sampled to participate. The estimation time to complete the consent process and survey is 60 minutes.
Table 1: Burden Associated with Planned Testing
Category of Respondent and Activity |
Maximum number of respondents |
Estimated burden (minutes) |
Total burden (hours) |
|
State and federal governments – provide points of contact for facilities that will be part of the sample |
8 |
15 |
2 |
|
State and federal governments – provide preliminary and final rosters from which incarcerated individuals are sampled and then assigned to in-person or virtual interview conditions |
30 |
30 |
15 |
|
In-person interviews |
State and federal governments – staff escorts sampled participants |
1,500 |
30 |
750 |
Incarcerated individuals – participants interview time |
1,500 |
60 |
1,500 |
|
Virtual interviews |
State and federal governments – staff escorts sampled participants |
1,500 |
30 |
750 |
Incarcerated individuals – participants interview time |
1,500 |
60 |
1,500 |
|
Total burden |
6,030 |
|
Note: These estimates represent the number of facilities and inmates that will be sampled, not the number expected to participate. As such, these burden hours represent the maximum burden.
The virtual and in-person interviews using the 2016 SPI instrument will be conducted in both English and Spanish.
Table 2. Timeline of 2025 SPI Field Test
Milestone |
Start Date |
End Date |
Obtain OMB generic clearance |
9/1/2024 |
9/15/2024 |
Recruitment of DOCs and the FBOP and conduct logistical planning with selected facilities |
9/30/2024 |
2/19/2025 |
Field testing period |
1/16/2025 |
5/1/2025 |
Analyze SPI response data for mode effects & debrief with facilities and interviewers |
5/9/2025 |
6/9/2025 |
Draft and deliver final report |
5/16/2025 |
9/4/2025 |
The interviewer will read the consent form (Appendix G) to all participants at the start of the interview. The form read to respondents at the beginning of the interview will provide information about the purpose of the 2025 SPI Field Test, the voluntary nature of the study, how the respondent was selected, what types of questions they will be asked, how their responses will be recorded, possible risks and benefits to participating, and provided confidentiality assurances. The consent will also include an option related to data linkage. As detailed in section 4.1D Random Assignment to Data Linkage Timeframe, sampled individuals will be randomly selected to receive either the consent form with a 5-year data linkage request or a 10-year data linkage request. This data linkage consent is not mandatory to complete the interview and they can decline this option and still continue with the interview. They will be provided information about who to contact with questions about the study. It will announce the estimated length of the interview in advance, allowing the participant an opportunity to decline. Interviewers will then confirm that they have read all required informed consent text to the respondents and proceed with the interview.
BJS is authorized to conduct this data collection under 34 U.S.C. § 10132. BJS may only use the information it collects for statistical or research purposes, consistent with 34 U.S.C. § 10134. BJS is required to protect information identifiable to a private person from unauthorized disclosure and may not publicly release data in a way that could reasonably identify a specific private person [34 U.S.C. § 10231 and 28 CFR Part 22]. Any person who violates these provisions may be punished by a fine up to $10,000, in addition to any other penalties imposed by law. Further, per the Cybersecurity Enhancement Act of 2015, federal information systems are protected from malicious activities through cybersecurity screening of transmitted data.
The BJS Data Protection Guidelines provide more detailed information on how BJS and its data collection agents will use and protect data collected under BJS’s authority. All data obtained will be maintained on secure servers at BJS and RTI and will not be shared with third parties.
As noted in Section 4.3, there will be an option to add an incentive to the sampled participants. If all DOCs and facilities agree to a $2 snack or contribution to a participant’s prison account and all incarcerated adults sampled agree to participate and therefore receive the incentive, the maximum total costs are show in Table 3. This cost is already accounted for in the SPI R&D award and budget submitted by RTI. This is not an additional cost that would be incurred by the federal government.
Table 3. Projected maximum costs related to participant compensation
Type of participants being compensated |
Number of participants |
Compensation amount |
Total |
Incarcerated adults interviewed in-person |
1,500 |
$2 |
$3,000 |
Incarcerated adults interviewed virtually |
1,500 |
$2 |
$3,000 |
Total |
3,000 |
|
$6,000 |
There will be no compensation to DOCs or the FBOP for their participation.
The estimated annual cost to the Federal government is $1,069,452 for RTI’s portion of the work directly tied to the field test tasks and an estimated $48,659 for the GS-13 project manager’s work, resulting in a total estimated cost of $1,118,111 [$1,069,452 + $48,659].
RTI will analyze the mode effects across a range of survey outcomes. Despite the random mode assignment, differences in the sample composition may still exist. Using a multivariate modeling approach can help to correct for differences between the in-person respondents and virtual respondents to isolate the effect of mode on outcomes and to suggest whether simple differences by mode may be mitigated by weighting. This analysis will inform whether there could be a break in trend from previous SPI implementations and how data collected by different modes can be compared over time.
In addition to evaluating the effects of mode on survey outcomes, differences in survey administration and data quality will also be assessed. This assessment will address the following questions:
How do unit response and item response rates differ by mode? Does in-person CAPI maintain respondent engagement better than virtual CAPI, resulting in higher survey and item completion rates?
Are sampled incarcerated individuals that do not participate in the study fundamentally different than those that do and are these differences consistent across mode assignment? Administrative data (e.g., sex, age, race) on the incarcerated population roster provided by each facility will allow RTI and BJS to evaluate whether certain groups are more or less likely to respond for a particular mode.
How long did the interviews take by mode? Instrument length will be reviewed by sub-section and in total to inform decisions about the addition or deletion of content and the length of the data collection period within each facility, and to accurately estimate respondent burden for the next iteration of SPI.
Were there any survey items with unusually long administration times and did this vary by mode?
Such items will be reviewed to determine whether they need to be clarified to reduce confusion or were creating excessive burden on respondents and therefore should be omitted from the instrument.
Debriefings with sampled facilities and interviewers will also be conducted to help prepare for the next iteration of SPI. These reports will provide a better understanding of the following questions:
What is the impact of mode on facility burden? Does the use of virtual interviews increase a facility’s ability and willingness to participate in the collection by reducing the number of complex background checks and onsite security procedures, or reduce the likelihood because of technological constraints (e.g., availability of internet connectivity, lack of incarcerated individuals’ access to devices)?
Were there technological issues (e.g., unstable internet connection) that disrupted the virtual interview process?
Was privacy able to be maintained during virtual interviews?
Were interviewers able to help respondents understand questions and provide support to those who experienced distress?
Did interviewers detect differences in levels of engagement when conducting virtual interviews?
Lastly, RTI will analyze the effects on survey participation and data linkage, given the two data linkage timeframes (5 years and 10 years).
Upon completion of the field test of in-person and virtual interviewing modes of the SPI, RTI will provide BJS with a report describing the findings and final recommendations. It will include findings related to the mode effects of conducting the SPI in-person compared to virtually. The report will also address the questions of feasibility of virtual interviewing, an assessment of burden associated with both modes, and recommendations that will inform a national implementation plan for the next full iteration of the SPI.
RTI’s Institutional Review Board (IRB), which has Federal-wide assurance, is currently reviewing the project per 28 CFR 46 and a determination is pending. No recruitment or data collection will take place until the IRB has issued approval for the project. Once this approval is granted, an official notice will be provided. The application that is currently pending with RTI’s IRB is included (Appendix H).
Contact Information
Questions regarding any aspect of this project can be directed to:
Emily Buehler
Statistician
Bureau of Justice Statistics
U.S. Department of Justice
810 7th Street, NW
Washington, DC 20531
Office Phone: 202-598-1036
E-mail: Emily.buehler@usdoj.gov
Appendices:
Appendix A: 2016 Survey of Prison Inmates Instrument
Appendix B: Initial Contact Commissioner Letter
Appendix C: Sampled Facility Letter
Appendix D: Facility FAQs
Appendix E: SPI R&D Example Roster
Appendix F: Consent Form
Appendix G: Roster Request Instructions
Appendix H: IRB Application
1 Per the agencies’ preference, background checks can be conducted by the DOC/FBOP or by the facility. If an interviewer does not pass a facility-level background check for any reason, the interviewer will not be permitted to conduct any interviews at that facility and will be substituted with another interviewer. If an interviewer does not pass a DOC-level background check applicable throughout the state, they will not be permitted to conduct interviews at any of that state’s facilities.
2 The attached consent form (Appendix G) currently mentions only the snack incentive. Based on experience, this is the most likely incentive to be approved for distribution by the agencies. The consent form will be edited to be facility-specific regarding the approved incentive offered.
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Davis, Elizabeth |
File Modified | 0000-00-00 |
File Created | 2024-09-20 |