Annual Surveys of Probation and Parole
OMB Control Number 1121-0064
OMB Expiration Date: 09/30/2023
B. Collection of Information Employing Statistical Methods
Universe and Respondent Selection
The Annual Surveys of Probation and Parole (ASPP) are designed to collect all probation and parole data from community-supervising jurisdictions within each state. The universe includes all federal, state, and locally administered probation and parole departments (N = 912). For parole, there are 54 respondents: 50 central state reporters, the District of Columbia, two from the federal system, and 1 state that has two respondents. For probation, there are approximately 858 respondents: 40 state reporters and 718 separate city, county, or court reporters. Information is collected from central reporters within each state wherever possible, to reduce the burden on individual agencies. For over 20 states, BJS can collect the entirety of the probation population from one respondent that has aggregate totals for the state. For other states, BJS can collect some state aggregated probation (i.e. state supervised felony probation) data from a centralized respondent and then also collect the remainder of probation data from individual agencies. The District of Columbia self-reports, and the data for the federal system are obtained indirectly from the Administrative Office of the U.S. Courts through BJS’s Federal Justice Statistics Program. These respondents are identified as providing a full census of probation in the United States.
As described in Part A, since 2012, BJS has been working to update and validate the Annual Survey of Probation frame. For the 2019 collection, BJS added 66 new agencies to the frame, all of which indicated they supervised at least one person convicted of a felony. For the 2023 collection, BJS will administer the revised CJ-8 to 250 probation agencies that supervise people on probation for a felony offense. Additionally, BJS will introduce the CJ-8M to about 600 probation agencies and courts that indicated only supervision of people on probation for a misdemeanor offense. BJS will confirm that these agencies supervised adults on probation in the past year and gather the critical data elements that will allow BJS to produce a comprehensive, valid census of the probation population. Each year of the collection will confirm and refine the frame resulting in an up-to-date list of agencies to include in the survey each year.
Routinely, BJS employs methods to maintain the accuracy and the completeness of the population frame for each survey:
Agency staff provide information about newly formed, merged, or closed supervising agencies while data collection is ongoing through data retrieval phone calls and emails. This information is used to update the frame prior to the start of each data collection year.
Close attention is paid to unexplained changes in the total population that occur from the end of one year to the beginning of the next, and large increases or decreases in the total population during the current reporting year. During survey administration, a comparison is made between the previous yearend population to the reported beginning current year population and, if there is a difference of 10% or greater, respondents are prompted to review their data. They are then asked to enter a reason for the discrepancy between the populations over two days.
Following data submission, all data are reviewed. Probation agencies with populations of 100 or more and parole agencies of any size, whose previous yearend population differs by more than 5% from that of their reported beginning year population, are flagged for review and potential follow up. This also occurs for probation agencies with a population less than 100 and a 10% or greater population difference. BJS’s data collection agent, the Research Triangle Institute (RTI) also reviews the information provided by agencies when annual growth (January 1 to December 31) for the current reporting year exceeds 10%.
During follow-up, RTI uses open-ended probes to determine the reasons for differences in yearend to beginning year population, or in the current reporting year. Differences may be explained by a variety of reasons, such as data entry error, a reporting method change, a change in the agency’s responsibility (e.g., an agency has taken responsibility for people supervised on probation or parole who were previously supervised by another agency), or, in the case of within-reporting year change, a genuine growth or decline of the population.
Over the past 2 years, BJS has continued to have almost 100% response for parole and almost 80% for probation agencies. In 2021, the response rate for the Annual Probation Survey was 76% of surveyed agencies (representing 98% of the 2021 yearend probation population) and the response rate for the Annual Parole Survey was 96% of surveyed agencies (representing over 99% of the 2021 yearend parole population) (table 1).
Table 1. Survey response rates, 2020 and 2021 |
2020 |
2021 |
||
|
Probation |
Parole |
Probation |
Parole |
Survey response rate |
79% |
98% |
76% |
96% |
Population of non-response as a percent of total population |
1.80% |
<0.5% |
2.60% |
<0.5% |
Note: The probation non-response includes unit non-response (71 agencies in 2020 and 74 agencies in 2021) as well as those who did provided incomplete data for the core four populations (33 agencies in 2020 and 46 in 2021). |
|
|
|
Procedures for Collecting Information
Collection Procedure
BJS emphasizes the web as the primary mode of data collection. Hardcopy forms are sent to respondents only after the due date has passed. Table 2 includes milestones summarizing outreach for data collection. For the 2023 collection only, an introduction letter will be sent to all agencies receiving the CJ-8 form to indicate changes to the instruments and give extra time for any programming requests within the agency (Attachment 10). For both the 2024 and 2025 collections, the first outreach will draw attention to the ASPP collection in advance of the formal request to participate, a pre-notification letter is mailed and emailed to agencies in early November (Attachment 11). The letter provides information about the purpose and importance of the surveys as well as the type of information to be requested so they can plan to retain the yearend information that they will need. Agencies will also be asked to update or confirm contact information for the most appropriate person to respond to the survey by logging onto the website or using an enclosed designation form (Attachment 12). They are also asked to indicate whether the designated respondent is from a private company contracted to supervise adults on probation.
In December, all agencies receive a survey invitation letter requesting that they complete the survey on the web (Attachment 13). The letter explains the importance of the survey and provides a link to the most recent BJS Probation and Parole in the United States bulletin, states that participation is voluntary, and thanks them for their involvement. Each agency is provided with a unique user ID and password to securely access the survey website to complete the questionnaire. Shortly after the letter is sent, each agency receives a follow-up email that references their invitation, provides a direct link to the survey website, and encourages their agency’s participation.
After this invitation, other communications inform respondents of the status of data collection or serve to remind them to respond. These include the following:
Automatic thank-you emails are sent to those that have submitted their web survey (Attachment 14).
Four reminder messages are sent to non-respondents throughout the data collection period.
The first is sent via email to alert respondents of the impending survey due date (Attachment 15).
The second is sent both via USPS and e-mail within a month of the survey due date (Attachment 16).
The third is a reminder postcard sent about a month after the survey due date, via USPS (Attachment 17).
The fourth is sent via email three weeks before the final cutoff of data collection, around the third week of April (Attachment 18).
Telephone calls, as a reminder to non-respondents, are made to non-respondents immediately following the survey due date. The scripts are tailored to the size, type, and reporting history of the agency (Attachment 19).
BJS instituted a practice of sending a closeout email towards the end of data collection. This letter describes the status of the agency’s submission. There are three versions of the closeout email message: no data, partial data, or data that requires clarification (Attachments 20, 21, 22).
Additional follow-up is conducted as needed with non-respondents:
Non-respondents that indicate they need more time to provide data receive follow-up contact by telephone to resolve data discrepancies and obtain answers to items left unanswered in the survey (Attachment 19).
Certain subsets of non-responding agencies receive tailored communications applicable to their agency characteristics. These subsets may be identified using RTI’s metadata dashboard also known as the adaptive technology dashboard (ATD).
Table 2. Outreach Schedule for ASPP |
|||
Event |
Mode |
Attachment |
Project Week |
Pre-notifications |
Mail, email |
11 |
2 months prior to survey open |
Web designation form |
|
12 |
1 months prior to survey open |
Survey Invitations |
Mail, email |
13 |
1 months prior to survey open |
Submission thank you email |
14 |
Ongoing upon submission (1-18) |
|
Non-response follow-up |
Phone |
19 |
Ongoing from due date to close (8-16) |
First Reminder |
15 |
5 |
|
Second Reminder |
15 |
10 |
|
Third Reminder |
16 |
13 |
|
Fourth Reminder (April Postcard) |
17 |
15 |
|
Fifth Reminders |
Email, mail |
18 |
17 |
Closeout Communications |
20,21,22 |
15-18 |
Processing
Upon receipt of a survey (web or hardcopy), data will be reviewed and edited, and if needed, the respondent will be contacted to clarify answers or provide missing information. Respondents who submit via web will be prompted with real-time validation checks when submitting missing or inconsistent data. Any unresolved items that remain after the respondent submits will result in recontact by RTI staff to the respondent to attempt to resolve these issues. Respondents who submit via hardcopy will have their data entered into the web survey by RTI staff and go through the same data quality checks as other web submissions.
Within 2 weeks of survey submission, follow-up activities by RTI will begin. If critical items are missing or inconsistent, such as the beginning year or yearend population or the number of entries to or exits from supervision, RTI staff will contact respondents to determine if they can provide estimates or explanations for inconsistencies (Attachments 23 and 24). Staff will work with the respondents to estimate missing information if it cannot be easily provided, making sure to obtain agreement from the respondent before disseminating data containing any revisions.
Within the first four weeks of the start of the data collection period, preliminary analysis begins. RTI staff check the data for out-of-range values, missing data, and other types of responses that need data editing/cleaning. These preliminary analyses are undertaken while data collection is still in progress to provide adequate time for follow-up clarification calls.
Imputation Procedures
BJS has developed several imputation methods to estimate January 1 and December 31 populations as well as entries and exits if respondents are unable to provide any of the key information. The probation non-response includes unit non-response (71 agencies in 2020 and 74 agencies in 2021) as well as those who provided incomplete data for the core four populations (33 agencies in 2020 and 46 in 2021). During 2022, RTI presented research of the imputation methods used for ASPP as well as additional options for imputation to BJS. This meeting resulted in additional research into imputation methods to ensure that the best options were being utilized. In a second presentation to BJS leadership, the following methods were agreed to be the option for imputation of the core populations for unit and item non-response.
For unit non-response, a combination of the population, entry and exit imputation methods are applied. When the January 1 probation population is missing, the December 31 population from the prior year is carried over. When the December 31 probation population is missing and January 1 exits and entries from the current year are also missing, the December 31 population is set to the last reported December 31 number.
When the January 1, 2021, probation population is missing, the December 31 probation population from the last reported year going back 5 years is carried forward. When the January 1, parole population is missing, the December 31 probation population from the prior year is also carried forward.
When the December 31, 2021, parole population, total entries, or exits is missing, the missing values are imputed by adding to (or subtracting from) the current January 1, parole population to estimate population change based on what was observed for the prior year. The intra-year change in population from January 1 to December 31 of prior year—expressed as a proportion of the prior year January 1 total—is multiplied by the current year January 1 total to estimate the current year population change.
BJS uses four methods of ratio estimation to impute probation entries for agencies not reporting these data. The first method is used to estimate entries for probation agencies that do not report entries in the current year but did report in the prior year. BJS estimates probation entries in the current year by using the ratio of entries in the prior year to the agency’s prior year probation population on January 1 and applying that ratio to the agency’s current year January 1 population. This method is used for agencies that report all four key items in at least one year in the last 5 years and for which the current year January 1 and December 31 populations are equal (likely due to the imputation of one or both of those variables). The entries and exits in the most recent of those years are divided by the beginning and year-end populations from the same year (stock overflow), and the resulting ratio is multiplied by the current year January 1 population.
The second method is used to estimate probation entries for agencies that do not report all four core variables in any single year in the last 5 years or have different beginning and year-end populations. The ratio of prior year entries to the January 1, population is multiplied by the January 1 population to derive the current year entries.
The third method estimates entries in agencies with small populations. This method estimates the relationship between current year entries and the January 1 population by calculating the ratio of the sums of these variables across similarly sized agencies within the same state. This ratio is then multiplied by the January 1 value to obtain current year entries. To ensure the stability of the ratio estimator, this method is only employed in states with at least 30 reporting units.
The fourth method used to estimate probation entries takes the ratio of prior year imputed entries to the prior year January 1 probation population and applies that ratio to the agency’s current year January 1 population.
The specific methods detailed above, and the jurisdictions to which they apply, are documented, and displayed as equations in the “Methodology” section of reports in the series Probation and Parole in the United States1. The imputed values are used for all analyses and reports published by BJS. Imputed values are flagged as such in the files that are sent to NACJD (http://www.icpsr.umich.edu/icpsrweb/NACJD/index.jsp).
Methods to Maximize Response
BJS continues to push for high response rates each year (in 2021, 96% for parole and 80% for probation) and employs several techniques to maximize response rates. They include—
Contacting the agencies prior to the start of data collection and making frequent contacts during the data collection period to solicit participation.
Sending web survey invitations which include login instructions for the web survey in both hard copy through the United States Postal Service (USPS) and in electronic format.
Making it easy for agencies to participate by providing technical support and other help with the survey as needed, offering a response mode other than web, if requested, and providing respondents with real-time online data checks to add efficiency to the response process.
Utilizing a data collection site that allows the respondents to: start and save progress and return at a later date; answer prompts from automatic data checks to perform data quality review prior to submission; download or print a PDF version of their completed survey.
Engaging respondents in the data collection process by highlighting BJS reports that provide information central to agency needs (see BJS Probation and Parole publications).
Using RTI’s metadata dashboard to analyze response patterns and determine the most effective methods for contacting and following up with agencies. The dashboard provides BJS and RTI with real-time response data broken down by agency characteristics and compared to historical trends. The dashboard is available by login only utilized by BJS and RTI staff.
Revising the instruments to streamline the CJ-8M as a data collection option to smaller probation agencies. This option allows for better response from smaller agencies who don’t collect the detailed data requested in the CJ-8 (see section A, item 5, “Impact on Small Businesses or Entities/Efforts to Minimize Burden” for more information).
BJS monitors the progress of the data collection by reviewing data about respondents and non-respondents, their response characteristics, and their communications throughout the data collection period to inform and enhance non-response follow-up. BJS currently has real-time access to the following data:
Agency contact information (e.g., names of agency heads and designated respondents, street and email addresses, telephone numbers).
Individual files containing the image of each submitted survey from the web instrument including notes provided by the respondents.
Mode and date of survey submission.
Notes describing contacts with agencies as well as follow-up efforts; and
Statistics on the current year’s overall response rates and the response rates for each survey type.
Historical data for the agency, including submission status, images of submitted surveys, and notes for the past 5 years.
During
the collection cycle, the data are analyzed to assess response
patterns (e.g., whether the same respondents are consistently late
responders) and missing data on submitted forms, and to develop
strategies to address the timeliness and completeness of data
submissions.
Tailored follow-up timelines using the response patterns are put into place to not needlessly contact a respondent when their time of submission can be predicted. For example, data collected labels respondents as on time, up to 1 month late, 1 to 3 months late, and more than 3 months late. Non-respondents who have historically submitted on time or up to 1 month late receive a decreased non-response effort until later in the follow-up period.
During the 2022 collection, non-respondents who were more than a month late were mailed a paper version of the form along with their log-in information for the website. The mailing resulted in an increase of response from those agencies, however, no one returned a paper form.
Targeted outreach was also conducted by using the metadata dashboard to identify specific groups that may be falling behind in survey response. After the conclusion of each year of data collection, BJS reviews the effectiveness of each method of contact with the respondents. The addition of a postcard from the last approval has proven an effective follow-up in additional to the traditional email and letter outreach.
Unit Non-Response
As seen in table 1, survey response rates for the past 2 years are about 96% for parole respondents and 80% for probation respondents. To publish national totals of people supervised on probation BJS developed strategies to impute missing data for key items for agencies that do not respond. These items include the beginning of the year count, total entries, total exits, and the end of year count. Imputation specifics are described in section 2, Imputation Procedures. For 2021 these imputations accounted for only 2.6% of the probation population and less than 0.5% of the parole population. The probation non-response includes unit non-response (71 agencies in 2020 and 74 agencies in 2021) as well as those who did provided incomplete data for the core four populations (33 agencies in 2020 and 46 in 2021).
Item Non-Response
Rates of item nonresponse on the parole survey vary, with maximum sentence continuing to be the largest nonresponse (table 3). Item nonresponse rates for the probation surveys have been higher than for the parole surveys (table 4). The high nonresponse for detailed data on the probation survey is one reason BJS is suggesting changes to the CJ-8M and CJ-8. Reporting detail data on people on probation for felony probation only provides a more accurate picture of the population, while not burdening the agencies who only supervise people on misdemeanor probation who most often do not collect detailed data. BJS does not perform any imputation on detail data collected.
Table 3. Percent of parole population missing data, by type of data, 2020 and 2021 |
||
|
2020 |
2021 |
Total entries |
6% |
9% |
Entry detail |
13% |
14% |
Total exits |
5% |
9% |
Exit detail |
17% |
16% |
Sex |
17% |
8% |
Race |
24% |
18% |
Type of offense |
24% |
22% |
Maximum sentence |
28% |
21% |
Status supervision |
22% |
22% |
Type of release from prison |
23% |
21% |
Table 4. Percent of probation population missing data, by type of data, 2020 and 2021 |
||
|
2020 |
2021 |
Total entries |
9% |
13% |
Entry detail |
36% |
38% |
Total exits |
4% |
5% |
Exit detail |
30% |
31% |
Sex |
26% |
25% |
Race |
30% |
34% |
Felony/ Misdemeanor |
22% |
19% |
Type of offense |
47% |
40% |
Status of supervision |
29% |
22% |
Notes: Data exclude unit nonresponse. They number of agencies that did not respond included 71 in 2020 and 74 in 2021. Data exclude respondents reporting on the CJ8-A (Short Form); this item(s) was not asked on that form. |
Final Testing of Procedures
After fielding Census of Adult Probation Supervising Agencies (CAPSA; OMB Control Number 1121-0347), BJS also conducted outreach and search efforts that led to a list of 3,560 potentially eligible entities that were supervising persons on probation for misdemeanors. In late 2017 through 2019, BJS worked with RTI (through a cooperative agreement) to gain OMB generic clearance and define the full extent of the under-coverage by collecting information from all potentially eligible entities on a) whether they supervise persons on probation, and if so, the number of persons on probation supervised for felonies and misdemeanors and b) the ability to report on those populations separately. In 2021, BJS released a report that describes the processes and findings from enhancing the survey frame used to conduct the Annual Probation Survey to improve the survey estimates.2
This effort refined the list of reporters eligible for the probation survey, resulting in 86 entities supervising at least one person convicted of a felony and 268 entities supervising persons convicted of a misdemeanor. For 2019, BJS added 66 entities to the frame that reported supervising at least one person convicted of a felony. For 2020, BJS continued to confirm all remaining agencies thought to be supervising persons on probation discovered over the course of the frame development research, including 20 entities that supervise persons convicted of a felony and 268 entities that supervise persons convicted of a misdemeanor. From 2020-2022, the CJ8 and CJ-8A collection confirmed the new agencies are supervising persons on probation and asked whether they supervise people on probation for a felony or misdemeanor offense.
Using all the data provided from central reporters and the expanded frame, BJS has modified the forms to best report on what is available from agencies while still providing a full enumeration of people on probation supervision in the United States. Agencies supervising people on probation for more serious crimes, often felonies, have more detailed information on their population than those smaller agencies supervising people with less serious misdemeanor offenses. In the revised CJ-8, BJS is asking agencies who supervise people on felony probation to report detailed data for felony and misdemeanor probation separately. In addition to the data collected from the generic clearance, BJS has consulted with 9 data collection agencies in the development and testing of the additions to the CJ-8 as well as leadership at the American Probation and Parole Association (APPA). Each year at the APPA Training Institute, BJS holds a meeting to discuss changes in probation and parole. It was widely agreed that smaller agencies, especially those who only supervise people on misdemeanor probation, do not have access to the detailed data requested on our longer CJ-8 and would much rather only answer a few questions.
Contacts for Statistical Aspects and Data Collection
The Corrections Statistics Unit at BJS takes responsibility for the overall design and management of the activities described in this submission, including fielding of the survey, data cleaning, and data analysis. BJS contacts include:
Danielle Kaeble, Statistician
Bureau of Justice Statistics
U.S. Department of Justice
810 Seventh Street, NW
Washington, DC 20531
202-598-1024
1 See Attachment 4, Probation and Parole in the United States, 2021; other reports in the series are available on the BJS website at Search Publications | Bureau of Justice Statistics (ojp.gov) .
2 Kennedy, Smith, and Mack. (2021) Enhancement of the Bureau of Justice Statistics’ Annual Probation Survey Frame. https://bjs.ojp.gov/library/publications/enhancement-bureau-justice-statistics-annual-probation-survey-frame
File Type | application/vnd.openxmlformats-officedocument.wordprocessingml.document |
Author | Kaeble, Danielle |
File Modified | 0000-00-00 |
File Created | 2023-08-01 |